SCM

Forum: help

Monitor Forum | Start New Thread Start New Thread
RE: SUR methodology, R square and test statistics [ Reply ]
By: Arne Henningsen on 2012-03-06 05:46
[forum:5573]
...and here is the answer to your remaining questions:

The R code that you posted here tests the hypotheses whether the coefficients of each explanatory variable sum up to zero across all equations, because pre-multiplying the coefficient vector by the restriction matrix ("Rmat %*% coef(testsur)") gives the sum of the coefficients of an explanatory variable over all equations. This is equivalent to checking whether the average of the coefficients of an explanatory variable over all equations is zero. To verify this, you could repeat the test with replacing "Rmat" by "Rmat/a" (where "a" is the number of equations), because "(Rmat/a) %*% coef(testsur)" returns the average of the coefficients of an explanatory variable over all equations.

If you want to test whether the coefficients of an explanatory variable across all equations are simultaneously equal to zero, you have "a" (number of equations) restrictions and hence, the matrix "Rmat" must have "a" rows, where each row corresponds to one restriction, i.e. that one of the coefficients is zero. This could be done by following commands (attention: untested code):

for (l in 1:aa) {
Rmat <- matrix(0, nrow = a, ncol = a*aa)
for (j in 0:(a-1)) {
Rmat[j+1,l + j*aa] <- 1
}
testresult <- linearHypothesis(testsur, Rmat, test = "F")
resSurtest[l,1] <- testresult[2, 4]
}


RE: SUR methodology, R square and test statistics [ Reply ]
By: Arne Henningsen on 2012-03-06 05:25
[forum:5572]
Dear Florian

Here is the answer to your first question:
The R^2 values of the individual equations are available in the object returned by the "summary" method (see section "Value" in the documentation of "summary.systemfit"). You can get the R^2 value of the i-th equation by the command
summary(testsur$eq[[i]])$r.squared
and you can get the vector of the R^2 values of all individual equations by the command:
sapply(testsur$eq,function(x){summary(x)$r.squared})

Best regards,
Arne


SUR methodology, R square and test statistics [ Reply ]
By: Florian Seeler on 2012-03-05 09:14
[forum:5569]
Hi,

I am acctually using the method of seemingly unrelated regressions in R. After searching a while I am a bit baffled as I was unable to get the R square out of the analysis. Therefore I would like to ask whether it is generally possible to get the respective R square for each single company the way it is displayed in the summary of the systemfit analysis. I mean I could also copy it from this figure but I thought it might be possible to do it automatically.

Another question concerns the test statistics of the SUR analysis. So far I run an standard F-test. Actually my code for the test statistic looks like this:

qvec <- c(0)

for (l in 1:aa) {
Rmat <- matrix(0, nrow = 1, ncol = a*aa)
for (j in 0:(a-1)) {
Rmat[1,l + j*aa] <- 1
}
testresult <- linearHypothesis(testsur, Rmat, qvec, test = "F")
resSurtest[l,1] <- testresult[2, 4]
}

(a = number of firms included; aa = number of variables included in the model)

This way, it is only tested whether the coefficients of a variable across all equations are simultaneously equal to zero. But I am also interested in testing whether the average of the coefficients associated to a designated variable across all equations is equal to zero. Is there a possibility to adjust the model that way?

Sorry for asking all these questions but I couldn’t find anything in the online community and I am still a bit unfamiliar with R in general.

I would appreciate a lot if someone could by any chance provide my some help with these topics.

Thanks a lot for your great support!!

Best regards,
Florian

Thanks to:
Vienna University of Economics and Business Powered By FusionForge