I am trying to reproduce Weighted Least Squares (WLS) in Excel using R for confirmation. I use the (trivial but reproducible) following dataset to perform a double check :
x<-c(1,2,3,4,5,6)
y<-c(9,23,30,42,54,66)
w<-1/x
When I calculate a WLS using lm and the weight argument as follows :
WLS<-lm(y~x, weights = w)
summary(WLS)
The output is :
> summary(WLS)
Call:
lm(formula = y ~ x, weights = w)
Weighted Residuals:
1 2 3 4 5 6
-0.50162 1.67280 -1.02017 -0.44984 -0.01447 0.34087
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) -1.6311 1.2241 -1.333 0.254
x 11.1327 0.4181 26.627 1.18e-05 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 1.05 on 4 degrees of freedom
Multiple R-squared: 0.9944, Adjusted R-squared: 0.993
F-statistic: 709 on 1 and 4 DF, p-value: 1.182e-05
I have read here that the Residual standard error as calculated from R can be calculated manually using the following lines (adapted to the above model for convenience) :
k=length(WLS$coefficients)-1 #Subtract one to ignore intercept
SSE=sum(WLS$residuals**2)
n=length(WLS$residuals)
sqrt(SSE/(n-(1 k))) #Residual Standard Error
This calculation is consistent with the formula I have seen in many books (e.g. here). However, when running this manual calculation, the result returned is 1.618487
(i.e. not 1.05).
I found namely here that WLS can also be performed by applying OLS to transformed variables (model in matrix notation: Y'=X'B e') using the transformation : Y=W^(1/2)Y ; X'=W^(1/2)X ; e'=W^(1/2)e. I perfomed it in R with the following code :
v<-w^(1/2)
x2<-x*v
y2<-y*v
WLS2<-lm(y2~0 v x2)
i.e. model with intercept forecd to zero, v represents the new intercept. Doing so, I get the following output :
> summary(WLS2)
Call:
lm(formula = y2 ~ 0 v x2)
Residuals:
1 2 3 4 5 6
-0.50162 1.67280 -1.02017 -0.44984 -0.01447 0.34087
Coefficients:
Estimate Std. Error t value Pr(>|t|)
v -1.6311 1.2241 -1.333 0.254
x2 11.1327 0.4181 26.627 1.18e-05 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 1.05 on 4 degrees of freedom
Multiple R-squared: 0.9982, Adjusted R-squared: 0.9972
F-statistic: 1085 on 2 and 4 DF, p-value: 3.388e-06
Note that the regression coefficients and Residual standard error are the same but the R², F -statistic and residuals are different. Also, I do get 1.049927
when I calculate the Residual standard error with the residuals of that model (WLS2).
My question : can someone kindly explain why the Residual standard error returned by R are the same for the two models despite having different residuals ? Is it correct that the Residual standard error should be 1.618487
(as calculated manually) for the first model (without data transformation) ? Is that a problem with how R internally computes WLS ? It seems that R omits to back-transform residuals prior to calculate the Residual standard error.
Thanks !
CodePudding user response:
Is it correct that the Residual standard error should be 1.618487 (as calculated manually) for the first model (without data transformation) ?
No because you fitted a model with weights but then forgot about the weights in your computation of sigma.
x <- c(1, 2, 3, 4, 5, 6)
y <- c(9, 23, 30, 42, 54, 66)
w <- 1 / x
WLS <- lm(y ~ x, weights = w)
summary(WLS)$sigma
#> [1] 1.049927
# You computed sigma for lm(y ~ x)
k <- length(WLS$coefficients) - 1 # Subtract one to ignore intercept
SSE <- sum(WLS$residuals**2)
n <- length(WLS$residuals)
sqrt(SSE / (n - (1 k))) # Residual Standard Error, without weighting
#> [1] 1.618487
# But what you really wanted is to compute sigma for lm (y ~ x | weights = w)
SSE <- sum(w * (WLS$residuals)**2)
sqrt(SSE / (n - (1 k))) # Residual Standard Error, with weighting
#> [1] 1.049927
Created on 2022-03-20 by the reprex package (v2.0.1)
Why the Residual standard error returned by R are the same for the two models despite having different residuals ?
It's because the second model explicitly includes the weights:
- WLS: y ~ 1 x with weights = w
- WLS2: sqrt(w) * y ~ sqrt(w) sqrt(w) * x with weights = 1