Standard Error of Coefficients in
Regression
1- Regress
(Dependent Y) to (Independent) X1, X2, X3…..
2- Sum of Residual Square = (Yi (predicted) - Yi )2
3- Mean Square Error = Sum of Residuals / ( N
- K - 1)
N = Number of observations, K = Number of
Regressor as above
4- Regress by taking (Dependent) X1 and (Independent) X2, X3, X4………..
For standard error of slope of X1.
a- Sum of Residual Square (SS Residual) =
(Xi (predicted)
- Xi )2
b- Standard Error of slope of X1 =
Sqrt (( (3) / (a))
5- T-Statistic - Standard Error / Coefficient.
6- P - Value in Excel = =TDIST( ( absolute value of
t-statistic),degree of freedom, tails(2))
Repeating steps 4-6 for statistics of X2- Dependent) X2 and (Independent) X1
+
- Variance-
- Slope-
Standard Error of Multiple Regression Matrix Form
Y = b0 + b1*X1 + b2*X2
- Note- It is possible to have variables that are
dependent but uncorrelated, since correlation only measures linear dependence.
A nice thing about normally distributed RV’s is that they are a convenient
special case: if they are uncorrelated, they are also independent.
- Coefficients can be significant despite of low R2 as R2 do not consider intercept.
- For equation – Y
= b0 + b1*X1 + b2*X2
We have-
Squaring both sides-
https://drive.google.com/file/d/0Bx3mfFH5R-y3c0pOQkc5VkJNVlk/view?usp=sharing
No comments:
Post a Comment