Regression Output’s-
-
Multiple R - The correlation coefficient between the
observed and predicted values. It ranges in value from 0 to 1. A small value
indicates that there is little or no linear relationship between the dependent
variable and the independent variables.
-
R- Square- Square
of Correlation i.e square of multiple R. It ranges from zero to one, with zero indicating that the proposed model does not
improve prediction over the mean
model and one indicating perfect prediction.
-
Adjusted R- Square - Adjusted
R2 is used to compensate for the addition of variables to the model. Adjusted
R-squared will decrease as predictors are added if the increase in model fit
does not make up for the loss of degrees of freedom. Likewise, it will increase
as predictors are added if the increase in model fit is worthwhile.
-
Significance F – Tells us that output is not by
chance. Smaller the value greater the probability that output is not by chance.
Significance F tells us the probability about the output is a good fit.
-
P-Value - Smaller the value greater the probability that output is
not by chance.
-
F – Regression Mean Square / Residual Mean
Square.
-
Sum of Squares due to
Regression- is
a quantity used in describe how well a regression model, represents the data
being modeled. In particular, the explained sum of squares measures how much variation there is in the modeled values and
this is compared to the total sum of squares, which Used for
t-test and f-test calculations.
-
Residuals-
are the difference between the observed values and those predicted by the
regression equation.
-
Residual sum of squares- measures
how much variation there is in the observed data, and to the, which measures
the variation in the modeling errors. A smaller residual sum of squares is
ideal.
-
Mean Square –
Sum of Square / Degrees of Freedom.
-
Residual Mean Square - .
A smaller residual sum of squares is ideal.
No comments:
Post a Comment