27.6 9.4 15.6 20.3 12.3 8.7 7.3 14.9 17.0 -0.8
1 3 4 5 3 5 7 6 5 4
4 5 7 5 7 3 5 4 3 2
4 3 1 4 3 6 7 8 9 0
The multiple correlation (R) is: (check all that apply)

false
true
The correlation between predicted and observed scores.
false
The sum of the simple r's.
false
The highest simple r.
true
Always between 0 and 1 (inclusive).
R is the correlation between predicted and observed scores when there are two or more predictors. It is always between 0 and 1.
In multiple regression there are:

false
false
multiple criterion variables.
true
multiple predictor variables.
false
two predictor variables.
There are not necessarily exactly two. There are two or more.
Having two or more predictor variables is what distinguishes multiple regression from simple regression.
The difference between a regression weight and a beta weight is:

false
false
A regression weight assumes linearity.
false
A beta weight is for the population while a regression weight is for the sample.
For most statistics the Greek symbol represents the populaton parameter. Confusingly, this is not the case with beta weights.
false
A regression weight is less biased.
true
A beta weight is a standardized regression weight.
A beta weight is a standardized regression weight.
In the regression equation Y' = b_subscrX_1_subscrY_X_subscrX_1_subscrY_ + b_subscrX_2_subscrY_X_subscrX_2_subscrY_ + A, if b_subscrX_1_subscrY_ = 5, then how much would the predicted value of Y differ for two observations that had the same value of X_subscrX_2_subscrY_ but differed by 7 on X_subscrX_1_subscrY_?

35
0
A change of 1 on b1 is associated with a change of 5 on Y' so a difference of 7 would be associated with a predicted difference of 5 x 7 = 35.
The difference between a regression weight and a regression coefficient is: (check all that apply)

false
false
The regression weight is more important.
false
The regression weight is unbiased.
false
The regression weight is added rather than multiplied.
They are synonymous.
A regression weight is a partial slope because:

false
true
It is the slope when the part of the predictor independent of the other predictors is used to predict the criterion.
false
It is only one of several slopes, so it is only part of the prediction equation.
false
It is the relationship between the significant part of a predictor and the criterion.
false
It is only an estimate of the true slope and so is a partial solution.
It is the slope when the part of the predictor independent of the other predictors is used to predict the criterion. The other predictors are "partialled out."
Find the value of the multiple correlation (R).
You should use a computer to find the solution.

R = 0.7575
.7575
0.005
These are the same data as in the previous question. Find the value of b_subscrX_2_subscrY_.
You should use a computer to find the solution.

b2 = 1.6848
1.6848
0.005
The sum of squares explained is 200 and the sum of squares error is 100. What is the R^2?

It is the sum of squares explained (200) divided by the sum of squares total (200+100). The answer is 0.667
0.667
0.005
The sum of the simple r^2's is typically

false
false
less than R^2
There are cases when it is less than R squared but it is relatively rare.
false
equal to R^2
It is generally not the case that the sum will be equal to R squared. However, it wil be if the predictor variables are uncorrelated.
true
greater than R^2
Greater than R squared. Typically there is overlap in the variance explained by the predictors.
Which of the following assumptions pertain to inferential statistics in multiple regression?

false
false
The predictor variables are normally distributed.
false
The criterion variable is normally distributed.
true
The errors of prediction (the residuals) are normally distributed.
true
The variance about the regression line is the same for all predicted values.
true
The predictor variables are linearly related to the criterion.
Residuals are normally distributed. The variance about the regression line is the same for all predicted values (homoscedasticity). The predictor variables are linearly related to the criterion.