2 Mart 2014 Pazar

Çoklu Regresyon - Multiple Regression

Çoklu Regresyonda kavramlar:
 Assumptions.
·         For each value of the independent variable, the distribution of the dependent variable must be normal. 
·         The variance of the distribution of the dependent variable should be constant forall values of the independent variable. 
·         The relationship between the dependent variable and each independent variable should be linear, and all observations should be independent.

Distances.
Measures to identify cases with unusual combinations of values for the independent variables and cases that may have a large impact on the regression model.
Mahalanobis.
A measure of how much a case’s values on the independent variables differ from the average of all cases. A large Mahalanobis distance identifies a case as having extreme values on one or more of the independent variables.
Cook’s.
A measure of how much the residuals of all cases would change if a particular case were excluded from the calculation of the regression coefficients. A large Cook’s D indicates that excluding a case from computation of the regression statistics changes the coefficients substantially.
Leverage values.
Measures the in fluence of a point on the fit of the regression. The centered leverage ranges from 0 (no in fluence on the fit) to (N-1)/N.
Prediction Intervals.
The upper and lower bounds for both mean and individual prediction intervals.
Mean.
Lower and upper bounds (two variables) for the prediction interval of the mean
predicted response.
Individual.
Lower and upper bounds (two variables) for the prediction interval of the dependent variable for a single case.
Confidence Interval.
Enter a value between 1 and 99.99 to specify the confidence level for the two Prediction Intervals. Mean or Individual must be selected before entering this value. Typical confidence interval values are 90, 95, and 99.
Residuals.
The actual value of the dependent variable minus the value predicted by the regression equation.
Unstandardized.
The difference between an observed value and the value predicted by the
model.
Standardized.
The residual divided by an estimate of its standard deviation. Standardized residuals, which are also known as Pearson residuals, have a mean of 0 and a Standard deviation of 1.
Studentized.
The residual divided by an estimate of its standard deviation that varies from case to case, depending on the distance of each case’s values on the independent variables from the means of the independent variables.
Deleted.
The residual for a case when that case is excluded from the calculation of the regression coefficients. It is the difference between the value of the dependent variable and the adjusted predicted value.
Studentized deleted.
The deleted residual for a case divided by its standard error. The difference between a Studentized deleted residual and its associated Studentized residual indicates how much difference eliminating a case makes on its own prediction.
Influence Statistics.
The change in the regression coefficients (DfBeta[s]) and predicted values (DfFit) that results from the exclusion of a particular case. Standardized DfBetas and DfFit values are also available along with the covariance ratio.
DfBeta(s).
The difference in beta value is the change in the regression coefficient that results from the exclusion of a particular case. A value is computed for each term in the model, including the constant.
Standardized DfBeta.
Standardized difference in beta value. The change in the regression coefficient that results from the exclusion of a particular case. You may want to examine cases
with absolute values greater than 2 divided by the square root of N, where N is the number of cases. A value is computed for each term in the model, including the constant.
DfFit.
The difference in fit value is the change in the predicted value that results from the exclusion of a particular case.
Standardized DfFit.
Standardized difference in fit value. The change in the predicted value that results from the exclusion of a particular case. You may want to examine standardized values which in absolute value exceed 2 times the square root of p/N, where p is the number of parameters in the model and N is the number of cases.
Covariance ratio.
The ratio of the determinant of the covariance matrix with a particular case excluded from the calculation of the regression coefficients to the determinant of the covariance matrix with all cases included. If the ratio is close to 1, the case does not significantly alter the covariance matrix.

Regression Coefficients.
Estimates displays Regression coefficient B, standard error of B, standardized coefficient beta, t value for B, and two-tailed significance level of t.
Confidence intervals
displays confidence intervals with the specified level of confidence for each regression coefficient or a covariance matrix.
Covariance matrix
displays a variance-covariance matrix of regression coefficients with covariances off the diagonal and variances on the diagonal. A correlation matrix is also displayed.
Model fit.
The variables entered and removed from the model are listed, and the following goodness-of-fit statistics are displayed: multiple R, R2 and adjusted R2, standard error of the estimate, and an analysis-of-variance table.
R squared change.
The change in the R2 statistic that is produced by adding or deleting an independent variable. If the R2 change associated with a variable is large, that means that the variable is a good predictor of the dependent variable.
Descriptives.
Provides the number of valid cases, the mean, and the standard deviation for each variable in the analysis. A correlation matrix with a one-tailed significance level and the number of cases for each correlation are also displayed.
Partial Correlation.
The correlation that remains between two variables after removing the correlation that is due to their mutual association with the other variables. The correlation between the dependent variable and an independen t variable when the linear effects of the other independent variables in the model have been removed from both.
Part Correlation.
The correlation between the dependent variable and an independent variable when the linear effects of the other independent variables in the model have been removed from the independent variable. It is related to the change in R-squared when a variable is added to an equation. Sometimes called the semipartial correlation.
Collinearity diagnostics.
Collinearity (or multicollinearity) is the undesirable situation when one independent variable is a linear function of other independent variables. Eigenvalues of the scaled and uncentered cross-products matrix, condition indices, and variance-decomposition proportions are displayed along with variance inflation factors (VIF) and tolerances for individual variables.
Residuals.
Displays the Durbin-Watson test for serial correlation of the residuals and casewise diagnostics for the cases meeting the selection criterion (outliers above n standard deviations).



Matrislerle Çoklu Regresyon

Teşekkürler Büşra...
Büşra'nın Çalışması

Faktör Modeli ve Varyansı

Spearman tarafından önerilen klasik test kuramının modeli: X=T+E şeklindedir.
Ancak bu model paralel ölçmeler üzerine tanımlanmıştır.
Eğdeğer ölçmeler için ise bu model: X=a+bT+E formuna dönüşür.
Bu model faktör analitik model notasyonu ile gösterilirse:



















Lütfen, veri setinizde orijinal (X=a+bT+E) ile kestirilen model (X=a+bT) arasındaki hata varyansını elde ederek raporlayınız.

24 Şubat 2014 Pazartesi

Matris Cebri ve Kavramları

- Matris Toplamı
- Matris Çarpımı
- Matris Skaler Çarpımı
- Matris Devriği
- Matrisin İzini Almak
- Matris Determinantları
- Matris Tersini Almak
--------------------
Kavramlar:
- Kare Matris (Square Matrix)
- Köşegen Matris (Diagonal Matrix)
- Birim Matris (Identity Matrix)
- Sıfır Matris
- Simetrik Matris (Symetric Matrix)
- Idempotent Matris
- Alt/Üst Üçgen Matris
- Dik Matris (Orthogonal Matrix)
- Özdeğer ve Özvektörler (Eigevalue-Eigenvector)
- Karesel Form (Quadratic Form)

- Pozitif Tanımlı (Positive Definite)

Ayrıntılar için:
Matrix Algebra for Statistics
Matrix Algebra Handout

22 Aralık 2013 Pazar