Comparison of the r- (k, d) class estimator with some estimators for multicollinearity under the Mahalanobis loss function
Comparison of the r- (k, d) class estimator with some estimators for multicollinearity under the Mahalanobis loss function
In the case of ill-conditioned design matrix in linear regression model, the r - (k, d) class estimator was proposed, including the ordinary least squares (OLS) estimator, the principal component regression (PCR) estimator, and the two-parameter class estimator. In this paper, we opted to evaluate the performance of the r - (k, d) class estimator in comparison to others under the weighted quadratic loss function where the weights are inverse of the variance-covariance matrix of the estimator, also known as the Mahalanobis loss function using the criterion of average loss. Tests verifying the conditions for superiority of the r - (k, d) class estimator have also been proposed. Finally, a simulation study and also an empirical illustration have been done to study the performance of the tests and hence verify the conditions of dominance of the r - (k, d) class estimator over the others under the Mahalanobis loss function in artificially generated data sets and as well as for a real data. To the best of our knowledge, this study provides stronger evidence of superiority of the r - (k, d) class estimator over the other competing estimators through tests for verifying the conditions of dominance, available in literature on multicollinearity.
___
- Baye, M.R. and D.F. Parker (1984). Combining ridge and principal components regression: a money demand illustration. Communications in Statistics-Theory and Methods, 13 (2), 197-205.
- Draper, N.R. and A. Smith (1981). Applied Regression Analysis. (II edition) New York: Wiley.
- Hald, A. (1952). Statistical Theory with Engineering Applications. New York: Wiley, 647.
- Hoerl, A.E. and R.W. Kennard (1970). Ridge regression: biased estimation for nonorthogonal problems. Technometrics 12, 55-67.
- Johnson, N.L., S. Kotz and N. Balakrishnan (2004). Continuous Univariate Distributions. Vol 2 (II edition) New York: Wiley.
- Massy, M.F. (1965). Principal component regression in explanatory research. Journal of the American Statistical Association, 60, 234-266.
- Montgomery, D.C. and E.A. Peck (1982). Introduction to linear regression analysis. New York: Wiley.
- Newhouse, J.P. and S.D. Oman (1971). An evaluation of ridge estimators. Rand corporation, 1-29.
- Nomura, M. and T. Okhuba (1985). A note on combining ridge and principal component regression. Communications in Statistics-Theory and Methods, 14, 489-2493.
- Özkale, M.R. and S. Kaçiranlar (2007). The restricted and unrestricted two parameter estimators. Communications in Statistics-Theory and Methodsi 36 (15), 2707-2725.
- Özkale M.R. (2012). Combining the unrestricted estimators into a single estimator and a simulation study on the unrestricted estimators. Journal of Statistical Computation and Simulation, 62 (4), 653-688.
- Peddada, S.D., A.K. Nigam and A.K. Saxena (1989). On the inadmissibility of ridge estimator in a linear model. Communications in Statistics-Theory and Methods, 18 (10), 3571- 3585.
- Piepel, G. and T. Redgate (1998). A mixture experiment analysis with Hald Cement data. Journal of the American Statistical Association, 52 (1), 23-30.
- Sarkar, N. (1992). A new estimator combining the ridge regression and the restricted least squares methods of estimation. Communications in Statistics-Theory and Methods, 21, 1987-2000.
- Sarkar, N. (1996). Mean square error matrix comparison of some estimators in linear regression with multicollinearity. Statistics and Probability Letters, 30, 133-138.
- Üstündaǧ-Şiray, G. and S. Sakallioğlu (2012). Superiority of the r-k class estimator over some estimators in a linear model. Communications in Statistics-Theory and Methods, 41 (15), 2819-2832.