Yapay Sinir Ağları ve Çoklu Regresyon Analizinin Karşılaştırılması

Bu çalışmada Yapay Sinir Ağları (YSA) bir tahminleyici olarak ele alınmış ve tahminleme başarısı çoklu doğrusal regresyon analizi ile karşılaştırılmıştır. YSA öncelikle, Regresyon Analizi'nde kullanılan temel yöntemlerden olan En Küçük Kareler (EKK) tekniği ile, daha sonra verilerin aykırı değer içermesi durumu ele alınarak, Robust Regresyon Tekniklerinden Huber, Tukey ve Andrew'in M-Kestiricileri ile karşılaştırılmıştır. Karşılaştırma kriteri olarak Hata Kareler Ortalaması (HKO) ve model seçim kriterlerinden ICOMP (Information Complexity) Kriteri kullanılmıştır.

The Comparison of Artificial Neural Networks and Regression Analysis

In this study, Artificial Neural Networks (ANN) is taken into account as an estimator and its success on forecasting is compared with multiple linear regression analysis. First, ANN is compared with Least Squares (LS) Technique, which is one of the basic techniques used for regression analysis. As a consequence, it is compared between ANN and Huber, Tukey, and Andrew's M-Estimators, which are techniques of Robust regression, by adding outliers to data. Mean Squares Error (MSE) and ICOMP (Information Complexity) are used as comparison critria.

___

  • Akaike, H., A new look at the statistical model identification, IEEE Transactions on Automatic Control 19 (6): 716-723, 1974.
  • Bozdoğan , H, ICOMP: A New Model-Selection Criterion, In Classification and Related Methods of Data Analysis, pp. 599-608, 1988.
  • Bozdoğan , H, Akaike's Information Criterion and Recent Developments in Information Complexity, Journal of Mathematical Psychology 44, 62-91, 2000.
  • Chen, B., Pinar M. Ç., On Newton's Method for Huber's Robust M-Estimation Problems in Linear Regression, Swets & Zeitlinger, Vol. 38 No. 4 pp. 674-684, 1998.
  • Draper, N.R., Smith, H., Applied Regression Analysis, Wiley, 1998.
  • Elmas, Ç., Yapay Sinir Ağları, Seçkin Yayıncılık, 2003.
  • Kohonen, T., 1987, State Of The Art In Neural Computing, IEEE First International Conference on Neural Networks.
  • Kumar, U. A., Comparison of Neural Networks and Regression Analysis: A New însight, Expert Systems with Applications, 29, 424-430, 2005.
  • Liu, Y. X., Zhang, J, Schaeffer, L. R., Yang, R.Q., Zhang, W.L., Short Communication: Optimal Random Regressiom Models for Milk Production in Dairy Cattle, American Dairy Science Assocation, 89:2233-2235, 2006.
  • McCulloch, W., Pitts, W., 1943, A Logical Calculus of the Ideas Immanent in Nervous Activity, Bulletin of Mathematical Biophysics, 7:115-133.
  • M. Minsky and S. A. Papert, 1969, Perceptrons: An Introduction to Computational Geometry,Cambridge, MIT Press.
  • Newbold, Paul, 2000, İşletme ve İktisat için İstatistik, Istanbul, Literatür.
  • Orhunbilge, Neyran, 2002, Uygulamalı Regresyon ve Korelasyon Analizi, İstanbul, İ.Ü. İşletme Fakültesi.
  • Ortiz, M. C., Sarabia, L. A., Herrero, A., A Useful Alternative fort he The Detection of Outlier Data in Chemical Analysis, Talanta, 70, 499-512, 2006.
  • Öztemel, E., Yapay Sinir Ağları, Papatya Yayıncılık, 2003.
  • Pan, Z., Chen, Y., Kang, L., Zhang, Y„ Parameter Estimation By Genetic Algorithms • For Nonlinear Regression, Optimization Techniques and Applications, Proc. of ICOPA'95, Vol.2,946-953, 1995.
  • Saraç, T., 2004, Yapay Sinir Ağları, Seminer Projesi, Gazi Üniversitesi Endüstri Mühendisliği Anabilim Dalı.
  • Schwarz, G., Estimating the dimension of a model, Annals of Statistics 6 (2): 461-464, 1978.
  • Stern, H. S., Neural Networks in Applied Statistics, Technometrics, 38, 3, 205-214, 1996.
  • Walzack, S., Terry, S., A Comparative Analysis of Regression and Neural Networks for University Admission, Information Sciences, 119, 1-20, 1999.
  • Wang, YM., Elhag, T.M.S., A Comparison of Neural Network, Evidental Reasoning and Multiple Regression Analysis in Modelling Bridge Risks, Expert Systems with Applications, 32, 336-348, 2007.