Psychometric Characteristics of a Test and Individualized Feedback via DINA Model: TIMSS 2015

Psychometric Characteristics of a Test and Individualized Feedback via DINA Model: TIMSS 2015

The aim of the study is to show how to use DINA model in defining psychometric characteristics of a test and giving students individualized feedback. The data of the study is taken from multiple-choice items in booklet-1 of eighth graders’ mathematics test of TIMSS 2015 Turkey sample. 435 students took booklet-1. The parameters of DINA model were estimated via R Studio and interpreted by comparing with CTT and IRT parameters. Additionally, diagnostic profile report example for individualized feedback was constructed. The result of the study shows that DINA model has a good fit (SRMSR, MADcor, MADQ3, MADaQ3, and RMSEA < 0.05). The parameters of DINA model are similar to those of CTT and IRT. The reliability values of DINA, CTT and IRT are (Pc)=0.913, KR-20=0.80 and marginal reliability=0.70, respectively. Thus, reliability of DINA model is bigger than the others’ values. As a result, it is suggested that DINA model can be used to report psychometric characteristics of a test. In addition, it can be used to give detailed individualized feedback to students, which is different from CTT and IRT.

___

  • Awopeju, O. A. ve Afolabi, E. R. I. (2016). Comparative analysis of classical test theory and item response theory based item parameter estimates of senior school certificate mathematics examination. European Scientific Journal, ESJ, 12(28).
  • de Ayala, R. J. (2009). Theory and practice of item response theory. Guilford Publications.
  • de la Torre, J. ve Douglas, J. (2004). Higher-order latent trait models for cognitive diagnosis. Psychometrika, 69 (3), 333-353.
  • de la Torre, J. (2007). Evaluation of model fit in a large-scale assessment application of cognitive diagnosis. In Presentation at the annual meeting of the national council on measurement in education, Chicago, IL.
  • de la Torre, J. (2009). DINA model and parameter estimation: A didactic. Journal of Educational and Behavioral Statistics, 34, 115-130.
  • de la Torre, J. (2011). The generalized DINA model framework. Psychometrika, 76(2), 179-199.
  • de La Torre, J. ve Minchen, N. (2014). Cognitively diagnostic assessments and the cognitive diagnosis model framework. Psicología Educativa, 20(2), 89-97.
  • DiBello, L. V., Roussos, L. A. ve Stout, W. F. (2007). Review of cognitively diagnostic assessment and a summary of psychometric models. In C. R. Rao ve S. Sinharay (Eds.), Handbook of statistics. Volume 26: Psychometrics (pp. 979-1030). Amsterdam, The Netherlands: Elsevier.
  • Ebel, R.L. ve Frisbie, D.A. (2009). Essentials Of Educational Measurement (5th ed.). New Delhi: Prentice- Hall Of India Pvt. Limited.
  • Embretson, S. (1984). A general latent trait model for response processes. Psychometrika, 49(2), 175- 186. doi: 10.1007/BF02294171
  • Embretson, S.E. (1997). Multicomponent response models. In: van der Linden,W.J., Hambleton, R.L. (Eds.), Handbook of Modern Item Response Theory. New York: Springer, pp. 305–321.
  • George, A. C. ve Robitzsch, A. (2015). Cognitive diagnosis models in R: A didactic. The Quantitative Methods for Psychology, 11(3), 189-205.
  • Hambleton, R. K., Swaminathan, H. ve Rogers, H. J. (1991). Fundamentals of item response theory. London: Sage.
  • Hartz, S., Roussos, L. ve Stout, W. (2002). Skills diagnosis: Theory and practice. User Manual for Arpeggio software. ETS.
  • Hu, L. T. ve Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural equation modeling: a multidisciplinary journal, 6(1), 1-55.
  • Jang, E. E. (2005). A validity narrative: Effects of reading skills diagnosis on teaching and learning in the context of NG TOEFL (Doctoral dissertation, University of Illinois at Urbana-Champaign).
  • Junker, B.W. ve Sijtsma, K. (2001). Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Applied Psychological Measurement, 25, 258–272.
  • Kehoe, J. (1995). Basic Item Analysis for Multiple-Choice Tests. ERIC/AE Digest.
  • Lei, P. W. ve Li, H. (2016). Performance of Fit Indices in Choosing Correct Cognitive Diagnostic Models and Q-Matrices. Applied Psychological Measurement, 1-13.DOI: 10.1177/0146621616647954
  • Li, X. ve Wang, W. C. (2015). Assessment of differential item functioning under cognitive diagnosis models: The DINA model example. Journal of Educational Measurement, 52(1), 28-54.
  • Maydeu-Olivares, A., Cai, L., & Hernández, A. (2011). Comparing the Fit of Item Response Theory and Factor Analysis Models. Structural Equation Modeling: A Multidisciplinary Journal, 18(3), 333–356. doi:10.1080/10705511.2011.581993
  • Maydeu-Olivares. (2013). Goodness-of-fit assessment of item response theory models. Measurement: Interdisciplinary Research and Perspectives, 11, 71–137.doi:10.1080/15366367.2013.831680
  • Maydeu-Olivares, A. ve Joe, H. (2014). Assessing Approximate Fit in Categorical Data Analysis. Multivariate Behavioral Research, 49(4), 305–328. doi:10.1080/00273171.2014.911075
  • MEB (2017). Eğitimde Öğrenci Gelişimini İzleme Değerlendirme Sistemi. 01 Kasım 2017 tarihinde http://odsgm.meb.gov.tr/www/egitimde-ogrenci-gelisimini-izleme-degerlendirmesistemi/ icerik/257 sayfasında erişilmiştir.
  • Ravand, H. (2015). Application of a cognitive diagnostic model to a high-stakes reading comprehension test. Journal of Psychoeducational Assessment, 1-18.
  • Ravand, H. ve Robitzsch, A. (2015). Cognitive Diagnostic Modeling Using R. Practical Assessment, Research & Evaluation, 20(11), 1-12.
  • Rupp, A. A. ve Templin, J. L. (2008). Unique characteristics of diagnostic classification models: A comprehensive review of the current state of the art. Measurement, 6(4), 219-262.
  • Tatsuoka, K. K. (1985). A probabilistic model for diagnosing misconceptions by the pattern classification approach. Journal of Educational and Behavioral Statistics, 10, 55-73.
  • Templin, J. ve Bradshaw, L. (2013). Measuring the reliability of diagnostic classification model examinee estimates. Journal of Classification, 30(2), 251-275.
  • TIMSS. (2015). TIMSS 2015 International Database. Erişim adresi: 11.10.2018, https://timssandpirls.bc.edu/timss2015/international-database/
  • Yen, W. M. (1984). Effects of local item dependence on the fit and equating performance of the threeparameter logistic model. Applied Psychological Measurement, 8, 125-145. doi:10.1177/014662168400800201
  • Zhang, W. (2006). Detecting differential item functioning using the DINA model. Unpublished doctoral dissertation, University of North Carolina at Greensboro
Türk Eğitim Bilimleri Dergisi-Cover
  • Yayın Aralığı: 3
  • Başlangıç: 2003
  • Yayıncı: GAZİ ÜNİVERSİTESİ REKTÖRLÜĞÜ.