Ortaöğretime Geçiş Merkezi Sınavının Özel Öğrenme Güçlüğü Olan Öğrencilere Göre Ölçme Değişmezliğinin İncelenmesi

Geniş ölçekli testlerin, özel gereksinimli öğrenciler için ölçme değişmezliğinin sağlanması ve buna yönelik bilimsel analizler adil ölçmeler için kritiktir. Özel öğrenme güçlüğü, özel gereksinim grupları içerisinde en büyük grubu oluşturmaktadır. Bu araştırmada ortaöğretime geçiş sınavının öğrencilerin özel öğrenme güçlüğü olup olmama durumuna göre ölçme değişmezliği incelenmiştir. Araştırmada öğrenme güçlüğü tanısı olan 994 öğrenci odak grubu, özel gereksinimi olmayan 1000 öğrenci ise referans grubu oluşturmuştur. Her bir alt testteki maddelerin Değişen Madde Fonksiyonu(DMF) gösterip göstermediği Mantel Haenszel ve Lord’un ki karesi yöntemleri ile incelenmiştir. Bunun yanında, Çoklu Grup Doğrulayıcı Faktör Analizi uygulanarak, alt testlerin yapısal, zayıf, güçlü ve katı değişmezlikleri aşamalı olarak incelenmiştir. Araştırma sonucuna göre 90 maddelik testin 34 maddesi her iki yönteme göre DMF göstermektedir. On bir madde orta ve beş madde ise yüksek düzeyde DMF göstermektedir. Zayıf değişmezlik ise tüm alt testlerde sağlanmamaktadır. Bu sonuca göre, sınavın tüm alt testlerinde, faktör yükleri gruplar arasında değişiklik göstermektedir. Bu sonuçlara göre söz konusu sınavın. Özel öğrenme güçlüğüne göre ölçme değişmezliğini sağlamadığı belirtilebilir.

MEASUREMENT INVARIANCE OF TURKISH “CENTRAL EXAM FOR SECONDARY EDUCATION” BY SPECIFIC LEARNING DISABILITY

Ensuring measurement invariance for students with disabilities is critical for fair measurement in large-scale testing. Specific learning disability constitutes the largest group among disability groups. In this study, it was aimed to examine the measurement invariance of the Turkish Central Exam for Secondary Education according to whether or not students have a specific learning disability. 994 students diagnosed with specific learning disability formed the focus group, whilst 1,000 students without any disability constituted the reference group. Mantel Haenszel and Lord’s chi-square methods were used to determine whether or not the items in each subtest showed Differential Item Functioning (DIF). In addition, by applying Multigroup Confirmatory Factor Analysis, the configural invariance, metric invariance, scalar invariance, and strict invariance of the subtests were examined. The study’s findings proved that 34 of the 90-item test indicated DIF according to both methods. Eleven items show moderate DIF and five show high DIF. Metric invariance is not provided in all subtests, with factor loadings in all subtests varied between the groups.

___

  • Abedi, J., Leon, S., & Kao, J. C. (2007). Examining differential distractor functioning in reading assessments for students with disabilities. Partnership for Accessible Reading Assessment. https://ici.umn.edu/products/395
  • AERA, APA, & NCME. (2014). Standards for educational and psychological testing: National council on measurement in education. https://www.aera.net/Publications/Books/Standards-for-Educational-Psychological-Testing-2014-Edition
  • Alatlı, B. K., & Bökeoğlu, Ö. Ç. (2018). Investigation of measurement invariance of literacy tests in the programme for international student assessment (PISA-2012). Elementary Education Online, 17(2), 1096–1115. https://doi.org/10.17051/ilkonline.2018.419357
  • American Federation of Teachers, National Council on Measurement in Education, & National Education Association. (1990). Standards for teacher competence in educational assessment of students.
  • Anjorin, I. (2009). High-stakes tests for students with specific learning disabilities: disability-based differential item functioning [Doctoral dissertation, Southern Illinois University]. https://www.proquest.com/openview/2b3d3f7dd8718df22abe293373d97c35/1?pq-origsite=gscholar&cbl=18750
  • Bolt, S. E. (2004, April 13). Using DIF analyses to examine several commonly-held beliefs about testing accommodations for students with disabilities [Conference presentation]. Annual conference of the National Council on Measurement in Education, San Diego, CA.
  • Bolt, S. E., & Thurlow, M. L. (2007). Item-level effects of the read-aloud accommodation for students with reading disabilities. Assessment for Effective Intervention, 33(1), 15–28. https://doi.org/10.1177/15345084070330010301
  • Bolt, S. E., & Ysseldyke, J. (2008). Accommodating students with disabilities in large-scale testing: A comparison of differential item functioning (DIF) identified across disability types. Journal of Psychoeducational Assessment, 26(2), 121–138. https://doi.org/10.1177/0734282907307703
  • Borsboom, D. (2006). When does measurement invariance matter?. Medical Care, 44(11), S176-S181. doi:10.1097/01.mlr.0000245143.08679.cc
  • Brody, L. E., & Mills, C. J. (1997). Gifted children with learning disabilities: A review of the issues. Journal of Learning Disabilities, 30(3), 282–296. https://doi.org/10.1177/002221949703000304
  • Brumfield, G. A. (2014). The effectiveness of reading accommodations for high school students with reading disabilities [Doctoral dissertation, Walden University]. https://www.proquest.com/openview/8aee69058d23d0cbd915233b60a3a16c/1?pq-origsite=gscholar&cbl=18750
  • Buzick, H., & Stone, E. (2011). Recommendations for conducting differential item functioning (DIF) analyses for students with disabilities based on previous DIF studies. ETS Research Report Series, 2011(2), Article i-26. https://doi.org/10.1002/j.2333-8504.2011.tb02270.x
  • Büttner, G., & Hasselhorn, M. (2011). Learning disabilities: Debates on definitions, causes, subtypes, and responses. International Journal of Disability, Development and Education, 58(1), 75-87.
  • Camara, W. J., Copeland, T., & Rothschild, B. (2005). Effects of extended time on the SAT ® I: reasoning test score growth for students with learning disabilities. The College Board.
  • Cook, L., Eignor, D., Sawaki, Y., Steinberg, J., & Cline, F. (2010). Using factor analysis to investigate accommodations used by students with disabilities on an English-language arts assessment. Applied Measurement in Education ISSN, 23(2), 187–208. https://doi.org/10.1080/08957341003673831
  • Cortiella, C. (2005). No Child Left Behind: Determining appropriate assessment accommodations for students with disabilities. National Center for Learning Disabilities.
  • Demars, C. (2010). Item Response Theory, understanding statistics. Oxford University Press.
  • Elbaum, B. (2007). Effects of an oral testing accommodation on the mathematics performance of secondary students with and without learning disabilities. Journal of Special Education, 40(4), 218–229. https://doi.org/10.1177/00224669070400040301
  • Elbaum, B., Arguelles, M. E., Campbell, Y., & Saleh, M. B. (2004). Effects of a student-reads-aloud accommodation on the performance of students with and without learning disabilities on a test of reading comprehension. Exceptionality, 12(2), 71–87. https://doi.org/10.1207/s15327035ex1202_2
  • Elliott, S. N., Kettler, R. J., Beddow, P. A., & Kurz, A. (Eds.). (2018). Handbook of accessible instruction and testing practices: Issues, Innovations, and Applications (2nd ed.). Springer. https://doi.org/10.1007/978-3-319-71126-3
  • Finch, W. H., & French, B. F. (2007). Detection of crossing differential item functioning: A comparison of four methods. Educational and Psychological Measurement, 67(4), 565–582. https://doi.org/10.1177/0013164406296975
  • First, M. B. (2013). DSM-5 handbook of differential diagnosis. American Psychiatric Publishing.
  • Fischer, R., & Karl, J. A. (2019). A primer to (cross-cultural) multi-group invariance testing possibilities in R. Frontiers in Psychology, 10, Article 1507. https://doi.org/10.3389/FPSYG.2019.01507
  • Fletcher, J. M., Francis, D. J., Boudousquie, A., Copeland, K., Young, V., Kalinowski, S., & Vaughn, S. (2006). Effects of accommodations on high-stakes testing for students with reading disabilities: Exceptional Children, 72(2), 136–150. https://doi.org/10.1177/001440290607200201
  • French, A. W., & Miller, T. R. (1996). Logistic regression and its use in detecting differential item functioning in polytomous items. Journal of Educational Measurement, 33(3), 315–332. https://doi.org/10.1111/j.1745-3984.1996.tb00495.x
  • Gregg, N., & Nelson, J. M. (2012). Meta-analysis on the effectiveness of extra time as a test accommodation for transitioning adolescents with learning disabilities: More questions than answers. Journal of Learning Disabilities, 45(2), 128–138. https://doi.org/10.1177/0022219409355484
  • Grigorenko, E. L., Compton, D. L., Fuchs, L. S., Wagner, R. K., Willcutt, E. G., & Fletcher, J. M. (2019). Understanding, educating, and supporting children with specific learning disabilities: 50 years of science and practice. American Psychologist, 75(1), 37-51. https://doi.org/10.1037/AMP0000452
  • Kamata, A., & Vaughn, B. K. (2004). An introduction to differential item functioning analysis. Learning Disabilities: A Contemporary Journal, 2(2), 49–69.
  • Kauffman, J. M., & Hallahan, D. P. (Eds.). (2011). Handbook of special education (1st ed.). Routledge. https://doi.org/10.4324/9780203837306.ch32
  • Kavale, K. A., & Forness, S. R. (2000). What definitions of learning disability say and don't say: A critical analysis. Journal of Learning Disabilities, 33(3), 239-256.
  • Kim, D.-H., Schneider, C., & Siskind, T. (2009). Examining the underlying factor structure of a statewide science test under oral and standard administrations: Journal of Psychoeducational Assessment, 27(4), 323–333. https://doi.org/10.1177/0734282908328632
  • Kingsbury, G. G., & Houser, R. L. (1988, April 9). A comparison of achievement level estimates from computerized adaptive testing and paper-and-pencil testing Portland (OR) Public Schools [Conference presentation]. Annual Meeting of the American Educational Research Association, New Orleans, LA. http://iacat.org/sites/default/files/biblio/ki88-01.pdf
  • Kishore, M. T., Maru, R., Seshadri, S. P., Kumar, D., Sagar, J. K. V., Jacob, P., & Murugappan, N. P. (2021). Specific learning disability in the context of current diagnostic systems and policies in India: Implications for assessment and certification. Asian Journal of Psychiatry, 55, 102506.
  • Knickenberg, M., Zurbriggen, C., Venetz, M., Schwab, S., & Gebhardt, M. (2020). Assessing dimensions of inclusion from students’ perspective–measurement invariance across students with learning disabilities in different educational settings. European Journal of Special Needs Education, 35(3), 287–302. https://doi.org/10.1080/08856257.2019.1646958
  • Koretz, D. (1997). The assessment of students with disabilities in Kentucky. National Center for Research on Evaluation, Standards, and Student Testing (CRESST), University of California. https://cresst.org/wp-content/uploads/TECH431.pdf
  • Lai, S. A., & Berkeley, S. (2012). High-stakes test accommodations: research and practice. Learning Disability Quarterly, 35(3), 158–169. https://doi.org/10.1177/0731948711433874
  • Lindstrom, J. H., & Gregg, N. (2007). The role of extended time on the SAT for students with learning disabilities and/or attention-deficit/hyperactivity disorder. Learning Disabilities Research & Practice, 22(2), 85–95. https://doi.org/10.1111/j.1540-5826.2007.00233.x
  • Lord, F. M. (1980). Applications of Item Response Theory to practical testing problems. Routledge.
  • Meloy, L. L., Deville, C., & Frisbie, D. (2000, April 26). The effect of a reading accommodation on standardized test scores of learning disabled and non learning disabled students [Conference presentation]. Annual Meeting of the National Council on Measurement in Education, New Orleans, LA.
  • Mellenbergh, G. J. (1989). Item bias and item response theory. International Journal of Educational Research, 13(2), 127-143.
  • Meredith, W. (1993). Measurement invariance, factor analysis, and factorial invariance. Psychometrika, 58(4), 525–543. https://doi.org/10.1007/BF02294825.
  • Middleton, K., & Laitusis, C. C. (2007). Examining test items for differential distractor functioning among students with learning disabilities. ETS Research Report Series, 2007(2), Article i-34. https://doi.org/10.1002/j.2333-8504.2007.tb02085.x
  • Milli Eğitim Bakanlığı. (2018). Sınavla öğrenci̇ alacak ortaöğreti̇m kurumlarına i̇li̇şki̇n merkezî sınav başvuru ve uygulama klavuzu [Application and implementation guide of central exam for secondary education institutions] Ankara, Turkey. http://www.meb.gov.tr/sinavlar/dokumanlar/2018/MERKEZI_SINAV_BASVURU_VE_UYGULAMA_KILAVUZU.pdf
  • Mori, K., Tominaga, M., Watanabe, Y., & Matsui, M. (1974). A simple synthesis of methyl 10,11- oxido-3,7,11 -trimethy ldodeca-2,4,6- trienoate, an analog of the Cecropia juvenile hormone. Agricultural and Biological Chemistry, 38(8), 1541–1542. https://doi.org/10.1080/00021369.1974.10861371
  • National Center for Statistics Education. (2021). Students with disabilities. The Condition of Education. https://www2.ed.gov/programs/osepidea/618-data/state-level-data-files/index.html#bcc;
  • Ozarkan, H. B., Kucam, E., & Demir, E. (2017). Merkezi ortak sınav matematik alt testinde değişen madde fonksiyonunun görme engeli durumuna göre incelenmesi [An investigation of differential item functioning according to the visually handicapped situation for the Central Joint Exam math subtest]. Current Research in Education, 3(1), 24–34.
  • Randall, J., Cheong, Y. F., & Engelhard, G. (2011). Using explanatory Item Response Theory modeling to investigate context effects of differential item functioning for students with disabilities. Educational and Psychological Measurement, 71(1), 129–147. https://doi.org/10.1177/0013164410391577
  • Randall, J., & Engelhard, G. (2010). Using confirmatory factor analysis and the Rasch Model to assess measurement invariance in a high stakes reading assessment. Applied Measurement in Education, 23(3), 286–306. https://doi.org/10.1080/08957347.2010.486289
  • Rogers, C. M., Lazarus, S. S., & Thurlow, M. L. (2014). A summary of the research on the effects of test accommodations, 2011-2012 (Synthesis Report 94). University of Minnesota, National Center on Educational Outcomes. https://nceo.umn.edu/docs/onlinepubs/Synthesis94/Synthesis94.pdf
  • Rogers, C. M., Lazarus, S. S., & Thurlow, M. L. (2016). A summary of the research on the effects of test accommodations: 2013-2014 (NCEO Report 402). University of Minnesota, National Center on Educational Outcomes. https://nceo.info/Resources/publications/OnlinePubs/Report402/default.htm
  • Rogers, C. M., Thurlow, M. L., Lazarus, S. S., & Liu, K. K. (2019). A summary of the research on effects of test accommodations: 2015-2016 (NCEO Report 412). University of Minnesota, National Center on Educational Outcomes. https://nceo.umn.edu/docs/OnlinePubs/NCEOReport412.pdf
  • Şenel, S. (2021). Assessing measurement invariance of Turkish “Central Examination for Secondary Education Institutions” for visually impaired students. Educational Assessment, Evaluation and Accountability, 33, 621-648. https://doi.org/10.1007/s11092-020-09345-5
  • Silverman, L. K. (2009). The Two-Edged Sword of Compensation: How the Gifted Cope with Learning Disabilities. Gifted Education International, 25(2), 115–130. https://doi.org/10.1177/026142940902500203
  • Steenkamp, J. B. E. M., & Baumgartner, H. (1998). Assessing measurement invariance in cross-national consumer research. Journal of Consumer Research, 25(1), 78–90. https://doi.org/10.1086/209528/0
  • Steinberg, J., Cline, F., & Sawaki, Y. (2011). Examining the factor structure of a state standards-based science assessment for students with learning disabilities. ETS Research Report Series, 2011(2), Article i–49. https://doi.org/10.1002/J.2333-8504.2011.TB02274.X
  • Stone, E., Cook, L., Cahalan Laitusis, C., & Cline, F. (2010). Using differential item functioning to investigate the impact of testing accommodations on an English-Language Arts Assessment for students who are blind or visually impaired. Applied Measurement in Education, 23(2), 132–152. https://doi.org/10.1080/08957341003673773
  • Svetina, D., Dai, S., & Wang, X. (2017). Use of cognitive diagnostic model to study differential item functioning in accommodations. Behaviormetrika, 44(2), 313–349. https://doi.org/10.1007/s41237-017-0021-0
  • Swaminathan, H., & Rogers, H. J. (1990). Detecting differential item functioning using logistic regression procedures. Journal of Educational Measurement, 27(4), 361–370. https://doi.org/10.1111/j.1745-3984.1990.tb00754.x
  • Vandenberg, R. J. & Lance, C. E. (1998). A summary of the issues underlying measurement equivalence and their implications for interpreting group differences. In: 1998 Research Methods Forum, 3, 1-10.
  • Van De Schoot, R., Schmidt, P., De Beuckelaer, A., Lek, K., & Zondervan-Zwijnenburg, M. (2015). Editorial: Measurement invariance. Frontiers in Psychology, 6, Article 1064. https://doi.org/10.3389/FPSYG.2015.01064
  • Yen, W. M. (1993). Scaling performance assessments: Strategies for managing local item dependence. Journal of Educational Measurement, 30(3), 187–213. https://doi.org/10.1111/j.1745-3984.1993.tb00423.x
  • Yılmaz, G. (2019). Seçme sınavlarının engel durumlarına göre madde yanlılığının incelenmesi [An investigation of item bias for selection exams according to disability situations] [Master's thesis, Hacettepe University, Turkey]. http://www.openaccess.hacettepe.edu.tr:8080/xmlui/bitstream/handle/11655/8917/10277911.pdf?sequence=1&isAllowed=y
  • Zieky, M. (2003). A DIF Primer. Center for Education in Assessment.
  • Zumbo, B. (1999). A handbook on the theory and methods of differential item functioning (DIF). National Defense Headquarters.