Teachers’ trialing procedures for Computer Assisted Language Testing Implementation

Teachers’ trialing procedures for Computer Assisted Language Testing Implementation

Problem Statement: Computer assisted language testing is becoming more frequent in many parts of Europe and the world. Likewise, in many countries like Spain, Greece, or Turkey (among many more), there is a move towards integrating information technologies and computers in high stakes testing. However, due to the limited information on this topic, it is necessary to provide active and training teachers with pieces of simple, accessible research that can help them understand the complex processes involved in designing language testing platforms or the relationship between computers and tests. Purpose of the Study: The main goal of this paper was to examine different approaches to computer based language testing design and also show one example done in Spain which may also be valid for different contexts and countries like many university entrance examinations in Europe or the ÖSYM in Turkey. Methods: Linguistic and computer design principles were used to determine the guidelines that programmers and testing stake holders follow to design and determine the testing platform specifications. Process description is further supported by the presentation of a model in Spain that is also based on two other significant projects previously undertaken in Spain. Findings and results: International or high stakes tests can be greatly enhanced by the use of computers in the test delivery process. Setting aside the convenience for security and economy reasons, computers allow for a richer and greater variety of tasks than the traditional pen and paper tests. The use of images is a powerful tool to trigger performance quality and quantity in second language testing. According to the current trialing done at the Polytechnic University of Valencia in previous experimentation, students feel at ease using computers in testing and it is quite motivating for them. As in other experiments and research experiences, it is still debatable whether this is due to the flow effect or that contextualizing test tasks through images and sounds facilitates language communication and performance. Results: Analyses of the quantitative data showed that using mobile phones had positive effects on students’ pronunciation learning. The qualitative data collected through the questionnaire and the interviews supported this finding. All participants provided positive feedback about the mobile learning application used in this study. Conclusions and Recommendations: The paper concludes that good design guidelines according to visual ergonomics can enhance students’ performance. Therefore guidelines for good practice and previous experimentation can enlighten similar projects, including those mentioned in Spain, Greece or the ÖSYM. Further research should indicate the reasons why computers may be positive for students especially in two measures: the difference between boys and girls and the potential existence of a flow effect in computer based language testing.

___

  • Aydin, S. (2006). The effect of computers on the test and inter-rater reliability of writing tests of ESL learners.5(1), Retrieved on February 27, 2009 from http://www.eric.ed.gov/ERICWebPortal/custom/portlets/recordDetails/de tailmini.jsp?_nfpb=true&_&ERICExtSearch_SearchValue_0=ED501439&ERIC ExtSearch_SearchType_0=no&accno=ED501439
  • Birol, C., Bekirogullari, Z., Etci, C., & Dagli, G. (2009) Gender and computer anxiety, motivation, self-confidence, and computer use, Egitim Arastirmalari – Eurasian Journal of Educationa Research, 34, 185-198.
  • Breithaupt, K., Ariel, A., & Veldkamp, B. P. (2005). Automated simultaneous assembly for multistage testing. International Journal of Testing, 5(3), 319-330.
  • Chapelle, C., Enright, M.K., & Jamieson, M (2008). Building a validity argument for the Test of English as a Foreign Language, New York: Routledge.
  • Cheon, J. & Grant, M. M. (2009) Are Pretty Interfaces Worth the Time? The Effects of User Interface Types on Web-Based Instruction, Journal of Interactive Learning Research, 20(1), 5-33.
  • Charlier, J., & Croche, S. (2008). The bologna process: The outcome of competition between europe and the united states and a stimulus to this competition, European Education, 39(4), 10-26.
  • Dooey, P. (2008). Language testing and technology: Problems of transition to a new era, ReCALL, 20(1), 21-34.
  • Educational Testing Service, Princeton, NJ. (2006). Innovations: Issue 1, Summer 2006, Princeton: Educational Testing Service.
  • Fulcher, G. (2003). Interface design in computer-based language testing, Language Testing, 20(4), 384-408.
  • García Laborda, J. (2006). PLEVALEX: A New Platform for Oral Testing in Spanish, Eurocall Review 9. Retrieved April 10, 2009 from http://www.eurocalllanguages. org/news/newsletter/9/index.html
  • García Laborda, J. (2007). On the net: Introducing standardized ESL/EFL exams, Language Learning & Technology, 11(2), 3-9.
  • García Laborda, J. (2008). Is the TOEFL exam aimed at everyone? Research considerations in the training and application of the TOEFL exam abroad, Eurocall Review 14. Retrieved April 10, 2009 from http://www.eurocalllanguages. org/news/newsletter/14/index.html#laborda
  • García Laborda, J. & Magal Royo, M. T. (2007). Diseño y validación de la plataforma PLEVALEX como respuesta a los retos de diseño de exámenes de idiomas para fines específicos, Ibérica. Revista de la Asociación Europea de Lenguas para Fines Específicos (AELFE), 14, 79-98.
  • Huang, Y., Lin, Y., & Cheng, S. (2009). An adaptive testing system for supporting versatile educational assessment, Computers & Education, 52(1), 53-67.
  • Hunt, M., Neill, S., & Barnes, A. (2007). The use of ICT in the assessment of modern languages: The English context and European viewpoints, Educational Review, 59(2), 195-213.
  • Kane, M., Crooks, T., & Cohen, A. (1999). Validating measures of performance, Educational Measurement: Issues and Practice, 18(2), 5-17.
  • Manalo, J. R., & Wolfe, E. W. (2000). A comparison of word-processed and handwritten essays written for the test of English as a foreign language, East Lansing: Eric Report (number ED443845).
  • Papadima-Sophocleous, S. (2008). A hybrid of a CBT- and a CAT-based new english placement test online (NEPTON), CALICO Journal, 25(2), 276-304.
  • Raiche, G., & Blais, J. (2006). SIMCA T 1.0: A SAS computer program for simulating computer adaptive testing, Applied Psychological Measurement, 30(1), 60-61.
  • Roever, C. (2001). Web-based language testing, Language Learning & Technology, 5(2), 84-94.
  • Ross, S. J., & Okabe, J. (2006). The subjective and objective interface of bias detection on language tests, International Journal of Testing, 6(3), 229-253.
  • Sawaki, Y., Stricker, L. J., & Oranje, A. H. (2009). Factor structure of the TOEFL internet-based test, Language Testing, 26(1), 5-30.
  • Smoline, D. V. (2008). Some problems of computer-aided testing and "interview-like tests", Computers & Education, 51(2), 743-756.
  • Stoynoff, S., & Chapelle, C. A. (2005). ESOL tests and testing. Alexandria, VA: Teachers of English to Speakers of Other Languages.
  • Zabaleta, F. (2007). Developing a multimedia, computer-based spanish placement test, CALICO Journal, 24(3), 675-692.