BİR ÇEVİRİ KALİTESİ ÖLÇME ARACI ÖNERİSİ

Çeviri kalitesinin ölçülmesi her zaman ilgi gören akademik alanlardan biri olmuştur. Bunun sonucunda da birçok çeviri değerlendirme aracı/yönergesi ortaya çıkmıştır. Ama bunlardan pek azı çeviri çözümlerini bir değerlendirme ölçütü olarak görmüştür ve hiçbiri Türkçeye yapılan çevirileri değerlendirmek için kullanılmamıştır. Bu iki farklılık, bu çalışmada sunulan ölçme aracının öne çıkmasına ve ilgili alanyazına kayda değer bir katkı sağlamasına imkan sunmaktadır. Söz konusu araç, yazarın doktora tezi (Yildiz, 2016) kapsamında üretilmiş ve bu çalışma ile daha da geliştirilmiştir. Değerlendirme aracı hem çeviri hataları hem de çeviri çözümleri üzerine inşa edilmiştir. Çeviri çözümünü bir parametre olacak işlevselleştirmek için alıntılanan kullanım kılavuzunda katılımcılara sorun oluşturması beklenen 25 unsur belirlenmiş ve bu unsurlar ölçme aracının ilk kısmını oluşturacak şekilde çalışmaya dahil edilmiştir. İkinci kısım sadece çeviri hatalarına odaklanmaktadır. Araç, iki hata türünü irdelemektedir: mekanik hatalar ve aktarım hataları. Bu çalışma, ölçme aracı ile birlikte çeviri çözüm ve hatalarına verilen puanların katlanarak arttığı bir puanlama tablosu da sunmaktadır. Küçük, büyük ve kritik hatalara sırasıyla (-2), (-4) ve (-8) puan verilirken kısmen kabul edilebilir ve kabul edilebilir çözümler (+2) ve (+4) puan ile ödüllendirilmiştir. Çalışmada aynı zamanda puanlama tablosunun anlamlandırılmasını kolaylaştıracak bir puanlama yönergesi de önerilmiştir. Bu yönergenin aracın nesnelliğini ve değerlendiriciler arası güvenilirliği geliştireceğine inanılmaktadır. Önerilen ölçme aracı, bu iki bileşeni ile de alana katkı sağlamaktadır. Aracın; çeviri eğitimi veren kurumlar, çevirmen dernekleri v

A TRANSLATION QUALITY ASSESSMENT TOOL PROPOSED

Translation quality assessment has always attracted a great deal of scholarly attention, which has resulted in several translation assessment tools/rubrics. Yet very few were observed to incorporate translation solutions as evaluative parameters and, to the author’s best knowledge, none was identified to assess the quality of translations into Turkish. These two features help the tool presented herein stand out and make a substantial contribution to the related literature. The tool was originally available in the author’s doctoral dissertation (Yildiz, 2016), but an improved version was proposed in this paper. It was built on translation errors and translation solutions. To be able to judge a solution’s acceptability, 25 rich points (PACTE, 2009) were identified in the excerpted manual. The rich points are located in the first part of the tool, while the second portion was solely based on the erroneous translation segments. The tool incorporates two types of errors – i.e., mechanical and transfer errors. The paper also proposes a grading table, featuring solution- and error-based grades in exponential increments. The minor, major, and critical errors are penalized with (-2), (-4), and (-8) points, whereas partially acceptable and acceptable solutions are awarded (+2) and (+4) points, respectively. The grading table is accompanied by a rubric to describe how the degrees of errors and solutions can be operationalized, which is believed to promote objectivity and inter-rater reliability. The tool notably contributes to the literature with these two components as well. It is thought to be usable by translation schools, translators associations, and translation companies

___

  • Al-Qinai, J. (2000). Translation Quality Assessment. Strategies, Parametres and Procedures. Meta: journal des traducteurs/Meta:Translators’ Journal, Vol. 45, Issue 3, pp. 497–519. https://doi.org/10.7202/00187.
  • Baker, M. (2011). In Other Words: A Coursebook on Translation (2nd ed.). London/Newyork: Routledge.
  • Boase-Beier, J. (2011) Stylistics and Translation. The Oxford Handbook of Translation Studies (Eds. K. Malmkjaer and K. Windle). Oxford/New York: Oxford University Press. pp. 71–82.
  • Bowker, L. (2000). A Corpus-Based Approach to Evaluating Student Translations. The Translator, Vol. 6, Issue 2, pp. 183-210. https://doi.org/10.1080/13556509.2000.10799065.
  • Colina, S. (2009). Further Evidence for a Functionalist Approach to Translation Quality Evaluation. Target, Vol. 21, Issue 2, pp. 235–264.
  • Eyckmans, J., and Anckaert, P. (2017). Item-Based Assessment of Translation Competence: Chimera of Objectivity versus Prospect of Reliable Measurement. Linguistica Antverpiensia, New Series: Themes in Translation Studies, Issue 16, pp. 40–56.
  • Farahzad, F. (1992). Testing Achievement in Translation Classes. Teaching Translation and Interpreting: Training, Talent and Experience (Eds. C. Dollerup and A. Loddegaard). Amsterdam: John Benjamins. pp. 271-78.
  • Han, C. (2020). Translation quality assessment: A methodological review. The Translator, Vol. 26, Issue 3, pp. 257–273. https://doi.org/10.1080/13556509.2020.1834751
  • House, J. (1997). Translation Quality Assessment: A Model Revisited. Tübingen: Gunter Narr Verlag.
  • House, J. (2001). Translation Quality Assessment: Linguistic Description versus Social Evaluation. Meta: Journal des Traducteurs/Meta: Translators’ Journal, Vol. 46, Issue 2, pp. 243-257. doi:10.7202/003141ar.
  • House, J. (2015). Translation Quality Assessment: Past and Present. London/New York: Routledge.
  • Hutchins, W. J. and Somers, H. L. (1992). An Introduction to Machine Translation, London/San Diego: Academic Press.
  • Kahl, P. (1991). Translation Quality - How Can We Tell It’s Good Enough?. Proceedings of Translating and the Computer 12: Applying Technology to the Translation Process (Ed. C. Picken), London: Aslib. pp. 149-158.
  • Koby, G. S. and Champe, G. G. (2013). Welcome to the Real World: Professional-Level Translator Certification. The International Journal for Translation and Interpreting Research, Vol. 5, Issue 1, pp. 156-173. doi:ti.105201.2013.a09.
  • Koby, G. S., Fields, P., Hague, D. R., Lommel, A., and Melby, A. (2014). Defining Translation Quality. Revista Tradumàtica: Tecnologies de la Traducció, Vol. 12, pp. 413-420.
  • Lacruz, I., Denkowski, M., and Lavie, A. (2014). Cognitive Demand and Cognitive Effort in Post-Editing. Proceedings of AMTA 2014 Third Workshop on Post-Editing Technology and Practice (Eds. S. O’Brien, M. Simard, and L. Specia), pp. 73 - 84.
  • Lilova, A. (2008). The Perfect Translation – Ideal and Reality (Trans. by M. J. Stern). Translation Excellence: Assessment, Achievement, Maintenance (Ed. M. G. Rose). Amsterdam: Benjamins. pp. 9-18
  • Martínez-Mateo, R. (2016). Aligning Qualitative and Quantitative Approaches in Professional Translation Quality Assessment. Encuentro, Issue 25, pp. 36–44.
  • Martínez-Mateo, R., Montero Martínez, S., and Moya Guijarro, A. J. (2017). The Modular Assessment Pack: A New Approach to Translation Quality Assessment at the Directorate General for Translation. Perspectives, Vol. 25, Issue 1, pp. 18-48. doi:10.1080/0907676X.2016.1167923.
  • Martínez-Melis, N. and Hurtado Albir, A. (2001). Assessment in Translation Studies: Research Needs. Meta:Journal des Traducteurs/Meta: Translators' Journal, Vol. 46, Issue 2, pp. 272–287. https://doi.org/10.7202/003624ar.
  • Mueller, C. W. (2004). Conceptualization, Operationalization, and Measurement. The SAGE Encyclopedia Social Science Research Methods (Eds. M. Lewis-Beck, A. Bryman, and T. F. Liao). California: SAGE. pp. 161-165. Newmark, P. (1995). A Textbook of Translation, London: Longman.
  • Nord, C. (1991). Text Analysis in Translation: Theory, Methodology and Didactic Application of a Model for Translation-Oriented Text Analysis. Amsterdam/Atlanta: Rodopi.
  • PACTE (2009). Results of the Validation of the PACTE Translation Competence Model: Acceptability and Decision Making. Across Languages and Cultures, Vol. 10, Issue 2, pp. 207-230. https://akjournals.com/view/journals/084/10/2/article-p207.xml
  • Reiss, K. (2014). Translation Criticism – The Potentials and Limitations: Categories and Criteria for Translation Quality Assessment (Trans. by E. F. Rhodes). London/New York: Routledge.
  • Samuelsson-Brown, G. (2010). A Practical Guide for Translators (5th ed.). Bristol/Buffalo/Toronto: Multilingual Matters.
  • Smith, T. P. (ed.) (2003). Manufacturer’s Guide to Developing Consumer Product Instructions, Washington.
  • Thelen, M. (2008). Translation Quality Assessment or Quality Management and Quality Control of Translation?. Translation and Meaning – Part 8 (Eds. B. Lewandowska-Tomaszczyk and M. Thelen). Maastricht: Hogeschool Zuyd. pp. 411-424
  • Waddington, C. (2001). Should Translations Be Assessed Holistically or Through Error Analysis?. Hermes: Journal of Linguistics, Issue 26, pp. 15-38.
  • Williams, M. (2004). Translation Quality Assessment: An Argumentation-Centred Approach. Ottawa/Ontario: University of Ottawa Press.
  • Williams, M. (2009). Translation Quality Assessment. Mutatis Mutandis, Vol. 2, Issue 1, pp. 3-23. http://aprendeenlinea.udea.edu.co/revistas/index.php/mutatismutandis/article/view/1825/1609.
  • Williams, M. (2013). A Holistic-Componential Model for Assessing Translation Student Performance and Competency. Mutatis Mutandis, Vol. 6, Issue 2, pp. 419-443.
  • Wilss, W. (1982), The Science of Translation: Problems and Methods. Tübingen: Gunter Narr Verlag.
  • Vandepitte, S. (2017). Translation Product Quality: A Conceptual Analysis. Quality Aspects in Institutional Translation (Eds. T. Svoboda, Ł. Biel, and K. Łoboda). Berlin: Language Science Press. pp. 15–29.
  • Vermeer, H. J. (2012). Skopos and Commission in Translational Action (Trans. by A. Chesterman). The Translation Studies Reader (3rd ed.) (Ed. L. Venuti). London and New York: Routledge. pp. 191–202
  • Yildiz, M. (2016). Mütercim-Tercümanlık Öğrencilerinin Özel Alan Çevirisi Kapsamında Yazılı Çeviri Edinçlerinin Ölçülmesi. Unpublished Doctoral Dissertation, Istanbul: Istanbul University Institute of Social Sciences.
  • Yildiz, M. (2020). A Critical Perspective on Translation Quality Assessments of Five Translators’ Organizations: ATA, CTTIC, ITI, NAATI, and SATI. RumeliDE Dil ve Edebiyat Araştırmaları Dergisi, Issue 18, pp. 568-589. doi:10.29000/rumelide.706390.