Comparability of Computer-based and Paper-based Versions of Writing Section of PET in Iranian EFL Context

Document Type : Original Article

Authors

Urmia University

Abstract

Computer technology has provided language testers with opportunity to develop computerized versions of traditional paper-based language tests. New generation of TOEFL and Cambridge IELTS, BULATS, KET, PET are good examples of computer-based language tests. Since this new method of testing introduces new factors into the realm of language assessment (e.g., modes of test delivery, familiarity with computer, etc.), the question may be whether the two modes of computer- and paper-based tests comparably measure the same construct, and hence, the scores obtained from the two modes can be used interchangeably. Accordingly, the present study aimed to investigate the comparability of the paper- and computer-based versions of a writing test. The data for this study were collected from administering the writing section of a Cambridge Preliminary English Test (PET) to eighty Iranian intermediate EFL learners through the two modes of computer- and paper-based testing. Besides, a computer familiarity questionnaire was used to divide participants into two groups of high and low computer familiarity. The results of the independent samples t-test revealed that there was no statistically significant difference between the learners' computer- and paper-based writing scores. The results of the paired samples t-test showed no statistically significant difference between high- and low-computer-familiar groups on computer-based writing. The researchers concluded that the two modes comparably measured the same construct.

Keywords

Chapelle, C. A., & Douglas, D. (2006). Assessing language through computer technology. Cambridge: Cambridge University Press.
 
Choi, I., Kim, K., & Boo, J. (2003) Comparability of a paper-based language test and a computer-based language test. Language Testing, 20 (3), 295- 320.
 
Fulcher, G. (1999). Computerizing an English language placement test. ELTJournal, 53, 289–99.
 
Goldberg, A., Russell, M., & Cook, A. (2003). The effect of computers on student writing: A meta analysis of studies from 1992 to 2002. Journal of Technology, Learning, and Assessment, 2 (1), 2-47.
 
Harrington, S., Shermis, M. D., & Rollins, A. L. (2000). The influence of word processing on English placement test results. Computersand Composition, 17, 197–210.
 
Kim, J. P. (1999). Meta-analysisofequivalenceof computerized and P&P tests on ability measures. Paper presented at the Annual Meeting of the Mid- Western Educational Research Association. Chicago, IL.
 
Lee, H. K. (2004). A comparative study of ESL writers’ performance in a paper-based and a computer-delivered writing test. AssessingWriting, 9 (1), 4–26.
 
Lee, Y. J., (2002). A comparison of composing processes and written products in timed-essay tests across paper-and-pencil and computer modes. AssessingWriting, 8, 135–257.
 
Lottridge, S., Nicewander, A., Schulz, M., & Mitzel, H. (2008). Comparability of Paper-based and Computer-based tests: A review of the methodology. Monterey, CA: Pacific Metrics Corporation.
 
 
 
Puhan, P., Boughton, K., & Kim, S. (2007). Examining differences in examinee performance in paper and pencil and computerized testing. Journal of Technology, Learning, and Assessment, 6 (3). Retrieved on April 24, 2010, from http://www.jtla.org.
 
Taylor, C., Kirsch, I., Eignor, D., & Jamieson, J. (1998). Examining the relationship between computer familiarity and performance on computer- based language tasks. LanguageLearning, 49 (2), 219–274.
 
Wang, S., Jiao, H., Young, M. J., Brooks, T. E., & Olson, J. (2008). Comparability of computer-based and paper-and-pencil testing in K-12 assessment: A meta-analysis of testing mode effects. Educational and Psychological Measurement, 68, 5-24.
 
Wang, H., Shin, C. D. (2009). Computer-based & paper-pencil test comparability studies. Test, measurement, and research services Bulletin, 9, 1-6.
Volume 1, Issue 2
July 2012
Pages 1-20
  • Receive Date: 23 November 2018
  • Accept Date: 23 November 2018