دانلود مقاله ISI انگلیسی شماره 138256
ترجمه فارسی عنوان مقاله

بررسی مقایسه پذیری تحویل مبتنی بر رایانه و مبتنی بر رایانه در یک آزمون نوشتن مقدماتی

عنوان انگلیسی
Researching the comparability of paper-based and computer-based delivery in a high-stakes writing test
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
138256 2018 17 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Assessing Writing, Available online 7 April 2018

ترجمه کلمات کلیدی
آزمون معادله، معادل همبستگی، اعتبار شناختی، تست کامپیوتری نوشتن، حالت تحویل، ارزیابی نوشتن زبان دوم،
کلمات کلیدی انگلیسی
Test equivalence; Score equivalence; Cognitive validity; Computer-based testing of writing; Delivery mode; Second language writing assessment;
پیش نمایش مقاله
پیش نمایش مقاله  بررسی مقایسه پذیری تحویل مبتنی بر رایانه و مبتنی بر رایانه در یک آزمون نوشتن مقدماتی

چکیده انگلیسی

International language testing bodies are now moving rapidly towards using computers for many areas of English language assessment, despite the fact that research on comparability with paper-based assessment is still relatively limited in key areas. This study contributes to the debate by researching the comparability of a high-stakes EAP writing test (IELTS) in two delivery modes, paper-based (PB) and computer-based (CB). The study investigated 153 test takers’ performances and their cognitive processes on IELTS Academic Writing Task 2 in the two modes, and the possible effect of computer familiarity on their test scores. Many-Facet Rasch Measurement (MFRM) was used to examine the difference in test takers’ scores between the two modes, in relation to their overall and analytic scores. By means of questionnaires and interviews, we investigated the cognitive processes students employed under the two conditions of the test. A major contribution of our study is its use – for the first time in the computer-based writing assessment literature – of data from research into undergraudates’ cognitive processes within real-world academic settings as a comparison with test takers’ cognitive processes during academic writing under test conditions. In summary, this study offers important new insights into academic writing assessment in the computer mode.