Assessing speaking at C1 level business English at Masaryk University Brno
Authors | |
---|---|
Year of publication | 2014 |
Type | Article in Proceedings |
Conference | Testing language competence of tertiary students in LSP courses : collective monograph |
MU Faculty or unit | |
Citation | |
Field | Pedagogy and education |
Keywords | methodology; testing; oral test; assessment; reliability; validity; paired format; analytical rating scales |
Attached files | |
Description | This paper deals with the standardisation process in specifying a new format of the Business English spoken test at the Faculty of Economics and Administration of Masaryk University, which was previously based on the assessment of integrated skills of reading and speaking. With a growing awareness of the CEFR levels, a need to enhance fairness, reliability and validity of testing arose and was embraced by the English language department team initiating a series of discussions resulting in an overhaul of the oral test format. The major change lies in a radical reduction of instruction input, a switch from a teacher-student interview to a peer-to-peer discussion pattern, thereby splitting the role of an interlocutor and rater and the use of analytical rating scales. A year from the implementation of this system, another round of discussions was held aiming to adjust the new format in order to enhance a more autonomous and higher-standard of language production. Recordings of real live tests have been collected for benchmarking purposes and to evaluate the result of the whole process. All signs indicate that the change was for the better with the new format being more capable of eliciting desired language and teachers’ ratings more reliably and consistently. |
Related projects: |