washback of a high stakes exam: the impact on teachers
TRANSCRIPT
Washback of a High Stakes Language Test: The impact on teachers
BAAL TEA SIG Pi# Building Conference Centre, Cambridge
May 8, 2015
Posi?ve Nega?ve
1. Washback Literature 2. Research Ques?ons 3. Context of study 4. Methodology 5. Results 6. Conclusion & Implica?ons
Cheng (1998a; 2004, see also Cheng, 2008) Context: Hong Kong secondary schools Par:cipants: 244 teachers, 110 students Test: HKEA (Hong Kong Examina?ons Authority) English exam Test Change: 1) Extra integrated component
2) Increase of weight for oral component Methodology: Ques?onnaires (teacher/student)
Classroom observa?on Inten:ons: To encourage interac?ve, task-‐based learning
Cheng (1998a; 2004, see also Cheng, 2008)
Support: Resources: Books aligned to exam curriculum
Communica:on: None reported Training: None reported
Results: Posi:ve: More emphasis on oral content, especially in homework
Not so posi:ve: Teacher methodologies unchanged (high teacher talk) Concluded: ‘[High stakes] examina?ons drive teaching in the direc?on of coaching and drilling for what is required in the examina?on…a change in the examina?on syllabus itself alone is highly unlikely to realize the intended goal.’ (2004: 164-‐165)
Muñoz and Álvarez (2010) Context: Colombia, University language centre Par:cipants: 14 teachers (7 in control, 7 in experimental), 110 students Test: OAS (Oral Assessment System) -‐ classroom-‐based assessment Test Change: En?re system change Methodology: Surveys (teacher/student)
Classroom observa?on External evalua?ons of exams
Inten:ons: To improve oral educa?on and increase communica?ve ac?vi?es
Muñoz and Álvarez (2010) Support: Resources: mul?ple scoring scales, set of tasks, outlined speaking
standards, a guideline document, assessment report cards (experimental group only) Communica:on: periodical discussion groups Training: 30 hour training with 3 modules
Results: Experimental: rubrics used as planned, beder-‐informed students,
detailed feedback given to students Control: more grammar-‐based teaching, did not state daily objec?ves, used iden?cal classroom/assessment tasks And…students in experimental classes had significant score gains in: 1) communica:ve effec:veness, 2) grammar, and 3) pronuncia:on!
Concluded: ‘Constant guidance and support over ?me are essen?al in order to help teachers use the system appropriately and therefore create posi?ve washback’ (p. 33)
1. How did the teachers perceive the new exam?
2. How was support perceived by the teachers in the prepara?on and examina?on of their students? (regarding resources, training, and communica?on)
3. What was the reported effects of the new speaking test to teaching prac?ces?
French Ministry of Educa?on
‘Inspecteurs’
Teachers
10 min prep 5 min oral 5 min with a topic monologue interview
Scoring
– 3 column/4 band scale with descriptors ‘aligned’ to CEFR levels
– Teachers = examiners – Scores combined with listening and wri?ng (no feedback)
• 10/20 = pass, 20/20 = B2
Topics The Forms of Power Myths and Heroes
Spaces and Exchanges The Idea of Progress
• 8 English Teachers • 3 schools in South-‐West France: • Ci?es of Toulouse, Rodez, Colomiers • 1 public, 1 private, 1 technical
– Teacher Ques?onnaires • 22 ques?ons created from 5 open-‐ended pilot ques?ons • 2 parts:
– Teaching prac?ces (mul?ple-‐choice/open-‐ended) – Percep?ons/beliefs regarding test (Likert agreement 1-‐5)
– Semi-‐structured Interviews • 4 -‐ 7 preliminary ques?ons based on ques?onnaire responses • Dura?on: 30 – 60 minutes
– Classroom observa?on • 5 classes
Teacher4: ‘When we do oral work, now, it has sense. It was necessary to get this oral test.’
Teacher3: ‘They dare to speak, before they did not.’ However… Teacher4: ‘They’re not stupid. We don’t tell them this, but they just prepare and
learn them by heart—then when it comes to the ques?ons, they realise their level is not very good.’
• 88% of the teachers agreed that the Ministry of Educa?on added the oral to change English teaching in secondary schools.
• 100% of teachers agreed that the test was ‘a good thing’ and ‘important’ • 75% of the teachers felt op?mis?c about their prepara?on of the students.
-‐ Guidelines about using ‘authen?c documents’ in class -‐ A few lines of instruc?ons about posi?ve examining
-‐ Aligned to the CEFR ‘B’ levels
T6: ‘They could have given us some database…They gave us nothing, no
help. They said to do it, just do it.’ T4: ‘[The teachers] don’t always agree. Like, all have different ideas with
the book, some rely on it, some?mes not.’
T7: ‘I think the inspectors should talk to us, and have mee?ngs, like with the Italian. We had none.’
T2: ‘When we asked [the inspector] ques?ons, he said he didn’t
know.’
T7: ‘The teachers say different things—even aoer mee?ngs, and this is difficult for the pupils as well.’
T2: They should propose us some training. The person who decided the oral exam did that, and that was it. They don’t decide on ways of doing it. They passed it down to the inspector, the director, then me. I would have liked to be a pupil—just to see what it is like. Of course it would cost a lot, and take ?me, so it wasn’t possible, but I would have liked that, to see what a pupil needs, and an idea of the organisa?on of the exam.
• Student presenta?ons (100%) • Speaking tests (88%) • Recorded language lab work (50%) • Group work (38%)
• Role-‐play ac?vi?es (38%) • Self-‐assessment (25%) • Pair work (12%)
75% of teachers agreed that the school/Ministry could have further aided them [for this test] in training, resources, and/or class :me.
– No assessment training – Misinterpreted instruc@ons – ‘Easy to pass’ ra@ng scale – No extra class @me for languages
-‐ Few teachers -‐ Lidle classroom observa?on -‐ No scores -‐ Not longitudinal
There is a global need! Projects in… Assessment Literacy Training (pre-‐service and in-‐service)
-‐ Learning-‐oriented assessment design (i.e. projects like LOLA) -‐ Language tes?ng terminology (i.e. large scale ‘simple anima?on’) -‐ Scale familiarity (CEFR and others) -‐ Examining sessions and feedback
• Bachman, L. F., & Palmer, A. (2010; 2012). Language assessment in prac?ce. • Cheng, L. (1998). The washback effect of public examina?on change on students’ percep?ons and
attudes toward their English learning. Studies in educa@onal evalua@on, 24(3), 279-‐301. • Cheng, L., & Cur?s, A. (2012) Test impact and washback: Implica?ons for teaching and learning.
Cambridge guide to second language assessment, 89-‐95. • Council of Europe. (2001). Common European Framework of Reference for Languages: Learning,
teaching, assessment. Cambridge: Cambridge University Press. Green, 2013; • Lidle, D. (2009). Language learner autonomy and the European language poruolio: Two L2 English
examples. Language teaching, 42(02), 222-‐233. • Muñoz, A.P., & Álvarez , M.E. (2010). Washback of an oral assessment system in the EFL classroom.
Language Tes@ng, 27(1), 33-‐49. • O’Sullivan, B. (2012). Assessing Speaking. The Cambridge guide to second language assessment,
234-‐246. • Wall, D., & Alderson, J.C. (1993). Examining washback: the Sri Lankan impact study. Language Tes@ng,
10(1), 41-‐69.
References
Thank you!