Connect, Grow, Thrive

ETAS Journal Editors' Choice, Number 18 (March 2017)

Emilia Siravo: Testing from a language learner’s perspective

We live in an education world where people want tests. Governments, universities, employers: they all like numbers. This has created an environment where we, as teachers, worry about the balance between testing and learning. As teachers we have often erred on the side of being wary of tests. They take up time, they create a negative atmosphere and they pigeon-hole students.

Emilia’s article refreshes our thinking. Catalysed by her own experiences with a fitbit, she leads us on a journey through her own action research of how testing helped her in focussing her study, learning and acquisition of German. As a learner she was aware of how testing helped (helps!) her to understand where she is with her growth in the language. Testing also inspired her (rather than pressured her) to do more: “Tests pushed me to listen to, to write, to speak, and to read German wherever I could”. She needed this push. Tests helped her.

This journey takes us down some important roads: we need to think about what we are testing and why. We need to consider the test in the same way we consider the course. It is normal practice to use Action Verbs in lesson plans and course plans. However to use such verbs in, as she says, formulating testing goals will be something for us all to consider.

Tests therefore, Emilia reminds us, should reflect the course. If we describe the course in terms of can-do, action-verb statements, the tests should reflect this in the activities and in the assessment. Interestingly, in this article, when talking about feedback, Emilia makes no reference to numbers! This is a crucial point. The good student will also be good at realising that they have done well. They don’t necessarily need a number or a letter to prove a point. The not-quite-so-good student could be de-motivated by a low number. Emilia talks about tailoring the feedback to student need, communication errors or errors of the target language.

We teachers need to re-consider our approach to testing – especially the internal tests and periodic “progress tests”. As Emilia encourages us to do, we should think of them positively and create tests that reflect the tasks and end-games of our courses. If we think in terms of these action verbs, we will create more dynamic and useful tests. Equally, if students feel our enthusiasm for the test, there is a greater likelihood that they will become infected too.

I commend this article to you. It is written with passion by an insightful practitioner and deserves our attention.

Lee Shutler

ETAS Journal Editorial Board

Testing from a language learner’s perspective

Emilia Siravo

My ongoing experience learning German has profoundly changed the way I approach second language teaching to adults. Thanks to my own teacher-student role reversal, I have been able to reexamine and reconsider every aspect of second language learning. One such area has been the use and effectiveness of testing in language classrooms.

As a language teacher to adults, I avoided testing. I argued that tests created too many negative affective filters, that my students did not necessarily have the time to take tests, and that more informal in-class assessments were a better measure of and means towards learning.

Yet, as a student, I felt the benefits of testing far outweighed the drawbacks. Although testing periods were sometimes stressful, I felt they were accompanied with tremendous strides in my language learning. Curious to understand why I felt taking tests helped me, I analyzed the roads around them. Here is what I discovered.

Testing: Sets goals, provides extrinsic motivation

I recently purchased a Fitbit to track my daily activity. I have always been physically active but as soon as I strapped my Fitbit on, I noticed I was moving much more. When I started, I was walking about 10 thousand steps a day, but with the Fitbit on, I started averaging 15 thousand steps daily. Simply having my movements measured pushed me to move more. I was not alone. In a meta-analysis conducted by Bravata et al. (2007), researchers found that people averaged 26 per cent more steps daily when using a pedometer or fitness tracker like the Fitbit.

Just like with the Fitbit, having tests in my German class meant that I had measurable goals to work towards. Tests pushed me to listen to, to write, to speak, and to read in German anywhere and everywhere I could. It’s not that I did not study or practise German otherwise, I did. Yet, when I had clear testing goals, I practised and studied more. It was this extra, extrinsic push that helped me make strides forward.

Teaching Implications

Thanks to my language learning experience, the need for having clear testing goals became apparent. However, as a teacher, I struggled creating testing goals for my courses. Ultimately, surveying my students to understand their needs along with using clearly defined lists of action verbs for formulating goals helped.

Conduct Needs Analysis Survey (NA)

To better understand my students’ thoughts and needs about testing, I conducted several Needs Analysis Surveys (Nunan, 1998; Case, 2008) using both take-home questionnaires and in-class interviews. In the surveys, I asked students about their thoughts and previous experiences with testing and even what an ideal test would look like to them. When students expressed negative emotions related to testing, I asked what specifically made their testing experience negative and what they thought could be done to improve testing as a whole.

The results were mixed – there are no broad brush rules that apply to every class, let alone every student. But the conversation with my students about testing was essential in identifying my students’ testing needs and ultimately in shaping each course’s testing plans.

Clearly formulate testing goals

Once I better understood my students’ needs, I had to translate those needs into measurable goals. Formulating goals, let alone measurable ones, has never been easy for me. What helped has been using the list of Bloom’s Taxonomy Action Verbs. These action verbs are very specific, therefore easier to measure. For example, instead of using verbs like know or understand that can be vague (How do you know that a student has understood something?), I started using verbs including: identify, list, match, and define which were easier to track. Using these verbs, a test’s goal might read something like this:

In this test, students should be able to (1) write an informal email to a friend and (2) describe their current living situation (city/neighborhood, work/study, home, etc.).

Break goals down into smaller steps

After formulating the goals, I broke them down into smaller steps of what students would need to do to achieve that goal. These objectives were formulated in the 'students should be able to …' structure to ensure clarity. In the case of the sample test goal above, students would need to be able to:

Use appropriate e-mail structures (formal vs. informal)

Construct typical sentences/phrases that are normally used in informal e-mails (I hope you are well, I’m writing because, I look forward to seeing you soon, etc.)

Apply typical grammatical patterns that are used when talking about current situations (present continuous/present simple patterns, verbs of preference (like, don’t like, prefer etc.))

Make use of vocabulary related to living situations (describing homes, cities, work/study situations, etc.)

Once I had the list of testing goals and subsequent substeps, classroom planning became a lot easier. Students knew what was expected of them and I had a clearer idea of what to measure to ensure progress.

Testing: Provides feedback, raises awareness

Richard Schmidt’s (1990) noticing hypothesis posits that learners’ awareness of the difference between their current and target knowledge is one of the first critical steps of learning. For me, taking tests, but more importantly, receiving feedback about my performance, helped me notice the gap between my interlanguage and target language. This awareness enabled me to start taking the steps towards correcting myself.

Language researchers have long debated about what type of feedback is best. Feedback may be implicit or explicit, immediate or delayed, and focus on either global or local errors. As a language learner, the type or delivery of feedback did not matter as much as what I could do with the feedback after I received it. In fact, even the most tailor-made feedback seemed useless to me unless I was given the opportunity to redo the assignment or to retake the test.

Teaching implications

As a teacher, I realised that for testing to be beneficial, I needed to (1) provide students with appropriate student-tailored feedback and (2) give students the opportunity to incorporate that feedback into their work.

Provide students with appropriate feedback

After each test, I gave each student feedback on both what was done well along with areas for improvement. In cases where students scored perfectly, I tried to challenge them with more difficult questions or follow-up tasks. In contrast, when students performed poorly, I met with that student individually to get more insight as to why the student was experiencing difficulties.

In terms of what feedback to provide, I firstly provided feedback on any global errors that impacted meaning or overall communication, and then focused on errors that were either very frequent or errors related to an actual class topic. I tailored feedback to students’ individual needs and considered the students’ level and readiness to receive a particular piece of feedback.

Allow redos/ retakes

As a teacher, I knew that I had to give students the opportunity to redo assignments and retake tests so that they could work with the feedback received. While I often gave redos immediately after a test, I sometimes also reintroduced and retested challenging material a few weeks or months after, to see what, if anything still needed further reviewing. I also had students compare their original assignments with current work so that they could track their progress.

And so…

As a teacher, testing used to seem outdated and ineffective. Yet, as a student I saw another side to testing and realised that if tests focus on students’ needs, have clearly formulated goals, and provide appropriate feedback, they become an essential part of language learning.

References

Bloom's taxonomy action verbs. Retrieved from
http://www.apu.edu/live_data/files/333/blooms_taxonomy_action_verbs.pdf
Bravata, D.M., Smith-Spangler C., Sundaram, V., Gienger, A.L., Lin, N., Lewis, R., Stave, C.D., Olkin, I. & Sirard, J.R. (2007). Using pedometers to increase physical activity and improve health: A systematic review. Journal of the American Medical Association, 298(19), 2296 – 2304. doi: 10.1001/jama.298.12.2296
Case, Alex. (2008). 15 ways to do a needs analysis. Retrieved July 2016 from http://edition.tefl.net/ideas/business/needs-analysis/
Nunan, D. (1988). Syllabus design. Oxford, UK: Oxford University Press.
Schmidt, R.W. (1990). The role of consciousness in second language learning. Applied Linguistics, 11, 129-158.

About the Author

Emilia Siravo is a freelance ESL teacher in Zürich, Switzerland. In addition to the CELTA, DELTA I, DELTA III, and SVEB certifications, Emilia received her Master’s in TESOL from The New School University. Emilia loves teaching and finds that second-language learning enables students to open new doors, discover new cultures, and explore new perspectives. Follow Emilia online on Twitter: @esiravo or read her blog: https://emiliasiravo.com