Digging for Knowledge: The Issue with Assessment in English

Assessment is one of the most difficult things we do in any subject. It is a delicate process where we try to create the opportunity to take a peek inside the heads of our students to see what has been learned and what they can actually do with that information effectively. There is a lot of criticism of the current exam system and there are a number of issues which, once we come through the chaos of the moment, we should be seeking to address. Norm referencing and the car crash of the English Language exam (something I examine here) are just two points in case. That is not to say I think exams should be removed, there is a great deal of value in having common points of reference and some high stakes accountability for students to work towards. I know myself that I work much better when I have a clear purpose and goal for my work and it helps my motivation immensely.

However, in English in particular, I think we often make more errors with our assessment choices than perhaps in some other areas. The subject itself has been buffeted greatly over the years, so much so that at times it feels like it has lost all identity, something David Didau explores extensively in his new book Making Meaning in English. The idea of transferable skills has taken over the need for any significant grounding in knowledge and often we focus our assessments on what students can do, the skills, as opposed to what they know. The use of inference is rooted firmly in our knowledge of the topic in hand, not something that is easy to isolate from our knowledge of the text,  and yet we often give students unseen material to make detailed inferences from and assume that they will have enough basic knowledge of the world without providing adequate context. Words which may have one meaning in one place, can be transformed in another, and I can’t help but think of the very capable students in my class a few years ago who decided the reference to the ‘black bird’ in the text meant that there was something sinister and ominous about a character only able to see the raven croaking himself hoarse at the arrival of Duncan, as opposed to a bright eyed and curious black bird hopping across the early morning dew.

We often assume a certain proficiency in English by the time students arrive at secondary school and, whilst we may teach them a huge range of texts and spend hours feeding back suggestions for improving their writing, we often rush them into writing about a text or creating their own before we take them through the basics. Then we rush to assess them. Students frequently enjoy free writing too, and whilst I am definitely not suggesting we strip that away completely, we need to help them to craft their work, building in opportunities to become proficient in this along the way. Without knowledge of how writers do this, and us having a clear understanding about what they know about the processes, it becomes tricky. Mistakes are often repeated despite our attempts to assess, reteach and getting them to redo.

Then we often look to the other end of the process, creating more difficulties in how we assess. Assessments, and indeed curriculums, have been distorted over the years as we try to replicate assessments for GCSE, bringing in tasks to emulate exam papers which have been carefully designed to assess at the end of the course, not at the start. We get year 7 students writing not about the craft of a writer in conveying their ideas, but about the language question on Paper 1 and assessment objective have become a staple part of language of the English classroom. I am not convinced that never seeing how an exam paper is constructed, despite seeing very few myself at school, is going to be the way to go, and there is evidence that practising exam questions under exam conditions can go along way to alleviating some of the anxiety we have created around exams. I mentioned high stakes exams being important for students, but I am not convinced we have got the balance quite right if we continue to have so many students needing significant exam adjustments because they have become so terrified by the process. And I am definitely not as bold as to go the way of one Maths teacher on twitter who said the first time his students see an exam paper is on the day of the exam.

However, when we replicate these types of assessment lower down the school and aim to use this data in both formative and summative ways, I believe we are making things difficult for both ourselves and our students.

Too much noise

A phrase that I use a lot, having heard Profession Becky Allen talk about it a few years ago, is about ‘noisy assessments’. When we give students a longer response assessment, we are going to be confronted with a number of issues. We have learned to be good at extracting general information about what a student knows and can do from them, sifting through the issues of literacy, time and misunderstandings around the questions, but these conclusions tend to be quite general and focus on exam technique over subject. Indeed in my case, some of those conclusions could be quite superficial. I can see a student doesn’t include a quote to support a point. But the real question is why didn’t they include it? Couldn’t they find it? Were they struggling to scan back through the text? Didn’t they recognise its relevance? Did they not know what to say about it?  Did they forget they were meant to include it? Did they not know how to include it in their text? Were they running out of time, so knew they were meant to include it, knew how to do it, but panicked and just moved on? Some of this can be unpicked via my knowledge of the student and subsequent conversations, but this is quite a labour intensive process and I know I would still end up making a guess and perhaps making assumptions about what they did or did not know. Often, we then move onto the next task without more than a cursory attempt to really address the issues which we flagged from the assessment itself.

Technical accuracy in writing is easier to assess, but what do I do in terms of noticing students hadn’t used a range of devices or a wide vocabulary? Again, I have had previous students who were clearly very capable, their verbal responses told me so, but when it came to writing assessment they often wrote in a rather pedestrian way. Little variety in use of vocabulary, despite knowing a huge range in their day to day discussions in class, poor sentence construction and the use of lazy tropes. I can see all of this in their work, much of which ran into pages as there was no crafting, but what does it really tell me about what they know? They didn’t make these choices because they didn’t know about them, however when left to their own devices, the didn’t really understand how to integrate them into their work. These assessments didn’t really tell me anything new and I was reliant again on what they had done in class, those snippets which actually gave me valuable information, and I would use that as a basis for much of my feedback and forward planning as opposed to what was in the long assessment they spent so much time writing and I had spent so much time reading and marking.

Good assessments should give us some surprises, and whilst these longer assessments would raise lots of questions and threads to explore, they always seemed to provide more questions than answers too. Without the specific details it hard to give really focused feedback that was going to have a meaningful impact, and again I think it was the nature of the assessment as opposed to a problem with me. I would work on it for hours and try to do the best for my classes, but it all felt quite disconnected.

As mentioned previously, I always found that I know more about my students knowledge and abilities from the day to day classroom interactions and formative practices that have become habit. The continuous dialogue, questions, shorter writing pieces and moments where I have guided their practice in a particular element, have always been a much richer veins to tap when it comes to understanding their thinking, identifying their knowledge and moving them forward. Assessments, especially those done every two or three weeks, only seemed to get in the way of the process, interrupting the flow of the learning and serving to frustrate both me and them at the apparent lack of progress. But this was the data which people were interested in the most, students included, and this was where much of the conversation would focus.

Leading in the dark

Despite leading a Department for many years, it was only relatively recently that I started to dig around the assessment process and properly understand it. I say that too as someone who has marked exams too and designed assessment for the team. I have been woefully inadequate. The work of Daisy Christodoulou, and Becky Allen, especially when considering what data might actually show, followed up by the Assessment Lead Programme run by the Evidence Based Education group, gave me opportunities to really reflect on what we were assessing, why and what we were doing with it. Often it made me think that we were trying to assess many of the wrong things, in the wrong way and then pretending we could do something with the information it produced.

These longer assessments in English lack nuance, and I have increasingly come to the view that we need to be much more granular in our approach, and that is a significant change from previous practices in English. When first using a set of carefully designed multiple choice questions with a class, I was amazed at the amount of information I could mine from it, and whilst my formative assessment practices in class gave me a lot of direction to steer the class and the individuals within it, looking at the cold hard data from assessments such as this was a bit of a revelation. I started exploring other approaches, employing much shorter, but much more precise assessments which allowed me to drill down into the knowledge they had and how I could adapt my teaching to accommodate these. This is one reason that I think that visible differentiation is hard to see in my classes as the subtle changes to my teaching, my repetition of key learning points to embed them my ongoing assessment of what has now stuck won’t be visible to others who don’t know the group. That is based on more effective assessment.

What next?

Assessment has for some time felt like it is something that is done to us. I think we need to take greater control of the assessments we undertake in our teams, knowing that at the moment we can’t address the issues of the wider assessment picture, and the level of dialogue around these would benefit from the same attention we give curriculum. After all you can have the most wonderful curriculum, but if it is not being learned it may as well not be taught in the first place. Quality of assessment is even more important now, especially with so much talk of gaps and catch up. But as Pedro De Bruyckere said in his Research Ed presentation back at the start of our first lockdown, the evidence from other areas of the world indicates not necessarily a wider gap in the learning but a wider spread. What students have and have not retained over this time will vary greatly from pupil to pupil, and whilst there have continued to be some amazing practices taking place remotely, this is perhaps one of the hardest things to do across a screen. Marc Rowland often reminds us we should be focusing on ‘assessment not assumptions’ and there are a lot of assumptions floating around at the moment. Anything from £40,000 pounds in lost future earnings to the idea that all students have lost learning and fallen behind. These assumptions could take us in the wrong direction if we are not careful and longer, exam style assessments would certainly help to ensure that would be the case as the pressure mounts to focus on deficits as opposed to building on what is there..

If we are really serious about getting our students to they need to be at to move forward with their learning, the need for high quality and forensic assessment is essential.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s