Improving Secondary Writing 2023–24: Year 7, 8 & 9 results, Sept 2023

Daisy Christodoulou
The No More Marking Blog
5 min readOct 20, 2023

--

Over the last few weeks, 195 secondary schools have taken part in our KS3 writing moderation tasks. In total, 5114 teachers from these schools judged the writing of 61,416 Year 7, 8 & 9 pupils. The overall reliability of the judging was between 0.9 and 0.91, which shows a high level of consistency between teachers.

The breakdown between the year groups was as follows.

The task

The task is shown below. Pupils completed the task in controlled conditions to ensure the writing was independent.

The results

You can see the scaled scores of all the pupils on the chart below. We have split these up by year group, but the pupils all did the same task and their results are on the same scale, so you can compare across year groups. In fact, our measurement is consistent across all of our assessments at primary and secondary. It runs from approx 300–700, and you can read more about how we created it here.

If you would like to read more about comparisons between the year groups, see our separate blog here.

Now that we have established the measurement scale, we can use it to provide students with two more pieces of information: a GCSE grade indicator, and a writing age.

GCSE grade indicator

We have written before (extensively!) about the limitations of grading, and we think that the scaled score and writing age both provide you with more nuanced information about student performance. However, we also know that schools are keen to have at least some idea of where these national thresholds might sit, so we have provided a statistical indicator as to where these grades are. This statistical indicator obviously does not represent the grade a student would get if they took a GCSE right now! Instead, it applies a typical GCSE distribution to the performance of these students in Year 7, 8 & 9.

Put simply, we use prior information about the 2019 GCSE results of our participating schools to decide the proportions of each grade to award. So, for example, we know that 2.7% of pupils at our participating schools got a grade 9 in the 2019 GCSE — so we award the top 2.7% of pupils in each year group in our assessment a grade 9, and so on and so forth for each grade. You can see what this looks like on the graph below.

We can then identify the scaled score point that these thresholds correspond to. The grade boundaries are different for each year group — that’s because you need a higher score to be in the top 2.7% in year 9 than you do in Year 7!

The full set of grade boundaries are in the PDF reports.

The grade boundaries are statistical indicators. We do not use the GCSE mark schemes to create them, or any adaptation of the mark scheme.

Why are these boundaries different for each year group?

In order to keep the proportions getting each grade consistent, the boundaries have to increase. Here’s an example: To get a grade 9 in Sept of Y7, you had to be in the top 2.7% of the pupils in Sept Y7. To get one in Sept of Y9, you have to be in the top 2.7% of pupils in Sept Y9. You need a higher score to get in the top 2.7% of Y9 than the top 2.7% of Y7!

This is also why the grade boundaries change from Sept to May. You need a higher score to get in the top 2.7% in May than in Sept.

So are they predicted or expected grades, then?

We find that different schools have very different interpretations of what they mean by ‘predicted’ and ‘expected’! It may be that what we have created corresponds to your usual definition of predicted — it may not be!

The best way of thinking about these grades is as follows: a student who gets a grade 9 is a student whose performance has placed them in the top 2.7% of a large, nationally representative sample of pupils doing the same task at the same time in the same year group. (And so on and so forth for each grade).

In total, approximately 5,000 pupils got a grade 4 on the Year 7 assessment. If (a big if!!) grading standards in summer 2027 are consistent with those in 2019, we would expect that when those students sit their English Language GCSE, they would average grade 4. However, we’d also expect that this average would conceal a great deal of individual variation.

Are these grades similar to a Progress 8 baseline calculation?

Yes, that’s a good way of thinking about these grades. They have been calculated in a similar way to the Progress 8 baseline which is based on KS2 Sats attainment.

Writing age

As well as converting every point on our measurement scale to a grade, we have also converted it to a writing age by using information from our previous assessments. For example, we know that on average, pupils whose actual age is 9y6m tend to score 533 on our writing assessments. So when a pupil scores 533, we can say that they have a writing age of 9y6m.

Learn more

  • You can see some exemplars of writing here.
  • You can compare the results of Years 7, 8 & 9 here.
  • If you didn’t take part in this assessment and would like to learn more about how Comparative Judgement works, we have an intro webinar on Thursday 16th November at 4pm.
  • We have lots of resources that will help improve students’ writing. Read more about our Writing Hub and Automark websites here.

--

--