Many readers of this blog will know that we have been working for some years to raise the importance of reading in secondary schools, and showing how struggling adolescent readers can make impressive gains with skilled instruction. Rather than write off children as ‘low ability’ or ‘disabled’, schools should review their systems and upskill their teaching staff to eliminate adolescent illiteracy.
So, when Ofsted released its research report: ‘Now the whole school is reading’: Supporting Struggling Readers in Secondary School, on October 31, there was a sense that the education system is finally beginning to pay more attention to this issue. The report has some sensible recommendations, and it highlights the importance and potential of effective intervention.
On the other hand, whenever Ofsted takes a position, it can prove to be a double-edged sword. When schools begin to do things because ‘Ofsted wants’ rather than ‘we believe’, the impact is often superficial. Our experience of implementing change in literacy practice in secondary schools has taught us that beliefs really do matter. Short-term compliance does not – repeat, does not – yield the same results as action based on carefully examined beliefs.
So, to the report itself. The headline findings are summarised towards the beginning of the document and, as noted above, these largely line up with practices that we have been advocating:
- leaders making reading a priority;
- identifying all pupils with reading difficulties and providing support at the classroom, small group and individual levels;
- ensuring staff have the required expertise to deal with some of the most serious learning problems;
- detailed monitoring of student progress;
- sharing assessment information with teachers so that they can provide appropriate support in the classroom environment.
Beyond the headlines
So much for the headlines. As always, the details matter. And here there is plenty to caveat or disagree with. The content and examples in the report do not always line up well with the headline statements, with potentially problematic results for implementation.
Screening systems are not well exemplified
The HMIs interviewed commented that many schools use Accelerated Reader for reading data but that this was often not used systematically. Leaving aside the question of whether it is appropriate for Ofsted to be naming commercial programmes in their report, we would argue that this is not the appropriate tool for measuring reading, and that it does not provide nearly enough information to be useful in identifying types or degrees of reading difficulties. Screening systems need to be much more precise.
Broad-brush stroke approaches to intervention are advocated by implication.
Case Study 2 under ‘Identifying pupils’ reading gaps’ simply takes children with a reading age of 9 or above and puts them into a phonics group. Please don’t do this. Some of those children may have few phonics gaps, and their low test scores may result from weak comprehension instead. Or, indeed, they may be average readers who made little effort in the test. It’s essential to use further diagnostic testing to identify domains of difficulty.
Expectations of the expertise required are set too low.
Schools (and Ofsted) often underestimate the amount of learning required to become an expert teacher of reading. For example, they comment that some schools give staff two days’ training. That might be enough to deliver a simple programme, but it’s nowhere near enough to become an expert. See our blogs Teaching Reading is Rocket Science, Teaching Reading: It’s Not as Niche as You Think , and 12 Qualities of an Effective Reading Teacher.
Teaching phase does not necessarily indicate expertise.
The report notes that some secondary schools rely on the expertise of primary teachers to help them solve reading problems. However, the fix is not that simple. Secondary schools should consider that the reading problems they’re trying to solve developed during six years of primary school. What students need is an effective reading teacher, not necessarily a primary teacher per se.
Monitoring of student progress is exemplified poorly.
While Ofsted says that successful schools monitor progress closely, the details suggest that they don’t really know what this requires. They give the example of a school which uses a pre- and post-test to check ‘if the intervention is right for the child’. In fact, the school should have known whether the intervention was appropriate before they placed the child: otherwise they may be wasting the pupil’s time. Secondly, even if the pupil has been placed appropriately, waiting until the post-test to discover that they had not made enough progress is clearly inappropriate. Progress monitoring should happen in every lesson.
Leaving aside the utility of the research findings, there are also concerns about methodology.
The evidence base is narrow. There are just six schools in the sample (this is acknowledged in the ‘limitations’ section of the report. These schools were selected on the basis of high rates of progress for students who scored low for reading comprehension in Year 6 SATs and scored at Level 4 or above in GCSE English in Year 11. This a proxy for progress in reading and a somewhat distant one. For example, GL Assessment’s report from 2020 found a 0.65% correlation between reading age and GCSE English. Year 6 SATS does not measure some aspects of reading, for example decoding skills. So neither measure is entirely valid, and putting them together does not necessarily lead to finding the best outcomes for improved reading in UK secondary schools.
The GCSE results were affected by the use of teacher assessments following the disruptions of the Covid-9 pandemic. Given the significant grade inflation at that time, impressions of students’ progress may have been overly generous.
While the report states ‘The findings show what works in these schools, rather than providing recommendations for all schools’, there is no doubt that, because of the accountability pressures in the UK system, school leaders will treat this report as a set of recommendations. For leaders who are looking for more depth and clarity on the topic, we’ve linked to posts relevant to each of the headline findings. And we cover all this and more in our book.
Senior leaders prioritised reading.
See our blog (2016) on leadership, mission, and vision, and our checklist on the need for a whole school approach (2015).
Schools accurately identified gaps in pupils’ reading knowledge.
Staff who taught reading had the expertise they needed to teach weaker readers.
Leaders shared information about struggling readers with staff.
See our follow-up blog for researchED Home on screening and assessment (2020).
Schools had clear procedures in place to monitor this teaching and its impact on struggling readers.
See our blog on 15 Tests for Secondary School Reading Interventions (2014), our notes on effective intervention and on achieving striking results from intervention (2020).
As pupils’ reading improved, they gained confidence and became more motivated to engage with reading in class.
We’ve written about the phenomenon of ‘desert bloom’ here, when struggling readers find new-found confidence, the potential for change in students’ self-esteem, and the enormous stress caused by being unable to read in Can Reading Problems Affect Mental Health?
Schools tended to stop additional support and monitoring once pupils were beyond key stage 3.
Don’t stop supporting or monitoring older students – time is of the essence as pupils approach exams. Even students who have graduated from reading interventions should be followed up annually. You can more read about this and the topics above in our book (see link below).
To summarise: it’s great to see that more and more people in secondary education are realising the pervasive importance of reading; but just because Ofsted says something, doesn’t mean it’s automatically a good idea. The rule still applies: caveat emptor.