“ ”
Moving from SATs data to standardised data
Nicola goes on: “We found the solution in GL Education’s Progress Test Series. They are a measure that compares progress against the Programmes of Study and are computer-based so the level of preparation is minimal. We introduced them initially for Years 2 and 6, but now also deliver them in Year 4.“
Derek Watson, Phase Leader for Year 5 and 6, explains: “Progress Test data is generated at several different levels, allowing it to be used flexibly across different stakeholders within the school. Individual student and class data is shared with class teachers and examined at a granular level. Year group data is discussed with class teachers, heads of year and myself as the phase leader – we look at trends across a year group and discuss areas requiring impact. We then look at phase data, which I review with the heads of year and the head of KS2; and whole-school data, which is a senior leadership conversation.
The Progress Tests allow us to compare ourselves against the UK average, and against an International average. This is something, as a leading international school, that our governors and top tier management are really interested in – as it gives them a tangible way of seeing how well the school is performing.
The Standard Age Score (SAS) data is extremely useful in allowing us to track value added and see which teaching and learning approaches we’ve used across a term or year have added value. They also help us to look at a child’s performance and their progress over time. That longitudinal assessment is really important. Measuring progress via SAS also allows us to see which classes are adding the most value. We can then put a spotlight on what’s going on in that particular class, and see which strategies are successful. We then carry out observations and peer to peer support so that all of our teachers are able to share best practice.”
Data to inform teaching and learning and demonstrate value added
Regular Pupil Progress Meetings offer an opportunity for each child to be looked at in detail.
Derek explains: “Pupil Progress Meetings (PPMs) are held four times a year. They give us the opportunity, as middle leaders, to sit down with classroom teachers, look at each curriculum area and drill down into each student’s performance, categorised as emerging, developing, secure or mastered. The most important factor in that meeting is our teachers’ professional judgements, but our assessment data feeds into the discussions and plays an important part in informing the analysis and guiding next steps. In the meetings we look at student data for the Progress Tests, as well as the New Group Reading Test (NGRT) and Pupil Attitudes to Self and School (PASS) survey – providing a valuable whole-student view.
We look for students who might be at risk of not meeting age-related expectations, so we can put in intervention strategies and timetable support for them. We also identify the highest attaining learners and make sure that strategies are in place to ensure these students are stretched and challenged. We are looking at the granular level, student by student, to ensure that all children are making appropriate progress and we are adding value.”
The tests support the school in identifying areas of the curriculum where further support may be needed, informing planning.
Nicola explains more: “Our whole-school and phase data analysis is invaluable. But it’s not so much having the data, but what we do with the data that’s key. For example, data from the Progress Test in Maths showed us that we needed to develop more opportunities for the children to have fluency with their mental recall and mental calculations. This fed directly back into the planning cycle for this past year and we made a significant shift in the delivery of the maths curriculum with positive results.”
Derek goes on: “We introduced an initiative with our KS2 team for a whole-class approach to guided reading. Using NGRT, we were able to review the impact of this change. We looked at the number of children achieving age-related expectations, and it remained steady at around 95%. But the number of children moving into mastery at greater depth rose from around 50% to 74%.
This gave us the evidence that our changed approach was allowing children to think more deeply about their reading and their learning, and subsequently they were moving into a mastery bracket. When we drilled down into this on a value-added basis, over the last two years we have on average added three standardised age points per student per year – so this approach is being validated by not just maintaining progress, but by adding value to learners.”
“ ”
Identifying literacy needs and the impact of interventions
Literacy is a key focus for the school – and being able to assess reading in a robust way is vital to support the identification of needs and measure progress.
Derek explains: “NGRT allowed us to assess reading three times a year, at the end of each term. It supported me, as the phase leader, in reviewing the data with class teachers – looking for trends, for children who are requiring additional support, higher attainers too. We can show progress from year to year and validate some of the pedagogical decisions that we’ve made about reading.
As a High Performance Learning school, it’s important that we never put a cap on students’ learning. As NGRT is adaptive, it means that we’re never saying, ‘that’s as good as you can be’.”
It’s also really important as a way of assessing the impact of interventions. Derek explains: “One learner finished Year 5 with a SAS on NGRT of 89. As they entered Year 6 and we held an initial PPM, we knew that this was a learner who needed significant input and intervention. They took part in an intensive phonics recovery programme, that takes place over six weeks, and at the end of term 1 their NGRT SAS had improved from 89 to 110. We now not only have evidence of progress for that learner, but also have evidence of the efficacy of the intervention and can roll it out more widely.”
He concludes: “With the benefit of the insights that the data can bring, we have clear direction and we are emboldened to try new things – and the data validates those judgements. We are able to maintain our accountability, and the data that we get from the assessments allows us to back up the decisions we make with evidence of its impact.”