Data Dashboards across the MAT…

I’ve had quite a bit of interest in the assessment and analytics development that we’ve been working on as a trust in the last few weeks so I thought I’d share some of our thinking along with some insights into how we’ve developed consistent summative assessment processes across our trust of 8 primary schools. I was also supposed to make a presentation at the BETT Show this week sharing the analytics tools we’ve developed but couldn’t make it so instead I’ll share some thoughts here.

Trust-Wide Assessment

One of my responsibilities across Northampton Primary Academy Trust is to develop our approaches to assessment in a world without levels. A big part of NPAT’s development is to constantly look at where we standardise practices and where we leave approaches down to individual schools and teachers. One area that made a lot of sense for us to standardise was our summative assessment processes and over the last three years we’ve been working on the what, why, when and how of assessment.

Standardised tests

Like many others, we’ve come around to the view that standardised tests across schools are a really important part of our internal assessment system.  There are different standardised tests out there and we use PIRA and PUMA for Reading and Maths respectively. Although ‘testing’ can get a bad press, we see a number of real benefits including the following:

  • They are more reliable than a teacher assessment grade in comparing attainment.
  • They take much less curriculum time than other forms of ‘Teacher Assessment’ or tests – the ones we use take 45 minutes each.
  • The workload associated with standardised tests is much less than other lengthy processes we’ve experienced involving evidence gathering or maintaining tracking systems with large amounts of objectives.

On the subject of standardised testing, James Pemroke’s post is well worth a read here.

If there is a question around tests such as PIRA and PUMA, it’s around validity and how relevant the information is that you get from them in relation to say the new end of KS2 tests. A specific example here is that there is almost no arithmetic in the PUMA tests in comparison to the new requirements at the end of KS2. But where the outputs are useful is as a predictor of what outcome children are likely to achieve at the end of Year 6 and thankfully some early correlation work now exists such as this from Tyrone Samuel from Ark Schools which we can build on.  Having a sense of how our children are performing in relation to the rest of the country is a really useful thing.

Stop Chasing Shadows

Getting good standardised attainment data from across different classes and schools is really helpful when identifying what the current strengths and weakness exist in the school – particular in comparison to others. By being able to see data such as comparative average scores, we can flag up where there are potential strengths and weaknesses more accurately across KS2 and crucially, before children get to Year 6.

Having an earlier radar on standards can help us to focus on the live issues in the school rather than being duped into a game of chasing shadows responding to what RAISE/ASP or FFT says about the children that left months before. It also gives us the opportunity to intervene earlier when necessary in KS2 which I hope can mean that there is less clamour in Year 6 as cohorts progress through.

How does it work?

Very simply, we have identified 3 standardised assessment points across the year (AP1, AP2 and AP3). These are in December, March and June. For children in Years 3-5, they complete PIRA and PUMA tests at this time. Children in Year 6 complete the 2016 SATs paper at AP1, 2017 SATs paper at AP2 and then the real thing in May. This happens consistently in all our schools at these times.

Collecting and Cleansing Data

Once tests are marked, teachers input the results into our MIS system (SIMS) and then this data is checked centrally to ensure that it is complete and in the right format. There is a lot of data ‘cleansing’ to do at this point in the process where data needs reformatting, double checking and testing. This is a really important stage and has required us to invest in staffing to manage the process as well as solving technical challenges so that the data manager has access to each school’s MIS remotely.

Once the data is in SIMS is complete, it is then sucked up using a ‘data agent’ and all the information held in the different schools is then stored centrally in a data warehouse. This part is really clever; way beyond my skill set and we’ve worked with Matt from Coscole Ltd. who does this work across our trust.

Once the data is in the warehouse, it can then be used for different purposes. This is part of our mantra to ‘collect once, use many times’.

Trust-Wide Analysis

Power BI (again customised and hosted through Coscole Ltd.) then provides the ‘front end’ which is the bit that school staff can engage with. It’s a part of our Office 365 dashboard which all staff already have access to and so it doesn’t require any additional login.

The following three screens are dashboard extracts from our system which allow us to compare attainment from standardised tests across schools in the trust. There are a range of filters you can tinker with to then view these same analytics by either school, contextual group etc.

Please note that the images here are from a version of our data in which all names of schools and individuals have been changed and results randomised so that no-one and no school can be identified.

This is a summary dashboard showing Reading and Maths headline data across all schools (light blue is Reading and black is Maths). You can view this by different year groups or altogether. In this screen, we are viewing Year 3 data.
This dashboard shows the current Year 6 reading data at December for a previous year’s SATs test across all trust schools. The data here indicates that 65% have achieved 100+ and 59% of FSM6 children have achieved 100+. There is also analysis by gender, SEND and PP on the right hand side.
This dashboard shows Maths average data for all Year 3 classes across the trust. It displays the same breakdown as the Reading dashboard.

This final screen is a scatterplot matching prior attainment (1 = Low, 2 = Middle, 3 = High, 0 = No Data) against current test scores. This is a much more visual way of comparing these two fields than looking down a spreadsheet. The same comparisons can be made with targets.

In this dashboard, the vertical axis separates the children by their prior attainment and the horizontal axis plots their average standardised score according to PIRA/PUMA (or Y6 test).

There’s lots more I could write about assessment (who knows I might have a chapter in an upcoming book?) but that’s all for now and hopefully enough to get a taster.

We’re hoping to host a visit to the trust later in the Spring term where anyone interested can find out more about how the data analysis works.

I’d be interested in any comments, suggestions for improvements and to know what other trusts or groups of schools are doing in this area around data.