Skip to main content

Testing Times for Scotland

'These are not high stakes tests; there will be no 'pass or fail' and no additional workload for children or teachers.' John Swinney 25/11/16 news.gov.scot

I start this look at the introduction of the Scottish National Standardised Assessments (SNSAs) with  statement above from John Swinney, the Deputy First Minister and Cabinet Secretary for Education and Skills, made when he announced the contract for our new standardised testing had been awarded to ACER International UK, Ltd. This organisation is a subsidiary of the Australian Council for Educational Research (ACER), whom have been responsible for the development of the National Assessment Program - Literacy and Numeracy (NAPLAN) regime of high-stakes testing in the Australian system since 2008. I also believe they were one of a very short list of providers who tendered a bid for this contract.

I was drawn to this statement as I reflected on many of the responses I have received after I put out a request on Twitter asking for people to get in touch about their experiences with the new standardised tests as they are introduced across our schools. I sit on the board of Connect (formerly the Scottish Parents and Teachers Council) and the issue of the new tests had been raised at a recent board meeting. I said I would gather more information for Connect, so that we were able to offer advice to parents on the new testing regime, and hopefully allay some of their fears.

What quickly emerged was a very mixed picture in how the tests were being used across Scotland, but there was a commonality in the types of experiences children, teachers and schools were having, and it very much flew in the face of Mr Swinney's assurances given at the outset of their development.

'regarding SNSAs…Where do I start? I have had 3 children I have spent all year working with to build self-worth and self-belief, comment that they are ‘no good’, ‘useless’ and then cry. I have had one child who decided to guess most of the numeracy questions, and got them correct! (Lies, damned lies and statistics!) Most frustratingly, I am a class teacher administering the tests in class using 2 ipads and a desk top. Class of 27=81 tests. Huge impact on learning and teaching as you can imagine. With so many children suffering from low self-esteem and an increase in mental health issues, why is this happening? I truly despair.’

This was a response from a primary school class teacher, one of many who got in touch, expressing their concern with not only the impact on learners, the learning going on in their classrooms whilst testing was taking place, and the implications for their workload. I had a number of similar responses from teachers, school leaders and members of senior management teams.



 ‘ I took some of our P1s for their assessments today. We have 3 P1 teachers, who stayed with their classes, while 3 class teachers 2 learning support teachers and 3 PSAs spent all day doing the assessments 1-1, roughly 20-30 mins per child per literacy test, plus 15-20 mins for numeracy. Aside from the straight salary cost there, imagine the opportunity cost! The tests themselves are (obviously) far too narrow to give a decent picture of a child’s learning, but also seem generic rather than based on the taught P1 curriculum (despite the Scottish accent). The (now legendary) passage on hummingbirds is just ridiculous, I had one wee girl who was becoming so visibly crushed by it that I told her we would just leave it – I couldn’t let her suffer for something so unrealistic. Most of the children were exhausted by them, especially literacy, and certainly schools shouldn’t  have P1 children doing both in one sitting. I have 2 primary age daughters and if they were still in P1 I’d be withdrawing them from these. My opinion is that all the planned primary tests are at best unnecessary and possibly detrimental, but the P1 test seems to be actively harmful and a phenomenal drain on resources to no obvious benefit to the learners.’

This, from another class teacher, backed up what many colleagues were saying about the impacts for learners and teachers, as well as wider school, workloads. This was the first response that also started to query wider system issues of the new testing, such as the cost, the appropriateness of the content and the emotional impact on very young learners. One or two indicated that they felt most of their children weren't unduly stressed by the tests, they were able to present them in a fun way as a quiz or some other way, but they still queried some of the content, the usefulness of the outcomes and the disruption and impact being caused for teachers, children and schools.

 ‘Highlights of the P1 SNSA reading test included a passage on hummingbirds! Hummingbirds??? Vocabulary included hover and perch (and backwards). It also included a question asking what an alternative word for ‘beak’ was. So testing general knowledge then? It is impossible to do with a class of P1. SMT now doing individually, with all 70 plus P1s!!! Aaarghhh!!’

was a reflection of some of the frustrations felt by one headteacher. She went on to add,


‘seems to be the only game in town. I really question the validity of the ‘standardisation’ too. Even within my cluster we have some folks using iPads, some PCs, some testing all day, some only mornings, some individually and some whole class, some folk reading to their p1s instead of using the voice and doing the clicking because their mouse skills are not sophisticated enough. And don’t get me started on the IT and wifi capacity!!’

‘Who does my work while I collect meaningless data for HQ/Scot Govt?’

It would seem that many schools had resorted to senior management teams, Support for Learning teachers and other support staff, where there were any left, to carry out the testing, recognising the impossibility of teachers being able to deliver these tests, especially the P1 ones, whilst still teaching a class. The lack of equipment, and poor ICT systems were cited by many as a frustration and cause of more stress for teachers and young learners.
Another headteacher sent me the following,


‘The torture continues. P7 writing assessment (which in fact is assessing punctuation, grammar and spelling so therefore just the tools of writing) has questions where children asked to correct the spelling of a word. One of my enterprising P7s worked out that if you right click on the answer, the computer will tell you if its correct! Brilliant!’

This story caused a flurry of Tweets and incredulity on Twitter, and beyond, and also pointed to a concern raised by many, that these tests of literacy and 'numeracy' did neither. What they assess is some of the skills required to demonstrate literacy and numeracy, but they were no a test of either literacy or numeracy.

The sense of frustration felt by one Support for Learning teacher is palpable in her response.

‘ SNSA aaaaahhhhhh! As you can imagine this is an extra to what we are all doing. Local authority has decided to do them in May, which is probably a good time of year.
Getting them all logged on, finding the website (the long name) and saving it in favourites takes time in itself. Logging onto the website is laborious for P1 as adult needs to do it as they are so long. OK for most P4 and P7.
P1 pupils need good competent keyboard/generic skills to complete assessments (click and drag, do not double click, etc.) Our screens do not show the ‘Next’ key unless pupils scroll down to find it.
P1 pupils have a lot of pointer movements to make every time they go to the next screen (go to top left to read out instructions then read out questions and possible answers, now find the ‘next’ button etc.)
The guidance says give pupils the same support they would get in class – this is quite subjective. Do you give them the support they DO get or what you would like them to get if there were more staff?? As a teacher I am unsure what is being assessed in some areas. For example is the reading assessing comprehension or decoding?
Teachers cannot do sample assessments.
No text to speech option for P4 and P7 pupils – for pupils who are still developing skills in decoding (only parts of the P1 have speech option)
Font is very small on P4 and P7 assessments – we are all having to peer at the screen.
P1 reading requires them to read or hear about 4 sections of a story before they answer questions – lots of memory rather than find the answer in the text.
Lots of words and names used in P1 assessments that are not decodable using Alphabetic Codes taught in P1.
P1 pupils need lots of support to get through the practice and 2 assessments. We do not have time to do 1 to 1 support so independent working through them digitally may not give correct measure against benchmarks.
‘I was demented this morning. Getting P1s set up. Broken headphones, notebooks with no audio! Eventually got them all working independently and keeping them happy. No idea how they have done. What a palaver! Glad I am retiring early after next session.’

She raises more issues about the validity and content of the tests, all of which have supposedly been tested and piloted extensively before their introduction, and the technical issues that teachers and schools are having to deal with. Since the introduction of such on-line testing was first mooted many of these concerns had been raised by teachers and schools, but it would seem that not a lot of heed was taken of the concerns expressed.

Another class teacher pointed out yet another technical issue that surely could have been resolved before the tests went 'live.
‘One of the problems we faced is that the usernames include the child’s middle names, so some of our kids are taking a long time to log in. One pupil has 5 middle names, time was up and he was still trying to log in.' 
Whilst another articulated a question many were asking,
‘How much is this costing? I have no jotters or whiteboard-pens, general basics to do my job …Ah, priorities. Hang them out to dry!’

It is clear that many local authorities are asking/telling their schools to administer the tests towards the end of the school year, i.e. May/June, which is a very busy time in schools anyway and does not allow teachers to use them in a properly diagnostic way, but some have taken a different approach.

‘In our small cluster, we have analysed the SNSAs our P7 pupils sat in October. Teachers used the results diagnostically to aid planning, but we have looked at what the trends for cluster mean for secondary. Many of the results haven’t changed judgements about achievement of a level but some clear trends have emerged, which we will address for next session.’

However, this has allowed some to question the validity of the 'standardisation' claimed for the tests by the government and its supplier. What is clear is there are a range of approaches and experiences happening across Scotland, some of which bring into question the validity of outcomes produced by the testing software.

A DHT wrote,

‘Looked at P1 results with CT. Children are ranked Low, Medium or High. All exactly where CT put them at beginning of the week. A week of quality teaching time lost and stressed pupils and teachers … not to mention the cost of it all!’
which really does bring into question the added value to teachers' professional judgement from these assessments. If they are not telling teachers or schools anything they do not already know about learners, what then is their purpose, and at what cost? This was reflected in the latest comment I have received from a teacher.

‘Have just attended the phase B SNSA training. All about the data. We were told that the Scot Gvmt will not have access to the data. It belongs to the school and their LA. We were told again it is NOT high stakes, but there to inform the teachers. However she then kept telling us that HMIe will ask SMT what are they doing about areas flagged up as low. Kept referring to how it will show how PEF interventions are closing the gap and raising attainment. We pointed out that SNSA is done at P1, 4, 7 and S3 only. Unless you have data before and after a PEF intervention how can you possibly say what the impact is from SNSAs? The reports/graphs were so busy I defy anyone to have the time to fully interrogate them for each pupil as we were being shown. It also does not produce block graphs for year groups less than ten pupils, which means that many small schools cannot get them. We also said we do not see how they can be standardised assessments if LAs can do them at whatever time of year they choose.’

On the last point, it would also seem that schools are administering the tests in a myriad of ways, and with varying levels of support for learners. All this brings into question the validity of the 'results' across schools, local authorities and further afield. Observing from outside now, it would seem to me that the Government rhetoric around the tests 'not being high stakes' is being ignored by local authorities, who are making them, alongside the benchmarks, very much high-stakes and how they are judging schools. This is exactly the scenario that played out in Australia with NAPLAN tests, England with SATS, and other countries that have gone down similar routes. In all these countries, the early talk was of the tests supporting 'teacher professional judgement', but they soon mutated into high-stakes accountability measures. Scotland is heading the same way.

Some of the tweets I received from teachers included the following selection:
‘Accountability. Pure and simple. In no way will this benefit our learners.’
‘If we can’t clearly decide the nature of the question it shouldn’t be used – a reading passage should have all the answers. Anyway the whole set-up is simply ScotGovt data trawling not promoting best practice.’
‘The maths question about how many Tuesdays in a particular calendar month made my heart sink. Far too difficult and not reflective of Early level,’
‘This is for P1!! Its not reflective of early level literacy curriculum. The hummingbird passage is beyond the expected usual level by the end of P1. That question in particular totally relies on children’s own prior knowledge of birds, there were no contextual clues.’
‘AND it was in the norming study completed in march when I know that HTs specifically said that that particular passage was not appropriate for P1, when asked for feedback re the assessment.’
‘An all so a gorgeous and very bright P1 could say, ‘I am not good, am I’ after trying really hard to work out the words in the ‘hummingbird’ passage. Well done the system – a curious and excited learner demoralised!’
‘I have just had a flash back to the Counting Rhymes in an African Village paper from 5-14 test bank. Is the purpose of spending all this money to help teachers know how chn are progressing? That will be a great help because how would teachers ever know otherwise??? ‘
‘Can parents ask for their child not to do this?’
As things stand, I have hundreds of responses to this request for information about the tests, and whilst I recognise this is anything but a scientific examination of SNSAs, I do think there is enough already for the profession and parents to be concerned about. Regarding that last question in a Tweet, the tests are not compulsory or mandatory, the Government's own advice recognises this. However, some schools and local authorities are presenting them as 'mandatory' to parents. I would argue, that even were they designated as 'mandatory' parents would still have the right to withdraw their children. After all they are their children and if they think the impacts of such testing are harmful to their wellbeing, then they should withdraw them.

Just like the tests themselves, my request for thoughts around them provides us with a snapshot in time, and quite early in the timeframe of their introduction. However, I think there are indications of significant issues that need to be addressed by Scottish government, local authorities and schools. I have summarised these as follows;
Assessments aren’t really assessing literacy and numeracy, just bits of the skills required to be literate and numerate
Tests not assessing the taught curriculum in Scotland, especially at Early Level
They don’t reflect the principles and practice of CfE
Technical problems within the tests themselves
Workload for teachers and schools, and time being swallowed up in their administration
Lack of, or poor, hardware and infrastructures in schools to administer tests
Lack of ‘standardisation’ in how they are being applied, used and supported – a very mixed approach across the country
Stresses for children, especially p1s, and staff
When and how tests are being delivered is being heavily dictated by LAs
Are the tests actually telling the teachers anything they don’t already know, and at what cost?
Headteachers telling parents tests are mandatory, or not even informing parents they are taking place
The validity of the tests, how they will be interpreted, and how they will be used by schools, LAs and Gov

Does the categorising learners as 'Low' 'Medium' and 'High' promote setting, labelling and further disadvantage?
I think there are big questions for everyone in the Scottish system to ask and seek answers to. The cost of the introduction of the SNSAs is huge, running into millions of pounds, much of which are 'hidden' and are being absorbed by schools and local authorities. The big question is, is it worth it? The EIS said it would oppose the carrying out of tests if they began to skew the curriculum and put undue extra pressure on their members. I would suggest both of those are already beginning to happen. Teachers and school leaders need to be asking, as suggested by Mr Swinney himself, do you have more freedom to focus on learning teaching with the introduction of the tests? In 2017 he said 'When Scotland set out to reform our school curriculum, a critical question was how we break free of the top-down diktats that dominated Scottish school education.' He gave teachers and schools 'permission' to challenge anything that took them away from the core business of learning and teaching. Perhaps it is now time to make some of those challenges!
If you don't think it is worth it, just read this tweet again,
‘An all so a gorgeous and very bright P1 could say, ‘I am not good, am I’ after trying really hard to work out the words in the ‘hummingbird’ passage. Well done the system – a curious and excited learner demoralised!’
Is that really want for our very youngest learners? I hope not! Perhaps we are all being tested?






Popular posts from this blog

Play not tests

Last night I attended the launch the 'PlayNotTests' campaign being led by Sue Palmer and the Upstart organisation in Scotland. This campaign is aimed at getting the Scottish government to think again about their decision to introduce standardised testing into Scottish schools, particularly in Primary 1. Upstart is a group whose main aim is the establishment of a play-based 'kindergarten stage' in Scottish schools, and they want to delay children's introduction into the formal education system until they have reached seven years of age. Before that, Upstart and their supporters, of which I am one, believe that young children learn best, and begin to develop the attributes they will need for life and learning, through play based learning, most of which should be located outside of classrooms and school buildings. This is a model that has been successfully developed by a number of Nordic systems, with positive impacts on the well-being as well as the learning of young…

Some thoughts on Scottish education

This week I was asked if I would go along to speak to labour MSPs and MPs about Scottish education and schools. My brief was to talk about education. its current state, the reality of how the attainment gap can be tackled, how teachers can help government address the challenges of poverty, and how we might start to reinvest in our schools and our teaching staff. The politicians did not want to hear from the 'same people' who always spoke to them, and wanted to hear from someone 'fresh from the chalk-face'. I had forty five minutes, about twenty minutes input from me then a discussion and question and answer session. No pressure there then! Anyway, I gave it my best shot.

I started with a brief introduction to myself and my background, to give them some idea of who this person was, and why they might be able to help them and I tried to cover most of the following in my time slot.

I started with some the positives from our system.

Stuff we should be proud of:
Our learners …