Skip to main content

Testing Times for Scotland

'These are not high stakes tests; there will be no 'pass or fail' and no additional workload for children or teachers.' John Swinney 25/11/16 news.gov.scot

I start this look at the introduction of the Scottish National Standardised Assessments (SNSAs) with  statement above from John Swinney, the Deputy First Minister and Cabinet Secretary for Education and Skills, made when he announced the contract for our new standardised testing had been awarded to ACER International UK, Ltd. This organisation is a subsidiary of the Australian Council for Educational Research (ACER), whom have been responsible for the development of the National Assessment Program - Literacy and Numeracy (NAPLAN) regime of high-stakes testing in the Australian system since 2008. I also believe they were one of a very short list of providers who tendered a bid for this contract.

I was drawn to this statement as I reflected on many of the responses I have received after I put out a request on Twitter asking for people to get in touch about their experiences with the new standardised tests as they are introduced across our schools. I sit on the board of Connect (formerly the Scottish Parents and Teachers Council) and the issue of the new tests had been raised at a recent board meeting. I said I would gather more information for Connect, so that we were able to offer advice to parents on the new testing regime, and hopefully allay some of their fears.

What quickly emerged was a very mixed picture in how the tests were being used across Scotland, but there was a commonality in the types of experiences children, teachers and schools were having, and it very much flew in the face of Mr Swinney's assurances given at the outset of their development.

'regarding SNSAs…Where do I start? I have had 3 children I have spent all year working with to build self-worth and self-belief, comment that they are ‘no good’, ‘useless’ and then cry. I have had one child who decided to guess most of the numeracy questions, and got them correct! (Lies, damned lies and statistics!) Most frustratingly, I am a class teacher administering the tests in class using 2 ipads and a desk top. Class of 27=81 tests. Huge impact on learning and teaching as you can imagine. With so many children suffering from low self-esteem and an increase in mental health issues, why is this happening? I truly despair.’

This was a response from a primary school class teacher, one of many who got in touch, expressing their concern with not only the impact on learners, the learning going on in their classrooms whilst testing was taking place, and the implications for their workload. I had a number of similar responses from teachers, school leaders and members of senior management teams.



 ‘ I took some of our P1s for their assessments today. We have 3 P1 teachers, who stayed with their classes, while 3 class teachers 2 learning support teachers and 3 PSAs spent all day doing the assessments 1-1, roughly 20-30 mins per child per literacy test, plus 15-20 mins for numeracy. Aside from the straight salary cost there, imagine the opportunity cost! The tests themselves are (obviously) far too narrow to give a decent picture of a child’s learning, but also seem generic rather than based on the taught P1 curriculum (despite the Scottish accent). The (now legendary) passage on hummingbirds is just ridiculous, I had one wee girl who was becoming so visibly crushed by it that I told her we would just leave it – I couldn’t let her suffer for something so unrealistic. Most of the children were exhausted by them, especially literacy, and certainly schools shouldn’t  have P1 children doing both in one sitting. I have 2 primary age daughters and if they were still in P1 I’d be withdrawing them from these. My opinion is that all the planned primary tests are at best unnecessary and possibly detrimental, but the P1 test seems to be actively harmful and a phenomenal drain on resources to no obvious benefit to the learners.’

This, from another class teacher, backed up what many colleagues were saying about the impacts for learners and teachers, as well as wider school, workloads. This was the first response that also started to query wider system issues of the new testing, such as the cost, the appropriateness of the content and the emotional impact on very young learners. One or two indicated that they felt most of their children weren't unduly stressed by the tests, they were able to present them in a fun way as a quiz or some other way, but they still queried some of the content, the usefulness of the outcomes and the disruption and impact being caused for teachers, children and schools.

 ‘Highlights of the P1 SNSA reading test included a passage on hummingbirds! Hummingbirds??? Vocabulary included hover and perch (and backwards). It also included a question asking what an alternative word for ‘beak’ was. So testing general knowledge then? It is impossible to do with a class of P1. SMT now doing individually, with all 70 plus P1s!!! Aaarghhh!!’

was a reflection of some of the frustrations felt by one headteacher. She went on to add,


‘seems to be the only game in town. I really question the validity of the ‘standardisation’ too. Even within my cluster we have some folks using iPads, some PCs, some testing all day, some only mornings, some individually and some whole class, some folk reading to their p1s instead of using the voice and doing the clicking because their mouse skills are not sophisticated enough. And don’t get me started on the IT and wifi capacity!!’

‘Who does my work while I collect meaningless data for HQ/Scot Govt?’

It would seem that many schools had resorted to senior management teams, Support for Learning teachers and other support staff, where there were any left, to carry out the testing, recognising the impossibility of teachers being able to deliver these tests, especially the P1 ones, whilst still teaching a class. The lack of equipment, and poor ICT systems were cited by many as a frustration and cause of more stress for teachers and young learners.
Another headteacher sent me the following,


‘The torture continues. P7 writing assessment (which in fact is assessing punctuation, grammar and spelling so therefore just the tools of writing) has questions where children asked to correct the spelling of a word. One of my enterprising P7s worked out that if you right click on the answer, the computer will tell you if its correct! Brilliant!’

This story caused a flurry of Tweets and incredulity on Twitter, and beyond, and also pointed to a concern raised by many, that these tests of literacy and 'numeracy' did neither. What they assess is some of the skills required to demonstrate literacy and numeracy, but they were no a test of either literacy or numeracy.

The sense of frustration felt by one Support for Learning teacher is palpable in her response.

‘ SNSA aaaaahhhhhh! As you can imagine this is an extra to what we are all doing. Local authority has decided to do them in May, which is probably a good time of year.
Getting them all logged on, finding the website (the long name) and saving it in favourites takes time in itself. Logging onto the website is laborious for P1 as adult needs to do it as they are so long. OK for most P4 and P7.
P1 pupils need good competent keyboard/generic skills to complete assessments (click and drag, do not double click, etc.) Our screens do not show the ‘Next’ key unless pupils scroll down to find it.
P1 pupils have a lot of pointer movements to make every time they go to the next screen (go to top left to read out instructions then read out questions and possible answers, now find the ‘next’ button etc.)
The guidance says give pupils the same support they would get in class – this is quite subjective. Do you give them the support they DO get or what you would like them to get if there were more staff?? As a teacher I am unsure what is being assessed in some areas. For example is the reading assessing comprehension or decoding?
Teachers cannot do sample assessments.
No text to speech option for P4 and P7 pupils – for pupils who are still developing skills in decoding (only parts of the P1 have speech option)
Font is very small on P4 and P7 assessments – we are all having to peer at the screen.
P1 reading requires them to read or hear about 4 sections of a story before they answer questions – lots of memory rather than find the answer in the text.
Lots of words and names used in P1 assessments that are not decodable using Alphabetic Codes taught in P1.
P1 pupils need lots of support to get through the practice and 2 assessments. We do not have time to do 1 to 1 support so independent working through them digitally may not give correct measure against benchmarks.
‘I was demented this morning. Getting P1s set up. Broken headphones, notebooks with no audio! Eventually got them all working independently and keeping them happy. No idea how they have done. What a palaver! Glad I am retiring early after next session.’

She raises more issues about the validity and content of the tests, all of which have supposedly been tested and piloted extensively before their introduction, and the technical issues that teachers and schools are having to deal with. Since the introduction of such on-line testing was first mooted many of these concerns had been raised by teachers and schools, but it would seem that not a lot of heed was taken of the concerns expressed.

Another class teacher pointed out yet another technical issue that surely could have been resolved before the tests went 'live.
‘One of the problems we faced is that the usernames include the child’s middle names, so some of our kids are taking a long time to log in. One pupil has 5 middle names, time was up and he was still trying to log in.' 
Whilst another articulated a question many were asking,
‘How much is this costing? I have no jotters or whiteboard-pens, general basics to do my job …Ah, priorities. Hang them out to dry!’

It is clear that many local authorities are asking/telling their schools to administer the tests towards the end of the school year, i.e. May/June, which is a very busy time in schools anyway and does not allow teachers to use them in a properly diagnostic way, but some have taken a different approach.

‘In our small cluster, we have analysed the SNSAs our P7 pupils sat in October. Teachers used the results diagnostically to aid planning, but we have looked at what the trends for cluster mean for secondary. Many of the results haven’t changed judgements about achievement of a level but some clear trends have emerged, which we will address for next session.’

However, this has allowed some to question the validity of the 'standardisation' claimed for the tests by the government and its supplier. What is clear is there are a range of approaches and experiences happening across Scotland, some of which bring into question the validity of outcomes produced by the testing software.

A DHT wrote,

‘Looked at P1 results with CT. Children are ranked Low, Medium or High. All exactly where CT put them at beginning of the week. A week of quality teaching time lost and stressed pupils and teachers … not to mention the cost of it all!’
which really does bring into question the added value to teachers' professional judgement from these assessments. If they are not telling teachers or schools anything they do not already know about learners, what then is their purpose, and at what cost? This was reflected in the latest comment I have received from a teacher.

‘Have just attended the phase B SNSA training. All about the data. We were told that the Scot Gvmt will not have access to the data. It belongs to the school and their LA. We were told again it is NOT high stakes, but there to inform the teachers. However she then kept telling us that HMIe will ask SMT what are they doing about areas flagged up as low. Kept referring to how it will show how PEF interventions are closing the gap and raising attainment. We pointed out that SNSA is done at P1, 4, 7 and S3 only. Unless you have data before and after a PEF intervention how can you possibly say what the impact is from SNSAs? The reports/graphs were so busy I defy anyone to have the time to fully interrogate them for each pupil as we were being shown. It also does not produce block graphs for year groups less than ten pupils, which means that many small schools cannot get them. We also said we do not see how they can be standardised assessments if LAs can do them at whatever time of year they choose.’

On the last point, it would also seem that schools are administering the tests in a myriad of ways, and with varying levels of support for learners. All this brings into question the validity of the 'results' across schools, local authorities and further afield. Observing from outside now, it would seem to me that the Government rhetoric around the tests 'not being high stakes' is being ignored by local authorities, who are making them, alongside the benchmarks, very much high-stakes and how they are judging schools. This is exactly the scenario that played out in Australia with NAPLAN tests, England with SATS, and other countries that have gone down similar routes. In all these countries, the early talk was of the tests supporting 'teacher professional judgement', but they soon mutated into high-stakes accountability measures. Scotland is heading the same way.

Some of the tweets I received from teachers included the following selection:
‘Accountability. Pure and simple. In no way will this benefit our learners.’
‘If we can’t clearly decide the nature of the question it shouldn’t be used – a reading passage should have all the answers. Anyway the whole set-up is simply ScotGovt data trawling not promoting best practice.’
‘The maths question about how many Tuesdays in a particular calendar month made my heart sink. Far too difficult and not reflective of Early level,’
‘This is for P1!! Its not reflective of early level literacy curriculum. The hummingbird passage is beyond the expected usual level by the end of P1. That question in particular totally relies on children’s own prior knowledge of birds, there were no contextual clues.’
‘AND it was in the norming study completed in march when I know that HTs specifically said that that particular passage was not appropriate for P1, when asked for feedback re the assessment.’
‘An all so a gorgeous and very bright P1 could say, ‘I am not good, am I’ after trying really hard to work out the words in the ‘hummingbird’ passage. Well done the system – a curious and excited learner demoralised!’
‘I have just had a flash back to the Counting Rhymes in an African Village paper from 5-14 test bank. Is the purpose of spending all this money to help teachers know how chn are progressing? That will be a great help because how would teachers ever know otherwise??? ‘
‘Can parents ask for their child not to do this?’
As things stand, I have hundreds of responses to this request for information about the tests, and whilst I recognise this is anything but a scientific examination of SNSAs, I do think there is enough already for the profession and parents to be concerned about. Regarding that last question in a Tweet, the tests are not compulsory or mandatory, the Government's own advice recognises this. However, some schools and local authorities are presenting them as 'mandatory' to parents. I would argue, that even were they designated as 'mandatory' parents would still have the right to withdraw their children. After all they are their children and if they think the impacts of such testing are harmful to their wellbeing, then they should withdraw them.

Just like the tests themselves, my request for thoughts around them provides us with a snapshot in time, and quite early in the timeframe of their introduction. However, I think there are indications of significant issues that need to be addressed by Scottish government, local authorities and schools. I have summarised these as follows;
Assessments aren’t really assessing literacy and numeracy, just bits of the skills required to be literate and numerate
Tests not assessing the taught curriculum in Scotland, especially at Early Level
They don’t reflect the principles and practice of CfE
Technical problems within the tests themselves
Workload for teachers and schools, and time being swallowed up in their administration
Lack of, or poor, hardware and infrastructures in schools to administer tests
Lack of ‘standardisation’ in how they are being applied, used and supported – a very mixed approach across the country
Stresses for children, especially p1s, and staff
When and how tests are being delivered is being heavily dictated by LAs
Are the tests actually telling the teachers anything they don’t already know, and at what cost?
Headteachers telling parents tests are mandatory, or not even informing parents they are taking place
The validity of the tests, how they will be interpreted, and how they will be used by schools, LAs and Gov

Does the categorising learners as 'Low' 'Medium' and 'High' promote setting, labelling and further disadvantage?
I think there are big questions for everyone in the Scottish system to ask and seek answers to. The cost of the introduction of the SNSAs is huge, running into millions of pounds, much of which are 'hidden' and are being absorbed by schools and local authorities. The big question is, is it worth it? The EIS said it would oppose the carrying out of tests if they began to skew the curriculum and put undue extra pressure on their members. I would suggest both of those are already beginning to happen. Teachers and school leaders need to be asking, as suggested by Mr Swinney himself, do you have more freedom to focus on learning teaching with the introduction of the tests? In 2017 he said 'When Scotland set out to reform our school curriculum, a critical question was how we break free of the top-down diktats that dominated Scottish school education.' He gave teachers and schools 'permission' to challenge anything that took them away from the core business of learning and teaching. Perhaps it is now time to make some of those challenges!
If you don't think it is worth it, just read this tweet again,
‘An all so a gorgeous and very bright P1 could say, ‘I am not good, am I’ after trying really hard to work out the words in the ‘hummingbird’ passage. Well done the system – a curious and excited learner demoralised!’
Is that really want for our very youngest learners? I hope not! Perhaps we are all being tested?






Comments

  1. Good day !!
    We are Christian Organization formed to help people in need of help,such as
    financial assistance, Do you need a loan to pay your bills? Do you need
    Personal Business Car or Student loans? Need a loan for various other
    purposes? If yes contact us today.
    Please these is for serious minded and God fearing People.
    Email: (jacksonwaltonloancompany@gmail.com)
    Text or call: +1-586-331-5557.
    Address is 68 Fremont Ave Penrose CO, 81240.

    ReplyDelete

Post a Comment

Popular posts from this blog

The Six Qualities of Educational Leadership

I wrote a post a few weeks ago (The six tasks of leadership 12/12/15) following an article about Sir Tim Brighouse, who had identified what he thought were the six key tasks for school leaders. My own list was a bit different to Tim's but it also set me thinking about what might be the qualities you would look for in high performing school leaders. I give you my six as a stimulus for discussion and perhaps your own consideration of what qualities we should look for in school leaders. The first is authenticity. I believe all school leaders need to be authentic and to really walk the walk of their talk. There can be nothing so dispiriting for school community members than being led by a leader who says one thing but does another. Remember to say what you mean and mean what you say. I think the highest performing leaders possess emotional awareness. They know themselves well and they know the people they lead well too. They understand the importance of relationships and how to ta

Evaluation: a process, not an event

Throughout my time as a school leader, and since, I have wrestled with the challenge of evaluation, in terms of measuring the impact of change, in a way which is meaningful and useful . Early in my career, such evaluation was very much viewed as an event, or events, that happened towards the end of a project, or piece of work, usually occurring towards the end of a school year. This was often a time filled with lots of scrabbling around looking for 'evidence' that could be put into some sort of report aimed at different different audiences. It felt stressful, concocted at times and often disconnected from the whole change agenda. Evaluation was a thing to be endured at the end of something else, with its main purpose consisting of proving you had been doing something to different people. Some of these would take what you gave them, and put that into their own 'evaluation report' for a cluster of schools, a local authority, or even a national system. A major issue with

Some thoughts for new student teachers

  Having gained a host of new followers on Twitter, who are either completing PGDE, or other student teacher qualifications, got me thinking about the advice, thoughts, comments I would give to those embarking on their own professional learning journey.   It is heart-warming to see, and hear, the enthusiasm of new entrants into the profession. They are passionate about their career path, and are constantly enthusing about the high quality input they are receiving from lecturers, professors of education and practitioners. My first piece of advice would to use those feelings as a touchstone, to go back to and revisit, throughout your career, but especially when you are facing challenges. Teaching is one of the most satisfying and rewarding professions to be involved in, but throughout your career you will encounter a myriad of challenges, and during these times it is often worth your while reminding yourself of why you came into the profession, and re-consider your early enthusiasms.   W