By Dr Kanayo Dike-Oduah Situ, EdD, NPQH, MA, PGCE, BSc
Teacher, Examiner and Independent Researcher
Note: This post builds on my article published in Schools Week: Tech problems are putting the entire exam system at risk (Linked).
Every summer, I log on to mark scripts knowing that the grades I award will help shape the futures of young people. Like many examiners, I fit this work around a full-time teaching job, family responsibilities, and late nights. I do it because I believe in fair outcomes and the integrity of the assessment system.
But this year, I found myself battling not only the workload, but also the technology. Logging-in problems, scripts not saving, platforms crashing mid-mark, and annotations disappearing. At times, it felt like the system was working against me.
I know I’m not alone. In an independent survey I launched last week, more than 70% of respondents so far have reported similar technical issues during this summer’s marking period.
One examiner described being “more demotivated to mark to the best of my ability whenever the system crashed.”
Another admitted they had to “rush the marking to meet the deadline and even reduced my effort when I had to mark the same script twice due to tech issues.”
A third reflected: “Because the process took longer, and a full day of marking was missed, I believe examiners would have been ‘playing catch up’ and trying to get through responses quicker… I was less inclined to take time reading each response in detail, as it was taking long enough to get through them.”
These problems aren’t just inconveniences. They affect concentration, consistency, and morale. They also create unfair pressure. Like many, I found myself having to “catch up” after losing time to technical failures. That inevitably raises the question: what impact might this have on the quality of marking, and ultimately, on the fairness of grades awarded?
The Numbers Behind the Experiences
So far, 56 examiners have responded to my survey. Here’s what they told me:
- Experience: 30% have been examiners for three to five years, 26% for six to ten years, another 26% for more than ten years, and 19% for zero to two years. These frustrations affect both new and experienced examiners.
- Technical Failures: 75% reported technical failures during marking this summer.
- Stress & Motivation: 75% said technical issues made them feel stressed and demotivated.
- Frustration: 80% reported feeling frustrated.
- Wellbeing: 26% agreed their overall wellbeing was affected.
- Support: Despite everything, 75% reported still feeling supported in completing their marking.
The picture that emerges is clear: examiners are deeply committed to their role, but poor systems risk undermining their efforts.
What Needs to Change
The qualitative feedback highlights a set of consistent requests from examiners:
- Robust, user-friendly software that works seamlessly across devices, including Macs and tablets.
- Clearer and more consistent training, so examiners are not left navigating confusing interfaces.
- Responsive technical support, available during the hours examiners actually work, not just office hours.
- Flexibility in deadlines when technical systems fail.
As one examiner put it: “Lack of being able to get onto the software made it so much longer and impossible to meet the tight turnaround of deadlines. Logging on and it then crashing deleted marks and affected quality.”
Why This Matters
As someone who has completed a Doctorate in Education at UCL, focusing on assessment, who holds the National Professional Qualification in Headship (NPQH), and who works as an examiner, I understand how much the system depends on trust. But trust is fragile.
If examiners themselves lose confidence in the systems they are required to use, how can we expect students, parents, and the wider public to trust the results?
Assessment is a public endeavour. It must be fair, transparent, and trusted. Examiners are the unseen backbone of that system. Their message is clear: the current technical infrastructure may not be fit for purpose.
Behind every exam result is an examiner. If we want to protect the futures of young people, we must listen to the people who mark their work.
Read More & Get Involved
- You can read my national piece in Schools Week: Tech problems are putting the entire exam system at risk. (Linked)
- If you’re an examiner who has experienced technical issues while marking this summer, please share your experience through my ongoing anonymous survey: Examiners’ Voices Matter – Share Your Experience. (Linked)
The more we can evidence these challenges, the stronger the case will be for change.

Disclaimer
This research is independent of any exam board. No exam boards are targeted, and no confidential materials are shared. The survey is anonymous and ongoing, and while the sample size is currently modest, further study is planned with assessment experts and in collaboration with exam boards. The findings cannot be considered representative of all examiners, but the qualitative accounts and lived experiences presented here are powerful and deserve attention.
Link to School’s Week Article: https://schoolsweek.co.uk/tech-problems-are-putting-the-entire-exam-system-at-risk/