How a computer algorithm caused a grading crisis in UK schools


Gavin Williamson, UK Secretary of Defense, will attend a weekly meeting of Cabinet Ministers at 10 Downing Street in London, United Kingdom, on Tuesday 23 April 2019.

Bloomberg | Bloomberg | Getty Images

Britain is in the midst of a nationwide rating debacle after an automated algorithm downgraded the A-level results of nearly 40% of students who could not pass exams from the coronavirus pandemic.

To determine each student’s results, the United Kingdom decided to use an algorithm that looked at their mock exam results, as well as their school’s track record in the exams. Lawmakers said the software would give students a “more honest” result after concluded teachers could potentially try to inflate their students’ grades.

But the model eventually discouraged students from private schools and affluent areas, leaving high-achievers of free, state schools disproportionately affected. Many students have withdrawn their university places as a result of the downgraded exam results, and there have been protests as a result.

There are no direct equivalents to A-level exams in the US, but these are the tests that students in the UK take when they are 17-18, perhaps to help them get into college, and are equal to SATs and PSATS. Some employers look to them as they consider new applicants and they are widely regarded as the most important exams in UK schools.

UK Education Secretary Gavin Williamson said he was ‘incredibly sorry’ for the emergency of the exam and said his main priority now is to ensure students get fair results.

The qualities expressed by the algorithm are drawn in favor of predicates of teachers, and mark one of the biggest U-turns in UK education history. The Department of Education said it continues to work with Exam Regulator Ofqual to try to deliver honest results for young people at this unusual time.

But many students have already lost their places at their preferred university and the admissions process is now in chaos. Students who applied at Oxford and Cambridge were told that they might have to wait a year before they could start their courses after successfully appealing to the results they obtained. Oxford’s Worcester College said it would accept all students who had made offers, regardless of their degrees.

The opposition Labor Party described the algorithm on Thursday as “unfair”, claiming it was anti-discrimination legislation and violating laws requiring these standards to be enforced.

Catherine Breslin, a machine learning consultant who once worked for Amazon, said: “Algorithms can reverse and surface the injustice and discrimination of systems they automate.”

She added: “While Ofqual’s algorithm was clearly the wrong way to go, and has caused a great deal of fear up and down the country, perhaps this will lead to a re-evaluation of our exam system.”

Labor’s Shadow Communities and Secretary of Local Government, Steve Reed, said on Wednesday that the “fiasco” was far from over. “It was just that the U-government turned up Monday, but thousands of families have yet to deal with the consequences,” Reed said.

“We need an iron guarantee from ministers that no student will lose on their first choice due to government incompetence. And they must ensure that all students have their final degrees by the end of the week,” he continued.

“The beggars believe that students are still in limbo, without clarity about their futures because of a mess the government created. Families earn better than this.”

The opposition party now wants Education Secretary Gavin Williamson, who has been dealing with calls to resign after the U-turn, to publish the legal advice he received. Williamson is yet to say whether he will dismiss or publish the legal advice.

William Tunstall-Pedoe, an entrepreneur who sold his start-up of artificial intelligence to Amazon, told CNBC: “A class in an exam is direct evidence of an individual’s completion: what they could do on the date they were tested “A direct achievement of that individual that was entirely in their hands (and therefore where they were to blame if not expected something).”

“Any prediction of what someone would have achieved when they did not, regardless of how produced it is is therefore very different. It may be necessary to sort out university places, but it is not the same as a class.”

The Office for Statistics Regulation (OSR) said on Tuesday that it would conduct an evaluation of the approach taken to develop the statistical models for the 2020 exams.

“The review will seek to highlight the challenges facing these unusual circumstances,” the OSR said.

Readers predictions

This week, teachers’ predicates were also used to determine student results for their GCSEs, which most students in the UK take two years for their A level. The pass rates were over the whole level and just over a quarter of the students were assigned grade 7 or higher (equal to an A or an A * under the previous system).

The debacle raises questions about how much governments should rely on algorithms that have a major impact on citizens.

Britain and other countries are using more and more software to automate public services, with the hope of cutting costs and making processes more efficient.

Earlier this month, the British government promised to stop using an algorithm when considering foreign visa applications after receiving a legal complaint accusing it of discrimination.

.