A British fiasco derived from algorithm
It was a British fiasco, but Prime Minister Boris Johnson termed it a "mutant algorithm". The fiasco is about the exam results of both GCSE and A-level involving millions of pupils. Though both these exams are run and managed by British authorities, any fallout is felt by many thousands around the world including Bangladesh. According to Dhaka Tribune, this year about 8,000 Bangladeshi youngsters received their A- level grades and a similar number of them got International GCSE results. As the Covid-19 pandemic made holding of any examination impossible this year, the results given were based on mathematical calculations, known as algorithm, which caused a national scandal. Hence, PM Johnson, after almost two weeks of silence, told pupils at a school, "I'm afraid your grades were almost derailed by a mutant algorithm and I know how stressful that must have been."
Initially, an algorithm was used to determine A-level grades for about 7,00,000 students this year. But it was scrapped after a nationwide outcry following detection of serious problems. Initially, government ministers defended the grades produced by the algorithm saying that it was a world-class procedure. But following widespread anger over major flaws detected in the algorithm-based grading,the government made a U-turn and decided to use predicted grades from teachers instead.The GCSE result was delayed to replace algorithm-based grades following the A-level fiasco. A sudden and last-minute change in GCSE results of more than four million school-leavers means generous grading, which has been described as a grade inflation.
In England, the official exam regulator, Ofqual, is responsible for awarding grades, and this year it had asked teachers to supply—for each pupil for every subject—an estimated grade as well as a ranking compared with every other pupil at the school within that same estimated grade. These were put through an algorithm, which also factored in the school's performances in each subject over the previous three years. The idea was that the grades this year—even without exams—would be consistent with how schools had done in the past. Ofqual said this was a more accurate way of awarding grades than simply relying on teachers' assessments. The rationale behind Ofqual's preference for algorithm was that teachers would likely be more generous in assigning an estimated mark, which might lead to a much-higher number of pupils getting the top grades.
Once A-level grades were announced, it showed that nearly 40 percent of students got lower grades than teachers' assessments. More shockingly, the downgrading affected state schools much more than the private-sector run schools. Owing to the stress on past school performance, a bright student from an underperforming school was likely to have their results downgraded through no fault of their own. Likewise, a school which was in the process of rapid improvement would not have seen this progress reflected in results. As private schools are selective and better-funded, in most years they perform well in terms of exam results. Therefore, an algorithm based on past performance puts students from these schools at an advantage compared with their state-educated counterparts.
The English fiasco happened within two weeks of the Scottish experience where algorithm-based results of their higher qualification, comparable to the A-level, was overturned by the government as soon as the fault was detected. But the government in London responsible for England, Wales and Northern Ireland's exam results seemed reluctant to take lessons from Scotland and insisted that its algorithm was a robust one. Prime Minister Johnson was on summer holiday and his silence caused widespread anger. One tabloid not known for political journalism splashed a single-word headline: "Missing"—with a manipulative caricature of Mr Johnson—asking its readers, "Have you seen him?" The fallout of the scandal continues and the National Education Union (NEU) called Johnson's "mutant" algorithm comment "brazen", and accused him of trying to "idly shrug away a disaster that his own government created."
The result fiasco also caused considerable logistical problems for universities too. Some of the students who lost out on their first-choice course and university due to lower grades rushed back, causing oversubscription in many universities. It forced the government to lift its cap on the number of students each institution can admit. But admitting more students means tackling other challenges such as capacity, staffing and facilities. Though this cap and advance offer for courses by universities do not have any direct impact on international students, including those from Bangladesh, the grading fiasco had some unsettling effects on many Bangladeshi families. Many of our friends and relatives made their children's result known only after the revised grades were announced. The obvious reason was that the initial results were not what they expected.
The fiasco raises questions about the oversight of algorithms used at all levels in society, ranging from very basic ones to complex examples that utilise artificial intelligence. Tech giants like Facebook, Twitter, and Google use algorithm, and whatever we see on our newsfeeds on social media platforms are chosen by such mathematical tools. The exam results produced by the algorithm left everyone unhappy, and now the Office for Statistics Regulation (OSR) says that it would conduct an urgent review of the approach taken by Ofqual. The algorithm fiasco also suggests a sense of powerlessness felt by those students disappointed by their results. Now, many experts want to find a human way back, instead of the computer deciding such crucial things for us. One may wonder whether such desire will extend to other things too.
Kamal Ahmed is a freelance journalist based in London, UK.
Comments