News article by Sarah Marsh.
Published in The Guardian.
Call for more transparency on how such tools are used in public services as 20 councils stop using computer algorithms.
Councils are quietly scrapping the use of computer algorithms in helping to make decisions on benefit claims and other welfare issues, the Guardian has found, as critics call for more transparency on how such tools are being used in public services.
It comes as an expert warns the reasons for cancelling programmes among government bodies around the world range from problems in the way the systems work to concerns about bias and other negative effects. Most systems are implemented without consultation with the public, but critics say this must change.
The use of artificial intelligence or automated decision-making has come into sharp focus after an algorithm used by the exam regulator Ofqual downgraded almost 40% of the A-level grades assessed by teachers. It culminated in a humiliating government U-turn and the system being scrapped.
The fiasco has prompted critics to call for more scrutiny and transparency about the algorithms being used to make decisions related to welfare, immigration, and asylum cases. [ . . . ]