Home Office drops 'racist' algorithm from visa decisions
After a legal challenge to the automatic algorithm, the UK will "redesign" its visa process.
www.bbc.co.uk
The Home Office has agreed to stop using a computer algorithm to help decide visa applications after allegations that it contained "entrenched racism".
The Joint Council for the Welfare of Immigrants (JCWI) and digital rights group Foxglove launched a legal challenge against the system.
Foxglove characterised it as "speedy boarding for white people".
The Home Office, however, said it did not accept that description.
"We have been reviewing how the visa application streaming tool operates and will be redesigning our processes to make them even more streamlined and secure," it said in a statement.
The controversy centred over an applicant's nationality being used as a part of the automatic system.
Use of the controversial algorithm will be suspended on Friday 7 August, with a redesigned system expected to be in place by the autumn.
Foxglove said the system had "been used for years to process every visa application to the UK".
What did the algorithm do?
The Home Office characterised the algorithm as a "streamlining" system.
The system took some information provided by visa applicants and automatically processed it, giving each person a colour code based on a "traffic light" system - green, amber, or red.
One metric used was nationality - and FoxGlove alleged that the Home Office kept a "secret list of suspect nationalities" which would automatically be given a red rating.
Those people were likely to be denied a visa, the group said.
- Face-scanning 'criminal predictor' sparks bias row
- Algorithms face scrutiny over potential bias
- Sexist and biased? How credit firms make decisions
People from red-flagged countries, it said, "received intensive scrutiny by Home Office officials, were approached with more scepticism, took longer to determine, and were much more likely to be refused".
The group argued this process amounted to racial discrimination, putting it in breach of the Equality Act.
There was another factor at play, which the JCWI and Foxglove called a "feedback loop".
Visa decision rates would be used to decide which countries were on the "suspect nationalities" list, they said.
But the algorithm used that list, and red-flagged applications were less likely to succeed. Those results were then used to reinforce the list.
The JCWI said it was "a vicious circle".
'Institutionally racist'
"We're delighted the Home Office has seen sense and scrapped the streaming tool. Racist feedback loops meant that what should have been a fair migration process was, in practice, just speedy boarding for white people," said Cori Crider, founder of Foxglove.
Chai Patel, legal policy director of JCWI, said the Windrush scandal had shown the Home Office was "oblivious to the racist assumptions and systems it operates".
"This streaming tool took decades of institutionally racist practices, such as targeting particular nationalities for immigration raids, and turned them into software," he said.
The Home Office said it could not comment further while litigation was still ongoing.
Until the new system is in place, the streaming of visa applications will be based on information about the specific person - such as their previous travel - and nationality will not be taken into account.
---
So basically they had an AI algorithm that looked at different risk factors and said for example 'a young man asking for a student visa from Nigeria' is a high risk to break the rules, whereas 'an old lady from Russia' is low risk.
Note that visas are incredibly racist by design, because why should people from Afghanistan need a visa to visit the UK when people visiting from Australia don't.
Essentially, the AI scores you as
'green'
'yellow'
or
'red'
and little attention was paid to the green, and lots to the red, hence
99.5% of African visit visa applications initially scored “Green” were successful, whereas only 55% of applications given a “Red” flag were granted.
Public must be told how controversial visa streaming tool works, immigration inspector says - Free Movement
David Bolt says the Home Office must reassure the public that automated red flags on visa applications are not the result of biased algorithms.
www.freemovement.org.uk
Which is not so bad considering that dozens of countries in the world refuse entry to Israelis outright, so a 45% chance of being rejected if the AI spots that you are a likely future illegal immigrant/rapist/bomber, is not too bad at all.