The government plans to use AI technology to verify the age of migrants that arrive in the UK in an attempt to stop adults pretending to be children.
A report carried out by the government’s independent immigration inspector found cases where adult migrants had been classified as children – and cases where child migrants had been wrongly classified as adults.
But the asylum system makes it easier for children to apply to stay and last year 56% of migrants who claimed to be children were either assessed to be adults or later admitted they were 18 or over.
BBC News understands the government plans to use existing technology that was created for online retailers that sell age-restricted products.
Border Security and Asylum Minster Angela Eagle said that the AI is trained on millions of images of faces and was “able to produce an age estimate with a known degree of accuracy for an individual whose age is unknown or disputed”.
Facial Age Estimation offers a “potentially rapid and simple means” for testing judgements when assessing age, Eagle said.
Currently immigration officials and social workers have to produce an assessment of the actual age of migrants claiming to be under 18, but both the Home Office and the independent immigration inspector have said accurately assessing an age is “challenging”.
David Bolt, the Independent Chief Inspector of Borders and Immigration, said the absence of a “foolproof test” of age it was “inevitable that some age assessments will be wrong”.
In a sample of 100 case files, inspectors found that of 38 people who had been initially assessed as an adult by the Home Office, 22 were later assessed by a local authority to be under 18.
Mr Bolt’s report was prepared before the government announced its plans for AI facial recognition.
The government said it would trial the technology ahead of an excepted roll out in 2026. A tender for providers of the technology will be launched in August.
Similar technology is already used in the private sector by banks and online retailers to verify the ages of customers buying products such as knives.
The government is now encouraging the companies who have pioneered that technology to take part in a Home Office procurement process.
A senior Home Office source said they hoped to “leverage the power of the private sector” by working with companies who are “investing in this in the billions”.
The previous Conservative government introduced a plan to examine the bones and teeth of some migrants in order to verify their age.
But Labour ministers are thought to be sceptical about the plan because it relied on people being taken to separate facilities and instead wanted a verification system that could be used at the border.
Mr Bolt’s report noted the safeguarding risk of a child incorrectly assessed to be an adult having to share a room with an adult stranger – as well as an adult incorrectly assessed as a child being placed with other children.
The inspector highlighted the case of a male small boat arrival who claimed they were 17, who the Home Office assessed to be 22 due to physical characteristics such as his “deep voice”, “fully developed facial structure” and “thick black stubble”.
He criticised the Home Office’s use of “generic physical characteristics” and “failing to take into account the young person’s individual circumstances”.
The report also said some migrants were signing “Statement of Age” forms without properly understanding what they were signing after their long and often-arduous journeys. This led to ages later being disputed.
The report also criticised the “lack of curiosity” of immigration officers about age assessment decisions that were later overturned and that no lessons were learnt from overturned decisions.
The government has accepted all eight of the recommendations Mr Bolt made in his report, including on increased training for immigration officers and improving communication.
The chief executive of the Refugee Council, Enver Solomon, said he was “not convinced” that using AI tools was the correct approach.
He said the inspector’s report highlighted concerns about children being put in unsafe situations and said “these technologies continue to raise serious questions about accuracy, ethics and fairness”.