Â鶹ԼÅÄ

Facial recognition: What is AFR? And why is it being challenged?

Stock-image-of-facial-recognition.Image source, Maxiphoto

The first legal challenge to the police use of automated facial recognition (AFR) is taking place.

A man from Cardiff is taking South Wales Police to court to try to get them to stop using the technology.

He says it is 'indiscriminate' because it scans everybody who passes the camera.

South Wales police is just one of the police forces trying out the technology.

More tech stories

What is automated facial recognition?

Automatic facial recognition - or AFR - is an advanced way of recognising people by using computers to scan their face with a camera as they walk by.

Image source, Matthew Horwood
Image caption,

Police in South Wales has been using AFR since 2017.

Police facial recognition cameras have been trialled at events such as football matches, festivals and parades.

High-definition cameras detect all the faces in a crowd and compare them with existing police photographs, such as photos from previous arrests.

Who is using AFR?

Image source, Getty Images
Image caption,

The Met Police used AFR at the Notting Hill carnival in 2016 and 2017.

Some UK police forces have used AFR technology in public spaces since June 2015 - including South Wales, the Metropolitan Police and Leicestershire Police.

An organisation which works to defend human rights called Liberty says South Wales Police have used AFR in public spaces on at least 22 occasions since 2017.

This includes the 2017 Champions League final in Cardiff.

London's police force - the Metropolitan Police - says it will test AFR 10 times before deciding whether to use the system. Two of its trials were at the Notting Hill carnival in 2016 and 2017.

Who is against AFR and why?

Image source, Justin Sullivan
Image caption,

A camera in San Francisco.

Some human rights groups have spoken out against all facial recognition software.

They say that it takes away the right to privacy and freedom of expression - both of which are part of the UK's Human Rights Act.

It's argued that the use of facial recognition surveillance means thousands of people are given sensitive identity checks without giving permission or agreement.

Critics also argue that it is not accurate. When South Wales Police used it at the Champions League final, of 2,470 potential matches made using AFR, 92% were wrong!

At the 2017 Notting Hill carnival, the system was wrong 98% of the time, falsely telling officers on 102 occasions it had spotted someone they suspected of a crime.

This month, San Francisco in the US became the first city to ban the use of AFR technology by police.

What do the people who use AFR say?

Image source, AFP
Image caption,

These modified sunglasses used in China are connected to an internal database of suspects.

China is reported to have more than 176 million facial recognition cameras for targeting even minor offences. Police have even started using glasses which are equipped with the technology.

Last month, a man who had committed fraud was found using AFR among a crowd of 60,000 people at a concert in Nanchang City, China.

In the UK, some of the police forces trialling the technology say that, despite failures, there have been some successes.

A spokesman for South Wales Police said: "Over 2,000 positive matches have been made using our 'identify' facial recognition technology with over 450 arrests."

The forces also stressed the need to test new technologies which could help catch criminals in the future.

On their website, the Metropolitan Police say: "We feel it's important to run the trial in real-life conditions to get accurate data and learn as much as possible from it."