Something which you would imagine confined to the bounds of science fiction books and films; Facial Recognition Software – also referred to as ‘Live Facial Recognition’ – is technology which is currently being trialled around the UK as a method of crime detection and prevention. It is used similarly to CCTV recordings, whereby cameras can identify a person from digital imaging by analysing the faces which it detects.
How does it work?
The Metropolitan Police Force facial recognition, describe that the Live Facial Recognition Software “measures the structure of each face, including distance between eyes, nose, mouth and jaw to create facial data”. This information is then streamed directly to the Live Facial Recognition System Database which contains a “Watch List” – comprised of people who are currently wanted by the Police.
Once the system has detected a face it will search it against the Watch List for any possible matches, if it identifies a match, an alert will be sent to a nearby police officer. It is then up to the officer to look at the camera image and the watch list image before deciding if they should stop the identified person.
Where has Facial Recognition Software been used?
So far, three UK police forces have used Facial Recognition Software since June 2015, these are:
• London Metropolitan Police
• Leicestershire Police
• South Wales Police
How does this affect me?
The Metropolitan Police Force state that the Live Facial Recognition System “will only keep faces matching the watch list”, with all others recordings being deleted immediately.
They also state that you can refuse to be scanned, as “it’s not an offence or considered ‘obstruction’ to actively avoid being scanned” – which brings into question the effectiveness of the Software.
Controversy, despite being in its infancy, Facial Recognition Software has already been met with widespread criticism – contrary to the notion that you are able to decline a scan, this is not the case when a man in the street was fined by the police using facial recognitation software, there are reports that a man was fined £90 by plain clothed police after he refused to be scanned by the Software with several others being stopped after “covering their faces or pulling up hoods”.
Trialling the Software around London has already cost £222,000 which some critics have argued is both a waste of money and a violation of privacy. Big Brother Watch, an organisation that campaigns against state surveillance, states in their ‘Face Off Report’ that the Software threatens the public right “to go about your daily activity undisturbed by state authorities” which is a set out in Article 10 of the Human Rights Act 1998 that provides:
Freedom of expression
1. Everyone has the right to freedom of expression. This right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers. This Article shall not prevent States from requiring the licensing of broadcasting, television or cinema enterprises.
2. The exercise of these freedoms, since it carries with it duties and responsibilities, may be subject to such formalities, conditions, restrictions or penalties as are prescribed by law and are necessary in a democratic society, in the interests of national security, territorial integrity or public safety, for the prevention of disorder or crime, for the protection of health or morals, for the protection of the reputation or rights of others, for preventing the disclosure of information received in confidence, or for maintaining the authority and impartiality of the judiciary.
The Software has also been challenged by human rights groups such as Liberty, who also believe the technology to be “dangerously intrusive and discriminatory” that does not belong in a “free, rights-respecting democracy”.
The Watch List created by the police is also seen to be disproportionate, according to Liberty, as it is comprised of those who have contacted the police “including thousands of innocent people” – they can also use pictures taken from social media with the Software.
Problems with the Software
The effectiveness of the technology also poses problems to those from an ethnic background – according to the BBC, university research and documents from the police have revealed that the system is “trained predominantly using white faces”. The Chief Constable of Durham Police stated in a 2014 meeting that “ethnicity can have an impact on search accuracy”.
The Software is seen to be extremely inaccurate, reinforced by reports that trials carried out in London over a two year period resulted in “96% of false positives” where the Software wrongly alerted the police that there is a match between a passing person and the Watch List.
Ed Bridges & South Wales Police, Ed Bridges, an office worker from Cardiff, is the first person to take legal action against the police because of their Facial Recognition Software. According to The Guardian, Mr. Bridges said he was left feeling distressed when he was subjected to the scanner whilst on his lunch break.
Mr. Bridges is campaigning against use of the Software, believing it to have “profound consequences for privacy and data protection rights” – more information about your data protection rights can be found here.
The claim is being support by Liberty, who described the facial recognition Software as “a mockery of our rights to privacy”. However, the force itself has argued that the use of the Software does not infringe any data protection rights, as it is used in the same way as photographing a person in public.
Dan Squires QC, who is acting as Mr. Bridges representative, made claims in his submission to the Court that the collecting of the “biometric data” used by the Software is similar to taking the fingerprints or DNA of those it scans, without their consent – despite there being laws to govern fingerprint and DNA data, this does not apply to facial data.
The police are essentially dealing with a legal grey area – Dr. Purshouse of the UEA School of Law describes the Software as “operating in a legal vacuum” as there is “currently no legal framework specifically regulating police use of facial recognition technology”.
Future of facial recognition and privacy
The technology and its implications upon existing law has yet to catch up. Whilst the laws can be said to cover facial recognition, for instance Article 8 of the Human Rights Act 1988 and the right to privacy the clue is in the date that the legislation was past (1988). It is far outdated, and that is where the problem lies. The outdated laws now have to fit in with today's technology and its use. The Court will determine how the law is applied to the current facts of the case. Time will tell where the Court will provide its dividing line and if there is a breach and thus the right to compensation and quash any fine or imprisonment and award compensation for the misuse of facial recognition software.