Difference between revisions of "Automated Facial Recognition (AFR)"

From Security Vision
Jump to: navigation, search
Line 5: Line 5:
 
===United Kingdom===
 
===United Kingdom===
 
====Mandate====
 
====Mandate====
Scotland Yard’s legal mandate for using live facial recognition states that the Human Rights Act recognises action in the interests of national security, public safety and the prevention of disorder or crime as legitimate aims.
+
<blockquote>[[Scotland Yard]]’s legal mandate for using live facial recognition states that the [[Human Rights Act]] recognises action in the interests of national security, public safety and the prevention of disorder or crime as legitimate aims.</blockquote>
  
=== Uses in 2020 ===
+
===Uses in 2020===
 
<blockquote>[[Scotland Yard]] has this year deployed cameras to scan shoppers  
 
<blockquote>[[Scotland Yard]] has this year deployed cameras to scan shoppers  
  
*in Stratford, east London, and
+
*In Stratford, east London
*at Oxford Circus in London<ref name=":0">Booth, Robert. ‘Halt Public Use of Facial Recognition Tech, Says Equality Watchdog’. ''The Guardian'', 12 March 2020, sec. UK news. <nowiki>https://www.theguardian.com/uk-news/2020/mar/12/halt-public-use-of-facial-recognition-tech-says-equality-watchdog</nowiki>.</ref>  
+
*At Oxford Circus in London<ref name=":0">Booth, Robert. ‘Halt Public Use of Facial Recognition Tech, Says Equality Watchdog’. ''The Guardian'', 12 March 2020, sec. UK news. <nowiki>https://www.theguardian.com/uk-news/2020/mar/12/halt-public-use-of-facial-recognition-tech-says-equality-watchdog</nowiki>.</ref>  
 
**The Oxford Circus deployment on 27 February scanned 8,600 faces to see if any matched a watchlist of more than 7,000 individuals. During the session, police [https://www.met.police.uk/SysSiteAssets/media/downloads/central/advice/met/facial-recognition/latest-past-deployment-data.pdf wrongly stopped five people] and correctly stopped one.
 
**The Oxford Circus deployment on 27 February scanned 8,600 faces to see if any matched a watchlist of more than 7,000 individuals. During the session, police [https://www.met.police.uk/SysSiteAssets/media/downloads/central/advice/met/facial-recognition/latest-past-deployment-data.pdf wrongly stopped five people] and correctly stopped one.
 
</blockquote><blockquote>[[South Wales Police|South Wales police]] used the technology at a Slipknot concert at the Cardiff City football club stadium in January 2020, as well as to monitor football fans<ref name=":0" />.</blockquote><blockquote>Police forces in Hull, Leicestershire, Sheffield, Manchester, Birmingham, Bradford and Brighton have also experimented with the technology in recent years, according to [https://bigbrotherwatch.org.uk/ research by the campaign group Big Brother Watch]<ref name=":0" />.</blockquote>
 
</blockquote><blockquote>[[South Wales Police|South Wales police]] used the technology at a Slipknot concert at the Cardiff City football club stadium in January 2020, as well as to monitor football fans<ref name=":0" />.</blockquote><blockquote>Police forces in Hull, Leicestershire, Sheffield, Manchester, Birmingham, Bradford and Brighton have also experimented with the technology in recent years, according to [https://bigbrotherwatch.org.uk/ research by the campaign group Big Brother Watch]<ref name=":0" />.</blockquote>

Revision as of 12:20, 15 March 2020

Summary[edit | ]

Main deployments[edit | ]

United Kingdom[edit | ]

Mandate[edit | ]

Scotland Yard’s legal mandate for using live facial recognition states that the Human Rights Act recognises action in the interests of national security, public safety and the prevention of disorder or crime as legitimate aims.

Uses in 2020[edit | ]

Scotland Yard has this year deployed cameras to scan shoppers

  • In Stratford, east London
  • At Oxford Circus in London[1]
    • The Oxford Circus deployment on 27 February scanned 8,600 faces to see if any matched a watchlist of more than 7,000 individuals. During the session, police wrongly stopped five people and correctly stopped one.

South Wales police used the technology at a Slipknot concert at the Cardiff City football club stadium in January 2020, as well as to monitor football fans[1].

Police forces in Hull, Leicestershire, Sheffield, Manchester, Birmingham, Bradford and Brighton have also experimented with the technology in recent years, according to research by the campaign group Big Brother Watch[1].

Statistics[edit | ]

Prof Peter Fussey, an expert on surveillance from Essex University who conducted the only independent review of the Metropolitan police’s public trials on behalf of the force, has found it was verifiably accurate in just 19% of cases.[1] In January 2019, the Information Commissioner’s Office commissioned research into public perceptions about its use. Of those surveyed, 58% thought it was acceptable to be stopped erroneously by the police, while 30% thought it was unacceptable.[1]

Criticisms[edit | ]

The demands for the technology to be halted add to pressure from civil liberties organisations, including Amnesty International, which has described the Met’s rollout as “putting many human rights at risk, including the rights to privacy, non-discrimination, freedom of expression, association and peaceful assembly”. [1]

Legal Challenges[edit | ]

But last September, the high court refused a judicial review of South Wales police’s use of the technology. Judges ruled that although it amounted to interference with privacy rights, there was a lawful basis for it and the legal framework used by the police was proportionate “The law is clearly on the back foot with invasive AFR and predictive policing technologies,” said Rebecca Hilsenrath, chief executive at the EHRC[1].

In a report to the United Nations on civil and political rights in the UK, the EHRC said: “Evidence indicates many AFR algorithms disproportionately misidentify black people and women and therefore operate in a potentially discriminatory manner … Such technologies may replicate and magnify patterns of discrimination in policing and have a chilling effect on freedom of association and expression.”[1]

Artistic reactions[edit | ]

Artists Georgina Rowlands (left) and Anna Hart (right), of the Dazzle Club, which holds monthly walks in London to raise awareness of AFR technology and ‘rampant surveillance’. Their facepaint is to confuse the cameras. Photograph: Kelvin Chan/AP Photo[1]

Template:Data Footer

References[edit | ]

  1. 1.0 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 Booth, Robert. ‘Halt Public Use of Facial Recognition Tech, Says Equality Watchdog’. The Guardian, 12 March 2020, sec. UK news. https://www.theguardian.com/uk-news/2020/mar/12/halt-public-use-of-facial-recognition-tech-says-equality-watchdog.

What links here

References

  1. a b c d  Bunz, Mercedes and Khan, Murad and Anamorphisms, New and Khan, Nora N. 001: Aesthetics of New AI. , pubdate.
  2. a b c d e f  |  Booth, Robert. Halt public use of facial recognition tech, says equality watchdog. , 2020.
  3. a b c d ClearviewFacialRecognition 
  4. a b c d  |  "AI Weekly: Coronavirus, facial recognition, and the future of privacy". (2020) <https://venturebeat.com/2020/03/06/ai-weekly-coronavirus-facial-recognition-and-the-future-of-privacy/> Accessed: 2020-03-15