Automated Facial Recognition (AFR)

From Security Vision
Revision as of 12:13, 15 March 2020 by Francesco (talk | contribs) (Created page with "== Summary == == Main deployments == === United Kingdom === '''2020'''<blockquote>Scotland Yard has this year deployed cameras to scan shoppers * in Stratford, east Lo...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Summary[edit | ]

Main deployments[edit | ]

United Kingdom[edit | ]

2020

Scotland Yard has this year deployed cameras to scan shoppers

  • in Stratford, east London, and
  • at Oxford Circus in London[1]
    • The Oxford Circus deployment on 27 February scanned 8,600 faces to see if any matched a watchlist of more than 7,000 individuals. During the session, police wrongly stopped five people and correctly stopped one.

South Wales police used the technology at a Slipknot concert at the Cardiff City football club stadium in January 2020, as well as to monitor football fans[1].

Accuracy[edit | ]

Prof Peter Fussey, an expert on surveillance from Essex University who conducted the only independent review of the Metropolitan police’s public trials on behalf of the force, has found it was verifiably accurate in just 19% of cases.[1]

Criticisms[edit | ]


Legal Challenges[edit | ]

But last September, the high court refused a judicial review of South Wales police’s use of the technology. Judges ruled that although it amounted to interference with privacy rights, there was a lawful basis for it and the legal framework used by the police was proportionate “The law is clearly on the back foot with invasive AFR and predictive policing technologies,” said Rebecca Hilsenrath, chief executive at the EHRC[1].

In a report to the United Nations on civil and political rights in the UK, the EHRC said: “Evidence indicates many AFR algorithms disproportionately misidentify black people and women and therefore operate in a potentially discriminatory manner … Such technologies may replicate and magnify patterns of discrimination in policing and have a chilling effect on freedom of association and expression.”[1]

Artistic reactions[edit | ]

Artists Georgina Rowlands (left) and Anna Hart (right), of the Dazzle Club, which holds monthly walks in London to raise awareness of AFR technology and ‘rampant surveillance’. Their facepaint is to confuse the cameras. Photograph: Kelvin Chan/AP Photo

References[edit | ]

  1. 1.0 1.1 1.2 1.3 1.4 Booth, Robert. ‘Halt Public Use of Facial Recognition Tech, Says Equality Watchdog’. The Guardian, 12 March 2020, sec. UK news. https://www.theguardian.com/uk-news/2020/mar/12/halt-public-use-of-facial-recognition-tech-says-equality-watchdog.

What links here

References

  1. a b c d  Bunz, Mercedes and Khan, Murad and Anamorphisms, New and Khan, Nora N. 001: Aesthetics of New AI. , pubdate.
  2. a b c d e f  |  Booth, Robert. Halt public use of facial recognition tech, says equality watchdog. , 2020.
  3. a b c d ClearviewFacialRecognition 
  4. a b c d  |  "AI Weekly: Coronavirus, facial recognition, and the future of privacy". (2020) <https://venturebeat.com/2020/03/06/ai-weekly-coronavirus-facial-recognition-and-the-future-of-privacy/> Accessed: 2020-03-15