Racist/Biased Facial Recognition Software?

Electronicssamantha severynComment

Amazon has developed, and is selling to law enforcement agencies, a new facial recognition system called ‘rekognition’ that we believe may pose a privacy threat to customers across the country. While advertised to enhance law enforcement activities it may also violate civil and human rights. The logarithms tracked in similar facial recognition systems in recent years have presented evidence of very serious biased problems with these systems. This technology has been used unfairly in the past and who’s to say it won’t disproportionately target and surveil people of diverse ethnic backgrounds or civil society issues in the present? 

  • According to the BBC, police in South Wales using facial recognition programs “made 2,685 ‘matches’ between May 2017 and March 2018—but 2,451 were false alarms.” 
  • In China, Apple has reportedly provided refunds to iPhone X customers who have repeatedly discovered the phone’s face recognition software can have trouble accurately recognizing people of color.
  • American Civil Liberties Union (ACLU), referred to Rekognition as: “. . . a powerful surveillance system readily available to violate rights and target communities of color.” 

Amazon shareholders have also expressed their concerns that this software will not only violate customer rights but stakeholders rights and fortunes as well. 

What is your opinion? Do you think this technology can do more harm than good? Have you been a victim of Racist/Biased Facial recognition? Tell us about it at (650) 762-8545 or fill out the form below! 

Name *


Fair Credit Reporting Act Violations

Click2Gov Utility Payment Service Hacked

Arby's Data Breach