Amazon’s Facial Recognition Technology Can Now Detect Fear in People

Posted by on August 18, 2019 in Government, Spying and Surveillance with 0 Comments
image_pdfimage_print

By Vandita | We Are Anonymous

(CD) — Privacy advocates are responding with alarm to Amazon’s claim this week that the controversial cloud-based facial recognition system the company markets to law enforcement agencies can now detect “fear” in the people it targets.

“Amazon is going to get someone killed by recklessly marketing this dangerous and invasive surveillance technology to governments,” warned Evan Greer, deputy director of the digital rights group Fight for the Future, in a statement Wednesday.


Amazon Web Services detailed new updates to its system—called Rekognition—in an announcement Monday:

WITH THIS RELEASE, WE HAVE FURTHER IMPROVED THE ACCURACY OF GENDER IDENTIFICATION. IN ADDITION, WE HAVE IMPROVED ACCURACY FOR EMOTION DETECTION (FOR ALL 7 EMOTIONS: ‘HAPPY’, ‘SAD’, ‘ANGRY’, ‘SURPRISED’, ‘DISGUSTED’, ‘CALM’, AND ‘CONFUSED’) AND ADDED A NEW EMOTION: ‘FEAR’. LASTLY, WE HAVE IMPROVED AGE RANGE ESTIMATION ACCURACY; YOU ALSO GET NARROWER AGE RANGES ACROSS MOST AGE GROUPS.

Pointing to research on the technology conducted by the ACLU and others, Fight for the Future’s Greer said that “facial recognition already automates and exacerbates police abuse, profiling, and discrimination.”

“Now Amazon is setting us on a path where armed government agents could make split-second judgments based on a flawed algorithm’s cold testimony. Innocent people could be detained, deported, or falsely imprisoned because a computer decided they looked afraid when being questioned by authorities,” she warned. “The dystopian surveillance state of our nightmares is being built in plain sight—by a profit-hungry corporation eager to cozy up to governments around the world.”

VICE reported that “despite Amazon’s bold claims, the efficacy of emotion recognition is in dispute. A recent study reviewing over 1,000 academic papers on emotion recognition found that the technique is deeply flawed—there just isn’t a strong enough correlation between facial expressions and actual human emotions, and common methods for training algorithms to spot emotions present a host of other problems.”

Amid mounting concerns over how police and other agencies may use and abuse facial recognition tools, Fight for the Future launched a national #BanFacialRecognitioncampaign last month. Highlighting that there are currently no nationwide standards for how agencies and officials can use the emerging technology, the group calls on federal lawmakers to ban the government from using it at all.

Fight for the Future reiterated their demand Wednesday, in response to Amazon’s latest claims. Although there are not yet any federal regulations for the technology, city councils—from San Francisco to Somerville, Massachusetts—have recently taken steps to outlaw government use such systems.


Activists are especially concerned about the technology in that hands of federal agencies such as U.S. Immigration and Customs Enforcement (ICE) and Customs and Border Patrol (CBP), whose implementation of the Trump administration’s immigration policies has spurred condemnation from human rights advocates the world over.

Civil and human rights advocates have strongly urged Amazon—as well as other developers including Google and Microsoft—to refuse to sell facial recognition technology to governments in the United States and around the world, emphasizing concerns about safety, civil liberties, and public trust.

However, documents obtained last year by the Project on Government Oversight revealed that in the summer of 2018, Amazon pitched its Rekognition system to the Department of Homeland Security—which oversees ICE and CBP—over the objection of Amazon employees. More recently, the corporation has been targeted by protesters of the Trump administration’s immigration agenda for Amazon Web Service’s cloud contracts with ICE.

In a July report on Amazon’s role in the administration’s immigration policies, Al Jazeera explained that “U.S. authorities manage their immigration caseload with Palantir software that facilitates tracking down would-be deportees. Amazon Web Services hosts these databases, while Palantir provides the computer program to organize the data.”

“Amazon provides the technological backbone for the brutal deportation and detention machine that is already terrorizing immigrant communities,” Audrey Sasson, executive director of Jews For Racial and Economic Justice, told VICE Tuesday. “[A]nd now Amazon is giving ICE tools to use the terror the agency already inflicts to help agents round people up and put them in concentration camps.”

“Just as IBM collaborated with the Nazis, Amazon and Palantir are collaborating with ICE today,” added Sasson. “They’ve chosen which side of history they want to be on.”

Read more great articles at We Are Anonymous.

Tags: , , , , , ,

Subscribe

If you enjoyed this article, subscribe now to receive more just like it.

Subscribe via RSS Feed Connect on YouTube

New Title

NOTE: Email is optional. Do NOT enter it if you do NOT want it displayed.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

FAIR USE NOTICE. Many of the articles on this site contain copyrighted material whose use has not been specifically authorized by the copyright owner. We are making this material available in an effort to advance the understanding of environmental issues, human rights, economic and political democracy, and issues of social justice. We believe this constitutes a 'fair use' of the copyrighted material as provided for in Section 107 of the US Copyright Law which contains a list of the various purposes for which the reproduction of a particular work may be considered fair, such as criticism, comment, news reporting, teaching, scholarship, and research. If you wish to use such copyrighted material for purposes of your own that go beyond 'fair use'...you must obtain permission from the copyright owner. And, if you are a copyright owner who wishes to have your content removed, let us know via the "Contact Us" link at the top of the site, and we will promptly remove it.

The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Conscious Life News assumes no responsibility for the use or misuse of this material. Your use of this website indicates your agreement to these terms.

Paid advertising on Conscious Life News may not represent the views and opinions of this website and its contributors. No endorsement of products and services advertised is either expressed or implied.
Top
Send this to a friend