Image Cloaking Tool Thwarts Facial Recognition Programs

Written by on August 6, 2020 in Sci-Tech, Technology with 0 Comments
image_pdfimage_print

Credit: University of Chicago

Researchers at the University of Chicago were not happy with the creeping erosion of privacy posed by facial recognition apps. So they did something about it.

They developed a program that helps individuals fend off programs that could appropriate their images without their permission and identify them in massive database pools.

Fawkes, named after the fictional anarchist who wore a mask in the “V for Vendetta” comics and film, makes subtle pixel-level alterations on images that, while invisible to the , distort the image enough so that it cannot be utilized by online image scrapers.

“What we are doing is using the cloaked photo in essence like a Trojan Horse, to corrupt unauthorized models to learn the wrong thing about what makes you look like you and not someone else,” Fawkes co-creator Ben Zhao, a computer science professor at the University of Chicago, said. “Once the corruption happens, you are continuously protected no matter where you go or are seen.”

The advent of facial recognition technology carried with it the promise of great benefits for society. It helped us protect our data and unlock our phones, organized our massive photo collections by matching names with faces, made air travel more tolerable by cutting the wait at ticket and baggage check-ins, and is helping the visually impaired recognize facial reactions in social situations with others.

There are obvious advantages for law enforcement who use facial recognition to detect and catch bad actors, track transactions at ATMs, and find missing children.

It is also helping businesses crackdown on thefts, tracking student attendance in schools, and, in China, allowing customers to leave their credit cards behind and pay for meals with just a smile. And the National Human Genome Research Institute is even using facial recognition with a near 100-percent success rate to identify symptoms of a rare disease that reveals itself in facial changes.

But concerns abound as well. With few federal regulations guiding the use of such an invasive technology, abuse is inevitable. The FBI has compiled a database exceeding 412 million people. Some, to be sure, are criminals. But not all. The notion of an increasingly surveilled population suggests to many the slow erosion of our privacy and, along with it, possibly our freedoms and rights. A society increasingly scrutinized under the watchful eye of Big Brother evokes images of totalitarian societies, imagined, as in “1984,” and real, as in North Korea.

Concerns have been raised about the consequences of misidentification, especially in situations involving serious crime, as well as the capacity for abuse when corrupt governments or rogue police agents have such tools at hand. Also, facial recognition programs can sometimes be wrong. Recent troubling studies have found recognition programs have particular problems with correctly identifying women of color.

Earlier this year, The New York Times reported on the controversial activity of Clearview AI, an app that claims to have compiled a database of more than 3 billion images from sources such as Facebook, YouTube, and Venmo. All of this was done without the permission of the subjects. Linked to a pair of reality-augmented eyeglasses, Clearview AI-equipped members of law enforcement and security agencies can walk down the street and identify anyone they see, along with their names, addresses, and other vital information.

Image cloaking tool thwarts facial recognition programs

Credit: University of Chicago 

The tool certainly can be used for good. Federal and state law enforcement officers, according to the Times, say the app helped solve murders, shoplifting crimes, identity theft, credit card fraud, and child sexual exploitation cases.

READ THE REST OF THIS ARTICLE…

Tags: , , , , , , , ,

Subscribe

If you enjoyed this article, subscribe now to receive more just like it.

Subscribe via RSS Feed Connect on YouTube

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

FAIR USE NOTICE. Many of the articles on this site contain copyrighted material whose use has not been specifically authorized by the copyright owner. We are making this material available in an effort to advance the understanding of environmental issues, human rights, economic and political democracy, and issues of social justice. We believe this constitutes a 'fair use' of the copyrighted material as provided for in Section 107 of the US Copyright Law which contains a list of the various purposes for which the reproduction of a particular work may be considered fair, such as criticism, comment, news reporting, teaching, scholarship, and research. If you wish to use such copyrighted material for purposes of your own that go beyond 'fair use'...you must obtain permission from the copyright owner. And, if you are a copyright owner who wishes to have your content removed, let us know via the "Contact Us" link at the top of the site, and we will promptly remove it.

The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Conscious Life News assumes no responsibility for the use or misuse of this material. Your use of this website indicates your agreement to these terms.

Paid advertising on Conscious Life News may not represent the views and opinions of this website and its contributors. No endorsement of products and services advertised is either expressed or implied.
Top
Send this to a friend