In June, we wrote about the growing expansion in the use of cameras, particularly by police, but apparently growing unchecked is the use of cameras – particularly facial recognition cameras – in retail.

Last month, the privacy non-profit Fight for the Future launched the campaign Ban Facial Recognition, and has been joined by more than 35 organizations to demand that American retailers stop using facial recognition to identify shoppers and workers in their stores. The campaign is designed to put pressure on companies that are currently using facial recognition and those who say they might use it in the future.

According to the campaign, retailers Albertsons, Macy’s and Apple are currently using it. They have a long list of retailers who say they’re considering using it and even fewer who’ve committed not to use the technology. You can find the complete list on their website www.BanFacialRecognition.com/stores.

Campaign supporters say while retailers implement facial recognition as a way to deter theft and identify shoplifters, the technology is just a growing expansion of video surveillance citizens are already subjected to by police, but is far less regulated and undisclosed.

“A lot of people would probably be surprised to know how many retailers that they shop in on a regular basis are using this technology in a variety of ways to protect their profits and maximize their profits as well,” Caitlin Seeley George, a campaign director at Fight for the Future, told Recode.

As quiet as it might have been kept, stores using facial recognition systems isn’t new. Last year, Reuters wrote about the drugstore chain Rite Aid’s extensive use of the cameras in their stores, predominantly in largely lower-income, non-White neighborhoods.

According to the Reuters article, “the cameras matched facial images of customers entering a store to those of people Rite Aid previously observed engaging in potential criminal activity, causing an alert to be sent to security agents’ smartphones. Agents then reviewed the match for accuracy and could tell the customer to leave.”

In their defense, Rite Aid said customers were apprised of the use of the technology by signage at the shops and a policy posted on the store’s website. Reuters said they found no notice of the surveillance in more than a third of the stores they visited with the facial recognition cameras.

Since Reuters’ report, the company has committed to discontinuing use of the technology.

RACIAL BIAS IN FACIAL RECOGNITION

If you wonder how Facebook knew to tag you in a photo, don’t be. It’s estimated that as of 2016, more than 117 million people have photos within a facial recognition network – the technology used to tag you. Just like Facebook tagged you, your participation occurred without your consent.

Unfortunately, for African Americans facial recognition technologies have shown to involve a significant racial bias against them. While the technology has a high rate of accuracy, not so much when it comes to Black people, particularly dark-skinned Black people.

In the 2018 “Gender Shades” project, facial recognition programs, including programs by IBM and Microsoft, grouped subjects into four categories: darker-skinned females, darker-skinned males, lighter-skinned females, and lighter-skinned males. All three algorithms performed the worst on darker-skinned females, with error rates up to 34% higher than for lighter-skinned males. The results of the study were confirmed through an independent assessment by the National Institute of Standards and Technology.

REGULATIONS NEEDED

Among the biggest concerns of the Ban Facial Recognition campaign is that use of the technology remains largely unregulated, with most of the effort to rein in the technology focused on law enforcement and government.

Members of Congress have proposed several ideas for giving customers more protection against private companies’ use of facial recognition, there’s yet to be significant regulation at the federal level.

“In the vast majority of cities and towns, there are no rules on when private companies can use surveillance tech, and when they can share the information with police, ICE [Immigration and Customs Enforcement], or even private ads,” warns Albert Fox Cahn, the executive director of the Surveillance Technology Oversight Project.

Since 1996, Bonita has served as as Editor-in-Chief of The Community Voice newspaper. As the owner, she has guided the Wichita-based publication’s growth in reach across the state of Kansas and into...

Leave a comment

Your email address will not be published. Required fields are marked *