There are many problematic functions of facial recognition technology that the general public might not be aware of, like how and when your likeness is used, and whether you ever consented to said use.
That’s according to a recent investigation of the eminent dangers of facial recognition by Mashable, which concluded that the technology is “inherently creepy.” Many of the worlds’ biggest companies — like Amazon, which created Rekognition — are developing their own versions and allowing the government to use it.
“It sounds like something a villainous, tyrannical government would do in a science fiction story,” the site wrote, “but it’s happening in real life.”
And it’s happening against our will. Mashable discovered earlier this year that IBM was taking people’s posts off Flickr to train its own facial recognition software without telling the owners they were doing so. U.S. Customs and Border Protection has also been using the pictures taken of passengers who enter and leave airports across the country in its own AI — again, without anyone’s permission.
But one of the biggest concerns with this ever-evolving tech is that it’s being used by law enforcement and submitted as evidence in criminal trials, despite the fact that it’s not always accurate. A study conducted by the American Civil Liberties Union in 2018 found that Rekognition “incorrectly matched members of Congress with other people who appeared in police mugshots,” according to Mashable. Most of those false IDs involved congress members of color — meaning the tech could be inherently biased against non-whites.
MIT backed up this finding, reporting in 2018 that it also saw a disproportionate number of errors from facial-recognition technology when people of color were considered, meaning that the tech, “as it exists now,” Mashable wrote, “reflects the biases of the people who create it.”
Editor’s Note: RealClearLife, a news and lifestyle publisher, is now a part of InsideHook. Together, we’ll be covering current events, pop culture, sports, travel, health and the world. Subscribe here for our free daily newsletter.
Thanks for reading InsideHook. Sign up for our daily newsletter and be in the know.