Microsoft Wants Artificial Intelligence Regulated Before It’s Too Late

The company's employees are backed by tech ethicists who are pushing for restrictions.

facial recognition
Controversial facial recognition technology needs to be regulated, according to some experts. (Getty Images)
Getty Images

Microsoft, Google and a New York University research institute are banding together to urge lawmakers to take a closer look at the dangers of unregulated artificial intelligence as a law enforcement tool.

The group, called AI Now, published a report on Thursday that underscores the inherent dangers of using artificial intelligence to surveil people — especially without their knowledge or permission.

“Mere public notice of their use is not sufficient, and there should be a high threshold for any consent, given the dangers of oppressive and continual mass surveillance,” the organization wrote, according to Vanity Fair.

“These tools are very suspect and based on faulty science,” one of the group’s co-founders, who works for Microsoft Research, Kate Crawford, told Bloomberg. “You cannot have black-box systems in core social services.”

Microsoft proposed that laws be put in place that require companies that use facial recognition technology to to let a third party audit them and for the government to obtain a court order to use the tech to track a person over time.

The report comes within the same week that the ACLU publicized the Secret Service’s plans to test facial-recognition tech both in and around the White House, Vanity Fair reported.

“How far-ranging does the Secret Service believe its monitoring efforts need to be to fulfill its mission?” concluded the ACLU. “Whatever the answer is today, there is good reason to be concerned about what that answer might be in the future—especially if unregulated face recognition and other technologies make it cheap and easy to extend the tentacles of its surveillance outwards.”

The InsideHook Newsletter.

News, advice and insights for the most interesting person in the room.