The 5 Biggest Criticisms Leveled Against Facebook in a New Civil Rights Audit

No surprise, the site is "far too reluctant to adopt strong rules to limit misinformation"

Facebook founder Mark Zuckerberg

Facebook's Civil Rights Audit is out, and it's bad news for Mark Zuckerberg's company.

By Kirk Miller

Facebook’s continued lack of urgency surrounding voter suppression, misinformation and algorithmic bias could have “dangerous (and life threatening) real-world consequences,” according to a new audit of the company.

Facebook’s Civil Rights Audit, released today, July 8, is an independent survey of the company led by Laura W. Murphy, a civil rights lawyer and former American Civil Liberties Union executive, and a team from the law firm Relman Colfax. It’s actually the third and final report after two preliminary releases; in all, the two-year audit encompassed interviews with “100 civil rights organizations, hundreds of advocates and several members of Congress.”

The report comes at a time when over 900 companies have threatened or started a boycott of the social media site — and when a Zoom meeting this week between civil rights groups and Facebook’s top executives Mark Zuckerberg and Sheryl Sandberg was considered “very disappointing.”

The auditors claim their report on Facebook was not acting as a comparison with other tech industry peers (“In some areas [Facebook] may outperform peers with respect to civil rights, and in other areas, it may not”) and only concentrates on the “core” Facebook app and only as it relates to the U.S. As well, the report wasn’t limited to only racial justice, but also touched upon unfair treatment and discrimination in education, employment, housing, credit, voting, public accommodations and more.

While the audit does suggest Facebook made some improvements over the course of the two-year study, it also notes that any progress is merely a “start, not a destination.”

Five startling conclusions:

The report did laud Facebook’s census interference policy, voting misinformation posts related to COVID-19 and the site’s increased capacity to combat coordinated inauthentic behavior (e.g., foreign actors trying to sway an election from fake or banned accounts). While the report featured an extensive overview of Facebook’s commitment to weed out algorithmic and machine learning biases, those programs were either too new or the commission did not have enough access to make conclusions; they did suggest the company should approach these issues with “a greater sense of urgency.”

As Sandberg, Facebook’s COO, noted in a blog post today, “What has become increasingly clear is that we have a long way to go. As hard as it has been to have our shortcomings exposed by experts, it has undoubtedly been a really important process for our company.”

Subscribe here for our free daily newsletter.

Exit mobile version