Lawsuit Suggests TikTok Is the Latest App With a Content Moderation Problem

This is a growing problem in the industry

TikTok
A TikTok logo is displayed on a smartphone screen.
Nikolas Kokovlis/NurPhoto via Getty Images

Sometimes, in the world of social media, what makes an app enticing is the same thing that can make it toxic. Do you like being able to upload fun images or video to Instagram or TikTok? Sure — but that same functionality might allow someone else to upload horrific imagery or explicit films to the same place.

In theory, there should be moderation policies in place to block harmful content from being seen by the app’s users — but that can still be a traumatizing experience for anyone involved in the moderation process. In 2019, the psychological effects of working as a moderator for Facebook came to light. And now, something similar seems to be happening with TikTok.

A recent Washington Post article reports that a TikTok content moderator has sued TikTok and ByteDance, its parent company. The lawsuit argues that the moderator and her colleagues spent up to 12 hours each day reviewing violent and traumatizing videos, including “genocide in Myanmar, mass shootings, children being raped, and animals being mutilated.”

The lawsuit seeks compensation for moderators who have had to review this and similar material, and calls for TikTok to bolster its mental health resources available to moderators.

TikTok responded with a statement that noted, in part, that “we strive to promote a caring working environment for our employees and our contractors.” Whether or not this lawsuit will influence the way social networks moderate content remains to be seen — but it seems to point to a growing issue.

The InsideHook Newsletter.

News, advice and insights for the most interesting person in the room.