Related Stories
'I was moderating hundreds of horrific and traumatising videos'
Key Excerpts from Article on Website of BBC News
Posted: November 27th, 2024
https://www.bbc.com/news/articles/crr9q2jz7y0o
Beheadings, mass killings, child abuse, hate speech – all of it ends up in the inboxes of a global army of content moderators. You don’t often see or hear from them – but these are the people whose job it is to review and then, when necessary, delete content that either gets reported by other users, or is automatically flagged by tech tools. Moderators are often employed by third-party companies, but they work on content posted directly on to the big social networks including Instagram, TikTok and Facebook. “If you take your phone and then go to TikTok, you will see a lot of activities, dancing, you know, happy things,” says Mojez, a former Nairobi-based moderator. “But in the background, I personally was moderating, in the hundreds, horrific and traumatising videos. “I took it upon myself. Let my mental health take the punch so that general users can continue going about their activities on the platform.” In 2020, Meta then known as Facebook, agreed to pay a settlement of $52m (£40m) to moderators who had developed mental health issues. The legal action was initiated by a former moderator [who] described moderators as the “keepers of souls”, because of the amount of footage they see containing the final moments of people’s lives. The ex-moderators I spoke to all used the word “trauma” in describing the impact the work had on them. One ... said he found it difficult to interact with his wife and children because of the child abuse he had witnessed. What came across, very powerfully, was the immense pride the moderators had in the roles they had played in protecting the world from online harm.
Note: Read more about the disturbing world of content moderation. For more along these lines, explore concise summaries of revealing news articles on Big Tech from reliable major media sources.