Related Stories
Facebook keeps researching its own harms — and burying the findings
Key Excerpts from Article on Website of Washington Post
Posted: September 27th, 2021
https://www.washingtonpost.com/technology/2021/09/16/faceboo...
Facebook knew that teen girls on Instagram reported in large numbers that the app was hurting their body image and mental health. It knew that its content moderation systems suffered from an indefensible double standard in which celebrities were treated far differently than the average user. It knew that a 2018 change to its news feed software, intended to promote “meaningful interactions,” ended up promoting outrageous and divisive political content. Facebook knew all of those things because they were findings from its own internal research teams. But it didn’t tell anyone. In some cases, its executives even made public statements at odds with the findings. The world’s largest social network employs teams of people to study its own ugly underbelly, only to ignore, downplay and suppress the results of their research when it proves awkward or troubling. A pattern has emerged in which findings that implicate core Facebook features or systems, or which would require costly or politically dicey interventions, are reportedly brushed aside by top executives, and come out only when leaked to the media by frustrated employees or former employees. For instance, the New York Times reported in 2018 that Facebook’s security team had uncovered evidence of Russian interference ahead of the 2016 U.S. election, but that Chief Operating Officer Sheryl Sandberg and Vice President of Global Public Policy Joel Kaplan had opted to keep it secret for fear of the political fallout.
Note: For more along these lines, see concise summaries of deeply revealing news articles on corporate corruption from reliable major media sources.