Mark Zuckerberg Knew Facebook Was Harmful to Children and Didn't Do Anything About It, Says Report. But Are We Even Surprised?
Mark Zuckerberg is under fire after a scathing Bloomberg report revealed that employees of Meta Platforms and ByteDance were aware of their platforms’ dangers to young children—and did nothing about it. The report cited an unredacted version of a court filing that made the headlines over the weekend. In the filing, it was revealed that key engineers and executives, including Meta CEO Mark Zuckerberg himself, were aware of the harm social media was doing to the youth yet decided to disregard the information. The lawsuit over social media addiction even claimed that Zuckerberg had been personally warned of the ways children and teens were at risk on his platforms. Nothing was done, or so the report claims. And we aren’t that surprised.
Also read: Google's Ex-CEO Warns of the Dangers of Facebook's Metaverse
Here's Why You Shouldn't Have Sensitive Convos or Files on Facebook Messenger
The Oakland case, which has become a PR nightmare for the platforms involved, has compiled countless complaints across the U.S. regarding American youths who’ve suffered anxiety, depression, eating disorders, and sleeplessness due to Facebook, Instagram, TikTok, Snapchat, and YouTube. The case has even blamed the platforms for social-media-related suicides, and public school districts have alleged that social media has impeded education. The filing even claims that Meta defunded its mental health team, which Meta vehemently refutes. As for TikTok, the internal documents claim that ByteDance (TikTok) was aware that young TikTok users were being lured to try dangerous stunts back when these sorts of videos went viral.
Despite all of these claims, social media companies in the U.S. are protected by a 1996 law (Section 230 is a section of Title 47 of the United States Code) that states social media platforms and other third-party publishers are not responsible for content posted on their platforms.
“These never-before-seen documents show that social media companies treat the crisis in youth mental health as a public relations issue rather than an urgent societal problem brought on by their products,” said Lexi Hazam, Previn Warren, and Chris Seeger, the three plaintiffs’ lawyers leading the lawsuit. “This includes burying internal research documenting these harms, blocking safety measures because they decrease ‘engagement,’ and defunding teams focused on protecting youth mental health.”
As concerning as the report is, the truth of the matter is that this isn’t surprising behavior from Meta or ByteDance. For years, Instagram has been criticized for giving youths body dysmorphia and a false sense of reality. TikTok has encouraged a dangerous level of irreverence for the sake of popularity. And based on Facebook’s track record, this new filing seems to fit the status quo. In 2016, Facebook witnessed the massive proliferation of misinformation on its platform, which eventually led to the election of Donald Trump as president of the U.S.A. Data firm Cambridge Analytica notoriously helped bring Trump to power through the use of “propaganda warfare” on Facebook ads. In 2021, an ISD Global study found that Facebook failed to curb COVID-19 misinformation coming from a “pseudo-science” anti-vaxxer group. And in the same year, just months after the January 6 Capitol Riots, Facebook was under fire once more for failing to stop white supremacists from organizing.
Don’t get us wrong. Social media platforms have changed society as we know it, and in some ways, for the better. It gave people a place to connect, but the lack of moderation on platforms has made it a modern-day no man’s land, giving dangerous people more audacity to do dangerous things. We’ve seen some measures being imposed, like TikTok’s time limit on users under legal age. Whether that’s enough to curb the dangers of social media is yet to be seen. After all, based on Meta’s history, what are a few lawsuits compared to billions of dollars in revenues?
To the disappointment of the many who once believed in Facebook’s mission, Meta has made it clear that its metaverse is more of a priority than keeping Facebook, and its other platforms, safe—if it ever was in the first place.