The Western Journal

‘Meta Never Told Parents…’: Court Docs Claim Meta Failed to Prevent Abuse of Minors

A lawsuit filed against Meta alleges the company failed to warn about the risks its platform poses to young users and showed negligence regarding adults exploiting minors. Court documents claim Meta knowingly allowed harmful content related to eating disorders, suicide, and child sexual abuse to persist without effective removal. According to testimony from former Instagram safety chief Vaishnavi Jayakumar, Meta maintained a high tolerance for violations involving sex trafficking, suspending accounts only after multiple offenses, and did not adequately inform parents or authorities. The lawsuit accuses Meta of prioritizing user growth and engagement over safety, comparing its approach to marketing dangerous products to children despite knowing the mental health risks. Internal reports indicated Meta was aware that actions like making teen accounts private coudl drastically reduce unwanted adult interactions but chose not to implement them instantly due to concerns about losing users. The case includes more than 1,800 plaintiffs and criticizes other social media companies for similar reckless growth strategies at the expense of children’s well-being. Meta has not responded to these allegations.


Court documents filed in a lawsuit against Meta allege that the company failed to inform anyone of risks to young users on the platform and had a cavalier attitude about adults preying on minors.

The allegations add that Meta chose not to take steps to increase safety for young users, according to Time.

The brief, which contains the claims against Meta, was filed in the Northern District of California. It claimed Meta knew its products worsened teen mental health, while content concerning eating disorders, suicide, and child sexual abuse was often found but allegedly rarely taken down.

In the brief, Instagram’s former head of safety and well-being, Vaishnavi Jayakumar, said that Meta, which she joined in 2020, had a lax policy in terms of “trafficking of humans for sex.”

“You could incur 16 violations for prostitution and sexual solicitation, and upon the 17th violation, your account would be suspended,” Jayakumar said, adding, “by any measure across the industry, [it was] a very, very high strike threshold.”

Meta never told parents, the public, or the Districts that it doesn’t delete accounts that have engaged over fifteen times in sex trafficking,” the plaintiffs wrote in the court filing.

“Meta has designed social media products and platforms that it is aware are addictive to kids, and they’re aware that those addictions lead to a whole host of serious mental health issues,” Previn Warren, an attorney for the plaintiffs in the case, said.

“Like tobacco, this is a situation where there are dangerous products that were marketed to kids. They did it anyway, because more usage meant more profits for the company,” he said.

The lawsuit, which includes more than 1,800 plaintiffs, says the parent companies for Instagram, TikTok, Snapchat, and YouTube “relentlessly pursued a strategy of growth at all costs, recklessly ignoring the impact of their products on children’s mental and physical health.”

Although Time noted that Meta has made some changes, it said the brief includes testimony from Brian Boland, Meta’s former vice president of partnerships, who left in 2020 after 11 years.

“My feeling then and my feeling now is that they don’t meaningfully care about user safety,” he said. “It’s not something that they spend a lot of time on. It’s not something they think about. And I really think they don’t care.”

Meta did not respond to Time’s request for comment.

The lawsuit noted that five years before Instagram took teen accounts private to avoid adults contacting underage users, Meta considered what to do about that problem. In 2020, it determined it could lose 1.5 million monthly active teens per year, with an employee quoted as saying that taking action “is likely to lead to a potentially untenable problem with engagement and growth.”

“Meta knew that placing teens into a default-private setting would have eliminated 5.4 million unwanted interactions a day,” the lawsuit said.

The lawsuit added that although Meta’s artificial intelligence tools detected harmful content, such as self-harm, child sex abuse, or eating disorder content, most of the content was never taken down.




Advertise with The Western Journal and reach millions of highly engaged readers, while supporting our work. Advertise Today.



" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
*As an Amazon Associate I earn from qualifying purchases

Related Articles

Back to top button
Available for Amazon Prime
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker