Meta Tolerated Sex Traffickers With 17-Strike System, Lawsuit Claims
Newly unsealed court documents allege that Meta, the parent company of platforms like Instagram, had an inadequate approach too blocking accounts involved in sex trafficking. Vaishnavi Jayakumar, former head of Instagram safety, testified that Meta’s policy allowed accounts to remain active until they received 17 strikes for sex trafficking-related violations, a threshold considered very high compared to industry standards. The lawsuit claims Meta was aware that millions of adults contacted minors on their platforms and that the platforms exacerbated mental health issues among teenagers, with harmful content such as suicide, eating disorders, and child sexual abuse rarely removed. Filed in California, the lawsuit accuses Meta and other tech companies like Snapchat, TikTok, and YouTube of prioritizing profits over children’s safety, knowingly designing addictive features targeting youth despite internal research highlighting harm. Meta has denied these allegations, stating they have made notable efforts to protect teens over the past decade.
Newly unsealed court filings asserted that Meta, the company behind social media platforms and Instagram, had a lackluster approach toward blocking accounts that appeared to participate in sex trafficking.
Vaishnavi Jayakumar, the former head of safety for Instagram, testified in the documents that when she joined Meta in 2020, there was a “17x” strike policy for accounts involved with “trafficking of humans for sex.”
“You could incur 16 violations for prostitution and sexual solicitation, and upon the 17th violation, your account would be suspended,” Jayakumar testified, per a report from Time.
Jayakumar said that “by any measure across the industry” the strike threshold was “very, very high.”
The lawsuit claimed that Meta knew of harms on their platforms and downplayed risks to younger users anyway.
Meta was allegedly aware that millions of adults on their platforms were actively contacting minors and that their platforms worsened mental health issues in teenagers.
They also allegedly knew that content about suicide, eating disorders, and child sexual abuse was detected but hardly ever removed.
The lawsuit, filed in the Northern District of California, claimed that Meta, as well as the companies behind Snapchat, TikTok, and YouTube, ignored harms to children as they pursued profits.
“Meta has designed social media products and platforms that it is aware are addictive to kids, and they’re aware that those addictions lead to a whole host of serious mental health issues,” Previn Warren, an attorney for the plaintiffs, said of the case, per Time.
“Like tobacco, this is a situation where there are dangerous products that were marketed to kids,” Warren continued.
“They did it anyway, because more usage meant more profits for the company.”
The plaintiffs also say that Meta has been targeting younger users since 2017 despite internal research warning that their platforms were harmful to children.
Meta executives allegedly shut down ways to mitigate the harms suggested by employees.
A representative for the company denied the validity of the lawsuit in a statement to Time.
“We strongly disagree with these allegations, which rely on cherry-picked quotes and misinformed opinions in an attempt to present a deliberately misleading picture,” the representative said.
“The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens — like introducing Teen Accounts with built-in protections and providing parents with controls to manage their teens’ experiences,” the statement continued.
“We’re proud of the progress we’ve made and we stand by our record.”
Advertise with The Western Journal and reach millions of highly engaged readers, while supporting our work. Advertise Today.
" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."