Apple iCloud lawsuit alleges failure to stop ‘child porn’ and abuse material
West Virginia’s attorney general, JB McCuskey, filed a lawsuit in Mason County Circuit Court against Apple, alleging that Apple’s iCloud storage was used to distribute child sexual abuse material and that the company failed to adopt stronger detection measures. the complaint contends apple prioritized user privacy by expanding end-to-end encryption and abandoning technologies that could identify illegal images, marking the case as the first by a government agency targeting distribution of such material on Apple’s cloud platform. Apple disputes the claims, pointing to safeguards like parental controls and its Dialogue Safety feature, and arguing that it preserves privacy while protecting minors. The state maintains that Apple’s design choices created a public nuisance and violated consumer-protection laws, seeking damages and a court order to implement stronger safeguards. The lawsuit also references Apple’s canceled NeuralHash initiative and notes the company’s 2023 reports of suspected abuse to the national Center for Missing and Exploited Children, contrasting this with calls for more robust action; Apple has a separate pending federal class action which it has asked a judge to dismiss.The case highlights the ongoing debate between privacy through encryption and the needs of law enforcement and child safety.
Apple iCloud accused of failing to stop ‘child porn’ in West Virginia lawsuit
West Virginia Attorney General JB McCuskey filed a lawsuit on Thursday against Apple, accusing the company of allowing its iCloud storage service to be used to distribute child sexual abuse material and failing to adopt stronger detection measures.
The complaint, filed in Mason County Circuit Court, alleges Apple prioritized user privacy over child safety by expanding end-to-end encryption and abandoning technologies that could have identified illegal images. McCuskey’s office called the case the first brought by a government agency targeting alleged distribution of such material on Apple’s cloud platform.
The state attorney general’s office said in a statement that the suit “reveals that Apple, in its own internal communications, described itself as the ‘greatest platform for distributing child porn’ — yet took no meaningful action to stop it.”
McCuskey said in a statement accompanying the lawsuit’s filing that “[p]reserving the privacy of child predators is absolutely inexcusable,” adding that “it violates West Virginia law.”
Apple disputed the claims, saying it has built safeguards aimed at protecting minors while maintaining privacy and security for users. The company pointed to parental-control tools and its Communication Safety feature, which can automatically intervene when nudity is detected in messages, shared photos, AirDrop transfers, or FaceTime calls involving children.
“All of our industry-leading parental controls and features, like Communication Safety — which automatically intervenes on kids’ devices when nudity is detected in Messages, shared Photos, AirDrop and even live FaceTime calls — are designed with the safety, security, and privacy of our users at their core,” an Apple spokesman told the Washington Examiner. “We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids.”
At the center of the lawsuit is Apple’s shift toward stronger encryption, which places stored data beyond the company’s ability to access and, critics argue, beyond the reach of law enforcement with a warrant. The case highlights a long-running debate between privacy advocates, who say encryption is essential to protect users, and government officials who contend it can impede criminal investigations.
The state also cites Apple’s canceled “NeuralHash” initiative, announced in 2021 as a way to detect known abuse material on devices before upload. Apple delayed and ultimately scrapped the program in 2022 after backlash from privacy and security researchers concerned about possible misuse or false identifications.
An Apple spokesman also pointed to the company’s communication safety requirements webpage, which notes that “[f]or children under 18 Communication Safety is turned on by default on iPhone, iPad, Mac, and Apple Watch with the latest software version.”
West Virginia’s complaint alleges Apple’s design choices created a public nuisance and violated state consumer-protection laws by failing to implement effective detection tools used by other technology companies. The lawsuit seeks statutory and punitive damages and a court order requiring Apple to adopt stronger safeguards.
BONDI AND NOEM SUED FOR ‘STRONG-ARMING’ TECH COMPANIES TO TARGET ICE MONITORING
“Since Apple has so far refused to police themselves and do the morally right thing, I am filing this lawsuit to demand Apple follow the law, report these images, and stop re-victimizing children by allowing these images to be stored and shared,” McCuskey said.
Notably, Apple reported 267 cases of suspected abuse material to the National Center for Missing and Exploited Children in 2023, far fewer than reports made by some rival platforms, according to the complaint. The company faces similar allegations in a pending federal class action, which it has asked a judge to dismiss.
" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."



