West Virginia AG sues Apple for allegedly failing to stop child sexual abuse material on iCloud
“We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids,” an Apple spokesperson said
West Virginia GOP Attorney General JB McCuskey announced Thursday that he filed a lawsuit against Apple alleging the Big Tech giant has failed to stop child sexual abuse material on iCloud services and iOS devices.
McCuskey accused Apple of prioritizing privacy branding and its own business interests over child safety, in contrast to other big tech companies, which have been more proactive by using systems to combat such material, CNBC reported.
Apple tested its own CSAM-detection features in 2021 that could automatically find and remove images of child exploitation and report those that had been uploaded to iCloud in the U.S. to the National Center for Missing & Exploited Children. The plans for the features were withdrawn after privacy advocates expressed concern that the technology could create a backdoor for government surveillance and be exploited to censor other kinds of content on iOS devices.
The National Society for the Prevention of Cruelty to Children, a watchdog based in the United Kingdom, said in 2024 that Apple failed to adequately monitor, tabulate and report CSAM in its products to authorities.
Also in 2024, thousands of child sexual-abuse survivors sued Apple, alleging the company never should have abandoned its earlier plans for CSAM detection features, and that by allowing such material to proliferate online, it had caused survivors to relive their trauma.
If West Virginia wins the lawsuit, then it could force Apple to make design or data security changes. The state is seeking statutory and punitive damages, in addition to injunctive relief requiring the company to implement effective CSAM detection.
A spokesperson for Apple told CNBC that “protecting the safety and privacy of our users, especially children, is central to what we do.”
The spokesperson noted Apple's parental controls and features like Communication Safety, which “automatically intervenes on kids’ devices when nudity is detected in Messages, shared Photos, AirDrop and even live FaceTime calls,” as an indication of its commitment to provide “safety, security, and privacy” to users.
“We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids,” the spokesperson added.