West Virginia sues Apple, saying it doesn't do enough to report child porn on iCloud
West Virginia took a major step in the fight against child pornography Thursday after the state filed a first lawsuit of its kind against Apple for failing to detect and report the sharing of this material on its systems and devices.
West Virginia State Attorney General JB McCuskey said thousands of kids are at a great risk of being sexually exploited, and that exploitation is coming from the images Apple allows people to store and distribute on its iCloud system, from which the company profits.
"They have made a conscious decision to not filter these images, not report them to the proper authorities, and to charge the people who are trying to keep and use these images for what are undoubtedly disgusting and illegal purposes," McCuskey said at a press conference.
McCuskey and his legal team are suing the tech giant, alleging it has prioritized user privacy over the safety of children for years.
The lawsuit said that U.S. companies like Apple are required to report this kind of content to the National Center for Missing and Exploited Children, based on federal law. It claims that in 2023, Apple filed 267 of these reports, compared to nearly 1.47 million made by Google.
"There is a social construct that dictates that you also have to be part of solving these large scale problems," McCuskey said.
McCuskey said he believes this is a responsibility for large companies like Apple and argued Apple can't be unaware of the problem as it maintains strict control over its hardware, software and cloud infrastructure.
In fact, the lawsuit includes a screenshot of what it says are texts from 2020 between executives.
"What Apple's response was, 'We are the greatest platform for distribution of child porn,'" McCuskey said, quoting the text.
West Virginia claims Apple once tried to solve the problem by implementing tools to detect photos of child exploitation as its competitors have, but abandoned the process, and has remained negligent.
KDKA reached out to Apple for comment. A spokesperson provided the following statement:
"At Apple, protecting the safety and privacy of our users, especially children, is central to what we do. We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids. All of our industry-leading parental controls and features, like Communication Safety — which automatically intervenes on kids' devices when nudity is detected in Messages, shared Photos, AirDrop and even live FaceTime calls — are designed with the safety, security, and privacy of our users at their core."
McCuskey doesn't feel it's enough.
"We're very, very hopeful that Apple will look us in the face and say, 'You know what? You're right,'" McCuskey said.
West Virginia is seeking punitive damages and a requirement from Apple to implement effective detection measures. McCuskey believes other states will end up joining them on the lawsuit.