Case Overview: A class action lawsuit has been filed against Apple, alleging that the company's iCloud service facilitated child sexual abuse by failing to detect and report child sexual abuse material (CSAM).
Who's Impacted: Children who have been victims of child sexual abuse facilitated through Apple's iCloud service.
Court: U.S. District Court for the Northern District of California
A new lawsuit brought against Apple on behalf of a 9-year-old North Carolina girl accuses the company of facilitating her sexual abuse by allowing it to be conducted through iMessage and stored on iCloud, when the company had the ability to report it.
The girl, who uses the pseudonym Jane Doe in the lawsuit, was given an iPad for Christmas in 2023 and “never imagined perpetrators would use it to coerce her to produce and upload child sexual abuse material to iCloud,” the lawsuit states.
The lawsuit accuses Apple of knowing iCloud offers a safe haven for producers of child sexual abuse material (CSAM) to store illicit videos and images, saying Apple chose not to adopt PhotoDNA and thus failed to report CSAM to law enforcement.
Apple’s choice not to use CSAM detection is not a result of end-to-end encryption, but it is a business choice that Apple made, the lawsuit alleges, adding “Apple’s purported focus on privacy is arbitrary at best, and hypocritical at worst.”
The 9-year-old girl at the center of the lawsuit was allegedly tricked by two people that contacted her first through Snapchat and then through iMessage to create illicit images and videos of herself and upload them to iCloud. They then asked her to share the link with them.
“As a result of this interaction, plaintiff is severely harmed, mentally and physically,” the lawsuit states, adding she is currently seeking psychotherapy and mental health care and her and her family have been suffering from the sexual abuse that she endured. “Plaintiff’s mother laments that Plaintiff’s innocence and carefree childhood has been ripped away from her.”
Many more children are harmed by Apple’s unwillingness to root out child sexual abuse material from its platforms, the lawsuits alleges, and the company must invest in prevention of the material and compensate victims who have been exploited “due to Apple’s choice to overlook this problem.”
It claims that Apple knew that it had a dire CSAM problem but chose not to address it, and instead accuses Apple of engaging in “privacywashing,” a tactic where the company touts its commitment to protect the privacy of its consumers, but neglects to meaningfully put ideas to practice.
Apple’s anti-fraud chief even knew iCloud has become “the greatest platform for distributing child porn,” the lawsuit states, with him writing in texts “the spotlight at Facebook is all on trust and safety . . . in privacy, they suck. Our priorities are the inverse. . . we have chosen to not know [about CSAM] in enough places where we really cannot say.”
The lawsuit adds that Apple also consistently underreports child sexual abuse material to agencies like the National Center for Missing & Exploited Children, citing that in 2023 four leading tech companies submitted over 32 million reports of CSAM to NCMEC, Apple submitted only 267.
Apple’s “incredulous rhetoric hedges one notion of privacy against another,” the lawsuit alleges, accusing Apple of creating a false narrative that by detecting and deterring CSAM, it would run the risk of creating a surveillance regime that violates other users’ privacy.
“By framing privacy and safety as a zero-sum game, Apple made choices to overlook safety, even though other tech companies have deployed technology to disrupt the distribution of CSAM.”
But, the lawsuit argues, Apple should have a comprehensive privacy that encompasses safety for all consumers, which is the basis for functioning society especially in contexts where children and minors are involved.
As with other industries and fields that are required to strike a balance between reasonable expectations of privacy and reasonable expectations of safety, such as showing a government ID when purchasing alcohol at a restaurant, granting parents and guardians access to a minor’s medical records, or requiring teenage drivers to obtain a learner’s permit before a driver’s license, Apple and other the tech companies shouldn’t be exempt for ensuring safety of its users.
Just a couple of years ago, Apple was close to instituting a system that would have rooted out CSAM from its platforms with a cryptographic system it had built to do just that, however the company ditched it at the end of 2022 over fears it would intrude on users' privacy.
The move, while hailed by some privacy advocates, was largely derided by child protection agencies, Forbes reports. A review by the publication of 100 federal cases in which Apple’s systems were believed to harbor CSAM found thousands of items stored between 2014 and 2023.
The plaintiff in the lawsuit alleges violations of sex trafficking laws, consumer protection laws, breach of contract, misrepresentation, and unjust enrichment. She wants to represent all other victims of child sexual abuse that result from their CSAM being stored or shared on Apple’s iCloud service. She’s seeking declaratory and injunctive relief; compensatory and punitive damages; and attorney’s fees and costs.
Case Details
Plaintiffs' Attorneys
Have you or someone you know been affected by similar issues with Apple's services or other online platforms? Share your thoughts and concerns in the comments below.
Loading...
Loading...
Injury Claims keeps you informed about lawsuits large and small that could affect your daily life. We simplify the complexities of class actions lawsuits, open class action settlements, mass torts, and individual cases to ensure you understand how these legal matters could impact your rights and interests.
If you think a recent legal case might affect you, action is required. Select a class action lawsuit or class action settlement, share your details, and connect with a qualified attorney who will explain your legal options and assist in pursuing any compensation due. Take the first step now to secure your rights.