Home > Media News > Apple under fire for allegedly using privacy to excuse inaction on child abuse ...

Apple under fire for allegedly using privacy to excuse inaction on child abuse material on iCloud
19 Aug, 2024 / 09:11 am / Apple

Source: http://www.mashable.com

115 Views

Mashable: A proposed class action lawsuit asserts that Apple is utilizing privacy claims as a shield to evade its obligation to prevent the storage of child sexual abuse material on iCloud and to address alleged grooming activities via iMessage.

Lawsuit Accuses Apple of Ignoring Child Sexual Abuse Content on iCloud...The company stopped using a child sexual material scanning tool, saying it posed a risk to user privacy, the complaint noted.https://t.co/pI2lhz4G3g — ashlanddog (@ashlanddog) August 16, 2024

This development follows allegations from a UK organization that Apple is significantly underreporting instances of child sexual abuse material (CSAM). The new lawsuit contends that the company is engaging in "privacy-washing" to diminish its responsibilities, reports AppleInsider.

A recent submission to the US District Court for the Northern District of California has been initiated on behalf of an unidentified 9-year-old plaintiff. Referred to solely as Jane Doe in the legal document, the filing alleges that she was pressured into creating and uploading child sexual abuse material (CSAM) using her iPad.

The filing states that when 9-year-old Jane Doe was gifted an iPad for Christmas, she could not have anticipated that it would be exploited by offenders to pressure her into creating and uploading child sexual abuse material ("CSAM") to iCloud.

The lawsuit requests a jury trial and seeks damages surpassing $5 million for each individual included in the class action. Additionally, it demands that Apple be compelled to:

Implement safeguards to protect children from the storage and dissemination of CSAM on iCloud.
Establish accessible reporting mechanisms.
Adhere to quarterly monitoring by an independent third party.

Apple's move to scan phones and computers for child sex abuse images has reportedly sparked concerns among the company's employees too https://t.co/W5V6ncL8xf pic.twitter.com/Z5icE8ETDu — Gadgets 360 (@Gadgets360) August 13, 2021

The legal complaint asserts that Apple manipulates the concept of 'privacy' to suit its own interests, stating, "At times, it prominently displays 'privacy' on urban billboards and polished advertisements, while at other moments, it employs privacy as a rationale to ignore the blatant violations of Doe and other children's privacy due to the widespread presence of CSAM on Apple's iCloud."