[ad_1]

Apple CSAM scanning plans may have been abandoned, but that hasn’t ended the controversy. An Australian regulator has accused the Cupertino company of turning a blind eye to the sexual exploitation of children.

She said that both Apple and Microsoft fail to take steps to protect “the most vulnerable from the most predatory” …

Background

The usual way to detect Child Sexual Abuse Material (CSAM) is when cloud services like Google Photos scan uploaded photos and compare them against a database of known CSAM images. This database is provided by NCMEC and similar organizations around the world.

The actual matching process uses what’s known as a hash, or digital fingerprint. This is derived from key elements of the image, and is deliberately fuzzy so that it will continue to work when images are resized, cropped, or otherwise processed. This means there will sometimes be false positives: an innocent image whose hash happens to be a close enough match to a CSAM one.

Apple planned a more privacy-friendly approach, in which scanning took place on the user’s iPhone, rather than in the cloud – but cybersecurity experts, human rights organizations, governments, and Apple’s own employees all raised four concerns about the plans.

While Apple appeared surprised by the pushback, we pointed out at the time that this was entirely predictable given the company’s constant privacy-based messaging that “what happens on iPhone stays on iPhone.”

The company has put up huge billboards. It has run amusing ads. It has an entire privacy microsite. Its CEO talks about privacy in every interview and public appearance. The company attacks other tech giants over privacy. It fought the entire ad industry over a new privacy feature.

After initially stating that it would pause the rollout in order to consider the concerns raised, and put in place additional privacy safeguards, the company quietly removed all reference to it. When questioned, Apple said that the feature was delayed, not cancelled. However, that changed last week.

On the same day that the company announced Advanced Data Protection with end-to-end encryption for all iCloud data, it also put an end to the never-released CSAM scan. The news was confirmed by Apple’s vice president of software engineering Craig Federighi in an interview with WSJ’s Joanna Stern.

Australian regulator accuses Apple of turning a blind eye

Reuters reports that the Australian e-Safety commissioner has accused both Apple and Microsoft of failing to play their part in preventing the sharing of CSAM.

The e-Safety Commissioner, an office set up to protect internet users, said that after sending legal demands for information to some of the world’s biggest internet firms, the responses showed Apple and Microsoft did not proactively screen for child abuse material in their storage services, iCloud and OneDrive.

An Apple announcement a week ago that it would stop scanning iCloud accounts for child abuse, following pressure from privacy advocates, was “a major step backwards from their responsibilities to help keep children safe” Inman Grant said.

The failure of both firms to detect live-streamed abuse amounted to “some of the biggest and richest technology companies in the world turning a blind eye and failing to take appropriate steps to protect the most vulnerable from the most predatory”, she added.

Whether Apple will be able to maintain its new position remains to be seen. The company may in the future be faced with a legal requirement to detect CSAM.

Photo: Priscilla Du Preez/Unsplash

FTC: We use income earning auto affiliate links. More.


Check out 9to5Mac on YouTube for more Apple news:

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *