Sunday, May 19, 2024
HomeBusinessApple delays child abuse detection system after backlash

Apple delays child abuse detection system after backlash


Apple Inc updates

Apple has bowed to pressure on a planned launch of software to detect photos of child pornography and sex abuse on iPhones after a fierce backlash from privacy campaigners.

The company said it would delay and potentially modify the new system, which was originally expected to launch this year.

“We have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple said in a statement.

One of the proposed features involved a system for matching files that were being uploaded from a user’s iPhone to iCloud Photos against a database of known child sex abuse imagery.

But the new controls, which were announced last month, sparked widespread alarm among privacy and human rights groups who feared a tool for scanning images on iPhones could be abused by repressive regimes.

The American Civil Liberties Union was among those warning that any system to detect data stored on a phone could also be used against activists, dissidents and minorities.

“Given the widespread interests of governments around the world, we cannot be sure Apple will always resist demands that iPhones be scanned for additional selected material,” the ACLU’s staff technologist, Daniel Kahn Gillmor, said last week. “These changes are a step toward significantly worse privacy for all iPhone users.”

Apple’s change of course dismayed some child protection campaigners. Andy Burrows, head of child safety online policy at the UK charity NSPCC, said the move was “incredibly disappointing” and that the company “should have stood their ground”.

Apple’s original proposal had been welcomed by officials in the US, UK and India but caused anger in Silicon Valley during delicate negotiations between the tech industry and regulators over tackling child abuse online.

The head of WhatsApp called it “very concerning”. The Electronic Frontier Foundation, the Silicon Valley digital rights group, said it was a “shocking about-face for users who have relied on the company’s leadership in privacy and security”.

In an email circulated internally at Apple, child safety campaigners had dismissed the complaints of privacy activists and security researchers as the “screeching voice of the minority”.

Apple had spent weeks robustly defending its plan, which it said involved “state of the art” cryptographic techniques to ensure the company itself could not see what images were stored on any customers’ devices.

It said the system would only be used for child protection and that the involvement of a team of human reviewers, alongside a minimum number of images that must be detected before an account was flagged, would nearly eliminate the potential for mistakes or abuses.

But Craig Federighi, Apple’s senior vice-president of software engineering, admitted that the introduction of the child pornography detection system alongside a separate tool that could warn parents if their children received sexually explicit photos through its iMessage system, was confusing.

“It’s really clear a lot of messages got jumbled pretty badly in terms of how things were understood,” Federighi told the Wall Street Journal last month. “In hindsight, introducing these two features at the same time was a recipe for this kind of confusion.”





Source link

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -

Most Popular

Recent Comments