Apple will scan US iPhone Looking for images of child sexual abuse

Apple escanea todas las imágenes de tu iPhone en busca de abuso infantil

Apple has revealed plans to scan U.S. iPhones for images of child sexual abuse, prompting applause from child protection groups, but raised concerns among some security researchers that the system could be misused, including governments seeking to monitor their citizens.

A tool designed to detect known images of child sexual abuse, called "neutral match," will scan the images before they are uploaded to iCloud. If it finds a match, a man will review the image. If child pornography is confirmed, the account will be deactivated and notified to the National Center for Missing and Exploited Children.

Separately, Apple plans to scan users' encrypted messages for sexually explicit content as a child safety measure, which has also upset privacy advocates.

The detection system will only mark images that are already in the known child pornography database at the center. Parents who take innocent photos of a child in the bathtub probably shouldn't worry. But the researchers say the appropriate tool, which does not "see" such images, but only the mathematical "fingerprints" that represent them, could be used for more perverse purposes.

Matthew Green, one of the leading cryptography researchers at Johns Hopkins University, warned that the system could be used to incriminate innocent people by sending them seemingly harmless images designed to trigger matches with child pornography. That could trick Apple's algorithm and alert the police. "The researchers did it quite easily," he said of the ability to trick such systems.

Other abuses may include government oversight of dissidents or protesters. "What happens when the Chinese government says, 'Here's a list of files we want you to scan,'" Green asked. "Apple says no? I hope they say no, but their technology won't say no."

Apple busca detectar imágenes de abuso sexual infantil en sus teléfonos y  servidores - France 24

Technical companies, including Microsoft, Google, Facebook, and others, have been sharing digital impressions of famous images of child sexual abuse for years. Apple used them to scan user files stored on its iCloud service, which is not as securely encrypted as their data on the device, for child pornography.

Apple has been under pressure from the government for years to allow greater surveillance of encrypted data. The enactment of the new security measures required Apple to strike a delicate balance between combating child exploitation and maintaining a high commitment to protecting the privacy of its users.

But the depressed Electronic Frontier Foundation, a pioneer of online civil liberties, called Apple's commitment to privacy "a shocking face for users who have trusted the company's leadership in privacy and security."

Meanwhile, the computer scientist who invented PhotoDNA more than a decade ago, a technology used by law enforcement to identify child pornography online, acknowledged the potential for abuse of Apple's system but said it far exceeded the imperative of combating child sexual abuse.

Apple to scan US iPhones for images of child sexual abuse

"Is it possible? Of course. But is that something that worries me? No," said Hany Farid, a researcher at the University of California, Berkeley, who argues that many other programs designed to protect devices from various threats have not. seen "in this way, for example, WhatsApp provides users with end-to-end encryption to protect their privacy, but also uses a malware detection system and warns users not to click on malicious links.

Separately, Apple said its messaging app will use machine learning on a device to identify and blur sexually explicit photos on children's phones, and it can also alert parents of younger children via text messages. Its software is also said to "step in" when users try to search for topics related to child sexual abuse.

To receive warnings about sexually explicit images on their children's devices, parents will need to report their child's phones. Children over the age of 13 can go out, which means that parents of teenagers will not receive notifications.

Apple said none of these features would compromise the security of private communications or notify the police.

Lineesh Kumar

"Lk Techsky is Expertise in covering technology news, reviews, producing quality tech videos, graphic designing, VFX editing, and more..." facebook instagram linkedin

Post a Comment

Previous Post Next Post

Subscription