Apple’s plan to scan iPhones for images of child sexual abuse ignites privacy concerns

Tech

(NewsNation Now) — Apple recently unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.

The tool designed to detect known images of child sexual abuse, called “neuralMatch,” will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children will be notified.

The system will not flag images not already in the center’s child pornography database. Parents snapping innocent photos of a child in the bath presumably need not worry. But researchers say the matching tool — which doesn’t “see” such images, just mathematical “fingerprints” that represent them — could be put to more nefarious purposes.

Privacy expert Sharon Bradford Franklin says that what Apple is attempting is admirable, but believes it poses privacy a risk.

“You should not feel fully comforted because these policy choices can easily give way in the face of demands that Apple expands and repurpose its new tools,” she said.

Bradford believes the context of what images Apple is scanning could easily change over time.

“Right now, Apple is limiting the photo scanning to scanning for hash images of child sexual abuse material,” she said. “But absolutely, the next thing to come down is going to be a government request to scan for terrorist content.”

The Associated Press contributed to this report.

Latest News

© 1998 - 2021 Nexstar Media Inc. | All Rights Reserved.

Trending on NewsNationNow.com