Experts Say Apple’s Child Porn Detection Tool Is Less Accurate Than Advertised

Critics worry authoritarian governments could use tool to hunt dissidents

(Photo by Michael M. Santiago/Getty Images)

Santi Ruiz • August 24, 2021 4:50 pm

Privacy advocates say Apple’s child pornography detection tool has a higher false-positive rate than the company claims.

Apple announced earlier this month that it will begin scanning user photos stored on iCloud for material found on a database of child pornography. The company says the false-positive rate in initial tests of the system was one in a trillion. But researchers have reverse engineered Apple’s system to show it classifies multiple sets of distinct photos as identical. The mistakes in the detection system cast doubt on Apple’s numbers and suggest iCloud users may be accidentally targeted by law enforcement or malicious actors.

The detection tool marks a shift for Apple. After a 2015 terrorist attack in San Bernardino, Calif., the company refused to build the FBI a backdoor into the shooter’s phone, saying, “The only way to guarantee that such a powerful tool isn’t abused and doesn’t fall into the wrong hands is to never create it.” Experts have warned that authoritarian governments could use the detection tool to hunt down political dissidents.

Apple’s detection system works by comparing “hashes,” or unique identifiers, of iCloud photos to hashes of known child pornography. If the system flags a certain number of hash matches, Apple employees will confirm the materials and alert authorities.

But researchers using code posted publicly by Apple say they’ve already found distinct photos that share the same hash. In one example, researchers artificially modified a picture of a dog so that the photo’s hash was identical to a photo of a young girl. Experts worry that bad actors could modify normal photos to trigger a match with child pornography, and send the photos to unsuspecting users.

Apple has resisted outside scrutiny of its new system. While Apple executive Craig Federighi told the Wall Street Journal that independent researchers were welcome to assess the child porn detection mechanism, the company sued Corellium, a firm that provides researchers with tools to analyze Apple products.

“For the last several years, Apple has argued that doing privacy and security research on iOS is illegal,” according to former Facebook security officer Alex Stamos. Apple has not said whether the code it posted publicly is the code for the final scanning tool, leaving researchers in the dark.

The company has been similarly unclear about the database it will use to identify Child Sexual Abuse Material. The company maintains that keeping the database of images private will prevent people from reverse engineering the hashes of inappropriate photos.

But while Apple says any flagged iCloud content must match databases from at least two child protection organizations, it has only named one such organization, the National Center for Missing and Exploited Children.

Apple did not respond to a request for the names of other partner organizations.


Read More From Original Article Here:

" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."

Related Articles

Sponsored Content
Back to top button
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker