Apple is facing new calls to drop its plans to scan phones
Apple is facing new calls to drop its plans to scan phones and other devices for images of child sex abuse.
A coalition of 90 groups around the world are calling on Apple to drop its plans to scan its products to detect images of child sexual abuse stored in iCloud.
In a letter published on the Center for Democracy and Technology website, the groups said the feature, known as a CSAM hash, may jeopardize the privacy and security of Apple users worldwide.
“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the letter said.
Apple’s plan, announced earlier this month, will feature an update that scans Apple products to detect child sexual abuse material. Such material found would then be reported to the National Center for Missing and Exploited Children (NCMEC).