Monday My Two Cents Special: Apple to pause its CSAM photos scanning operation for now.
/Apple is apparently heading public complaints about scanning its users' photos for Child Sexual Abuse Material (CSAM). Last Friday, iMore reported Apple announced it "... decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features." Recall before users' photos are uploaded to Apple's iCloud, the program would be run on users' devices to compare their photos' "NeuralHash" to known CSAM images. Do not get me wrong; child pornography is wrong. But no one should have access to my information unless I have granted them access or if the government has a legitimate warrant.
As MacRumors reported, "...the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees. I personally do not like the idea of private companies taking on the role of "Big Brother" snooping for trouble unless it has been served with a "good faith" based warrant to search a person's technological device. Apple is a big privacy advocate (one of many reasons why I like Apple). Not only does Apple's original announcement for the program put a chink in its (privacy) reputation, it gives great pause as to what it (and others) may do to proactively cooperate in sharing my personal information with any government, not just with my own country.
The CSAM snooping program was set to be released with iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. We will see if that happens, if the new OS/iOS is rolled out on time, or if the program is later released. Either way, I won't be storing my personal items on iCloud.
MTC.