Apple announced today that it is delaying rollout of its controversial new surveillance technology that sought to identify child pornography.
They’ve updated their previous press release with this disclaimer at the top:
“Update as of September 3, 2021: Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Initial concerns were wide-ranging:
- How accurate would this scanning be? This was a particular concern when the consequences of a false match are so extreme. A misclassification could potentially mean a phone owner would be reported to law enforecement.
- Many people didn’t like Apple accessing their photo libraries or scanning their media.
- Apple could review your searches (potentially in real time) and block them or report them.
- A number of privacy groups were concerned that once this “thin wedge” was installed, it could be misused in the future.
Note that Apple has only announced they are taking “additional time” not necessarily canceling the initiative. What are your thoughts on this?
Related Posts:
- Merry Christmas from LowEndBox! - December 25, 2024
- We are Social Butterflies!Check Us Out Wherever You Browse, View, or Tap! - December 23, 2024
- Let’s Celebrate the Winter Solstice with Awesome Deals and a Free Bonus Code for RackNerd’s Giveaway! - December 22, 2024
Again, just to be clear:
“How accurate would this scanning be? This was a particular concern when the consequences of a false match are so extreme. A misclassification could potentially mean a phone owner would be reported to law enforecement.”
It would take approximately 30 false positives before it would BE REVIEW BY AN APPLE EMPLOYEE to see if law enforcement should be contacted.
“Many people didn’t like Apple accessing their photo libraries or scanning their media.”
You mean, just like Dropbox, Microsoft, Google, Facebook, etc ALREADY do?
“Apple could review your searches (potentially in real time) and block them or report them.”
You mean something like this: https://www.searchenginejournal.com/ways-to-get-deindexed-by-google/242069/
“A number of privacy groups were concerned that once this “thin wedge” was installed, it could be misused in the future”
There are MANY more efficient ways Apple ALREADY has in place to scan your photos. Go to Apple Photos on your phone or Mac and search on Dog or Car? It will likely find instance of this. You know why? Because Apple already scans your photos and categorizes them. Search on text in a photo and IT CAN FIND that! Because it has scanned your photos already. This has been in place for YEARS, never has been expanded to include nefarious information, and would be much easier to do these things that people are so concerned about this CSAM tool expanding to do!
There are worse things to worry about….
“You mean, just like Dropbox, Microsoft, Google, Facebook, etc ALREADY do?”
“There are MANY more efficient ways Apple ALREADY has in place to scan your photos.”
Sure. But why should we allow the practice to continue/escalate? What is the purpose of this argument?
I simply don’t understand the reasoning behind sitting back and accepting our privacy being consistently violated, especially in an upward trajectory. I do not use Apple devices or services, but from what I understand, a main concern in regards to this new scenario is going through the device storage, not just what’s been uploaded. Who owns the device?
The point is that whether it is Google, Microsoft, Apple or for that matter Lenovo, Dell, HP…. you eventually have to TRUST someone. In theory, you can set up a device (mobile or desktop\laptop) with a fully open source software stack, and you could review all the code that you install on your devices but that is near impossible to do.
So, if you not willing to devote your full time to maintaining the integrity of your devices you eventually have to to TRUST they are not doing something nefarious.
Google dropped the “Don’t be Evil” motto a long time ago. Microsoft could care less about most consumer devices, they have a long time ago moved on to the corporate world providing most of their revenue. So in my OPINION, and I squarely understand this is my opinion, the company that advertises based on not invading your privacy is where I choose to go.
Apple has for a long time said they are maintaining your privacy. Should they break that trust they will be breaking a lot of good will. They have a stake in maintaining whatever privacy policies they can. So I will ride that horse until a better option becomes available…