I just read up, and I didn’t know this is not so much about stopping new images, but restitution for continued damages.
The plaintiffs are “victims of the Misty Series and Jessica of the Jessica Series” ( be careful with your googling) https://www.casemine.com/judgement/us/5914e81dadd7b0493491c7d7
Correct me please, The plaintiffs logic is : “The existence of these files is damaging to us. Anyone found ever in possession of one of these files is required by law to pay damages. Any company who stores files for others, must search every file for one these 100 files, and report that files owner to the court”
I thought it was more about protecting the innocent, and future innocent, and it seems more about compensating the hurt.
Am I missing something?
Children should be made illegal, this is a self resolving problem.
Is iCloud a file sharing service or social network in some way? If it isn’t, comparing them with such services makes no sense
I thought the way they intended to handle it was pretty reasonable, but the idea that there is an actual obligation to scan content is disgusting.
“People like to joke about how we don’t listen to users/feedback. About how we just assert our vision and do things how we wish. Like our mouse. It drives people absolutely bonkers! But this time we listened to the pushback. And now they sue us?”
I’d posit that the people who don’t want their files scanned, and the people suing Apple are not the same people.
If I’ve learned on thing from my time on earth, it’s that all humans are the same, and all of the opinions of one are shared by the majority.
The irony is that the Apple CSAM detection system was as good as we could make it at the time, with multiple steps to protect people from accidental positives.
But, as usual, I think I was the only one who actually read the paper and didn’t go “REEEE muh privacy!!!” after seeing the headline.
You should have though. This type of scanning is the thin end of the wedge to complete surveillance. If it’s added, next year it’s extended to cover terrorism. Then to look for missing people. Then “illegal content” in general.
The reason most people seem to disagree with you in this case is that you’re wrong
We could’ve burned that bridge when we got to it. If Apple would’ve been allowed to implement on-device scanning, they could’ve done proper E2E “we don’t have the keys officer, we can’t unlock it” encryption for iCloud.
Instead what we have now is what EVERY SINGLE other cloud provider is: they scan your shit in the cloud all the time unless you specifically only upload locally-encrypted content, which 99.9999% of people will never be bothered to do.
iCloud does have E2EE if you enable it
It does now, it didn’t at the time