So Apple decided that they'll take every one of your pictures and try to asses them to check if they contain nude children.

Imagine some random guy knocking on your door, stating they are from the photo album manufacturer and they would now need to check if any of your photo albums contain pictures of naked children. And if he considers anything suspicious he might need to report it to the police.

I'm sure you would enjoy such a visit every week, I mean, it's to protect children, right? Right?

Oops I did it again…

A little warning banner showing up, reminding you what Apple decided they should do. I guess that will stick around a bit longer than the hype around the problem.

git.shivering-isles.com/shiver

Enjoy!

Well, that didn't take long… we have a first hash collision: social.wildeboer.net/@jwildebo

Have that picture of a dog on your phone and the secret threshold for apple's CSAM scanner is down by one.

Good that there is only a 1 in a trillion chance of a false positive.

But no worries, it's just your own phone that make wrong accusations of being a pedophile against you and maybe reports you to Apple which might takes it to the police.

Follow

@sheogorath Many argue that if it's a piece of software then it is not search without warrant, because a search needs to be carried out by a human. But search by software may be (sometimes) even worse than search by human, exactly because of these cases (and if people blindly believe in results from such software, then it's just recipe for dystopia).

Sign in to participate in the conversation
SN.Angry.Im

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!