Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is Apple installing code on their users' devices with the express intent to harm their customers. That's it! This is inarguable! If this system works as intended, Apple is knowingly selling devices that will harm their customers. We can have the argument as to whether the harm is justified, whether the users deserved it. Sure, this only impacts child molesters. That makes it ok?

"But it only impacts iCloud Photos". Valid! So why not run the scanner in iCloud and not on MY PHONE that I paid OVER A THOUSAND DOLLARS for? Because of end-to-end encryption. Apple wants to have their cake and eat it too. They can say they have E2EE, but also give users no way to opt-out of code, running on 100% of the "end" devices in that "end-to-end encryption" system, which subverts the E2EE. A beautiful little system they've created. "E2EE" means different things on Apple devices, for sure!

And you're ignoring (or didn't read) the central, valid point of the EFF article: Maybe you can justify this in the US. Most countries are far, far worse than the US when it comes to privacy and human rights. The technology exists. The policy has been drafted and enacted; Apple is now alright with subverting E2EE. We start with hashes of images of child exploitation. What's next? Tank man in China? Photos of naked adult women, in conservative parts of the world? A meme criticizing your country's leader? I want to believe that Apple will, AT LEAST, stop at child exploitation, but Apple has already estroyed the faith I held in them, only yesterday, in their fight for privacy as a right.

This isn't an issue you can hold a middleground position on. Encryption doesn't only kinda-sorta work in a half-ass implementation; it doesn't work at all.



> So the Photos app will hash images and compare to the list in the database.

I am wondering what hashes are now and will be in this database. Or combine Pegasus exploit , put a few bad images on the journalist/politician iPhone, cleanup the tracks and wait for Apple and FBI destroy the person.


> in their fight for privacy as a right

I kept the lineage phone in my back pocket, confident that it would be a good 4-5 years before they shipped something that violated their claims. I figured, the alternatives would be stable and widespread.

My timing was off.


> with the express intent to harm their customers.

This of course gets into 'what even is harm?' since that's a very subjective way of classifying something, especially when you try to do it on behalf of others.

For CSAM you could probably assume that "everyone this code takes action against would consider doing so harmful", but _consequences in general are harmful_ and thus you could make this same argument about anything that tries to prevent crime or catch criminals instead of simply waiting for people to turn themselves in. You harm a burglar when you call for emergency services to apprehend them.

> This isn't an issue you can hold a middleground position on. Encryption doesn't only kinda-sorta work in a half-ass implementation; it doesn't work at all.

This is the exact issue that the U.S. has been entrenched by - thinking that you can't disagree with one thing someone says or does and agree with other things they say or do. You can support Apple deciding to combat CSAM. You can not support Apple for trying to do this client-sided instead of server-sided. You can also support Apple for taking steps towards bringing E2EE to iCloud Photos. You can also not support them bowing to the CCP and giving up Chinese citizens' iCloud data encryption keys to the CCP. This is a middle ground - and just because you financially support Apple by buying an iPhone or in-app purchases doesn't mean you suddenly agree with everything they do. This isn't a new phenomenon - before the internet, we just didn't have the capacity to know, in an instant, the bad parts of the people or companies we interfaced with.


You do harm a burglar when you call for emergency services; but the burglar doesn't pay for your security system. And more accurately: an innocent man pays for his neighbor's security system, which has a "one in a trillion" chance of accusing the innocent man of breaking in, totally randomly and without any evidence. Of course, the chances are slim, and he would never be charged with breaking in if it did happen, but would you still take that deal?

I've seen the "right to unreasonable search and seizure" Americans hold quoted a bit during this discussion. Valid, though to be clear, the Constitution doesn't apply for private company products. But more interestingly: what about right against self-incrimination? That's what Apple is pushing here; that by owning an iPhone, you may incriminate yourself, and actually it may end up happening whether you're actually guilty or not.


Regarding your second paragraph of the legality, Apple doesn't incriminate you even if they send the image off and a image reviewer deems something CSAM. If Apple does file a police report on this evidence or otherwise gives evidence to the police, the police will still have to prove that (A) the images do indeed depict sexually suggestive content involving a minor, and (B) you did not take an affirmative defense under 18 USC 2252A (d) [0], aka. they would have to prove that you had 3 or more actual illegal images and didn't take reasonable steps to destroy the images or immediately report them to law enforcement and given such law enforcement access to said photos.

The biggest issue with this is, of course, that Apple's accusation is most certainly going to be enough evidence to get a search warrant, meaning a search and seizure of all of your hard drives they can find.

0: https://www.law.cornell.edu/uscode/text/18/2252A#:~:text=(d)...


Based off of your A and B there, I think we’re about to see a new form of swatting. How many people regularly go through all of their photos? Now if someone pisses someone else off and has physical access to their phone they just need to add 3 pictures to the device with older timestamps and just wait for the inevitable results.


I believe the account gets disabled..? More of that, no thanks.


> Sure, this only impacts child molesters.

Um. No?

I would be very surprised if more than 10% of people in possession of sexual images of under 18s molested (pre-pubecent) children.


There’s a database of known child porn. The hashes of these images are compared with the content on peoples phones.


I don't think GP missed that point, and it seems like you missed theirs. Having illegal CSAM and molesting children are two different crimes.


The OP gets away with that argument, because many people who have such images are also, hopefully, also minors.

However, this is NOT the use case being applied for here. Holding those images, which are not part of known CP, will not be an issue, brining it up is a red herring. ~The issue most people have fruitfully started discussing is the scanning of content on your own phone.

Secondly - the correlation between holding known CP and child molestation IS, sadly, high.


I think Apple has always installed software on their users' devices with explicit intent to harm their customers. This instance just makes is a little bit more obvious what the harm is but not enough to harm Apple's bottom line. Eventually Apple will do something that will be obvious to everyone but by then it will probably be too late for most people to leave the walled garden (prison).


there is no e2e encryption of icloud photos or backups, and they never claimed to have that (except for keychain) - the FBI stepped in and prevented them from doing so years ago.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: