>18 U.S.C. § 2258A is specific: the data can only be sent to NCMEC. (With 2258A, it is illegal for a service provider to turn over CP photos to the police or the FBI; you can only send it to NCMEC. Then NCMEC will contact the police or FBI.) What Apple has detailed is the intentional distribution (to Apple), collection (at Apple), and access (viewing at Apple) of material that they strongly have reason to believe is CSAM. As it was explained to me by my attorney, that is a felony.
I'm not sure, after reading the article, who is/has the most insane system of Apple or NCMEC.
Because Apple (its employees) aren't actually viewing the images, nor transmitting them. They mention somewhere that it's a low res proxy of the image, or something similar.
But wouldn't a low res image of CP be still classified as CP ? I guess the manual verification will ve done as a joint venture by apple and the authorities
Well the trouble is that it makes developing detection nearly impossible.
And the line where the law is crossed is fuzzy. Say you use an AI classifier, at what accuracy is validating the results of that AI a crime? 50.000001%?
What part of criminalizing the obvious course of action that everyone is taught to do (find evidence of illegal activity, give it to the police) makes any iota of sense to you?
I'm not sure, after reading the article, who is/has the most insane system of Apple or NCMEC.