Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Catching child pornographers should not involve subjecting innocent people to scans and searches. Frankly, I don't care if this "CSAM" system is effective - I paid for the phone, it should operate for ME, not for the government or law enforcement. Besides, the imagery already exists by the time it's been found - the damage has been done. I'd say the authorities should prioritise tracking down the creators but I'm sure their statistics look much more impressive by cracking down on small fry.

I've had enough of the "think of the children" arguments.



The algorithms and data involved are too sensitive to be discussed publicly and the reasoning is acceptable enough to even the most knowledgeable people. They can't even be pressured to prove that the system is effective at it's primary purpose.

This is the perfect way to begin opening the backend doors.


The algorithm is actually public: https://www.apple.com/child-safety/pdf/Apple_PSI_System_Secu... From an intellectual point of view it's interesting to learn about.

I agree with the rest of your points. The problem is that we don't know if Apple implemented this algorithm correctly, or even this algorithm at all, because the source code isn't subject to review and, even it were the binary cannot be proved to have been built from such source code. We also don't have proof that the only images being searched for are child abuse images as they claim.


Security by obscurity has never been particularly effective, and there are some articles which allege that detection algorithms can be defeated fairly easily.


I’m furious. My top app has 250,000 uniques a day.

I’m considering a 24h black out with a protest link to apple’s support email explaining what they’ve done.

I wonder if anyone else would join me?


We need to get organized first. We need a support platform where we can coordinate these type of actions. It's in my todo list, but if anyone can get this started please do so


There isn't any reason to believe the CSAM hash list is only images. The government now has the ability to search for anything in your iCloud account with this.


Why is it always "think of the children"? It gets people emotional? What about terrorism, murder, or a litany of other heinous violent crimes?


I invite you to look up "The Four Horsemen of the Infocalypse". Child Pornography is but one of the well trodden paths to remove privacy and security.


And remember, a minor who takes pictures of him or herself is an offender.


As it has to be. Because there's no defense against the possession of it, you don't want a situation where a person under 18 can take pictures of him or herself, send it to an adult unsolicited, and then call the police and not suffer any consequences.


That doesn't make any sense and it's not how it works in a number of other countries. You could (for example) make it illegal to send these pictures instead.


That's how it works in the U.S. It's called "strict liability."


People own their bodies. Taking pictures of yourself, if you're a child, isn't child porn any more than touching yourself is molestation/assault.

Children don't need to be hit with "strict liability".

A person trying to frame someone else of a serious crime commits a serious offense, yes.

But that's a logically separate concept from the production or possession of child pornography, which that person must not be regarded as committing if the images are of him or herself.

The idiotic law potentially victimizes victims. A perpetrator can threaten the child into denying the existence of the perpetrator, and into falsely admitting to having taken pictures him or herself. It's exactly like taking the victims of human trafficking and charging them with prostitution, because the existence and whereabouts of the traffickers couldn't be established.

Whoever came up with this nonsense was blinded by their Bible Belt morality into not seeing the unintended consequences.


We have to deal with the reality of our legal system.


Note that this message is substantially different from what communicated with "As it has to be".


"CSAM" is an easy target because people can't see it - it would be wrong for you to audit the db because then you'd need the illicit content. So its invisible to the average law-abiders.


France has been pushing terrorism as a justification for mass-surveillance in the E.U.


Yes. I'm not interested in catching pedophiles, or drug dealers, or terrorists. It's the job of the police. I'm not the police.


Yes, if you act as the police you are a vigilante.


> the damage has been done. I'd say the authorities should prioritise tracking down the creators

Russian and ex-Soviet countries with human trafficking mafias host several fucked up people who produce this crap.


You know it can be used to get the geo location in the meta data of pictures for people who took photos at protests. Etc


I do agree with your points, but I think it's obvious to see that this feature is trying to allow authorities to catch the creators.


[flagged]


The work of Facebook's illicit media team has led to many, many prosecutions. They intentionally keep quiet about it because the reaction to a headline like "500-member Child Porn Ring busted on Facebook" isn't "Geez, I'm glad Facebook is keeping us safe," it's "Wow, maybe we shouldn't let our teenagers on Facebook" -- a reaction that significantly hurts their bottom line, and tips off the ChiPo folks besides.

Source: my own experiences in the criminal justice system and Chaos Monkeys, by Antonio Garcia-Martinez (a Y Combinator alum!).


>They intentionally keep quiet about it because the reaction

This sentiment can go fuck-itself...

Its HIDING the fact of how rampant this is.


> Source: my own experiences in the criminal justice system and Chaos Monkeys, by Antonio Garcia-Martinez (a Y Combinator alum!).

Makes me think even harder about the _real_ reason Apple canceled him.


>"the reaction to a headline like "500-member Child Porn Ring busted on Facebook" isn't "Geez, I'm glad Facebook is keeping us safe," it's "Wow, maybe we shouldn't let our teenagers on Facebook"

-- Exactly. Fuck facebook.

If they wanted more credibility it wouldn't be about "making the bottom line a more profitable place"

As opposed to the bullshit "making the world a better place"


I can tell that you're angry at Facebook. However, I don't really understand why. You're upset that they aren't taking more public credit? Perhaps this is a cultural difference, but I've never been exposed to a community where not taking credit violates social values. Help me understand?


Put your email in your profile, plz


There have certainly been busts in the media, including some depraved individuals who have blackmailed teenagers into sending them images, one of which set the dangerous precedent of tech companies developing exploits, and refusing to disclose them after the fact.

It isn't terribly surprising that a platform like Facebook, which has a lot of children on it, would end up attracting predators who seek to prey on them. Fortunately, Facebook has been deploying a number of tools to improve their safety over the past few years which don't rely on surveillance or even censorship.

Statistically, there have been a number of arrests which have been a product of their activities, although I don't have much info on those. Someone else may.

The real question is whether it is worth sacrificing everyone's privacy, so that a few people can be arrested.

I can imagine iCloud being a lower risk platform than Facebook. Someone can't really groom someone into uploading photos, although the existence of such images is still very condemnable.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: