Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> designate your partner as a child.

That's not how it works, unless you control your partner's Apple ID and you lie about their DOB when you create their account.

I created my kids Apple IDs when they were minors and enrolled them in Family Sharing. They are now both over 18 and I cannot just designate them as minors. Apple automatically removed my ability to control any aspects of their phones when they turned 18.

> Dads' photos of their kids landing them on a national kiddie porn watch list.

Indeed, false positives is much more worrying. The idea that my phone is spying on my pictures... like, what the hell.



> That's not how it works, unless you control your partner's Apple ID and you lie about their DOB when you create their account.

Rather than reassuring me, this sounds like an achievable set of steps for an abuser to carry out.


I recently had a friend stay with me after being abused by their partner. The partner had paid for their phone and account and was using that control to spy on them. I wish that cyber security was taught in a more practical way because it has real world consequences. And like two comments on here and it’s now clear as day how this change could be used to perpetuate abuse. I’m not sure what the right solution is, but I wish there was a tech non profit that secured victims of abuse in their communication in an accessible way to non tech people.


Most people on this platform understand Cyber Sec and OpSec relatively well. The problem is you are concerned with the people not on a platform like this who require a good learning system and ways of making it interesting to actually retain and understand.


More than achievable. Abusers often control their victims' accounts.


exactly, as the IT guy in the family you set up accounts for everybody all the time


Here’s better -

There’s a repository built from seized child porn.

Those pictures and videos have hashes. Apple wants to match against those hashes.

That’s it.

That’s it for now.


How do you prevent photo's from your kids ending up in such a database? Perhaps you mailed grandma a photo of a nude two year old during bath time during a Covid lockdown — you know, normal parenting stuff. Grandma posted it on Facebook (accidentally, naively, doesn't matter) or someone gained access to it, and it ended up on a seedy image board that caters to that niche. A year later and it's part of the big black box database of hashes and ping, a flag lights up next to your name on Apple's dashboard and local law enforcement is notified.

I don't know how most people feel about this, but even a false positive would seem hazardous. Does that put you on some permanent watch list in the lowest tier? How can you even know? And besides, it's all automated.

We could of course massively shift society towards a no-photo/video policy for our kids (perhaps only kept on a non-internet connected camera and hard drive), and tell grandma to just deal with it (come back after the lockdown granny, if you survive). Some people do.

And don't think that normal family photos won't get classified as CEI. What is titillating for one is another's harmless family photo.


This is implying all the concerns about possible future uses of this technology are unreasonable slippery slope concerns, but we're on our like fourth or fifth time on this slope and we've slipped down it every previous time, so it's not unreasonable to be concerned

Previous times down this slope:

* UK internet filters for child porn -> opt out filters for regular porn (ISPs now have a list of porn viewers) + mandatory filters for copyright infringment

* Google drive filters for illegal content -> Google driver filters for copyrighted content

* iCloud data is totally protected so it's ok to require an apple account -> iCloud in China run by government controlled data centers without encryption

* Protection against malware is important so Windows defender is mandatory unless you have a third party program -> Windows Defender deletes DeCSS

* Need to protect users against malware, so mobile devices are set up as walled gardens -> Providers use these walled gardens to prevent business models that are bad for them


The first slippery slope for this was when people made tools to do deep packet inspection and find copyrighted content during the Napster era.

That was the first sin of the internet era.

Discussing slippery slopes does nothing.

Edit: It is frustrating to see where we are going. However - conversations on HN tend to focus on the false positives, and not too much on the actual villains who are doing unspeakable things.

Perhaps people need to hear stories from case workers or people actually dealing with the other side of the coin to better make a call on where the line should be drawn.


I don't think anyone here is trying to detract from the horrors and the crimes.

My problem is that these lists have already been used for retaliation against valid criticism. Scope creep is real, and in case of this particular list, adding an item is an explicit, global accusation of the creator and/or distributor for being a child molester.

I also wrote some less commonly voiced thoughts yesterday: https://news.ycombinator.com/item?id=28071872


Rather than try to rehash the arguments myself, I'll just point you to Matthew Green's detailed takedown: https://twitter.com/matthew_d_green/status/14230910979334266...

But just to highlight one aspect, the list of maintained hashes has a known, non-negligible fraction of false positives.

> That’s it for now.

If this is an attempt at "first they came...", we're not biting.


Bit confused here.

My statement was to clarify incorrect statements of the issue. Someone was worried about incorrect DoBs entered by jilted lovers would get people flagged.

I just outlined what the actual process is. I feel that discussing the actual problem leads to better solutions and discussions.

Since this topic attracts strong viewpoints, I was as brief as possible to reduce any potential target area, and even left a line supporting the slippery slope argument.

If this was not conveyed, please let me know.

Matter of fact, your response pointing out the false positive issues is a win in my book! Its better than what the parent discussion was about.

But what I am truly perplexed by, is when you talk about "firs they came..." and "we're not biting".

Who is we, and why wouldn't YOU agree with a position supporting a slippery slope argument?

You seem to disagree with the actions being telegraphed by Apple.

Could you clarify what you mean?


This isn't a question about condoning child abuse. It's a question of doing probabilistic detection of someone possessing "objectionable content". Not sharing, not storing - possessing. This system, once deployed, will be used for other purposes. Just look at the history of every other technology supposedly built to combat CP. They all have expanded in scope.

Trying to frame the question along the usual slippery slope arguments implicitly sets up anyone critisicing the mechanism as a supporter of fundamentally objectionable content.


Sure, and i have no objection to what you are saying.

This thread however was where I was making a separate point that helps this discussion by removing confusion or assumptions on how Apple’s proposal works.

Perhaps you may have misread what I was saying ?


Sorry about the really long delay with answer, the week got better of me.

Your original post posited a reasonable question, but I felt the details were somewhat muddled. The reason I reacted and answered was that I have seen this style of questioning elsewhere before. The way you finished off was actually a little alarming: it'd be really easy to drop in with a followup that in turn would look like the other person was trying to defend the indefensible.

With my original reply I attempted to defuse that potential. The issue is incendiary enough without people willingly misunderstanding each other.


What forms of abuse will this open up to the prospective abuser that they couldn't do previously?


Distrust your spouse/partner isn't cheating? Designate them a minor and let the phone tell you if they sext.


If you already control their apple account, then you already have access to this information. Your threat model can’t be “the user is already pwned” because then everything is vulnerable, always


The real problem here is that the user can't un-pwn the device, because it's the corporation that has root instead of the user.

To do a factory reset or otherwise get it back into a state where the spyware the abuser installed is not present, the manufacturer requires the authorization of the abuser. If you can't root your own device then you can't e.g. spoof the spyware and have it report what you want it to report instead of what you're actually doing.


i wish I could downvote this a million times. if someone has to seize physical control of your phone to see sexts thats one thing. this informs the abuser whenever the sext is sent/received. this feature will lead to violent beatings of victims who share a residence with their abuser. Consider the scenario of sally sexting jim while tom sits in another room of the same home waiting for the text to set him off. in other circumstances, sally would be able to delete her texts, now violent tom will know immediately. Apple has just removed the protection of deleting texts from victims of same residence abusers.

Apple should be ashamed. I see this as Apple paying the tax of doing business in many of the worlds most lucrative markets. Apple has developed this feature to gain access to markets that require this level of surveillance if their citizens.


> Consider the scenario of sally sexting jim while tom sits in another room

Consider Sally sending a picture of a bee that Apple’s algo determines with 100% confidence is a breast while Tom sits in another room. One could iterate ad infinitum.


Well yeah, but this makes the UI better to pwn your victims - now it tells you when they do something instead of you needing to watch for it


Parent police the phones of 16 and 17 years old? That's some horrifying over parenting, Britney spears conservatorship level madness. Those kids have no hope in the real world


Clearly you are not a parent.

Well, as a parent, I can tell you that some 16/17 year olds are responsible and worthy of the trust that comes with full independence. Others have more social/mental maturing to do yet and need some extra guidance. That's just how it goes.


When you write that out, the idea of getting 'Apple ID's for your kids doesn't sound that great.

Register your kids with a corporate behemoth! Why not!? Get them hooked on Apple right from childhood, get their entire life in iCloud, and see if they'll ever break out of the walled garden.


> false positives is much more worrying

This is an argument for me to not start using iCloud keychain. If Apple flags my account, I don't want to lose access to literally all my other accounts.


The “child” would be alerted and given a chance to not send the objectionable content prior to alerting anyone else. Did you read how it work?

Also, a father would only land in a national registry of their child’s photos are known to be CSAM. Simply taking a photo of your child wouldn’t trigger it.


> That's not how it works, unless you control your partner's Apple ID and you lie about their DOB when you create their account.

The most annoying thing about Apple Family sharing is that in order to create accounts for people you must specify that they are under 13 (source: https://www.apple.com/lae/family-sharing) - otherwise the only other option is for your "family member" to link their account to the Apple Family which is under your purview, which understandably many people might be hesitant to do because of privacy concerns (as opposed to logging into the child account on a Windows computer exclusively to listen to Apple Music - which doesn't tie the entire machine to that Apple ID as long as it's not a mac).

And so in my case, I have zero actual family members in my Apple Family (they're more interested in my Netflix family account). It begs the question, why does Apple insist on having people be family members in order to share Apple Music? We have five slots to share, and they get our money either way. They also don't let you remove family members - which may be the original intent for insisting on such a ridiculous thing - as if they're trying to take the moral high ground and guilt trip us for disowning a family member when in fact it simply benefits them when a fallout occurs between non-family members, because there's a good chance that the person in question will stop using the service due to privacy concerns, and that's less traffic for Apple.

It's actually kind of humorous to think that I still have my ex-ex-ex-girlfriend in my Apple Family account, and according to Apple she's 11 now (in reality, she's in her 30s). I can't remove her until another 7 years pass (and even then it’s questionable if they’ll allow it, because they might insist that I can’t divorce my “children”). And honestly, at this point I wouldn’t even remove her if I could, she has a newborn baby and a partner now, and I’m happy to provide that account, and I still have two unused slots to give away. I’ve never been the type of person who has a lot of friends, I have a few friends, and one girlfriend at a time. But the thing is she’s never been a music person and I assume that she isn’t even using it - and so even if I made a new best friend or two and reached out to her to let her know that I wanted to add them, Apple currently wouldn’t let me remove her to make room for those theoretical friends. While I'm a big fan of Apple hardware, it really bothers me that a group of sleazy people sat around a table trying to figure out how to maximize income and minimize network traffic, and this is what they came up with.


Did you ever stop to realize if licensing has anything to do with this? You also lied about someones age when creating their Apple account and continue to provide access to someone outside your family. Call them, and remove them, and then likely they'll ban you for violating the ToS


Curious, how would licensing affect this? Would the assumption be that everyone resides under the same roof? Because that's not a requirement for being in a family.


No, generally to license content you pay a fee per screen in this case as well as a fee per viewer. In the case of families this is calculated by the amount of people in the account. They don’t charge your costs per person, they charge a flat rate based on the maximum number of people you can add to your account. So doing it this way they’re not circumventing the per device fee they are charged that you’re trying to get them to pay for you for free.


> They don’t charge your costs per person, they charge a flat rate based on the maximum number of people you can add to your account. So doing it this way they’re not circumventing the per device fee they are charged that you’re trying to get them to pay for you for free.

I'm confused, how am I trying to get them to provide anything for free? I pay for the service, and that service has a limited number of account slots, and the people using those slots have their own devices. What am I missing?

Are you under the assumption that child accounts don't occupy a slot, and are free-riding? If so, that's not the case. Child accounts occupy a slot all the same, the only difference is that by providing child accounts to my adult friends, they aren't required to link their existing Apple accounts to the service that's under my control.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: