Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This take is borderline absurd.

The risk of AI is in centralization of AI. In the same way the risk of the web was in centralization of the web.

Those very real risks manifested in very real ways on the web.

I have young kids. I'm not exaggerating when I say I wake up every day and diligently build the future I was promised for them.

I've got somewhere around 10 years, conservatively, until Instagram/TikTok/YouTube/Facebook/etc. try to get my daughter to kill herself. I've got around 10 years until she is a statistic, hopefully the statistic that survives.

Social Media wasn't always Social Media. We weren't promised Social Media. We were promised Social Networking.

Social Networks were supposed to connect us with our communities. Bring the world together in a way previously impossible. Allow the free exchange of information and ideas. That future is still possible.

But it turns out expecting some dude in Palo Alto to pay to be the free Proxy and Archivist of the entire world's social communication is a hard problem to solve. It's a hard engineering problem. It's a hard financial problem. It's a hard legal problem. It's a hard ethics problem.

Social Networking companies didn't solve that problem. They failed and pivoted into Social Media.

Social Media divides us. Social Media isolates us. Social Media believes my friends, my family, and my community "aren't engaging enough" and litters my timeline with randos it's trying to hype to keep me engaged so they can monetize the experience.

I go to my PTO meetings and parents talk about their kids being isolated and depressed. They talk about body image issues and lack of purpose.

I go to my community gatherings, business networking groups, and gym. People talk about how lonely they feel. How angry they feel. How depressed and anxious they feel.

Instead of the Social Networks we were promised, we have these Social Media monstrosities that take randos on the internet and try to make them famous in front of other randos on the internet in a variable rate reward slot machine.

Social Networks on P2P technology have been viable for at least 5 years now, but the industry hasn't caught up. It's possible to join a p2p social network from your browser tab, pair it with your phone, have them both simultaneously be able to generate posts and sync over a gossip protocol without conflict. And it's possible for your social group to backup your content. All of this is still possible with the Application Service Provider architecture of having a "backup pinning server" that will make your data highly available.

Identity is solvable in a way that allows for full control of your identity without relying exclusively on public/private key cryptography - and permits social recovery of your identity when your keys are lost or compromised [1]. End users don't need to understand private keys or public keys. They just know they have an account, and that their friends can send them a friend request.

I'm going long on p2p applications. I think they aren't just viable, but that they can out compete the user experience that current generation Application Service Providers offer while simultaneously breaking the incentives that are destroying the societies my children will inherit.

AI is no different. The call to arms about AI regulation and alignment often degrades to advocating for a world where my children do not get access to AI because it's "too dangerous" for them - while someone else's kids belong in the privileged AI class.

The trajectory forming for AI strongly signals to me that it will be the single most profound means of production our species has produced. Any philosophy that advocates taking that away from my kids can go play in its own corner. I'm not taking back the web for my kids just to let them take away AI in return.

If these things interest you, and you're interested in doing hardcore research on productionizing real, no bullshit, peer-to-peer networks both for end users and enterprises, my email is in my bio. I want to talk to you. We need more good people in this space, and we have budget for it.

My team is building peer-to-peer networks for AI, but the solutions that break AI out of it's ivory walls aren't AI specific solutions. These peer-to-peer building blocks are general purpose and the next generation of the web is going to run on them.

This is the other other road ahead [2].

We will succeed or I will die trying.

[1] http://www.blankenship.io/essays/2023-09-24/

[2] http://paulgraham.com/road.html



>Social Media wasn't always Social Media. We weren't promised Social Media. We were promised Social Networking.

Brilliant.


Everything you're saying resonates, though I'm in that "10 years" zone, ahead of you, in regards to parenting.

I'm here to talk about the problems of adolescent girls in this era, if you ever want to reach out. A horrible reality I've lived as a parent.

And I would like to build the online social world we wanted, back in the early 90s, rather than the one we eventually sold ourselves to, or were sold to.

... and to me that social tech we need is concrete rather than virtual. It's immanent rather than transcendent. Infra or "within", rather than meta and "without", or virtual. Its creative rather than passive-consumptive. And narrative-textual-descriptive rather than artificial-graphical. And to me the outline of that looks something like what was being grasped at in the early 90s with synchronous, live, shared creative social worlds like LambdaMOO -- if that isn't too obscure for you -- than what we ended up with the web, and then Facebook, Twitter, etc.

The peer to peer aspect is of less ... interest? ... to me? I'm not convinced that this aspect of it is as important as it is to refocus on control/ownership, intimacy, and moderation?


> Everything you're saying resonates, though I'm in that "10 years" zone, ahead of you, in regards to parenting.

> I'm here to talk about the problems of adolescent girls in this era, if you ever want to reach out. A horrible reality I've lived as a parent.

This is heartbreaking and I'd love to talk... Permission to reach out via email?

> I'm not convinced that [peer to peer] is as important as it is to refocus on control/ownership, intimacy, and moderation?

I believe these are synonymous.

> control/ownership

When you invert control/ownership, the user controls their data. The application is delegated access to it. This is a peer-to-peer data structure.

Instead of generating the data on an ASP's server, the user generates their data and stores it in a data structure they have control over. They can _delegate_ generation of data to an ASP style app, but ultimately the app is taking actions _on their behalf_ which the user owns.

They can pull that data down to their laptop, push it up to a peer-to-peer enabled NAS (purchased from the big box store) that they plugged into their router, etc.

The app can't revoke access to their data if they've backed it up. The app can refuse to continue mirroring it or modifying it, but it can't _lock them out_. It's the user's data.

That data is sharable and discoverable without the app's permission. The app can participate in sharing and discovering the data, but it can't _block_ the user from interacting with other users on the web.

The user isn't _forced_ to interact with other users either. They are in full control of who they associate with.

Right now we are solving this with a hash-tree. Every time a user takes an action ("creates a post", "uploads a picture", etc.) it's added to their hash tree. When you add a new device, the tree "branches" and you get two branches, one for each key to sign and add next blocks. This solves concurrent writes much like a CRDT.

That hash-tree is an append-only database. Anything you want to record goes into the database. The nodes in the tree do not store content, only a _reference_ to the content. This keeps the tree very small (1M nodes can fit in less than a GB of storage). You can backup the entire tree without backing up all of the content. And you can "delete" large files without having to rewrite the tree.

The tree itself isn't the user's identity though. That's fiat. If a key is compromised, users can flag content and say "hey, this isn't Alice, Alice wouldn't send me a crypto scam." Users can take two trees and "merge" them under the same identity (these are both Alice, but Alice lost her login and got locked out of her account, so has a new tree). This can be done without appealing to some application owner to "permit" you to fix Alice's social identity from your perspective on the web.

Then we use a capability system to delegate rights to keys added to the tree. So you can add an app (like iCloud) to your tree, and that app can now take actions within the scope of the capability it was granted.

> intimacy, and moderation

Intimacy, to me, means letting users decide how they want to model their social interactions. Who they pull in content from, how that content gets aggregated and displayed, etc.

Moderation, to me, means deciding what the user gets to see.

Those decisions are for communities, not for some remote disinterested platform operator.

You solve intimacy and moderation by allowing communities to form and moderate themselves.


>> I'm not convinced that [peer to peer] is as important as it is to refocus on control/ownership, intimacy, and moderation?

>I believe these are synonymous.

Not op, but I don't believe it is and while I would prefer more p2p in so many case onboarding issues brake it. My hope is more on federation now but still you see so many peoples complains on how Mastodon is not as simple as Twitter ...

But good luck in your projects :).


> My hope is more on federation

Personally I see federation as N copies of the centralization problem.

I don't think it foundationally solves any of the problems that resulted in centralization of the web, or the incentives that lead to social networking evolving into social media. It just creates N copies of them. They are still ASPs, there are just N instances of them now.

Federation still segregates users into the "administrator class" and the "user class" for web technology. And the administrator class has a substantial power imbalance over the user class.

Users aren't platforming themselves on a federated system. System administrators retain "root access" and users are second class. Users have little to no way of operating independently of the administrator class, unless they are technically savy enough to administer their own node. And administrators still need to pay the bills (though many federated servers operate on altruism).

It's also rather nonsensical to me. Most of the arguments that promote Application Service Providers can be accomplished with peer-to-peer software.

The software can be remotely administrated, updated, and managed. You can deliver a peer-to-peer experience via a browser tab, an app store, or a binary. Those can still update the same way an ASP does today.

The user's data can be backed up and made highly available (though that becomes less important, because a user's social group is doing that too!).

Many of the arguments for someone to administer a server on behalf of a user are still valid in p2p, but the server becomes a p2p node - not a root administrator over the user's account.

Peer-to-Peer doesn't mean users have to build and operate everything. It just means the data they generate is theirs and their social connections are theirs. They can directly dial and communicate with the devices in their social group without needing to go through a centrally managed server.

The federated incentive model can still incentivize administrators to build and maintain systems for peer-to-peer networks without requiring a "first class" and "second class" digital citizen class structure baked into the fabric of the internet protocols.


I would have more to say about a lot of these things, but I have no time today but also don't want to leave the thread stale too long, so just a few points.

> This is heartbreaking and I'd love to talk... Permission to reach out via email?

Absolutely. I think this might be visible through my GH profile followed-from my HN profile? Unsure.

> When you invert control/ownership, the user controls their data. The application is delegated access to it. This is a peer-to-peer data structure.

I feel like this is probably an orthogonal question to where the data/system physically lives. It's possible to give logistical / social control over one's data without going to a fully distributed p2p transport/data model.

> Then we use a capability system to delegate rights

This is good. Also a path I'm pursuing.

> Intimacy, to me, means letting users decide how they want to model their social interactions.

> Moderation, to me, means deciding what the user gets to see.

I have thoughts on this but can't get into it in depth right now. Suffice it to say I am getting a bit of an impression that you are trying -- like an engineer would -- to solve social problems with technical ones.

There is no magic bullet for these issues that becomes solved through a P2P (or federative, etc.) model.

And my chief concern is actually with the content model / style itself that "social networks" and actually the web itself promulgated. It is asynchronous, "post" oriented, like a mailbox. It is not interactive or real time (apart from messaging), and it is consumptive rather than creative/interactive. I feel like from your comments you are leaving that unchallenged. Or hoping that by giving people control that this will just change. I don't think it will, I think people need to be given better environments and tools.

> You solve intimacy and moderation by allowing communities to form and moderate themselves.

Agree on this, but I also don't think the solution here is technical. P2P (or federation or whatever) may assist, but in the end these are social problems, and problems of scale.

What I perceive as the more resilient social model is a kind of village focus. So what I'm concerned with is two things: 1) how to give people the tools for social/community content construction. So to go beyond the style of magazine-page/leaflet/post/article/shared-link model that the web encouraged and to something that's more about building shared environments. 2) how to let people do that in a more "village" fashion, where, yes, they can moderate themselves; because I believe actually the wide-open mass scale of things like Twitter and/or Facebook etc actually do not really scale.

In any case, interesting discussion.


> I've got somewhere around 10 years, conservatively, until Instagram/TikTok/YouTube/Facebook/etc. try to get my daughter to kill herself.

I plan to keep my kids of social media as much as possible, but I've never heard claims that these companies are trying to get kids to kill themselves. Why bother exaggerating when the truth (more social media use, especially for teen girls, is bad for mental health) is as bad as it is?


Because they knew what it was doing to teens (this was their own research) and chose to do nearly nothing at all. They've chosen profit over individual lives time and time again and yet people still think this is an exaggeration?


There's a difference between trying to do a thing and doing a thing knowing that it will have a side effect in certain cases.

For example, is the President trying to warm the globe when he flies around in Air Force One, with his entourage and protection detail? Or is he trying to do something else, but is aware that in doing so he is generating greenhouse gasses?

There's a difference, at least in my book. And if you've lost me (HN-reading father of soon-to-be-tweenage daughter), then good luck convincing the average Joe.


They did the thing, observed and documented the side effect, and chose to continue doing the thing. They did this in Myanmar and they are still doing this to teenagers.


> I've got somewhere around 10 years, conservatively, until Instagram/TikTok/YouTube/Facebook/etc. try to get my daughter to kill herself. I've got around 10 years until she is a statistic, hopefully the statistic that survives.

You make a fair point gnicholas.

If you can reword this sentence in a way that accurately reflects what you think is happening while simultaneously not losing the conciseness of the waters I'm navigating as a parent when I say "in 10 years, Instagram is going to try to get my daughter to kill herself" - I'll gladly edit it and start using that instead.

I haven't found a better way to express myself though.

Sure apps aren't sentient. They don't have free will. They can't "try to do anything." But language is messy.

You're right about losing people in the argument too, that's not a great outcome.

Micromacrofoot is spot on as well, you have a generation of parents who understand well that social media is a root rot in our society and evidence showing the people building them _knew_ about the impact it was having on kids.

"Bad for mental health" doesn't cut it. I reject any language that dulls the blade of kids committing suicide.


If you share the evidence you're referring to, perhaps I could come up with alternative language to reflect that.

It sounds like you prefer to go for punchier language at the expense of technical accuracy. That's a valid choice to make in certain circumstances (advertisers and political campaigns do it all the time), but you just have to be aware that you will lose some people when you make statements like that.



I think it's more subtle. I think someone like Zuckerberg thinks the thing we interpret as the path to harm is actually the positive goal.

What I see as atomization, isolation, alienation, detachment from the real physical-lived social world, the hyper-focus/engagement into the "meta" or "virtual" world, specifically geared around commercial / advertisement interests, the loss of privacy? All of this is turned on its head by people like him and represented as a kind of liberation, transcendence, a kind of connectedness. If you listen to him talk, he seems to actually believe this stuff.

And all the harms that might come to youth along the way is seen as either not-their-problem, or "worth it" for that goal.

Oh, and it just so happens to make them billions. Ideology is an amazing thing, and usually driven from the wallet.


Serious question - why dont you just not let your daughter use these apps ? My kids not old enough to want to use them yet, but knowing what i know, she wont have access until she is 18, by then, you have no control anyways


I have every intention of encouraging my daughter to read books like Hooked[1] to understand why these experiences are designed the way they are.

I have every intention of helping her develop a personal philosophy that helps her navigate through the world in the face of easy access to variable rate reward systems, opioids, amphetamines, etc.

But she lives in a society full of the consequences of these things.

Sheltering her from it won't shelter her from it. It's prolific, she's going to be exposed to it regardless. At friends houses, at school, at the mall. It is everywhere.

Her social groups are going to be deep into the dark well of these variable rate reward systems, their character and behaviors defined by their interactions with this technology. And she will be interacting with those kids.

Prohibition is not a solution. By the time she is 18, either she has a personal philosophy that keeps her from overdosing on this nonsense without supervision or I've failed as a parent.

But, at all costs, my daughter will be free: http://www.blankenship.io/essays/2023-10-09/

[1] https://www.amazon.com/Hooked-How-Build-Habit-Forming-Produc...


You will know when your children start to reach adolescence. It's a story as old as parenting itself. You don't control your children. They are products only first of the family they are born to, but very rapidly -- probably around kindergarten -- they become primarily products of the society they are in.

You can try to block it all you want. It won't work, unless you have some sort of extra compliant child. Which has its own clear downsides.


I guess I will find out, blocking TicTok seems pretty straight forward until highschool


I'd go for tumblr, pinterest, reddit, and twitter as well, I'm sorry to say.


Mastodon is an actual social network at the moment, but the ongoing flood of Twitter emigres have triggered a culture war that is getting some really funky permutations.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: