Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I would go so far as to say that people who worry about the dangers of unfriendly superoptimizer intelligences need to look at the massively distributed unfriendly superoptimizer running on the abstracted meatware of human relationships and conveniently labeled "capitalism".


Well you can say that about many problems that we actually have today. That does not necessarily mean that UFAI is not the greatest long-term existential threat. I am not saying it is, but the SIAI sees it that way while recognizing that a lot of money and energy should go into solving problems we actually have today. But does that mean absolutely no energy or money should be spent on unlikely existential threats?


Quite to the contrary. I'm pointing out that while a bunch of people choose to worry about potential threats in the future, the exact kind of threat they worry about is already here, and actively destroying the world and ruining lives. The prophecies are all true, you could say, they're just coming true right now in a way that the worrying prophets don't recognize because they themselves, insofar as they favor capitalism, are participating in the damage.

When I first realized this, the irony kind of threw me for a giggling fit.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: