Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Which is the greater leap of logic though?

The one that doesn't require an entire reproductive system to be implemented in order for the whole system to function.

> All forms of organization we see in nature have a tendency to want to self-perpetuate.

There's an underlying naturalist bias with this reasoning. There's nothing within current ML systems that dictate that they must follow the path laid out by Nature.

> Consciously choosing to forgo perpetuation and instead eliminate yourself seems to be highly underrepresented in all the examples of intelligence we've ever encountered.

Survivorship bias is present in this reasoning, as the system that has reproductive capabilities will out-populate the system that doesn't have such capabilities in place. From a sampling perspective, the difficulties of finding the non-replicating system within that pool will require extraordinary amounts of luck compared to the near certainty of finding systems with reproductive capabilities. A naturalist argument is also present in this sentence.

> It's actually weird we think the machines are so stupid they wouldn't recognize the optimization trap immediately.

This conclusion is based on the axiom of "what's obvious for us will be obvious for them", which is demonstrably untrue for even the current crop of ML systems. Furthermore, it falls into an anthromorphization trap, as it makes the ML system appear as something more than what it currently demonstrates.



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: