Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think anybody from the SIAI is disagreeing with you here. I don't think the kind of prediction you're talking about is part of the intelligence explosion hypothesis. A superintelligence doesn't need literal omniscience, or even something that close to it, in order to be much, much, more effective than humans at achieving goals.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: