Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I see only two outcomes at this point. LLMs evolve into AGI or they evolve into something perceptually indistinguishable from AGI. Either way the result is the same and we’re just arguing semantics.


Explain how a language model can “evolve” into AGI.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: