Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Exactly. That might be something interesting to think about. Humans make mistakes. LLMs make mistakes.

Yet for humans we have built a society which prevents these mistakes except in edge cases.

Would humans make these mistakes as often as LLMs if there would be no consequences?



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: