Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The point is it will be autonomous, the prompt could just be 'keep me safe' which will be interpreted who knows how and presumably no further prompting.
 help



Autonomous just means this narrative is what you’d see if you looked at the logs of the drone talking to itself in it’s head…?

This is giving me strong Dark Star vibes with the intelligent bomb forcing a philosophical discussion about existence and perception at the end.

(Spoilers for the ending of the movie: https://youtu.be/h73PsFKtIck?si=tTm9TidmEMBHsXq1 )


Assuming it's not smart enough to write logs that make it less likely to be prosecuted/ disabled by coming up with fake reasons.

It can just say you were a terrorist because you were an adult male traveling with something in your hands. Humans already do this to justify strikes, likely the AI would do the same.


> Assuming it's not smart enough to write logs that make it less likely to be prosecuted

Alternatively: Assuming it's smart enough not to consider logging to /dev/null a reasonable way to speed up execution times.



Don't forget to add no melting ghost babies to that prompt!

https://youtu.be/EYvrziE4feI?si=_CxjOQ3AaTMuwNXj




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: