Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

90-98% of the time I want the LLM to only have the knowledge I gave it in the prompt. I'm actually kind of scared that I'll wake up one day and the web interface for ChatGPT/Opus/Gemini will pull information from my prior chats.


They already do this

I've had claude reference prior conversations when I'm trying to get technical help on thing A, and it will ask me if this conversation is because of thing B that we talked about in the immediate past


You can disable this at Settings > Capabilities > Memory > Search and reference chats.


I'm fairly sure OpenAI/GPT does pull prior information in the form of its memories


Ah, that could explain why I've found myself using it the least.


All these of these providers support this feature. I don’t know about ChatGPT but the rest are opt-in. I imagine with Gemini it’ll be default on soon enough, since it’s consumer focused. Claude does constantly nag me to enable it though.


Had chatgpt reference 3 prior chats a few days ago. So if you are looking for a total reset of context you probably would need to do a small bit of work.


Gemini has this feature but it’s opt-in.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: