Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
dep_b
10 months ago
|
parent
|
context
|
favorite
| on:
LLMs get lost in multi-turn conversation
Doing the same. Though I wish there was some kind of optimization of text generated by an LLM for an LLM. Just mentioning it’s for an LLM instead of Juan consumption yields no observably different results.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: