Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Whether it's a non-problem or not very much depends on how much the LLM API providers actually bother to add enforcement server-side.

Anecdotally, I've seen Azure OpenAI services hallucinate tools just last week, when I provided an empty array of tools rather than not providing the tools key at all (silly me!). Up until that point I would have assumed that there are server-side safeguards against that, but now I have to consider spending time on adding client-side checks for all kinds of bugs in that area.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: