Login
You're viewing the front-end.social public feed.
  • Apr 7, 2026, 5:09 PM

    The least that an LLM interface should have to even qualify it's designed, is to make the user know it's failing predictably

    Like if you assume its going to fuck up, atleast have it limited to fucking up in only xyz ways so the user has a way to ration their time and energy to steer it.

    Designing predictable failure modes in LLMs

    💬 0🔄 0⭐ 0