Indication that request has been forwarded to LLM #3078
Unanswered
MichaelaMer
asked this question in
AI & LLMs
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Describe the feature
Local LLM integration is relatively slow due to the currently unavoidable promoting which leads to a long time to first token. Most of us only forward requests to the LLM when it can't be resolved locally. It would be great, if HA could indicate that it forwarded the request to the LLM (something like: "Request forwarded" or a different beep).
Use cases
Turn the light on .. beep .. light goes on
Tell me a joke beep .. different beep .. ... (wait some seconds) .. The "atom make everything up" joke.
Anything else?
No response
Beta Was this translation helpful? Give feedback.
All reactions