Expose the LLM response text from the Assist Pipeline as: #2833
Unanswered
ondrom111
asked this question in
Voice assistants
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Describe the feature
There is currently no variable, event, or hook that contains the LLM response.
This means:
For example, if a user wants to say “translate” and have Home Assistant translate the last LLM reply, this is impossible because the reply text is not available anywhere.
Requested solution
Expose the LLM response text from the Assist Pipeline as:
{{ conversation.response }}), orconversation.response), orresponsevariable.Any of these would allow automations to work with the actual text returned by the LLM.
Thank you for considering this feature.
Example commands
For example, if a user wants to say “translate” and have Home Assistant translate the last LLM reply, this is impossible because the reply text is not available anywhere.
Use cases
Assist Pipeline already receives the LLM response internally, but Home Assistant does not expose it.
Making the response available to automations would unlock many powerful use cases and significantly improve the flexibility of the Assist system.
Anything else?
No response
Beta Was this translation helpful? Give feedback.
All reactions