Skip to content

Make chat UI #32

@light-and-ray

Description

@light-and-ray

Add a minimalist simple chat UI based on gradio chat components with the only 1 purpose - allow you to use LLM on the same GPU where you use Comfy

Proposal:

  1. Can enable in settings
  2. A separate UI available on a link
  3. Interrupt and restart the current task in mcww uii, keep it on pause while llm generating answer
  4. Unload LLM callback that is executed before comfy ui generation beginning
  5. Clear cache and models before calling LLM

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions