Skip to content

[Question]: Do you sopport local LLMs (ollama and LM Studio) ? #34

@velteyn

Description

@velteyn

Do you need to ask a question?

  • I have searched the existing question and discussions and this question is not already answered.
  • I believe this is a legitimate question, not just a bug or feature request.

Your Question

Hello,
my question is simple , do you support local LLM manager such Ollama or LM Studio ? I tried to put http://localhost:1234 in my configuration but I obtain loads of errors so I think this is not supported.
Thank you

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions