Skip to content

Using Web Interfaces as Ollama Backend? #113

@pat-richter

Description

@pat-richter

Is your feature request related to a problem? Please describe.
I use several tools which could connect to local ollama instances. But I dislike to use small models if I can use something like gemini 2.5 pro...

Describe the solution you'd like
Is it somehow possible with the proxy (and maybe something like the browser tools server) to use the chat interfaces as backend? So i would chat from inside of my IDE to a faked ollama instance, which is some browser instance instead.

I mainly used the tools for file edits, which can be quite slow.

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions