Skip to content

Conversation

@michaelneale
Copy link
Collaborator

@michaelneale michaelneale commented Aug 26, 2025

this detects ollama and shows the card only when it is installed

another take on #4329 cc @blackgirlbytes @angiejones

(currently is down below the fold, and mentions significant hardware requirements)

@angiejones
Copy link
Collaborator

would prefer #4348. just because Ollama is installed doesnt mean the user has a model that is good for tool calling. it's just overall not the best experience

@michaelneale
Copy link
Collaborator Author

@angiejones given how much I Have been pinged about the regression (with streaming) from 1.5, I think a lot more are using it than we realise. This change won't show it unless running, and will check for the model/download the prescribed one if they choose to go that path (otherwise they are more likely to get the generic ollama case and use a model we don't know will work).

@michaelneale
Copy link
Collaborator Author

closing as was removed instead

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants