Skip to content

Support Ollama model selection#1

Merged
duartebarbosadev merged 4 commits intomainfrom
codex/implement-ollama-support-if-missing
Dec 3, 2025
Merged

Support Ollama model selection#1
duartebarbosadev merged 4 commits intomainfrom
codex/implement-ollama-support-if-missing

Conversation

@duartebarbosadev
Copy link
Copy Markdown
Owner

@duartebarbosadev duartebarbosadev commented Dec 3, 2025

Summary

  • require an explicitly provided Ollama model instead of falling back to a default
  • remove backup model usage for Ollama and align tests with the explicit requirement

Testing

  • cd apps/web && pnpm vitest run utils/llms/model.test.ts

@duartebarbosadev duartebarbosadev changed the title Require explicit Ollama model selection Support Ollama model selection Dec 3, 2025
@duartebarbosadev duartebarbosadev marked this pull request as ready for review December 3, 2025 14:15
Copilot AI review requested due to automatic review settings December 3, 2025 14:15
@duartebarbosadev duartebarbosadev added enhancement New feature or request and removed codex labels Dec 3, 2025
@duartebarbosadev
Copy link
Copy Markdown
Owner Author

@codex review

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

import { Provider } from "./config";
import { env } from "@/env";
import type { UserAIFields } from "./types";
import { createOllama } from "ollama-ai-provider-v2";
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Mock new Ollama provider in tests

The test file now imports createOllama from ollama-ai-provider-v2, but the mock still targets the old ollama-ai-provider module later in the file, so createOllama remains the real implementation. The new Ollama tests that call expect(createOllama).toHaveBeenCalledWith(...) will therefore fail with Vitest’s “received value must be a mock or spy” error when running pnpm vitest run utils/llms/model.test.ts. Update the mock to point at the -v2 module so the spy assertions execute against a mocked function.

Useful? React with 👍 / 👎.

@duartebarbosadev duartebarbosadev merged commit cf4bd32 into main Dec 3, 2025
@duartebarbosadev duartebarbosadev review requested due to automatic review settings March 23, 2026 22:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant