Conversation
|
@codex review |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| import { Provider } from "./config"; | ||
| import { env } from "@/env"; | ||
| import type { UserAIFields } from "./types"; | ||
| import { createOllama } from "ollama-ai-provider-v2"; |
There was a problem hiding this comment.
Mock new Ollama provider in tests
The test file now imports createOllama from ollama-ai-provider-v2, but the mock still targets the old ollama-ai-provider module later in the file, so createOllama remains the real implementation. The new Ollama tests that call expect(createOllama).toHaveBeenCalledWith(...) will therefore fail with Vitest’s “received value must be a mock or spy” error when running pnpm vitest run utils/llms/model.test.ts. Update the mock to point at the -v2 module so the spy assertions execute against a mocked function.
Useful? React with 👍 / 👎.
Summary
Testing