Skip to content

Add MedGemma to MLLM detection patterns and fix the mllm flag#22

Merged
waybarrios merged 1 commit intomainfrom
fix/medgemma-mllm-detection
Jan 28, 2026
Merged

Add MedGemma to MLLM detection patterns and fix the mllm flag#22
waybarrios merged 1 commit intomainfrom
fix/medgemma-mllm-detection

Conversation

@waybarrios
Copy link
Copy Markdown
Owner

Fixes #21

Summary

MedGemma models were not being recognized as multimodal because the pattern was missing from the detection list. Additionally, the --mllm flag was being ignored because force_mllm was never passed to the engines.

Changes

Added medgemma to MLLM detection patterns in both api/utils.py and models/mllm.py. The --mllm CLI flag now properly passes force_mllm to SimpleEngine and BatchedEngine, allowing users to force MLLM mode for models not yet in the pattern list.

Testing

All 389 tests pass. Verified that mlx-community/medgemma-1.5-4b-it-bf16 is now correctly detected as MLLM.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Doesn't recognize medgemma mlx-community/medgemma-1.5-4b-it-bf16 as mllm (multimodal model)

1 participant