-
Notifications
You must be signed in to change notification settings - Fork 5k
Description
Description
I'm trying to follow the Build your first flow guide from CrewAI with a gpt-5-mini
deployment on Azure, but when I try running the flow with crewai flow kickoff
I get the following error:
ValueError: The model azure/my-gpt-5-mini-deployment does not support response_format for provider 'azure'. Please remove response_format or use a supported model.
Returns the same error with azure/gpt-4.1-mini
.
Checking if my deployments support response_format directly through LiteLLM returns True.
Steps to Reproduce
- Follow Build your first flow guide
- Set up Azure model according to the docs Azure configuration example with AZURE_API_KEY, AZURE_API_BASE and AZURE_API_VERSION keys on
.env
file - Edit LLM constructor on
src/crewai_flow/main.py
# Initialize the LLM
llm = LLM(model="azure/my-gpt-5-mini-deployment", response_format=GuideOutline)
- Edit
crews/content_crew/config/agents.yaml
and set "llm" to "azure/my-gpt-5-mini-deployment" - Run flow with
crewai flow kickoff
- Provide inputs, for example "python" and "beginner"
Expected behavior
The output described in the "Create your first flow" guide
Screenshots/Code snippets
Switching to a different provider like "gemini/gemini-2.0-flash", works fine:
Operating System
Other (specify in additional context)
Python Version
3.11
crewAI Version
0.165.1
crewAI Tools Version
--
Virtual Environment
Venv
Evidence
Checking my model deployment through LiteLLM directly seems to be fine.
from litellm import get_supported_openai_params
params = get_supported_openai_params(model="my-gpt-5-mini-deployment", custom_llm_provider="azure")
print("response_format" in params)
##Outputs "True"
Also according to Microsoft's docs, both gpt-4.1 and gpt-5 models support structured outputs.
Possible Solution
None
Additional context
OS: WSL - Debian 12