Skip to content

GPT-5 fail gracefully if temperature parameter is provided #2482

@gusfraser

Description

@gusfraser

Description

If temperature is sent as a parameter e.g.

agent = PydanticAgent(
                model="openai:gpt-5-mini",
                instructions="You are an expert at evaluating whether conversation goals have been achieved. Be precise and analytical.",
                model_settings={"temperature": 0.7, "max_tokens": 200}  
            )

This fails with gpt-5 models as temperature is no longer a parameter.

PydanticAI generation failed: status_code: 400, model_name: gpt-5-mini, body: {'message': "Unsupported value: 'temperature' does not support 0.7 with this model. Only the default (1) value is supported.", 'type': 
'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_value'}

I think this should fail gracefully as it's a common attribute sent to other models (makes switching models more difficult if you have to also remember and configure which parameters to send / not send)

References

https://community.openai.com/t/temperature-in-gpt-5-models/1337133/13 - breaking change according to OpenAI community comments

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions