SQLDatabaseToolkit adds temperature=0 by default and breaks the execution when using 03-mini #29541
Open
5 tasks done
Labels
🤖:bug
Related to a bug, vulnerability, unexpected error with an existing feature
investigate
Flagged for investigation.
Checked other resources
Example Code
In langchain_community/agent_toolkits/sql/toolkit.py, class SQLDatabaseToolkit(BaseToolkit): line 48:
Instantiate:
.. code-block:: python
This instructs the LLM to generate code using temperature=0. When o3-mini is used ,then this causes:
BadRequestError('Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.",
I removed the temperature=0 from my local copy of the code and instruct it to not use temperature with o3-mini and the error stopped.
It also works when I pass the instruction not to set temperature in the user/system prompt directly.
Error Message and Stack Trace (if applicable)
BadRequestError('Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_parameter'}}')Traceback (most recent call last):
File "/root/.cache/pypoetry/virtualenvs/gix-service-9TtSrW0h-py3.12/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 637, in generate
self._generate_with_cache(
File "/root/.cache/pypoetry/virtualenvs/gix-service-9TtSrW0h-py3.12/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 855, in _generate_with_cache
result = self._generate(
^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/gix-service-9TtSrW0h-py3.12/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 717, in _generate
response = self.client.create(**payload)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/gix-service-9TtSrW0h-py3.12/lib/python3.12/site-packages/openai/_utils/_utils.py", line 279, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/gix-service-9TtSrW0h-py3.12/lib/python3.12/site-packages/openai/resources/chat/completions.py", line 850, in create
return self._post(
^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/gix-service-9TtSrW0h-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1283, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/gix-service-9TtSrW0h-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 960, in request
return self._request(
^^^^^^^^^^^^^^
File "/root/.cache/pypoetry/virtualenvs/gix-service-9TtSrW0h-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1064, in _request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_parameter'}}
Description
In langchain_community/agent_toolkits/sql/toolkit.py, class SQLDatabaseToolkit(BaseToolkit): line 48:
Instantiate:
.. code-block:: python
This instructs the LLM to generate code using temperature=0. When o3-mini is used ,then this causes:
BadRequestError('Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.",
I removed the temperature=0 from my local copy of the code and instruct it to not use temperature with o3-mini and the error stopped.
It also works when I pass the instruction not to set temperature in the user/system prompt directly.
System Info
System Information
Package Information
Other Dependencies
The text was updated successfully, but these errors were encountered: