-
Notifications
You must be signed in to change notification settings - Fork 36
Description
Issue
While testing something out that utilizes this client to make requests, things were working great yesterday but started to fail today with the following error: Invalid arguments while trying to generate text result: Bad Request (400) - Model incompatible request argument supplied: temperature.
In debugging further, found that if I only have OpenAI configured and I don't specify a model, seems today it started picking gpt-4o-mini-search-preview as the model. Not sure if that's a new model that OpenAI just released recently or if they changed the model list their API returns but somehow that now ends up on the top of the list when we try and determine which model works best.
But as the error shows, that model doesn't support standard parameters like temperature or n and so the request doesn't work.
This does highlight a downside of the approach taken in this SDK, where we try and intelligently choose the provider and model. Anytime a new model is released or the model APIs we use change the order of their results, we could end up breaking integrations. I was just testing things locally so while this issue was annoying it wasn't critical but for any production integrations, would be ideal if there was a bit more certainty things won't just break one day.
There's potentially a conversation we could have around that (for instance, should we be setting preferred models within the client to avoid breaking changes like this?) but for now, seems like we need to update the OpenAiModelMetadataDirectory class to account for this search model.
Steps to Reproduce
Things may change but as of right now, I can easily reproduce this by calling our cli command:
OPENAI_API_KEY=sk-proj-XXXX php cli.php 'Hello' --temperature=0.9If the model the SDK chooses changes in the future, can reproduce by running this:
OPENAI_API_KEY=sk-proj-XXXX php cli.php 'Hello' --modelId=gpt-4o-mini-search-preview --temperature=0.9