-
Notifications
You must be signed in to change notification settings - Fork 59.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix no max_tokens in payload when the vision model name does not cont… #5304
Conversation
@dustookk is attempting to deploy a commit to the NextChat Team on Vercel. A member of the Team first needs to authorize it. |
WalkthroughThe recent modifications to the Changes
Poem
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configuration File (
|
Your build has completed! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files selected for processing (1)
- app/client/platforms/openai.ts (1 hunks)
Additional comments not posted (1)
app/client/platforms/openai.ts (1)
193-194
: LGTM! Simplified condition formax_tokens
.The removal of the additional condition ensures that
max_tokens
is set for all vision models, improving flexibility and aligning with the PR objectives.
…ain 'vision'.
💻 变更类型 | Change Type
🔀 变更说明 | Description of Change
Fix no 'max_tokens' in the payload when the vision model name does not contain 'vision'
📝 补充信息 | Additional Information
&& modelConfig.model.includes("preview")
会导致常规 vision model 走不进 if 代码块,从而无法设置max_tokens本地出现的bug是上传图片后,openai回复断流。 删除
&&
后正常Summary by CodeRabbit
New Features
max_tokens
parameter in the vision model requests, enhancing compatibility across various model configurations.Bug Fixes
max_tokens
is applied more consistently.