-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Add native OpenRouter model #2409
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
PR Change SummaryAdded native support for the OpenRouter model in the documentation, enhancing the API's capabilities and providing detailed setup instructions.
Modified Files
Added Files
How can I customize these reviews?Check out the Hyperlint AI Reviewer docs for more information on how to customize the review. If you just want to ignore it on this PR, you can add the Note specifically for link checks, we only check the first 30 links in a file and we cache the results for several hours (for instance, if you just added a page, you might experience this). Our recommendation is to add |
@abhishekbhakat The reasoning parameters can already be set using the As for the As it stands, I don't think the slight increase in convenience justifies the additional burden of maintaining this code and keeping it in sync with I'll close this PR but feel free to submit a new one that involves less duplication. |
@abhishekbhakat Having seen some more OpenRouter issues show up recently, I've changed my mind on this and think we should indeed have a separate Are you interested in updating this PR to match the latest code in |
Yeah. Now I feel that you feel what I feel about this 😂. I'll update the PR this weekend. |
@abhishekbhakat Awesome, thank you! Can you see how we can keep the amount of copy-pasting to a minimum? Subclassing |
dd9a185
to
af72bd8
Compare
I took a stab at updating the PR with extracting some common compat. And perhaps, it will be an ongoing effort as and when openrouter starts to drift apart from openai. |
9fac917
to
cc66d80
Compare
2c49a4b
to
9f64a41
Compare
Thanks for working on this! Just saw that OpenRouter are launching an alpha verison of the responses API so in the future we'll profit from having this dedicated model class :) |
00ad298
to
dc1bc75
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@abhishekbhakat Thanks for picking this up again. I haven't done a full review yet, because looking through the diff there seem to be a lot of changes that are not strictly necessary. Can you see if you can reduce the diff so that I can focus on just the bits that were pulled out of OpenAIChatModel
, and the new model?
I'm still working through the PR. When I tried to do extract some common functions, at lot of coverage gaps occurred. PR is not ready for review at this point. |
61f898a
to
926f64d
Compare
926f64d
to
186160f
Compare
d47813f
to
2074fe0
Compare
@DouweM Can you take a look now ? |
c4e0d31
to
fc5be63
Compare
@abhishekbhakat I'll review this on Monday, sorry for the delay! |
@abhishekbhakat What do you think of the implementation at #3089, which is a lot simpler because it subclasses |
IMO, openrouter being an aggregator OpenAI SDK is a lifesaver for it. And I'm very 50-50 at seeing openrouter diverge from openai. They definitely cannot create a new SDK all together in the forseable future (which is relatively short in the AI terms). I would say let's proceed with #3089. And we can always revisit if they actually decide to move away. |
Closed in favor of #3089 |
Native Model for OpenRouter
This document outlines the implementation of a native model for OpenRouter, leveraging the OpenAI SDK internally to improve maintainability.
Reasoning Parameter Structure
The native model handles reasoning parameters differently depending on the provider:
openai_reasoning_effort
parameter with accepted values:low
,medium
, orhigh
.extra_body.reasoning
:openrouter_reasoning_effort
openrouter_reasoning_max_tokens
openrouter_reasoning_enabled
openrouter_reasoning_exclude
Finish Reason Handling
Both models use the same validation pipeline from
_openai_compat.py
, with one key difference:stop
,length
,tool_calls
,content_filter
,function_call
error
to handle OpenRouter-specific error statesUnknown finish reasons are gracefully handled by mapping to
None
rather than raising validation errors.Error Response Handling
OpenRouter can return HTTP 200 responses with an
error
field in the body when upstream providers fail. This implementation detects such responses and raisesModelHTTPError
appropriately.closes: #2323