Skip to content

Conversation

@cbrit
Copy link
Contributor

@cbrit cbrit commented Apr 11, 2025

Closes #393

@yavens
Copy link
Contributor

yavens commented Apr 11, 2025

It looks like Inception is an OpenAI-compatible provider. Is there a reason you can't reuse the types directly from the OAI module? See Hyperbolic's intergration

@cbrit
Copy link
Contributor Author

cbrit commented Apr 11, 2025

It looks like Inception is an OpenAI-compatible provider. Is there a reason you can't reuse the types directly from the OAI module? See Hyperbolic's intergration

Oh great catch. That would simplify things for sure.

@cbrit
Copy link
Contributor Author

cbrit commented Apr 11, 2025

So since the beta only supports text, it seems they don't yet accept structured content data in the content field of the messages:

// this works
{ ... "messages": [{ "content": "Hello, world!" }] }

// this is what OpenAI expects/OpenAI Message type serializes to. Does not work with Inception API right now
{ ... "messages": [{ "content": { "type": "text", "text": "Hello, world!" }}]}

So I was able to simplify it by using the OpenAI provider's streaming utility, but not by using its Message type.

The resulting runtime error if using the other Message type:

ProviderError:  "detail":[{"type":"string_type","loc":["body","messages",0,"content"],"msg":"Input should be a valid string","input":[{"type":"text","text":"You are a helpful AI assistant."}]},{"type":"string_type","loc":["body","messages",1,"content"],"msg":"Input should be a valid string","input":[{"type":"text","text":"Hello, how are you?"}]}]}

@0xMochan
Copy link
Contributor

So since the beta only supports text, it seems they don't yet accept structured content data in the content field of the messages:

// this works
{ ... "messages": [{ "content": "Hello, world!" }] }

// this is what OpenAI expects/OpenAI Message type serializes to. Does not work with Inception API right now
{ ... "messages": [{ "content": { "type": "text", "text": "Hello, world!" }}]}

So I was able to simplify it by using the OpenAI provider's streaming utility, but not by using its Message type.

The resulting runtime error if using the other Message type:

ProviderError:  "detail":[{"type":"string_type","loc":["body","messages",0,"content"],"msg":"Input should be a valid string","input":[{"type":"text","text":"You are a helpful AI assistant."}]},{"type":"string_type","loc":["body","messages",1,"content"],"msg":"Input should be a valid string","input":[{"type":"text","text":"Hello, how are you?"}]}]}

It seems like a lot of "openai compatible" apis are like this which makes me think we might want a custom serializer here..

@cbrit
Copy link
Contributor Author

cbrit commented Apr 16, 2025

@0xMochan Reverted to the custom message implementation

@0xMochan
Copy link
Contributor

@0xMochan Reverted to the custom message implementation

Is this bc they support the full content type now? otherwise, we can flag this for review?

@cbrit
Copy link
Contributor Author

cbrit commented May 7, 2025

@0xMochan Reverted to the custom message implementation

Is this bc they support the full content type now? otherwise, we can flag this for review?

We discussed in discord dms about reverting because they don't. I haven't actually checked to see if the API has changed though now that it's been a few weeks.

@joshua-mo-143
Copy link
Contributor

@cbrit Don't suppose anything has changed on this? I'm looking into potentially adding more new providers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat: Add support for Inception Labs' Mercury Coder Beta dLLM

4 participants