Skip to content

feat: add TextChunk as possible content type for Assistant message#57

Closed
gcalmettes wants to merge 1 commit intomistralai:mainfrom
gcalmettes:fix/assitant-message-openai-compatibility
Closed

feat: add TextChunk as possible content type for Assistant message#57
gcalmettes wants to merge 1 commit intomistralai:mainfrom
gcalmettes:fix/assitant-message-openai-compatibility

Conversation

@gcalmettes
Copy link

@gcalmettes gcalmettes commented Sep 23, 2024

Currently, the AssistantMessage class only accepts str as type. It is the only role only accepting nothing else than str (or None) as content.

This means that payload generated when using the Mistral tokenizer via external systems (e.g. vllm) also accepting TextChunk as content for the Assistant message, will break:

"messages": [
    {
        "role": "system",
        "content": [
            {
                "type": "text", "text": "\nCurrent model: pixtral-12b-2409\nCurrent date: 2024-09-23T07:14:36.737Z\n\nYou are a helpful assistant. You can help me by answering my questions. You can also ask me questions."
                }
        ]
    },
    {
        "role": "user",
        "content": [
            {
                "type": "text",
                "text": "Does the animal in the first image could live in what is describe in the second image ?"
            },
            {
                "type": "image_url",
                "image_url": {
                "url": "https://some-urls"
                }
            },
            {
                "type": "image_url",
                "image_url": {
                    "url": "https://some-urls"
                }
            }
        ]
    },
    {
        "role": "assistant",
        "content": [
            {
                "type": "text",
                "text": "Some response from Pixtral."
            }
        ]
    },
    {
        "role": "user",
        "content": [
            {
                "type": "text",
                "text": "What time is it ?"
            }
        ]
    }
]

This MR adds TextChunk as possible content type for the AssistantMessage class.

@gcalmettes
Copy link
Author

@jean-malo @patrickvonplaten any chance of this PR to be reviewed on your side ? This still causes some incompatiblity when querying mistral with the openAI client (since on their side textChunks are also valid for assistant messages, like for the other roles)

Thanks !

@gcalmettes
Copy link
Author

related: vllm-project/vllm#12859

@juliendenize
Copy link
Contributor

Done in another PR !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants