Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add model prop to useChat hook to remove the need for a backend #5207

Open
wants to merge 19 commits into
base: main
Choose a base branch
from

Conversation

The-Best-Codes
Copy link

Add model prop to useChat hook to remove the need for a backend

This PR introduces the model prop to the useChat hook, enabling direct interaction with Language Model V1 compliant AI providers (e.g., Google, Groq, and other models from @ai-sdk) directly from the client-side, thus removing the need for a dedicated backend in simple use cases.

Addresses: #5140

Changes:

  • Added model prop to UseChatOptions: The UseChatOptions interface now accepts an optional model property of type LanguageModelV1.
  • Conditional API Call: The triggerRequest function now checks if a model is provided. If so, it uses the model's doStream method to interact with the AI provider. Otherwise, it defaults to the existing callChatApi function, which requires a backend endpoint.
  • Streaming Implementation: When a model is provided, the PR implements a streaming mechanism using TransformStream and WritableStream to handle the response from the AI provider. The response is processed chunk by chunk, updating the chat messages in real-time.
  • Updated chatKey: The chatKey used for SWR is updated to include the model instance if it is provided, ensuring unique cache keys for different model configurations.
  • Typescript improvements: Added the language model types to the existing types.

Usage Example:

"use client";
import { createGoogleGenerativeAI } from "@ai-sdk/google";
import { useChat } from "@ai-sdk/react";

// Existing function, I made no changes here
const google = createGoogleGenerativeAI({
  apiKey: "(pass the key from somewhere)",
});

function ChatComponent() {
  const { messages, append, input, setInput, handleSubmit } = useChat({
    model: google("gemini-2.0-flash-001"), // Pass the model instance - this is the change I made
  });

  return (
    <div>
      {/* Some UI */}
    </div>
  );
}

export default ChatComponent;

Super open to changing this if it's not what you had in mind. 😀

The-Best-Codes and others added 4 commits March 12, 2025 21:23
This commit introduces a new `model` prop to the `useChat` hook.
This prop allows the integration of an AI provider model directly
into the hook, enabling streaming functionality using
`TransformStream`. The commit also updates the key generation logic
to account for the new `model` prop and includes error handling for
stream abortions.
@The-Best-Codes
Copy link
Author

Please do not merge this PR yet. I need to do some more tests. 🫤

@The-Best-Codes
Copy link
Author

I'm done testing, ready for feedback 👂

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant