Skip to content

Commit

Permalink
AI Core: Perplexity support (#1357)
Browse files Browse the repository at this point in the history
  • Loading branch information
lgrammel authored Apr 16, 2024
1 parent 66b5892 commit 7b8791d
Show file tree
Hide file tree
Showing 27 changed files with 202 additions and 76 deletions.
5 changes: 5 additions & 0 deletions .changeset/short-masks-visit.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
'@ai-sdk/openai': patch
---

Support streams with 'chat.completion' objects.
9 changes: 9 additions & 0 deletions .changeset/tiny-seas-unite.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
---
'@ai-sdk/provider-utils': patch
'@ai-sdk/anthropic': patch
'@ai-sdk/mistral': patch
'@ai-sdk/google': patch
'@ai-sdk/openai': patch
---

Rename baseUrl to baseURL. Automatically remove trailing slashes.
2 changes: 1 addition & 1 deletion docs/pages/docs/ai-core/anthropic.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ You can import `Anthropic` from `ai/anthropic` and initialize a provider instanc
import { Anthropic } from '@ai-sdk/anthropic';

const anthropic = new Anthropic({
baseUrl: '', // optional base URL for proxies etc.
baseURL: '', // optional base URL for proxies etc.
apiKey: '', // optional API key, default to env property ANTHROPIC_API_KEY
});
```
Expand Down
11 changes: 6 additions & 5 deletions docs/pages/docs/ai-core/custom-provider.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -28,30 +28,31 @@ A custom provider should follow the pattern of using a provider facade with fact
An instance of the custom provider class with default settings can be exported for convenience.

```ts filename="custom-provider-facade.ts"
import { generateId, loadApiKey } from ''@ai-sdk/provider-utils'';
import { generateId, loadApiKey, withoutTrailingSlash } from ''@ai-sdk/provider-utils'';
import { CustomChatLanguageModel } from './custom-chat-language-model';
import { CustomChatModelId, CustomChatSettings } from './mistral-chat-settings';

/**
* Custom provider facade.
*/
export class CustomProvider {
readonly baseUrl?: string;
readonly baseURL: string;
readonly apiKey?: string;

constructor(
options: {
baseUrl?: string;
baseURL?: string;
apiKey?: string;
} = {},
) {
this.baseUrl = options.baseUrl;
this.baseURL = withoutTrailingSlash(options.baseURL) ??
'https://api.custom.ai/v1';
this.apiKey = options.apiKey;
}

private get baseConfig() {
return {
baseUrl: this.baseUrl ?? 'https://custom.ai/v1',
baseURL: this.baseURL,
headers: () => ({
Authorization: `Bearer ${loadApiKey({
apiKey: this.apiKey,
Expand Down
2 changes: 1 addition & 1 deletion docs/pages/docs/ai-core/google.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ You can import `Google` from `ai/google` and initialize a provider instance with
import { Google } from '@ai-sdk/google';

const google = new Google({
baseUrl: '', // optional base URL for proxies etc.
baseURL: '', // optional base URL for proxies etc.
apiKey: '', // optional API key, default to env property GOOGLE_GENERATIVE_AI_API_KEY
});
```
Expand Down
2 changes: 1 addition & 1 deletion docs/pages/docs/ai-core/mistral.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ You can import `Mistral` from `@ai-sdk/mistral` and initialize a provider instan
import { Mistral } from '@ai-sdk/mistral';

const mistral = new Mistral({
baseUrl: '', // optional base URL for proxies etc.
baseURL: '', // optional base URL for proxies etc.
apiKey: '', // optional API key, default to env property MISTRAL_API_KEY
});
```
Expand Down
2 changes: 1 addition & 1 deletion docs/pages/docs/ai-core/openai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ You can import `OpenAI` from `ai/openai` and initialize a provider instance with
import { OpenAI } from '@ai-sdk/openai'

const openai = new OpenAI({
baseUrl: '', // optional base URL for proxies etc.
baseURL: '', // optional base URL for proxies etc.
apiKey: '' // optional API key, default to env property OPENAI_API_KEY
organization: '' // optional organization
})
Expand Down
3 changes: 3 additions & 0 deletions examples/next-openai/app/api/chat/route.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,15 @@ import { StreamingTextResponse, experimental_streamText } from 'ai';
export const runtime = 'edge';

export async function POST(req: Request) {
// Extract the `messages` from the body of the request
const { messages } = await req.json();

// Call the language model
const result = await experimental_streamText({
model: openai.chat('gpt-4-turbo-preview'),
messages,
});

// Respond with the stream
return new StreamingTextResponse(result.toAIStream());
}
43 changes: 20 additions & 23 deletions examples/next-perplexity/app/api/chat/route.ts
Original file line number Diff line number Diff line change
@@ -1,32 +1,29 @@
import OpenAI from 'openai';
import { OpenAIStream, StreamingTextResponse } from 'ai';
import { OpenAI } from '@ai-sdk/openai';
import { StreamingTextResponse, experimental_streamText } from 'ai';

export const runtime = 'edge';

// Create an OpenAI API client (that's edge friendly!)
// but configure it to point to perplexity.ai
const perplexity = new OpenAI({
apiKey: process.env.PERPLEXITY_API_KEY || '',
apiKey: process.env.PERPLEXITY_API_KEY ?? '',
baseURL: 'https://api.perplexity.ai/',
});

// IMPORTANT! Set the runtime to edge
export const runtime = 'edge';

export async function POST(req: Request) {
// Extract the `messages` from the body of the request
const { messages } = await req.json();

// Ask Perplexity for a streaming chat completion using PPLX 70B online model
// @see https://blog.perplexity.ai/blog/introducing-pplx-online-llms
const response = await perplexity.chat.completions.create({
model: 'pplx-70b-online',
stream: true,
max_tokens: 1000,
messages,
});
try {
// Extract the `messages` from the body of the request
const { messages } = await req.json();

// Convert the response into a friendly text-stream.
const stream = OpenAIStream(response);
// Call the language model
const result = await experimental_streamText({
// see https://docs.perplexity.ai/docs/model-cards for models
model: perplexity.chat('sonar-medium-chat'),
messages,
});

// Respond with the stream
return new StreamingTextResponse(stream);
// Respond with the stream
return new StreamingTextResponse(result.toAIStream());
} catch (error) {
console.log(error);
throw error;
}
}
2 changes: 1 addition & 1 deletion examples/next-perplexity/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,9 @@
"lint": "next lint"
},
"dependencies": {
"@ai-sdk/openai": "latest",
"ai": "latest",
"next": "14.1.1",
"openai": "4.16.1",
"react": "18.2.0",
"react-dom": "^18.2.0"
},
Expand Down
2 changes: 1 addition & 1 deletion packages/anthropic/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ You can import `Anthropic` from `ai/anthropic` and initialize a provider instanc
import { Anthropic } from '@ai-sdk/anthropic';

const anthropic = new Anthropic({
baseUrl: '', // optional base URL for proxies etc.
baseURL: '', // optional base URL for proxies etc.
apiKey: '', // optional API key, default to env property ANTHROPIC_API_KEY
});
```
Expand Down
30 changes: 26 additions & 4 deletions packages/anthropic/src/anthropic-facade.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import { loadApiKey } from '@ai-sdk/provider-utils';
import { loadApiKey, withoutTrailingSlash } from '@ai-sdk/provider-utils';
import { AnthropicMessagesLanguageModel } from './anthropic-messages-language-model';
import {
AnthropicMessagesModelId,
Expand All @@ -9,23 +9,45 @@ import {
* Anthropic provider.
*/
export class Anthropic {
readonly baseUrl?: string;
/**
* Base URL for the Anthropic API calls.
*/
readonly baseURL: string;

readonly apiKey?: string;

/**
* Creates a new Anthropic provider instance.
*/
constructor(
options: {
/**
* Base URL for the Anthropic API calls.
*/
baseURL?: string;

/**
* @deprecated Use `baseURL` instead.
*/
baseUrl?: string;

/**
* API key for authenticating requests.
*/
apiKey?: string;

generateId?: () => string;
} = {},
) {
this.baseUrl = options.baseUrl;
this.baseURL =
withoutTrailingSlash(options.baseURL ?? options.baseUrl) ??
'https://api.anthropic.com/v1';
this.apiKey = options.apiKey;
}

private get baseConfig() {
return {
baseUrl: this.baseUrl ?? 'https://api.anthropic.com/v1',
baseURL: this.baseURL,
headers: () => ({
'anthropic-version': '2023-06-01',
'anthropic-beta': 'tools-2024-04-04',
Expand Down
6 changes: 3 additions & 3 deletions packages/anthropic/src/anthropic-messages-language-model.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ import { mapAnthropicStopReason } from './map-anthropic-stop-reason';

type AnthropicMessagesConfig = {
provider: string;
baseUrl: string;
baseURL: string;
headers: () => Record<string, string | undefined>;
};

Expand Down Expand Up @@ -165,7 +165,7 @@ export class AnthropicMessagesLanguageModel implements LanguageModelV1 {
const { args, warnings } = this.getArgs(options);

const response = await postJsonToApi({
url: `${this.config.baseUrl}/messages`,
url: `${this.config.baseURL}/messages`,
headers: this.config.headers(),
body: args,
failedResponseHandler: anthropicFailedResponseHandler,
Expand Down Expand Up @@ -220,7 +220,7 @@ export class AnthropicMessagesLanguageModel implements LanguageModelV1 {
const { args, warnings } = this.getArgs(options);

const response = await postJsonToApi({
url: `${this.config.baseUrl}/messages`,
url: `${this.config.baseURL}/messages`,
headers: this.config.headers(),
body: {
...args,
Expand Down
4 changes: 2 additions & 2 deletions packages/core/core/prompt/prepare-call-settings.ts
Original file line number Diff line number Diff line change
Expand Up @@ -135,8 +135,8 @@ export function prepareCallSettings({
maxTokens,
temperature: temperature ?? 0,
topP,
presencePenalty: presencePenalty ?? 0,
frequencyPenalty: frequencyPenalty ?? 0,
presencePenalty,
frequencyPenalty,
seed,
maxRetries: maxRetries ?? 2,
};
Expand Down
2 changes: 1 addition & 1 deletion packages/google/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ You can import `Google` from `ai/google` and initialize a provider instance with
import { Google } from '@ai-sdk/google';

const google = new Google({
baseUrl: '', // optional base URL for proxies etc.
baseURL: '', // optional base URL for proxies etc.
apiKey: '', // optional API key, default to env property GOOGLE_GENERATIVE_AI_API_KEY
});
```
Expand Down
35 changes: 30 additions & 5 deletions packages/google/src/google-facade.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,8 @@
import { generateId, loadApiKey } from '@ai-sdk/provider-utils';
import {
generateId,
loadApiKey,
withoutTrailingSlash,
} from '@ai-sdk/provider-utils';
import { GoogleGenerativeAILanguageModel } from './google-generative-ai-language-model';
import {
GoogleGenerativeAIModelId,
Expand All @@ -9,27 +13,48 @@ import {
* Google provider.
*/
export class Google {
readonly baseUrl?: string;
/**
* Base URL for the Google API calls.
*/
readonly baseURL: string;

readonly apiKey?: string;

private readonly generateId: () => string;

/**
* Creates a new Google provider instance.
*/
constructor(
options: {
/**
* Base URL for the Google API calls.
*/
baseURL?: string;

/**
* @deprecated Use `baseURL` instead.
*/
baseUrl?: string;

/**
* API key for authenticating requests.
*/
apiKey?: string;

generateId?: () => string;
} = {},
) {
this.baseUrl = options.baseUrl;
this.baseURL =
withoutTrailingSlash(options.baseURL ?? options.baseUrl) ??
'https://generativelanguage.googleapis.com/v1beta';
this.apiKey = options.apiKey;
this.generateId = options.generateId ?? generateId;
}

private get baseConfig() {
return {
baseUrl:
this.baseUrl ?? 'https://generativelanguage.googleapis.com/v1beta',
baseURL: this.baseURL,
headers: () => ({
'x-goog-api-key': loadApiKey({
apiKey: this.apiKey,
Expand Down
6 changes: 3 additions & 3 deletions packages/google/src/google-generative-ai-language-model.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ import { mapGoogleGenerativeAIFinishReason } from './map-google-generative-ai-fi

type GoogleGenerativeAIConfig = {
provider: string;
baseUrl: string;
baseURL: string;
headers: () => Record<string, string | undefined>;
generateId: () => string;
};
Expand Down Expand Up @@ -152,7 +152,7 @@ export class GoogleGenerativeAILanguageModel implements LanguageModelV1 {
const { args, warnings } = this.getArgs(options);

const response = await postJsonToApi({
url: `${this.config.baseUrl}/${this.modelId}:generateContent`,
url: `${this.config.baseURL}/${this.modelId}:generateContent`,
headers: this.config.headers(),
body: args,
failedResponseHandler: googleFailedResponseHandler,
Expand Down Expand Up @@ -190,7 +190,7 @@ export class GoogleGenerativeAILanguageModel implements LanguageModelV1 {
const { args, warnings } = this.getArgs(options);

const response = await postJsonToApi({
url: `${this.config.baseUrl}/${this.modelId}:streamGenerateContent?alt=sse`,
url: `${this.config.baseURL}/${this.modelId}:streamGenerateContent?alt=sse`,
headers: this.config.headers(),
body: args,
failedResponseHandler: googleFailedResponseHandler,
Expand Down
2 changes: 1 addition & 1 deletion packages/mistral/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ You can import `Mistral` from `ai/mistral` and initialize a provider instance wi
import { Mistral } from '@ai-sdk/mistral';

const mistral = new Mistral({
baseUrl: '', // optional base URL for proxies etc.
baseURL: '', // optional base URL for proxies etc.
apiKey: '', // optional API key, default to env property MISTRAL_API_KEY
});
```
Expand Down
Loading

0 comments on commit 7b8791d

Please sign in to comment.