Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Cookbook] Add Anyscale Endpoint cookbook #746

Merged
merged 4 commits into from
Jan 4, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
29 changes: 29 additions & 0 deletions cookbooks/Anyscale/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
# Anyscale Endpoints with AIConfig

[![colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1JgGjJ2YglyaT5GHQNswkPOyB5oHGbOcv?usp=sharing)

[Anyscale Endpoints](https://www.anyscale.com/endpoints) support optimized inference for many open source models, including the LLaMA2 family of models (7B, 13B, 70B, CodeLLaMA, LLaMA Guard) and Mistral (7B, Mixtral 8x7B).

This cookbook shows how to use any [Anyscale Endpoints](https://www.anyscale.com/endpoints) model with AIConfig using the same simple API.

We cover:

- Inference using Anyscale Endpoints
- Prompt chains with multiple models

Read more about [AIConfig for prompt and model management](https://github.com/lastmile-ai/aiconfig).

## Models supported with Anyscale Endpoints

For the complete list, please see https://app.endpoints.anyscale.com/docs

- LLaMA-7B: meta-llama/Llama-2-7b-chat-hf
- LLaMA-13B: meta-llama/Llama-2-13b-chat-hf
- LLaMA-70B: meta-llama/Llama-2-70b-chat-hf
- LLaMA Guard: Meta-Llama/Llama-Guard-7b
- CodeLLaMA: codellama/CodeLlama-34b-Instruct-hf
- Mistral-7B (OpenOrca): Open-Orca/Mistral-7B-OpenOrca
- Mistral-7B: mistralai/Mistral-7B-Instruct-v0.1
- Mixtral-8x7B: mistralai/Mixtral-8x7B-Instruct-v0.1
- Zephyr: HuggingFaceH4/zephyr-7b-beta
- GTE: thenlper/gte-large
542 changes: 542 additions & 0 deletions cookbooks/Anyscale/getting_started.ipynb

Large diffs are not rendered by default.

55 changes: 55 additions & 0 deletions cookbooks/Anyscale/travel.aiconfig.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
{
"name": "NYC Trip Planner",
"description": "Intrepid explorer with ChatGPT and AIConfig",
"schema_version": "latest",
"metadata": {
"model_parsers": {
"meta-llama/Llama-2-7b-chat-hf": "AnyscaleEndpoint",
"meta-llama/Llama-2-13b-chat-hf": "AnyscaleEndpoint",
"Meta-Llama/Llama-Guard-7b": "AnyscaleEndpoint",
"meta-llama/Llama-2-70b-chat-hf": "AnyscaleEndpoint",
"Open-Orca/Mistral-7B-OpenOrca": "AnyscaleEndpoint",
"codellama/CodeLlama-34b-Instruct-hf": "AnyscaleEndpoint",
"HuggingFaceH4/zephyr-7b-beta": "AnyscaleEndpoint",
"mistralai/Mistral-7B-Instruct-v0.1": "AnyscaleEndpoint",
"mistralai/Mixtral-8x7B-Instruct-v0.1": "AnyscaleEndpoint",
"thenlper/gte-large": "AnyscaleEndpoint"
},
"models": {
"meta-llama/Llama-2-7b-chat-hf": {
"model": "meta-llama/Llama-2-7b-chat-hf",
"top_p": 1,
"temperature": 1
},
"meta-llama/Llama-2-70b-chat-hf": {
"model": "meta-llama/Llama-2-70b-chat-hf",
"max_tokens": 3000,
"system_prompt": "You are an expert travel coordinator with exquisite taste."
}
},
"default_model": "meta-llama/Llama-2-7b-chat-hf"
},
"prompts": [
{
"name": "get_activities",
"input": "Tell me 10 fun attractions to do in NYC."
},
{
"name": "gen_itinerary",
"input": "Generate an itinerary ordered by {{order_by}} for these activities: {{get_activities.output}}.",
"metadata": {
"model": {
"name": "AnyscaleEndpoint",
"settings": {
"model": "mistralai/Mixtral-8x7B-Instruct-v0.1",
"max_tokens": 3000,
"system_prompt": "You are an expert travel coordinator with exquisite taste."
}
},
"parameters": {
"order_by": "geographic location"
}
}
}
]
}
53 changes: 53 additions & 0 deletions cookbooks/Anyscale/travel.aiconfig.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
description: Intrepid explorer with ChatGPT and AIConfig
metadata:
default_model: meta-llama/Llama-2-7b-chat-hf
model_parsers:
HuggingFaceH4/zephyr-7b-beta: AnyscaleEndpoint
Meta-Llama/Llama-Guard-7b: AnyscaleEndpoint
Open-Orca/Mistral-7B-OpenOrca: AnyscaleEndpoint
codellama/CodeLlama-34b-Instruct-hf: AnyscaleEndpoint
meta-llama/Llama-2-13b-chat-hf: AnyscaleEndpoint
meta-llama/Llama-2-70b-chat-hf: AnyscaleEndpoint
meta-llama/Llama-2-7b-chat-hf: AnyscaleEndpoint
mistralai/Mistral-7B-Instruct-v0.1: AnyscaleEndpoint
mistralai/Mixtral-8x7B-Instruct-v0.1: AnyscaleEndpoint
thenlper/gte-large: AnyscaleEndpoint
models:
meta-llama/Llama-2-70b-chat-hf:
max_tokens: 3000
model: meta-llama/Llama-2-70b-chat-hf
system_prompt: You are an expert travel coordinator with exquisite taste.
meta-llama/Llama-2-7b-chat-hf:
model: meta-llama/Llama-2-7b-chat-hf
temperature: 1
top_p: 1
parameters: {}
name: NYC Trip Planner
prompts:
- input: Tell me 10 fun attractions to do in NYC.
name: get_activities
- input: 'Generate an itinerary ordered by {{order_by}} for these activities: {{get_activities.output}}.'
metadata:
model:
name: AnyscaleEndpoint
settings:
max_tokens: 3000
model: mistralai/Mixtral-8x7B-Instruct-v0.1
system_prompt: You are an expert travel coordinator with exquisite taste.
parameters:
order_by: geographic location
name: gen_itinerary
- input: What should I bring to {{location}}?
metadata:
model:
name: gpt-3.5-turbo
settings:
model: gpt-3.5-turbo
system_prompt:
content: You provide a bulleted list of items to pack for a week long trip.
role: system
parameters:
location: nyc
remember_chat_context: true
name: gen_packing_list
schema_version: latest
2 changes: 1 addition & 1 deletion cookbooks/Function-Calling-OpenAI/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,6 @@

This example is taken from [OpenAI's function calling demo](https://github.com/openai/openai-node/blob/v4/examples/function-call-stream.ts) and modified to show how the same functionality can be done more simply with AIConfig.

TThis notebook serves as a practical guide for leveraging AIConfig and function calling with OpenAI models. We start with a mock database of books and functions to list, search, and retrieve books. Function calling is enabled so the LLM can interpret a user's question, determine the appropriate function to call, and execute the function. Read more about [Function Calling with Open AI](https://openai.com/blog/function-calling-and-other-api-updates) and [AIConfig for prompt and model management](https://github.com/lastmile-ai/aiconfig).
This notebook serves as a practical guide for leveraging AIConfig and function calling with OpenAI models. We start with a mock database of books and functions to list, search, and retrieve books. Function calling is enabled so the LLM can interpret a user's question, determine the appropriate function to call, and execute the function. Read more about [Function Calling with Open AI](https://openai.com/blog/function-calling-and-other-api-updates) and [AIConfig for prompt and model management](https://github.com/lastmile-ai/aiconfig).

[Google Colab notebook](https://colab.research.google.com/drive/1RZ5s2XmD-Gkg64QlS80lgwOFT3F7J346)
2 changes: 2 additions & 0 deletions python/src/aiconfig/Config.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@

import requests
from aiconfig.callback import CallbackEvent, CallbackManager
from aiconfig.default_parsers.anyscale_endpoint import DefaultAnyscaleEndpointParser
from aiconfig.default_parsers.openai import DefaultOpenAIParser
from aiconfig.default_parsers.palm import PaLMChatParser, PaLMTextParser
from aiconfig.model_parser import InferenceOptions, ModelParser
Expand Down Expand Up @@ -35,6 +36,7 @@
]
for model in gpt_models:
ModelParserRegistry.register_model_parser(DefaultOpenAIParser(model))
ModelParserRegistry.register_model_parser(DefaultAnyscaleEndpointParser("AnyscaleEndpoint"))
ModelParserRegistry.register_model_parser(PaLMChatParser())
ModelParserRegistry.register_model_parser(PaLMTextParser())
ModelParserRegistry.register_model_parser(HuggingFaceTextGenerationParser())
Expand Down
Loading