Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor Fireworks and add ChatFireworks #3

Merged
merged 14 commits into from
Sep 14, 2023
255 changes: 255 additions & 0 deletions docs/extras/integrations/chat/fireworks.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,255 @@
{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"id": "642fd21c-600a-47a1-be96-6e1438b421a9",
"metadata": {},
"source": [
"# ChatFireworks\n",
"\n",
">[Fireworks](https://app.fireworks.ai/) accelerates product development on generative AI by creating an innovative AI experiment and production platform. \n",
"\n",
"This example goes over how to use LangChain to interact with `ChatFireworks` models."
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "d00d850917865298",
"metadata": {
"collapsed": false,
"jupyter": {
"outputs_hidden": false
}
},
"outputs": [],
"source": [
"from langchain.chat_models.fireworks import ChatFireworks\n",
"from langchain.schema import SystemMessage, HumanMessage\n",
"import os"
]
},
{
"cell_type": "markdown",
"id": "f28ebf8b-f14f-46c7-9962-8b8dc42e31be",
"metadata": {},
"source": [
"# Setup\n",
"Contact Fireworks AI for the an API Key to access our models\n",
"\n",
"Set up your model using a model id. If the model is not set, the default model is fireworks-llama-v2-7b-chat."
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "d096fb14-8acc-4047-9cd0-c842430c3a1d",
"metadata": {},
"outputs": [],
"source": [
"# Initialize a Fireworks Chat model\n",
"os.environ['FIREWORKS_API_KEY'] = \"<your_api_key>\" # Change this to your own API key\n",
"chat = ChatFireworks(model=\"accounts/fireworks/models/llama-v2-13b-chat\")"
]
},
{
"cell_type": "markdown",
"id": "d8f13144-37cf-47a5-b5a0-e3cdf76d9a72",
"metadata": {},
"source": [
"# Calling the Model\n",
"\n",
"You can use the LLMs to call the model for specified message(s). \n",
"\n",
"See the full, most up-to-date model list on [app.fireworks.ai](https://app.fireworks.ai)."
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "72340871-ae2f-415f-b399-0777d32dc379",
"metadata": {},
"outputs": [],
"source": [
"# ChatFireworks Wrapper\n",
"system_message = SystemMessage(content=\"You are to chat with the user.\")\n",
"human_message = HumanMessage(content=\"Who are you?\")\n",
"response = chat([system_message, human_message])"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "2d6ef879-69e3-422b-8379-bb980b70fe55",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"AIMessage(content=\"Hello! My name is LLaMA, I'm a large language model trained by a team of researcher at Meta AI. My primary function is to assist users with tasks and answer questions to the best of my ability. I am capable of understanding and responding to natural language input, and I am here to help you with any questions or tasks you may have. Is there anything specific you would like to know or discuss?\", additional_kwargs={}, example=False)"
]
},
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"response"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "68c6b1fa-2ff7-4a63-8d88-3cec302180b8",
"metadata": {},
"outputs": [],
"source": [
"# Setting additional parameters: temperature, max_tokens, top_p\n",
"chat = ChatFireworks(model=\"accounts/fireworks/models/llama-v2-13b-chat\", model_kwargs={\"temperature\":1, \"max_tokens\": 20, \"top_p\": 1})\n",
"system_message = SystemMessage(content=\"You are to chat with the user.\")\n",
"human_message = HumanMessage(content=\"How's the weather today?\")\n",
"response = chat([system_message, human_message])"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "a09025f8-e4c3-4005-a8fc-c9c774b03a64",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"AIMessage(content=\"Oh, you know, it's just another beautiful day in the virtual world! The sun\", additional_kwargs={}, example=False)"
]
},
"execution_count": 6,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"response"
]
},
{
"cell_type": "markdown",
"id": "d93aa186-39cf-4e1a-aa32-01ed31d43bc8",
"metadata": {},
"source": [
"# ChatFireworks Wrapper with generate"
]
},
{
"cell_type": "code",
"execution_count": 7,
"id": "cbe29efc-37c3-4c83-8b84-b8bba1a1e589",
"metadata": {},
"outputs": [],
"source": [
"chat = ChatFireworks()\n",
"message = HumanMessage(content=\"Hello\")\n",
"response = chat.generate([[message], [message]])"
]
},
{
"cell_type": "code",
"execution_count": 8,
"id": "35109f36-9519-47a6-a223-25639123e836",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"LLMResult(generations=[[ChatGeneration(text=\"Hello! It's nice to meet you. I'm here to help answer any questions you may have, while being respectful and safe. Please feel free to ask me anything, and I will do my best to provide helpful and positive responses. Is there something specific you would like to know or discuss?\", generation_info={'finish_reason': 'stop'}, message=AIMessage(content=\"Hello! It's nice to meet you. I'm here to help answer any questions you may have, while being respectful and safe. Please feel free to ask me anything, and I will do my best to provide helpful and positive responses. Is there something specific you would like to know or discuss?\", additional_kwargs={}, example=False))], [ChatGeneration(text=\"Hello! *smiling* I'm here to help you with any questions or concerns you may have. Please feel free to ask me anything, and I will do my best to provide helpful, respectful, and honest responses. I'm programmed to avoid any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content, and to provide socially unbiased and positive responses. Is there anything specific you would like to talk about or ask?\", generation_info={'finish_reason': 'stop'}, message=AIMessage(content=\"Hello! *smiling* I'm here to help you with any questions or concerns you may have. Please feel free to ask me anything, and I will do my best to provide helpful, respectful, and honest responses. I'm programmed to avoid any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content, and to provide socially unbiased and positive responses. Is there anything specific you would like to talk about or ask?\", additional_kwargs={}, example=False))]], llm_output={'model': 'accounts/fireworks/models/llama-v2-7b-chat'}, run=[RunInfo(run_id=UUID('f137463e-e1c7-454a-8b85-b999ce20e0f2')), RunInfo(run_id=UUID('f3ef1138-92de-4e01-900b-991e34a647a7'))])"
]
},
"execution_count": 8,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"response"
]
},
{
"cell_type": "markdown",
"id": "92c2cabb-9eaf-4c49-b0e5-a5de5a7d920e",
"metadata": {},
"source": [
"# ChatFireworks Wrapper with stream"
]
},
{
"cell_type": "code",
"execution_count": 9,
"id": "12717a29-fb7d-4a4d-860b-40435452b065",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"Hello! I'm just\n",
" an AI assistant,\n",
" here to help answer your\n",
" questions and provide information in\n",
" a responsible and respectful manner\n",
". I'm not able\n",
" to access personal information or provide\n",
" any content that could be considered\n",
" harmful, uneth\n",
"ical, racist, sex\n",
"ist, toxic, dangerous\n",
", or illegal. My purpose\n",
" is to assist and provide helpful\n",
" responses that are socially un\n",
"biased and positive in nature\n",
". Is there something specific you\n",
" would like to know or discuss\n",
"?\n"
]
}
],
"source": [
"llm = ChatFireworks()\n",
"\n",
"for token in llm.stream(\"Who are you\"):\n",
" print(token.content)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "02991e05-a38e-47d4-9ab3-7e630a8ead55",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.4"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
Loading