Skip to content

This sample shows how to take a ChatGPT prompt as HTTP Get or Post input, calculates the completions using OpenAI ChatGPT service, all hosted in an Azure Function.

License

Notifications You must be signed in to change notification settings

jianingwang123/function-python-ai-openai-chatgpt

 
 

Repository files navigation

Azure Functions

Chat using ChatGPT (Python v2 Function)

This sample shows how to take a ChatGPT prompt as HTTP Get or Post input, calculates the completions using OpenAI ChatGPT service, and then returns the output plus caches in a Blob state store.

Open in GitHub Codespaces

Run on your local environment

Pre-reqs

  1. Python 3.8+
  2. Azure Functions Core Tools
  3. OpenAPI API key
  4. Export these secrets as Env Vars using values from Step 3.

Mac/Linux

export OPENAI_API_KEY=*Paste from step 3*

Windows

Search for Environment Variables in Settings, create new System Variables similarly to these instructions:

Variable Value
OPENAI_API_KEY Paste from step 3
  1. Add this local.settings.json file to the root folder to simplify local development and include Key from step 3
{
  "IsEncrypted": false,
  "Values": {
    "FUNCTIONS_WORKER_RUNTIME": "python",
    "AzureWebJobsFeatureFlags": "EnableWorkerIndexing",
    "AzureWebJobsStorage": "",
    "OPENAI_API_KEY": "*Paste from step 3*"
  }
}

Using Functions CLI

  1. Open a new terminal and do the following:
pip3 install -r requirements.text
func start
  1. Using your favorite REST client, e.g. RestClient in VS Code, PostMan, curl, make a post. test.http has been provided to run this quickly.

Terminal:

curl -i -X POST http://localhost:7071/api/chat/ \
  -H "Content-Type: text/json" \
  --data-binary "@testdata.json"

testdata.json

{
    "prompt": "Write a poem about Azure Functions.  Include two reasons why users love them."
}

test.http

POST http://localhost:7071/api/chat HTTP/1.1
content-type: application/json

{
    "prompt": "Write a poem about Azure Functions.  Include two reasons why users love them."
}

You will see chat happen in the Terminal standard out, the HTTP response, and saved off to a Blob for state management in the samples-chatgpt-output container.

Source Code

The key code that makes this work is as follows in ./function_app.py. You can customize this or learn more snippets using Examples and OpenAPI Playground.

    completion = openai.Completion.create(
        model='text-davinci-003',
        prompt=generate_prompt(prompt),
        temperature=0.9,
        max_tokens=200
    )
    return completion.choices[0].text

Deploy to Azure

The easiest way to deploy this app is using the Azure Dev CLI aka AZD. If you open this repo in GitHub CodeSpaces the AZD tooling is already preinstalled.

To provision and deploy:

azd up

About

This sample shows how to take a ChatGPT prompt as HTTP Get or Post input, calculates the completions using OpenAI ChatGPT service, all hosted in an Azure Function.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Bicep 97.8%
  • Python 2.2%