Skip to content

Commit

Permalink
Merge pull request #185 from Azure-Samples/main
Browse files Browse the repository at this point in the history
merge changes in main to wrk550 branch
  • Loading branch information
nitya authored Sep 20, 2024
2 parents 08782be + 18b9645 commit 24c79d8
Show file tree
Hide file tree
Showing 16 changed files with 185 additions and 136 deletions.
2 changes: 1 addition & 1 deletion docs/workshop/docs/00-Before-You-Begin/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ _Tip: every page will have these handy **Next →** markers to help you navigate
- [X] The infrastructure is pre-provisioned for you. Just launch the lab to get started.
- [X] The sessions run for a fixed time. You have 75 minutes to complete the lab.

!!! example "**Next** → Doing a self-guided walkthrough of the workshop? [Get Started Here](./../02-Self-Guide-Setup/02-provision.md)"
!!! example "**Next** → Doing a self-guided walkthrough of the workshop? [Get Started Here](./../02-Self-Guide-Setup/01-setup.md)"

- [X] You will use your own Azure subscription and laptop.
- [X] You will provision Azure infrastructure and deploy the application yourself.
Expand Down
8 changes: 5 additions & 3 deletions docs/workshop/docs/01-Tour-Guide-Setup/01-setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ The **WRK550 Lab** is run using the Skillable platform which provides you with a

!!! info "If you are currently in an AI Tour session and have already launched the Skillable Lab and verified credentials - move on to Section 2 below. Otherwise, complete these two steps now."

1. Open a new bowser window In incognito mode (window A)
1. Open a new browser window in incognito mode (window A)

The workshop is conducted completely within a browser environment. You may have an enterprise Azure or GitHub account that you are logged into from your browser that may cause conflicts. To avoid this, we recommend opening a new browser window in **incognito mode** (private mode) with your preferred browser.

Expand Down Expand Up @@ -88,6 +88,7 @@ GitHub Codespaces will be our development environment for this workshop. Let's l
https://portal.azure.com
```
1. **Sign in** using the `Username` and `Password` displayed under "Azure Credentials" in the Skillable Lab window you launched in **Step 1** (above).
1. You will be presented with a "Welcome to Microsoft Azure" screen. Click **Cancel** to dismiss, or click **Get Started** if you'd like to take an introductory tour of the Azure Portal.
1. In the Navigate section, **Click** `Resource Groups`.
1. A resource group has been created for you, containing the resources needed for the RAG application. **Click** `rg-AITOUR`.
1. **Check:** Deployments (under "Essentials") - There are **35 succeeded** Deployments.
Expand Down Expand Up @@ -126,11 +127,11 @@ GitHub Codespaces will be our development environment for this workshop. Let's l
1. Visit the `rg-AITOUR` Resource group page
1. Click the `Container App` resource to display the Overview page
1. Look for `Application Url` (at top right), and click it to launch in new tab (Tab 5️⃣)
* This creates a new tab `"Azure Container Apps"` displaying the logo
* This creates a new tab `"Welcome to Azure Container Apps!"` displaying the logo
!!! info "Azure Container Apps (ACA) is an easy-to-use compute solution for hosting our chat AI application. The application is implemented as a FastAPI server that exposes a simple `/create_request` API endpoint to clients for direct use or integration with third-party clients."
**🌟 | CONGRATULATIONS!** - Your ACA Endpoint is ready!
** | CONGRATULATIONS!** - Your ACA Endpoint is ready!
## Step 6: Make sure CodeSpaces has completed launching
Expand All @@ -139,6 +140,7 @@ GitHub Codespaces will be our development environment for this workshop. Let's l
You should see the Visual Studio Online development environment. If you have used Visual Studio Code on the desktop, it will look very familiar. You will see these components:
* Left sidebar: The Activity Bar, including the "Prompty" extension logo at the end
![Prompty logo](../img/prompty-logo.png)
* Left pane: The Explorer pane, showing the files in the `contoso-chat` repository
* Right pane: A preview of the main README.md file from the repository
* Lower pane: A terminal pane, with a `bash` prompt ready to receive input
Expand Down
10 changes: 7 additions & 3 deletions docs/workshop/docs/01-Tour-Guide-Setup/02-validate.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,9 @@ azd version
```

```bash
python --version
prompty --version
```

```bash
fastapi --version
```
Expand Down Expand Up @@ -63,6 +64,7 @@ From the VS Code Online Terminal pane (in Tab 2️⃣):
```
azd auth login
```
- You won't need to enter the password again. Simply select your Skillable Lab account.
!!! success "You are now logged into Azure CLI and Azure Developer CLI"
Expand All @@ -75,12 +77,14 @@ From the Terminal pane in Tab 2️⃣:
1. Run the commands below
```
azd env set AZURE_LOCATION francecentral
azd env set AZURE_LOCATION francecentral -e AITOUR --no-prompt
```
```
azd env refresh -e AITOUR --no-prompt
azd env refresh -e AITOUR
```
(Press ENTER to select the default Azure subscription presented).
The file `.azure/AITOUR/.env` has been updated in our filesystem with information needed to build our app: connection strings, endpoint URLs, resource names and much more. You can open the file to see the values retrieved, or display them with this command:
```
Expand Down
2 changes: 1 addition & 1 deletion docs/workshop/docs/02-Self-Guide-Setup/01-setup.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# 1️⃣ | Getting Started (Self-Guided Workshop)

These are the instructions for **Self Guided** learners for this workshop. If you are participating in an intructor-led version of this workshop, please skip ahead to Section 3️⃣ [Provision Infra](./../02-Self-Guide-Setup/02-provision.md).
These are the instructions for **Self Guided** learners for this workshop. If you are participating in an intructor-led version of this workshop, please skip ahead to Section 3️⃣ [Explore App Infrastructure](./../03-Workshop-Build/03-infra.md).

In this section, you will provision the required resources to your
Azure subscription, and validated your local development environment in GitHub Codespaces.
Expand Down
2 changes: 1 addition & 1 deletion docs/workshop/docs/02-Self-Guide-Setup/02-provision.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,4 +27,4 @@ The last step provisions the Azure infrastructure **and** deploys the first vers

---

!!! info "Next → 3️⃣ [Let's Explore App Infrastructure](./../03-Workshop-Build/03-infra.md) before we start building!
!!! info "Next → 3️⃣ [Let's Explore App Infrastructure](./../03-Workshop-Build/03-infra.md) before we start building!"
13 changes: 9 additions & 4 deletions docs/workshop/docs/03-Workshop-Build/03-infra.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,11 @@ The Azure AI Search resources contains the product index for our retailer's prod
1. Click `Search Explorer` in the top-nav menu
- see Search explorer with default index `contoso-products`
- **click** "Search" with no other input
- you will see: Results dialog filled with index data for the product database.
- you will see: Results dialog filled with index data for the entire product database.
1. Enter `sleeping bag` in the text box, and click Search
- Verify that the first result returned relates to a sleeping bag from the catalog
1. Enter `something to make food with` in the text box, and click Search
- Verify that the first result returned relates to a camping stove

✅ | Your Azure AI Search resource is ready!

Expand All @@ -52,10 +56,11 @@ When iterating on a prototype application, we start with manual testing - using
- enter `[]` for **chat_history**
- enter **Execute** to run the endpoint with the provided parameters.

You will get a response body with `answer` and `context` components.
You will get a response body with `question`, `answer` and `context` components.

* `answer` is the chatbot's response to the customer's `question`, entered into the chat window
* `context` is the additional information provided to the Generative AI model, which it uses to ground its answer. In this app, that includes information about products relevant to the customer question. The products selected may depend on the `customer_id` and their associated order history.
* `question` is the customer's question as typed in the chat window on the Contoso Outdoor website
* `answer` is the chatbot's response to the customer's `question`, as generated by this RAG application
* `context` is the additional information provided to the Generative AI model, which it used to ground its answer. In this app, that includes information about products relevant to the customer question. The products selected may depend on the `customer_id` and their associated order history.
* The web app provides the `chat_history` from the chat window, which provides additional context for the generative AI model to ground its response.

✅ | Your Contoso Chat AI is deployed - and works with valid inputs!
Expand Down
30 changes: 20 additions & 10 deletions docs/workshop/docs/03-Workshop-Build/04-ideation.md
Original file line number Diff line number Diff line change
Expand Up @@ -120,9 +120,18 @@ cp chat-0.prompty chat-1.prompty
Open the file `chat-1.prompty` and edit it as described below.
### Set the temperature parameter
1. Add the following at Line 15 (at the end of the `parameters:` section):
```
temperature: 0.2
```
!!! info "[Temperature](https://learn.microsoft.com/azure/ai-services/openai/concepts/advanced-prompt-engineering?pivots=programming-language-chat-completions#temperature-and-top_p-parameters) is one of the parameters you can use to modify the behavior of Generative AI models. It controls the degree of randomness in the response, from 0.0 (deterministic) to 1.0 (maximum variability)."
### Use a sample data file
For now, we'll supply data in a JSON file to provide context for the generative AI model to provide in the model. (Later, we'll extract this data from the databases.)
From here, we'll supply data in a JSON file to provide context for the generative AI model to provide in the model. (Later, we'll extract this data from the databases.)
1. Copy a JSON file with sample data to provide as context in our Prompty.
```
Expand Down Expand Up @@ -192,14 +201,14 @@ Prompty constructs the meta-prompt from the inputs before passing it to the mode
3. Ideate on your own!
You can change the system prompt to modify the style and tone of the responses from the chatbot.
You can change the system prompt to modify the style and tone of the responses from the chatbot.
- Try adding `Provide responses in a bullet list of items` to the end of the `system:` section. What happens to the output?
- Try adding `Provide responses in a bullet list of items` to the end of the `system:` section. What happens to the output?
You can also change the parameters passed to the generative AI model in the `parameters:` section.
You can also change the parameters passed to the generative AI model in the `parameters:` section.
- Have you observed truncated responses in the output? Try changing `max_tokens` to 3000 - does that fix the problem?
- Try changing `temperature` to 0.7. Try some other values between 0.0 and 1.0. What happens to the output?
- Have you observed truncated responses in the output? Try changing `max_tokens` to 3000 - does that fix the problem?
- Try changing `temperature` to 0.7. Try some other values between 0.0 and 1.0. What happens to the output?
✅ | Your prompty template is updated, and uses a sample test data file
Expand All @@ -219,9 +228,9 @@ cp chat-1.prompty chat-2.prompty
cp chat-1.json chat-2.json
```
1. Open `chat-1.prompty` for editing
1. Open `chat-2.prompty` for editing
1. Change line 22 to input the new data file:
1. Change line 21 to input the new data file:
```
sample: ${file:chat-2.json}
Expand Down Expand Up @@ -283,7 +292,7 @@ cp chat-1.json chat-2.json
cp ../docs/workshop/src/1-build/chat-3.json .
```
1. In the Explorer pane, right-click on the new `chat-3.prompty` file: select `Add Code > Add Prompty Code`. This creates a new Python file `chat-3.py` and opens it in VS Code.
1. In the Explorer pane, right-click on the new `chat-3.prompty` file and select `Add Code > Add Prompty Code`. This creates a new Python file `chat-3.py` and opens it in VS Code.
1. Add the three lines below to the top of `chat-3.py`:
Expand All @@ -297,6 +306,8 @@ cp chat-1.json chat-2.json
1. Execute `chat-3.py` by clicking the "play" at the top-right of its VS Code window.
A Python script forms the basis of the FASTAPI endpoint we deployed in Tab 5️⃣. We'll explore the source code later.
!!! quote "Congratulations! You just learned prompt engineering with Prompty!"
Let's recap what we tried:
Expand All @@ -314,7 +325,6 @@ _In this section, you saw how Prompty tooling supports rapid prototyping - start
!!! example "Next → [Let's Evaluate with AI!](./05-evaluation.md) and learn about custom evaluators!"
We didn't change the Customer and Context section, but observe how the parameters will insert the input customer name and context into the meta-prompt.
Expand Down
Loading

0 comments on commit 24c79d8

Please sign in to comment.