diff --git a/docs/concept-dialog.md b/docs/concept-dialog.md index 0889f5b49d..57ade761fe 100644 --- a/docs/concept-dialog.md +++ b/docs/concept-dialog.md @@ -1,115 +1,58 @@ # Dialogs -Modern conversational software is comprised of many components, including programming code, custom business logic, cloud APIs, training data for language processing systems and perhaps most importantly, the actual content used in conversations with the bot's end users. With Composer, all of these pieces are integrated with one another into a single interface for constructing blocks of bot functionality called **Dialogs**. ([SDK Docs: Bot Framework Dialogs](https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-concept-dialog?view=azure-bot-service-4.0)) +Modern conversational software has many different components, including source code, custom business logic, cloud API, training data for language processing systems, and perhaps most importantly, the actual content used in conversations with the bot's end users. With Composer integrated all of these pieces into a single interface for constructing the building blocks of bot functionality called **Dialogs**. -Each dialog represents a piece of the bot's functionality. They contain instructions for how the bot will react to input. Simple bots will have a few dialogs. Complex bots may have dozens or hundreds of individual dialogs. +Each dialog represents a portion of the bot's functionality and contains instructions for how the bot will react to the input. Simple bots will have just a few dialogs. Sophisticated bots may have dozens or hundreds of individual dialogs. -In Composer, dialogs are functional components offered in a visual interface and do not require you to write code. The dialog system supports building a pluggable and extensible model that integrates building blocks of bot functionality. Dialogs help users focus on conversation modeling rather than the mechanics of dialog management. +In Composer, dialogs are functional components offered in a visual interface that do not require you to write code. The dialog system supports building an extensible model that integrates all of the building blocks of a bot's functionality. Composer helps you focus on conversation modeling rather than the mechanics of dialog management. ## Types of dialogs -There are two types of dialogs in Composer: main dialog and child dialog. Below is a screenshot of a main dialog named `MyBot.Main` and two child dialogs named `Weather` and `Greeting`. +You create a dialog in Composer to manage a conversation objective. There are two types of dialogs in Composer: _main dialog_ and _child dialog_. The main dialog is initialized by default when you create a new bot, and it has a **.Main** file extension. You can create one or more child dialogs to keep the dialog system organized. Each bot has one main dialog and can have zero or more child dialogs. Refer to the [Create a bot](./tutorial/tutorial-introduction.md) article on how to create a bot and its main dialog in Composer. Refer to the [Add a dialog](./tutorial/tutorial-add-dialog.md) article on how to create a child dialog and wire it up in the dialog system. -![main_child_dialog](./media/dialog/main_child_dialog.png) +Below is a screenshot of a main dialog named `MyBot.Main` and two children dialogs called `Weather` and `Greeting`. -You create a dialog in Composer to manage a conversation objective. Main dialog is initialized by default when you create a new bot and it has a **.Main** file extension. Each bot has one main dialog and can have multiple child dialogs or no child dialogs. Read the [Add a dialog](./tutorial/bot-tutorial-add-dialog.md) section to create a dialog in Composer. +![Main and child dialog](./media/dialog/main-and-child-dialog.png) -At runtime, the main dialog is called into action and becomes the active dialog, triggering event handlers with pre-defined actions. As the conversation flows, a child dialog can be called by a main dialog, and vice versa. Different child dialogs can be called with each other as well. +At runtime, the main dialog is called into action and becomes the active dialog, triggering event handlers with the actions you defined during the creation of the bot. As the conversation flows, the main dialog can call a child dialog, and a child dialog can, in turn, call the main dialog or other children dialogs. ## Anatomy of a dialog The following diagram shows the anatomy of a dialog in Composer. Note that dialogs in Composer are based on [Adaptive dialogs](https://github.com/Microsoft/BotBuilder-Samples/tree/master/experimental/adaptive-dialog#readme). -![adaptive-dialog-anatomy](./media/dialog/adaptive-dialog-anatomy.png) +![The adaptive dialog anatomy](./media/dialog/adaptive-dialog-anatomy.png) + ### Recognizer -When a dialog is called into action its **recognizer** will start to process the message and try to extract the primary **intent** and any **entity values** the message includes. After processing the message, both the **intent** and **entity values** are passed onto the dialog's triggers. Composer currently supports two recognizers: the LUIS recognizer (default) and the Regular Expression recognizer. You can choose only one recognizer per dialog, and a dialog can have no recognizer choosing the `None` type. Below is a screenshot of recognizers supported in Composer. +The recognizer interprets what the user wants based on their input. When a dialog is invoked its **recognizer** will start to process the message and try to extract the primary [**intent**](concept-language-understanding.md#intents) and any [**entity values**](concept-language-understanding.md#entities) the message includes. After processing the message, both the **intent** and **entity values** are passed onto the dialog's triggers. Composer currently supports two recognizers: The LUIS recognizer, which is the default, and the Regular Expression recognizer. You can choose only one recognizer per dialog, or you can choose not to have a recognizer at all. Below is a screenshot of recognizers supported in Composer. -![recognizer](./media/dialog/recognizer.png) +![Recognizer](./media/dialog/recognizer.png) -**Recognizers** provide the functionality of understanding and extracting meaningful pieces of information from a user's input. All recognizers emit events when the recognizer picks up an **intent** (or extracts **entities**) from a given user utterance. A **recognizer** of a dialog is not always called into play when a dialog is called. It depends on how you design the dialog system. +**Recognizers** give your bot the ability to understand and extract meaningful pieces of information from user input. All recognizers emit events when the recognizer picks up an **intent** (or extracts **entities**) from a given user **utterance**. The **recognizer** of a dialog is not always called into play when a dialog is invoked. It depends on how you design the dialog system. ### Trigger -The functionality of a dialog is contained within triggers - rules that tell the bot how to process incoming messages. They are also used to define a wide variety of bot behaviors, from performing the main fulfillment of the user's request, to handling [interruptions](https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-howto-handle-user-interrupt?view=azure-bot-service-4.0&tabs=csharp) like requests for help, to handling custom, developer-defined events originating from the app itself. Below is a screenshot of the trigger menu in Composer. +The functionality of a dialog is contained within triggers. Triggers are rules that tell the bot how to process incoming messages and are also used to define a wide variety of bot behaviors, from performing the main fulfillment of the user's request, to handling [interruptions](https://aka.ms/botservice-howto-handle-user-interruptions?view=azure-bot-service-4.0&tabs=csharp) like requests for help, to handling custom, developer-defined events originating from the app itself. Below is a screenshot of the trigger menu in Composer. -![trigger_menu](./media/dialog/trigger_menu.gif) +![Trigger menu](./media/dialog/trigger-menu.png) ### Action -Triggers contain a series of actions that the bot will undertake to fulfill a user's request. Actions are things like sending messages, making calculations, and performing computational tasks on behalf of the user. The path the bot follows through a dialog can branch and loop. The bot can ask questions, validate input, manipulate and store values in memory, and make decisions. Below is a screenshot of the action menu in Composer. Click the **+** sign below the trigger you can mouse over the action menu. - -![action_menu](./media/dialog/action_menu.gif) - -### Language generator - -As the bot takes actions and sends messages, the **language generator** is called into play. This allows messages sent by the bot to be composed from variables and templates. Language generators can be used to create reusable components, variable messages, macros, and dynamic messages that are grammatically correct. - - - -## Create a dialog - -When you create a bot in Composer you also create its main dialog by default. Follow the steps to create a bot project and its main dialog: - -1. On the left side of the Composer home screen, click **+ New** from the upper left corner (or the big `+` sign in the middle part of the home screen). - -![create_new_bot](./media/dialog/create_new_bot.png) - -2. After you see the pop-up window, select **Create from scratch** and click **Submit**. - -3. In the pop-up window give a name for your bot and optionally fill in a brief description and click **Next**. Leave the **Location** field as is at this time. - -![new_bot](./media/dialog/new_bot.png) - -When your bot is created successfully you will see a **.Main** dialog in the dialog navigation pane. Congratulations! You have created your first bot! Below is a screenshot of a bot named `GreetingBot` and its main dialog named `GreetingBot.Main`: - -![main_dialog](./media/dialog/main_dialog.png) +Triggers contain a series of actions that the bot will undertake to fulfill a user's request. Actions are things like sending messages, responding to user questions using a [knowledge base](./how-to-add-qna-to-bot.md), making calculations, and performing computational tasks on behalf of the user. The path the bot follows through a dialog can branch and loop. The bot can ask ad even answer questions, validate input, manipulate and store values in memory, and make decisions. Below is a screenshot of the action menu in Composer. Select the **+** sign below the trigger you can mouse over the action menu. -> [!NOTE] -> After you create a bot a **Greeting** trigger will be created by default. It is a trigger to handle activities such as sending a welcome message. For details please read [events and triggers](concept-events-and-triggers.md). +![Action Menu](./media/dialog/action-menu.gif) -## Add a dialog +### Language Generator -After you create a bot you are also creating its main dialog by default. The main dialog is like the brain of our bot, controlling and managing the dialog system. Sometimes we find it useful to create a child dialog that contains a chunk of functionality so that our dialog system is organized and easily managed. Let's walk through a very simple example to show how to create a child dialog and wire it up to the main dialog. +As the bot takes actions and sends messages, the **Language Generator** is used to create those messages from variables and templates. Language generators can create reusable components, variable messages, macros, and dynamic messages that are grammatically correct. -1. Create a child dialog. Click **New Dialog** on the navigation pane. On the pop-up window give a name for the new dialog and optionally fill in the description and then click **Next**. - -![weather_dialog](./media/dialog/weather_dialog.png) - -After that, you will see an empty dialog on the navigation pane and a pre-configured **BeginDialog** trigger. The new dialog named `Weather` looks like the following: - -![new_weather_dialog](./media/dialog/new_weather_dialog.png) - -2. Define an action in the **BeginDialog** trigger. In the new dialog's authoring canvas, click the `+` sign under **BeginDialog** trigger node and select **Send a response**. In the Language Generation editor put such a sentence: `The weather dialog is calle with success!` - -![send_response](./media/dialog/send_response.gif) - -3. Wire up the new dialog. Click the main dialog in navigation pane and select **Greeting** trigger. In the authoring canvas, click the `+` sign under **Greeting** trigger node and select **Dialog management** and then **Begin a new dialog**. This is a dialog action that begins another dialog. When that dialog is completed, it will return to the caller. - -![begin_dialog_action](./media/dialog/begin_dialog_action.png) - -Now in the properties panel on the right side select the dialog you want to wire up from the drop-down menu. Let's select `Weather` dialog and then you will see the name of the new dialog appear in the **Begin a new dialog** action node. - -![wire_up_dialog](./media/dialog/wire_up_dialog.gif) - -When the bot runs, the pattern of this simple design is as follows: - -- The main dialog `Greeting.Main` is called at bot runtime. -- The **Greeting** trigger in the main dialog is activated and begins to execute the **Begin a new dialog** action which begins `Weather` dialog. -- When `Weather` dialog becomes active, the **BeginDialog** trigger in the child dialog is fired and send the response "The weather dialog is called with success!" to users. - -You can test the result by clicking **Start** on the upper right corner and then click **Test in Emulator**. You should be able to see the following result in the emulator: - -You can test the result by clicking **Start** on the upper right corner and then click **Test in Emulator**. You should be able to see the following result in the emulator: - -![test_emulator](./media/dialog/test_emulator.png) ## Dialog actions -A bot will have a few dialogs or hundreds of individual dialogs and traditionally it's difficult to manage the dialog system and the conversation with user. In the previous "Add a dialog" section, we cover how to create a child dialog and wire it up to the dialog system using **Begin a new dialog** action. In fact, Composer provides more dialog actions to make it easier to manage the dialog system. You can access the different dialog actions by clicking the **+** node under a trigger and then select **Dialog management**. +A bot can have from one to several hundred dialogs, and it can get challenging to manage the dialog system and the conversation with users. In the [Add a dialog](./tutorial/tutorial-add-dialog.md) section, we covered how to create a child dialog and wire it up to the dialog system using **Begin a new dialog** action. Composer provides more dialog actions to make it easier to manage the dialog system. You can access the different dialog actions by clicking the **+** node under a trigger and then select **Dialog management**. -Below is a list of the dialog actions provided in Composer: +Below is a list of the dialog actions available in Composer: | Dialog Action | Description | | ------------------- | -------------------------------------------------------------------------------------------------------------------------------- | @@ -120,13 +63,14 @@ Below is a list of the dialog actions provided in Composer: | Repeat this Dialog | An action that repeats the current dialog with the same dialog. | | Replace this Dialog | An action that replaces the current dialog with the target dialog. | -With these dialog actions, we can easily build a pluggable and extensible dialog system without worrying about the mechanics of dialog management. +With these dialog actions, you can easily create an extensible dialog system without worrying about the complexities of dialog management. ## Further reading -[Dialogs library](https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-concept-dialog?view=azure-bot-service-4.0) +- [Dialogs library](https://aka.ms/bot-builder-concept-dialog?view=azure-bot-service-4.0) + +- [Adaptive dialogs](https://github.com/Microsoft/BotBuilder-Samples/tree/master/experimental/adaptive-dialog#readme) -[Adaptive dialogs](https://github.com/Microsoft/BotBuilder-Samples/tree/master/experimental/adaptive-dialog#readme) +## Next -## Next - [Events and triggers](./concept-events-and-triggers.md) diff --git a/docs/concept-events-and-triggers.md b/docs/concept-events-and-triggers.md index e9260f4bf8..268686d479 100644 --- a/docs/concept-events-and-triggers.md +++ b/docs/concept-events-and-triggers.md @@ -1,81 +1,105 @@ # Events and triggers -In Bot Framework Composer, each dialog includes a set of triggers (event handlers) that contain instructions for how the bot will respond to inputs received when the dialog is active. When a bot receives a message, an event of the type `activityReceived` is fired. As the message is processed by the recognizer and passes through the dialog system, other events of different types are fired. If an event handler is found to handle an incoming event, that event is considered handled, and processing of further event handlers stops. If no event handler is found, the event will pass through the bot with no additional actions taken. -On the navigation pane, click **New Trigger** and you will see the trigger menu in Composer as follows: +In the Bot Framework Composer, each [dialog](./concept-dialog.md) includes one or more event handlers called _triggers_. Each trigger contains one or more _actions_. Actions are the instructions that the bot will execute when the dialog receives any event that it has a trigger defined to handle. Once a given event is handled by a trigger, no further action is taken on that event. Some event handlers have a condition specified that must be met before it will handle the event and if that condition is not met, the event is passed to the next event handler. If an event is not handled in a child dialog, it gets passed up to its parent dialog to handle and this continues until it is either handled or reaches the bots main dialog. If no event handler is found, it will be ignored and no action will be taken. -![trigger_menu](./media/dialog/trigger_menu.gif) +To see the trigger menu in Composer, select **New Trigger** in the navigation pane. + +![Trigger menu](./media/events-triggers/trigger-menu.gif) ## Anatomy of a trigger -The basic idea behind a trigger (event handler) is "When (_event_) happens, then do (_actions_)". The trigger is a conditional test on an incoming event, while the actions are one or more programmatic steps the bot will take to fulfill the user's request. -Every trigger contains the following components: -- A trigger name that can be changed in the property panel -- Possible **Condition** (specified using [Common Language Expression](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language)), must evaluate to be "true" for the event to fire -- Actions defined to fire when the event is triggered +The basic idea behind a trigger (event handler) is "When (_event_) happens, do (_actions_)". The trigger is a conditional test on an incoming event, while the actions are one or more programmatic steps the bot will take to fulfill the user's request. + +A trigger contains the following properties: + +| Trigger property | Description | +| ---------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | +| Name | The trigger name can be changed in the property panel. | +| Actions | The set of instructions that the bot will execute. | +| Condition | The condition can be created or updated in the properties panel and is ignored if left blank, otherwise it must evaluate to _true_ for the event to fire. Conditions must follow the [Common Expression Language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language) syntax. If the condition is ignored or evaluates to false, processing of the event continues with the next trigger. | + +A dialog can contain multiple triggers. You can view them under the specific dialog in the navigation pane. Each trigger shows as the first node in the authoring canvas. A trigger contains actions defined to be executed. Actions within a trigger occur in the context of the active dialog. + +The screenshot below shows the properties of an **Intent recognized** trigger named _Cancel_ that is configured to fire whenever the _Cancel_ [intent](./concept-language-understanding.md#intents) is detected as shown in the properties panel. In this example the **Condition** field is left blank, so no additional conditions are required in order to fire this trigger. + +![Cancel trigger](./media/events-triggers/cancel-trigger.png) + +## Types of triggers + +There are different types of triggers that all work in a similar manner, and in some cases can be interchanged. This section will cover the different types of triggers and when you should use them. See the [define triggers](how-to-define-triggers.md) article for additional information. + +### Intent triggers -The screenshot below shows the definition of an **Intent** trigger that is configured to fire whenever the "cancel" intent is detected. It is possible to add a condition to the event - this expression which follows [Common Language Expression](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language), if specified, must evaluate to be "true" for the event to fire. +Intent triggers work with recognizers. There are two types of intent triggers in Composer: **Intent recognized** and **Unknown intent**. After the first round of events is fired, the bot will pass the incoming message through the recognizer. If an intent is detected, it will be passed into the trigger (event handler) with any **entities** contained in the message. If no intent is detected by the recognizer, an **Unknown intent** trigger will fire, which handles intents not handled by any trigger. -![anatomy_trigger](./media/events_triggers/anatomy_trigger.png) +You should use _intent triggers_ when you want to: -This event will appear in the dialog as a node at the top of the editor. Actions within this trigger occur in the context of the active dialog. These steps control the main functionality of a bot. +- Trigger major features of your bot using natural language. +- Recognize common interruptions like "help" or "cancel" and provide context-specific responses. +- Extract and use entity values as parameters to your dialog. -![cancel_trigger](./media/events_triggers/cancel_trigger.png) +For additional information see how to define an [Intent recognized](how-to-define-triggers.md#intent-recognized) trigger or an [Unknown intent](how-to-define-triggers.md#unknown-intent) trigger in the article titled _How to define triggers_. -## Types of triggers -There are different types of triggers. They all work in a similar manner, and in some cases, can be interchanged. This section will cover the different types of triggers and when should we use them. Read more to learn how to [define triggers](how-to-define-triggers.md). +### Dialog events -### Dialog events -The base type of triggers are dialog triggers. Almost all events start as dialog events which are related to the "lifecycle" of the dialog. Currently there are four different dialog triggers in Composer: **Dialog started (Begin dialog event)**, **Dialog cancelled (Cancel dialog event)**, **Error occurred(Error event)** and **Re-prompt for input(Reprompt dialog event)**. Most dialogs will include a trigger configured to respond to the `BeginDialog` event, which fires when the dialog begins and allows the bot to respond immediately. +The base type of triggers are dialog triggers. Almost all events start as dialog events which are related to the "lifecycle" of the dialog. Currently there are four different dialog events triggers in Composer: -Use dialog triggers when you want to: -- Take actions immediately when the dialog starts, even before the recognizer is called -- Take actions when a "cancel" signal is detected -- Take automatic action on every message as it is received or sent -- Evaluate the raw content of the incoming activity +- **Dialog started (Begin dialog event)** +- **Dialog cancelled (Cancel dialog event)** +- **Error occurred(Error event)** +- **Re-prompt for input(Reprompt dialog event)** -See how to define dialog events [here](how-to-define-triggers.md#dialog-events). +Most dialogs include a trigger configured to respond to the `BeginDialog` event, which fires when the dialog begins. This allows the bot to respond immediately. -### Intent triggers -Intent triggers work with recognizers. There are two intent triggers in Composer: **Intent recognized** and **Unknown intent**. After the first round of events is fired, the bot will pass the incoming activity through the configured recognizer. If an intent is detected, it will be passed onto the matching handler along with any **entity values** the message contains. If an intent is not detected by the recognizer, any configured **Unknown intent** trigger will fire. This will only fire if no matching intent handler is found. **Unknown intent** handles any intent that is not handled by a trigger. +You should use _dialog triggers_ to: -Use intent triggers when you want to: -- Trigger major features of your bot using natural language -- Recognize common interruptions like "help" or "cancel" and provide context-specific responses -- Extract and use entity values as parameters to your dialog or a child dialog +- Take actions immediately when the dialog starts, even before the recognizer is called. +- Take actions when a "cancel" signal is detected. +- Take actions on messages received or sent. +- Evaluate the content of the incoming activity. -See how to define an **Intent recognized** trigger [here](how-to-define-triggers.md#intent-recognized) and how to define an **Unknown intent** trigger [here](how-to-define-triggers.md#unknown-intent). +For additional information, see the [dialog events](how-to-define-triggers.md#dialog-events) section of the article on how to define triggers. -### Activities -Activity trigger is used to handle activities such as when a user joins and the bot begins a new conversation. **Greeting (ConversationUpdate activity)** is a trigger of this type and you can use it to send a greeting message. When you create a new bot, the **Greeting (ConversationUpdate activity)** trigger is initialized by default in the main dialog. This specialized option is provided to avoid handling an event with a complex condition attached. **Message events** is a type of Activity trigger to handle message activities. +### Activities -Use **Activities** trigger when you want to: -- Take actions when a user begins a new conversation with the bot -- Take actions on receipt of an activity with type `EndOfConversation` -- Take actions on receipt of an activity with type `Event` -- Take actions on receipt of an activity with type `HandOff` -- Take actions on receipt of an activity with type `Invoke` -- Take actions on receipt of an activity with type `Typing` +Activity triggers are used to handle activities such as when a new user joins and the bot begins a new conversation. **Greeting (ConversationUpdate activity)** is a trigger of this type and you can use it to send a greeting message. When you create a new bot, the **Greeting (ConversationUpdate activity)** trigger is initialized by default in the main dialog. This specialized option is provided to avoid handling an event with a complex condition attached. **Message events** is a type of Activity trigger to handle message activities. -Use **Message events** when you want to: -- Take actions when a message is updated (on receipt of an activity with type `MessageUpdate`) -- Take actions when a message is deleted (on receipt of an activity with type `MessageDelete`) -- Take actions when a message is reacted (on receipt of an activity with type `MessageReaction`). +You should use **Activities** trigger when you want to: -See how to define an **Activities** trigger [here](how-to-define-triggers.md#activities). +- Take actions when a user begins a new conversation with the bot. +- Take actions on receipt of an activity with type `EndOfConversation`. +- Take actions on receipt of an activity with type `Event`. +- Take actions on receipt of an activity with type `HandOff`. +- Take actions on receipt of an activity with type `Invoke`. +- Take actions on receipt of an activity with type `Typing`. + +For additional information, see [Activities](how-to-define-triggers.md#activities) trigger in the article titled _How to define triggers_. + +### Message events + +**Message event** triggers allow you to react to the different message events such as when a message is updated or deleted or when someone reacts to a message (for example, some of the common message reactions include a Like, Heart, Laugh, Surprised, Sad and Angry reactions). + +You should use **Message events** when you want to: + +- Take actions when a message is updated (on receipt of an activity with type `MessageUpdate`). +- Take actions when a message is deleted (on receipt of an activity with type `MessageDelete`). +- Take actions when a message is reacted (on receipt of an activity with type `MessageReaction`). ### Custom event -**Custom event** is a trigger to handle **Emit a custom event**. Bots can emit user-defined events using **Emit a custom event** which will fire this trigger. If you define an **Emit a custom event** and it fires, any **Custom event** in any level of dialogs will catch it and trigger corresponding actions. -Use **Custom event** when you want to: -- handle a pre-defined **Emit a custom event** +You can create and emit your own events by creating an action associated with any trigger, then you can handle that custom event in any dialog in your bot by defining a **Custom event** event trigger. + +Bots can emit your user-defined events using **Emit a custom event**. If you define an **Emit a custom event** and it fires, any **Custom event** in any dialog will catch it and execute the corresponding actions. -See how to define a **Custom event** [here](how-to-define-triggers.md#custom-event). +For additional information, see [Custom event](how-to-define-triggers.md#custom-event) in the article titled _How to define triggers_. ## Further reading + - [Adaptive dialog: Recognizers, rules, steps and inputs](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/adaptive-dialog/docs/recognizers-rules-steps-reference.md#Rules) - [.lu format file](https://github.com/microsoft/botbuilder-tools/blob/master/packages/Ludown/docs/lu-file-format.md) - [RegEx recognizer and LUIS recognizer](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/adaptive-dialog/docs/recognizers-rules-steps-reference.md#regex-recognizer) -## Next -- Learn [conversation flow and memory](./concept-memory.md) -- Learn [how to define triggers](how-to-define-triggers.md) +## Next + +- [conversation flow and memory](./concept-memory.md) +- [how to define triggers](how-to-define-triggers.md) diff --git a/docs/concept-language-generation.md b/docs/concept-language-generation.md index 53ff9e57bb..bc4c578d9e 100644 --- a/docs/concept-language-generation.md +++ b/docs/concept-language-generation.md @@ -1,87 +1,84 @@ # Language Generation -Language Generation (LG) enables you to define multiple variations on a phrase, execute simple expressions based on context, and refer to conversational memory. At the core of language generation lies template expansion and entity substitution. You can provide one-of variation for expansion as well as conditionally expand a template. The output from language generation can be a simple text string or multi-line response or a complex object payload that a layer above language generation will use to construct a complete [activity](https://github.com/microsoft/botframework-sdk/blob/master/specs/botframework-activity/botframework-activity.md). Bot Framework Composer natively supports language generation to produce output activities using the LG templating system. -You can use language generation to: -- achieve a coherent personality, tone of voice for your bot -- separate business logic from presentation -- include variations and sophisticated composition based resolution for any of your bot's replies -- construct cards, suggested actions and attachments using [structured response template](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/structured-response-template.md) +Language Generation (LG) enables you to define multiple variations of a phrase, execute simple expressions based on context, and refer to conversational memory. At the core of language generation lies template expansion and entity substitution. You can provide one-off variation for expansion as well as conditionally expanding a template. The output from language generation can be a simple text string or multi-line response or a complex object payload that a layer above language generation will use to construct a complete [activity](https://github.com/microsoft/botframework-sdk/blob/master/specs/botframework-activity/botframework-activity.md). The Bot Framework Composer natively supports language generation to produce output activities using the LG templating system. -Language generation is achieved through: -- markdown based [.lg file format](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/lg-file-format.md) that describes the templates and their composition -- full access to current bot's memory so you can data bind language to the state of memory -- parser and runtime libraries that help achieve runtime resolution +You can use Language Generation to: +- Achieve a coherent personality, tone of voice for your bot. +- Separate business logic from presentation. +- Include variations and sophisticated composition for any of your bot's replies. +- Construct cards, suggested actions and attachments using a [structured response template](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/structured-response-template.md). -## Templates -Templates are functions which return one of the variations of the text but fully resolve any other references to template for composition. You can define one or more text response in a template. For multiple responses, one response will be picked by random. You can also define one or more expressions (using the [common expression language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language)) so when it is a conditional template, those expressions control which particular collection of variations get picked. Templates can be parameterized meaning that different callers to the template can pass in different values for use in expansion resolution. Read details in [.lg file format](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/lg-file-format.md). +Language Generation is achieved through: +- A markdown based [.lg file](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/lg-file-format.md) that contains the templates and their composition. +- Full access to the current [bot's memory](concept-memory.md) so you can data bind language to the state of memory. +- Parser and runtime libraries that help achieve runtime resolution. -### Template types -Composer currently supports three different types of templates: -- Simple response template -- Conditional response template -- Structured response template (read more [here](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/structured-response-template.md)) +You can read more about Language Generation [here](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/language-generation). + +## Templates + +Templates are functions which return one of the variations of the text and fully resolve any other references to template for composition. You can define one or more text responses in a template. When multiple responses are defined in the template, a single response will be selected by random. + +You can also define one or more expressions using the [common expression language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language), so when it is a conditional template, those expressions control which particular collection of variations get picked. Templates can be parameterized meaning that different callers to the template can pass in different values for use in expansion resolution. For additional information see [.lg file format](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/lg-file-format.md). + +Composer currently supports three types of templates: [simple response template](#simple-response-template), [conditional response template](#conditional-response-template), and [structured response template](#structured-response-template). You can read [this section](#define-lg-templates) and learn how to define each of them. + +You can break the Language Generation templates into separate files and refer them from one another. You can use markdown-style links to import templates defined in another file, for example, `[description text](file/uri path)`. Make sure your template names are unique across files. ### Anatomy of a template -A template usually consists of a name of the template with `#` and one of the following parts: - -- a list of one-of variation text values defined using "-" -- a collection of conditions, each with a - - condition expression which is expressed using [common expression language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language#readme) and - - list of one-of variation text values per condition -- a structure that contains +A template usually consists of the name of the template with `#` and one of the following parts: + +- A list of one-off variation text values defined using "-" +- A collection of conditions, each with a: + - conditional expression which is expressed using [common expression language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language#readme) and + - list of one-off variation text values per condition +- A structure that contains: - structure-name - properties -Below is an example of a simple [.lg template](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/language-generation) with one-of variation text values. +Below is an example of a simple [.lg template](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/language-generation) with one-off variation text values. > this is a comment # nameTemplate - Hello @{user.name}, how are you? - Good morning @{user.name}. It's nice to see you again. - Good day @{user.name}. What can I do for you today? - -## Define LG template +## Define LG templates -### User scenario +When you want to determine how your bots should respond to user input, you can define LG templates to generate responses. For example, you can define a welcome message to the user in the **Send a response** action. To do this, select the **Send a response** action node, you will see the inline LG editor where you can define LG templates. -When you want to determine how your bots should respond to users, you can define LG templates to generate responses. For example, you can define a welcome message to the user in the **Send a response** action. Click on the **Send a response** action node, you will see the inline LG editor where you can define LG templates. - -### What to know To define LG templates in Composer, you will need to know - - Supported concepts of LG + - Supported concepts of LG mentioned above. - [.lg file format](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/lg-file-format.md) - - [common expression language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language#readme) - -### Where to define -You can define LG template in two places: inline LG editor and LG all-up view (**Bot Responses**) that lists all templates. Below is a screenshot of LG inline editor. + - [Common Expression Language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language#readme) -![inline_editor](./media/language_generation/inline_editor.png) +You can define LG templates in two places: The inline LG editor and **Bot Responses** that lists all templates. Below is a screenshot of LG inline editor. -On the navigation pane click **Bot Responses** icon (or the bot icon when collapsed), you will see the all-up LG editor listing all LG templates defined in the bot. Toggle **Edit Mode** on the upper right corner to start editing your LG template. +![LG inline editor](./media/language-generation/lg-inline-editor.png) -![bot_responses](./media/language_generation/bot_responses.png) +Select the **Bot Responses** icon (or the bot icon when collapsed) in the navigation pane to see the LG editor listing all LG templates defined in the bot. Toggle **Edit Mode** on the upper right corner to edit your LG template. -### How to define -Composer currently supports definition of the following three types of templates: simple template, conditional template and structured template. Let's walk through each type of them. +![Bot Responses](./media/language-generation/Bot-Responses.png) -#### Simple template -A simple template is defined to generate a simple text response. A simple template can be a single-line response, text with memory, or a response of multiline text, etc. You will need to use a `-` before a response text or an expression with returned property value. +Composer currently supports definitions of the following three types of templates: simple template, conditional template and structured template. -Here is an example of a single line text response from the [RespondingWithTextSample](https://github.com/microsoft/BotFramework-Composer/tree/master/Composer/packages/server/assets/projects/RespondingWithTextSample): +### Simple response template + +A simple template is defined to generate a simple text response. A simple template can be a single-line response, text with memory, or a response of multiline text, etc. You will need to use a `-` before a response text or an expression with returned property value. Here are a few examples of simple response template from the [RespondingWithTextSample](https://github.com/microsoft/BotFramework-Composer/tree/master/Composer/packages/server/assets/projects/RespondingWithTextSample). + +Here is an example of a single line text response: - Here is a simple text message. -This is an example of a single line expression response with memory from the [RespondingWithTextSample](https://github.com/microsoft/BotFramework-Composer/tree/master/Composer/packages/server/assets/projects/RespondingWithTextSample): +This is an example of a single line response using a variable: - @{user.message} -A multiline response include multiline text enclosed in `...`. Here is an example response from the [RespondingWithTextSample](https://github.com/microsoft/BotFramework-Composer/tree/master/Composer/packages/server/assets/projects/RespondingWithTextSample). +> [!NOTE] +> Variables and expressions are enclosed in curly brackets - @{} + +Here is an example of a multi-line response. It includes multiple lines of text enclosed in ` ``` `. # multilineText - ``` you have such alarms @@ -89,8 +86,8 @@ A multiline response include multiline text enclosed in `...`. Here is an exampl alarm2: 9:pm ``` -#### Conditional template -For all conditional templates, all conditions are expressed in [common expression language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language#readme) and condition expressions are enclosed in curly brackets. Here are two [conditional template examples](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/lg-file-format.md#conditional-response-template). +### Conditional response template +For all conditional templates, all conditions are expressed in the [common expression language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language#readme) and condition expressions are enclosed in curly brackets. Here are two [conditional template examples](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/lg-file-format.md#conditional-response-template) examples of a conditional response. IF...ELSE @@ -112,10 +109,10 @@ SWITCH...CASE - DEFAULT: - final output -#### Structured template -[Structured response template](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/structured-response-template.md) enables users to define a complex structure that supports all the benefits of LG (templating, composition, substitution) while leaving the interpretation of the structured response up to the caller of the LG library. It provides an easier way to define a full blown outgoing [activity](https://github.com/Microsoft/botframework-sdk/blob/master/specs/botframework-activity/botframework-activity.md) in a simple text format. Composer currently support structured LG templates such as Cards, SuggestedActions and other [Chatdown](https://github.com/microsoft/botbuilder-tools/tree/master/packages/Chatdown) style constructs. +### Structured response template +[Structured response templates](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/structured-response-template.md) enable users to define a complex structure that supports all the benefits of LG (templating, composition, substitution) while leaving the interpretation of the structured response up to the bot developer. It provides an easier way to define a full blown outgoing [activity](https://github.com/Microsoft/botframework-sdk/blob/master/specs/botframework-activity/botframework-activity.md) in a simple text format. Composer currently support structured LG templates such as Cards, SuggestedActions and other [Chatdown](https://github.com/microsoft/botbuilder-tools/tree/master/packages/Chatdown) style constructs. -The definition of a [structured response template](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/structured-response-template.md) is as follows: +The definition of a structured response template is as follows: # TemplateName > this is a comment @@ -126,11 +123,11 @@ The definition of a [structured response template](https://github.com/microsoft/ Property3 = Nested structures are achieved through composition ] -Below is an example of SuggestedActions from the [InterruptionSample](https://github.com/microsoft/BotFramework-Composer/tree/master/Composer/packages/server/assets/projects/InterruptionSample): +Below is an example of SuggestedActions from the [Interruption Sample](https://github.com/microsoft/BotFramework-Composer/tree/master/Composer/packages/server/assets/projects/InterruptionSample): - Hello, I'm the interruption demo bot! \n \[Suggestions=Get started | Reset profile] -Below is another example of Thumbnail card from the [RespondingWithCardsSample](https://github.com/microsoft/BotFramework-Composer/tree/master/Composer/packages/server/assets/projects/RespondingWithCardsSample): +Below is another example of a Thumbnail card from the [Responding With Cards Sample](https://github.com/microsoft/BotFramework-Composer/tree/master/Composer/packages/server/assets/projects/RespondingWithCardsSample): # ThumbnailCard [ThumbnailCard @@ -141,19 +138,19 @@ Below is another example of Thumbnail card from the [RespondingWithCardsSample]( image = https://sec.ch9.ms/ch9/7ff5/e07cfef0-aa3b-40bb-9baa-7c9ef8ff7ff5/buildreactionbotframework_960.jpg buttons = Get Started] -For more information on structured template please read the [structured response template](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/structured-response-template.md) article. For more examples of structured templates, please refer to [Example 1](https://github.com/microsoft/botbuilder-dotnet/blob/master/tests/Microsoft.Bot.Builder.LanguageGeneration.Tests/Examples/StructuredTemplate.lg) and [Example 2](https://github.com/microsoft/botbuilder-dotnet/blob/master/tests/Microsoft.Bot.Builder.Dialogs.Adaptive.Templates.Tests/lg/NormalStructuredLG.lg). +For more examples of structured response templates, see [StructuredTemplate.lg](https://github.com/microsoft/botbuilder-dotnet/blob/master/tests/Microsoft.Bot.Builder.LanguageGeneration.Tests/Examples/StructuredTemplate.lg) and [NormalStructuredLG.lg](https://github.com/microsoft/botbuilder-dotnet/blob/master/tests/Microsoft.Bot.Builder.Dialogs.Adaptive.Templates.Tests/lg/NormalStructuredLG.lg) in GitHub. -### Common expression language cheatsheet - -| Symbol | Description | -| ------ | -------------------------------------------------------------------------------------------------------------------------------------------------------------- | -| # | Template definition symbol | -| - | Variation | -| \ | Escape character | -| @ | A prefix character to signify need expression evaluation when in multi-line response | -| {} | Used for all expressions. Templates are also functions so {templateName()} is valid and supported. | +### Common expression language cheat sheet + +| Symbol | Description | +| ------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| # | Template definition symbol | +| - | Variation | +| \ | Escape character | +| @ | A prefix character to signify need expression evaluation when in multi-line response | +| {} | Used for all expressions. Templates are also functions so {templateName()} is valid and supported. | | () | Used to denote parameters to a function or to a template. E.g {templateName(‘value1’, ‘value2’)} or to a prebuilt function {length(greeting)} or {length(‘value1’)} | -| ``` | Used in pair to denote multi-line segment. |``` +| ``` | Used in pair to denote multi-line segment. | ## References - [Language generation preview](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/language-generation) @@ -161,8 +158,6 @@ For more information on structured template please read the [structured response - [.lg API reference](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/api-reference.md) - [Common expression language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language#readme) - [Structured response template](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/structured-response-template.md) -- [Structured template example1](https://github.com/microsoft/botbuilder-dotnet/blob/master/tests/Microsoft.Bot.Builder.LanguageGeneration.Tests/Examples/StructuredTemplate.lg) -- [Structured template example2](https://github.com/microsoft/botbuilder-dotnet/blob/master/tests/Microsoft.Bot.Builder.Dialogs.Adaptive.Templates.Tests/lg/NormalStructuredLG.lg) ## Next - [Linting and validation](./how-to-use-validation.md) diff --git a/docs/concept-language-understanding.md b/docs/concept-language-understanding.md index 443d8690e5..afd0b8c70f 100644 --- a/docs/concept-language-understanding.md +++ b/docs/concept-language-understanding.md @@ -1,188 +1,154 @@ # Language Understanding -Language Understanding (LU) is used by the bot to understand language naturally and contextually to determine what next to do in a conversation flow. In Bot Framework Composer,the process is achieved through setting up recognizers and providing training data in the dialog so that any **intents** and **entities** contained in the message can be captured. These values will then be passed on to triggers which define how bots will respond with appropriate actions. -In Bot Framework Composer LU has the following characteristics: +Language Understanding (LU) is used by a bot to understand language naturally and contextually to determine what next to do in a conversation flow. In the Bot Framework Composer, the process is achieved through setting up recognizers and providing training data in the dialog so that the **intents** and **entities** contained in the message can be captured. These values will then be passed on to triggers which define how the bot responds using the appropriate actions. -- LU content can be authored in inline editor using the [.lu file format](https://github.com/Microsoft/botbuilder-tools/blob/master/packages/Ludown/docs/lu-file-format.md). -- LU content is training data for recognizers. -- Composer currently supports LU technologies such as LUIS and Regular Expression. -- Composer provides an all-up LU view in **User Responses**. +LU has the following characteristics when used in the Bot Framework Composer: -## Core LU concepts in Composer -### Intents -Intents are categories or classifications of user intentions. An intent represents an action the user wants to perform. The intent is a purpose or goal expressed in a user's input, such as booking a flight, paying a bill, or finding a news article. You define and name intents that correspond to these actions. A travel app may define an intent named "BookFlight." +- LU content can be authored in an inline editor or in **User Input** using the [.lu file format](https://github.com/Microsoft/botbuilder-tools/blob/master/packages/Ludown/docs/lu-file-format.md). +- LU content is training data for recognizers. +- Composer currently supports LU technologies such as LUIS and Regular Expression. -Here's a simple .lu file that captures a simple **Greeting** intent with a list of example utterances that capture different ways users will express this intent. You can use - or + or * to denote lists. Numbered lists are not supported. +## Core LU concepts in Composer - # Greeting - - Hi - - Hello - - How are you? +### Intents -`#` describes a new intent definition section. Each line after the intent definition are example utterances that describe that intent. You can stitch together multiple intent definitions in a language understanding editor in Composer. Each section is identified by `#` notation. Blank lines are skipped when parsing the file. +Intents are categories or classifications of user intentions. An intent represents an action the user wants to perform. The intent is a purpose or goal expressed in the user's input, such as booking a flight, paying a bill, or finding a news article. You define and name intents that correspond to these actions. A travel app may define an intent named "BookFlight." + +Here's a simple .lu file that captures a simple **Greeting** intent with a list of example utterances that capture different ways users will express this intent. You can use - or + or \* to denote lists. Numbered lists are not supported. + + # Greeting + - Hi + - Hello + - How are you? + +`#` describes a new intent definition section. Each line after the intent definition are example utterances that describe that intent. You can stitch together multiple intent definitions in a language understanding editor in Composer. Each section is identified by `#` notation. Blank lines are skipped when parsing the file. To define and use intents in Composer, you will need to: -- setup **LUIS** as recognizer type -- specify intent(s) and example utterances in [.lu file format](https://github.com/microsoft/botbuilder-tools/edit/master/packages/Ludown/docs/lu-file-format.md) as mentioned above -- create an **Intent** trigger to handle each pre-defined intent -- publish the training data to LUIS +1. Setup **LUIS** as recognizer type. +2. Specify intent(s) and example utterances in [.lu file format](https://github.com/microsoft/botbuilder-tools/edit/master/packages/Ludown/docs/lu-file-format.md) as mentioned above. +3. Create an **Intent** trigger to handle each pre-defined intent. +4. Publish the training data to LUIS. > [!NOTE] -> Please read details of how to define intents with LUIS recognizer and Regular Expression recognizer [here](how-to-define-triggers.md#intent). +> For additional information on defining intents with the LUIS recognizer and Regular Expression recognizer refer to the [Defining triggers](how-to-define-triggers.md#intent) article. + +### Utterances -### Utterances -Utterances are input from users and may have a lot of variations. Since utterances are not always well formed we need to provide example utterances for specific intents to train our bots to recognize intents from different utterances. By doing so, our bots will have some "intelligence" to understand human languages. +Utterances are input from users and may have a lot of variations. Since utterances are not always well-formed, we need to provide example utterances for specific intents to train bots to recognize intents from different utterances. By doing so, your bots will have some "intelligence" to understand human languages. -In Composer, utterances are always captured in a markdown list and followed by an intent. For example, the **Greeting** intent with some example utterances are shown in the above section [here](concept-language-understanding.md#intents). +In Composer, utterances are always captured in a markdown list and followed by an intent. For example, the **Greeting** intent with some example utterances are shown in the [Intents](concept-language-understanding.md#intents) section above. -You may have noticed that LU format is very similar to LG format but they are different. +You may have noticed that LU format is very similar to LG format but there are some key differences. - LU is for bots to understand user's input (primarily capture **intent** and more) - LU is associated with recognizers (LUIS/Regular Expression) -- LG is for bots to respond to users as outputs -- LG is associated with language generator +- LG is for bots to respond to users as output +- LG is associated with a language generator ### Entities -Entities are a collection of objects data extracted from an utterance such as places, time, and people. Entities and intents are both important data extracted from utterances, but they are different. An intent indicates what the user is trying to do. An utterance may include several entities or no entity, while an utterance usually represents one intent. In Composer, all entities are defined and managed inline. Entity in the [.lu file format](https://github.com/microsoft/botbuilder-tools/blob/master/packages/Ludown/docs/lu-file-format.md) is denoted using {\=\} notation as follows: + +Entities are a collection of objects, each consisting of data extracted from an utterance such as places, time, and people. Entities and intents are both important data extracted from utterances, but they are different. An intent indicates what the user is trying to do. An utterance may include zero or more entities, while an utterance usually represents one intent. In Composer, all entities are defined and managed inline. Entities in the [.lu file format](https://github.com/microsoft/botbuilder-tools/blob/master/packages/Ludown/docs/lu-file-format.md) are denoted using {\=\} notation. For example: # BookFlight - book a flight to {toCity=seattle} - book a flight from {fromCity=new york} to {toCity=seattle} -The example above shows the definition of a `BookFlight` intent with two example utterances and two entity definitions: `toCity` and `fromCity`. When triggered, if LUIS is able to identify a destination city, the city name will be made available as `@toCity` within the triggered actions or a departure city with `@fromCity` as available entity values. The entity values can be used directly in expressions and LG templates, or [stored into a memory property](concept-memory.md) for later use. Read [here](how-to-define-advanced-intents-entities.md) for advanced intents and entities definition. +The example above shows the definition of a `BookFlight` intent with two example utterances and two entity definitions: `toCity` and `fromCity`. When triggered, if LUIS is able to identify a destination city, the city name will be made available as `@toCity` within the triggered actions or a departure city with `@fromCity` as available entity values. The entity values can be used directly in expressions and LG templates, or stored into a property in [memory](concept-memory.md) for later use. For additional information on entities see the article [advanced intents and entities](how-to-define-advanced-intents-entities.md). + +### Example -### Example -To better understand intents, entities and utterances, we provide some examples in the table below. All the three utterances share the same intent _BookFlight_ and with different entities. There are different types of entities, see details [here](https://github.com/microsoft/botbuilder-tools/blob/master/packages/Ludown/docs/lu-file-format.md). +The table below shows an example of an intent with its corresponding utterances and entities. All three utterances share the same intent _BookFlight_ each with a different entity. There are different types of entities, you can find more information in this article on the [LU file format](https://github.com/microsoft/botbuilder-tools/blob/master/packages/Ludown/docs/lu-file-format.md). -| Intent | Utterances | Entity | +| Intent | Utterances | Entity | | ---------- | --------------------------------------------- | ----------------------- | | BookFlight | "Book me a flight to London" | "London" | | | "Fly me to London on the 31st" | "London", "31st" | | | "I need a plane ticket next Sunday to London" | "next Sunday", "London" | -Below is a similar definition of a _BookFlight_ intent with entity specification `{city=name}` and a set of example utterances. We use this example to show how they are manifested in Composer. Extracted entities are passed along to any triggered actions or child dialogs using the syntax `@city`. +Below is a similar definition of a _BookFlight_ intent with entity specification `{city=name}` and a set of example utterances. We use this example to show how they are manifested in Composer. Extracted entities are passed along to any triggered actions or child dialogs using the syntax `@city`. ``` # BookFlight - book a flight to {city=austin} - travel to {city=new york} -- i want to go to {city=los angeles} +- I want to go to {city=los angeles} ``` -After publishing, LUIS will be able to identify a city as entity and the city name will be made available as `@city` within the triggered actions. The entity value can be used directly in expressions and LG templates, or [stored into a memory property](concept-memory.md) for later use. - -Based on the training data above, the JSON view of the query "book me a flight to London" in LUIS app looks like this: - -```json -{ - "query": "book me a flight to london", - "prediction": { - "normalizedQuery": "book me a flight to london", - "topIntent": "BookFlight", - "intents": { - "BookFlight": { - "score": 0.9345866 - } - }, - "entities": { - "city": [ - "london" - ], - "$instance": { - "city": [ - { - "type": "city", - "text": "london", - "startIndex": 20, - "length": 6, - "score": 0.834206, - "modelTypeId": 1, - "modelType": "Entity Extractor", - "recognitionSources": [ - "model" - ] - } - ] - } - } - } -} -``` +After publishing, LUIS will be able to identify a city as entity and the city name will be made available as `@city` within the triggered actions. The entity value can be used directly in expressions and LG templates, or stored into a property in [memory](concept-memory.md) for later use. Read [here](how-to-define-advanced-intents-entities.md) for advanced intents and entities definition. -## Author LU content in Composer -### User scenario -To enable your bot to understand user's input contextually and conversationally so that your bot can decide how to respond to different user inputs, you should author LU as training data. +## Author LU content in Composer -### What to know -To author proper LU content in Composer, you need to know - - LU concepts - - [.lu file format](https://github.com/Microsoft/botbuilder-tools/blob/master/packages/Ludown/docs/lu-file-format.md) - - [Common expression language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language#readme) +To enable your bot to understand user's input contextually and conversationally so that your bot can decide how to respond to different user inputs, you should author LU as training data. -### How to author -To create the LU content, follow these steps: +To author proper LU content in Composer, you need to know: + +- [LU concepts](https://aka.ms/botbuilder-luis-concept?view=azure-bot-service-4.0) +- [.lu file format](https://github.com/Microsoft/botbuilder-tools/blob/master/packages/Ludown/docs/lu-file-format.md) +- [Common Expression Language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language#readme) -- set up a **Recognizer** for a specific dialog (per dialog per recognizer) -- author LU content as training data in [.lu format](https://github.com/Microsoft/botbuilder-tools/blob/master/packages/Ludown/docs/lu-file-format.md) -- create **Intent** triggers to wire up the LU content -- publish LU content (for LUIS) +To create the LU content, follow these steps: -#### Set up a recognizer -Composer currently support two types of recognizers: LUIS (by default) and Regular Expression. Before setting up a recognizer type, you need to select the dialog where you want to customize your LU content. For example, let's select the main dialog and then set up LUIS as recognizer type. +- Set up a **Recognizer** for a specific dialog (per dialog per recognizer). +- Author LU content as training data in [.lu format](https://github.com/Microsoft/botbuilder-tools/blob/master/packages/Ludown/docs/lu-file-format.md). +- Create **Intent** triggers to wire up the LU content. +- Publish LU content (for LUIS). -1. Go to your bot's navigation pane on the left side and select the main dialog. +### Set up a recognizer -![select_dialog](./media/language_understanding/select_dialog.png) +Composer currently supports two types of recognizers: LUIS (by default) and Regular Expressions. This article focuses solely on the LUIS recognizer. Before setting up a recognizer type, you need to select the dialog you will be using for this purpose. In this example you will use the main dialog to set up LUIS as the recognizer type. -2. When you see the Language Understanding editor on the right side, select **LUIS** as its **Recognizer Type**. +1. Select the main dialog in the navigation pane. Then you will see the **Language Understanding** section in the bots properties panel on the right side of the Composer window. select **LUIS** from the **Recognizer Type** drop-down list. + ![select-recognizer](./media/language-understanding/select-recognizer.png) -![luis](./media/language_understanding/luis.png) +#### Author LU content -#### Author LU content After you set up the recognizer type, you can customize your LU content in the editor using the [.lu format](https://github.com/Microsoft/botbuilder-tools/blob/master/packages/Ludown/docs/lu-file-format.md). -For example, let's define two intents: **Greeting** and **CheckWeather** with some example utterances inline: +For this example define two intents: **Greeting** and **CheckWeather** with some example utterances inline: -![intents](./media/language_understanding/intents.gif) +![LU content](./media/language-understanding/LU-content.png) -#### Wire up LU with **Intent recognized** trigger -After you define the intents with example utterances, you need to create **Intent recognized** triggers in the dialog to handle each intent. In the **Intent recognized** trigger you can define the actions to take when an intent is recognized. +### Wire up LU content -1. Go to your bot's navigation pane on the left side and select **New Trigger** in the dialog you wish you create the trigger. +After you define the intents with example utterances, you need to create **Intent recognized** triggers in the dialog to handle each intent by defining the actions to take when an intent is recognized. -![new_trigger](./media/language_understanding/new_trigger.png) +1. Go to your bot's navigation pane on the left side and select **New Trigger**. -2. In the `Create a trigger` pop-up window, select **Intent recognized** as the type of trigger. Pick the intent you want to handle from the drop-down menu and then click **Submit**. +2. In the `Create a trigger` pop-up window, select **Intent recognized** as the type of trigger. Pick the intent you want to handle from the drop-down menu and select **Submit**. -![wireup_intent](./media/language_understanding/wireup_intent.png) + ![Wireup intent](./media/language-understanding/wireup-intent.png) -Click "User Input" icon on the left navigation pane to get an all-up view. +3. Click the **User Input** icon in Composer menu. You will see a list of all the LU content you have authored along with details such as which dialog the content is associated with and whether it is published or not. -![user_say](./media/language_understanding/user_say.png) + ![User Input](./media/language-understanding/user-input.png) -The all-up view lists all LU content you have authored and some details such as which dialog you define the content and whether it is published or not. +### Publish LU -![all_up_view](./media/language_understanding/all_up_view.png) +Now the last step is to publish your LU content to LUIS. -#### Publish LU +1. Select **Start Bot** on the upper right corner of the Composer. -Now the last step is to publish your LU content to LUIS. +2. Fill in your **LUIS Primary key** and select **OK**. -Click **Start Bot** on the upper right corner of your Composer, fill in your LUIS authoring key and click **Publish**. If you do not have a LUIS account, you need to apply one first from [here](https://www.luis.ai/home). If you have a LUIS account but do not know how to find your LUIS authoring key please read [here](https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-concept-keys?tabs=V2#programmatic-key). + > [!Note] + > If you do not have a LUIS account, you can get one on the [LUIS](https://www.luis.ai/home). If you have a LUIS account but do not know how to find your LUIS primary key please see the [Azure resources for LUIS](https://aka.ms/LUIS-Authoring-Keys#programmatic-key) section of the _Authoring and runtime keys_ article. -![publish_luis](./media/add_luis/publish_luis.png) +3. Select **OK**. + ![Publish LU](./media/language-understanding/publish-lu.png) -Any time you hit **Start Bot** (or **Restart Bot**), Composer will evaluate if your LU content has changed. If so Composer will automatically make required updates to your LUIS applications, train and publish them. If you go to your LUIS app website, you will find the newly published LU model. +Any time you select **Start Bot** (or **Restart Bot**), Composer will evaluate if your LU content has changed. If so Composer will automatically make the required updates to your LUIS applications then train and publish them. If you go to your LUIS app website, you will find the newly published LU model. ## References -- [What is LUIS](https://docs.microsoft.com/en-us/azure/cognitive-services/luis/what-is-luis) -- [Language Understanding](https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-concept-luis?view=azure-bot-service-4.0) + +- [What is LUIS](https://aka.ms/luis-what-is-luis) +- [Language Understanding](https://aka.ms/botbuilder-luis-concept?view=azure-bot-service-4.0) - [.lu file format](https://github.com/Microsoft/botbuilder-tools/blob/master/packages/Ludown/docs/lu-file-format.md) -- [Common expression language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language#readme) +- [Common Expression Language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language#readme) - [Using LUIS for language understanding](https://github.com/microsoft/BotFramework-Composer/blob/kaiqb/Ignite2019/docs/howto-using-LUIS.md) -- [Extract data from utterance text with intents and entities](https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-concept-data-extraction?tabs=V2) +- [Extract data from utterance text with intents and entities](https://aka.ms/luis-concept-data-extraction?tabs=V2) + +## Next -## Next - Learn how to [use validation](./how-to-use-validation.md) - Learn how to [send messages to users](how-to-send-messages.md) diff --git a/docs/concept-memory.md b/docs/concept-memory.md index 001cc68179..a47f99bae9 100644 --- a/docs/concept-memory.md +++ b/docs/concept-memory.md @@ -1,43 +1,49 @@ # Conversation flow and memory -All bots built with Bot Framework Composer have a "memory" - a representation of everything that is currently in the bot's active mind. Developers can store and retrieve values in the bot's memory, and can use those values to create loops, branches, dynamic messages and behaviors in the bot. Properties from memory can be used inside templates, and can also be used as part of a calculation. +All bots built with the Bot Framework Composer have a "memory", a representation of everything that is currently in the bot's active mind. Developers can store and retrieve values in the bot's memory, and can use those values to create loops, branches, dynamic messages and behaviors in the bot. Properties stored in memory can be used inside templates or as part of a calculation. The memory system makes it possible for bots built in Composer to do things like: -* store a user profile and user preferences -* remember things between sessions - like the last search query or a list of recently mentioned locations -* pass information between dialogs +- Store user profiles and preferences. +- Remember things between sessions such as the last search query or a list of recently mentioned locations. +- Pass information between dialogs. ## Anatomy of a property in memory -A piece of data in memory is referred to as a **property**. A property is a distinct value identified by a specific address comprised of two parts - the **scope** of the property and the **name** of the property. +A piece of data in memory is referred to as a **property**. A property is a distinct value identified by a specific address comprised of two parts, the **scope** of the property and the **name** of the property: `scope.name`. Here are a couple of examples: -* `user.name` -* `turn.activity` -* `dialog.index` -* `user.profile.age` + +- `user.name` +- `turn.activity` +- `dialog.index` +- `user.profile.age` The scope of the property determines when the property is available, and how long the value will be retained. ### Storing information about users and ongoing conversations -The bot's memory has two "permanent" scopes - a place to store information about individual users, and a place to store information about ongoing conversations: -* **user** is associated with a specific user. Properties in the user scope are retained forever. -* **conversation** is associated with the conversation id. Properties in the conversation scope are retained forever and may be accessed by multiple users within the same conversation (for example, multiple users together in an Microsoft Teams channel). +The bot's memory has two "permanent" scopes. The first is a place to store information about individual users, the second is a place to store information about ongoing conversations: + +1. **user** is associated with a specific user. Properties in the user scope are retained forever. + +2. **conversation** is associated with the conversation id. Properties in the user scope are retained forever and may be accessed by multiple users within the same conversation (for example, multiple users together in a Microsoft Teams channel). ### Storing temporary values during task handling -The bot's memory also has two "ephemeral" scopes - a place to store temporary values that are only relevant while a task is being handled: -* **dialog** is associated with the active dialog and any child or parent dialogs. Properties in the dialog scope are retained until the last active dialog ends. -* **turn** is associated with a single turn. You can also think of this as the bot handling a single message from the user. Properties in the turn scope are discarded at the end of the turn. +The bot's memory also has two "ephemeral" scopes. Ephemeral scopes are a place to store temporary values that are only relevant while a task is being handled. The two scopes are: + +1. **dialog** is associated with the active dialog and any child or parent dialogs. Properties in the dialog scope are retained until the last active dialog ends. + +2. **turn** is associated with a single turn. You can also think of this as the bot handling a single message from the user. Properties in the turn scope are discarded at the end of the turn. ## Set properties with prompts + Input is collected from user's with prompt types provided in the **Ask a question** sub-menu. ![Ask a question submenu](./media/memory/ask-a-question-menu.png) -Prompts define the question to pose to the user and are set in the **Prompt** box under the **Bot Asks** tab in properties panel on the left. +Prompts define the question posed to the user and are set in the **Prompt** box under the **Bot Asks** tab in the properties panel on the left. ![Prompt Bot Asks](./media/memory/bot-asks.png) @@ -47,94 +53,108 @@ Under the **User Input** tab you'll see **Property to fill**, where the user's r In the above example of a number prompt, the result of the prompt "What is your age?" will be stored as the `user.age` property. The result will be stored as a float since the `float` output format was selected. -For more information about implementing text other prompts and read [asking users for input](./how-to-ask-for-user-input.md). +For more information about implementing text other prompts see the article [Asking users for input](./how-to-ask-for-user-input.md). ## Manipulating properties using memory actions -Bot Framework provides a set of memory manipulation actions in the **Manage properties** sub-menu to create and modify properties in memory. Properties can be created on the fly in the editor - the runtime will automatically manage the underlying data for you in the background. +The Bot Framework Composer provides a set of memory manipulation actions in the **Manage properties** sub-menu. These actions can be used to create, initialize, modify and delete properties in memory. Properties can be created in the editor and during runtime. Composer will automatically manage the underlying data for you. ![Memory manipulation menu](./media/memory/memory-mainpulation-menu.png) -### Set a Property -Use **Set a Property** to set the value of a property. +### Set a property -![Delete Property](./media/memory/set-property.png) +Use **Set a property** to set the value of a property. - The value of a property can be set to a literal value, like `true`, 0, or `fred`, or it can be set to the result of an [computed expression](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language). When storing simple values it is not necessary to initialize the property. +![Set a property](./media/memory/set-property.png) -### Initialize a Property -Use **Initialize a Property** to create new properties that are objects or arrays. This allows your bot to use sub-properties, or store multiple values inside the property. +The value of a property can be set to a literal value, like `true`, `0`, or `fred`, or it can be set to the result of a [computed expression](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language). When storing simple values it is not necessary to initialize the property. -![Initialize Property](./media/memory/initialize-property.png) +### Set properties -It is important to note that before setting the value of a sub-property like `user.profile.age` that `user.profile` must first be initialized. It is not necessary to further initialize `user.profile.age` unless `age` must also contain sub-values. +Use **Set properties** to set a group of properties. -### Edit an Array Property -Use **Edit an Array Property** to add and remove items from an array. Items set in **Value** can be added or removed from the top or bottom of an array in the **Items property** using push, pop, take, remove, and clear in **Type of change**. The result of the edited array is saved to **Result Property** +![Set properties](./media/memory/set-properties.png) -![Edit Array Property](./media/memory/edit-array-property.png) +The value of each property is assigned individually in the **Properties panel**. Don't forget to press `Enter` to save the property setting before you set the next one. -Note that it is possible to push the value of an existing property into another Array property - for example, push `turn.choice` onto `dialog.choices`. +### Initialize a property + +Use **Initialize a property** to create new properties that are objects or arrays. + +![Initialize property](./media/memory/initialize-property.png) + +It is important to note that before setting the value of a sub-property like `user.profile.age` that `user.profile` must first be initialized. It is not necessary to also initialize `user.profile.age` unless `age` also contains sub-values. + +### Delete a property -### Delete a Property Use **Delete a Property** to remove a property from memory. -![Delete Property](./media/memory/delete-property.png) +![Delete a property](./media/memory/delete-property.png) + +### Delete properties + +Use **Delete properties** to remove properties from memory. + +![Delete properties](./media/memory/delete-properties.png) + +### Edit an Array Property + +Use **Edit an Array Property** to add and remove items from an array. Items set in **Value** can be added or removed from the beginning or end of an array in the **Items property** using push, pop, take, remove, and clear in **Type of change**. The result of the edited array is saved to **Result Property** + +![Edit an Array Property](./media/memory/edit-array-property.png) + +Note that it is possible to push the value of an existing property into an array property. For example, push `turn.choice` onto `dialog.choices`. ## Manipulating properties with dialogs -Dialogs can return values to their parent dialogs. In this way, a child dialog can encapsulate a multi-step interaction, collect and compute multiple values, and then return a single value to the parent. +Child dialogs can return values to their parent dialogs. In this way, a child dialog can encapsulate a multi-step interaction, collect and compute multiple values, and then return a single value to its parent dialog. -For example, a child dialog might first **Initialize an object** property called `dialog.profile`. Then, using prompts, build a compound property representing a user profile: +For example, a child dialog might first **Initialize an object** property called `dialog.profile`. Then, using prompts, build a compound property representing a user profile: ![Initialize object profile](./media/memory/initialize-object-profile.png) -Finally, the dialog returns the compound value to the parent dialog. The return value is specified as the **Default Result Property** within the trigger for the child dialog: +Finally, the dialog returns the compound value to the parent dialog. The return value is specified as the **Default result property** within the trigger for the child dialog: -![Default Result Property](./media/memory/default-result-property.png) +![Default result property](./media/memory/default-result-property.png) -Finally, the parent dialog is configured to capture the return value inside the **Begin a Dialog** action: +Finally, the parent dialog is configured to capture the return value inside the **Begin a new dialog** action: ![Return value stored in parent dialog](./media/memory/begin-new-dialog.png) - -When executed, the bot will perform the `profile` child dialog, collect the user's name and age in a _temporary_ scope, then return it to the parent dialog where it is captured into the `user.profile` property and stored permanently. +When executed, the bot will execute the **profile** child dialog, collect the user's name and age in a _temporary_ scope, then return it to the parent dialog where it is captured into the `user.profile` property and stored permanently. ## Automatic properties Some properties are automatically created and managed by the bot. These are available automatically. -| Property | Description | -| ---------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------- | -| turn.activity | The full incoming [Activity](https://docs.microsoft.com/en-us/javascript/api/botframework-schema/activity?view=botbuilder-ts-latest) object | -| turn.intents | If a recognizer is run, the intents found | -| turn.entities | If a recognizer is run, the entities found | -| turn.dialogEvents.event name.value | Payload of a custom event fired using the EmitEvent action. | +| Property | Description | +| ---------------------------------- | ----------------------------------------------------------------------------------------------------------------------------- | +| turn.activity | The full incoming [Activity](https://aka.ms/typescript-namespace-latest-activity-interface?view=botbuilder-ts-latest) object. | +| turn.intents | If a recognizer is run, the intents found. | +| turn.entities | If a recognizer is run, the entities found. | +| turn.dialogEvents.event name.value | Payload of a custom event fired using the EmitEvent action. | ## Refer to properties in memory -Bots can retrieve and use values from memory for a variety of purposes. The bot may need to use a value in order to construct an outgoing message. The bot may need to make a decision based on a value and perform different actions based on that decision. The bot may need to use the value to calculate other values. +Bots can retrieve values from memory for a variety of purposes. The bot may need to use a value in order to construct an outgoing message, or make a decision based on a value then perform actions based on that decision, or use the value to calculate other values. -Sometimes, you will refer directly to a property by its address in memory: `user.name`. Other times, you will refer to one or more properties as part of an expression: `(dialog.orderTotal + dialog.orderTax) > 50`. When referring to properties in memory, it is generally possibly to use either mechanism to access the necessary values. +Sometimes, you will refer directly to a property by its address in memory: `user.name`. Other times, you will refer to one or more properties as part of an expression: `(dialog.orderTotal + dialog.orderTax) > 50`. ### Expressions -Bot Framework uses the [common expression language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language) to calculate computed values. This syntax allows developers to create composite values, define complex conditional tests, and transform the content and format of values. - -* [Operators](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language#operators) -* [Built-in functions](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/common-expression-language/prebuilt-functions.md#pre-built-functions) +The Bot Framework Composer uses the [common expression language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language) to calculate computed values. This syntax allows developers to create composite values, define complex conditional tests, and transform the content and format of values. For more information see the common expression language [operators](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language#operators) and [pre-built functions](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/common-expression-language/prebuilt-functions.md#pre-built-functions). When used in expressions, no special notation is necessary to refer to a property from memory. ### Memory in branching actions -A bot can evaluate values from memory when making decisions inside a [branching action](./howto-controlling-conversation-flow.md) like an **If/Else** or **Switch** branch. The conditional expression that is tested in one of these branching actions is an expression that, when evaluated, drives the decision. +A bot can evaluate values from memory when making decisions inside a [branching action](./how-to-control-conversation-flow.md) like an **If/Else** or **Switch** branch. The conditional expression that is tested in one of these branching actions is an expression that, when evaluated, drives the decision. -In the example below, the expression `user.profile.age > 13` will evaluate to either `True` or `False`, and the branch action will then execute the appropriate branch. +In the example below, the expression `user.profile.age > 13` will evaluate to either `True` or `False`, and the flow will continue through the appropriate branch. ![If/Else Condition](./media/memory/if-else.png) -In this second example, the value of `turn.choice` is used to match against multiple `Switch` cases. Note that, while it looks like a raw reference to a property, this is actually an expression - since no operation is being taken on the property, the expression evaluates to the raw value. +In this second example, the value of `turn.choice` is used to match against multiple `Switch` cases. Note that, while it looks like a raw reference to a property, this is actually an expression and since no operation is being taken on the property, the expression evaluates to the raw value. ![Switch condition](./media/memory/switch.png) @@ -142,13 +162,15 @@ In this second example, the value of `turn.choice` is used to match against mult When using **For each** and **For each page** loops, properties also come into play. Both require an **Items property** that holds the array, and **For each page** loops also require a **Page size**, or number of items per page. -![for each page properties](./media/memory/for-each.png) +![For each page properties](./media/memory/for-each.png) ### Memory in LG -One of the most powerful features of the Bot Framework system is Language Generation - particularly when used alongside properties pulled from memory. -You can refer to properties in the text of any message - including prompts. -Properties can also be referred to in LG templates and functions - [learn more about the full scope of Language Generation system in this section.](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/language-generation) +One of the most powerful features of the Bot Framework system is Language Generation, particularly when used alongside properties pulled from memory. + +You can refer to properties in the text of any message, including prompts. + +You can also refer to properties in [LG templates](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/lg-file-format.md). See the Language Generation [readme](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/language-generation) to learn more the Language Generation system. To use the value of a property from memory inside a message, wrap the property reference in curly brackets: `{user.profile.name}` @@ -156,22 +178,19 @@ The screenshot below demonstrates how a bot can prompt a user for a value, then ![LG memory](./media/memory/lg.png) -In addition to raw properties values, it is also possible to embed [expressions](#expressions) into the message template. See the full list of pre-built functions you can use in expression [here](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/common-expression-language/prebuilt-functions.md). +In addition to getting properties values, it is also possible to embed properties in [expressions](#expressions) used in a message template. Refer to the _Common Expression Language_ page for the full list of [pre-built functions](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/common-expression-language/prebuilt-functions.md). -Properties from memory can also be used within an LG template to provide conditional variants of a message and can be passed as parameters to built-in and custom functions. [Learn more about LG](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/language-generation). +Properties can also be used within an LG template to provide conditional variants of a message and can be passed as parameters to both built-in and custom functions. [Learn more about LG](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/language-generation). -### Memory shorthands - -Bot Framework provides a variety of shortcuts for referring to properties in memory. See the full list [here](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/adaptive-dialog/docs/memory-model-overview.md#memory-short-hands) +### Memory shorthand notations +The Bot Framework Composer provides a variety of shortcuts for referring to properties in memory. Refer to the _Managing state_ documentation for the complete list of [memory shorthand notations](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/adaptive-dialog/docs/memory-model-overview.md#memory-short-hand-notations). ## Further reading -* [Bot Framework Adaptive dialogs Memory Model](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/adaptive-dialog/docs/memory-model-overview.md) - -* [Bot Framework on Github](https://github.com/microsoft/botframework) - +- [The Bot Framework Adaptive dialogs Memory Model](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/adaptive-dialog/docs/memory-model-overview.md). +- [The Microsoft Bot Framework](https://github.com/microsoft/botframework). ## Next -* [Language Generation in Bot Framework Composer](./concept-language-generation.md) +- [Language Generation](./concept-language-generation.md) in the Bot Framework Composer. diff --git a/docs/how-to-ask-for-user-input.md b/docs/how-to-ask-for-user-input.md index 49619e7e2a..1e5809c5bc 100644 --- a/docs/how-to-ask-for-user-input.md +++ b/docs/how-to-ask-for-user-input.md @@ -15,7 +15,7 @@ As seen in the **TextInput** dialog the user is prompted for their name in the * ![Text prompt bot says](./media/ask-for-input/text-bot-asks.png) -The user's response is stored in **Property to fill** in the **User Asks** section as `user.name`. Note that you can change the **Output Format** if you want to save the text as trimmed (leading and trailing whitespace removed), uppercase, or lowercase. +The user's response is stored in **Property to fill** in the **User Input** section as `user.name`. Note that you can change the **Output Format** if you want to save the text as trimmed (leading and trailing whitespace removed), uppercase, or lowercase. ![Text prompt user input](./media/ask-for-input/text-user-input.png) @@ -51,7 +51,9 @@ In the **ChoiceInput** dialog you will see the **Property to fill** is set to `u | Suggested Action | ![list suggested action](./media/ask-for-input/multichoice-list-suggestedactions.png) | displays options as Suggested Action buttons | | Hero Card | ![list hero card](./media/ask-for-input/multichoice-list-herocard.png) | displays Hero Card with options as buttons **within** card | -In the **User Answers** section in the Property panel you will also notice **Choice Options**, which can be used to add more choices and their synonyms. You'll also see three boxes related to inline separation, or how your bot separates the text of your choices: +In the **User Input** section in the properties panel you will notice **Choice Options**, which can be used to add more choices and their synonyms. You can set the choice options in **Static** or **Dynamic** format. For **Static** format, you need to write each choice option manually; for **Dynamic** format, you can set the options to an array and then retrieve the value dynamically. + +You'll also see three boxes related to inline separation, or how your bot separates the text of your choices: - **Inline separator** - character used to separate individual choices when there are more than two choices, usually `,`. - **Inline or** - separator used when there are only two choices, usually `or`. - **Inline or more** - separator between last two choices when there are more than two options, usually `, or`. diff --git a/docs/how-to-define-triggers.md b/docs/how-to-define-triggers.md index 184c2fe727..4ba6a62024 100644 --- a/docs/how-to-define-triggers.md +++ b/docs/how-to-define-triggers.md @@ -1,161 +1,186 @@ -# Defining triggers -Each dialog in Bot Framework Composer includes a set of triggers (event handlers) that contain instructions for how the bot will respond to inputs received when the dialog is active. There are several different types of triggers in Composer. They all work in a similar manner and can be interchanged in some cases. In this article, we instruct how to define each type of them. Before you walk through this article, please read the [events and triggers](concept-events-and-triggers.md) concept article. +# Defining triggers +Each dialog in the Bot Framework Composer includes a set of triggers (event handlers) that contain actions (instructions) for how the bot will respond to inputs received when the dialog is active. There are several different types of triggers in Composer. They all work in a similar manner and can even be interchanged in some cases. This article explains how to define each type of trigger. Before you walk through this article, please read the [events and triggers](concept-events-and-triggers.md) concept article. -The table below lists the six different types of triggers in Composer and their descriptions. +The table below lists the six different types of triggers in Composer and their descriptions. -| Trigger Type | Description | -| ----------------- | --------------------------------------------------------------------------------------- | -| [Intent recognized](how-to-define-triggers.md#intent-recognized) | Trigger an action when an `intent` is recognized (and optionally `entities`) | -| [Unknown intent](how-to-define-triggers.md#Unknown-intent) | Trigger an action when no intent is recognized | -| [Dialog events](how-to-define-triggers.md#Dialog-events) | Trigger an action when a dialog event such as **BeginDialog** is fired | -| [Activities](how-to-define-triggers.md#Activities) | Trigger an action to take when an activity event such as when a new conversation starts | -| [Message events](how-to-define-triggers.md) | Trigger an action to take when a message activity is fired. | -| [Custom event](how-to-define-triggers.md#custom-event) | Trigger a pre-defined custom event such as **Emit a custom event**. | +| Trigger Type | Description | +| ----------------- | ------------------------------------------------------------------------------------------- | +| [Intent recognized](#intent-recognized) | When an intent is recognized the **Intent recognized** trigger fires. | +| [Unknown intent](#Unknown-intent) | The **Unknown intent** trigger fires when an intent is defined and recognized but there is no **Intent recognized** trigger defined for that intent.| +| [Dialog events](#Dialog-events) | When a dialog event such as **BeginDialog** occurs it will fire the specified trigger. | +| [Activities](h#Activities) | When an activity event occurs, such as when a new conversation starts, the **Activities** trigger will fire. | +| [Message events](#message-events) | When a message activity occurs such as when a message is updated, deleted or reacted to, the **Message events** trigger will fire. | +| [Custom event](#custom-event) | When an **Emit a custom event** occurs the **Custom event** trigger will fire.| ## Intent recognized -This is a trigger type we use to define actions to take when an `intent` is recognized (and optionally `entities`). It is a trigger that works with **recognizers**. There are two **recognizers** in Composer: [LUIS](https://www.luis.ai) recognizer and [Regular Expression](https://regexr.com/) recognizer. On the navigation pane click **New Trigger** and select **Intent recognized** from the drop-down menu. You will see the intent trigger menu as follows: +This trigger type is used to define the actions to execute when an [intent](concept-language-understanding.md#intents) is found in a message sent from the user. The **Intent recognized** trigger works in conjunction with **recognizers**. There are two [recognizers](concept-dialog.md#recognizer) in Composer, one for [LUIS](#luis-recognizer) and the other for [Regular Expression](#regular-expression-recognizer). You define which recognizer is used, if any, at the dialog level. -![intent_trigger](./media/events_triggers/intent_trigger.png) +To create the **Intent recognized** trigger, select **New Trigger** in the navigation pane then **Intent recognized** from the drop-down list. You will see the intent trigger menu as follows: -If you have not defined any intents the sub-menu will show `No intents configured for this dialog` and there is no intent to configure. The basic steps to define an **Intent recognized** trigger are as follows: -- set up a recognizer type in your selected dialog -- define intent(s) in the language understanding editor -- create **Intent recognized** triggers to handle pre-defined intents (one trigger handles one intent) -- define actions in the trigger +![Intent trigger](./media/events-triggers/intent-trigger.png) -### LUIS recognizer -Composer enables developers to create language training data in the dialog editing surface because it is deeply integrated with the [LUIS.ai](https://www.luis.ai/home) language understanding API. LUIS is able to take natural language input from users and translate it into a named intent and a set of extracted entity values the message contains. +If you have not defined any intents the **Which intent do you want to handle?** drop-down list will show "_No intents configured for this dialog_" and there will be no intent to configure, however you can define an intent later in the triggers properties panel located on the left side of the Composer screen. -Follow the steps to define an **Intent recognized** trigger with LUIS recognizer: +Once the **Intent recognized** trigger has been created, you can further refine it by assigning one or more [entities](concept-language-understanding.md#entities) in the properties panel. + +> [!TIP] +> You need to press the **Enter** key after entering an entity or it will not be saved. + +It is also possible to add a **condition** to the trigger. A condition is an expression that follows [Common Expression Language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language) syntax. If a condition is specified, it must evaluate to "true" for the event to fire. + + The basic steps to define an **Intent recognized** trigger are as follows: + +1. Set up a [recognizer](./concept-dialog.md#recognizer) type in your dialog. +2. Define [intents](concept-language-understanding.md#intents) in the Language Understanding editor. +3. Create an **Intent recognized** trigger to handle each intent you created (one trigger per intent). +4. Define [actions]((./concept-dialog.md#action)) in the trigger. + +### LUIS recognizer +[LUIS](https://www.luis.ai/home) is a machine learning-based service you can use to build natural language capabilities into your bot. Using a LUIS recognizer enables you to extract intents and entities based on a LUIS application. + +Composer enables developers to create language training data in the dialog authoring canvas because it is deeply integrated with the [LUIS](https://www.luis.ai/home) API. LUIS is able to take natural language input from users and translate it into a named intent and a set of extracted entity values the message contains. + +Follow the steps to define an **Intent recognized** trigger with a LUIS recognizer: 1. In the properties panel of your selected dialog, choose **LUIS** as recognizer type. -![luis_recognizer](./media/events_triggers/luis_recognizer.png) + ![luis recognizer](./media/events-triggers/luis-recognizer.png) -2. In the Language Understanding editor, create intents with sample utterances and follow the [.lu file format](https://github.com/Microsoft/botbuilder-tools/blob/master/packages/Ludown/docs/lu-file-format.md#lu-file-format). +2. In the Language Understanding editor, create intents with sample [utterances](concept-language-understanding.md#utterances) following the [.lu file format](https://github.com/Microsoft/botbuilder-tools/blob/master/packages/Ludown/docs/lu-file-format.md#lu-file-format). ->[!NOTE] -> Each intent contains a series of sample utterances which will be used as training data in LUIS to recognize any pre-defined intent. You will need a [LUIS authoring key](https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-concept-keys?tabs=V2#programmatic-key) to get your training data published. For details, read [using LUIS for language understanding](howto-using-LUIS.md) article. + Below is a screenshot showing the **text editor** in a dialogs properties panel. This example captures two simple _intents_ ("Greeting" and "BookFlight") each with a list of example _utterances_ that capture ways users might express these two intents. You can use - or + or * to denote lists. Numbered lists are not supported. -Below is a screenshot to show the previous two steps: + ![LUIS intent](./media/events-triggers/LUIS-intent.png) -![LUIS_intent](./media/events_triggers/LUIS_intent.png) + >[!NOTE] + > Each intent contains a series of sample utterances which will be used as training data in LUIS to recognize any pre-defined intent. + + > [!IMPORTANT] + > You will need a [LUIS authoring key](https://aka.ms/bot-framework-emulator-LUIS-keys?tabs=V2#programmatic-key) to get your training data published. For details, read [using LUIS for language understanding](how-to-use-LUIS.md) article. -3. Select **Intent recognized** from the trigger menu and pick any pre-defined intent you want this trigger to handle. Each **Intent** trigger handles one pre-defined intent. +3. Select **Intent recognized** from the trigger menu and pick the intent you want the trigger to handle. Each **Intent** trigger handles one intent. -![BookFlight_configure](./media/events_triggers/BookFlight_configure.png) + ![BookFlight configure](./media/events-triggers/BookFlight-configure.png) 4. Optionally, you can set the **Condition** property to avoid low confidence results given that LUIS is a machine learning based intent classifier. For example, set the **Condition** property to this in the **Greeting** intent: - `#Greeting.Score >=0.8` - -![score](./media/events_triggers/score.png) + `#Greeting.Score >=0.8` -This definition means that the **Greeting** intent trigger will only fire when the confidence score returned by LUIS equals or is higher than 0.8. + ![Score](./media/events-triggers/score.png) -#### LUIS for entity extraction -In addition to specifying intents and utterances, it is also possible to train LUIS to recognize named entities and patterns. Entities are a collection of objects data extracted from an utterance such as places, time, and people. Read more about the full capabilities of LUIS recognizers [here](https://github.com/microsoft/botbuilder-tools/blob/master/packages/Ludown/docs/lu-file-format.md). +This definition means that the **Greeting** intent trigger will only fire when the confidence score returned by LUIS is equal to or greater than 0.8. -Extracted entities are passed along to any triggered actions or child dialogs using the syntax `@[Entity Name]`. For example, given an intent definition like below: +### Regular Expression recognizer +A [regular expression](https://regexr.com/) is a special text string for describing a search pattern that can be used to match simple or sophisticated patterns in a string. Composer exposes the ability to define intents using regular expressions and also allows regular expressions to extract simple entity values. While LUIS offers the flexibility of a more fully featured language understanding technology, the [regular expression recognizer](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/adaptive-dialog/docs/recognizers-rules-steps-reference.md#regex-recognizer) works well when you need to match a narrow set of highly structured commands or keywords. -``` -# book-flight -- book a flight to {city=austin} -- travel to {city=new york} -- i want to go to {city=los angeles} -``` +In the example below, a similar book-flight intent is defined. However, this will _only_ match the very narrow pattern "book flight to [somewhere]", whereas the LUIS recognizer will be able to match a much wider variety of messages. -When triggered, if LUIS is able to identify a city, the city name will be made available as `@city` within the triggered actions. The entity value can be used directly in expressions and LG templates, or [stored into a memory property](concept-memory.md) for later use. Advanced intents and entities definition in Composer can be found [here](howto-define-advanced-intents-entities.md). +Follow the steps to define **Intent recognized** trigger with [Regular Expression](https://regexr.com/) recognizer: -### Regular Expression recognizer -[Regular expressions](https://regexr.com/) are rigid patterns that can be used to match simple or sophisticated patterns in a text. Composer exposes the ability to define intents using regular expressions and also allows the regular expressions to extract simple entity values. While LUIS offers the flexibility of a more fully featured language understanding technology, [Regular Expression recognizer](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/adaptive-dialog/docs/recognizers-rules-steps-reference.md#regex-recognizer) works well when you need to match a narrow set of highly structured commands or keywords. +1. In the properties panel of your selected dialog, choose **Regular Expression** as recognizer type for your dialog. -In the example below, a similar book-flight intent is defined. However, this will _only_ match the very narrow pattern "book flight to [somewhere]", whereas the LUIS recognizer will be able match a much wider variety of messages. + ![regex recognizer](./media/events-triggers/regex-recognizer.png) -Follow the steps to define **Intent recognized**trigger with [Regular Expressions](https://regexr.com/) recognizer: +2. In the regular expression editor, create a regular expression **intent** and **pattern** as shown in the screenshot below: -1. In the properties panel of your selected dialog, choose **Regular Expression** as recognizer type for your trigger. + ![regular expression intent](./media/events-triggers/regular-expression-intent.png) -![regex_recognizer](./media/events_triggers/regex_recognizer.png) +3. You can then create an **Intent recognized** trigger to handle each intent you define as instructed in the [LUIS recognizer](how-to-define-triggers.md#LUIS-recognizer) section. -2. In the Language Understanding editor, create [Regular Expression](https://regexr.com/) **intents** and **pattern** as shown in the screenshot below: +> [!NOTE] +> For more information on how to write regular expression, read [here](https://regexr.com/). -![regular_expression_intent](./media/events_triggers/regular_expression_intent.png) +## Unknown intent +This is a trigger type used to define actions to take when there is no **Intent recognized** trigger to handle an existing intent. -3. Create an **Intent recognized** trigger for each pre-defined intent as instructed in the [LUIS recognizer section](how-to-define-triggers.md#LUIS-recognizer). +Follow the steps to define an **Unknown intent** trigger: -## Unknown intent -This is a trigger type we use to define actions to take when an intent is not recognized. You do not need to define any intents for this trigger. Follow the steps to define an **Unknown intent** trigger: +1. In the navigation pane, select **New Trigger**. -1. On the navigation pane click **New Trigger** and select **Unknown intent** from the drop-down menu. +2. Select **Create a Trigger** from the **What is the type of this trigger?** drop-down list, then **Submit**. -![unknown_intent](./media/events_triggers/unknown_intent.png) + ![Unknown intent](./media/events-triggers/unknown-intent.png) -2. After you click **Submit**, you will see an empty **Unknown intent** trigger in the authoring canvas. +2. After you select **Submit**, you will see an empty **Unknown intent** trigger in the authoring canvas. -3. Click the "+" sign under the trigger node to add any action node(s) you want to include. For example, you can click **Send a response** to send a message `This is an unknown intent trigger!`. When this trigger is fired, the response message will be sent to the user. +3. Select the **+** sign under the trigger node to add any action node(s) you want to include. For example, you can select **Send a response** to send a message "This is an unknown intent trigger!". When this trigger is fired, the response message will be sent to the user. -![unknown_intent_response](./media/events_triggers/unknown_intent_response.gif) + ![Unknown intent response](./media/events-triggers/unknown-intent-response.gif) ## Dialog events -This is a trigger type we use to define actions to take when a dialog event such as `BeginDialog` is fired. Most dialogs will include an event handler (trigger) configured to respond to the `BeginDialog` event, which fires when the dialog begins and allows the bot to respond immediately. Follow the steps below to define a **Dialog started (Begin dialog event)** trigger: +This is a trigger type used to define actions to take when a dialog event such as `BeginDialog` is fired. Most dialogs will include an event handler (trigger) configured to respond to the `BeginDialog` event, which fires when the dialog begins and allows the bot to respond immediately. Follow the steps below to define a **Dialog started** trigger: -On the navigation pane click **New Trigger** and select **Dialog events** from the drop-down menu. +1. Select **New Trigger** in the navigation pane then select **Dialog events** from the drop-down list. -![dialog_events](./media/events_triggers/dialog_events.png) + ![Dialog events](./media/events-triggers/dialog-events.png) -Select **Dialog started (Begin dialog event)** from the drop-down menu. Click **Submit**. +2. Select **Dialog started (Begin dialog event)** from the **Which event?** drop-down list then select **Submit**. -![begin_dialog](./media/events_triggers/begin_dialog.png) + ![Begin dialog](./media/events-triggers/begin-dialog.png) -Click the "+" sign under the trigger node and mouse over the action menu. Click **Dialog management** and then **Begin a new dialog**. You can configure any pre-defined dialog to the **Begin a new dialog** action on the properties panel on the right side. Before you use this trigger you must define a dialog to be configured. +3. Select the **+** sign under the *Dialog started* node and then select **Begin a new dialog** from the **Dialog management** menu. -![configure_dialog](./media/events_triggers/configure_dialog.png) + ![Begin a new dialog](./media/events-triggers/begin-a-new-dialog.png) + +4. Before you can use this trigger you must associate a dialog to it. You do this by selecting a dialog from the **Dialog name** drop-down list in the **properties panel** on the right side of the Composer window. You can select an existing dialog or create a new one. the example below demonstrates selecting and existing dialog named *weather*. + + ![Configure dialog](./media/events-triggers/wire-up-dialog.gif) ## Activities -This is a type of trigger used to handle activity events such as your bot receiving a `ConversationUpdate` Activity. This indicates a new conversation begins and you use a **Greeting (ConversationUpdate activity)** trigger to handle it. Follow the steps to define a **Greeting (ConversationUpdate activity)** trigger to send a welcome message: +This type of trigger is used to handle activity events such as your bot receiving a `ConversationUpdate` Activity. This indicates a new conversation began and you use a **Greeting (ConversationUpdate activity)** trigger to handle it. + +The following steps demonstrate hot to create a **Greeting (ConversationUpdate activity)** trigger to send a welcome message: + +1. Select **New Trigger** in the navigation pane then **Activities** from the drop-down list. -On the navigation pane click **New Trigger** and select **Activities** from the drop-down menu. + ![Activities](./media/events-triggers/activities.png) -![activities](./media/events_triggers/activities.png) -Select **Greeting (ConversationUpdate activity)** from the drop-down menu. Click **Submit**. +2. Select **Greeting (ConversationUpdate activity)** from the **Which activity type?** drop-down list then select **Submit**. -![conversationupdate](./media/events_triggers/conversationupdate.png) + ![Conversation update](./media/events-triggers/conversation-update.png) -Click the "+" sign under the trigger node and and mouse over the action menu then click **Send a response**. In the language generation editor, author your response message following [.lg file format](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/lg-file-format.md). +3. Select the **+** sign under the *ConversationUpdate Activity* node and then select **Send a response**. -![welcome](./media/events_triggers/welcome.gif) +4. Author your response in the **Language Generation** editor in the **properties panel** on the right side of the Composer window, by entering a message following [.lg file format](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/lg-file-format.md) as demonstrated in the image below. + + ![Welcome](./media/events-triggers/welcome.gif) ## Custom event -**Custom event** is a trigger to handle a custom event such as **Emit a custom event**. Bots can emit user-defined events using the **Emit a custom event** which will trigger this handler. Follow the steps below to define a **Custom event**: +The **Custom event** trigger will only fire when a matching **Emit a custom event** occurs. It is a trigger that any dialog in your bot can consume. To define and consume a **Custom event** trigger, you need to create a **Emit a custom event** first. Follow the steps below to create a **Emit a custom event**: + +### Emit a custom event + +1. In the Composer navigation pane select the trigger you want to associate your custom event with. This opens the trigger in the authoring canvas where you can specify exactly where in the flow you want to trigger this event from. Once determined, select the **+** sign and then select **Emit a custom event** from the **Access external resources** drop-down list. -In your bot's authoring canvas, select the trigger you want to define **Emit a custom event**. Under this trigger, click the "+" sign and mouse over the action menu. Click **Access external resources** and then select **Emit a custom event**. + ![Emit custom event](./media/events-triggers/emit-custom-event.png) -![emit_custom_event](./media/events_triggers/emit_custom_event.png) +2. In the _properties panel_ of this activity, on the right side of the Composer window, enter a name ("*Weather*") into the **Event name** field, then select **Bubble event**. -You can define properties of this event on the properties panel on the right side. Let's give this event a name "Weather", leave `Event value` as is and check `Bubble event`. When `Bubble event`is checked this event will be passed on to the parent dialogs to look for handlers to handle it. + ![Emit custom event property](./media/events-triggers/emit-custom-event-property.png) -![emit_custom_event_property](./media/events_triggers/emit_custom_event_property.png) +> [!TIP] +> When **Bubble event** is selected, any event that is not handled in the current dialog will _bubble up_ to that dialogs parent dialog where it will continue to look for handlers for the custom event. -Now let's create a **Custom event** trigger to handle the pre-defined event. On the navigation pane on the left, click on **New Trigger** and select **Custom event** from the drop-down menu. Click **Submit**. +### Create a custom event trigger +Now that your **Emit a custom event** has been created, you can create a **Custom event** trigger to handle this event. When the **Emit a custom event** occurs, any matching **Custom event** trigger at any dialog level will fire. Follow the steps to create a **Custom event** trigger to be associated with the previously defined **Emit a custom event**. -![custom_event](./media/events_triggers/custom_event.png) +1. Select **New Trigger** in the navigation pane, then select **Custom event** from the drop-down list, then **Submit**. -On the properties panel on the right side, fill in the name of you pre-defined event in the `Custom event name` section. We fill in `Weather` as we created in step 1. The name in the `Custom event name` section must match the name of the **Emit a custom event** you just created. +2. Enter "*Weather*" into the **Custom event name** field in the properties panel on the right side of the Composer window. You must enter the same name ("*Weather*") to associate this trigger with the **Emit a custom event** you defined previously. -![custom_event_property](./media/events_triggers/custom_event_property.png) + ![Custom event property](./media/events-triggers/custom-event-property.png) -You can add an action to this trigger to test if it triggers the pre-defined event. Click the "+" sign and select **Send a response** from the actions menu. Author your response for this action in the language generation editor as you want. +4. Now you can add an action to your custom event handler, this defines what will happen when it is triggered. Do this by selecting the **+** sign and then **Send a response** from the actions menu. Enter the desired response for this action in the Language Generation editor, for this example enter "This is a custom trigger!". -![custom_event_response](./media/events_triggers/custom_event_response.gif) + ![Custom event response](./media/events-triggers/custom-event-response.gif) -Now you have completed defining a **Custom event** trigger. When **Emit a custom event** is fired, the **Custom event** will handle this event and send the response you have defined. +Now you have completed both of the required steps needed to create and execute a custom event. When **Emit a custom event** fires, your custom event handler will fire and handle this event, sending the response you defined. -![custom_event_response](./media/events_triggers/custom_event_response.png) +![Custom event response](./media/events-triggers/custom-event-response.png) ## References -- [Events and triggers](./concept-events-and-triggers.md) +- The [Events and triggers](./concept-events-and-triggers.md) concept article. ## Next - Learn how to [control conversation flow](./how-to-control-conversation-flow.md). diff --git a/docs/how-to-publish-bot.md b/docs/how-to-publish-bot.md index b089693db6..a6a248bd78 100644 --- a/docs/how-to-publish-bot.md +++ b/docs/how-to-publish-bot.md @@ -6,7 +6,7 @@ To publish a bot, you will need to use Az CLI tool and Bot Framework LuBuild too - A subscription to [Microsoft Azure](https://azure.microsoft.com/en-us/free/) - To install **Az CLI**, follow [Install the Azure CLI](https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest) instructions - To install **PowerShell 6.0**, follow [Install PowerShell 6.0](https://docs.microsoft.com/en-us/powershell/scripting/install/installing-powershell?view=powershell-6) instructions -- To install **LuBuild**, run the following command +- To install **[LuBuild](https://botbuilder.myget.org/feed/botbuilder-declarative/package/npm/lubuild)**, run the following command ``` npm install -g https://botbuilder.myget.org/F/botbuilder-declarative/npm/lubuild/-/1.0.3-preview.tgz ``` diff --git a/docs/how-to-send-cards.md b/docs/how-to-send-cards.md index 6bdb8c3fa2..a69746dbe9 100644 --- a/docs/how-to-send-cards.md +++ b/docs/how-to-send-cards.md @@ -1,5 +1,5 @@ # Sending responses with cards -A bot communicates with users through message activities which are multi-modal. There are messages which simply consist of plain text and there are also richer message content such as cards. Bot Framework Composer supports [structured response template](https://github.com/microsoft/BotBuilder-Samples/blob/vishwac/master-4.6/experimental/language-generation/docs/structured-response-template.md) with which you can add rich cards to your bot and enhance your bot's design. If you are looking for examples about sending text messages to users please read the [sending messages to users](./how-to-send-messages.md) article. +A bot communicates with users through message activities which are multi-modal. There are messages which simply consist of plain text and there are also richer message content such as cards. Bot Framework Composer supports [structured response template](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/structured-response-template.md) with which you can add rich cards to your bot and enhance your bot's design. If you are looking for examples about sending text messages to users please read the [sending messages to users](./how-to-send-messages.md) article. In this article, we will cover different types of cards you can define in Composer using [structured response template](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/structured-response-template.md). We use the examples provided in the [RespondingWithCardsSample](https://github.com/microsoft/BotFramework-Composer/tree/master/Composer/packages/server/assets/projects/RespondingWithCardsSample) throughout this article. @@ -141,12 +141,12 @@ This template "#AllCards" is defined to display all cards when the template is c ## References - [Bot Framework - Cards](https://github.com/microsoft/botframework-sdk/blob/master/specs/botframework-activity/botframework-cards.md) -- [Add media to messages](https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-howto-add-media-attachments?view=azure-bot-service-4.0&tabs=csharp) +- [Add media to messages](https://docs.microsoft.com/azure/bot-service/bot-builder-howto-add-media-attachments) - [Language Generation](./concept-language-generation.md) - [Structured response template](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/structured-response-template.md) -- [Adaptive Cards overview](https://docs.microsoft.com/en-us/adaptive-cards/) +- [Adaptive Cards overview](https://docs.microsoft.com/adaptive-cards/) - [Adaptive Cards Sample](https://github.com/microsoft/BotBuilder-Samples/tree/master/samples/csharp_dotnetcore/07.using-adaptive-cards) -- [Adaptive Cards for bot developers](https://docs.microsoft.com/en-us/adaptive-cards/getting-started/bots) +- [Adaptive Cards for bot developers](https://docs.microsoft.com/adaptive-cards/getting-started/bots) ## Next - Learn [how to define triggers and events](./how-to-define-triggers.md). diff --git a/docs/how-to-send-messages.md b/docs/how-to-send-messages.md index 90f44f02ff..8d2d442d47 100644 --- a/docs/how-to-send-messages.md +++ b/docs/how-to-send-messages.md @@ -38,10 +38,10 @@ To define a simple text message, use a "-" before the text that you want your bo You can also define a simple text message with multiple variations. Bot will respond with any of the simple text messages by random. For example: - > Greeting template with 2 variations. - # GreetingPrefix - - Hi - - Hello + # SimpleText + - Hi, this is simple text + - Hey, this is simple text + - Hello, this is simple text ### Text with memory To define a text message with memory, you need to **Set a Property** first and then use an expression response like this: @@ -127,7 +127,7 @@ Similar to If/Else conditional template, you can define a Switch conditional tem In this Switch conditional template, bot will respond in text message `Happy Sunday!`, `Happy Saturday` or `Working day!` based on the returned value of days of the week for a given timestamp. `utcNow()`is a pre-built function which returns current timestamp as string. `dayOfWeek()` is a pre-built function which returns the day of the week from a timestamp. Read more about [pre-built functions](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/common-expression-language/prebuilt-functions.md) in [common expression language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language). ## References -- [Send and receive text message](https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-howto-send-messages?view=azure-bot-service-4.0) +- [Send and receive text message](https://docs.microsoft.com/azure/bot-service/bot-builder-howto-send-messages) - [Language generation](./concept-language-generation.md) - [.lg file format](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/language-generation/docs/lg-file-format.md) - [Common language expression](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language#readme) diff --git a/docs/how-to-use-LUIS.md b/docs/how-to-use-LUIS.md index b2ddf07fdc..444dd6c526 100644 --- a/docs/how-to-use-LUIS.md +++ b/docs/how-to-use-LUIS.md @@ -67,7 +67,7 @@ To test your bot which you just added LUIS to, click the **Test in Emulator** bu ## References - [LUIS.ai](https://www.luis.ai/home) -- [Add natural language understanding to your bot](https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-howto-v4-luis?view=azure-bot-service-4.0&tabs=csharp) +- [Add natural language understanding to your bot](https://docs.microsoft.com/azure/bot-service/bot-builder-howto-v4-luis) - [Events and triggers](./concept-events-and-triggers.md) - [Language Understanding](./concept-language-understanding.md) diff --git a/docs/introduction.md b/docs/introduction.md new file mode 100644 index 0000000000..be05f2553b --- /dev/null +++ b/docs/introduction.md @@ -0,0 +1,65 @@ +# Introduction to the Bot Framework Composer + +The Bot Framework Composer is an integrated development tool that developers and multi-disciplinary teams can use to build bots. It is built using the latest features of the Bot Framework SDK. Within Composer, you'll find everything you need to build a sophisticated conversational experience: + +* A visual dialog editor. +* Tools to train and manage Language Understanding (LU). +* Powerful language generation and templating systems. +* A ready-to-use bot runtime executable . + +![BF Composer](./media/introduction/composer-overview.png) + +Under the hood, Composer harnesses the power of many of the components from the Bot Framework SDK. When building bots in Composer, developers will have access to: + +**Adaptive dialogs** + +Dialogs provide a way for the bot to manage conversations with the user. The new [Adaptive dialog](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/adaptive-dialog) and the event model simplify sophisticated conversation modelling and helps you focus on the model of the conversation rather than the mechanics of dialog management. + +**Language Understanding (LU)** + +LU is a core component of Composer, allowing developers and conversation designers to train language understanding directly in the context of editing a dialog. As dialogs are edited in Composer, developers can continuously add to their bots' natural language capabilities using the [lu file format](https://aka.ms/lu-file-format), a simple markdown-like format that makes it easy to define new [intents](concept-language-understanding.md#intents) and provide sample [utterances](concept-language-understanding.md#utterances). In Composer, you can use both regular expression or [LUIS](https://docs.microsoft.com/azure/cognitive-services/luis/what-is-luis) service. + + ![BF Composer NLU](./media/introduction/intro-nlu.png) + +Composer detects changes and updates the bot's cloud-based natural-language understanding (NLU) model automatically so it is always up to date. + +**Language Generation (LG)** + +Creating grammatically correct, data-driven responses that have a consistent tone and convey a clear brand voice has always been a challenge for bot developers. Composer's integrated [Language Generation](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/language-generation) system allows developers to create bot replies with a great deal of flexibility. + + ![BF Composer LG](./media/introduction/Bot-Responses.png) + +With Language Generation, previously complex tasks can be quickly achieved, like: +* Including dynamic elements in messages. +* Generating grammatically correct lists, pronouns, articles. +* Providing context-sensitive variation in messages. +* Creating Adaptive Cards attachments, as seen above. + +**QnA Maker** + +[QnA Maker](https://docs.microsoft.com/azure/cognitive-services/qnamaker/overview/overview) is a cloud-based Natural Language Processing (NLP) service that easily creates a natural conversational layer over your data. It can be used to find the most appropriate answer for any given natural language input, from your custom knowledge base (KB) of information. + +**Bot Framework Emulator** + +[Emulator](https://github.com/Microsoft/BotFramework-Emulator/blob/master/README.md) is a desktop application that allows bot developers to test and debug bots built using Composer. + + +## Advantage of developing bots with Composer +Developers familiar with the Bot Framework SDK will notice differences between bots developed with it and the Bot Framework Composer. Some of the advantages of developing bots in Composer include: +- Use of Adaptive Dialogs allow for Language Generation (LG), which can simplify interruption handling and give bots character. +- Visual design surface in Composer eliminates the need for boilerplate code and makes bot development more accessible. You no longer need to navigate between experiences to maintain LU model as it is editable within the app. +- Time saved with fewer steps to set up your environment. + +A major difference between the current version of the Bot Framework SDK and Composer is that the apps created using Composer uses the Adaptive dialog format, a JSON specification shared by many tools provided by the Bot Framework. More information about Adaptive dialog is available on [GitHub](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/adaptive-dialog). + + + +The Composer bot projects contain reusable assets in the form of JSON and Markdown files that can be bundled and packaged with a bot's source code. These can be checked into source control systems and deployed along with code updates, such as dialogs, language understanding (LU) training data, and message templates. + +## Additional resources +- [Bot Framework SDK](https://github.com/microsoft/botframework-sdk/blob/master/README.md) +- [Common Expression Language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language) + +## Next steps + +* Learn how to [create an echo bot](./quickstart-create-bot.md) using Composer. diff --git a/docs/media/ask-for-input/choice-and-inline.png b/docs/media/ask-for-input/choice-and-inline.png index d4fd8ad753..2f7cb765dc 100644 Binary files a/docs/media/ask-for-input/choice-and-inline.png and b/docs/media/ask-for-input/choice-and-inline.png differ diff --git a/docs/media/dialog/action-menu.gif b/docs/media/dialog/action-menu.gif new file mode 100644 index 0000000000..7c109e9aea Binary files /dev/null and b/docs/media/dialog/action-menu.gif differ diff --git a/docs/media/dialog/action_menu.gif b/docs/media/dialog/action_menu.gif deleted file mode 100644 index 8f3716d790..0000000000 Binary files a/docs/media/dialog/action_menu.gif and /dev/null differ diff --git a/docs/media/dialog/adaptive-dialog-anatomy.png b/docs/media/dialog/adaptive-dialog-anatomy.png index 02d0039e59..57f1617aff 100644 Binary files a/docs/media/dialog/adaptive-dialog-anatomy.png and b/docs/media/dialog/adaptive-dialog-anatomy.png differ diff --git a/docs/media/dialog/begin_dialog_action.png b/docs/media/dialog/begin_dialog_action.png deleted file mode 100644 index b37f69def5..0000000000 Binary files a/docs/media/dialog/begin_dialog_action.png and /dev/null differ diff --git a/docs/media/dialog/create_new_bot.png b/docs/media/dialog/create_new_bot.png deleted file mode 100644 index b4d12e9afa..0000000000 Binary files a/docs/media/dialog/create_new_bot.png and /dev/null differ diff --git a/docs/media/dialog/main-and-child-dialog.png b/docs/media/dialog/main-and-child-dialog.png new file mode 100644 index 0000000000..79ba35199f Binary files /dev/null and b/docs/media/dialog/main-and-child-dialog.png differ diff --git a/docs/media/dialog/main_child_dialog.png b/docs/media/dialog/main_child_dialog.png deleted file mode 100644 index c301739962..0000000000 Binary files a/docs/media/dialog/main_child_dialog.png and /dev/null differ diff --git a/docs/media/dialog/main_dialog.png b/docs/media/dialog/main_dialog.png deleted file mode 100644 index 58e4db297b..0000000000 Binary files a/docs/media/dialog/main_dialog.png and /dev/null differ diff --git a/docs/media/dialog/new_bot.png b/docs/media/dialog/new_bot.png deleted file mode 100644 index 9245e46296..0000000000 Binary files a/docs/media/dialog/new_bot.png and /dev/null differ diff --git a/docs/media/dialog/new_weather_dialog.png b/docs/media/dialog/new_weather_dialog.png deleted file mode 100644 index 78edb56e52..0000000000 Binary files a/docs/media/dialog/new_weather_dialog.png and /dev/null differ diff --git a/docs/media/dialog/recognizer.png b/docs/media/dialog/recognizer.png index ede233429c..f1d8e118e9 100644 Binary files a/docs/media/dialog/recognizer.png and b/docs/media/dialog/recognizer.png differ diff --git a/docs/media/dialog/send_response.gif b/docs/media/dialog/send_response.gif deleted file mode 100644 index 81da445381..0000000000 Binary files a/docs/media/dialog/send_response.gif and /dev/null differ diff --git a/docs/media/dialog/test_emulator.png b/docs/media/dialog/test_emulator.png deleted file mode 100644 index 1747ca1e52..0000000000 Binary files a/docs/media/dialog/test_emulator.png and /dev/null differ diff --git a/docs/media/dialog/trigger-menu.png b/docs/media/dialog/trigger-menu.png new file mode 100644 index 0000000000..b16b3f2f5a Binary files /dev/null and b/docs/media/dialog/trigger-menu.png differ diff --git a/docs/media/dialog/trigger_menu.gif b/docs/media/dialog/trigger_menu.gif deleted file mode 100644 index e0952a109d..0000000000 Binary files a/docs/media/dialog/trigger_menu.gif and /dev/null differ diff --git a/docs/media/dialog/weather_dialog.png b/docs/media/dialog/weather_dialog.png deleted file mode 100644 index 7b31039f67..0000000000 Binary files a/docs/media/dialog/weather_dialog.png and /dev/null differ diff --git a/docs/media/dialog/wire_up_dialog.gif b/docs/media/dialog/wire_up_dialog.gif deleted file mode 100644 index f2b109d6ee..0000000000 Binary files a/docs/media/dialog/wire_up_dialog.gif and /dev/null differ diff --git a/docs/media/events-triggers/BookFlight-configure.png b/docs/media/events-triggers/BookFlight-configure.png new file mode 100644 index 0000000000..bdd991e018 Binary files /dev/null and b/docs/media/events-triggers/BookFlight-configure.png differ diff --git a/docs/media/events-triggers/LUIS-intent.png b/docs/media/events-triggers/LUIS-intent.png new file mode 100644 index 0000000000..241ef9388c Binary files /dev/null and b/docs/media/events-triggers/LUIS-intent.png differ diff --git a/docs/media/events-triggers/activities.png b/docs/media/events-triggers/activities.png new file mode 100644 index 0000000000..2db8f3b7b1 Binary files /dev/null and b/docs/media/events-triggers/activities.png differ diff --git a/docs/media/events-triggers/begin-a-new-dialog.png b/docs/media/events-triggers/begin-a-new-dialog.png new file mode 100644 index 0000000000..5075fa23a7 Binary files /dev/null and b/docs/media/events-triggers/begin-a-new-dialog.png differ diff --git a/docs/media/events-triggers/begin-dialog.png b/docs/media/events-triggers/begin-dialog.png new file mode 100644 index 0000000000..df2116d633 Binary files /dev/null and b/docs/media/events-triggers/begin-dialog.png differ diff --git a/docs/media/events-triggers/cancel-trigger.png b/docs/media/events-triggers/cancel-trigger.png new file mode 100644 index 0000000000..6cbe0ee9b9 Binary files /dev/null and b/docs/media/events-triggers/cancel-trigger.png differ diff --git a/docs/media/events_triggers/conversationupdate.png b/docs/media/events-triggers/conversation-update.png similarity index 100% rename from docs/media/events_triggers/conversationupdate.png rename to docs/media/events-triggers/conversation-update.png diff --git a/docs/media/events-triggers/custom-event-property.png b/docs/media/events-triggers/custom-event-property.png new file mode 100644 index 0000000000..38e4da88d7 Binary files /dev/null and b/docs/media/events-triggers/custom-event-property.png differ diff --git a/docs/media/events-triggers/custom-event-response.gif b/docs/media/events-triggers/custom-event-response.gif new file mode 100644 index 0000000000..cf25759a8e Binary files /dev/null and b/docs/media/events-triggers/custom-event-response.gif differ diff --git a/docs/media/events_triggers/custom_event_response.png b/docs/media/events-triggers/custom-event-response.png similarity index 100% rename from docs/media/events_triggers/custom_event_response.png rename to docs/media/events-triggers/custom-event-response.png diff --git a/docs/media/events-triggers/dialog-events.png b/docs/media/events-triggers/dialog-events.png new file mode 100644 index 0000000000..7c4152faba Binary files /dev/null and b/docs/media/events-triggers/dialog-events.png differ diff --git a/docs/media/events_triggers/emit_custom_event_property.gif b/docs/media/events-triggers/emit-custom-event-property.gif similarity index 100% rename from docs/media/events_triggers/emit_custom_event_property.gif rename to docs/media/events-triggers/emit-custom-event-property.gif diff --git a/docs/media/events-triggers/emit-custom-event-property.png b/docs/media/events-triggers/emit-custom-event-property.png new file mode 100644 index 0000000000..ec05136fb9 Binary files /dev/null and b/docs/media/events-triggers/emit-custom-event-property.png differ diff --git a/docs/media/events-triggers/emit-custom-event.png b/docs/media/events-triggers/emit-custom-event.png new file mode 100644 index 0000000000..c6560ff471 Binary files /dev/null and b/docs/media/events-triggers/emit-custom-event.png differ diff --git a/docs/media/events-triggers/intent-trigger.png b/docs/media/events-triggers/intent-trigger.png new file mode 100644 index 0000000000..73806e5915 Binary files /dev/null and b/docs/media/events-triggers/intent-trigger.png differ diff --git a/docs/media/events-triggers/luis-recognizer.png b/docs/media/events-triggers/luis-recognizer.png new file mode 100644 index 0000000000..f19b8a4388 Binary files /dev/null and b/docs/media/events-triggers/luis-recognizer.png differ diff --git a/docs/media/events-triggers/regex-recognizer.png b/docs/media/events-triggers/regex-recognizer.png new file mode 100644 index 0000000000..6e572c964a Binary files /dev/null and b/docs/media/events-triggers/regex-recognizer.png differ diff --git a/docs/media/events-triggers/regular-expression-intent.png b/docs/media/events-triggers/regular-expression-intent.png new file mode 100644 index 0000000000..c253cb29b4 Binary files /dev/null and b/docs/media/events-triggers/regular-expression-intent.png differ diff --git a/docs/media/events-triggers/score.png b/docs/media/events-triggers/score.png new file mode 100644 index 0000000000..9f904da047 Binary files /dev/null and b/docs/media/events-triggers/score.png differ diff --git a/docs/media/events-triggers/trigger-menu.gif b/docs/media/events-triggers/trigger-menu.gif new file mode 100644 index 0000000000..9b62c981cc Binary files /dev/null and b/docs/media/events-triggers/trigger-menu.gif differ diff --git a/docs/media/events-triggers/unknown-intent-response.gif b/docs/media/events-triggers/unknown-intent-response.gif new file mode 100644 index 0000000000..e2b184b0d9 Binary files /dev/null and b/docs/media/events-triggers/unknown-intent-response.gif differ diff --git a/docs/media/events-triggers/unknown-intent.png b/docs/media/events-triggers/unknown-intent.png new file mode 100644 index 0000000000..aded6b6754 Binary files /dev/null and b/docs/media/events-triggers/unknown-intent.png differ diff --git a/docs/media/events-triggers/welcome.gif b/docs/media/events-triggers/welcome.gif new file mode 100644 index 0000000000..af44c1cc04 Binary files /dev/null and b/docs/media/events-triggers/welcome.gif differ diff --git a/docs/media/events-triggers/wire-up-dialog.gif b/docs/media/events-triggers/wire-up-dialog.gif new file mode 100644 index 0000000000..0fa52182ae Binary files /dev/null and b/docs/media/events-triggers/wire-up-dialog.gif differ diff --git a/docs/media/events_triggers/BookFlight_configure.png b/docs/media/events_triggers/BookFlight_configure.png deleted file mode 100644 index 1878fdade8..0000000000 Binary files a/docs/media/events_triggers/BookFlight_configure.png and /dev/null differ diff --git a/docs/media/events_triggers/LUIS_intent.png b/docs/media/events_triggers/LUIS_intent.png deleted file mode 100644 index 2ae601a9bc..0000000000 Binary files a/docs/media/events_triggers/LUIS_intent.png and /dev/null differ diff --git a/docs/media/events_triggers/activities.png b/docs/media/events_triggers/activities.png deleted file mode 100644 index e308c903ae..0000000000 Binary files a/docs/media/events_triggers/activities.png and /dev/null differ diff --git a/docs/media/events_triggers/activity_trigger.png b/docs/media/events_triggers/activity_trigger.png deleted file mode 100644 index 48a0d1b6f3..0000000000 Binary files a/docs/media/events_triggers/activity_trigger.png and /dev/null differ diff --git a/docs/media/events_triggers/anatomy_trigger.png b/docs/media/events_triggers/anatomy_trigger.png deleted file mode 100644 index 61b8d3db39..0000000000 Binary files a/docs/media/events_triggers/anatomy_trigger.png and /dev/null differ diff --git a/docs/media/events_triggers/author_LUIS_intent.gif b/docs/media/events_triggers/author_LUIS_intent.gif deleted file mode 100644 index 0cb7ceabca..0000000000 Binary files a/docs/media/events_triggers/author_LUIS_intent.gif and /dev/null differ diff --git a/docs/media/events_triggers/author_RegEx_intent.gif b/docs/media/events_triggers/author_RegEx_intent.gif deleted file mode 100644 index 5249b554fd..0000000000 Binary files a/docs/media/events_triggers/author_RegEx_intent.gif and /dev/null differ diff --git a/docs/media/events_triggers/begin_dialog.png b/docs/media/events_triggers/begin_dialog.png deleted file mode 100644 index f824570462..0000000000 Binary files a/docs/media/events_triggers/begin_dialog.png and /dev/null differ diff --git a/docs/media/events_triggers/cancel_trigger.png b/docs/media/events_triggers/cancel_trigger.png deleted file mode 100644 index 400cdaf259..0000000000 Binary files a/docs/media/events_triggers/cancel_trigger.png and /dev/null differ diff --git a/docs/media/events_triggers/configure_dialog.png b/docs/media/events_triggers/configure_dialog.png deleted file mode 100644 index 6841aa1d48..0000000000 Binary files a/docs/media/events_triggers/configure_dialog.png and /dev/null differ diff --git a/docs/media/events_triggers/configure_trigger.png b/docs/media/events_triggers/configure_trigger.png deleted file mode 100644 index 8d636c9f8f..0000000000 Binary files a/docs/media/events_triggers/configure_trigger.png and /dev/null differ diff --git a/docs/media/events_triggers/create_custom_trigger.png b/docs/media/events_triggers/create_custom_trigger.png deleted file mode 100644 index db78b13e8f..0000000000 Binary files a/docs/media/events_triggers/create_custom_trigger.png and /dev/null differ diff --git a/docs/media/events_triggers/custom_event.png b/docs/media/events_triggers/custom_event.png deleted file mode 100644 index 859e294ad4..0000000000 Binary files a/docs/media/events_triggers/custom_event.png and /dev/null differ diff --git a/docs/media/events_triggers/custom_event_property.png b/docs/media/events_triggers/custom_event_property.png deleted file mode 100644 index f4d89be832..0000000000 Binary files a/docs/media/events_triggers/custom_event_property.png and /dev/null differ diff --git a/docs/media/events_triggers/custom_event_response.gif b/docs/media/events_triggers/custom_event_response.gif deleted file mode 100644 index 19341d706f..0000000000 Binary files a/docs/media/events_triggers/custom_event_response.gif and /dev/null differ diff --git a/docs/media/events_triggers/custom_trigger.png b/docs/media/events_triggers/custom_trigger.png deleted file mode 100644 index c1776db7b7..0000000000 Binary files a/docs/media/events_triggers/custom_trigger.png and /dev/null differ diff --git a/docs/media/events_triggers/define_intents.png b/docs/media/events_triggers/define_intents.png deleted file mode 100644 index 59d4d892c7..0000000000 Binary files a/docs/media/events_triggers/define_intents.png and /dev/null differ diff --git a/docs/media/events_triggers/dialog_events.png b/docs/media/events_triggers/dialog_events.png deleted file mode 100644 index 477cd87829..0000000000 Binary files a/docs/media/events_triggers/dialog_events.png and /dev/null differ diff --git a/docs/media/events_triggers/dialog_trigger.png b/docs/media/events_triggers/dialog_trigger.png deleted file mode 100644 index 363572e4da..0000000000 Binary files a/docs/media/events_triggers/dialog_trigger.png and /dev/null differ diff --git a/docs/media/events_triggers/emit_custom_event.gif b/docs/media/events_triggers/emit_custom_event.gif deleted file mode 100644 index 928029d92c..0000000000 Binary files a/docs/media/events_triggers/emit_custom_event.gif and /dev/null differ diff --git a/docs/media/events_triggers/emit_custom_event.png b/docs/media/events_triggers/emit_custom_event.png deleted file mode 100644 index 96d46fe476..0000000000 Binary files a/docs/media/events_triggers/emit_custom_event.png and /dev/null differ diff --git a/docs/media/events_triggers/emit_custom_event_property.png b/docs/media/events_triggers/emit_custom_event_property.png deleted file mode 100644 index 63ad99a6cb..0000000000 Binary files a/docs/media/events_triggers/emit_custom_event_property.png and /dev/null differ diff --git a/docs/media/events_triggers/empty_custom_trigger.png b/docs/media/events_triggers/empty_custom_trigger.png deleted file mode 100644 index fc7d0fddc7..0000000000 Binary files a/docs/media/events_triggers/empty_custom_trigger.png and /dev/null differ diff --git a/docs/media/events_triggers/empty_intent_trigger.png b/docs/media/events_triggers/empty_intent_trigger.png deleted file mode 100644 index 7ac5a6e3f9..0000000000 Binary files a/docs/media/events_triggers/empty_intent_trigger.png and /dev/null differ diff --git a/docs/media/events_triggers/empty_unrecognized_intent.png b/docs/media/events_triggers/empty_unrecognized_intent.png deleted file mode 100644 index ccbd99299e..0000000000 Binary files a/docs/media/events_triggers/empty_unrecognized_intent.png and /dev/null differ diff --git a/docs/media/events_triggers/event_name_greeting.png b/docs/media/events_triggers/event_name_greeting.png deleted file mode 100644 index 70e47d6518..0000000000 Binary files a/docs/media/events_triggers/event_name_greeting.png and /dev/null differ diff --git a/docs/media/events_triggers/greeting_trigger.gif b/docs/media/events_triggers/greeting_trigger.gif deleted file mode 100644 index 40c66092ab..0000000000 Binary files a/docs/media/events_triggers/greeting_trigger.gif and /dev/null differ diff --git a/docs/media/events_triggers/intent_trigger.png b/docs/media/events_triggers/intent_trigger.png deleted file mode 100644 index 4dbcd848d7..0000000000 Binary files a/docs/media/events_triggers/intent_trigger.png and /dev/null differ diff --git a/docs/media/events_triggers/luis_recognizer.png b/docs/media/events_triggers/luis_recognizer.png deleted file mode 100644 index 3332666d38..0000000000 Binary files a/docs/media/events_triggers/luis_recognizer.png and /dev/null differ diff --git a/docs/media/events_triggers/message_activity_trigger.png b/docs/media/events_triggers/message_activity_trigger.png deleted file mode 100644 index f466aa82f4..0000000000 Binary files a/docs/media/events_triggers/message_activity_trigger.png and /dev/null differ diff --git a/docs/media/events_triggers/new_trigger.png b/docs/media/events_triggers/new_trigger.png deleted file mode 100644 index c4ba849add..0000000000 Binary files a/docs/media/events_triggers/new_trigger.png and /dev/null differ diff --git a/docs/media/events_triggers/recognizer_type.png b/docs/media/events_triggers/recognizer_type.png deleted file mode 100644 index 3714176ca8..0000000000 Binary files a/docs/media/events_triggers/recognizer_type.png and /dev/null differ diff --git a/docs/media/events_triggers/regex_recognizer.png b/docs/media/events_triggers/regex_recognizer.png deleted file mode 100644 index 92dff0ec70..0000000000 Binary files a/docs/media/events_triggers/regex_recognizer.png and /dev/null differ diff --git a/docs/media/events_triggers/regular_expression_intent.png b/docs/media/events_triggers/regular_expression_intent.png deleted file mode 100644 index ca1bf38d70..0000000000 Binary files a/docs/media/events_triggers/regular_expression_intent.png and /dev/null differ diff --git a/docs/media/events_triggers/score.png b/docs/media/events_triggers/score.png deleted file mode 100644 index 23ab40d1e8..0000000000 Binary files a/docs/media/events_triggers/score.png and /dev/null differ diff --git a/docs/media/events_triggers/setup_LUIS_recognizer.gif b/docs/media/events_triggers/setup_LUIS_recognizer.gif deleted file mode 100644 index 04fd0f6488..0000000000 Binary files a/docs/media/events_triggers/setup_LUIS_recognizer.gif and /dev/null differ diff --git a/docs/media/events_triggers/setup_LUIS_recognizer.png b/docs/media/events_triggers/setup_LUIS_recognizer.png deleted file mode 100644 index a0a0effc89..0000000000 Binary files a/docs/media/events_triggers/setup_LUIS_recognizer.png and /dev/null differ diff --git a/docs/media/events_triggers/setup_RegEx_recognizer.gif b/docs/media/events_triggers/setup_RegEx_recognizer.gif deleted file mode 100644 index b6661fbc1d..0000000000 Binary files a/docs/media/events_triggers/setup_RegEx_recognizer.gif and /dev/null differ diff --git a/docs/media/events_triggers/setup_RegEx_recognizer.png b/docs/media/events_triggers/setup_RegEx_recognizer.png deleted file mode 100644 index 288d623a6e..0000000000 Binary files a/docs/media/events_triggers/setup_RegEx_recognizer.png and /dev/null differ diff --git a/docs/media/events_triggers/trigger_menu.png b/docs/media/events_triggers/trigger_menu.png deleted file mode 100644 index 8ae7898b01..0000000000 Binary files a/docs/media/events_triggers/trigger_menu.png and /dev/null differ diff --git a/docs/media/events_triggers/unknown_intent.png b/docs/media/events_triggers/unknown_intent.png deleted file mode 100644 index 7baa6dd3a6..0000000000 Binary files a/docs/media/events_triggers/unknown_intent.png and /dev/null differ diff --git a/docs/media/events_triggers/unknown_intent_response.gif b/docs/media/events_triggers/unknown_intent_response.gif deleted file mode 100644 index f0a5c295a1..0000000000 Binary files a/docs/media/events_triggers/unknown_intent_response.gif and /dev/null differ diff --git a/docs/media/events_triggers/welcome.gif b/docs/media/events_triggers/welcome.gif deleted file mode 100644 index b7e6d32c05..0000000000 Binary files a/docs/media/events_triggers/welcome.gif and /dev/null differ diff --git a/docs/media/introduction/Bot-Responses.png b/docs/media/introduction/Bot-Responses.png new file mode 100644 index 0000000000..e0e5099d36 Binary files /dev/null and b/docs/media/introduction/Bot-Responses.png differ diff --git a/docs/media/introduction/composer-overview.png b/docs/media/introduction/composer-overview.png index 0fa76db8c8..0f9be4fbc2 100644 Binary files a/docs/media/introduction/composer-overview.png and b/docs/media/introduction/composer-overview.png differ diff --git a/docs/media/introduction/intro-nlu.png b/docs/media/introduction/intro-nlu.png index 3a7ac7efc5..1285833bd1 100644 Binary files a/docs/media/introduction/intro-nlu.png and b/docs/media/introduction/intro-nlu.png differ diff --git a/docs/media/language-generation/Bot-Responses.png b/docs/media/language-generation/Bot-Responses.png new file mode 100644 index 0000000000..e0e5099d36 Binary files /dev/null and b/docs/media/language-generation/Bot-Responses.png differ diff --git a/docs/media/language-generation/lg-inline-editor.png b/docs/media/language-generation/lg-inline-editor.png new file mode 100644 index 0000000000..9deb1ce9cb Binary files /dev/null and b/docs/media/language-generation/lg-inline-editor.png differ diff --git a/docs/media/language-understanding/LU-content.png b/docs/media/language-understanding/LU-content.png new file mode 100644 index 0000000000..dccb91043a Binary files /dev/null and b/docs/media/language-understanding/LU-content.png differ diff --git a/docs/media/language-understanding/publish-lu.png b/docs/media/language-understanding/publish-lu.png new file mode 100644 index 0000000000..f98f17d7da Binary files /dev/null and b/docs/media/language-understanding/publish-lu.png differ diff --git a/docs/media/language-understanding/select-recognizer.png b/docs/media/language-understanding/select-recognizer.png new file mode 100644 index 0000000000..ef214404b4 Binary files /dev/null and b/docs/media/language-understanding/select-recognizer.png differ diff --git a/docs/media/language-understanding/user-input.png b/docs/media/language-understanding/user-input.png new file mode 100644 index 0000000000..86e370bc13 Binary files /dev/null and b/docs/media/language-understanding/user-input.png differ diff --git a/docs/media/language-understanding/wireup-intent.png b/docs/media/language-understanding/wireup-intent.png new file mode 100644 index 0000000000..e15ae6692c Binary files /dev/null and b/docs/media/language-understanding/wireup-intent.png differ diff --git a/docs/media/language_generation/bot_responses.png b/docs/media/language_generation/bot_responses.png deleted file mode 100644 index 0d610204ad..0000000000 Binary files a/docs/media/language_generation/bot_responses.png and /dev/null differ diff --git a/docs/media/language_generation/bot_says.png b/docs/media/language_generation/bot_says.png deleted file mode 100644 index 750ec50a77..0000000000 Binary files a/docs/media/language_generation/bot_says.png and /dev/null differ diff --git a/docs/media/language_generation/inline_editor.png b/docs/media/language_generation/inline_editor.png deleted file mode 100644 index 13d9e50d50..0000000000 Binary files a/docs/media/language_generation/inline_editor.png and /dev/null differ diff --git a/docs/media/language_generation/multi_line_response.gif b/docs/media/language_generation/multi_line_response.gif deleted file mode 100644 index 64daf8fee1..0000000000 Binary files a/docs/media/language_generation/multi_line_response.gif and /dev/null differ diff --git a/docs/media/language_generation/one_line_response.gif b/docs/media/language_generation/one_line_response.gif deleted file mode 100644 index 397c25f9f4..0000000000 Binary files a/docs/media/language_generation/one_line_response.gif and /dev/null differ diff --git a/docs/media/language_generation/send_an_activity.gif b/docs/media/language_generation/send_an_activity.gif deleted file mode 100644 index 4ed6156333..0000000000 Binary files a/docs/media/language_generation/send_an_activity.gif and /dev/null differ diff --git a/docs/media/language_generation/single_line_response_expression.png b/docs/media/language_generation/single_line_response_expression.png deleted file mode 100644 index 1daa21c3fd..0000000000 Binary files a/docs/media/language_generation/single_line_response_expression.png and /dev/null differ diff --git a/docs/media/language_generation/single_line_response_text.png b/docs/media/language_generation/single_line_response_text.png deleted file mode 100644 index c031a12070..0000000000 Binary files a/docs/media/language_generation/single_line_response_text.png and /dev/null differ diff --git a/docs/media/language_understanding/all_up_view.png b/docs/media/language_understanding/all_up_view.png deleted file mode 100644 index 3af1531a66..0000000000 Binary files a/docs/media/language_understanding/all_up_view.png and /dev/null differ diff --git a/docs/media/language_understanding/intents.gif b/docs/media/language_understanding/intents.gif deleted file mode 100644 index e00ee5fcfd..0000000000 Binary files a/docs/media/language_understanding/intents.gif and /dev/null differ diff --git a/docs/media/language_understanding/luis.png b/docs/media/language_understanding/luis.png deleted file mode 100644 index bb52f8a977..0000000000 Binary files a/docs/media/language_understanding/luis.png and /dev/null differ diff --git a/docs/media/language_understanding/new_trigger.png b/docs/media/language_understanding/new_trigger.png deleted file mode 100644 index ec7728e1e9..0000000000 Binary files a/docs/media/language_understanding/new_trigger.png and /dev/null differ diff --git a/docs/media/language_understanding/select_dialog.png b/docs/media/language_understanding/select_dialog.png deleted file mode 100644 index 300e649d33..0000000000 Binary files a/docs/media/language_understanding/select_dialog.png and /dev/null differ diff --git a/docs/media/language_understanding/user_say.png b/docs/media/language_understanding/user_say.png deleted file mode 100644 index 9f9b2fb696..0000000000 Binary files a/docs/media/language_understanding/user_say.png and /dev/null differ diff --git a/docs/media/language_understanding/wireup_intent.png b/docs/media/language_understanding/wireup_intent.png deleted file mode 100644 index ad266db8db..0000000000 Binary files a/docs/media/language_understanding/wireup_intent.png and /dev/null differ diff --git a/docs/media/memory/delete-properties.png b/docs/media/memory/delete-properties.png new file mode 100644 index 0000000000..9edf86d09d Binary files /dev/null and b/docs/media/memory/delete-properties.png differ diff --git a/docs/media/memory/memory-mainpulation-menu.png b/docs/media/memory/memory-mainpulation-menu.png index 3ebc153116..8f382b6ce0 100644 Binary files a/docs/media/memory/memory-mainpulation-menu.png and b/docs/media/memory/memory-mainpulation-menu.png differ diff --git a/docs/media/memory/set-properties.png b/docs/media/memory/set-properties.png new file mode 100644 index 0000000000..6359b2f087 Binary files /dev/null and b/docs/media/memory/set-properties.png differ diff --git a/docs/media/setup-yarn/address.png b/docs/media/setup-yarn/address.png index a17729e2b0..85971c27b6 100644 Binary files a/docs/media/setup-yarn/address.png and b/docs/media/setup-yarn/address.png differ diff --git a/docs/media/tutorial-weatherbot/01/WelcomeTheUser.gif b/docs/media/tutorial-weatherbot/01/WelcomeTheUser.gif new file mode 100644 index 0000000000..1086a2322f Binary files /dev/null and b/docs/media/tutorial-weatherbot/01/WelcomeTheUser.gif differ diff --git a/docs/media/tutorial-weatherbot/01/add-send-activity.gif b/docs/media/tutorial-weatherbot/01/add-send-activity.gif deleted file mode 100644 index 643286a9e3..0000000000 Binary files a/docs/media/tutorial-weatherbot/01/add-send-activity.gif and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/01/create-1.png b/docs/media/tutorial-weatherbot/01/create-1.png index 8483c2e633..85a0eddefa 100644 Binary files a/docs/media/tutorial-weatherbot/01/create-1.png and b/docs/media/tutorial-weatherbot/01/create-1.png differ diff --git a/docs/media/tutorial-weatherbot/01/create-2.png b/docs/media/tutorial-weatherbot/01/create-2.png index 0363a30c16..cc2a66eb18 100644 Binary files a/docs/media/tutorial-weatherbot/01/create-2.png and b/docs/media/tutorial-weatherbot/01/create-2.png differ diff --git a/docs/media/tutorial-weatherbot/01/create-welcome-trigger.png b/docs/media/tutorial-weatherbot/01/create-welcome-trigger.png deleted file mode 100644 index 7e3578ca5e..0000000000 Binary files a/docs/media/tutorial-weatherbot/01/create-welcome-trigger.png and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/01/empty-main-dialog.png b/docs/media/tutorial-weatherbot/01/empty-main-dialog.png index 0529abb15c..b7cef02c37 100644 Binary files a/docs/media/tutorial-weatherbot/01/empty-main-dialog.png and b/docs/media/tutorial-weatherbot/01/empty-main-dialog.png differ diff --git a/docs/media/tutorial-weatherbot/01/emulator-launch.png b/docs/media/tutorial-weatherbot/01/emulator-launch.png index 02f80d2899..0e15e880f3 100644 Binary files a/docs/media/tutorial-weatherbot/01/emulator-launch.png and b/docs/media/tutorial-weatherbot/01/emulator-launch.png differ diff --git a/docs/media/tutorial-weatherbot/01/greeting-in-emulator.png b/docs/media/tutorial-weatherbot/01/greeting-in-emulator.png deleted file mode 100644 index c7058c036f..0000000000 Binary files a/docs/media/tutorial-weatherbot/01/greeting-in-emulator.png and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/01/new-flow.png b/docs/media/tutorial-weatherbot/01/new-flow.png deleted file mode 100644 index 3d78e68685..0000000000 Binary files a/docs/media/tutorial-weatherbot/01/new-flow.png and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/01/new.png b/docs/media/tutorial-weatherbot/01/new.png new file mode 100644 index 0000000000..24b98d0f06 Binary files /dev/null and b/docs/media/tutorial-weatherbot/01/new.png differ diff --git a/docs/media/tutorial-weatherbot/01/recognizer-none.gif b/docs/media/tutorial-weatherbot/01/recognizer-none.gif deleted file mode 100644 index 0fa9038f45..0000000000 Binary files a/docs/media/tutorial-weatherbot/01/recognizer-none.gif and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/01/send-activity.gif b/docs/media/tutorial-weatherbot/01/send-activity.gif new file mode 100644 index 0000000000..27e31a620b Binary files /dev/null and b/docs/media/tutorial-weatherbot/01/send-activity.gif differ diff --git a/docs/media/tutorial-weatherbot/01/send-activity.png b/docs/media/tutorial-weatherbot/01/send-activity.png deleted file mode 100644 index 23024c37b6..0000000000 Binary files a/docs/media/tutorial-weatherbot/01/send-activity.png and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/01/start-bot.gif b/docs/media/tutorial-weatherbot/01/start-bot.gif index e873c3bbff..84ed62a7cf 100644 Binary files a/docs/media/tutorial-weatherbot/01/start-bot.gif and b/docs/media/tutorial-weatherbot/01/start-bot.gif differ diff --git a/docs/media/tutorial-weatherbot/02/begin-dialog-congifure.gif b/docs/media/tutorial-weatherbot/02/begin-dialog-congifure.gif deleted file mode 100644 index d0f85c773c..0000000000 Binary files a/docs/media/tutorial-weatherbot/02/begin-dialog-congifure.gif and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/02/begin-dialog.png b/docs/media/tutorial-weatherbot/02/begin-dialog.png new file mode 100644 index 0000000000..110c14f914 Binary files /dev/null and b/docs/media/tutorial-weatherbot/02/begin-dialog.png differ diff --git a/docs/media/tutorial-weatherbot/02/begindialog-trigger.png b/docs/media/tutorial-weatherbot/02/begindialog-trigger.png deleted file mode 100644 index 9915c199cc..0000000000 Binary files a/docs/media/tutorial-weatherbot/02/begindialog-trigger.png and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/02/connect-dialog.png b/docs/media/tutorial-weatherbot/02/connect-dialog.png new file mode 100644 index 0000000000..dc09e6e828 Binary files /dev/null and b/docs/media/tutorial-weatherbot/02/connect-dialog.png differ diff --git a/docs/media/tutorial-weatherbot/02/create-getweather-dialog.png b/docs/media/tutorial-weatherbot/02/create-getweather-dialog.png new file mode 100644 index 0000000000..b4bf645928 Binary files /dev/null and b/docs/media/tutorial-weatherbot/02/create-getweather-dialog.png differ diff --git a/docs/media/tutorial-weatherbot/02/create-getweather.png b/docs/media/tutorial-weatherbot/02/create-getweather.png deleted file mode 100644 index 22ac6e33fc..0000000000 Binary files a/docs/media/tutorial-weatherbot/02/create-getweather.png and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/02/dialog-management.png b/docs/media/tutorial-weatherbot/02/dialog-management.png new file mode 100644 index 0000000000..846c32996b Binary files /dev/null and b/docs/media/tutorial-weatherbot/02/dialog-management.png differ diff --git a/docs/media/tutorial-weatherbot/02/emulator-weather-draft.png b/docs/media/tutorial-weatherbot/02/emulator-weather-draft.png deleted file mode 100644 index f3cac868bd..0000000000 Binary files a/docs/media/tutorial-weatherbot/02/emulator-weather-draft.png and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/02/emulator-weather.png b/docs/media/tutorial-weatherbot/02/emulator-weather.png new file mode 100644 index 0000000000..b9fe1c68fb Binary files /dev/null and b/docs/media/tutorial-weatherbot/02/emulator-weather.png differ diff --git a/docs/media/tutorial-weatherbot/02/getweather-dialog.gif b/docs/media/tutorial-weatherbot/02/getweather-dialog.gif new file mode 100644 index 0000000000..bd1532a767 Binary files /dev/null and b/docs/media/tutorial-weatherbot/02/getweather-dialog.gif differ diff --git a/docs/media/tutorial-weatherbot/02/getweather-draft.png b/docs/media/tutorial-weatherbot/02/getweather-draft.png deleted file mode 100644 index 1cafacedd9..0000000000 Binary files a/docs/media/tutorial-weatherbot/02/getweather-draft.png and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/02/intent-pattern.png b/docs/media/tutorial-weatherbot/02/intent-pattern.png new file mode 100644 index 0000000000..991db35af9 Binary files /dev/null and b/docs/media/tutorial-weatherbot/02/intent-pattern.png differ diff --git a/docs/media/tutorial-weatherbot/02/recognizer-type.png b/docs/media/tutorial-weatherbot/02/recognizer-type.png new file mode 100644 index 0000000000..a3976911bd Binary files /dev/null and b/docs/media/tutorial-weatherbot/02/recognizer-type.png differ diff --git a/docs/media/tutorial-weatherbot/02/regexp-recognizer.gif b/docs/media/tutorial-weatherbot/02/regexp-recognizer.gif deleted file mode 100644 index c62c31355a..0000000000 Binary files a/docs/media/tutorial-weatherbot/02/regexp-recognizer.gif and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/02/select-begin.gif b/docs/media/tutorial-weatherbot/02/select-begin.gif deleted file mode 100644 index 31509e3df3..0000000000 Binary files a/docs/media/tutorial-weatherbot/02/select-begin.gif and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/02/weather-intent-trigger.png b/docs/media/tutorial-weatherbot/02/weather-intent-trigger.png new file mode 100644 index 0000000000..e91b9af555 Binary files /dev/null and b/docs/media/tutorial-weatherbot/02/weather-intent-trigger.png differ diff --git a/docs/media/tutorial-weatherbot/02/weather-intent.png b/docs/media/tutorial-weatherbot/02/weather-intent.png deleted file mode 100644 index e94b06bcc2..0000000000 Binary files a/docs/media/tutorial-weatherbot/02/weather-intent.png and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/02/weather-trigger.png b/docs/media/tutorial-weatherbot/02/weather-trigger.png deleted file mode 100644 index 37fcd2189f..0000000000 Binary files a/docs/media/tutorial-weatherbot/02/weather-trigger.png and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/03/empty-prompt.png b/docs/media/tutorial-weatherbot/03/empty-prompt.png deleted file mode 100644 index 27c16819f3..0000000000 Binary files a/docs/media/tutorial-weatherbot/03/empty-prompt.png and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/03/http-props.png b/docs/media/tutorial-weatherbot/03/http-props.png deleted file mode 100644 index 05b79cb1ae..0000000000 Binary files a/docs/media/tutorial-weatherbot/03/http-props.png and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/03/http-result-property.png b/docs/media/tutorial-weatherbot/03/http-result-property.png new file mode 100644 index 0000000000..8eef3ce311 Binary files /dev/null and b/docs/media/tutorial-weatherbot/03/http-result-property.png differ diff --git a/docs/media/tutorial-weatherbot/03/http-step.png b/docs/media/tutorial-weatherbot/03/http-step.png index ef6ba5526a..1fc41be791 100644 Binary files a/docs/media/tutorial-weatherbot/03/http-step.png and b/docs/media/tutorial-weatherbot/03/http-step.png differ diff --git a/docs/media/tutorial-weatherbot/03/http-url.png b/docs/media/tutorial-weatherbot/03/http-url.png new file mode 100644 index 0000000000..8939adac00 Binary files /dev/null and b/docs/media/tutorial-weatherbot/03/http-url.png differ diff --git a/docs/media/tutorial-weatherbot/03/ifelse.png b/docs/media/tutorial-weatherbot/03/ifelse.png index 2fd61f9456..7295a077d0 100644 Binary files a/docs/media/tutorial-weatherbot/03/ifelse.png and b/docs/media/tutorial-weatherbot/03/ifelse.png differ diff --git a/docs/media/tutorial-weatherbot/03/ifelse2.png b/docs/media/tutorial-weatherbot/03/ifelse2.png index 386f982b00..e9c223f2d8 100644 Binary files a/docs/media/tutorial-weatherbot/03/ifelse2.png and b/docs/media/tutorial-weatherbot/03/ifelse2.png differ diff --git a/docs/media/tutorial-weatherbot/03/prompt-tabs.png b/docs/media/tutorial-weatherbot/03/prompt-tabs.png deleted file mode 100644 index 852be8e59d..0000000000 Binary files a/docs/media/tutorial-weatherbot/03/prompt-tabs.png and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/03/set-a-property.png b/docs/media/tutorial-weatherbot/03/set-a-property.png new file mode 100644 index 0000000000..f448891fa5 Binary files /dev/null and b/docs/media/tutorial-weatherbot/03/set-a-property.png differ diff --git a/docs/media/tutorial-weatherbot/03/tab-exceptions.png b/docs/media/tutorial-weatherbot/03/tab-exceptions.png deleted file mode 100644 index f69b17b18b..0000000000 Binary files a/docs/media/tutorial-weatherbot/03/tab-exceptions.png and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/03/user-prompts.png b/docs/media/tutorial-weatherbot/03/user-prompts.png new file mode 100644 index 0000000000..637e1f77bd Binary files /dev/null and b/docs/media/tutorial-weatherbot/03/user-prompts.png differ diff --git a/docs/media/tutorial-weatherbot/03/zipcode-answer.png b/docs/media/tutorial-weatherbot/03/zipcode-answer.png index 120819934e..bccb0ab4c7 100644 Binary files a/docs/media/tutorial-weatherbot/03/zipcode-answer.png and b/docs/media/tutorial-weatherbot/03/zipcode-answer.png differ diff --git a/docs/media/tutorial-weatherbot/03/zipcode-extensions.png b/docs/media/tutorial-weatherbot/03/zipcode-extensions.png index d837bc007e..a3dcb660e7 100644 Binary files a/docs/media/tutorial-weatherbot/03/zipcode-extensions.png and b/docs/media/tutorial-weatherbot/03/zipcode-extensions.png differ diff --git a/docs/media/tutorial-weatherbot/03/zipcode-flow.png b/docs/media/tutorial-weatherbot/03/zipcode-flow.png deleted file mode 100644 index 356f77c991..0000000000 Binary files a/docs/media/tutorial-weatherbot/03/zipcode-flow.png and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/04/begin-new-dialog-cancel.png b/docs/media/tutorial-weatherbot/04/begin-new-dialog-cancel.png new file mode 100644 index 0000000000..e85d5cc24b Binary files /dev/null and b/docs/media/tutorial-weatherbot/04/begin-new-dialog-cancel.png differ diff --git a/docs/media/tutorial-weatherbot/04/begin-new-dialog.png b/docs/media/tutorial-weatherbot/04/begin-new-dialog.png index 441f6a1db5..f45de7419a 100644 Binary files a/docs/media/tutorial-weatherbot/04/begin-new-dialog.png and b/docs/media/tutorial-weatherbot/04/begin-new-dialog.png differ diff --git a/docs/media/tutorial-weatherbot/04/cancel-flow.png b/docs/media/tutorial-weatherbot/04/cancel-flow.png index 708124576b..42971ac7a7 100644 Binary files a/docs/media/tutorial-weatherbot/04/cancel-flow.png and b/docs/media/tutorial-weatherbot/04/cancel-flow.png differ diff --git a/docs/media/tutorial-weatherbot/04/cancel-intent.png b/docs/media/tutorial-weatherbot/04/cancel-intent.png new file mode 100644 index 0000000000..c4beda8a63 Binary files /dev/null and b/docs/media/tutorial-weatherbot/04/cancel-intent.png differ diff --git a/docs/media/tutorial-weatherbot/04/cancel-trigger.png b/docs/media/tutorial-weatherbot/04/cancel-trigger.png index 018a231ab1..0b1e80c322 100644 Binary files a/docs/media/tutorial-weatherbot/04/cancel-trigger.png and b/docs/media/tutorial-weatherbot/04/cancel-trigger.png differ diff --git a/docs/media/tutorial-weatherbot/04/help-dialog.png b/docs/media/tutorial-weatherbot/04/help-dialog.png index 5dd3aaa452..ebb99e8d13 100644 Binary files a/docs/media/tutorial-weatherbot/04/help-dialog.png and b/docs/media/tutorial-weatherbot/04/help-dialog.png differ diff --git a/docs/media/tutorial-weatherbot/04/help-intent.png b/docs/media/tutorial-weatherbot/04/help-intent.png index 1dcfde7bb4..5c6ba9af3f 100644 Binary files a/docs/media/tutorial-weatherbot/04/help-intent.png and b/docs/media/tutorial-weatherbot/04/help-intent.png differ diff --git a/docs/media/tutorial-weatherbot/04/help-props.png b/docs/media/tutorial-weatherbot/04/help-props.png deleted file mode 100644 index 9331112c2d..0000000000 Binary files a/docs/media/tutorial-weatherbot/04/help-props.png and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/04/help.png b/docs/media/tutorial-weatherbot/04/help.png index dd8d8e5dee..d81c71f44f 100644 Binary files a/docs/media/tutorial-weatherbot/04/help.png and b/docs/media/tutorial-weatherbot/04/help.png differ diff --git a/docs/media/tutorial-weatherbot/04/intent-trigger-cancel.png b/docs/media/tutorial-weatherbot/04/intent-trigger-cancel.png new file mode 100644 index 0000000000..a1bd360360 Binary files /dev/null and b/docs/media/tutorial-weatherbot/04/intent-trigger-cancel.png differ diff --git a/docs/media/tutorial-weatherbot/04/intent-trigger-help.png b/docs/media/tutorial-weatherbot/04/intent-trigger-help.png index e1273a186b..847c785fd9 100644 Binary files a/docs/media/tutorial-weatherbot/04/intent-trigger-help.png and b/docs/media/tutorial-weatherbot/04/intent-trigger-help.png differ diff --git a/docs/media/tutorial-weatherbot/04/interrupts.png b/docs/media/tutorial-weatherbot/04/interrupts.png index ac1987d524..c3f6564ef6 100644 Binary files a/docs/media/tutorial-weatherbot/04/interrupts.png and b/docs/media/tutorial-weatherbot/04/interrupts.png differ diff --git a/docs/media/tutorial-weatherbot/04/new_trigger.png b/docs/media/tutorial-weatherbot/04/new_trigger.png index 25e2da9b66..b424485178 100644 Binary files a/docs/media/tutorial-weatherbot/04/new_trigger.png and b/docs/media/tutorial-weatherbot/04/new_trigger.png differ diff --git a/docs/media/tutorial-weatherbot/04/plus-sign-icon.png b/docs/media/tutorial-weatherbot/04/plus-sign-icon.png new file mode 100644 index 0000000000..94f1fdec61 Binary files /dev/null and b/docs/media/tutorial-weatherbot/04/plus-sign-icon.png differ diff --git a/docs/media/tutorial-weatherbot/04/select-begindialog-trigger.png b/docs/media/tutorial-weatherbot/04/select-begindialog-trigger.png new file mode 100644 index 0000000000..44b377bc93 Binary files /dev/null and b/docs/media/tutorial-weatherbot/04/select-begindialog-trigger.png differ diff --git a/docs/media/tutorial-weatherbot/04/select-cancel-dialog.png b/docs/media/tutorial-weatherbot/04/select-cancel-dialog.png new file mode 100644 index 0000000000..6b8ef4b1e6 Binary files /dev/null and b/docs/media/tutorial-weatherbot/04/select-cancel-dialog.png differ diff --git a/docs/media/tutorial-weatherbot/04/select-help-dialog.png b/docs/media/tutorial-weatherbot/04/select-help-dialog.png new file mode 100644 index 0000000000..cd419c8942 Binary files /dev/null and b/docs/media/tutorial-weatherbot/04/select-help-dialog.png differ diff --git a/docs/media/tutorial-weatherbot/04/select-textinput-action.png b/docs/media/tutorial-weatherbot/04/select-textinput-action.png new file mode 100644 index 0000000000..d15a9b0144 Binary files /dev/null and b/docs/media/tutorial-weatherbot/04/select-textinput-action.png differ diff --git a/docs/media/tutorial-weatherbot/05/Select-the-BeginDialog-trigger.png b/docs/media/tutorial-weatherbot/05/Select-the-BeginDialog-trigger.png new file mode 100644 index 0000000000..ee02e4408b Binary files /dev/null and b/docs/media/tutorial-weatherbot/05/Select-the-BeginDialog-trigger.png differ diff --git a/docs/media/tutorial-weatherbot/05/bot-responses.png b/docs/media/tutorial-weatherbot/05/bot-responses.png new file mode 100644 index 0000000000..491b397beb Binary files /dev/null and b/docs/media/tutorial-weatherbot/05/bot-responses.png differ diff --git a/docs/media/tutorial-weatherbot/05/botsays.png b/docs/media/tutorial-weatherbot/05/botsays.png index ae2dd4050c..0a056ec5cc 100644 Binary files a/docs/media/tutorial-weatherbot/05/botsays.png and b/docs/media/tutorial-weatherbot/05/botsays.png differ diff --git a/docs/media/tutorial-weatherbot/05/editmode.png b/docs/media/tutorial-weatherbot/05/editmode.png index 750ec50a77..d6c442c2a1 100644 Binary files a/docs/media/tutorial-weatherbot/05/editmode.png and b/docs/media/tutorial-weatherbot/05/editmode.png differ diff --git a/docs/media/tutorial-weatherbot/05/lg-2.png b/docs/media/tutorial-weatherbot/05/lg-2.png index ff685ddf9d..1858123d67 100644 Binary files a/docs/media/tutorial-weatherbot/05/lg-2.png and b/docs/media/tutorial-weatherbot/05/lg-2.png differ diff --git a/docs/media/tutorial-weatherbot/05/nav1.png b/docs/media/tutorial-weatherbot/05/nav1.png index 63ba47d5ca..2d65927a42 100644 Binary files a/docs/media/tutorial-weatherbot/05/nav1.png and b/docs/media/tutorial-weatherbot/05/nav1.png differ diff --git a/docs/media/tutorial-weatherbot/05/nav2.png b/docs/media/tutorial-weatherbot/05/nav2.png deleted file mode 100644 index e6f73699e2..0000000000 Binary files a/docs/media/tutorial-weatherbot/05/nav2.png and /dev/null differ diff --git a/docs/media/tutorial-weatherbot/05/nice-weather.png b/docs/media/tutorial-weatherbot/05/nice-weather.png index 7b0d4f1a1e..873fb69f22 100644 Binary files a/docs/media/tutorial-weatherbot/05/nice-weather.png and b/docs/media/tutorial-weatherbot/05/nice-weather.png differ diff --git a/docs/media/tutorial-weatherbot/05/select-the-WelcomeTheUser-trigger.png b/docs/media/tutorial-weatherbot/05/select-the-WelcomeTheUser-trigger.png new file mode 100644 index 0000000000..c5ad97459d Binary files /dev/null and b/docs/media/tutorial-weatherbot/05/select-the-WelcomeTheUser-trigger.png differ diff --git a/docs/media/tutorial-weatherbot/06/getWeather-beginDialog.png b/docs/media/tutorial-weatherbot/06/getWeather-beginDialog.png index 682af40a48..1457f1ab6a 100644 Binary files a/docs/media/tutorial-weatherbot/06/getWeather-beginDialog.png and b/docs/media/tutorial-weatherbot/06/getWeather-beginDialog.png differ diff --git a/docs/media/tutorial-weatherbot/06/getWeather-beginDialog2.png b/docs/media/tutorial-weatherbot/06/getWeather-beginDialog2.png new file mode 100644 index 0000000000..837c4937c3 Binary files /dev/null and b/docs/media/tutorial-weatherbot/06/getWeather-beginDialog2.png differ diff --git a/docs/media/tutorial-weatherbot/06/suggested-actions-emulator.png b/docs/media/tutorial-weatherbot/06/suggested-actions-emulator.png index cb900975e3..7e82663aa1 100644 Binary files a/docs/media/tutorial-weatherbot/06/suggested-actions-emulator.png and b/docs/media/tutorial-weatherbot/06/suggested-actions-emulator.png differ diff --git a/docs/media/tutorial-weatherbot/06/weather-card.png b/docs/media/tutorial-weatherbot/06/weather-card.png index b8e0351df7..cb4da3ee16 100644 Binary files a/docs/media/tutorial-weatherbot/06/weather-card.png and b/docs/media/tutorial-weatherbot/06/weather-card.png differ diff --git a/docs/media/tutorial-weatherbot/06/zipcode-prompt.png b/docs/media/tutorial-weatherbot/06/zipcode-prompt.png index e7bf1eaad3..7af79a224f 100644 Binary files a/docs/media/tutorial-weatherbot/06/zipcode-prompt.png and b/docs/media/tutorial-weatherbot/06/zipcode-prompt.png differ diff --git a/docs/media/tutorial-weatherbot/07/back-at-zipcode-prompt.png b/docs/media/tutorial-weatherbot/07/back-at-zipcode-prompt.png index 818f1517ed..01239f3f85 100644 Binary files a/docs/media/tutorial-weatherbot/07/back-at-zipcode-prompt.png and b/docs/media/tutorial-weatherbot/07/back-at-zipcode-prompt.png differ diff --git a/docs/media/tutorial-weatherbot/07/luis-key.png b/docs/media/tutorial-weatherbot/07/luis-key.png index d58e5570f6..a703ae87e7 100644 Binary files a/docs/media/tutorial-weatherbot/07/luis-key.png and b/docs/media/tutorial-weatherbot/07/luis-key.png differ diff --git a/docs/media/tutorial-weatherbot/07/luis-score.png b/docs/media/tutorial-weatherbot/07/luis-score.png index 9687c7d390..3e71d69fb9 100644 Binary files a/docs/media/tutorial-weatherbot/07/luis-score.png and b/docs/media/tutorial-weatherbot/07/luis-score.png differ diff --git a/docs/media/tutorial-weatherbot/07/luis-wired-up.png b/docs/media/tutorial-weatherbot/07/luis-wired-up.png index 580278ba91..f8dea8fcdf 100644 Binary files a/docs/media/tutorial-weatherbot/07/luis-wired-up.png and b/docs/media/tutorial-weatherbot/07/luis-wired-up.png differ diff --git a/docs/media/tutorial-weatherbot/07/set-property-zipcode.png b/docs/media/tutorial-weatherbot/07/set-property-zipcode.png index dc69888ff5..f35f8f4cbf 100644 Binary files a/docs/media/tutorial-weatherbot/07/set-property-zipcode.png and b/docs/media/tutorial-weatherbot/07/set-property-zipcode.png differ diff --git a/docs/tutorial-onboarding.md b/docs/onboarding.md similarity index 100% rename from docs/tutorial-onboarding.md rename to docs/onboarding.md diff --git a/docs/tutorial-create-echobot.md b/docs/quickstart-create-bot.md similarity index 96% rename from docs/tutorial-create-echobot.md rename to docs/quickstart-create-bot.md index 53aa4f6c9d..26e65d2c23 100644 --- a/docs/tutorial-create-echobot.md +++ b/docs/quickstart-create-bot.md @@ -29,5 +29,5 @@ After opening Composer in a browser click the **Echo Bot** button at the top of You've successfully created an echo bot! ## Next Steps -Create a [weather bot](tutorial/bot-tutorial-introduction.md) using Composer. +Create a [weather bot](tutorial/tutorial-introduction.md) using Composer. diff --git a/docs/setup-yarn.md b/docs/setup-yarn.md index 0a04d73ec8..a9f4aa09d9 100644 --- a/docs/setup-yarn.md +++ b/docs/setup-yarn.md @@ -11,32 +11,43 @@ Bot Framework Composer is designed to be a hosted web app. Currently, you need t - [Bot Framework Emulator](https://github.com/microsoft/BotFramework-Emulator/releases/latest): latest stable version - [.NET Core SDK 2.2](https://dotnet.microsoft.com/download/dotnet-core/2.2): required to test your bot -## Set up yarn for Composer -To start, clone the Composer GitHub repository. -``` -git clone https://github.com/microsoft/BotFramework-Composer.git -``` - -After cloning the repo open a terminal and navigate to the Bot Framework Composer folder. Navigate to the **Composer** folder and run the following commands: -``` -cd Composer -yarn -``` - This command gets all dependent packages. - -``` -yarn build -``` - This command builds the Composer app. The build process can take few minutes. +## Installation instructions +1. To start, open a terminal and clone the Composer GitHub repository. You will use this terminal for the rest of the steps in this section. + + ``` + git clone https://github.com/microsoft/BotFramework-Composer.git + ``` + +2. After cloning the repository, navigate to the **Bot Framework Composer** folder. Then run the following commands to navigate to the **Composer** folder and get all required packages: + + ``` + cd Composer + yarn + ``` + +3. Next, run the following command to build the Composer application, this command can take several minutes to finish: + + ``` + yarn build + ``` - If you are having trouble intalling or building Composer run `yarn tableflip`, which removes all of the Composer application's dependencies (node_modules) and reinstalls and rebuilds the application's dependencies. After running `yarn tableflip` run `yarn install` and `yarn build` again. This process can take anywhere from 5-10 minutes. + > [!NOTE] + > If you are having trouble installing or building Composer run `yarn tableflip`. This will remove all of the Composer application's dependencies (node_modules) and then it reinstalls and rebuilds all of its dependencies. Once completed, run `yarn install` and `yarn build` again. This process generally takes 5-10 minutes. + +4. Again using Yarn, start the Composer authoring application and the bot runtime: + + ``` + yarn startall + ``` + +5. Once you see **Composer now running at:** appear in your terminal, you can run Composer in your browser using the address http://localhost:3000. + + ![browser address](./media/setup-yarn/address.png) + +Keep the terminal open as long as you plan to work with the Composer. If you close it, Composer will stop running. -``` -yarn startall -``` - This command starts the Composer authoring application and the bot runtime. +The next time you need to run the Composer, all you will need is to run `yarn startall` from the **Composer** directory. -## Open Composer in a browser -To use Composer open a browser and navigate to the address after the message `Compiled successfully`, as seen below where the address is http://localhost:3000. +## Next steps -![browser address](./media/setup-yarn/address.png) +- Create a [echo bot](./quickstart-create-bot.md) using Composer. diff --git a/docs/tutorial/bot-tutorial-add-dialog.md b/docs/tutorial/bot-tutorial-add-dialog.md deleted file mode 100644 index 5adad95604..0000000000 --- a/docs/tutorial/bot-tutorial-add-dialog.md +++ /dev/null @@ -1,91 +0,0 @@ -# Add a dialog - -When building features of a bot with Composer, it is sometimes useful to create a new **dialog** to contain a chunk of functionality. This helps keep the dialog system organized, and also allows sub-dialogs to be combined into larger, more complex dialogs. - -Each dialog contains one or more triggers that launch associated actions. They can have their own dedicated language model. Dialogs can call other dialogs and can pass values back and forth. - -## What are we building? - -The main feature of this bot is reporting on the current weather conditions. - -To do this, we'll create a dialog that -- prompts the user to enter a zipcode to use as location for weather lookup -- calls an external API to retrieve the weather data for a specific zipcode. - -First, we'll set up all the components and make sure they work together. Then, we'll flesh out the functionality. - -## Create a new dialog -1. Click the **+ New Dialog** button in the left hand explorer. A dialog will appear and ask for a **Name** and **Description** - -2. Give this new dialog the name: - - `getWeather` - - and the description: - - `Get the current weather conditions` - -3. Click **Next**, and Composer will create the new dialog and open it in the editor. - - ![](../media/tutorial-weatherbot/02/create-getweather.png) - -Composer created this new dialog with a `BeginDialog` trigger pre-configured. - -3. For now, we'll just add a simple message to get things hooked up, then we'll come back to flesh out the feature. With `BeginDialog` trigger selected, click the **+** in the flow, and use the same **Send a response** action. Set the text of the activity to: - - `Let's check the weather` - - You'll have a flow that looks like this: - - ![](../media/tutorial-weatherbot/02/getweather-draft.png) - -## Wiring up dialogs -You can break pieces of your conversation flow into dialogs and can chain them together. Let's get the newly created `getWeather` dialog wired up to the root dialog. - -1. Click on `WeatherBot.Main` from the left navigation tree. After selecting `WeatherBot.Main` from the explorer, find the **Language Understanding** section of the properties panel. - - > Each dialog can have it's own **recognizer**, a component that lets the bot examine an incoming message and decide what it means by choosing between a set of predefined **intents**. Different types of recognizers use different techniques to determine which intent, if any, to choose. - - > For now, we're going to use the **Regular Expression** recognizer, which uses pattern matching. Later, we'll use more sophisticated natural language understanding technology from **LUIS**. - -2. Under the **Recognizer Type**, select `Regular Expression` - - ![](../media/tutorial-weatherbot/02/regexp-recognizer.gif) - -3. Click the **Add** button. Two new fields will appear: **Intent** and **Pattern** - - ![](../media/tutorial-weatherbot/02/weather-intent.png) - -4. Define the bot's first intent. Set the value of the **Intent** field to: - - `weather` - -5. Set the value of the **Pattern** field to: - - `weather` - - > This tells the bot to look for the word "weather" anywhere in an incoming message. Regular expression patterns can be much more complicated than this, but for now, this will do! - -6. Click "+ New Trigger" in the left hand side under the `weatherBot.Main` header, and a modal will appear. Select **Intent** from the first dropdown, and then select our freshly created `weather` intent from the second dropdown. - - ![](../media/tutorial-weatherbot/02/weather-trigger.png) - -7. Click the **+** in the flow and select the `Dialog management` option. From the submenu, select `Begin a new dialog` - -8. In the properties panel for the new action, set the `dialog name` property to our `getWeather` dialog. - -![](../media/tutorial-weatherbot/02/begin-dialog-congifure.gif) - - -## Let's test it out. - -1. Click the **Restart Bot** button in the upper right hand corner of the Composer window. This will update the bot runtime app with all the new content and settings. Then, click **Test in Emulator**. When Emulator connects to your bot, it'll send the greeting we configured in the last section. - - ![](../media/tutorial-weatherbot/02/restart-bot.gif) - -2. Send the bot a message that says `weather`. The bot should respond with our test message, confirming that our intent was recognized as expected, and the fulfillment action was triggered. - - ![](../media/tutorial-weatherbot/02/emulator-weather-draft.png) - -## Next steps -- [Get weather](./bot-tutorial-get-weather.md) diff --git a/docs/tutorial/bot-tutorial-add-help.md b/docs/tutorial/bot-tutorial-add-help.md deleted file mode 100644 index 277a6d00d4..0000000000 --- a/docs/tutorial/bot-tutorial-add-help.md +++ /dev/null @@ -1,132 +0,0 @@ -# Add Help and Cancel - -With even a simple bot, it is a good practice to provide a help command. You'll also want to provide a way for users to back out. - -1. Click **+ New Dialog** in the left hand explorer. You'll see a popup window. -2. Give this new dialog the name: - - `help` - - ![](../media/tutorial-weatherbot/04/help-dialog.png) - -3. Click **Next**, and you'll land in the editor view for the new help dialog. - - - - Composer created this new dialog with one `BeginDialog` trigger pre-configured. - -4. With the `BeginDialog` trigger selected, use the **+** button at the bottom of the flow, choose **Send a response** -5. In the properties panel on the right side, set the text of the activity to: - - `I am a weather bot! I can tell you the current weather conditions. Just say WEATHER.` - - ![](../media/tutorial-weatherbot/04/help.png) - - Next, let's wire this new dialog up to the Main dialog (your bot's brain). - -6. In the left hand explorer, click on `weatherBot.Main` at the top of the list. -7. In the right hand property pane, find the "Language Understanding" section and click the "Add" button at the bottom. This will reveal 2 new fields, allowing you to define a new intent. -8. Set the **Intent** field to: - - `help` - - Set the **Pattern** field to: - - `help` - - ![](../media/tutorial-weatherbot/04/help-intent.png) - -9. In the left hand explorer, click **+ New Trigger** -10. In the resulting dialog box, select **Intent recognized**, then choose the new `help` intent. Click `Submit`. - - ![](../media/tutorial-weatherbot/04/intent-trigger-help.png) - -11. In the flow editor, click the **+** button at the bottom of the empty flow. -12. Choose **Dialog management** and then select **Begin a new dialog** - - ![](../media/tutorial-weatherbot/04/begin-new-dialog.png) - -13. In the right hand properties panel, select the `help` dialog. - - ![](../media/tutorial-weatherbot/04/help-props.png) - -14. Click **Restart Bot** and open it in the emulator. - ----- - -Now, in addition to giving you the current weather, your bot can also offer help. - -![](../media/tutorial-weatherbot/04/basic-help.gif) - -However, notice that once you start the weather dialog by saying weather, your bot doesn't know how to provide help. Let's fix this! - - ---- - -## Allowing interruptions - -1. In Composer's left hand explorer, navigate back to the `getWeather` dialog. Make sure to highlight the `BeginDialog` trigger. -2. Select the **Bot Asks** node in the flow that says `What is your zipcode?` -3. In the right hand properties panel, set **Allow Interruptions** to `true` - ![](../media/tutorial-weatherbot/04/interrupts.png) - - > This tells Bot Framework to consult the parent dialog's recognizer, which will allow the bot to respond to `help` at the prompt as well. - -4. Hit **Restart Bot** and open it in the emulator. - -Say `weather` to your bot. It will ask for a zipcode. - -Now say `help`. It'll provide the global help response, even though that intent and trigger are defined in another dialog. Interruptions are a powerful way to make complex bots - we'll come back to that later. - -![](../media/tutorial-weatherbot/04/better-help.gif) - -For now, let's add one more global function - a cancel command. - - -## Global cancel - -1. In Composer's left hand explorer, click the **+ New Dialog** button again. -2. Give this new dialog the name: - - `cancel` - -3. Use the **+** button at the bottom of the flow, choose **Send a response** -4. In the properties panel on the right side, set the text of the activity to: - - `Canceling!` - -5. Use the **+** button again, this time choose **Dialog management**, then **Cancel all dialogs** - - > When triggered, this will cause the bot to cancel any active dialogs, and send the user back to the main dialog. - - ![](../media/tutorial-weatherbot/04/cancel-flow.png) - -6. In the left hand explorer, click on `weatherBot.Main` at the top of the list. -7. In the right hand property pane, find the **Language Understanding** section and click the **Add** button at the bottom. This will reveal two new fields, allowing you to define a new intent. -8. Set the **Intent** field to: - - `cancel` - -9. Set the **Pattern** field to: - - `cancel` - -10. In the left hand explorer, click **+ New Trigger** -11. In the resulting dialog box, select **Intent**, then choose the new `cancel` intent. Submit the dialog. -12. In the flow editor, click the **+** button at the bottom of the empty flow. -13. Choose **Dialog management** and then select **Begin a new dialog** -14. In the right hand properties panel, select the `cancel` dialog. - - ![](../media/tutorial-weatherbot/04/cancel-trigger.png) - -15. Click **Restart Bot** and open it in the emulator. - -Say `weather` to your bot. It will ask for a zipcode. - -Now say `help`. It'll provide the global help response. - -Now, say `cancel` - notice, the bot doesn't resume the weather dialog. Instead, it confirms the cancelation, and waits for your next message. - - -## Next steps -- [Add Language Generation](./bot-tutorial-lg.md) diff --git a/docs/tutorial/bot-tutorial-cards.md b/docs/tutorial/bot-tutorial-cards.md deleted file mode 100644 index dc6b50e62c..0000000000 --- a/docs/tutorial/bot-tutorial-cards.md +++ /dev/null @@ -1,60 +0,0 @@ -# Using cards - -You can use the Language Generation system to also render UI cards and button actions to user. - -Let's further refine the responses provided by the weather bot to include cards and button actions. - -## Rendering suggested actions - -Suggested actions help guide user by providing them most frequently used set of actions to interact with the bot. - -First, let's go ahead and update `prompt for zipcode` to include suggested actions for help and cancel actions. - -1. In Composer, click on `getWeather`, then make sure to highlight the `BeginDialog` trigger. - - ![](../media/tutorial-weatherbot/06/getWeather-beginDialog.png) - -2. Select the `What is your zipcode?` node in the flow. - - ![](../media/tutorial-weatherbot/06/zipcode-prompt.png) - -3. Update the prompt text to be this instead - - ``` - [Activity - Text = What is your zipcode? - SuggestedActions = help | cancel - ] - ``` - -4. Click **Restart Bot** and open it in the emulator. - -Now when you say weather to your bot, you will not only see that your bot asks you for zipcode but also presents help and cancel button as suggested actions. - -![](../media/tutorial-weatherbot/06/suggested-actions-emulator.png) - ---- - -Next up, let's change the weather report to also include a card. - -5. With the `getWeather` dialog selected and `BeginDialog` trigger selected, scroll down to the bottom, and click on the **Send a response** node that starts with `{DescribeWeather(dialog.weather)}...` -6. Instead of coming back with simple text response, let's have this action come back with a weather card. Replace the activity with this - - ``` - [ThumbnailCard - title = Weather for @{dialog.weather.city} - text = The weather is {dialog.weather.weather} and @{dialog.weather.temp}° - image = @{dialog.weather.icon} - ] - ``` - -7. Click **Restart Bot** and open it in the emulator. - ---- - -Go through the bot flow, say `weather` followed by a zip code. Notice now the bot responds back with a card and image. - - ![](../media/tutorial-weatherbot/06/weather-card.png) - ---- - -## Next steps -- [Add LUIS](./bot-tutorial-luis.md) diff --git a/docs/tutorial/bot-tutorial-get-weather.md b/docs/tutorial/bot-tutorial-get-weather.md deleted file mode 100644 index 08133e4687..0000000000 --- a/docs/tutorial/bot-tutorial-get-weather.md +++ /dev/null @@ -1,151 +0,0 @@ -# Get weather report - -1. In the explorer, click on `getWeather` to select the dialog and reveal the triggers it contains. -2. Click on the `BeginDialog` trigger underneath `getWeather`. The first thing we need to do to check a user's local weather is collect the user's location. Our weather API accepts a 5 digit zipcode as a parameter. So, let's add a **Text Input** to prompt the user for a `zipcode`. -3. Click the **+** button in the flow and select **Ask a question**. You'll see a variety of options for asking for different types of input. Read details [here](howto-ask-for-user-input.md) -4. Select **Text Input** from the sub-menu. Two new nodes will appear in the flow! - - > You use prompts to collect information from user. Prompt are broken down into a few pieces. We'll configure each separately. - - ![](../media/tutorial-weatherbot/03/empty-prompt.png) - -5. Click on the **Bot Asks** node. This part of the prompt represents the message the bot will send to the user requesting information. In the properties panel set the prompt to: - - `What is your zipcode?` - -6. Set the **Default value** property (next to Max turn count) to `'98052'` (include the quotes). - - > By default prompts are configured to ask the user for information `Max turn count` number of times (defaults to 3). When this happens, the prompt will stop and set the **Default value** to the `Property` and move forward with the conversaiton. - - ![](../media/tutorial-weatherbot/03/zipcode-prompt.png) - -7. Next, click the **User Input** tab in the properties panel. This part of the prompt represents the user's response, including where to store the value and how to pre-process it. - -8. Here, we can specify what property in memory will be used to store the user's response. In `Property to fill`, enter the value: - - `user.zipcode` - - For **Output Format**, select `trim`. This ensures leading and trailing spaces in user input are trimmed before the value is assigned to `user.zipcode` - - ![](../media/tutorial-weatherbot/03/zipcode-answer.png) - -9. Click on the **Others** tab in the properties panel. This section allows you to specify validation rules for the prompt, as well as error messages that will be used if the user provides an invalid response. - -10. In the **Unrecognized Prompt** field, enter: - - `- Sorry, I do not understand '{this.value}'. Please specify a zipcode in the form 12345` - - In the **Invalid Prompt** field, also enter: - - `- Sorry, '{this.value}' is not valid. I'm looking for a 5 digit number as zipcode. Please specify a zipcode in the form 12345` - -11. In **Validation Rules**, type: - - > validation rule 1 says we need a five characters - - `length(this.value) == 5` - - and then press enter. - - > Make sure to press enter to add the rule! - - Your properties pane should look like this: - - ![](../media/tutorial-weatherbot/03/zipcode-extensions.png) - - And your flow should look like this: - - ![](../media/tutorial-weatherbot/03/zipcode-flow.png) - - With these options set, we have a dialog that will prompt the user for a zipcode. If the user gives a valid 5 digit zipcode, the prompt will store the value in `user.zipcode` and move on. If the user gives an invalid zipcode (e.g. `tomato` or `123456`), the prompt will present an error message and repeat until a valid response is received. - - > There are some options in the footer of the prompt properties that can be used to tune how the prompt works. - - > Max turn count can be used to control how many times the bot will reprompt after invalid responses. - - > By default, prompts will be skip if the bound property already has a value. Always prompt, when enabled, will cause the prompt to appear even if the value is already known. Leave this unchecked for now. - - After this action occurs, the bot can use `{user.zipcode}` in messages, and more importantly, in calls to external APIs! - -## Add an HTTP request - -The http request action is found under the **Access external resources** menu in the flow **+** button. - -1. Select **Send an HTTP request** to add a the step to your flow. - - ![](../media/tutorial-weatherbot/03/http-step.png) - -2. In the properties editor, - - Set the method to `GET` - - Set the URL to: - - `http://weatherbot-ignite-2019.azurewebsites.net/api/getWeather?zipcode={user.zipcode}` - - Set the **Result property** to: - - `dialog.api_response` - - ![](../media/tutorial-weatherbot/03/http-props.png) - - This will cause the bot to make an HTTP request to the url specified. The reference to `{user.zipcode}` will be replaced by a live value from the bot's memory. - - > HTTP action sets the following information in the **Result property**: statusCode, reasonPhrase, content, headers. Setting the **Result property** to `dialog.api_response` means we can access those values via `dialog.api_response.statusCode`, `dialog.api_response.reasonPhrase`, `dialog.api_response.content` and `dialog.api_response.headers`. If the response is json, it will be a deserialized object available via `dialog.api_response.content`. - - After making an HTTP request, we need to test the status of the response. To do this, we'll use an If/Else branch. - -3. Use the '+' button, then choose **Create a condition**, then choose **Branch: If/Else** -4. In the properties panel on the right, set the **Condition** field to: - - `dialog.api_response.statusCode == 200` - -5. In the `true` branch click the **+** button, select **Manage properties**, and then **Set a Property** - - Set **Property** to: - - `dialog.weather` - - Set **Value** to: - - `dialog.api_response.content` - - ![](../media/tutorial-weatherbot/03/set-property-condition.png) - -6. Still in the `true` branch, use the **+** button, then select **Send a response** - - Set the text of the message to: - - `The weather is {dialog.weather.weather} and the temp is {dialog.weather.temp}°` - - ![](../media/tutorial-weatherbot/03/ifelse.png) - -7. Now, in the `false` branch, use the **+** button, then select **Send a response** - - Set the text of the message to: - - `I got an error: {dialog.api_response.content.message}` - -8. To be safe, let's clean up the invalid value which otherwise would persist. Use the **+**, select **Manage properties**, then select **Delete a property** - - Set the property to: - - `user.zipcode` - - ![](../media/tutorial-weatherbot/03/ifelse2.png) - - -## Test in Emulator - -1. Restart the bot again, and open it in the emulator. - - ![](../media/tutorial-weatherbot/02/restart-bot.gif) - -2. After the greeting, send `weather` to the bot. The bot will prompt you for a zipcode. Give it your home zipcode, and seconds later, you should see the current weather conditions! - - ![](../media/tutorial-weatherbot/03/basic-weather.gif) - - If you ask for the weather again, notice that the bot doesn't prompt for a zipcode the second time. Remember, this is because `user.zipcode` is already set. Had we checked **Always prompt** the bot would ask each time. Go back to step 10, check **Always prompt** and try again. Your bot will ask for a zipcode everytime you re-start the conversation in emulator. - -## Next steps -- [Add help and cancel command](./bot-tutorial-add-help.md) diff --git a/docs/tutorial/bot-tutorial-introduction.md b/docs/tutorial/bot-tutorial-introduction.md deleted file mode 100644 index f1208ffcde..0000000000 --- a/docs/tutorial/bot-tutorial-introduction.md +++ /dev/null @@ -1,119 +0,0 @@ -# Build a weather bot -In this tutorial, you will build a weather bot using Bot Framework Composer. We'll start simple and gradually introduce sophistication. We'll cover how to: -- Create a new bot -- Author a new dialog -- Add global help and cancel handling -- Use Language Generation to power your bot's responses -- Use Adaptive Cards -- Handle interruptions in the conversation flow -- Add multiple dialogs to help your bot fulfill more than one scenario - -## Prerequisites -- [Bot Framework Composer](./docs/setup-yarn.md) - -- A LUIS subscription [key](https://stackoverflow.com/questions/42920829/where-can-i-get-the-luis-subscription-key) (found in [Settings](https://www.luis.ai/user/settings) in LUIS) - - -# Create the Weather Bot - -The first step in creating a bot with Bot Framework Composer is to create a new bot project from the home screen in the Composer. This will create a new folder locally on your computer with all the files necessary to build, test and run the bot. - -## Create Project - -1. From the home screen, select **New** from the upper left corner. You'll be presented with a dialog with options to either create an empty bot project from scratch, or to create one based on a template. For this workshop, make sure `Create from Scratch` selected and click **Next** - - ![create project](../media/tutorial-weatherbot/01/create-1.png) - -2. The second screen asks for a **Name** and **Description** of your bot. Let's call it: - - `WeatherBot` - - and give it a description: - - `A friendly bot who can talk about the weather.` - - > Make sure not to put any spaces or special characters in the bot's name. - - > Leave the **Location** field with its default value - this will put the bot project into Composer's default project folder where it will be easy to find. - - ![create bot](../media/tutorial-weatherbot/01/create-2.png) - -3. Click **Next**, and Composer will create the project for you! - -## Give your bot something to say - -After creating your bot, Composer will load the new bot's `Main` dialog in the editor. It should look like this: - -![bot conversation](../media/tutorial-weatherbot/01/empty-main-dialog.png) - -Each dialog contains one or more **[Triggers](concept-events-and-triggers.md)** that define the actions available to the bot while the dialog is active. Right now the dialog is empty, so the bot won't do anything. - -You will notice that the new bot is pre-configured with one trigger in the left dialogs window - `Greeting`. - -> Triggers help your dialog capture events of interest and respond to them using actions. - -1. Click the `Greeting` trigger in the left hand explorer. - -2. You will see a new flow has been added to the dialog. - - ![dialog](../media/tutorial-weatherbot/01/new-flow.png) - -3. To help keep the bot organized, let's rename this trigger to something that describes what it does. In the **properties panel** on the right side of the screen, click on the name of the trigger ("Greeting"). You'll be able to update the title there, and the change will be instantly reflected in the dialog and navigation on the left. Rename ths trigger to: - - `WelcomeTheUser` - - ![trigger](../media/tutorial-weatherbot/01/rename-trigger.gif) - -Now, let's actually make the bot do something! -Inside the flow, you'll see that the real **Trigger** box has a line below it that includes in a **+** button. - -The **+** button can be used to add **Actions** to the conversation flow. You can use this to add actions to the end of a flow, or insert actions at an earlier point. - -For now, let's instruct the bot to send a simple greeting. - -4. Click the **+** button and select the first menu item **Send a response**. - - ![menu](../media/tutorial-weatherbot/01/add-send-activity.gif) - -5. Select the new **Send a response** action in the flow and it's properties will appear on the right hand side of the screen. This action has only one main property - the text of the activity to send. - -6. Type a welcome message into this field. It is always a good idea to have your bot introduce itself and explain it's main features. So let's make the welcome message something like: - - `Hi! I'm a friendly bot that can help with the weather. Try saying WEATHER or FORECAST.` - -Your bot should now look like this: - - ![activity](../media/tutorial-weatherbot/01/send-activity.png) - -Next, let's temporarily disable the recognizer for the main dialog. We will get back to this in the next step. - -7. Click on `WeatherBot.Main` in the left pane to bring up the properties editor for the root dialog. - -6. In the properties panel on the right hand side, click on **Recognizer type** and select `None`. - - > Dialogs in Composer support two different recognizer types - LUIS and Regular expressions. Unless you need intent classification or entity extraction you can remove the recognizer by setting it to `None`. - - ![recognizer](../media/tutorial-weatherbot/01/recognizer-none.gif) - -## Start your bot and test it - -Now that our new bot has its first simple feature, let's launch it in the emulator and make sure everything works. - -1. Click the **Start Bot** button in the upper right hand corner of the screen. This tells Composer to launch the bot's runtime (an external app powered by the Bot Framework SDK) and updates it with the latest content and settings from Composer. - -2. After a few seconds, a second link will appear next to the button **Test in Emulator**. Click this link to open Emulator and connect. - - ![start bot](../media/tutorial-weatherbot/01/start-bot.gif) - -You should see a window like this appear: - - ![emulator](../media/tutorial-weatherbot/01/emulator-launch.png) - -And the bot should immediately greet you with the message we just configured: - - ![emulator](../media/tutorial-weatherbot/01/greeting-in-emulator.png) - -We now have a working bot, and we're ready to add some more substantial functionality! - -## Next steps -- [Add a dialog](./bot-tutorial-add-dialog.md) diff --git a/docs/tutorial/bot-tutorial-lg.md b/docs/tutorial/bot-tutorial-lg.md deleted file mode 100644 index fe5729f609..0000000000 --- a/docs/tutorial/bot-tutorial-lg.md +++ /dev/null @@ -1,113 +0,0 @@ -# Language Generation - -Now that the bot can perform its basic tasks, it's time to work on the conversational UI. A good bot doesn't just do a task - it does it with style and personality. - -Composer includes the Bot Framework Language Generation library, a set of powerful templating and message formatting tools that make it easy to include variation, conditional messages, and dynamic content that puts you control of how your bot responds to the user! - -Let's start by adding some variation to the welcome message. - -1. In Composer, click on `weatherBot.Main` and highlight the `WelcomeTheUser` trigger. - - ![](../media/tutorial-weatherbot/05/nav1.png) - -2. Select the **Send a response** node in the flow. - - ![](../media/tutorial-weatherbot/05/lg-1.png) - -3. In the right hand properties panel, replace the text with the following: - - ``` - -Hi! I'm a friendly bot that can help with the weather. Try saying WEATHER. - -Hello! I am Weather Bot! Say WEATHER to get the current conditions. - -Howdy! Weather bot is my name and weather is my game. - ``` - - > Each tick mark indicates a variation in the message. The bot will choose one of the responses randomly at runtime! - -4. Click **Restart Bot** and open it in the emulator. - ---- - -You'll see the bot greet you with one of the three variants we listed. - -Click the **Restart conversation** link in Emulator's top bar. You might see another variant! If you see the same response, click **Restart conversation** again! - ---- - -Currently, the bot reports the weather in a very robotic manner: The weather is Clouds and it is 75°. - -Let's improve the language used when delivering the weather conditions. To do this, we'll use 2 features of the Language Generation system: conditional messages, and parameterized messages. - -5. Navigate to the **Bot Responses** tab by clicking the bot icon on the far left of the screen. - - ![](../media/tutorial-weatherbot/05/botsays.png) - -6. Toggle the **Edit Mode** switch in the upper right hand corner so that it turns blue. This will enable a syntax-highlighted LG editor in the main pane. - - > You'll notice that every message you created in the flow editor also appears here. They're linked, and any changes you make in this view will be reflected in the flow as well. - - ![](../media/tutorial-weatherbot/05/editmode.png) - -7. Scroll to the bottom of the editor. -8. Paste the following text: - ``` - # DescribeWeather(weather) - - IF: {weather.weather=="Clouds"} - - It is cloudy - - ELSEIF: {weather.weather=="Thunderstorm"} - - There's a thunderstorm - - ELSEIF: {weather.weather=="Drizzle"} - - It is drizzling - - ELSEIF: {weather.weather=="Rain"} - - It is raining - - ELSEIF: {weather.weather=="Snow"} - - There's snow - - ELSEIF: {weather.weather=="Clear"} - - The skies are clear - - ELSEIF: {weather.weather=="Mist"} - - There's a mist in the air - - ELSEIF: {weather.weather=="Smoke"} - - There's smoke in the air - - ELSEIF: {weather.weather=="Haze"} - - There's a haze - - ELSEIF: {weather.weather=="Dust"} - - There's a dust in the air - - ELSEIF: {weather.weather=="Fog"} - - It's foggy - - ELSEIF: {weather.weather=="Ash"} - - There's ash in the air - - ELSEIF: {weather.weather=="Squall"} - - There's a squall - - ELSEIF: {weather.weather=="Tornado"} - - There's a tornado happening - - ELSE: - - {weather.weather} - ``` - - > This creates a new Language Generation template called `DescribeWeather`. This template receives weather data from our API as a parameter, and outputs a friendlier - description of the weather based on the raw data from the API. - -9. Navigate back to the flow designer by clicking on **Flow designer** in the left navigation bar. -10. In Composer's explorer, click on the `getWeather` dialog, and make sure the `BeginDialog` trigger is highlighted. - - ![](../media/tutorial-weatherbot/05/nav2.png) - -11. Scroll to the bottom, and click on the **Send a response** node that starts with `The weather is...` -13. In the right hand property pane, replace the activity text with the following: - - `- {DescribeWeather(dialog.weather)} and the temp is {dialog.weather.temp}°` - - > Here, we are using the `DescribeWeather` template _inside another template_. LG templates can be combined in this way to create more complex templates. - - ![](../media/tutorial-weatherbot/05/lg-2.png) - -14. Click **Restart Bot** and open it in the emulator. - ---- - -Now, when you say `weather`, the bot will send you a message that sounds much more human than it did before. It's possible to combine these techniques to quickly create lots of variety in your messages! - -![](../media/tutorial-weatherbot/05/nice-weather.png) - -## Next steps -- [Use cards](./bot-tutorial-cards.md) diff --git a/docs/tutorial/tutorial-add-dialog.md b/docs/tutorial/tutorial-add-dialog.md new file mode 100644 index 0000000000..6d689bb3d6 --- /dev/null +++ b/docs/tutorial/tutorial-add-dialog.md @@ -0,0 +1,101 @@ +# Tutorial: Adding dialogs to your bot + +This tutorial walks you through adding additional dialogs to a basic bot with the Bot Framework Composer and testing it in the Emulator. + +It can be useful to include functionality in [**dialogs**](../concept-dialog.md) when building the features of your bot with Composer. This helps keep the dialogs organized and allow sub-dialogs to be combined into larger and more complex dialogs. + +A dialog contains one or more [triggers](../concept-events-and-triggers.md). Each trigger consists of one or more actions which are the set of instructions that the bot will execute. Dialogs can also call other dialogs and can pass values back and forth between them. + +In this tutorial, you learn how to: + +> [!div class="checklist"] +> * Build on the basic bot created in the previous tutorial by adding an additional dialog +> * Run your bot locally and test it using the Bot Framework Emulator + +## Prerequisites +- Completion of the tutorial [Create a new bot and test it in the Emulator](./tutorial-create-bot.md) +- An understanding of the concepts taught in [the dialog concept article](../concept-dialog.md) + +## What are we building? + +The main feature of this bot is reporting on the current weather conditions. + +To do this, you will create a dialog that +- Prompts the user to enter a zip code to use as location for weather lookup. +- Calls an external API to retrieve the weather data for a specific zip code. + +> [!TIP] +> It is recommended that you first create all of the components of your bot and make sure they work together before creating the detailed functionality. + +## Create a new dialog +1. Click the **+ New Dialog** button in the navigation pane. A dialog will appear and ask for a **Name** and **Description**. + +2. Fill in the **Name** field with **getWeather** and the **Description** field with **Get the current weather conditions**. + + ![Create Get Weather Dialog](../media/tutorial-weatherbot/02/create-getweather-dialog.png) + +1. After clicking **Next**, Composer will create the new dialog and open it in the editor. Composer will also create this new dialog with a pre-configured **BeginDialog** trigger. + +2. For now, we'll just add a simple message to get things hooked up, then come back to flesh out the feature. With **BeginDialog** trigger selected, click the **+** sign in the flow and use the same **Send a response** action. Set the text of the activity to: + + **Let's check the weather** + + You'll have a flow that looks like this: + + ![Begin dialog](../media/tutorial-weatherbot/02/begin-dialog.png) + +## Connect your new dialog +You can break pieces of your conversation flow into different dialogs and can chain them together. Next you need to get the newly created **getWeather** dialog connected to the main dialog. + +1. Select **WeatherBot.Main** in the **Navigation pane**. + +2. Find the **Language Understanding** section of the in the **Properties panel**. + + > Each dialog can have its own [recognizer](../concept-dialog.md#recognizer), a component that lets the bot examine an incoming message and decide what it means by choosing between a set of predefined [intents](../concept-language-understanding.md#intents). Different types of recognizers use different techniques to determine which intent, if any, to choose. + + > [!NOTE] + > For now, you're going to use the [Regular Expression Recognizer](../how-to-define-triggers.md#regular-expression-recognizer), which uses pattern matching. Later, you will use more sophisticated natural language understanding technology from [LUIS](../how-to-define-triggers.md#luis-recognizer). + +3. Select **Regular Expression** from the **Recognizer Type** drop-down list. + + ![regular expression recognizer](../media/tutorial-weatherbot/02/recognizer-type.png) + +4. Enter **weather** for both **Intent** and **Pattern**. Make sure you press **Enter** in your keyboard to save the setting. + + ![intent and pattern](../media/tutorial-weatherbot/02/intent-pattern.png) + + > [!NOTE] + > This tells the bot to look for the word "weather" anywhere in an incoming message. Regular expression patterns are generally much more complicated, but this is adequate for the purposes of this example. + +5. Next, create a new trigger in the **weatherBot.Main** dialog by selecting **+ New Trigger**. + +6. In the **Create a trigger** form that appears, select **Intent recognized** as the trigger type, then select **weather** from the **Which intent do you want to handle?** drop-down list, then **Submit**. + + ![weather intent trigger](../media/tutorial-weatherbot/02/weather-intent-trigger.png) + +7. Next, create a new action for the **Intent recognized** trigger you just created. You can do this by selecting the **+** sign under the trigger node in the _Authoring canvas_, then select **Begin a new dialog** from the **Dialog management** menu. + + ![dialog management](../media/tutorial-weatherbot/02/dialog-management.png) + +8. In the **Properties panel** for the new **Begin a new dialog** action, select **getWeather** from the **dialog name** drop-down list. + + ![connect dialog](../media/tutorial-weatherbot/02/connect-dialog.png) + +Now when a user enters **weather**, your bot will respond by activating the **getWeather** dialog. + +In the next tutorial you will learn how to prompt the user for additional information, then query a weather service and return the results to the user, but first you need to validate that the functionality developed so far works correctly, you will do this using the Emulator. + + +## Test bot in the Emulator + +1. Select the **Restart Bot** button in the upper right hand corner of the Composer window. This will update the bot runtime app with all the new content and settings. Then select **Test in Emulator**. When the Emulator connects to your bot, it'll send the greeting you configured in the last section. + + ![Restart Bot](../media/tutorial-weatherbot/02/restart-bot.gif) + +2. Send the bot a message that says **weather**. The bot should respond with your test message, confirming that your intent was recognized as expected, and the fulfillment action was triggered. + + ![Weather Bot in Emulator](../media/tutorial-weatherbot/02/emulator-weather.png) + + +## Next steps +- [Get weather](./tutorial-get-weather.md) diff --git a/docs/tutorial/tutorial-add-help.md b/docs/tutorial/tutorial-add-help.md new file mode 100644 index 0000000000..5e6dc2b3c8 --- /dev/null +++ b/docs/tutorial/tutorial-add-help.md @@ -0,0 +1,154 @@ +# Tutorial: Adding Help and Cancel to your bot + +In this tutorial you will learn how to handle interruptions to the conversation flow. Composer enables you to add help topics to your bot and enable users to exit out of any process at any time. + +In this tutorial, you learn how to: + +> [!div class="checklist"] +> * Create help topics that can be accessed from any flow at any time. +> * Interrupt your bots flow to enable your users to exit out of any process before it is completed. + + +## Prerequisites +- Completion of the tutorial [Adding actions to you dialog](./tutorial-get-weather.md). + + +## Add Help and Cancel + +With even a simple bot, it is a good practice to provide help. You'll also want to provide a way for users to exit at any point in the flow. + +1. You first need to create a new dialog. Select **+ New Dialog** in the **Navigation** pane. + +2. Enter **help** in the **Name** field and **global help** in the **Description** field of the **Define conversation objective** form, then select **Next**. + + ![The Define conversation objective form](../media/tutorial-weatherbot/04/help-dialog.png) + + Composer will create the new _help_ dialog with one **BeginDialog** trigger pre-configured. + +3. Select the **BeginDialog** trigger in the **Navigation** pane. + +4. Create a new action at the bottom of the flow by selecting the plus (+) icon in the **Authoring canvas**, then select **Send a response** from the list of actions. + + ![Plus sign icon](../media/tutorial-weatherbot/04/plus-sign-icon.png) + +5. Enter the following text into the **Properties** panel on the right side of the Composer screen: + + **I am a weather bot! I can tell you the current weather conditions. Just say WEATHER.** + + The next step is to connect this new _help_ dialog with the bot's main dialog. + +6. Select **weatherBot.Main** at the top of the **Navigation** pane. + + Next you will create a new [intent](../concept-language-understanding.md#intents) so that when the user enters 'help', the bots recognizer will identify that as a user intent and the bot will know what to do. + +7. Create a new intent in the **RegEx patterns to intents** section of the **Properties** panel by entering **help** in both the **Intent** and **Pattern** fields, then press the enter key on your keyboard to ensure the new values are saved. + + > [!IMPORTANT] + > You need to enter values into both the **Intent** and **Pattern** fields then press enter or the new intent will not be saved correctly. + + ![Enter the Help intent in the Properties panel](../media/tutorial-weatherbot/04/help-intent.png) + +8. Now you will need to create a new trigger in the main dialog to handle the new intent. You do this by selecting **+ New Trigger** in the **Navigation** pane. + +9. The **Create a trigger** form will appear. Select **Intent recognized** as the trigger type and **help** from the list of intents, then select the **Submit** button. + + ![The Help intent](../media/tutorial-weatherbot/04/intent-trigger-help.png) + +10. Next select the **+** button in the **Authoring Canvas** to create a new action, then select **Begin a new dialog** from the **Dialog management** menu. + + ![Selecting the "Begin a new dialog" action](../media/tutorial-weatherbot/04/begin-new-dialog.png) + +11. Next you need to specify the dialog to call when the _help_ intent is recognized. You do this by selecting **help** from the **Dialog name** drop-down list in the **Properties** panel. + + ![Selecting the "help" dialog in the Properties panel](../media/tutorial-weatherbot/04/select-help-dialog.png) + + Now, in addition to giving you the current weather, your bot should now also offer help. You can verify this using the Emulator. + +12. Select **Restart Bot** and open it in the Emulator to verify you are able to call your new help dialog. + + ![Basic Help test in Emulator](../media/tutorial-weatherbot/04/basic-help.gif) + +However, notice that once you start the weather dialog by saying weather, your bot doesn't know how to provide help, it is still trying to resolve the zip code. You need to configure your bot to allow interruptions to the dialogs flow before this will work. + + +### Allowing interruptions +The **getWeather** dialog handles getting the weather forecast, so you will need to configure its flow to enable it to handle interruptions, which will enable the new help functionality to work. The following steps demonstrate how to do this. + +1. Select the **BeginDialog** trigger in the **getWeather** dialog. + + ![Select the beginDialog trigger](../media/tutorial-weatherbot/04/select-begindialog-trigger.png) + +2. Select the **Text input** action in the **Authoring canvas**. + + ![Select the textInput action](../media/tutorial-weatherbot/04/select-textinput-action.png) + + +3. Select the **Bot Asks** tab in the **Properties** panel. In the **Prompt settings** section set the **Allow interruptions** field to `true`. + + ![Setting "Allow interruptions" to true](../media/tutorial-weatherbot/04/interrupts.png) + + > This tells Bot Framework to consult the parent dialog's recognizer, which will allow the bot to respond to **help** at the prompt as well. + +4. Select **Restart Bot** and open it in the Emulator to verify you are able to call your new help dialog. + +5. Say **weather** to your bot. It will ask for a zip code. + +6. Now say **help**. It will now provide the global help response, even though that intent and trigger are defined in another dialog. Interruptions are a powerful way to make complex bots - more on that later. + + ![Testing Help in the Emulator](../media/tutorial-weatherbot/04/better-help.gif) + +You have learned how to interrupt a flow to include help functionality to your bot. Next you will learn how to add another useful global function, the ability to exit out of a flow without completing it - a cancel command. + + +### Global cancel + +1. You first need to create a new dialog. Select **+ New Dialog** in the **Navigation** pane. + +2. Enter **cancel** in the **Name** field of the **Define conversation objective** form, then select **Next**. + +3. Select **+** at the bottom of the flow in the **Authoring canvas** and select **Send a response** from the list of actions. + +4. In the **Properties** panel on the right side of the Composer screen, enter **Canceling!** + +5. Add another action by selecting **+** at the bottom of the flow in the **Authoring canvas** then select **Cancel all dialogs** from the **Dialog management** menu. + + > [!NOTE] + > When **Cancel all dialogs** is triggered, the bot will cancel all active dialogs, and send the user back to the main dialog. + + ![Create the "Cancel all dialogs" action](../media/tutorial-weatherbot/04/cancel-flow.png) + + Next you will add a _cancel_ intent, the same way you added the _help_ intent in the previous section. + +6. Select **weatherBot.Main** at the top of the **Navigation** pane. + +7. Create a new intent in the **RegEx patterns to intents** section of the **Properties** panel by entering **cancel** in both the **Intent** and **Pattern** fields, then press the enter key on your keyboard to ensure the new values are saved. + + ![Enter the Help intent in the Properties panel](../media/tutorial-weatherbot/04/cancel-intent.png) + +8. Now you will need to create a new trigger in the main dialog to handle the new intent. You do this by selecting **+ New Trigger** in the **Navigation** pane. + +9. The **Create a trigger** form will appear. Select **Intent recognized** as the trigger type and **cancel** from the list of intents, then select the **Submit** button. + + ![The cancel intent](../media/tutorial-weatherbot/04/intent-trigger-cancel.png) + +10. Next select the **+** button in the **Authoring Canvas** to create a new action, then select **Begin a new dialog** from the **Dialog management** menu. + + ![Selecting the "Begin a new dialog" action](../media/tutorial-weatherbot/04/begin-new-dialog-cancel.png) + +11. Next you need to specify the dialog to call when the _cancel_ intent is recognized. You do this by selecting **cancel** from the **Dialog name** drop-down list in the **Properties** panel. + + ![Selecting the "help" dialog in the Properties panel](../media/tutorial-weatherbot/04/select-cancel-dialog.png) + + Now, your users will be able to cancel out of the weather dialog at any point in the flow. You can verify this using the Emulator. + +12. Select **Restart Bot** and open it in the Emulator to verify you are able to cancel. + +13. Say **weather** to your bot. _It will ask for a zip code_. + +14. Now say **help**. _It'll provide the global help response_. + +15. Now, say **cancel**. Notice that the bot doesn't resume the weather dialog but instead, it confirms that you want to cancel, then waits for your next message. + + +## Next steps +- [Add Language Generation](./tutorial-lg.md) to power your bot's responses. diff --git a/docs/tutorial/tutorial-cards.md b/docs/tutorial/tutorial-cards.md new file mode 100644 index 0000000000..09eecc0917 --- /dev/null +++ b/docs/tutorial/tutorial-cards.md @@ -0,0 +1,72 @@ +# Tutorial: Incorporating cards and buttons into your bot +The previous tutorial taught how to add language generation to your bot to include variation, conditional messages, and dynamic content that give you greater control of how your bot responds to the user, but all of your responses to the user are still in plain text format. This tutorial will build on what you learned in the previous tutorial by adding richer message content to your bot using Cards and Buttons. + + +In this tutorial, you learn how to: + +> [!div class="checklist"] +> * Add cards and buttons to your bot using Composer + +## Prerequisites +- Completion of the tutorial [Adding language generation to your bot](./tutorial-lg.md). +- A working knowledge of the concepts taught in the [Language Generation](../concept-language-generation.md) article. +- A high level understanding of the concepts taught in the how to article [Sending responses with cards](../how-to-send-cards.md). +- A high level understanding of the concepts taught in the _Cards_ section of the [Design the user experience](https://docs.microsoft.com/azure/bot-service/bot-service-design-user-experience?view=azure-bot-service-4.0#cards) article. + +## Using cards + +The Language Generation system can also be used to render UI cards and button actions to the user. + +Next you will learn how to refine the responses provided by the weather bot to include cards and button actions. + +### Adding buttons +Buttons are added as _suggested actions_, which enable your bot to present buttons that the user can select to provide input. Suggested actions can enhance user experience by enabling them to answer a question or make a selection with a simple tap of a button, rather than having to type a response with a keyboard. + +First, update the prompt for the users zip code to include suggested actions for help and cancel actions. + +1. Select the **BeginDialog** trigger in the **getWeather** dialog. + +2. Select the **test input** action which is the second action in the flow. + + ![](../media/tutorial-weatherbot/06/getWeather-beginDialog.png) + +3. Update the **Prompt** to include the suggested actions as shown below: + + ``` + [Activity + Text = What is your zip code? + SuggestedActions = help | cancel + ] + ``` + +4. Click **Restart Bot** and open it in the emulator. + + Now when you say weather to your bot, you will not only see that your bot asks you for zipcode but also presents help and cancel button as suggested actions. + + ![](../media/tutorial-weatherbot/06/suggested-actions-emulator.png) + +### Adding cards + +Now you can change the weather report to also include a card. + +1. Next, scroll down to the bottom in the **Authoring canvas**, then select the **Send a response** node that starts with `@{DescribeWeather(dialog.weather)}...` + +2. Instead of coming back with simple text response, you can respond with a weather card. To do that, replace the response with this Thumbnail Card: + ``` + [ThumbnailCard + title = Weather for @{dialog.weather.city} + text = The weather is @{dialog.weather.weather} and @{dialog.weather.temp}° + image = @{dialog.weather.icon} + ] + ``` + +3. Select **Restart Bot** in the Composer **Toolbar** then **Test in Emulator**. + +In the Emulator, go through the bot flow, say **weather** followed by a zip code. Notice now the bot responds back with a card that contains the results along with a card title and image. + + ![](../media/tutorial-weatherbot/06/weather-card.png) + +--- + +## Next steps +- [Tutorial: Adding LUIS functionality to your bot](./tutorial-luis.md) diff --git a/docs/tutorial/tutorial-create-bot.md b/docs/tutorial/tutorial-create-bot.md new file mode 100644 index 0000000000..cd7c544317 --- /dev/null +++ b/docs/tutorial/tutorial-create-bot.md @@ -0,0 +1,91 @@ +# Tutorial: Create a new bot and test it in the Emulator + +This tutorial walks you through creating a basic bot with the Bot Framework Composer and testing it in the Emulator. + +In this tutorial, you will learn how to: + +> [!div class="checklist"] +> * Create a basic bot using the Bot Framework Composer +> * Run your bot locally and test it using the Bot Framework Emulator + +## Prerequisites +- The [Bot Framework Composer](../setup-yarn.md) +- The [Bot Framework Emulator](https://aka.ms/bot-framework-emulator-readme) + + +## Create a new bot project +The first step in creating a bot with the Bot Framework Composer is to create a new bot project from the home screen in the Composer. This will create a new folder locally on your computer with all the files necessary to build, test and run the bot. + +1. From the home screen, select **New**. + + ![create project](../media/tutorial-weatherbot/01/new.png) + +2. In the **Create from scratch?** screen, you'll be presented with options to create an empty bot project from scratch or to create one based on a template. For this tutorial, select the **Create from Scratch** option, then **Next**. + + ![create project](../media/tutorial-weatherbot/01/create-1.png) + +3. In the **Define conversation objective** form: + 1. Enter the name **WeatherBot** in the **Name** field. + 2. Enter **A friendly bot who can talk about the weather** in the **Description** field. + 3. Select the location to save your bot. + 4. Save your changes and create your new bot by selecting **Next**. + + ![create project](../media/tutorial-weatherbot/01/create-2.png) + + > [!TIP] + > Spaces and special characters are not allowed in the bot's name. + + After creating your bot, Composer will load the new bot's main dialog in the editor. It should look like this: + + ![bot conversation](../media/tutorial-weatherbot/01/empty-main-dialog.png) + + > [!NOTE] + > Each dialog contains one or more [triggers](../concept-events-and-triggers.md) that define the actions available to the bot while the dialog is active. When you create a new bot, an **Activities** trigger of type **Greeting (ConversationUpdate activity)** is automatically provisioned. Triggers help your dialog capture events of interest and respond to them using actions. + + > [!TIP] + > To help keep bots created in Composer organized, you can rename any trigger to something that better describes what it does. + + **Steps 4-8 are demonstrated in the image immediately following step 8.** + +4. Click the **Greeting** trigger in the navigation pane. + +5. In the **Properties panel** on the right side of the screen, select the trigger name and type **WelcomeTheUser**. + +6. Next you will start adding functionality to your bot by adding **Actions** to the **WelcomeTheUser** trigger. You do this by selecting the **+** button in the **Authoring canvas** and select **Send a response** from the list. + + Now, it's time to make the bot do something. + + You will see the flow in the **Authoring canvas** starts with the **Trigger** name with a line below it that includes in a **+** button. + + > [!TIP] + > The **+** button can be used to add **Actions** to the conversation flow. You can use this to add actions to the end of a flow, or insert new actions between existing actions. + + For now, instruct the bot to send a simple greeting. + +7. Select the new **Send a response** action in the flow and its properties will appear on the right hand side of the screen. This action has only one main property, the text of the activity to send. + +8. Type a welcome message into this field. It is always a good idea to have your bot introduce itself and explain it's main features, something like: + + **Hi! I'm a friendly bot that can help with the weather. Try saying WEATHER or FORECAST.** + + ![trigger](../media/tutorial-weatherbot/01/WelcomeTheUser.gif) + + +## Start your bot and test it + +Now that your new bot has its first simple feature, you can launch it in the emulator and verify that it works. + +1. Click the **Start Bot** button in the upper right hand corner of the screen. This tells Composer to launch the bot's runtime, which is powered by the Bot Framework SDK. + +2. After a second the **Start Bot** button will change to **Restart Bot** which indicates that the bots runtime has started. Simultaneously a new link will appear next to the button labeled **Test in Emulator**. Selecting this link will open your bot in the Emulator. + + ![start bot](../media/tutorial-weatherbot/01/start-bot.gif) + + Soon the Emulator will appear, and the bot should immediately greet you with the message you just configured: + + ![emulator](../media/tutorial-weatherbot/01/emulator-launch.png) + +You now have a working bot, and you're ready to add some more substantial functionality! + +## Next steps +- [Add a dialog](./tutorial-add-dialog.md) diff --git a/docs/tutorial/tutorial-get-weather.md b/docs/tutorial/tutorial-get-weather.md new file mode 100644 index 0000000000..013d79a050 --- /dev/null +++ b/docs/tutorial/tutorial-get-weather.md @@ -0,0 +1,179 @@ +# Tutorial: Adding actions to your dialog + +In this tutorial you will use the Bot Framework Composer to add actions to your dialog to prompt the user for their zip code, then the bot will respond with the weather forecast for the specified location based on a query to an external service. + +In this tutorial, you learn how to: + +> [!div class="checklist"] +> * Add actions in your trigger to prompt the user for information. +> * Create properties with default values. +> * Save data into properties for later use. +> * Retrieve data from properties and use it to accomplish tasks. +> * Make calls to external services. + +## Prerequisites +- Completion of the tutorial [Adding dialogs to your bot](./tutorial-add-dialog.md). +- An understanding of the concepts taught in the [Dialogs](../concept-dialog.md) concept article, specifically the section on [actions](../concept-dialog.md#action). +- An understanding of the concepts taught in the [Conversation flow and memory](../concept-dialog.md) concept article. + +## Get weather report +Before you can get the weather forecast you need to know the desired location. You can create a **Text Input** action to prompt the user for a zip code to pass to the weather service. At the end of the following steps is an animated GIF that demonstrates this process. + +1. Select **getWeather** in the **Navigation panel** to show the **getWeather** dialog. + +2. Select the **BeginDialog** trigger. + +3. To create the **Text Input** action, select **+** under the trigger node in the **Authoring canvas** then select **Text Input** from the **Ask a question** menu. + + > [!TIP] + > There are multiple options in the **Ask a question** menu. This enables you to easily request and validate different types of user input depending on your needs. See the [Asking for user input](../how-to-ask-for-user-input.md) article for more information. + + > [!IMPORTANT] + > After selecting **Text Input** from the **Ask a question** menu, you will notice that two new nodes appear in the flow. Each node corresponds to a tab in the _Properties panel_ as shown in the following image: + + ![Set the Default value property](../media/tutorial-weatherbot/03/user-prompts.png) + + > 1. **Bot Asks** refers to the bots prompt to the user for information. + > 2. **User Input** enables you to assign the user input to a property that is saved in memory and can be used by the bot for further processing. + > 3. **Other** enables you to validate the user input and respond with a message if invalid input is entered. + +1. Click on the **Bot Asks** tab in the **Properties panel** and enter the **Prompt** that the bot will display to the user to request their input: + + **What is your zip code?** + + ![zipcode prompt](../media/tutorial-weatherbot/03/zipcode-prompt.png) + +1. Set the **Default value** property (next to **Max turn count**) to ***'98052'*** (include the quotes). + + > [!TIP] + > By default prompts are configured to ask the user for information **Max turn count** number of times (defaults to 3). When the _max turn count_ is reached, the prompt will stop and the property will be set to the value defined in the **Default value** field before moving forward with the conversation. + +2. Next, select the **User Input** tab in the **Properties panel**. This part of the prompt represents the user's response, including where to store the value and how to pre-process it. + +3. Here is where you specify the property used to store the user's response. Enter the following value in the **Property to fill** field: + + **user.zipcode** + +4. Enter **trim(this.value)** in the **Output Format** field. This ensures that all leading and trailing spaces in the users input are trimmed before the value is validated and assigned to the property defined in the **Property to fill** field (**user.zipcode**). + + ![zipcode answer](../media/tutorial-weatherbot/03/zipcode-answer.png) + + > [!TIP] + > **trim()** is a [pre-built function](https://github.com/microsoft/BotBuilder-Samples/blob/master/experimental/common-expression-language/prebuilt-functions.md) of [common expression language](https://github.com/microsoft/BotBuilder-Samples/tree/master/experimental/common-expression-language). + +1. Select the **Other** tab in the **Properties panel**. This is where you can specify your validation rules for the prompt, as well as any error messages that will be displayed to the user if they enter an invalid value based on the **Validation Rules** you create. + +2. In the **Unrecognized Prompt** field, enter: + + **Sorry, I do not understand '@{this.value}'. Please specify a zip code in the form 12345** + +3. In the **Invalid Prompt** field, enter: + + **Sorry, '@{this.value}' is not valid. I'm looking for a 5 digit number as zip code. Please specify a zip code in the form 12345** + +4. In the **Validation Rules** field, enter: + + **length(this.value) == 5** + + This is the first validation rules, and it requires a five character value to pass, otherwise it will cause the error to appear to the user. + + > [!IMPORTANT] Make sure to press the enter key after entering the rule, if you don't it will not be added. + + Your properties pane should look like this: + + ![zipcode extensions](../media/tutorial-weatherbot/03/zipcode-extensions.png) + +You have created an action in your **BeginDialog** trigger that will prompt the user for their zip code and placed it into the **user.zipcode** property. Next you will pass the value of that property in an HTTP request to a weather service and validate the response, then if it passes your validation you will display the weather report to the user. + +## Add an HTTP request + +The entire process of adding an HTTP request, capturing the results into a property then determining what action to take depending on the results is demonstrated in this section. + +1. Select the **+** in the **Authoring canvas**, then select **Send an HTTP request** from the **Access external resources** menu. + + ![Select Send an HTTP request](../media/tutorial-weatherbot/03/http-step.png) + +2. In the **Properties panel**: + + - select **GET** from the **HTTP method** drop-down list. + + - Enter the following in the **Url** field: + + **http://weatherbot-ignite-2019.azurewebsites.net/api/getWeather?zipcode=@{user.zipcode}** + + This will enable the bot to make an HTTP request to the specified URL. The reference to **@{user.zipcode}** will be replaced by the value from the bots' **user.zipcode** property. + + ![http url](../media/tutorial-weatherbot/03/http-url.png) + + - Next, enter the following in the **Result property** field: + + **dialog.api_response** + + ![http result property](../media/tutorial-weatherbot/03/http-result-property.png) + + > [!TIP] + > **Result property** represents the property where the result of this action will be stored. The result can include any of the following 4 properties from the http response: + > - _statusCode_. This can be accessed via the `dialog.api_response.statusCode`. + > - _reasonPhrase_. This can be accessed via the `dialog.api_response.reasonPhrase`. + > - _content_. This can be accessed via the `dialog.api_response.content`. + > - _headers_. This can be accessed via the `dialog.api_response.headers`. + > - If the **Response type** is json, it will be a deserialized object available via `dialog.api_response.content` property. + +3. After making an HTTP request, you need to test the status of the response and handle errors is they occur. You can use an **If/Else branch** for this purpose. To do this, select the **+** button, then select **Branch: If/Else** from the **Create a condition** menu. + +4. In the **Properties panel** on the right, enter the following value into the **Condition** field: + + **dialog.api_response.statusCode == 200** + +5. In the **True** branch select the **+** button then select **Set a Property** from the **Manage properties** menu. + +6. In the **Properties panel** on the right, enter the following in the **Property** field: + + **dialog.weather** + +7. Next, enter the following in the **Value** field: + + **dialog.api_response.content** + + ![set a property](../media/tutorial-weatherbot/03/set-a-property.png) + +8. While still in the **True** branch, select the **+** button that appears beneath the action created in the previous step, then select **Send a response**. + +9. In the **Properties panel** on the right, enter the following response to send: + + `The weather is @{dialog.weather.weather} and the temp is @{dialog.weather.temp}°` + + The flow should now appear in the **Authoring canvas** as follows: + + ![If/Else diagram](../media/tutorial-weatherbot/03/ifelse.png) + +You will now tell the bot what to do in the event that the [statusCode](https://docs.microsoft.com/en-us/windows/win32/winhttp/http-status-codes) returned is not 200. + +10. Select the **+** button in the **False** branch, then select **Send a response** and set the text of the message to: + + **I got an error: @{dialog.api_response.content.message}** + +11. For the purposes of this tutorial we will assume that if you are in this branch, it is because the zip code is invalid, and if it is invalid it should be removed so that the invalid value does not persist in the **user.zipcode** property. To remove the invalid value from this property, select the **+** button that follows the **Send a response** action you created in the previous step, then select **Delete a property** from the **Manage properties** menu. + +12. In the **Properties panel** on the right, enter **user.zipcode** into the **Property** field. + + The flow should appear in the **Authoring canvas** as follows: + + ![user.zipcode If/Else diagram](../media/tutorial-weatherbot/03/ifelse2.png) + +You have now completed adding an HTTP request to your **BeginDialog** trigger. The next step is to validate that these additions to your bot work correctly. To do that you can test it in the Emulator. + +## Test in the bot Emulator + +1. Select the **Restart bot** button in the upper right-hand corner of the Composer screen, then **Test in Emulator**. + + ![Restart bot and test in Emulator](../media/tutorial-weatherbot/02/restart-bot.gif) + +2. After the greeting, send **weather** to the bot. The bot will prompt you for a zip code. Give it your home zip code, and seconds later, you should see the current weather conditions. + + ![Weather bot in Emulator](../media/tutorial-weatherbot/03/basic-weather.gif) + + If you ask for the weather again, notice that the bot doesn't prompt for a zip code the second time. Remember, this is because **user.zipcode** is already set. If **Always prompt** had been selected, the bot would have prompted for the zip code. You can verify that by going back to step 10 and checking **Always prompt** and try again. Your bot will ask for a zip code every time you re-start the conversation in THE Emulator. + +## Next steps +- [Add help and cancel commands](./tutorial-add-help.md) diff --git a/docs/tutorial/tutorial-introduction.md b/docs/tutorial/tutorial-introduction.md new file mode 100644 index 0000000000..ebeee89a43 --- /dev/null +++ b/docs/tutorial/tutorial-introduction.md @@ -0,0 +1,20 @@ +# The Bot Framework Composer tutorials + +Welcome to the Bot Framework Composer tutorials. These start with the creation of a simple bot and gradually introducing more sophistication with each successive tutorial building on the previous one and adding additional capabilities. These tutorials are designed to teach some of the basic concepts required to build bots with the Bot Framework Composer. + +In these tutorials, you'll learn how to: + +> [!div class="checklist"] +> * [Create a simple bot](tutorial-create-bot.md) and test it in the Emulator +> * [Add multiple dialogs](tutorial-add-dialog.md) to help your bot fulfill more than one scenario +> * [Use prompts](tutorial-get-weather.md) to ask questionsand get responses from an HTTP request +> * [Handle interruptions](tutorial-add-help.md) in the conversation flow in order to add global help and the ability to cancel at any time +> * [Use Language Generation](tutorial-lg.md) to power your bot's responses +> * [Send responses with cards](tutorial-cards.md) +> * [Use LUIS in your bot](tutorial-luis.md) + +## Prerequisites +- A good understanding of the material covered in the [Introduction to Bot Framework Composer](../introduction.md), including the naming conventions used for elements in the Composer. + +## Next step + - [Create a new bot and test it in the Emulator](./tutorial-create-bot.md) diff --git a/docs/tutorial/tutorial-lg.md b/docs/tutorial/tutorial-lg.md new file mode 100644 index 0000000000..d761fb10f7 --- /dev/null +++ b/docs/tutorial/tutorial-lg.md @@ -0,0 +1,116 @@ +# Tutorial: Adding language generation to your bot +Now that the bot can perform its basic tasks, it's time to improve your bots ability to converse with the user. The ability to understand what your user means conversationally and contextually then responding with useful information is often the primary challenge for a bot developer. The Bot Framework Composer integrates with the Bot Framework Language Generation library, a set of powerful templating and message formatting tools that enable you to include variation, conditional messages, and dynamic content that give you greater control of how your bot responds to the user. A good bot doesn't just do a task - it does it with style and personality and Composer makes it easier to integrate these capabilities into your bot. + +In this tutorial, you learn how to: + +> [!div class="checklist"] +> * Integrate Language Generation into your bot using Composer + +## Prerequisites +- Completion of the tutorial [Adding Help and Cancel functionality to your bot](./tutorial-add-help.md). +- A working knowledge of the concepts taught in the [Language Generation](../concept-language-generation.md) article. + + +## Language Generation +Let's start by adding some variation to the welcome message. + +1. Go to the **Navigation** pane and select the **weatherBot.Main** dialogs **WelcomeTheUser** trigger. + +2. Select the **Send a response** action in the **Authoring Canvas**. + + ![Select th -WelcomeTheUser trigger](../media/tutorial-weatherbot/05/select-the-WelcomeTheUser-trigger.png) + +3. Replace the response text in the **Properties** panel with the following: + + ``` + - Hi! I'm a friendly bot that can help with the weather. Try saying WEATHER. + - Hello! I am Weather Bot! Say WEATHER to get the current conditions. + - Howdy! Weather bot is my name and weather is my game. + ``` + + > [!NOTE] + > Your bot will randomly select any of the above entries when responding to the user. Each entry must begin with the dash (**-**) character on a separate line. For more information see the [Template](../concept-language-generation.md#templates) and [Anatomy of a template](../concept-language-generation.md#anatomy-of-a-template) sections of the **Language Generation** article. + +4. To see how this works, select the **Restart Bot** button in the **Toolbar** and open it in the Emulator, then select **Restart conversation** a few times to see the results of the greetings being randomly selected. + + Currently, the bot reports the weather in a very robotic manner: + + > The weather is Clouds and it is 75°. + + It's possible to improve the language used when delivering the weather conditions to the user. You can do this by using one of these two features of the Language Generation system: Conditional messages and parameterized messages. + +5. Select **Bot Responses** from Composers menu. + + ![Bot Responses](../media/tutorial-weatherbot/05/bot-responses.png) + +6. Toggle the **Edit Mode** switch in the upper right hand corner so that it turns blue. This will enable a syntax-highlighted LG editor in the main pane. + + > You'll notice that every message you created in the flow editor also appears here. They're linked, and any changes you make in this view will be reflected in the flow as well. + + ![](../media/tutorial-weatherbot/05/editmode.png) + +7. Scroll to the bottom of the editor. +8. Paste the following text: + + ``` + # DescribeWeather(weather) + - IF: @{weather.weather=="Clouds"} + - It is cloudy + - ELSEIF: @{weather.weather=="Thunderstorm"} + - There's a thunderstorm + - ELSEIF: @{weather.weather=="Drizzle"} + - It is drizzling + - ELSEIF: @{weather.weather=="Rain"} + - It is raining + - ELSEIF: @{weather.weather=="Snow"} + - There's snow + - ELSEIF: @{weather.weather=="Clear"} + - The sky is clear + - ELSEIF: @{weather.weather=="Mist"} + - There's a mist in the air + - ELSEIF: @{weather.weather=="Smoke"} + - There's smoke in the air + - ELSEIF: @{weather.weather=="Haze"} + - There's a haze + - ELSEIF: @{weather.weather=="Dust"} + - There's a dust in the air + - ELSEIF: @{weather.weather=="Fog"} + - It's foggy + - ELSEIF: @{weather.weather=="Ash"} + - There's ash in the air + - ELSEIF: @{weather.weather=="Squall"} + - There's a squall + - ELSEIF: @{weather.weather=="Tornado"} + - There's a tornado happening + - ELSE: + - @{weather.weather} + ``` + + > This creates a new Language Generation template named `DescribeWeather`. This template enables the LG system to use the data returned from the weather service that was placed into the weather.weather variable to respond to the user with a more user friendly response. + +9. Select **Design Flow** from the Composer Menu. + +10. Select the **getWeather** dialog, then its **BeginDialog** trigger in the **Navigation** pane. + + ![Select the BeginDialog trigger](../media/tutorial-weatherbot/05/Select-the-BeginDialog-trigger.png) + +11. Scroll down in the **Authoring Canvas** and select the **Send a response** action that starts with _The weather is..._. + +13. Now replace the response with the following: + + `- @{DescribeWeather(dialog.weather)} and the temp is @{dialog.weather.temp}°` + + > This syntax enables us to nest the `DescribeWeather` template _inside another template_. LG templates can be combined in this way to create more complex templates. + + ![Nesting template example](../media/tutorial-weatherbot/05/lg-2.png) + + You are now ready to test this in the Emulator. + +14. Select the **Restart Bot** button in the **Toolbar** then open it in the Emulator. + +Now, when you say `weather`, the bot will send you a message that sounds much more natural than it did previously. You can combine these techniques to create more variety in your messages! + +![](../media/tutorial-weatherbot/05/nice-weather.png) + +## Next steps +- [Tutorial: Incorporating cards and buttons into your bot](./tutorial-cards.md) \ No newline at end of file diff --git a/docs/tutorial/bot-tutorial-luis.md b/docs/tutorial/tutorial-luis.md similarity index 86% rename from docs/tutorial/bot-tutorial-luis.md rename to docs/tutorial/tutorial-luis.md index 9a9e878bda..c223581f25 100644 --- a/docs/tutorial/bot-tutorial-luis.md +++ b/docs/tutorial/tutorial-luis.md @@ -1,8 +1,16 @@ -# Using LUIS for Language Understanding - +# Tutorial: Using LUIS for Language Understanding Up until this point, we have been using a simple regex recognizer to detect user intent. Bot Framework Composer has deep integration with LUIS. -Let's go ahead and update our dialog's recognizers to use luis instead. +In this tutorial, you learn how to: + +> [!div class="checklist"] +> * Add LUIS into your bot with Composer for Language Understanding + +## Prerequisites +- Completion of the tutorial [Incorporating cards and buttons into your bot](./tutorial-cards.md). +- A working knowledge of the concepts taught in the [Language Understanding](../concept-language-understanding.md) article. +- [LUIS](https://www.luis.ai/home) and how to get [LUIS authoring key](https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-concept-keys#programmatic-key). + ## Update recognizer @@ -44,7 +52,7 @@ Let's go ahead and update our dialog's recognizers to use luis instead. ![](../media/tutorial-weatherbot/07/luis-with-lu-content.png) 3. Once you have done this, you need to re-configure the various **Intent** triggers within that dialog. -4. Click on `weather` trigger in the left navigation and choose `Weather` from the intent drop down +4. Click on `weather` trigger in the left navigation. Select `Weather` from the intent drop-down menu in the properties panel on the right side of the Composer. Update the title of the trigger to `Weather` instead of **Intent** @@ -91,8 +99,8 @@ With LUIS, you no longer have to type in exact regex patterns to trigger specifi * "How is the weather" * "Weather please" -* "Cancel everything" -* "Not sure what I can do" +* "Can you help me" +* "Cancel please" ![](../media/tutorial-weatherbot/07/luis-wired-up.png) diff --git a/toc.md b/toc.md index c22812f63d..720cfb389e 100644 --- a/toc.md +++ b/toc.md @@ -1,19 +1,21 @@ # Microsoft Bot Framework Composer ### Overview -- [Introduction to Bot Framework Composer](./docs/bfcomposer-intro.md) - +- [Introduction to Bot Framework Composer](./docs/introduction.md) +### Installation +- [Set up Composer using Yarn](./docs/setup-yarn.md) ### QuickStart -- [Create your first bot](./docs/tutorial-create-echobot.md) +- [Tour of Composer](./docs/onboarding.md) +- [Create your first bot](./docs/quickstart-create-bot.md) ### Tutorials -0. [Onboarding](./docs/tutorial-onboarding.md) -1. [Create a bot](./docs/tutorial/bot-tutorial-introduction.md) -2. [Add a dialog](./docs/tutorial/bot-tutorial-add-dialog.md) -3. [Get weather report](./docs/tutorial/bot-tutorial-get-weather.md) -4. [Add help and cancel commands](./docs/tutorial/bot-tutorial-add-help.md) -5. [Add Language Generation](./docs/tutorial/bot-tutorial-lg.md) -6. [Use cards](./docs/tutorial/bot-tutorial-cards.md) -7. [Add LUIS](./docs/tutorial/bot-tutorial-luis.md) +0. [Introduction](./docs/tutorial/tutorial-introduction.md) +1. [Create a bot](./docs/tutorial/tutorial-create-bot.md) +2. [Add a dialog](./docs/tutorial/tutorial-add-dialog.md) +3. [Get weather report](./docs/tutorial/tutorial-get-weather.md) +4. [Add help and cancel commands](./docs/tutorial/tutorial-add-help.md) +5. [Add Language Generation](./docs/tutorial/tutorial-lg.md) +6. [Use cards](./docs/tutorial/tutorial-cards.md) +7. [Add LUIS](./docs/tutorial/tutorial-luis.md) ### Concepts - [Dialogs](./docs/concept-dialog.md) @@ -23,8 +25,6 @@ - [Language understanding](./docs/concept-language-understanding.md) ### How to -#### Set up -- [Set up Composer using Yarn](./docs/setup-yarn.md) #### Develop - [Send messages](./docs/how-to-send-messages.md) - [Ask for user input](./docs/how-to-ask-for-user-input.md)