Chatbot Overview
Conversational Bots
Intents & Entities
Intelligent Bots
Kore.ai's Approach
Kore.ai Conversational Platform
Bot Concepts and Terminology
Natural Language Processing (NLP)
Bot Types
Bot Tasks
Starting with Kore.ai Platform
How to Access Bot Builder
Working with Kore.ai Bot Builder
Building your first Bot
Getting Started with Building Bots
Using the Dialog Builder Tool
Creating a Simple Bot
Release Notes
Latest Updates
Older Releases
Bot Builder
Creating a Bot
Design
Develop
Dialog Task
Working with User Intent Node
Working with the Dialog Node
Working with Entity Node
Supported Entity Types
Working with Composite Entities
Supported Time Zones
Supported Colors
Supported Company Names
Working with Message Nodes
Working with the Confirmation Nodes
Working with Service Node
Implementing Custom Authentication
Enabling 2-way SSL for Service nodes
Working with Script Node
Working with Agent Transfer Node
Working with WebHook Node
Defining Connections & Transitions
Managing Dialogs
Prompt Editor
Action & Information Task
Working with Action Tasks
Working with Information Tasks
Establishing Flows
Alert Tasks
Working with Alert Tasks
Managing Ignore Words and Field Memory
Knowledge Graph
Terminology
Building Knowledge Graph
Generation of Knowledge Graph
Importing and Exporting Knowledge Graph
Knowledge Graph Analysis
Knowledge Extraction
Natural Language
Overview
Machine Learning
ML Model
Fundamental Meaning
Knowledge Graph Training
Traits
Ranking and Resolver
NLP Detection
NLP Settings and Guidelines
Bot Intelligence
Overview
Context Management
Session and Context Variables
Context Object
Dialog Management
Sub-Intents
Amend Entity
Multi-Intent Detection
Sentiment Management
Tone Analysis
Sentiment Management
Default Conversations
Default Standard Responses
Channel Enablement
Test & Debug
Talking to Bot
Utterance Testing
Batch Testing
Recording Conversations
Publishing your Bot
Analyzing your Bot
Overview
Dashboard
Custom Dashboard
Conversation Flows
Bot Metrics
Advanced Topics
Bot Authorization
Language Management
Collaborative Development
IVR Integration
Universal Bots
Defining
Creating
Customizing
Enabling Languages
Smart Bots
Defining
Sample Bots
Github
Asana
Travel Planning
Flight Search
Event Based Bot Actions
Bot Settings
Bot Functions
General Settings
PII Settings
Customizing Error Messages
Bot Management
Using Bot Variables
API Guide
API Overview
API List
API Collection
SDKs
SDK Overview
SDK Security
SDK App Registration
Kore.ai Web SDK Tutorial
Message Formatting and Templates
Mobile SDK Push Notification
Web Socket Connect & RTM
Using the BotKit SDK
Installing the BotKit SDK
BotKit SDK Configuration
Events for the BotKit SDK
Functions for the BotKit SDK
BotKit SDK Tutorial – Agent Transfer
BotKit SDK Tutorial – Flight Search Sample Bot
Using an External NLP Engine
Bot Administration
Bots Admin Console
User Management
Managing Users
Managing Groups
Managing Role
Bots Management
Enrollment
Inviting Users
Sending Bulk Invites to Enroll Users
Importing Users and User Data
Synchronizing Users from Active Directory
Security & Compliance
Overview
Using Single Sign-On
Cloud Connector
Analytics
Billing
How Tos
Context Switching
Using Traits
Live Agent Transfer
Schedule a Smart Alert
Configure Agent Transfer
Custom Dashboard
Patterns for Intents & Entities
Build Knowledge Graph
  1. Home
  2. Docs
  3. Bots
  4. Bot Building
  5. Dialog Task
  6. Working with the Entity Node

Working with the Entity Node

Bots need to extract relevant information from the user utterance to fulfill the user intent.

Take a look at this sample utterance: Book me a flight from LA to NYC on Sunday.  To act on this user intent, which is to book a flight ticket, the bot should extract the entities such as the source city (LA), destination city (NYC), and the departure date (Sunday).

So, in a Dialog Task, for every critical data you want to get from a user utterance, you should create a corresponding Entity node. You can add Prompt messages to these nodes for users to input the required values.
Kore.ai supports more than 30 entity types such as Address, Airport, Quantity, TimeZone. You can also define the entities as a selection from a list, free-form entry, file or image attachment from a user, or as regex expressions.

Note: You may need to follow one Entity node with several other Entity nodes to collect series of user inputs to complete a transaction, such as a username, location, amount, and due date, followed by a Webhook node to process a request to an API to complete an online transaction.

Setting up an Entity Node

Setting up an Entity node in a Dialog task involves the following steps:

Step 1: Adding an Entity Node to the Dialog Task

  1. Open the Dialog Task where you want to add the Entity node.
  2. Hover over a node next to which you want to add the entity node, and click the plus icon on it.
  3. Select Entity > New entity node (you can also use an existing entity node, by selecting from the list).
  4. The Component Properties panel for the Entity Node opens.

Step 2: Configuring the Component Properties

The Entity Component Properties allow you to configure the General Settings, User and Error Prompts.

Note: The configurations you set up or modify in this section are reflected in all other Dialog Tasks that use this node.

  1. Enter a Name and Display Name for the Entity node. Entity names cannot have spaces.
  2. From the Type drop-down select an entity type depending on the expected user input. For example, if you want the user to type the departure date, select Date from the drop-down. The platform will do the basic validation based upon the Type selected.
    The Entity Type provides the NLP Interpreter with the expected type of data from a user utterance to enhance recognition and system performance. For more information, see the Entity Types
  3. Based on the Type selected, you might have an option to set the Entity as Multi-Item thereby allowing the user multiple selections.
  4. In the User Prompt text box, enter the prompt message that you want the user to see for this entity. For example, Enter the Departure Date. You can enter channel-specific messages for user prompts. For more information, see Using the Prompt Editor.
  5. In the Error Prompts box, review the default error message, and if required, click Edit Errors to modify it. For more information, see Using the Prompt Editor.

Step 3: Configuring the Instance Properties

Use the Instance Properties to determine whether to make the Entity value mandatory as well as to choose if you want to consider values from previous user utterances to capture entities.

Note: The settings in the Instance Properties panel are specific to the current Dialog Task and do not reflect in the other dialog tasks that use this entity node.

  1. Click the Instance Properties icon on the Entity node.
  2. Under the User Input section, select an option (see below for how the entity flow is managed):
    • Mandatory: This entity is required, and users must provide a valid entry before proceeding with the dialog flow. A prompt is displayed for the user to resolve in case ambiguous values for the entity are detected in the user utterance.
    • Optional: User is prompted only once for this entity and system proceeds with any input provided by the user. In case ambiguous values for these optional entities are detected in the User Utterance, then an resolution prompt is displayed allowing the user to pick the correct value.
    • Hidden: If enabled, the bot will not prompt for the entity value unless explicitly provided by the user.
  3. Under the Entity Extraction section, select one of these options:
    • Evaluate the un-used text from the previous utterance: When this option is selected, the entity uses the text that was not used by any other entity in the dialog so far. This is the default option.
    • Evaluate un-used text and text used for extracting entities from previous utterances: Select this option if you would like to reuse an entity value extracted by another Entity node in the dialog.
    • Do not evaluate previous utterances and explicitly prompt the user: Select this option if you want the bot to ignore the previous user utterances and explicitly prompt the user to provide value for the entity.
  4. Click the Advanced controls to set up these options:
    • User Prompts: Use the Click to Override button to write custom user prompts for this particular instance of the Entity node. Once you override, the User Prompts section in the Component Properties panel is disabled.  Also, these user prompts do not apply to any other instances of the node.
    • Error Prompts: Use the Click to Override button to write custom error prompts for this particular instance of the Entity node. Once you override, the Error Prompts section in the Component Properties panel is disabled.  Also, these error prompts do not apply to any other instances of the node.
    • Intent Detection (Applies only to String and Description entities): Select one of these options to determine the course of action if the bot encounters an entity as a part of the user entry for the String or Description entities:
      • Accept input as entity value and discard intent detected: The bot captures the user entry as a string or description and ignores the intent.
      • Prefer user input as intent and proceed with Hold & Resume settings: The user input is considered for intent detection and proceed according to the Hold & Resume settings.
      • Ask the user how to proceed: Allow the user to specify if they meant intent or entity.
    • Hold & Resume to define the interruption handling at this node, choose from:
      • Use the task level ‘Hold & Resume’ setting: The bot refers to the Hold & Resume settings set at the dialog task level.
      • Customize for this node option: You can customize the Hold & Resume settings for this node by selecting this option and configuring the same.  Read the Interruption Handling and Context Switching article for more information.
    • Precedence (Applies to all Entity types except String and Description nodes): When user’s input for an entity consists of a valid value for the entity and another intent, you can control the experience by choosing between “Intent over Entity” or “Entity over Intent” options. For example, if a Flight Booking bot prompts for the destination and the user enters, “Bangalore, how’s the weather there?” you get to define how the bot responds in such cases – pick the entity and add the intent to the Follow-up Intents stack or go ahead with the intent first based upon the Hold & Resume settings.
  5. Custom Tags defines tags to build custom profiles of your bot conversations. See here for more.

User Input Flow:

When a user is prompted for input, the following is the processing done by the platform:

  • If the user responds with a valid value, then the entity would be populated with that value and the dialog flow will continue.
  • If ambiguous values are identified in the user response, then ambiguity resolution prompt will be displayed.
  • If the user responds with an invalid utterance i.e. an utterance that doesn’t contain a valid input for the ambiguity resolution, then:
    • if the given value is valid for the entity (any possible value for the entity), then it will be used for the entity and conversation will be continued;
    • If the given value is not valid for the entity, and
      • If the value triggers any task intent, faq or small talk, then
        • The intent will be executed as per hold and resume settings and when (and if) the dialog (containing the entity) is resumed, then the user will be reprompted for the entity value;
      • If the value doesn’t trigger any task intent, faq or small talk, then the entity will be left blank and the conversation will be continued from the entity’s transitions.

Step 4: Configuring the Connections Properties

From the node’s Connections panel you can determine which node in the dialog task to execute next. You can write the conditional statements based on the values of any Entity or Context Objects in the dialog task, or you can use intents for transitions.
To setup Component Transitions, follow these steps:

  1. You can select from the available nodes under the Default connections.
  2. To configure a conditional flow, click Add IF.
  3. Configure the conditional expression based on one of the following criteria:
    1. Entity: Compare an Entity node in the dialog with a specific value using one of these operators: Exists, equals to, greater than equals to, less than equals to, not equal to, greater than, and less than. Select the entity, operator using the respective drop-down lists, and type the number in the Value box. Example: PassengerCount (entity) greater than (operator) 5 (specified value)
    2. Context: Compare a context object in the dialog with a specific value using one of these operators: Exists, equals to, greater than equals to, less than equals to, not equal to, greater than, and less than. Example: Context.entity.PassengerCount (Context object) greater than (operator) 5 (specified value)
    3. Intent: Select an intent that should match the next user utterance.
  4. In the Then go to the drop-down list, select the next node to execute in the dialog flow if the conditional expression succeeds. For example, if the PassengerCount (entity) greater than (operator) 5 (specified value), Then go to Offers (sub-dialog).
  5. In the Else drop-down list, select the node to execute if the condition fails.
Note: If you want to write multiple If conditions, click Add Else If below the last If conditional expression.

Step 5: Configuring the NLP Properties

  1. In the Suggested Synonyms for the < Entity Name > box, enter synonyms for the name of your Entity. Press enter after each entry to save it. For more information, see Managing Synonyms.
  2. In the Suggested Patterns for < Entity Name >, click +Add Pattern to add new patterns for your Entity. The Patterns field is displayed. For more information, see Managing Patterns.
  3. Manage Context
    • Define the context tags to be set in the context when this entity is populated using the Context Output field
    • Auto emit the entity values captured as part of the Context Object.
      (See here for Context Management)

Step 6: Configuring the IVR Properties

You can use this tab for defining at the Node level the input mode, grammar, prompts and call behavior parameters for this node to be used in IVR Channel. Refer here for details.

Menu