시작
Kore.ai 대화형 플랫폼
챗봇 개요
자연어 처리(NLP)
봇 개념 및 용어들
빠른 시작 가이드
봇 빌더 접근 방법
사용 고지 사항 (영어)
Kore.ai 봇 빌더로 작업하기
봇 구축 시작하기
릴리스 정보
현재 버전 (영어)
이전 버전 (영어)

개념
디자인
스토리보드
대화 작업
개요
Using the Dialog Builder Tool
노드 유형
사용자 의도 노드
대화 노드
엔티티 노드
양식 노드
확인 노드
서비스 노드
봇 조치 노드
Service Node
WebHook 노드
스크립트 노드
노드 그룹화하기
Agent Transfer Node
사용자 프롬프트
음성 통화 속성
대화 관리
노드 및 전환
구성 요소 전환
컨텍스트 개체
이벤트 기반 봇 조치
지식 그래프
소개
지식 추출
지식 그래프 생성
봇에 지식 그래프 추가
그래프 생성
지식 그래프 작성
FAQ 추가
작업 실행
기존 소스에서 FAQ 구축
특성, 동의어 및 불용어
변수 네임스페이스 관리
수정
용어 편집 및 삭제
용어 편집 및 삭제
질문과 응답 편집
Knowledge Graph Training
지식 그래프 분석
봇 온톨로지 가져오기 및 내보내기
지식 그래프 가져오기
지식 그래프 내보내기
지식 그래프 생성
CSV 파일에서
JSON 파일
지식 그래프 생성
경고 작업
스몰 토크
Digital Skills
디지털 양식
Views
Digital Views
Panels
Widgets
기차
봇 성능 향상 – NLP 최적화
기계 학습
소개
모델 검증
기초 의미
지식 그래프 학습
특성
순위 및 해결
고급 NLP 설정
NLP 설정 및 지침
봇 인텔리전스
소개
컨텍스트 관리
컨텍스트 관리
대화 관리
다중 – 의도 탐지
엔티티 수정
기본 대화
정서 관리
어조 분석
Test & Debug
봇과 대화
발화 테스트
배치 테스트하기
대화 테스트
배포
채널 활성화
봇 게시
분석
봇 분석하기
Conversations Dashboard
Performance Dashboard
사용자 정의 대시보드
소개
맞춤형 메타 태그
사용자 정의 대시보드 생성 방법
Conversation Flows
NLP 지표
Containment Metrics
사용량 지표
스마트 봇
소개
범용 봇
소개
범용 봇 정의
범용 봇 생성
범용 봇 학습
범용 봇 커스터마이징
범용 봇용 추가 언어 활성화
스토어
Manage Assistant
플랜 및 사용량
Overview
Usage Plans
Support Plans
플랜 관리
봇 인증
다국어 봇
개인 식별 정보 삭제하기
봇 변수 사용
IVR 통합
일반 설정
봇 관리

방법
간단한 봇 생성하기
Design Conversation Skills
뱅킹 봇 생성
뱅킹 봇 – 자금 이체
뱅킹 봇 – 잔액 업데이트
Knowledge Graph (KG) 구축
스마트 경고를 예약하는 방법
Design Digital Skills
디지털 양식 설정 방법
디지털 보기 설정 방법
데이터 테이블에 데이터를 추가하는 방법
데이터 테이블 내 데이터 업데이트 방법
UI 양식에서 데이터 테이블에 데이터를 추가하는 방법
Train the Assistant
특성 사용 방법
의도와 엔티티에 대한 패턴 사용 방법
컨텍스트 전환 관리 방법
Deploy the Assistant
상담사 전환을 설정하는 방법
봇 기능 사용 방법
콘텐츠 변수 사용 방법
전역 변수 사용 방법
Kore.ai 웹 SDK 튜토리얼
Kore.ai 위젯 SDK 튜토리얼
Analyze the Assistant
사용자 정의 대시보드 생성 방법
사용자 지정 태그를 사용하여 봇 메트릭을 필터링하는 방법

API 및 SDK
API 참조
Kore.ai API 사용
API 목록
API 컬렉션
koreUtil Libraries
SDK 참조
상담사 전환을 설정하는 방법
봇 기능 사용 방법
콘텐츠 변수 사용 방법
전역 변수 사용 방법
소개
Kore.ai 웹 SDK 튜토리얼
Kore.ai 위젯 SDK 튜토리얼

관리
소개
봇 관리자 콘솔
대시보드
사용자 관리
사용자 관리
그룹 관리
역할 관리
봇 관리 모듈
등록
사용자 초대
사용자 등록을 위한 대량 초대 보내기
사용자 및 사용자 데이터 가져오기
Active Directory에서 사용자 동기화
보안 및 준수
싱글 사인 온 사용
보안 설정
Kore.ai 커넥터
봇 관리자용 분석
Billing (지원하지 않음)
  1. Docs
  2. Virtual Assistants
  3. Builder
  4. Creation
  5. Storyboard Dialog Tasks

Storyboard Dialog Tasks

Dialog Tasks are the infrastructure that the Kore.ai XO Platform provides for building conversation-based interactions between Virtual Assistants and Users, and integrating these interactions with business systems and processes.

Dialog Tasks are available within the XO Platform as follows:

  1. As part of the Storyboard, where the focus is on the Conversation Designer.
  2. As a standalone module, under Conversation Skills > Dialog Tasks, where the focus is on integrating the Conversation Designer and Dialog Builder to create interactions between Virtual Assistants, Users and business systems and processes.

The Structure of a Dialog Task

The Dialog Task development process consists in three main stages, built into the structure made available to you by the XO Platform:

  1. Design your conversation flow using natural language.
  2. Build business logic into your conversation. 
  3. Train: Once you are done creating your conversation flow, you can train your VA on it.

The Design and Build stages work in sync, so that you can generate logical flows and integrate business processes while simultaneously working based on conversational elements. Essentially, this  allows you to see what the conversation looks like to a human user and to the VA. This is achieved using two important features, called the Conversation Designer and the Dialog Builder, respectively.

This Train section is where you can train your Virtual Assistant using a variety of parameters that will help it fulfill the Dialog Task that you are working on. Using this feature, you are engaging with Kore.ai’s NLP engines to allow your VA to maximize its potential for reaching your users’ goals. This article will not be focusing on training related aspects, therefore please see Navigating Dialog Tasks to learn more about the Train section.

The Conversation Designer and the Dialog Builder

When creating Dialog Tasks, teams can work with dedicated views. In the XO Platform, these views are referred to as the Conversation Designer and the Dialog Builder

  1. From under Conversation Skills > Storyboard > Dialog Tasks. When creating or opening a Dialog Task, you will be presented with the Design View, called the Conversation Designer. Conversation Designers and Business Analysts can use the Design View to work on the natural language side of the interaction, The Design View looks similar to Storyboard Mock Scenes, providing a chat-like interface. However, unlike Mock Scenes, this view is meant to be used for implementation, rather than prototyping. Therefore, in the Design View, all features are available.This is why, throughout this documentation, we refer to this view as the Conversation Designer or the Conversation Builder, interchangeably. Please continue reading to learn more about this view.
  2. From under Conversation Skills > Dialog Tasks. When creating or opening a Dialog Task, you will be first presented with the Dialog Builder. If you want to work with the Conversation Designer from here, please select the Design View. Engineers can use the Build View to add logic and integrate systems into Dialog Tasks that become functionalities of the VA being built. This view is based on logic rather than natural language and provides a flow visualization that uses nodes to manage and view the flow of the conversation. This is where the full extent of the Dialog Task is being built, which is why, throughout this documentation, we refer to this view as the Dialog Builder. Within the Dialog Builder, there is also an integrated Conversation Builder, which engineers can use to see what the VA looks like in each view. Please read our article about the Dialog Builder to learn more.

Create a Dialog Task using the Conversation Designer

Moving forward, this article demonstrates how to create a Dialog Task, using the Conversation Designer. For information on creating a Dialog Task using the Dialog Builder, please see. Using the Dialog Builder.

To create a new Dialog Task:

  1. Select the Virtual Assistant that you want to work on.
  2. Go to the Build Tab.
  3. Select Conversation Skills > Storyboard.
  4. Select Dialog Tasks.
  5. Click Create Dialog.
  6. Enter a name for the dialog task in the Intent Name field. As the field label states, the name should reflect the user intent you want to achieve with this Dialog Task. Optionally, you can add a Description.
  7. You can add user utterances that should trigger this intent, using the Intent Training field. You may add this data later, from the user intent node that is generated within this task.
  8. Set additional options:
    • Set the task to be independent or a sub-intent dialog.
    • Hide from help requests by the user.
    • Specify the context tags as intent preconditions – this will ensure that the intent would be detected only when the context tag is present.
    • Set Context Output and define the context tags to set when this intent is initiated.
    • If you are using the legacy Dialog Builder please select Create Using the New Conversation Driven Dialog Builder which was introduced in ve r9.0 of the platform. This is the default setting and recommended. If you do not select this option, you will be prompted to Upgrade whenever you open the dialog task.
  9. Click Proceed.
  10. You will now be presented with the Conversation Designer (The Design View), where you can begin creating your new Dialog Task. The first element you will see in the new conversation is the intent, which you have provided as the name for the Dialog Task. This represents the initial message within the conversation, which should belong to the User. To continue developing the conversation, you may want to switch to the Bot tab.

    Now, let us look at the features you can use to develop conversations using the Conversation Designer, all approached based on a recommended workflow. 

    Configure the Intent Node

    Once you create the Dialog Task, a Primary Intent is created by default. This is taken as the User Utterance that triggers this dialog. To configure this node, click  the More Options button next to the Intent and select Configure Properties. 

    Here are the configurations you can make:

    1. Change Display Name and Node Name if needed.
    2. Add/change Description.
    3. Add utterances that would trigger this intent. You can add multiple utterances. These would be in addition to the primary intent utterance.
    4. You also have the option to Manage Training which will enable you to train the intent by adding Utterances, Patterns, and Rules. Refer here for more on utterance training.

    Entity Node

    The Entity Node is created whenever Bot > Ask a Question is selected for the Bot in the Design section.

    For each question you can do the following:

    1. Define Entity Type – Select from the drop-down list. This list includes the entity types supported by the Kore.ai XO Platform.
    2. Apply Formatting Options like bold, italics, etc.
    3. Use the following Templates to present the query:
      1. Attachment
      2. Text
      3. Quick Reply
      4. Carousel
      5. Button
      6. List
      7. Image

    For the Button Template of the Entity Node, the Platform now provides the ability to customize the path behavior on the storyboard designer by removing/reassigning the buttons from/to the path.

    To customize the path behavior, follow the steps below:

    1. Click the Design tab for the selected Dialog Task’s storyboard screen.
    2. Click the Templates icon and select Button.
    3. Click Path Behavior.
    4. On the Assign Path to Each Button window, map the buttons to their respective path boxes, delete, or switch the buttons as required.

    The system allows the user to do the following when designing a Button Template:

    1. Select one of the following Path Behaviors for the button template:
    2. When Common for all is selected, a single path flow is mapped to all the buttons in the template on the Dialog Builder as shown below:
    3. When Individual Paths is selected, individual path flows are mapped to all the paths of all the buttons in the template as shown below:
    4. Drag and drop the buttons in their respective path boxes when assigning the Path to each created button.
    5. Delete a button from the path by clicking the X icon.
    6. Add a Path to the storyboard preview screen by clicking the Add Path icon.
    7. Delete a Path by clicking the delete icon.
    8. Swap the buttons between paths or reassign paths.
    9. Be notified if there are duplicate buttons in the same path with the message “Button already exists in the path.
    10. Be prompted when a button is not associated with any path to associate the button to any of the existing paths when Individual Paths is selected.

      Note:

      1. The user can ignore the prompt and skip associating the buttons with a path.
      2. The user can have paths without any associated buttons on the storyboard.
    1. Delete the button from the path.When a new path is added, and no buttons are associated with it, the system treats the path flow for the Entity Node as an Else transition in the Build section.Once you add an Entity Node, the following properties can be configured:
      1. Display name
      2. Node name
      3. Type
      4. Is Multi-Item
      5. User Prompts
      6. Error Prompts

    For details on Entity component properties, please refer to this link.

    Set up Bot Messages

    Since the user has started the conversation through the intent, it is recommended to continue with  a message from the VA to the user. This would serve as a guide to the user on the conversation flow.

    Bot Messages can serve one of the following purposes:

    1. Ask a Question with the intention of gathering information from the user. This gets converted to an Entity node in the dialog task.
      1. You can further specify the type of user input expected like string, number, date, etc. It gets translated to an entity type in the dialog task.
      2. You can format the message using simple formatting options or by selecting a template for presenting the bot message.
    2. Ask Confirmation with the intention to decide the path of the conversation. This gets converted to a Confirmation node in the dialog task.
    3. Inform the User like a help message, welcome message, or as a response to the user query. These get converted to a Message node in the dialog task.

    For each of the above selections, you can either use an existing node by selecting from the list or create a new node. Please see the documentation relevant to each node type for detailed information.

    Here is a quick demo on how to work with Bot Messages in the Conversation Designer.

    Set Up User Responses

    Every bot message is ideally followed by a user response. User Responses can be used to define the conversation flow by predicting the user’s reply. For example, the VA might have asked the user to confirm an input, then based on the response there would be two paths – one for affirmation and one for negation.

    Note If not specified, the platform adds a Sample User Response placeholder to maintain the conversational flow.

    Work with Other Nodes

    Apart from  Bot and User Messages, you can

    1. Add placeholders for Bot Actions like service calls, scripts, logic, webhook and process to define the flow, and more. The actual functionality needs to be added from the dialog builder. You can leave comments for the developer elaborating the purpose of such a Bot action node. For example, for the Book Flight task, you want to connect to your backend servers for the actual booking process.
    2. Trigger Dialog Tasks for subtasks or related/follow-up tasks. For example, after booking a flight you might want to trigger the Web Checkin dialog.
    3. Agent Transfer nodes (only at the end of the conversation). For example, for a Flight Transfer task, you might want to authenticate the user credentials via a live agent.
    4. Add Digital Forms for capturing a series of user inputs. For example, for a Web Check In task, you might want to present a form to capture the user details like name, address, phone number, etc.

    Node Configurations

    While building the conversation, the nodes are generated with default settings. You can customize these configurations from the Conversation Designer itself or do it at a later time from the Dialog Builder.

    In the following section, we see the various configurations available from the Conversation Designer for each available node type.

    Entity Node

    The entity node is created whenever Bot -> Ask Question is selected.

    For each question you can:

    1. Define Entity Type – select from the drop-down list. This list includes the entity types supported by the Kore.ai XO Platform.
    2. Apply Formatting Options like bold, italics, etc.
    3. Use Templates like buttons, carousel, etc to present the query

    Once added you can configure properties:

    • Display name,
    • Node name,
    • Type,
    • Is Multi-Item,
    • User Prompts, and
    • Error Prompts.

    Refer here for details on Entity component properties.

    Confirmation Node

    The confirmation node is created whenever Bot -> Ask Confirmation is selected, along with a Yes, No, and two other user response paths. You can delete or add more options.

    For each confirmation you can:

    1. Apply Formatting Options like bold, italics, etc.
    2. Use Templates like buttons, carousel, etc to present the confirmation options.
    3. You can set configuration properties like
      1. Display name,
      2. Node name,
      3. User prompts,
      4. Display options, and
      5. Synonyms for yes/no.
    4. Select the concerned user response options to continue with the appropriate path.

    Refer here for confirmation node component properties.

    Advanced Features for User Responses

    Apart from the linear flow, you can add exceptional flows to the conversation. For example, while in the Book Flight task, the user might request the Weather Report at the destination city, or while in the Web Check In task, the user might have entered the wrong flight number three times in a row. These exceptional cases can be taken into consideration as a part of the conversation..

    For each user response you can:

    1. Set up alternative user responses,
    2. Configure user retries and the VA’s behavior when this limit is exceeded.
    3. Ask for an intent when there is an interruption or a sub-intent.

    Alternative User Response 

    This allows you to trigger a different conversation flow. For example, at the prompt for the City entity within the Weather Report task, if the user says “Not now” then the conversation should end. This denotes that the user’s response is unrelated to the VAs initial question, therefore the behavior should not be the same as if the response would be the expected one.

    Retry Limits

    You can set a number of wrong responses that the user can provide, then define the VAs behavior when a user exceeds the set number. A standard response is set by default.

    Use Configure Properties to define settings like the number of Allowed Retries and the Behavior on exceeding retriesend of dialog or transition to a node

    Ask Another Intent

    Configure the VA behavior with an interruption or a sub-intent, indicated using Ask Another Intent. For example, within the Book Flight task, the user asks for the Weather Report at the destination city using the specific intent. This option, therefore,  lets you define the VAs behavior when a user utterance deviates from the task at hand. To manage this:

    1. Enter the user response that is likely to ask for another intent.
    2. You can choose the intent from the available list or create a new one.
    3. You can set the transition to the new intent:
      1. As an Interruption to allow the user to switch to another task Please see here for more on interruption handling, or
      2. As Sub-intent, to allow the user to seamlessly branch into related intents, See here for more on sub-intents.

Storyboard Dialog Tasks

Dialog Tasks are the infrastructure that the Kore.ai XO Platform provides for building conversation-based interactions between Virtual Assistants and Users, and integrating these interactions with business systems and processes.

Dialog Tasks are available within the XO Platform as follows:

  1. As part of the Storyboard, where the focus is on the Conversation Designer.
  2. As a standalone module, under Conversation Skills > Dialog Tasks, where the focus is on integrating the Conversation Designer and Dialog Builder to create interactions between Virtual Assistants, Users and business systems and processes.

The Structure of a Dialog Task

The Dialog Task development process consists in three main stages, built into the structure made available to you by the XO Platform:

  1. Design your conversation flow using natural language.
  2. Build business logic into your conversation. 
  3. Train: Once you are done creating your conversation flow, you can train your VA on it.

The Design and Build stages work in sync, so that you can generate logical flows and integrate business processes while simultaneously working based on conversational elements. Essentially, this  allows you to see what the conversation looks like to a human user and to the VA. This is achieved using two important features, called the Conversation Designer and the Dialog Builder, respectively.

This Train section is where you can train your Virtual Assistant using a variety of parameters that will help it fulfill the Dialog Task that you are working on. Using this feature, you are engaging with Kore.ai’s NLP engines to allow your VA to maximize its potential for reaching your users’ goals. This article will not be focusing on training related aspects, therefore please see Navigating Dialog Tasks to learn more about the Train section.

The Conversation Designer and the Dialog Builder

When creating Dialog Tasks, teams can work with dedicated views. In the XO Platform, these views are referred to as the Conversation Designer and the Dialog Builder

  1. From under Conversation Skills > Storyboard > Dialog Tasks. When creating or opening a Dialog Task, you will be presented with the Design View, called the Conversation Designer. Conversation Designers and Business Analysts can use the Design View to work on the natural language side of the interaction, The Design View looks similar to Storyboard Mock Scenes, providing a chat-like interface. However, unlike Mock Scenes, this view is meant to be used for implementation, rather than prototyping. Therefore, in the Design View, all features are available.This is why, throughout this documentation, we refer to this view as the Conversation Designer or the Conversation Builder, interchangeably. Please continue reading to learn more about this view.
  2. From under Conversation Skills > Dialog Tasks. When creating or opening a Dialog Task, you will be first presented with the Dialog Builder. If you want to work with the Conversation Designer from here, please select the Design View. Engineers can use the Build View to add logic and integrate systems into Dialog Tasks that become functionalities of the VA being built. This view is based on logic rather than natural language and provides a flow visualization that uses nodes to manage and view the flow of the conversation. This is where the full extent of the Dialog Task is being built, which is why, throughout this documentation, we refer to this view as the Dialog Builder. Within the Dialog Builder, there is also an integrated Conversation Builder, which engineers can use to see what the VA looks like in each view. Please read our article about the Dialog Builder to learn more.

Create a Dialog Task using the Conversation Designer

Moving forward, this article demonstrates how to create a Dialog Task, using the Conversation Designer. For information on creating a Dialog Task using the Dialog Builder, please see. Using the Dialog Builder.

To create a new Dialog Task:

  1. Select the Virtual Assistant that you want to work on.
  2. Go to the Build Tab.
  3. Select Conversation Skills > Storyboard.
  4. Select Dialog Tasks.
  5. Click Create Dialog.
  6. Enter a name for the dialog task in the Intent Name field. As the field label states, the name should reflect the user intent you want to achieve with this Dialog Task. Optionally, you can add a Description.
  7. You can add user utterances that should trigger this intent, using the Intent Training field. You may add this data later, from the user intent node that is generated within this task.
  8. Set additional options:
    • Set the task to be independent or a sub-intent dialog.
    • Hide from help requests by the user.
    • Specify the context tags as intent preconditions – this will ensure that the intent would be detected only when the context tag is present.
    • Set Context Output and define the context tags to set when this intent is initiated.
    • If you are using the legacy Dialog Builder please select Create Using the New Conversation Driven Dialog Builder which was introduced in ve r9.0 of the platform. This is the default setting and recommended. If you do not select this option, you will be prompted to Upgrade whenever you open the dialog task.
  9. Click Proceed.
  10. You will now be presented with the Conversation Designer (The Design View), where you can begin creating your new Dialog Task. The first element you will see in the new conversation is the intent, which you have provided as the name for the Dialog Task. This represents the initial message within the conversation, which should belong to the User. To continue developing the conversation, you may want to switch to the Bot tab.

    Now, let us look at the features you can use to develop conversations using the Conversation Designer, all approached based on a recommended workflow. 

    Configure the Intent Node

    Once you create the Dialog Task, a Primary Intent is created by default. This is taken as the User Utterance that triggers this dialog. To configure this node, click  the More Options button next to the Intent and select Configure Properties. 

    Here are the configurations you can make:

    1. Change Display Name and Node Name if needed.
    2. Add/change Description.
    3. Add utterances that would trigger this intent. You can add multiple utterances. These would be in addition to the primary intent utterance.
    4. You also have the option to Manage Training which will enable you to train the intent by adding Utterances, Patterns, and Rules. Refer here for more on utterance training.

    Entity Node

    The Entity Node is created whenever Bot > Ask a Question is selected for the Bot in the Design section.

    For each question you can do the following:

    1. Define Entity Type – Select from the drop-down list. This list includes the entity types supported by the Kore.ai XO Platform.
    2. Apply Formatting Options like bold, italics, etc.
    3. Use the following Templates to present the query:
      1. Attachment
      2. Text
      3. Quick Reply
      4. Carousel
      5. Button
      6. List
      7. Image

    For the Button Template of the Entity Node, the Platform now provides the ability to customize the path behavior on the storyboard designer by removing/reassigning the buttons from/to the path.

    To customize the path behavior, follow the steps below:

    1. Click the Design tab for the selected Dialog Task’s storyboard screen.
    2. Click the Templates icon and select Button.
    3. Click Path Behavior.
    4. On the Assign Path to Each Button window, map the buttons to their respective path boxes, delete, or switch the buttons as required.

    The system allows the user to do the following when designing a Button Template:

    1. Select one of the following Path Behaviors for the button template:
    2. When Common for all is selected, a single path flow is mapped to all the buttons in the template on the Dialog Builder as shown below:
    3. When Individual Paths is selected, individual path flows are mapped to all the paths of all the buttons in the template as shown below:
    4. Drag and drop the buttons in their respective path boxes when assigning the Path to each created button.
    5. Delete a button from the path by clicking the X icon.
    6. Add a Path to the storyboard preview screen by clicking the Add Path icon.
    7. Delete a Path by clicking the delete icon.
    8. Swap the buttons between paths or reassign paths.
    9. Be notified if there are duplicate buttons in the same path with the message “Button already exists in the path.
    10. Be prompted when a button is not associated with any path to associate the button to any of the existing paths when Individual Paths is selected.

      Note:

      1. The user can ignore the prompt and skip associating the buttons with a path.
      2. The user can have paths without any associated buttons on the storyboard.
    1. Delete the button from the path.When a new path is added, and no buttons are associated with it, the system treats the path flow for the Entity Node as an Else transition in the Build section.Once you add an Entity Node, the following properties can be configured:
      1. Display name
      2. Node name
      3. Type
      4. Is Multi-Item
      5. User Prompts
      6. Error Prompts

    For details on Entity component properties, please refer to this link.

    Set up Bot Messages

    Since the user has started the conversation through the intent, it is recommended to continue with  a message from the VA to the user. This would serve as a guide to the user on the conversation flow.

    Bot Messages can serve one of the following purposes:

    1. Ask a Question with the intention of gathering information from the user. This gets converted to an Entity node in the dialog task.
      1. You can further specify the type of user input expected like string, number, date, etc. It gets translated to an entity type in the dialog task.
      2. You can format the message using simple formatting options or by selecting a template for presenting the bot message.
    2. Ask Confirmation with the intention to decide the path of the conversation. This gets converted to a Confirmation node in the dialog task.
    3. Inform the User like a help message, welcome message, or as a response to the user query. These get converted to a Message node in the dialog task.

    For each of the above selections, you can either use an existing node by selecting from the list or create a new node. Please see the documentation relevant to each node type for detailed information.

    Here is a quick demo on how to work with Bot Messages in the Conversation Designer.

    Set Up User Responses

    Every bot message is ideally followed by a user response. User Responses can be used to define the conversation flow by predicting the user’s reply. For example, the VA might have asked the user to confirm an input, then based on the response there would be two paths – one for affirmation and one for negation.

    Note If not specified, the platform adds a Sample User Response placeholder to maintain the conversational flow.

    Work with Other Nodes

    Apart from  Bot and User Messages, you can

    1. Add placeholders for Bot Actions like service calls, scripts, logic, webhook and process to define the flow, and more. The actual functionality needs to be added from the dialog builder. You can leave comments for the developer elaborating the purpose of such a Bot action node. For example, for the Book Flight task, you want to connect to your backend servers for the actual booking process.
    2. Trigger Dialog Tasks for subtasks or related/follow-up tasks. For example, after booking a flight you might want to trigger the Web Checkin dialog.
    3. Agent Transfer nodes (only at the end of the conversation). For example, for a Flight Transfer task, you might want to authenticate the user credentials via a live agent.
    4. Add Digital Forms for capturing a series of user inputs. For example, for a Web Check In task, you might want to present a form to capture the user details like name, address, phone number, etc.

    Node Configurations

    While building the conversation, the nodes are generated with default settings. You can customize these configurations from the Conversation Designer itself or do it at a later time from the Dialog Builder.

    In the following section, we see the various configurations available from the Conversation Designer for each available node type.

    Entity Node

    The entity node is created whenever Bot -> Ask Question is selected.

    For each question you can:

    1. Define Entity Type – select from the drop-down list. This list includes the entity types supported by the Kore.ai XO Platform.
    2. Apply Formatting Options like bold, italics, etc.
    3. Use Templates like buttons, carousel, etc to present the query

    Once added you can configure properties:

    • Display name,
    • Node name,
    • Type,
    • Is Multi-Item,
    • User Prompts, and
    • Error Prompts.

    Refer here for details on Entity component properties.

    Confirmation Node

    The confirmation node is created whenever Bot -> Ask Confirmation is selected, along with a Yes, No, and two other user response paths. You can delete or add more options.

    For each confirmation you can:

    1. Apply Formatting Options like bold, italics, etc.
    2. Use Templates like buttons, carousel, etc to present the confirmation options.
    3. You can set configuration properties like
      1. Display name,
      2. Node name,
      3. User prompts,
      4. Display options, and
      5. Synonyms for yes/no.
    4. Select the concerned user response options to continue with the appropriate path.

    Refer here for confirmation node component properties.

    Advanced Features for User Responses

    Apart from the linear flow, you can add exceptional flows to the conversation. For example, while in the Book Flight task, the user might request the Weather Report at the destination city, or while in the Web Check In task, the user might have entered the wrong flight number three times in a row. These exceptional cases can be taken into consideration as a part of the conversation..

    For each user response you can:

    1. Set up alternative user responses,
    2. Configure user retries and the VA’s behavior when this limit is exceeded.
    3. Ask for an intent when there is an interruption or a sub-intent.

    Alternative User Response 

    This allows you to trigger a different conversation flow. For example, at the prompt for the City entity within the Weather Report task, if the user says “Not now” then the conversation should end. This denotes that the user’s response is unrelated to the VAs initial question, therefore the behavior should not be the same as if the response would be the expected one.

    Retry Limits

    You can set a number of wrong responses that the user can provide, then define the VAs behavior when a user exceeds the set number. A standard response is set by default.

    Use Configure Properties to define settings like the number of Allowed Retries and the Behavior on exceeding retriesend of dialog or transition to a node

    Ask Another Intent

    Configure the VA behavior with an interruption or a sub-intent, indicated using Ask Another Intent. For example, within the Book Flight task, the user asks for the Weather Report at the destination city using the specific intent. This option, therefore,  lets you define the VAs behavior when a user utterance deviates from the task at hand. To manage this:

    1. Enter the user response that is likely to ask for another intent.
    2. You can choose the intent from the available list or create a new one.
    3. You can set the transition to the new intent:
      1. As an Interruption to allow the user to switch to another task Please see here for more on interruption handling, or
      2. As Sub-intent, to allow the user to seamlessly branch into related intents, See here for more on sub-intents.

메뉴