Release 6.2 introduces Chatbot-IVR integration capability that enables developers to integrate conversational bots with your existing IVR systems. Also, in this release, we've made several signification additions to the Knowledge Graph and System Entities. Kore.ai Bots Platform now also lets developers customize the NLP parameters as every use-case demands a varied NLP criterion. Read below to know the full list of features included in this release.
Seamless Chatbot-IVR Integration
If you are using an IVR system for your customer service, it’s about time you give it the AI-era makeover. Kore.ai now lets you build chatbots that can empower your IVR system with human-like conversation capabilities that can take your organization’s customer service experience to the next level.
Our platform provides the following features that make the integration quick and seamless:
- Native VXML Support: Built-in support to parse and generate W3C compliant VXML files.
- Hybrid Integration: Flexibility to build only essential use cases or dialogs to Bots Platform that can work in sync with the IVR dialogs.
- Grammar and Transcription Engine Capabilities: Choice to define the grammar or use UNIMRCP-encoded voice-to-text services.
- Granular Call Flow Support: Ability to define all the call flow elements such as grammar, prompts, retry, and time-out periods.
Help link: IVR Integration
Knowledge Graphs Enhancements
- Term Designations to Qualify Paths: Now designate the terms in your ontology as Regular, Mandatory, or Organizer, depending on their importance in qualifying matching paths.
- Language-Specific Ontology, Choice of Publishing: You can now build a separate ontology for each supported language to customize the flow of the ontology according to the selected language. You can also publish the Knowledge Graph only in selected languages.
- Global Synonyms: When you add a synonym for a term in the Knowledge Graph, you can now add them as global synonyms. Local synonyms apply to the term only in that particular path, whereas global synonyms apply to the term in any path that it occurs.
- Concurrent Development: You can now allow multiple developers to work simultaneously on an ontology. Only the node being edited and all its parents up to the first-level node are locked, with the other nodes kept open for editing.
- Multi-message Responses: Sometimes the responses to FAQs can be quite lengthy or may include nice-to-have information apart from the primary response. To improve the readability of such responses, you can now split information into multiple responses that go as separate messages consecutively.
Help link: Kore.ai Knowledge Graph
Customizable NLP Thresholds
Kore.ai now makes vital NLP thresholds accessible to you, so instead of depending on default values, you can customize them for your bot’s specific needs. The thresholds can control the bot’s Fundamental meaning, Knowledge Graph, and Ranking & resolver engines.
Entity Node Enhancements
We’ve made the following enhancements to Entity recognition by Kore.ai NLP engine
- New Entity Types:
- Composite: Captures a composite value from the user input, one that involves more than one entity. For example, Car Type can be a combination of build date, make, and model. The distance could be defined as a combination of direction and number of miles like 2 miles East.
- Date Period: Captures start date and end date from the user input, for example, Book the hotel for five days starting May 5. If the user input doesn’t include one or both of the dates, the bot prompts the user to provide the necessary input.
- Quantity – Age: Captures the age in one of the following units: days, weeks, months, years, decades, and centuries.
- Instance-Specific Text Prompts: You can now write User or Error Prompts specific to a particular instance of the Entity node in the Instances Properties window. When you do so, the prompts that you define in the Component Properties of the node become disabled for the entity instance.
- Auto-Correction of Relevant Inputs for List of Values: You can now set up auto-correct thresholds for the LOV entity type so that it not only accepts exact matches but also the closest utterances with minor variations. For example, let’s consider that a list value called Apple has the following synonyms: red apple, green apple, and raw apple. For this entity, user inputs like raw apple, green, raw, apple, were always accepted, but the bot now also accepts a typo such as “appel” based on your threshold settings. Spell correction wouldn’t apply to dictionary words or alphanumeric inputs.
- Reuse a Used-up Entity Value: You can now turn on the option to reuse a user utterance for an entity node, wherein the utterance is used to extract the entity value for the node even if the same utterance has been used by another entity before.
- Optional User Input: You can now mark entities as mandatory, optional, or Hidden. Mandatory entities expect the end user to definitely provide the value for an entity before proceeding with the rest of the dialog. Optional entities prompt the user for input only once and proceed with the dialog regardless of the type of input provided by the user. Hidden entities do not prompt the user for an input.
Help link: Working with the Entity Node
Code Samples for Message Templates
Creating channel-specific messages using templates became a lot easier. When you select a message template for a Web/Mobile or Facebook Messenger channels, the editor now shows a sample code that you can read, copy, or tweak to create your own.
Help link: Message Formatting and Templates
Advanced, Channel-Specific Welcome Messages
You can now format your bot’s welcome message using JavaScript like all other prompts. You can also customize the welcome for each bot channel.
Minor Updates
- As a part of bot training, when a task has both published and in-development versions, the training windows no longer show the task names twice (task name and task name @development suffix). The task name now appears only once. When you click the name to edit it, if the task has an in-development version it opens directly. If it only has a published version, you will receive an alert with an option to create an in-development version.
- You can now dynamically pass the end user’s language preference in WebSDK for every user session. This option enhances the end-user experience if their preferred language is already captured previously.
- Smartbots: You can now use the BotKit of the parent for inherited Bots as well, without the need to host separate BotKit for each inherited bot.
- You now have an option to turn off developer interactions from the Analyze page.
- The Context object now includes all user utterances along with timestamps.
- We’ve improved the Did you mean dialog to perform as follows:
- For a single possible match:
- If the response is Affirmative (Yes) – Executes the intent
- If the response is Negative (No) – Discards the Did you mean and initiates the help message
- If the user message exactly matches the task name: Executes the intent.
- Anything else/Partial match – Discards the Did you mean and initiates the help message
- For Multiple possible matches:
- If the user message exactly matches the task name – Executes the intent
- Anything else/Partial match – Discards the Did you mean and initiates the ‘Help’ message
- If the user types a, b, c – selects the relevant task in the order and executes it.
- For a single possible match:
- Kore.ai NLP intent detection and entity extraction services are now available on a public API.
- All the Post Login API calls now include a Random ID in API request header.
- Elastic Search is updated to the latest version.