Natural Language

Spell Correction – Version 2

The Spell Correction feature for the supported languages corrects the misspelled words in the user’s utterance based on the language. Version 10.1.19 of the XO Platform includes a new version of the Spell Correction (Version 2) for English that comes with the following advantages: Consistent Spell Correction Experience: The spell…

Prompts and Requests Library

If you have integrated a Custom LLM, continue to add prompts. If not, skip this article and see Co-Pilot and Dynamic Conversations features. Effective prompts play a crucial role in enhancing response accuracy when interacting with LLMs. The new Prompts Library module empowers bot designers by allowing them to create…

LLM Integration

To use LLM and Generative AI features, you must configure the integration with a pre-built or custom LLM. Pre-built LLM Integration The XO Platform offers seamless integration with leading AI services like Azure OpenAI, OpenAI, and Anthropic. Utilizing pre-configured prompts and APIs, you can effortlessly tap into the core capabilities…

Co-Pilot Features

The Co-Pilot features add design-time capabilities to accelerate your bot development process using tailored LLM features. By default, all the features are disabled. To enable the feature, select the model, prompt (for custom model only), and then toggle the status to enable it. You can select another supported model for…

Dynamic Conversations Features

The Dynamic Conversations features boost your virtual assistant’s performance with LLM-powered runtime features designed to streamline development and reduce time and effort. By default, all the features are disabled. To enable the feature, select the model and toggle the status to enable it. You can select another supported model for…

Kore.ai XO GPT Module

The new Kore.ai XO GPT Models module provides fine-tuned large language models optimized for enterprise conversational AI applications. These models have been evaluated and fine-tuned to be accurate, safe, and efficient for production deployment. Initial capabilities include Conversation Summarization and User Query Rephrasing. Additional models for features like Intent Resolution,…

Introduction to LLM and Generative AI

The Kore.ai XO Platform helps enhance your bot development process and enrich end-user conversational experiences by integrating pre-built (OpenAI, Azure OpenAI, Anthropic) or custom models in the backend. In addition to the out-of-box integration with pre-built models, the Platform supports the bring-your-own (BYO) model framework to integrate with externally hosted…

Training Validations

NLP models play a significant role in providing natural conversational experiences for your customers and employees. Improving the accuracy of the NLP models is a continuous journey and requires fine-tuning, as you add new use cases to your virtual assistant. The Kore.ai XO Platform proactively validates the NLP training provided…

Advanced NLP Configurations

You can fine-tune intent detection for each language enabled for your Virtual Assistant (VA). To perform this action, follow the below steps: On the left pane, click Natural Language > Training > Thresholds & Configurations. Under the Thresholds & Configurations section, you can perform by customizing The Fundamental Meaning model – see…

Model Validation

Once you have built your virtual assistant and trained it, the Kore.ai platform builds an ML model mapping user utterance with intents (click here for more info). Once created, it is recommended to validate the model to understand and estimate an unbiased generalization performance of the ML model. The XO…
Menu