# Platform

### What is the NLX platform?

NLX is an end-to-end platform for building conversational AI applications. What are conversational applications you ask? The answer is simple: any application that you can converse with naturally, as opposed to having to click around.

Whether you’re launching a handsfree Voice+ app, an immersive chat interface, or an agentic AI application that reasons and acts, NLX provides everything to design, deploy, and scale in one place.

<h4 align="center">Chat.   Voice.   Voice+</h4>

<figure><img src="https://2737319166-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FHCxYxhIU0Bqkjj942mGk%2Fuploads%2F8i1xLkgEpqTN1HiViDUD%2FApplicationStack.png?alt=media&#x26;token=0a38dc12-de5c-426b-ad15-b2ee04693e6e" alt=""><figcaption></figcaption></figure>

<h4 align="center">Build.   Deploy.   Analyze.</h4>

We reject forcing builders to juggle multiple tools or write complex orchestration code. NLX handles the infrastructure, language models, integrations, security, and runtime complexity so your team can focus on crafting exceptional user experiences quickly.

#### With NLX, you can

<i class="fa-check">:check:</i>  Build visual workflows that automate tasks and guide conversations

<i class="fa-check">:check:</i>  Power your apps with LLMs, APIs, knowledge bases, and agentic reasoning

<i class="fa-check">:check:</i>  Deploy to any channel: web, mobile, telephony, kiosks, IoT, and beyond

<i class="fa-check">:check:</i>  Monitor performance, analyze user behavior, and continuously refine the experience

<i class="fa-check">:check:</i>  Scale effortlessly across regions, languages, and business units

<table><thead><tr><th width="379">Other platforms</th><th>NLX</th></tr></thead><tbody><tr><td>Custom code required to stitch together LLMs, STT, TTS, APIs, and workflows</td><td>No-code orchestration with built-in nodes, tools, and plug-and-play integrations</td></tr><tr><td>Changes require updating many systems</td><td>Change once in NLX and all channels receive the update</td></tr><tr><td>Debugging spans logs across several services</td><td>One unified debugger with full turn-by-turn state tracking</td></tr><tr><td>Each channel (chat, voice, web, CCaaS) requires its own setup</td><td>Channels are managed in-app, share the same flows, and deploy from a single build</td></tr><tr><td>LLM usage is inconsistent and hard to govern</td><td>Centralized governance for models, guardrails, data access, and behavior</td></tr><tr><td>Building multi-step or agentic logic requires custom engineering</td><td>Agentic Generative Journey® automates complex workflows out of the box</td></tr><tr><td>Hard to maintain multilingual experiences</td><td>Native translation management and auto-translate across all resources</td></tr><tr><td>Fragmented analytics across products</td><td>Unified analytics and conversation insights in one place</td></tr></tbody></table>

### How we power conversations

NLX’s conversation engine orchestrates every layer of your conversational AI stack. From application design and workflow automation to data integrations, analytics, and access management. It brings together the systems, models, and tools needed to build scalable intelligent interactions.

With NLX, you can design and deploy applications that automate tasks, connect to APIs and knowledge bases, support multiple languages, and deliver real-time insights. Built-in tools for analytics, conversation review, and role management make it simple to monitor performance, refine experiences, and keep your applications running smoothly.

<figure><img src="https://2737319166-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FHCxYxhIU0Bqkjj942mGk%2Fuploads%2Fl7QyIJn9UrXj7A3DoXTO%2FInfrastructure%20Diagram.png?alt=media&#x26;token=e90f661b-665d-4509-933d-be3d0e3126b1" alt=""><figcaption></figcaption></figure>

### Core runtime functions

Our conversation engine orchestrates communication between your frontend interface, backend systems, and connected services. It tracks conversation state (every turn between the user and your app) to maintain continuity, support debugging, and feed powerful analytics.

Put simply, the NLX runtime handles three core functions:

<table data-view="cards"><thead><tr><th></th><th></th><th data-hidden data-card-cover data-type="image">Cover image</th></tr></thead><tbody><tr><td><strong>Sending &#x26; receiving messages</strong></td><td>The runtime manages all inbound and outbound messages between users and your application across any channel</td><td data-object-fit="cover"><a href="https://2737319166-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FHCxYxhIU0Bqkjj942mGk%2Fuploads%2Fhqqr5KEllCfJf9oeYyO6%2FMessages.png?alt=media&#x26;token=b73b6677-dd0e-470c-b7b5-9e0e0f1e5a95">Messages.png</a></td></tr><tr><td><strong>Calling APIs, LLMs, &#x26; CCaaS systems</strong></td><td>It executes real-time calls to custom or managed APIs, LLMs, and CCaaS platforms to fetch data, run logic, and keep your AI agent informed and responsive</td><td><a href="https://2737319166-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FHCxYxhIU0Bqkjj942mGk%2Fuploads%2FmSeYvZ9OuR1FhAZqD7s1%2FCalls.png?alt=media&#x26;token=b8dd84aa-db81-42a3-9d7a-bdab49260175">Calls.png</a></td></tr><tr><td><strong>Building &#x26; executing workflows</strong></td><td>The runtime assembles your conversation logic, messaging, and prompts into a deployable build and then executes that logic at runtime to power each turn of the interaction</td><td><a href="https://2737319166-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FHCxYxhIU0Bqkjj942mGk%2Fuploads%2FpJn3yyOUpmdcOYje0oz6%2FLogic.png?alt=media&#x26;token=6ee9f7e1-1482-4a13-9283-f877b6a59246">Logic.png</a></td></tr></tbody></table>

### Built-in AI capabilities

<i class="fa-brain-circuit">:brain-circuit:</i> Agentic app: Execute complex, multi-step tasks by combining API calls, knowledge retrieval, and user input, all within a single node

<i class="fa-bullseye-arrow">:bullseye-arrow:</i> Intent classification with NLX Boost: Improve flow detection accuracy by up to 90% compared to traditional NLP alone

<i class="fa-grid-2-plus">:grid-2-plus:</i> Slot capture: Collect multiple user inputs naturally, even when details are provided out of order or revised mid-conversation

<i class="fa-split">:split:</i> Conditional logic & data transformation: Dynamically branch conversations or filter data sources to deliver more relevant personalized responses

<i class="fa-dumbbell">:dumbbell:</i> Training data: Automatically produce sample utterances for training NLP engines with intent detection

<i class="fa-clipboard-check">:clipboard-check:</i> Tests: Instantly create variations of test inputs to validate your model’s accuracy and coverage

#### BYO LLM

Integrate your preferred large language model (LLM) to power AI-driven logic, reasoning, and messaging within your NLX flows. Connect a model to generate text, automate decisions, and extend your agentic capabilities.

<table data-view="cards"><thead><tr><th></th><th></th><th data-hidden data-card-target data-type="content-ref"></th></tr></thead><tbody><tr><td><p><i class="fa-comment-dots">:comment-dots:</i></p><p><strong>Text outputs</strong></p></td><td>Generate dynamic responses, summaries, or formatted content directly within conversation nodes</td><td><a href="../../flows-and-building-blocks/overview/nodes#generative-text">#generative-text</a></td></tr><tr><td><p><i class="fa-gear">:gear:</i></p><p><strong>Agentic mode</strong></p></td><td>Use your chosen model to autonomously run multi-tool calls and complete complex user tasks</td><td><a href="../ai-applications/types/agentic">agentic</a></td></tr><tr><td><p><i class="fa-link">:link:</i></p><p><strong>Model Context Protocol</strong></p></td><td>Allow LLMs to call NLX workflows as tools, passing context between steps for seamless task execution</td><td><a href="../ai-applications/deployment/mcp-server">mcp-server</a></td></tr></tbody></table>
