Platform
Create intelligent conversational applications with NLX
What is the NLX platform?
NLX is an end-to-end platform for building conversational AI applications. What are conversational applications you ask? The answer is simple: any application that you can converse with naturally, as opposed to having to click around.
Whether you’re launching a handsfree Voice+ app, an immersive chat interface, or an agentic AI application that reasons and acts, NLX provides everything to design, deploy, and scale in one place.
Chat. Voice. Voice+

Build. Deploy. Analyze.
We reject forcing builders to juggle multiple tools or write complex orchestration code. NLX handles the infrastructure, language models, integrations, security, and runtime complexity so your team can focus on crafting exceptional user experiences quickly.
With NLX, you can
Build visual workflows that automate tasks and guide conversations
Power your apps with LLMs, APIs, knowledge bases, and agentic reasoning
Deploy to any channel: web, mobile, telephony, kiosks, IoT, and beyond
Monitor performance, analyze user behavior, and continuously refine the experience
Scale effortlessly across regions, languages, and business units
Custom code required to stitch together LLMs, STT, TTS, APIs, and workflows
No-code orchestration with built-in nodes, tools, and plug-and-play integrations
Changes require updating many systems
Change once in NLX and all channels receive the update
Debugging spans logs across several services
One unified debugger with full turn-by-turn state tracking
Each channel (chat, voice, web, CCaaS) requires its own setup
Channels are managed in-app, share the same flows, and deploy from a single build
LLM usage is inconsistent and hard to govern
Centralized governance for models, guardrails, data access, and behavior
Building multi-step or agentic logic requires custom engineering
Agentic Generative Journey® automates complex workflows out of the box
Hard to maintain multilingual experiences
Native translation management and auto-translate across all resources
Fragmented analytics across products
Unified analytics and conversation insights in one place
How we power conversations
NLX’s conversation engine orchestrates every layer of your conversational AI stack. From application design and workflow automation to data integrations, analytics, and access management. It brings together the systems, models, and tools needed to build scalable intelligent interactions.
With NLX, you can design and deploy applications that automate tasks, connect to APIs and knowledge bases, support multiple languages, and deliver real-time insights. Built-in tools for analytics, conversation review, and role management make it simple to monitor performance, refine experiences, and keep your applications running smoothly.

Core runtime functions
Our conversation engine orchestrates communication between your frontend interface, backend systems, and connected services. It tracks conversation state (every turn between the user and your app) to maintain continuity, support debugging, and feed powerful analytics.
Put simply, the NLX runtime handles three core functions:

Sending & receiving messages
The runtime manages all inbound and outbound messages between users and your application across any channel

Calling APIs, LLMs, & CCaaS systems
It executes real-time calls to custom or managed APIs, LLMs, and CCaaS platforms to fetch data, run logic, and keep your AI agent informed and responsive

Building & executing workflows
The runtime assembles your conversation logic, messaging, and prompts into a deployable build and then executes that logic at runtime to power each turn of the interaction
Built-in AI capabilities
Agentic app: Execute complex, multi-step tasks by combining API calls, knowledge retrieval, and user input, all within a single node
Async Worker applications: Create fully autonomous agents that can search the web, generate content, compile PDFs, or perform custom workflows using managed or custom APIs
Intent classification with NLX Boost: Improve flow detection accuracy by up to 90% compared to traditional NLP alone
Slot capture: Collect multiple user inputs naturally, even when details are provided out of order or revised mid-conversation
Conditional logic & data transformation: Dynamically branch conversations or filter data sources to deliver more relevant personalized responses
Training data: Automatically produce sample utterances for training NLP engines with intent detection
Tests: Instantly create variations of test inputs to validate your model’s accuracy and coverage
BYO LLM
Integrate your preferred large language model (LLM) to power AI-driven logic, reasoning, and messaging within your NLX flows. Connect a model to generate text, automate decisions, and extend your agentic capabilities.
Last updated

