Overview
Easily integrate third-party or custom connections in NLX
What are Integrations?
Integrations in NLX are how you connect your conversational applications to the external systems, models, and services they depend on. Whether you need to fetch data from your internal APIs, use a specific Large Language Model, connect to an NLP engine, or configure third-party tooling, integrations make those capabilities available across your workspace for use with any of your AI apps.
Once an integration is added, its related resources become instantly available for use in flows, agent tool calls, testing, and deployment.
To simplify, we can think of integrations in three primary categories, each serving a different purpose in your conversational AI stack:
Managed integrations
Prebuilt integrations with providers that NLX supports out of the box, like LLMs, NLP engines, TTS, CCaaS, etc.
Channel integrations
A form of a managed integration, these are connections that allow your apps to be delivered to the environments where users interact
Custom integrations
For connecting NLX to your own systems. Think internal APIs, custom services, or bespoke endpoints not covered by managed integrations
Managed integrations & Channels (out-of-the-box providers)
These are prebuilt connectors for services like Amazon Connect, OpenAI, Shopify, Bedrock, Twilio, Gmail, and others. Adding them typically involves authenticating with the provider and enabling NLX to use the service (e.g., for voice, telephony, NLP, LLMs, CCaaS routing, etc.).
They can be used in two ways:
In a flow: Assigned to generative nodes to power their intelligence (LLMs), or used in nodes that trigger API calls (Data requests or agentic Generative Journey)
On an application: Assigned as a preferred NLP engine or as a communication channel
To access these types of integrations, select Resources in your workspace menu and choose the Integrations card:
Custom integrations
Custom integrations are user-defined API integrations that let your application communicate with your own services or bespoke endpoints. They can be used in two ways:
In a flow: Trigger an API call during conversation and use the returned data to personalize the next steps
As an agentic tool: Let our agentic Generative Journey autonomously invoke your custom integration to complete tasks
Common examples include retrieving account info, checking order status, writing to a CRM, fetching appointment availability, or sending transactional emails.
Going custom supports:
Development vs. Production endpoints
Structured request/response schemas
Dynamic headers
Secrets for secure credential management
Built-in testing and debugging
To access this type of integration, select Resources in your workspace menu and choose the Data requests card:
How integrations work
When you complete the one-time setup of any integration:
It becomes available to all applications and flows in your workspace
Flows can call it via agent tools or Data requests/Action nodes
Permissions, settings, and auth tokens stay centrally managed
Updates apply automatically everywhere the integration is used
Last updated

