Overview

Easily integrate third-party or custom connections in NLX

What are Integrations?

Integrations in NLX are how you connect your conversational applications to the external systems, models, and services they depend on. Whether you need to fetch data from your internal APIs, use a specific Large Language Model, connect to an NLP engine, or configure third-party tooling, integrations make those capabilities available across your workspace for use with any of your AI apps.

Once an integration is added, its related resources become instantly available for use in flows, agent tool calls, testing, and deployment.

To simplify, we can think of integrations in three primary categories, each serving a different purpose in your conversational AI stack:

Managed integrations

Prebuilt integrations with providers that NLX supports out of the box, like LLMs, NLP engines, TTS, CCaaS, etc.

Channel integrations

A form of a managed integration, these are connections that allow your apps to be delivered to the environments where users interact

Custom integrations

For connecting NLX to your own systems. Think internal APIs, custom services, or bespoke endpoints not covered by managed integrations

Managed integrations & Channels (out-of-the-box providers)

These are prebuilt connectors for services like Amazon Connect, OpenAI, Shopify, Bedrock, Twilio, Gmail, and others. Adding them typically involves authenticating with the provider and enabling NLX to use the service (e.g., for voice, telephony, NLP, LLMs, CCaaS routing, etc.).

They can be used in two ways:

  1. In a flow: Assigned to generative nodes to power their intelligence (LLMs), or used in nodes that trigger API calls (Data requests or agentic Generative Journey)

  2. On an application: Assigned as a preferred NLP engine or as a communication channel

To access these types of integrations, select Resources in your workspace menu and choose the Integrations card:

Custom integrations

Custom integrations are user-defined API integrations that let your application communicate with your own services or bespoke endpoints. They can be used in two ways:

  1. In a flow: Trigger an API call during conversation and use the returned data to personalize the next steps

  2. As an agentic tool: Let our agentic Generative Journey autonomously invoke your custom integration to complete tasks

Common examples include retrieving account info, checking order status, writing to a CRM, fetching appointment availability, or sending transactional emails.

Going custom supports:

  • Development vs. Production endpoints

  • Structured request/response schemas

  • Dynamic headers

  • Secrets for secure credential management

  • Built-in testing and debugging

To access this type of integration, select Resources in your workspace menu and choose the Data requests card:

How integrations work

When you complete the one-time setup of any integration:

  • It becomes available to all applications and flows in your workspace

  • Flows can call it via agent tools or Data requests/Action nodes

  • Permissions, settings, and auth tokens stay centrally managed

  • Updates apply automatically everywhere the integration is used

Channels and most managed integrations require you to authenticate with the provider and grant permissions, while custom integrations require you to define response and or request models and environment settings.

Last updated