Core

Learn to set up your conversational AI application with NLX in minutes

What's a Core app?

A Core application (or Custom) is your fully configurable conversational experience. It gives complete control over logic, messaging, routing, and behavior using the flows you design in our Canvas. Because Core apps are not tied to a single modality, they may work across chat, voice, phone, SMS, and more, making them the most flexible option for building structured production-grade experiences.

Core apps are ideal when you need:

ruler-triangle

Precise control over every turn of the conversation

arrow-progress

Deterministic logic and routing or a combination

pencil

Customizable NLP configuration (engine selection, thresholds, intent matching)

window

Deployment across multiple delivery channels

Set up a Core application end-to-end

Below is the complete workflow from creation through deployment and implementation.

Step 1: Add app

spinner
  1. Select Applications in your workspace menu

  2. Click New application

  3. Choose Blank application from the available options and select Custom

  4. Enter a name for your application and click Create

Your new app will open on the Configuration tab.

Step 2: Configure

The Configuration tab defines your AI engine, delivery channels, and connected workflows.

spinner
1

AI Engine

An AI engine helps disambiguate human speech for both language understanding and intent recognition and helps construct a package of your application whenever a new build is made. The built-in NLX model is provided for you for seamless setup.

  1. Keep NLX’s built-in model or swap to a managed provider you previously integrated in your workspace (Amazon Lex, Google Dialogflow, etc.)

2

Delivery

Channels determine where your users interact with your app.

  1. Every custom core app includes an API channel with native Voice+ support for easy implementation to web or mobile environments. See API channel for all settings and configuration options

    • [Optional] Hosting: Host your application as a Touchpoint app via the conversational.app domain. Ideal for previewing its final look during development and sharing your app externally with collaborators. Select Touchpoint from the default API channel and enable Hosting to configure the URL (e.g., mybusiness.conversational.app)

    • [Optional] For MCP setup, enable MCP interface under the API's General tab

  2. [Optional] Click + Add channel and choose one or more channels from the list where you want your application deployed. Fields are specific to the channel type selected. See list of channels

3

Functionality (flows)

Flows define your application’s behavior. Attach the flows you want your app to use:

  1. Click + Add flow and select one or more flows from your workspace for your app

  2. Click gear Default behavior and assign a flow to run during the following situations:

    • Welcome: Runs when a new conversation session starts. Use it to greet the user, set expectations, and collect any essentials (e.g., name or intent)

    • Fallback: Runs on timeouts, integration failures, state breaks, or exceeded incomprehension events that are not handled by a knowledge base. Route here to recover gracefully and guide the user forward

    • Unknown: Runs when the AI cannot match the user's response to any flow or provided choices. Use to invoke a knowledge base and check a question against a repository of information

    • [Optional] Escalation: Runs when a node hits the escalation path

  3. Click Save

Step 3: Deploy

Deploying an application allows you to construct a build that contains a package of the flows, AI engine, settings, and delivery details in the state they exist at the time the build is created. You may then deploy a successful build to make it live or roll back to a previous deployment.

spinner
1

Build

  1. Click deployment status in upper right and choose Build and deploy

  2. Review the Validation check for critical errors or detected UX issues in custom flows. Before each new build initiates, a validation check is run to provide a preview of potential errors that may cause failed builds. Detected issues are listed with descriptions and potential solutions

  3. Select Build after reviewing validator

  4. [Optional] Provide a Description of notable build edits as a changelog

  5. Click Build

You can now test your newest build in your workspace using any of the test chats.

circle-exclamation
2

Deploy

Channel(s) provides the frontend interface (how users experience your app). Deploying a build pushes your updates through any delivery channels setup on the app, effectively making your app available outside of your NLX workspace.

  1. Click deployment status in upper right and select All builds

  2. Choose Deploy on a successful build

    • [Optional] Deployment languages: Select the languages to include in the deployment, if multiple are available

  3. Click Deploy

circle-info

Enable One-click deploy to auto-deploy every new build. From the deployment status, open Deployment settings and turn One-click deploy on. All future builds will deploy automatically.

Once a build is made, flows can be further edited without affecting a deployed application. Only deploying a new build will impact live applications. One build may be deployed at a time and deploying any build deactivates the previous one. You can freely alternate between newer and older builds using Rollback or Deploy.

3

Implement

  1. Click the Configuration tab of your application and choose any channel assigned in the Delivery section

  2. Choose the Setup instructions tab and follow instructions for installing to your chosen communication channel

circle-info

Want to take your app offline? Click the deployed build > Select Deployment tab in modal > Scroll to Danger zone and click Delete deployment. The app stays offline until you redeploy.

Once you deploy a build, you may use your application outside the NLX workspace in two ways:

Delivery channel

Interact with the app through the channel where it’s installed (e.g., web chat via API channel, voice, SMS)

NLX hosted

Open the hosting URL from a deployed build to chat with the app. Hosting must be enabled when you deploy

App settings

Select your app's Settings tab to access the following:

chevron-rightGeneralhashtag

Provide a workspace description of your app with any relevant resource tags for better filing and organization.

chevron-rightAI settingshashtag
  • AI description: Input a brief description of the application for Model Context Protocol (MCP) Clients to reference when using your app as an MCP server

  • NLX Boost: Choose Enable NLX Boost to enhance the performance of your AI engine with generative AI intent classification

    • Allow NLX Boost to override the NLP: Enabling this relies on NLX's built-in generative AI to detect user intent (based on your routing data) and route to a flow accordingly, regardless of the AI engine's detected match

chevron-rightAdvancedhashtag
  • Child-directed: For applications subject to COPPA. Utterance information will not be stored if enabled

  • Autocorrection: For supported NLPs, spell correction will be applied towards written user responses

  • Repeat on incomprehension: If a user's response is unclear, the application will repeat its last message to the user

  • NLP confidence threshold: When an utterance is at or above this value, your application assumes it matched to a flow with certainty

  • Negative sentiment threshold: The flow assigned to the Frustration default will be triggered when negative sentiment is detected above this threshold by the NLP (e.g., profanity, sarcasm, etc.)

  • Incomprehension count: Define the number of sequential utterances that the application asks for clarity on before an escalation or Unknown flow is triggered

  • Conversation timeout (min): Sets the timeout period for all channels on your application. If a timeout was also configured for a channel when managing channels, that channel's setting will take precedence over the application's setting

  • Default project ID: Default Project ID when using Google's Dialogflow NLP

chevron-rightAutomated tests (enterprise)hashtag

See the Automated tests setup for complete setup and instructions

chevron-rightLanguages (check the list of supported languages in your workspace)hashtag
  • Choose which languages your application supports when in production. Though the flows attached to your application may be set up and developed in several languages, your application ultimately decides which to provide when released.

  • Expand either Main language or any Supported language to view advanced settings:

    • Use native NLP: Sends a user's utterance to the NLP directly without translation

    • Region: Global is chosen by default but may be toggled to EU for compliance and performance when using Dialogflow's NLP

    • Dialogflow project ID: If using Dialogflow's NLP, enter the project ID generated on the application's deployment tab

    • Amazon Lex Voice: If using Amazon's Lex NLP for voice channels, select the conversational AI voice to be used. You may listen to the selection of Amazon Polly voices herearrow-up-right

circle-info

When an application is created, it also inherits the list of workspace languages that have been pushed to all resources. Under Main language, the default language is English (US), while Supported languages lists any additional workspace languages applied via Translations.

Need to adjust languages and translations at the workspace level? See Translations.

circle-exclamation
chevron-rightLifecycle (enterprise)hashtag

Choose any Lifecycle hook (enterprise only) created within your workspace that enables your application to retrieve and use content from external resources or trigger an external operation during specific points within the application's conversation session. The following are all lifecycle types available to configure on your application:

  • Conversation start: Triggers when a conversation session starts

  • Conversation end: Triggers when a conversation session ends/times out

  • Escalation: Triggers when an Escalate node is hit and the agent escalation transfer initiates

  • Message received: Triggers whenever a user submits a message (utterance)

  • State modification: Triggers when a node is hit that includes State modifications applied to one or more Data requests where the Stream state modification setting is also enabled

Last updated