Agentic
From start to finish, set up a flow using Generative Journey in agentic mode with NLX
What's an agentic app?
An agentic app uses Generative Journey® v2 to let an LLM complete a task by combining conversation, data capture, and tool use inside a single node. Instead of hard-mapping every branch of a workflow, you define the task, assign the tools, and let the agent decide what to collect, what to use, and when to move forward.
With Generative Journey v2, the agent can:
gather required or optional details naturally from the user
call tools such as custom data requests, managed integrations, knowledge bases, modalities, flows, and slot-based data capture
complete multi-step tasks within one conversational experience
exit the node based on one or more defined completion criteria or data capture completion

Imagine a customer says:
"I’d like to exchange my shirt for a larger size."
With agentic Generative Journey:
the app invokes a flow designed for exchanges or returns
the Generative Journey v2 node gathers the needed details, such as order number, requested size, and shipping preference
the agent may use a custom data request to retrieve the order from your backend
it may send a modality, such as a carousel, so the customer can choose from available replacement sizes or colors
it may reference a knowledge base for return policy details
it can trigger another flow or integration to complete the exchange
In practice, this one node can replace many traditional workflow nodes, making complex tasks easier to build and maintain.
Step 1: Set up tools
Tools are the resources your Generative Journey v2 agent can use to complete its task. Before building the node, identify which tools the agent will need and make sure they are configured.
Step 2: Create a flow
Create a flow that invokes your agent when the AI engine detects the appropriate user intent or if the flow will be assigned to an application default behavior (e.g., Welcome, Unknown, etc.) in Step 3.

Configure the node
In the node’s side panel:
Enter a Prompt that explains:
the task the agent is responsible for
the information it may need to collect
any tone, brand, or behavioral guidance
anything the agent should avoid
Add one or more Exit conditions to define when the agent should leave the node and proceed through the rest of the flow
Attach the tools the agent should be allowed to use
Add an optional Interim message: The agent will deliver this message before a tool is invoked (e.g., "Let me take a look," or, "One moment while I check")
If using Data capture, assign required and optional slots
Optionally enable Exit when complete for data capture if the node should exit automatically once all required slots are resolved
Select the LLM model best suited for your task and channel
Optionally adjust node settings such as:
Max steps
Zero-turn mode
Timeout
Step 3: Add the app
Once your flow is ready, create an application that will use it.
Select Applications in your workspace menu
Click New application
Choose Blank application
Enter a name and create the application
Step 4: Configure
The Configuration tab defines your AI engine, delivery channels, and flows your app will use.
Delivery
Choose the channels where users will interact with your app.
[Optional] For MCP setup, choose the API channel and enable MCP interface
[Optional] Hosting: Host your application as a Touchpoint app via the conversational.app domain. Ideal for previewing its final look during development and sharing your app externally with collaborators. Select Touchpoint from the default API channel and enable Hosting to configure the URL (e.g.,
mybusiness.conversational.app)Select + Add channel and choose one or more channels from the list where you want your application deployed
Channel name: Defaults to the chosen channel type, but you may overwrite with a custom name
Integration: For channels requiring a one-time workspace integration, select from applicable integration(s) created
Custom conversation timeout: Customize the timeout period (in minutes) for a conversation session to end on the selected channel
Escalation: If any escalation channels have been created in your workspace, they will be listed here for selection
Remaining fields are specific to the channel type selected. See list of channels for complete instructions
Functionality + default behavior
Flows define your application’s behavior. Attach the flow that contains your Generative Journey v2 node:
Click + Add flow and select one or more flows from your workspace for your app
Click Default behavior and assign a flow to run during the following situations:
Welcome: Runs when a new conversation session starts. Use it to greet the user, set expectations, and collect any essentials (e.g., name or intent)
Fallback: Runs on timeouts, integration failures, state breaks, or exceeded incomprehension events that are not handled by a knowledge base. Route here to recover gracefully and guide the user forward
Unknown: Runs when the AI cannot match the user's response to any flow or provided choices. Use to invoke a knowledge base and check a question against a repository of information
Escalation (optional): Runs when a node hits the escalation path. Route here to transfer to a human agent
Click Save
Step 5: Deploy
Deploying an application allows you to construct a build that contains a package of the flows, AI engine, settings, and delivery details in the state they exist at the time the build is created. You may then deploy a successful build to make it live or roll back to a previous deployment.
Build
Click deployment status in upper right and choose Build and deploy
Review the Validation check for critical errors or detected UX issues in custom flows. Before each new build initiates, a validation check is run to provide a preview of potential errors that may cause failed builds. Detected issues are listed with descriptions and potential solutions
Provide a Description of notable build edits as a changelog
Click Create build
You can now test your newest build in your workspace using any of the test chats.
Experiencing a Failed build? Select the All builds option in the deployment menu and click the failed build to view details on what caused an error.
Deploy build
Channel(s) provide the frontend interface (how users experience your app). Deploying a build pushes your updates through any delivery channels setup on the app, effectively making your app live outside of your NLX workspace.
Click deployment status in upper right and select All builds
Choose Deploy on a successful build
[Optional] Deployment languages: Select the languages to include in the deployment, if multiple are available
Click Create deployment
Enable One-click deploy to auto-deploy every new build. From the deployment status, open Deployment settings and turn One-click deploy on. All future builds will deploy automatically.
Once a build is made, flows can be further edited without affecting a deployed application. Only deploying a new build will impact live applications. Only one build can be deployed at a time and deploying any build deactivates the previous one. You can freely alternate between newer and older builds using Rollback or Deploy.
Implement
Click the Configuration tab of your application and choose any channel assigned in the Delivery section
Choose the Setup instructions tab and follow instructions for installing to your chosen communication channel
For finalizing MCP, follow these instructions for your chosen MCP client
Want to take your app offline? Click the deployed build > Select Deployment tab in modal > Scroll to Danger zone and click Delete deployment. The app stays offline until you redeploy.
Once you deploy a build, you may use your agentic app outside the NLX workspace in two ways:
Delivery channel
Interact with the app through the channel where it’s installed (e.g., web chat via API channel, voice, SMS)
NLX hosted
Open the hosting URL from a deployed build to chat with the app. Hosting must be enabled when you deploy
App settings
Select your app's Settings tab to access the following:
General
Provide a workspace description of your app with any relevant resource tags for better filing and organization.
AI settings
AI description: Input a brief description of the application for Model Context Protocol (MCP) Clients to reference when using your app as an MCP server
NLX Boost: Choose Enable NLX Boost to enhance the performance of your AI engine with generative AI intent classification
Allow NLX Boost to override the NLP: Enabling this relies on NLX's built-in generative AI to detect user intent (based on your routing data) and route to a flow accordingly, regardless of the AI engine's detected match
Advanced
Child-directed: For applications subject to COPPA. Utterance information will not be stored if enabled
Autocorrection: For supported NLPs, spell correction will be applied towards written user responses
Repeat on incomprehension: If a user's response is unclear, the application will repeat its last message to the user
NLP confidence threshold: When an utterance is at or above this value, your application assumes it matched to a flow with certainty
Negative sentiment threshold: The flow assigned to the Frustration default will be triggered when negative sentiment is detected above this threshold by the NLP (e.g., profanity, sarcasm, etc.)
Incomprehension count: Define the number of sequential utterances that the application asks for clarity on before an escalation or Unknown flow is triggered
Conversation timeout (min): Sets the timeout period for all channels on your application. If a timeout was also configured for a channel when managing channels, that channel's setting will take precedence over the application's setting
Default project ID: Default Project ID when using Google's Dialogflow NLP
Automated tests (enterprise)
See the Automated tests setup for complete setup and instructions
Languages (check the list of supported languages in your workspace)
Choose which languages your application supports when in production. Though the flows attached to your application may be set up and developed in several languages, your application ultimately decides which to provide when released.
Expand either Main language or any Supported language to view advanced settings:
Use native NLP: Sends a user's utterance to the NLP directly without translation
Region: Global is chosen by default but may be toggled to EU for compliance and performance when using Dialogflow's NLP
Dialogflow project ID: If using Dialogflow's NLP, enter the project ID generated on the application's deployment tab
Amazon Lex Voice: If using Amazon's Lex NLP for voice channels, select the conversational AI voice to be used. You may listen to the selection of Amazon Polly voices here
When an application is created, it also inherits the list of workspace languages that have been pushed to all resources. Under Main language, the default language is English (US), while Supported languages lists any additional workspace languages applied via Translations.
Need to adjust languages and translations at the workspace level? See Translations.
As some voice channels leverage NLP models to convert audio inputs to text, check your NLP provider's supported languages for phone-enabled applications deployed with NLX.
Last updated

