Agentic
From start to finish, set up an app using Generative Journey in agentic mode with NLX
What's an agentic app?
NLX's Generative Journey brings intelligence and flexibility to your conversational AI apps. When Generative Journey is set to Agentic mode, it uses an LLM to decide which tools to call (like custom or managed data requests, modalities, knowledge bases, or even other flows) based on what’s happening in the conversation.
It can gather details naturally from users (like a name, order number, or size) without requiring rigid, step-by-step mapping across several flows.

Imagine a customer says:
"I’d like to exchange my shirt for a larger size."
With agentic Generative Journey:
A flow is invoked that is designed to handle an order return
When the Generative Journey node is hit, the agent powering it understands the intent (exchange request) and gathers needed details like order number, shirt size, and preferred shipping option directly from the user in natural conversation
The agent then may use a custom data request to retrieve the order from your backend system
If available, the agent uses a modality (like a carousel of available sizes/colors) to let the customer pick a replacement
The agent may reference a knowledge base for return policies and automatically shares that info with the customer
Finally, the agent executes the exchange by triggering another flow or integration tool
In practice, this one node can replace dozens of traditional workflow nodes, giving you far more flexibility and making your flows cleaner, smarter, and easier to maintain.
Step 1: Set up tools
Tools are the building blocks the agent can call on to complete tasks. When assigned to the agent, tools give the agent both the capability and the context it needs to autonomously decide what to use and in what order to fulfill the user’s request.
Custom/managed integration
Connects your agent to real-time data from external systems or services
Modalities
Presents information in different formats (e.g., a carousel, a list, a map)
Knowledge base
Pulls from stored documentation or reference material
Flows
Triggers other workflows in your workspace
Begin by identifying the tools that your agent must use and complete their setup:
Select Resources from workspace menu > Choose Data requests > Add a new data request
Enter a descriptive name (no spaces or special characters) > Select Save
Provide Response model schema and/or Request model schema (if entering Request model schema, provide a description for each property so the agent understands what information is needed for the payload)
Provide a description on the Settings tab of the data request for the agent to understand the purpose of the tool > Click Save
Select Resources from workspace menu > Choose Integrations > Add a new integration
Enter a descriptive name > Choose integration type from Managed list
Complete required fields > Click Create integration
Select Resources from workspace menu > Choose Flows > Add a new flow or select existing
Choose the flow's settings icon in the toolbar > Enter an AI description in the Routing tab explaining the purpose of the workflow
Choose MCP tab > Toggle ON MCP. Optionally, you can enter input schema for the flow's MCP properties that the agent will need to collect from a user before executing the workflow
Click Save
Select Resources from workspace menu > Choose Modalities > Add a new modality
Select the auto-generate option to provide the output schema that your front-end will render went sent by the agent
Click Save
Select Resources from workspace menu > Choose Knowledge bases > Add a new knowledge base or select an existing
Choose type > Provide content or external integration
Provide a description in the settings > Click Save
Step 2: Create flow
Identify the overall task your agent is completing and add a flow. Each flow is invoked when your chosen AI model identifies user intent from a user's utterance ("I want to order room service") and matches it to a flow you've created OrderRoomService. This match is done based on the routing data you provide for your AI model.

Set up your flow
Complete Flow setup by providing routing data
Add a Generative Journey (GJ) node to the Canvas and connect it to the Start node
In the GJ node's configuration panel, select Agentic
🧠 Looking for more? See Generative Journey® node and Intro to flows
Step 3: Add app
Select Applications in your workspace menu
Click New application
Choose Blank application from the available options and select Custom
Enter a name for your application and click Create
Your new app will open on the Configuration tab.
Step 4: Configure
The Configuration tab defines your AI engine, delivery channels, and connected workflows.
AI Engine
An AI engine helps disambiguate human speech for both language understanding and intent recognition and also helps construct construct a package of your application whenever a new build is made. The built-in NLX model is provided for you for seamless setup.
Choose NLX’s built-in model or a managed provider you previously integrated into your workspace (Lex, Dialogflow, etc.)
Delivery
Channels determine where your users interact with your app. Every custom core app includes an API channel by default for easy installation in web or mobile environments.
[Optional] For MCP setup, choose the API channel and enable MCP interface
Select + Add channel and choose one or more channels from the list where you want your application deployed
Channel name: Defaults to the chosen channel type, but you may overwrite with a custom name
Integration: For channels requiring a one-time workspace integration, select from applicable integration(s) created
Custom conversation timeout: Customize the timeout period (in minutes) for a conversation session to end on the selected channel
Escalation: If any escalation channels have been created in your workspace, they will be listed here for selection
Remaining fields are specific to the channel type selected. See list of channels for complete instructions
Functionality
Flows define your application’s behavior. Attach the flow that contains your agentic Generative Journey:
Click + Add flow and select one or more flows from your workspace for your app
Click Default behavior and assign a flow to run during the following situations:
Welcome: Runs when a new conversation session starts. Use it to greet the user, set expectations, and collect any essentials (e.g., name or intent)
Fallback: Runs on timeouts, integration failures, state breaks, or exceeded incomprehension events that are not handled by a knowledge base. Route here to recover gracefully and guide the user forward
Unknown: Runs when the AI cannot match the user's response to any flow or provided choices. Use to invoke a knowledge base and check a question against a repository of information
Escalation (optional): Runs when a node hits the escalation path. Route here to transfer to a human agent
[Optional] You may view all Linked resources associated with your app and select one to navigate to its edit page
Click Save
Step 5: Deploy
Deploying an application allows you to construct a build that contains a package of the flows, AI engine, settings, and delivery details in the state they exist at the time the build is created. You may then deploy a successful build to make it live or roll back to a previous deployment.
Build
Click deployment status in upper right and choose Build and deploy
Review the Validation check for critical errors or detected UX issues in custom flows. Before each new build initiates, a validation check is run to provide a preview of potential errors that may cause failed builds. Detected issues are listed with descriptions and potential solutions
[Optional] Provide a Description of notable build edits as a changelog
[Optional] Choose Enable NLX Boost to enhance the performance of your AI engine with generative AI intent classification
Allow NLX Boost to override the NLP: Enabling this relies on NLX's built-in generative AI to detect user intent (based on your routing data) and route to a flow accordingly, regardless of the AI engine's detected match
Click Create build
You can now test your newest build in your workspace using any of the test chats.
Experiencing a Failed build? Select the All builds option in the deployment menu and click the failed build to view details on what caused an error.
Deploy build
Channel(s) provide the frontend interface (how users experience your app). Deploying a build pushes your updates through any delivery channels setup on the app, effectively making your app live outside of your NLX workspace.
Click deployment status in upper right and select All builds
Choose Deploy on a successful build
[Optional] Deployment languages: Select the languages to include in the deployment, if multiple are available
[Optional] Hosting: Host your application as a Touchpoint app via the conversational.app domain. Ideal for previewing its final look during development and sharing your app externally with collaborators. Enable the Hosting option to configure the URL (e.g.,
mybusiness.conversational.app)
Click Create deployment
Once a build is made, flows can be further edited without affecting a deployed application. Only deploying a new build will impact live applications. Only one build can be deployed at a time and deploying any build deactivates the previous one. You can freely alternate between newer and older builds using Rollback or Deploy.
Implement
Click the Configuration tab of your application and choose any channel assigned in the Delivery section
For finalizing MCP, follow these instructions for your chosen MCP client
Choose the Setup instructions tab and follow instructions for installing to your chosen communication channel
Want to take your app offline? Click the deployed build > Select Deployment tab in modal > Scroll to Danger zone and click Delete deployment. The app stays offline until you redeploy.
Once you deploy a build, you may use your agentic app outside the NLX workspace in two ways:
Delivery channel
Interact with the app through the channel where it’s installed (e.g., web chat via API channel, voice, SMS)
NLX hosted
Open the hosting URL from a deployed build to chat with the app. Hosting must be enabled when you deploy
App settings
Last updated

