Agentic Generative Journey®

From start to finish, set up a flow using Generative Journey in agentic mode with NLX

What's Generative Journey® in agentic mode?

The Generative Journey® node is a powerful building block in your flow that brings a large language model (LLMs) into the process. When set to agentic mode, the node can call on a range of tools you assign to it, such as managed or custom data requests, modalities, knowledge bases, or even other flows. It knows when and how to use these tools to facilitate a task while conversing with a user, and can naturally collect details from the user, like name, order number, or t-shirt size without requiring a rigid, pre-mapped flow.

Imagine a customer uses your conversational app and says: "I’d like to exchange my shirt for a larger size."

With agentic Generative Journey:

  • A flow is invoked that is designed to handle an order return

  • When the Generative Journey node is hit, the agent powering it understands the intent (exchange request) and gathers needed details like order number, shirt size, and preferred shipping option directly from the user in natural conversation

  • The agent then calls a custom data request to retrieve the order from your backend system

  • If available, the agent uses a modality (like a carousel of available sizes/colors) to let the customer pick a replacement

  • The agent may reference a knowledge base for return policies and automatically share that information with the customer

  • Finally, the agent executes the exchange by triggering another flow or integration

In practice, this one node can replace dozens of traditional workflow nodes, giving you far more flexibility and making your flows cleaner, smarter, and easier to maintain.


Checklist

You'll complete the following to successfully launch a workflow powered by Generative Journey in agentic mode:


Step 1: Set up tools

Tools are the building blocks the agent can call on to complete tasks. These include:

  • Data requests (custom or managed integrations): Retrieves or sends information from APIs or integrations

  • Modalities: Presents information in different formats (e.g., a carousel, a list, a map)

  • Flows: Triggers other workflows in your workspace

  • Knowledge Bases: Pulls from stored documentation or reference material

When assigned to the node, these tools give the agent both the capability and the context it needs to autonomously decide what to use and in what order to fulfill the user’s request.

Begin by identifying the tools that your agent will use to complete a task:

  • Select Resources from workspace menu > Choose Data requests > Add a new data request

  • Enter a descriptive name (no spaces or special characters) > Select Save

  • Provide Response model schema and/or Request model schema (if entering Request model schema, provide a description for each property so the agent understands what information is needed for the payload)

  • Provide a description on the Settings tab of the data request for the agent to understand the purpose of the tool > Click Save

  • Select Resources from workspace menu > Choose Integrations > Add a new integration

  • Enter a descriptive name > Choose integration type from Managed list

  • Complete required fields > Click Create integration

  • Select Resources from workspace menu > Choose Flows > Add a new flow or select existing

  • Choose the flow's settings icon in the toolbar > Enter an AI description in the Routing tab explaining the purpose of the workflow

  • Choose MCP tab > Toggle ON MCP. Optionally, you can enter input schema for the flow's MCP properties that the agent will need to collect from a user before executing the workflow

  • Click Save

  • Select Resources from workspace menu > Choose Modalities > Add a new modality

  • Select the auto-generate option to provide the output schema that your front-end will render went sent by the agent

  • Click Save

  • Select Resources from workspace menu > Choose Knowledge bases > Add a new knowledge base or select an existing

  • Choose type > Provide content or external integration

  • Provide a description in the settings > Click Save


Step 2: Create flow

Identify the overall task your agent is completing and add a flow. Each flow is invoked when your chosen AI model identifies user intent from a user's utterance ("I want to order room service") and matches it to a flow you've created OrderRoomService. This match is done based on the routing data you provide for your AI model.

Agentic Generative Journey node
  • Select Resources from workspace menu > Choose Flows > Click New flow

  • Enter a descriptive name (no spaces or special characters) > Select Save

  • Complete Flow setup by attaching training data and attaching slots

  • Place a Generative Journey node on the Canvas and attach it to the Start node

  • Click the Generative Journey node and choose Agentic on its configuration panel

  • Provide a succinct prompt to the LLM on what flow the task is part of, the end goal of the task, specific messaging or branding requirements (if any), and things to avoid (if any)

  • Provide the acceptable criteria for ejecting out of the Generative Journey node's Success path and proceeding a user through the remainder of a flow

  • Assign one or more tools from your workspace for the agent to use. Select the check icon after choosing each to attach it to the node

    • For any of Input schema (payload) your Tools require, you may swap between LLM prompt, Explicit values, or Placeholder variables for each property field by selecting the menu to the right of each field

  • Choose from one of the NLX-provided LLMs to power your agent's intelligence. Note that some models are better suited for specific tasks and communication channels

  • Link from the node's Success edge to the next node(s) in a flow (e.g., Basic node, Redirect node, etc.)

  • Click Save

🧠 Looking for more? See Generative Journey® node and Intro to flows


Step 3: Deploy

Now you'll create the conversational AI application users will interface with. This step involves attaching all flows you want your application to access, defining flows to handle certain behaviors, setting up the channels where your application will be installed, and deploying.

  • Select Applications from workspace menu > Choose New application

  • Click Blank application from the available options > Choose Custom

  • Provide a name for your application > Click Create application

  • On Configuration tab of application

    • Under Functionality section, attach one or more flows created in previous step to make available to your application > Click Save

    • Click Default behavior > Assign any attached flow to the application's behaviors > Click Save

A build now constructs a package of your application with a snapshot of current state of flows, languages, and application setup. A deployment then pushes a successful build to the communication channels where your app is installed:

  • Click deployment status in upper right > Select Build and deploy

    • Review Validation check for critical errors or detected UX issues in custom flows. Before each new build initiates, a validation check is run to provide a preview of potential errors that may cause failed builds. Detected issues are listed with descriptions and potential solutions

    • You may provide a Description of notable build edits as a changelog

    • Click Create build

  • Click deployment status in upper right > Select All builds > Choose Deploy on successful build > Click Create deployment


Step 4: Install

  • Click the Configuration tab of your application > Click any channel assigned in the Delivery section

    • For finalizing MCP, choose the API channel in your Configuration tab and choose Setup instructions tab to access how to set up your MCP Client (ensure you turned ON MCP interface before building and deploying)

  • Choose the Setup instructions tab and follow instructions for installing to your chosen communication channel

Once you deploy a build, you can use your application outside the NLX workspace in two ways:

  1. Delivery channel: Interact with the app through the channel where it’s installed (e.g., web chat via API channel, voice, SMS, MCP Client)

  2. Touchpoint (hosted chat): Open the hosting URL from the deployment details to chat with the app. Hosting must be enabled when you deploy

Last updated