# Flows & variables

## Flows and user intent

When communicating with your conversational AI, a user's *intention* can be as simple as getting an answer to their question or wanting to complete a series of tasks (e.g., cancel a flight, file an insurance claim, etc.). The user's goal or *intent* is handled by a *flow*, which defines the appropriate response or action your conversational application should follow.

An example of your conversational AI application at work:

* A user says, “*I want to book a room*”
* Using [routing data](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/setup#routing), an AI model recognizes "I want to book a room" as an expression that suggests the user's intent should match to the flow, `BookRoom`
* Your conversational application then responds by following the conversation logic deﬁned within the `BookRoom` flow

<figure><img src="https://2737319166-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FHCxYxhIU0Bqkjj942mGk%2Fuploads%2FP6MT6ox3afyzeMzWp5pd%2FFlows%20address%20intents.png?alt=media&#x26;token=3d294a07-62ea-4fa4-95a1-83ac9bcaabdf" alt=""><figcaption><p>Intent recognized from user utterance and matched to a flow</p></figcaption></figure>

***

## Flow invocation

Flows are triggered in conversation in one of four ways:

* *User invocation*: Through the use of a [*User input* node](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/nodes#user-input) that captures a user's utterance and sends it to your application's AI model for intent recognition. When intent is recognized and is matched to a flow, the user is redirected to it (using an accompanying *Redirect* node set to *Recognized flow*). [*User choice* nodes](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/nodes#user-choice) also support intent recognition and routing automatically in cases where user utterances don't match the assigned choices and new intent is detected
* *Application default*: When assigned to one of the available [default behaviors](https://docs.nlx.ai/platform/nlx-platform-guide/ai-applications/setup#default-behavior), a flow is invoked during certain events the [NLX NLU](https://docs.nlx.ai/platform/nlx-platform-guide/introduction-to-nlx/platform) tracks (the start of a conversation session, fallback/failure events, unrecognized intent and therefore unmatched flows, etc.)
* *Redirect node*: When deliberately directed to via a [*Redirect* node](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/nodes#redirect) in another flow
* *MCP extension*: When a flow is exposed to an LLM via [Model Context Protocol](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/setup#model-context-protocol-mcp), the LLM invokes it as a tool when user intent is matched&#x20;

***

## How flows are defined in NLX

The core functionality of flows are comprised of the following:

* *Nodes*: Carefully sequenced across the Canvas builder of a conversation flow, [nodes](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/overview/nodes) represent a single step executed in a turn
* *Turn*: A single exchange between the conversational AI and the user. A turn stops when a specific node is reached (either a *User choice*/*input* node, *Generative Journey* node, or node that is disconnected to any subsequent node)
* *Flow name*: A brief descriptive name for the flow that an NLP engine uses when constructing the stack of flows that make up your conversational application. Must be unique and must avoid spaces or special characters (e.g., `BookRoom`)
* *AI description*: Context on the purpose of the flow so an LLM model or NLX NLP understands when to invoke the flow through user intent recognition
* *Training phrases*: A sample of phrases (utterances) users might say so an NLP model (NLX or other provider) understands when to invoke the flow
* *Variables*: Pieces of information that are required to successfully complete the flow and may be extracted from the user or defined from data gathered externally
* *External actions*: Events triggered that send or retrieve data outside NLX. These may be triggered via [*Data request, Generative Journey*, or *Action* nodes](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/overview/nodes)

***

## Variables in flows

Every conversation between your application and a user is unique with details that change from session to session based on user choices, context, and external or generated data that changes in real time. Since this information can't be hardcoded into your flow's messaging or logic, variable placeholders are used to handle it dynamically.

To reference variables created, you may bring up the placeholder menu in supported text fields of your flow using an open curly brace `{`:&#x20;

<figure><img src="https://2737319166-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FHCxYxhIU0Bqkjj942mGk%2Fuploads%2FQzoZ6DV1EvrZrDzsqOS0%2FFlows_Variables%20color%20codes.png?alt=media&#x26;token=1338f49a-d025-4027-b367-5d68f2ccfeb5" alt=""><figcaption><p>Using a variable in a flow via the placeholder menu</p></figcaption></figure>

Each dynamic placeholder (*variable*) is distinguishable by color:

* Purple: Local flow variables or *Built-in* *integrations* and their variables
* Green: Custom or built-in [*Slots*](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/advanced/slots-custom)
* Orange: [*Data requests*](https://docs.nlx.ai/platform/nlx-platform-guide/integrations/types/data-requests) and their data variables
* Pink: [*Context variables*](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/advanced/context-variables)*,* [*Secrets*](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/advanced/secrets)*,* or [MCP inputs](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/setup#mcp)
* Light teal: [*System variables*](#list-of-system-variables)

<table data-full-width="false"><thead><tr><th width="163">Flow variable</th><th width="268">Description</th><th width="302">Memory</th></tr></thead><tbody><tr><td>Local variable</td><td>Includes named outputs from <a href="../nodes#define"><em>Define</em></a> nodes, <a href="../nodes#loop"><em>Loop</em></a> nodes,  <a href="../nodes#generative-journey"><em>Generative</em></a> nodes, and <a href="../nodes#knowledge-base"><em>Knowledge Base</em></a> nodes.</td><td>Retained locally in a flow where they are created/generated while a user is active in the flow. If a user may be looped through the same flow or may revisit a flow in the same conversation later, these variables may be cleared via a <a href="../nodes#state-modifications">state modification </a>to avoid auto-traversal. To preserve the variable in one flow for use in other flows, use a <a href="../nodes#state-modifications">state modification</a> to set the variable as a <a href="../advanced/context-variables"><em>Context variable</em></a></td></tr><tr><td>Slot</td><td>Captured through user utterance. Slots allow you to restrict the accepted "value" of the user's utterance using <a href="../setup#custom-vs-built-in-slots">custom or built-in slot</a> types</td><td>Retained locally to the flow where attached. If a user revisits a flow during a session, slots may be cleared via a <a href="../nodes#state-modifications">state modification </a>to avoid auto-traversal. To preserve a slot selection in one flow for reference in other flows, use a <a href="../nodes#state-modifications">state modification</a> to set the slot as a <a href="../advanced/context-variables"><em>Context variable</em></a></td></tr><tr><td>Data request variable</td><td>External data retrieved first through a <a href="../nodes#data-request"><em>Data request</em></a> node</td><td>Retained throughout a conversation session after variables are fetched. If the data may become outdated and the node is expected to be revisited during the session, enable the <em>Always retrigger</em> toggle on the <em>Data request</em> node to fetch updated values</td></tr><tr><td><a href="../advanced/context-variables">Context variable</a></td><td>Usable across any flow, these are empty on their own and are set or defined by other variables</td><td>Variables that begin empty and can be set either by externally-fetched data (via <em>Data request</em>s or <em>Lifecycle hooks</em> set to <a href="../../../ai-applications/setup#custom-app-settings">'Start' lifecycle</a>) or by internal values you've defined or captured in a flow. If not using lifecycles, use a <em>state modification</em> to set or alter these variables first. Tracked and retained throughout a conversation session</td></tr><tr><td>Model Context Protocol (MCP) variables</td><td>Externally set by an LLM and passed along to NLX via MCP</td><td>Passed from the LLM interfacing with a user when an <a href="../setup#model-context-protocol-mcp">MCP-enabled flow</a> is invoked. Retained throughout a conversation session</td></tr><tr><td><a href="../advanced/secrets">Secrets</a></td><td>Created at the workspace and usable across any flow in <em>Data request</em> node payloads</td><td>Set when integrated in the workspace and retained throughout a conversation session</td></tr><tr><td>System variable</td><td>Tracked automatically by NLX</td><td><a href="#list-of-system-variables">Set and tracked</a> by the system and retained throughout the conversation session</td></tr></tbody></table>

#### List of *System variables*

NLX system variables are helpful when referenced in conditional logic in *Split* nodes, adding dynamic placeholders in messages, or providing context to *Generative* nodes for more tailored responses.

To use a system variable while [configuring a node](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/nodes#configuring-nodes), type an open curly brace `{` and start typing `system` to choose from the available options.

<table data-full-width="false"><thead><tr><th width="269">System variable</th><th width="212">Description</th><th>Example</th></tr></thead><tbody><tr><td><code>System.capturedIntent</code></td><td>Set by the <em>User input</em> node</td><td>Use in a <em>Split</em> node using conditional rules from the match edge of a <em>User input</em> node</td></tr><tr><td><code>System.capabilities</code></td><td>Uses the flow names and their AI descriptions of the application being executed</td><td>Use in a <em>Generative text</em> prompt to determine user intent from a given utterance captured prior through the NLX.Text slot. Then use in a <em>Split</em> node to check for the name of the flow and <em>Redirect</em> accordingly.</td></tr><tr><td><code>System.conversationId</code></td><td>The unique id for the current conversation</td><td>Pass to <em>Data requests</em> to identify a unique interaction</td></tr><tr><td><code>System.channelType</code></td><td>Provides the channel the user is currently using </td><td>Use in a <em>Split</em> node to route the user based on the channel used in conversation</td></tr><tr><td><code>System.currentTimestamp</code></td><td>Indicates the time (in UTC) recorded by the system when triggered in conversation (returns the number of milliseconds elapsed since the <a href="https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date#the_epoch_timestamps_and_invalid_date">epoch</a>, defined as midnight at the beginning of January 1, 1970, UTC)</td><td>Use in a <em>Split</em> node to route the user if their query is time sensitive or for changes to verbiage (e.g., “Good morning”) or processes during or outside of hours of operation</td></tr><tr><td><code>System.environment</code></td><td>Provides the environment the application is deployed to (Dev/Prod)</td><td>Use in a <em>Split</em> node to route users based on the production environment</td></tr><tr><td><code>System.isVoice</code></td><td>Indicates whether the user is communicating through text or voice interface. Applicable to all channels, including API with Touchpoint Voice</td><td>Use in a <em>Split</em> node with True/False check to route the user based on whether voice or text channel</td></tr><tr><td><code>System.language</code></td><td>The ISO 639-1 designated language (e.g., <code>en</code>)</td><td>Parameterize URLs sent in messaging to support localized versions of a webpage by language/region</td></tr><tr><td><code>System.languageCode</code></td><td>The ISO 3166-1 Alpha-2 language code that the conversation session is in (e.g., <code>en-US</code>)</td><td>Pass to <em>Data requests</em> as a parameter to support translation in the <em>Data request</em> response</td></tr><tr><td><code>System.lastIntent</code></td><td>The last flow visited by the user in the conversation session</td><td>Use in a <em>Split</em> node at the beginning of a flow to route the path of a user based on the context of where they were in a conversation</td></tr><tr><td><code>System.locale</code></td><td>The ISO 639-1 designated locale code (e.g., <code>US</code>)</td><td>Parameterize URLs sent in messaging to support localized versions of a webpage by language/region</td></tr><tr><td><code>System.voicePlusTimeoutType</code></td><td>Timeout occurring from a Voice+ experience. Reference in a <em>Split</em> node condition: <code>blank</code> = no timeout occurred; <em><code>sessionStart</code></em> = user didn't tap SMS link; <code>inactivity</code> = Voice+ started but inactivity between steps occurred</td><td>Use in a <em>Split</em> node to route users that may need to be escalated based on timeout condition</td></tr><tr><td><code>System.nlpConfidenceScore</code></td><td>The value (out of 100) that the NLP confidently matched a user's input to a flow</td><td>Use to ask clarifying questions if below a threshold to ensure the correct flow has been matched</td></tr><tr><td><code>System.resumeIntentMessage</code></td><td>The message relayed to a user when a flow is resumed from a previous interruption</td><td>Enabled on the <em>Start</em> node of a flow to pick up where a user left off</td></tr><tr><td><code>System.sentiment</code></td><td>The sentiment of the user (Positive, Neutral, Negative) for the current exchange.  Note:  sentiment analysis <a href="../../../integrations/types/nlp-engines/amazon-lex#enabling-sentiment">must be enabled</a> for the application</td><td>Use in a <em>Split</em> node to provide empathetic responses</td></tr><tr><td><code>System.transcript</code></td><td>Provides the complete exchange between the human and the conversational AI application.  Any sensitive information will be redacted</td><td>Use as a payload field in a <em>Data request</em> that creates a ticket in a help desk system</td></tr><tr><td><code>System.userId</code></td><td>The user ID identifying the human in the conversation.  Value varies by channel</td><td>Pass to <em>Data requests</em> to look up information on the user's phone number over a voice channel</td></tr><tr><td><code>System.utterance</code></td><td>The most recent message received from the human</td><td>Use in a <em>Split</em> node and the <em>Contains</em> operator to determine if the user said a specific keyword</td></tr><tr><td><code>System.timezone</code></td><td>The user's detected time zone in the conversation</td><td>Determine the real time a user means for <a href="../setup#custom-vs-built-in-slots">NLX.Date or NLX.Time</a> slots when passing to a <em>Data request</em></td></tr></tbody></table>

***

## Capturing variables from users

<figure><img src="https://2737319166-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FHCxYxhIU0Bqkjj942mGk%2Fuploads%2FoQueS4f6DhNlsTbAzQnn%2FFlows%20and%20slot%20variables.png?alt=media&#x26;token=0792827f-aadd-4530-b186-2362ccc603b4" alt=""><figcaption><p>User utterance containing required variables for flow completion</p></figcaption></figure>

[*Slots*](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/advanced/slots-custom) are used to extract specific information from the user, especially if the variables are custom and static (e.g, yes/no, small/medium/large, etc.) or if the variables are abstract or infinite in range (e.g., time, cities, names). They may be used in [training phrases](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/setup#training-phrases) to better route to flows or asked by the conversational AI when assigned to [*User choice*](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/nodes#user-choice) or [*Generative Journey*](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/nodes#generative-journey) nodes.

In a `BookRoom` flow, for example, variables that need to be resolved could include check-in and check-out dates, room type, number of guests, etc.&#x20;

In the sample utterance, "I want to book a King room for Saturday," two variables are given by the user: room type and check-in date. These variables can be captured using a custom slot type named `{RoomType}` and a built-in date slot named `{CheckInDate}`.&#x20;

#### Externally defined

For variables that must first be fetched externally in real-time to provide to users as options or to relay dynamic data, [*Data request*](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/nodes#data-request) nodes are used, or the *Generative Journey* node in agentic mode uses a custom or managed data request as a tool.&#x20;

For example, with a `BookRoom` flow, available room types can be retrieved from a *Data request* node and then presented in a *User choice* node, since these variables are likely to change regularly.

Additional methods for passing variables from another system to NLX, include:

* [*Lifecycle hook*](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/advanced/lifecycle-hooks): Passes variables captured from a system or channel provider to NLX (at the time your conversational application is called) through the use of [context](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/advanced/context-variables). Must assign the *Lifecycle hook* created to the application's lifecycle
* [MCP input](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/setup#model-context-protocol-mcp): Passes variables captured from an LLM to NLX when the LLM invokes the flow. Must have MCP setup completed for your application


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/overview/flows-and-variables.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
