Testing
Test and experience a single workflow or full conversation in NLX
Testing in NLX lets you simulate real user conversations before deploying your application. You can run full end-to-end tests from an app’s initialization flow or isolate specific flows for focused debugging.
Through your workspace's built-in test feature, you can interact with your AI application in text or voice mode, review conversation turns, inspect variable state, and track events in real time. For deeper troubleshooting, the test's Debugger log highlights each operation that occurred during a turn and pinpoints where issues arise.
Enterprise users can also streamline testing with Automated tests, which validate user intent recognition and flow performance at scale.
Application test
Run a full end-to-end simulation of your app to confirm that routing, responses, and integrations work as expected
Flow test
Quickly test and refine a single flow in isolation, adjusting logic, variables, and translations in real time
Automated tests (Enterprise)
Validate intent recognition and flow accuracy at scale through automated test cases for faster QA
Application test
After completing an application build, locate the test option in the upper right of your application to begin a conversation from the flow assigned to the app's Welcome default behavior.
Type your responses directly, or for voice-enabled applications, activate Voice mode using the microphone icon at the bottom of the tester. To launch a test session with your application, choose an app from your workspace and locate the play icon in the upper right:
Flow test
Test mode on a flow allows you to isolate a specific conversation flow for evaluation. With test mode, you can easily debug or test different variables and translations belonging to your flow while conversing with your application.
To launch the test widget, select your flow and enable a test conversation from the toolbar:
Select Resources from your workspace menu > Choose Flows > Select an existing flow
Click the test icon (play button) in the Canvas toolbar to launch a test conversation session
Type your responses directly, or for voice-enabled applications, activate Voice mode using the microphone icon at the bottom of the tester
You can continue to modify and build with your flow while in Test mode. Simply select the Canvas to adjust the flow and save changes. Re-select the Test icon and refresh the chat to test changes in real time.
Troubleshoot
Often when testing our conversations, we may notice errors appear or unanticipated behavior from the application. These issues usually stem from problems with state and variables in a node of a turn:
Missing required variables used in application messaging, payloads, or state modifications
Uncleared variables in retry or revisit scenarios
If an issue is detected by NLX's state tracker, the entire turn will fail to execute or will trigger the application's fallback behavior. Within a test chat in your NLX workspace, you can engage in conversation with your AI application to troubleshoot or view potential errors.
Note the four turns that have taken place in this test chat so far (application messages in grey):

To troubleshoot unexpected errors with a test chat, click the most recent application message to open the Debugger. This shows a top-down timeline of events for that turn, allowing you to expand each logged event and identify the cause of the issue. The debugger also highlights the nodes involved in the turn directly on the Canvas of a current flow:

If having trouble pinpointing which variable's state is causing an issue, start debugging from the first node in the affected turn. Detach it from all subsequent nodes in the turn and connect it to a simple Basic node with a success message. If the turn reaches the success message, continue slowly iterating the process by reconnecting another node, saving, and testing again. When the turn fails, you’ll know the last connected node is where the issue lies.
🧠 Want to learn more? Read all about conversation turns & state
Configure & debug
Within a test session's panel, you may either select the gear icon to set context or other settings for your test, or select the bug icon to review all logged events from your test transcript for troubleshooting.
Selecting any AI message (in grey) within your test chat provides a sequential list (top-down) of all tracked operations and events that occurred up to the turn selected.
Customize data: If you have used context variables in your conversation, select the Settings of the test chat to enter values into the corresponding fields to test your flow. Simply enter data into an available context variable field and click Save and refresh
Event details: To debug events in your flow, select a conversational AI message (in grey) in the transcript window. Click the expand caret on any item to view the NLU's details
Environment: Change the endpoint for the Data request environment between Production and Development
Last updated

