Model Context Protocol
From start to finish, set up an NLX application exposed through Model Context Protocol (MCP)
What's an application integrated through MCP?
Model Context Protocol (MCP) is a standardized protocol for how Large Language Models (LLMs) integrate and engage with external systems. Prior to MCP, builders would need to provide a multitude of tools via REST APIs to give an LLM the ability to perform even the simplest of tasks you wish to define:
Check the bank balance of a user
Provide an activity recommendation local to a user
NLX’s MCP support allows you to turn any NLX application into an MCP Server, giving an MCP Client the ability to follow and perform tasks outlined in the application's flow(s) as well as pass context easily to NLX when the MCP Client interfaces with a user.
Your NLX flows effectively become MCP tools that you provide to a supporting MCP Client to boost its capabilities. This process is done entirely without writing code or exposing your systems and services to new vulnerabilities.
Checklist
You'll complete the following to successfully launch your MCP implementation:
Step 1: Create flow
Est. time to complete: ~10 minutes
Begin by identifying the tasks your conversational AI application will automate and organize them into individual topics handled by flows. Determine the sequence of steps and messaging that the conversational application follows to assist a user with the task. The conversation workflow is assembled in a flow's Canvas with a pattern of nodes similar to a flow diagram.
Each flow is invoked when your chosen AI model identifies customer intent from a user's query ("Where can I get Portuguese food in Atlanta?") and matches it to a flow you've created (MCPFoodRecommendation).
Variables required for the flow to work that should be extracted by the LLM in conversation and passed along to NLX are set up first in the flow's Settings:

Select Resources from workspace menu > Choose Flows > Click New flow
Enter a descriptive name (no spaces or special characters) > Select Save
Choose Settings (gear icon) in flow toolbar
From the Routing tab:
In the AI description field, enter a concise description explaining the purpose of the flow that LLM models reference to invoke the flow (e.g., A tool for providing food recommendations based on cuisine and location)
From the MCP tab:
Enable MCP toggle
Provide a unique and concise input name (no spaces or special characters)
Enter any optional input schema containing variable(s) that will be passed along by the LLM interfacing with a user
Enter a brief description for each property so the LLM understands the purpose and context of each (e.g.,
cuisine
property might have the accompanying description for a weather update flow:The cuisine for the recommendation
)

Click Save
On any node of the flow, enter an open curly brace {
and reference the MCP input variable you want to use as an output in messaging, payload fields, Split node conditions, etc:

Step 2: Deploy
Est. time to complete: ~2 minutes
Now you'll set up and deploy an application that will serve as an MCP server.
Select Applications from workspace menu > Choose New application
Click Blank application from the available options > Choose Custom
Provide a name for your application > Click Create application
On Configuration tab of application:
Under Delivery section, select the API channel
Enable MCP interface toggle > Click Update channel
Under Functionality section, attach one or more flows created in previous step to make available to your application
Click Default behavior > Assign any attached flow to the application's behaviors
Select Settings tab of application > Under AI settings, enter a concise description of the application's purpose into the AI description field
Click Save
A build now constructs a package of your application with a snapshot of current state of flows, languages, and application setup.
Select deployment status in upper right > Select Build and deploy
Review Validation check for critical errors or detected UX issues in custom flows. Before each new build initiates, a validation check is run to provide a preview of potential errors that may cause failed builds. Detected issues are listed with descriptions and potential solutions
You may provide a Description of notable build edits as a changelog
Click Create build
Click deployment status in upper right > Select All builds
Choose Deploy on successful build > Click Create deployment
Step 3: Install
Est. time to complete: ~3 minutes
Click the Configuration tab of your application > Click the API channel assigned in the Delivery section
Choose the Setup instructions tab > Access Setup instructions for MCP Client
Follow these instructions for setting up Claude Desktop
Open the MCP configuration in Claude Desktop > Select Settings menu > Choose Developer tab > Click Edit Config
Claude Desktop will open the file explorer to the
claude_desktop_config.json
filePaste the MCP server JSON copied from the setup instructions to the
claude_desktop_config.json
file
Relaunch Claude Desktop and open the conversation settings. Check to see your new MCP server is enabled
Claude Desktop can now use your NLX MCP server when relevant.
Example MCP server JSON configuration:
{
"mcpServers": {
"YourApplicationName": {
"command": "npx",
"args": [
"-y",
"@nlxai/mcp-nodejs-server"
],
"env": {
"NLX_API_KEY": "your API key",
"NLX_APP_URL": "your MCP URL"
}
}
}
}
Want to take your app offline? Click the deployed build > Select Deployment tab in modal > Scroll to Danger zone and click Delete deployment. The app stays offline until you redeploy.
Last updated