LogoLogo
  • GETTING STARTED
    • Welcome to the NLX platform
    • How NLX works
    • Guides
      • Analytics dashboard
      • Chat
      • Generative Journey (Slots)
      • Model Context Protocol
      • Voice
      • Voice+
      • Touchpoint components
        • Carousel modality
        • Video modality
    • Terms & concepts
    • Generative AI
    • Developer
  • Build
    • Workspace setup
    • Flows
      • Intro to flows & variables
      • The Canvas
      • Flow settings
      • Nodes
      • Flow appearance
    • Resources
      • Actions
        • Implementation
        • Request model
      • Analytics tags
      • Context variables
      • Data requests
        • Implementation
        • Response model
        • Request model
      • Knowledge bases
        • Ingest content
        • Add metadata (beta)
        • Apply KB
      • Lifecycle hooks
        • Implementation
      • Modalities
      • Secrets
      • Slots (custom)
        • Adding values
        • Translating slots
      • Voice+ scripts
        • Add + download script
        • Deploy script + install SDK
        • Create Voice+ flow
    • Integrations
      • Channels
        • Alexa
        • Amazon Chime SDK
        • Amazon Connect
        • AWS End User Messaging SMS
        • AWS End User Messaging Social
        • Bandwidth
        • Genesys
        • Twilio
        • Zendesk Sunshine
      • LLM services
        • Amazon Bedrock
        • Anthropic
        • Azure OpenAI
        • Cerebras
        • Cohere
        • Google Vertex AI
        • Groq
        • NVIDIA
        • OpenAI
        • xAI
      • NLP
        • Amazon Lex
        • Google Dialogflow
        • Google Dialogflow CX
        • Custom NLP
    • Translations
  • Test & Deploy
    • Applications
      • Attach flows
      • Assign default behavior
      • Add channels
        • API
          • REST API
        • Alexa
        • Amazon Chime SDK
        • Amazon Connect
        • AWS End User Messaging SMS
        • AWS End User Messaging Social
        • Genesys
        • Genesys SMS
        • Messenger
        • Microsoft Teams
        • Slack
        • SMS via Bandwidth
        • Twilio SMS
        • Twilio Voice
        • WhatsApp via Twilio
        • Zendesk Sunshine
      • Deploy
      • Optional: Set lifecycle
      • Optional: Set languages
    • Test
      • Test a conversation
      • Automated tests
      • Test an external integration
  • Analyze
    • Conversations
    • Analytics
      • Creating dashboards
      • Formulas & multi-metrics
      • Canvas analytics
    • Training
  • Workspace Settings
    • Escalation channels
    • Resource tags
    • Audit
  • Admin
    • Access control
      • Roles & permissions
    • Notifications
    • FAQ
    • Contact support
Powered by GitBook
On this page
  • What's an application integrated through MCP?
  • Checklist
  • Step 1: Construct a flow
  • Step 2: Deploy application
  • Step 3: Set up MCP Client
  1. GETTING STARTED
  2. Guides

Model Context Protocol

From start to finish, set up an NLX application exposed through Model Context Protocol (MCP)

Last updated 22 days ago

What's an application integrated through MCP?

Model Context Protocol (MCP) is a standardized protocol for how Large Language Models (LLMs) integrate and engage with external systems. Prior to MCP, builders would need to provide a multitude of tools via REST APIs to give an LLM the ability to perform even the simplest of tasks you wish to define:

  • Check the weather local to a user

  • Provide an activity recommendation local to a user

NLX’s MCP support allows you to turn any NLX application into an MCP Server, giving an LLM (MCP Client) the ability to follow and perform tasks outlined in the application's as well as easily to NLX when the LLM interfaces with a user.

Your NLX flows effectively become MCP tools that you provide to a supporting LLM to boost its capabilities. This process is done entirely without writing code or exposing your systems and services to new vulnerabilities.


Checklist

You'll complete the following to successfully launch your MCP implementation:


Step 1: Construct a flow

Est. time to complete: ~10 minutes

Each flow is invoked when your chosen AI model identifies customer intent from a user's query ("What's the weather like?") and matches it to a flow you've created (WeatherUpdate).

Variables required for the flow to work that should be extracted by the LLM in conversation and passed along to NLX are set up first in the flow's Settings:

  • Select Flows in workspace menu > Choose New flow > Enter a descriptive name (no spaces or special characters) > Select Save

  • Choose Settings (gear icon) in flow toolbar

  • From the AI settings tab:

    • In the AI description field, enter a concise description explaining the purpose of the flow that LLM models reference to invoke the flow

    • Enable MCP toggle

      • Provide a unique and concise input name (no spaces or special characters)

      • Expand each property variable defined in your MCP input schema > Expand their settings

        • Enter a brief description in the property's Description field for the LLM to understand the purpose and context of each (e.g., location property might have the accompanying description for a weather update flow: The location of the weather request)

  • Click Save

On any node of the flow, enter an open curly brace { and reference the MCP input variable you want to use as an output in messaging, payload fields, Split node conditions, etc:


Step 2: Deploy application

Est. time to complete: ~10 minutes

Now you'll set up and deploy an application to become your MCP server.

  • Select Applications from workspace menu > Choose New application

  • Enter a descriptive name > Click Save

  • Click Flows tab of application > Select Attach flows > Attach one or more flows created to make available to your application > Click Attach selected

  • Select Channels tab of application > Expand API option > Click + Create channel

    • Enable MCP interface toggle

    • Click Create channel

  • Select Settings tab of application > Under AI settings, enter a concise description of the application's purpose into the AI description field

  • Click Save

A build constructs the array of flows that make up your conversational AI application and updates any changes made to your flows, while deploying makes a successful build live:

  • Click Deployment tab of application > Select Create or Review & build

  • Wait for validation to complete > Select Create build*

  • When satisfied with a successful build, click Deploy


Step 3: Set up MCP Client

Est. time to complete: ~5 minutes

To make your application available to a supported MCP Client (Large Language Model service), complete the following:

  • From the Deployment tab of your NLX application, select Details next to the Deployed status

  • Expand the API section under Setup instructions in the pop-up > Copy the MCP URL and the API key

    • Use the following for the claude_desktop_config.json and replace with the MCP URL and API key copied earlier:

{
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": [
        "-y",
        "@nlxai/mcp-nodejs-server",
      ],
      "env": {
        "NLX_API_KEY": "your API key",
        "NLX_APP_URL": "your MCP URL"
      }
    }
  }
}

Begin by identifying the tasks your conversational AI application will automate and organize them into individual topics handled by . Determine the sequence of steps and messaging that the conversational application follows to assist a user with the task. The conversation workflow is assembled in a flow's Canvas with a pattern of nodes similar to a flow diagram.

Enter the input schema containing that will be set and passed along by the LLM interfacing with a user

*After a build status appears as Built, you may use the Test feature to test the conversation with your application using the latest build.

Looking for more? See

Complete the MCP setup for your preferred MCP-supporting client. For example, here are the

🟢
flows
necessary variable(s)
🧠
Manage channels
instructions for Anthropic
flow(s)
pass context
MCP variable being passed to a Data request payload and referenced in a Basic node's message
MCP URL and API key in deployment details