mcpMCP

The NLX MCP Interface exposes your AI app's capabilities as standard Model Context Protocolarrow-up-right tools. This allows external Agents, LLMs, and MCP-compliant clients to discover and execute actions defined within your NLX deployment.

When to use this API

brain-circuit

Agent integration

Allow autonomous agents (like AutoGen or LangChain agents) to use your NLX app as a toolkit

head-side-gear

Tool discovery

Dynamically query what your AI app can do (e.g., "CheckInventory", "ResetPassword") and retrieve the schema for those actions

Configuration

Setting

Value

Base URL

https://apps.nlx.ai/c/mcp/{deploymentKey}/{channelKey}-{languageCode}

Header

nlx-api-key: YOUR_API_KEY

circle-info

As with other endpoints, append the language code (e.g., -en-US) to the channel key.

1. List available tools (GET)

Retrieve a list of all tools exposed by this deployment, including their names, descriptions, and JSON schemas for arguments.

Endpoint: GET /tools

curl -X GET "https://apps.nlx.ai/c/mcp/xxxx/xxxx-en-US/tools" \
  -H "nlx-api-key: YOUR_API_KEY"

Response example:

{
  "tools": [
    {
      "name": "check_order_status",
      "description": "Retrieves the status of a customer order",
      "inputSchema": {
        "type": "object",
        "properties": {
          "orderId": { "type": "string" }
        },
        "required": ["orderId"]
      }
    }
  ]
}

2. Execute a tool (POST)

Invoke a specific tool by name. You pass the arguments defined in the tool's schema.

Endpoint: POST /tools/{toolName}

Standard execution (JSON)

Useful for single, synchronous results.

Response example:

Streaming execution (SSE)

If a tool produces long-running output or you want real-time feedback, enable streaming.

Response example: Returns text/event-stream chunks prefixed with data:.

Last updated