# Custom Voice+

## **What's Voice+?**

*Voice+* delivers a fully handsfree, conversational experience by providing a voice AI assistant to handle real-time navigation, form filling, and knowledge base answers on your website or mobile app. Instead of following a [predefined script](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/advanced/voice+-scripts) over a phone channel, users can ask open-ended questions or give natural commands while your AI dynamically responds with both spoken answers and contextual actions on the page. Adding this type of Voice+ experience to your frontend asset makes interactions more fluid and task-driven without requiring users to click or tap.

* *Navigate hands free*: Voice commands automatically trigger page changes (“Take me to pricing”)
* *Fill forms by voice*: Spoken input is mapped to form fields on the webpage (“My address is 31 Hudson Yards”)
* *Enhance knowledge base answers*: Responses not only include an answer but can support relevant navigation or form-fill actions
* *Trigger custom commands*: Send structured data to the frontend for domain-specific tasks (“Add the Hokas size 9 to my cart”)
* *Use real-time context*: The Touchpoint SDK continuously analyzes pages to improve navigation and form recognition
* *Combine scripted responses*: Use predefined voice messages for consistent and custom responses
* *Extend with MCP tools:* Fetch live data or run workflows to pull in information from outside your domain

```
User: What are your store hours?
AI: We’re open Monday through Saturday from 10AM to 8PM, and Sundays from 11AM to 6PM.

> Thanks to custom commands added as metdata in a knowledge base response, you
can support automatic navigation to the Store Hours & Location page for this query.
```

<table><thead><tr><th>Features</th><th width="200.8125" data-type="checkbox">Readymade Voice+ app</th><th data-type="checkbox">Custom app with Voice+ node</th></tr></thead><tbody><tr><td><em>Custom Welcome message &#x26; process</em></td><td>false</td><td>true</td></tr><tr><td><em>Combine scripted responses</em></td><td>false</td><td>true</td></tr><tr><td><em>Turn off small talk</em></td><td>false</td><td>true</td></tr><tr><td><em>UI navigation</em></td><td>true</td><td>true</td></tr><tr><td><em>Fill forms</em></td><td>true</td><td>true</td></tr><tr><td><em>Use real-time context</em></td><td>true</td><td>true</td></tr><tr><td><em>Trigger commands</em></td><td>true</td><td>true</td></tr><tr><td><em>Answer FAQs</em></td><td>true</td><td>true</td></tr><tr><td><em>MCP tools</em></td><td>true</td><td>true</td></tr></tbody></table>

{% hint style="success" %}
Prefer a starter Voice+ app for lightning-fast implementation? Try our [readymade Voice+ app](https://docs.nlx.ai/platform/nlx-platform-guide/ai-applications/types/voice+-tm).
{% endhint %}

## Requirements

* [ ] Create knowledge base
* [ ] Create flow
* [ ] Deploy
* [ ] Install Touchpoint SDK

## Create KB & add metadata

{% hint style="success" %}
Est. time to complete: \~10 minutes
{% endhint %}

A *Q\&A* knowledge base (KB) provides answers to commonly asked questions that users may pose while engaged with your conversational AI. But they also serve another significant purpose: providing support for navigation and custom actions to be triggered in the frontend, when appropriate.&#x20;

*User: What do you have that's vegetarian on the lunch menu?*

*Knowledge base answer: Our lunch menu includes a variety of sandwiches and salads*

*Action: Menu page is loaded*

*Custom action: Vegetarian options are shown in view pane*

To support these actions, metadata is added to relevant KB responses as key-value pairs. In the above example, the following metadata would be provided:

```yaml
nlx:destination: https://www.restaurant.com/menu  
nlx:action: showLunch  
nlx:actionPayload.vegetarian : true
nlx:actionPayload.glutenfree : false
```

Thus, when the KB answer is delivered, Touchpoint navigates to the menu page and sends a structured action payload for the custom “showLunch” and "vegetarian" actions.

Supported metadata keys

<table><thead><tr><th>Key</th><th>Classification</th><th width="140.33331298828125">Action</th><th>Description</th></tr></thead><tbody><tr><td><code>nlx:destination</code></td><td><code>navigation</code></td><td><code>page_custom</code></td><td>Navigate to a specific page or section</td></tr><tr><td><code>nlx:action</code></td><td><code>custom</code></td><td>Custom action name</td><td>Send custom actions to the frontend</td></tr><tr><td><code>nlx:actionPayload</code></td><td><code>custom</code></td><td>Custom action data</td><td>Optional value only taken into account if <code>nlx:action</code> key is present. Sent as payload key to the frontend along with command type custom and action = nlx:action key value</td></tr><tr><td><code>nlx:uri</code></td><td>n/a</td><td>Custom behavior (see <a href="#advanced-using-nlx-uri">Advanced section </a>below)</td><td>Limit responses and actions to specific <code>uri</code> s within your client application</td></tr></tbody></table>

* Select *Resources* > *Knowledge bases* in your workspace menu
* Create a [new *Q\&A* type knowledge base](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/knowledge-bases/ingest-content#q-and-a)
* On *Q\&A's* tab, upload or manually enter new articles
* Select *Metadata* tab
  * Choose *Auto-generate* option to input sample schema or manually enter metadata properties > Click *Save*

{% @arcade/embed flowId="4nU3Rtqx1SHAcNS8zr0N" url="<https://app.arcade.software/share/4nU3Rtqx1SHAcNS8zr0N>" %}

* On *Q\&A's* tab, expand any question that requires navigation or custom actions to be executed on your frontend when asked by a user
  * Click *+ Add metadata*
  * Input values into relevant metadata key fields > Click *Save*
* Choose *Publish* tab of your knowledge base > Publish new version

{% @arcade/embed flowId="3SdUjb0ufnOpRw7Hh59s" url="<https://app.arcade.software/share/3SdUjb0ufnOpRw7Hh59s>" %}

<details>

<summary>Advanced: Using nlx:uri</summary>

In the food menu example, you might provide a different answer to the question if the user is already on the menu page. That's where `nlx:uri` can change how Voice+ behaves:

I create a new article in the Knowledge Base with the following content:

* Question: How do I know what's vegetarian lunch option?
* Answer: Look for the green leaf icon next to listed menu options.

Add a new article in your Q+A knowledge base with `nlx:uri` metadata:

| metadata key                   | value       |
| ------------------------------ | ----------- |
| `nlx:destination`              | N/A         |
| `nlx:action`                   | `showLunch` |
| `nlx:actionPayload`            | `{}`        |
| `nlx:actionPayload.vegetarian` | `true`      |
| `nlx:actionPayload.glutenFree` | `false`     |
| `nlx:uri`                      | `/menu`     |

Touchpoint will receive the custom action payload, but won't receive the navigation command when the user is on the `/contact` page. The voice assistant will also change the voice response to be more contextually aware of the current page.

</details>

## Create flow

{% hint style="success" %}
Est. time to complete: \~5 minutes
{% endhint %}

The `Welcome` flow kickstarts your voice assistant's workflow. It delivers the initial greeting and activates Voice+ mode within your conversational AI application. Once active, your frontend (where the application is installed) is ready to receive API calls from NLX, enabling it to handle navigation, fill forms, trigger custom UI actions, and provide answers to user questions.

<figure><img src="https://2737319166-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FHCxYxhIU0Bqkjj942mGk%2Fuploads%2Fqx5LjtYhFmPQTLfeG8kD%2FVoice%2B%20node%20in%20flow%20(1).png?alt=media&#x26;token=c85669f8-931a-4fef-88fd-f60e58bfb0bd" alt=""><figcaption></figcaption></figure>

* Select Resources > Choose *Flows* card
* Create a new flow > Provide a name
* Add a [*Basic* node](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/overview/nodes#basic) containing a welcome message to fit your use case > Attach it to the *Start* node
* Add a [*Voice+* node](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/overview/nodes#voice) > Link it to the Basic node's Next path
  * Ensure Agentic *mode* is turned ON
  * Assign the knowledge base created in Step 1
  * Optionally assign [MCP-enabled flows](https://docs.nlx.ai/platform/nlx-platform-guide/flows-and-building-blocks/overview/setup#model-context-protocol-mcp) as *Tools*, if you would like your assistant to retrieve external data or execute a workflow during conversation
* Click *Save* on Canvas toolbar

## Deploy

{% hint style="success" %}
Est. time to complete: \~5 minutes
{% endhint %}

Now you'll construct your conversational AI application with your updated changes.

* Select *Applications* from workspace menu
* Create an application > Choose *Blank application* and select the *Custom* app type and provide a name
* On *Configuration* tab:
  * Choose API channel > Select *General* tab in modal > Whitelist your domain
    * Select *Voice* tab in modal > Enable and choose a TTS provider and the voice persona you would like to use (choose from built-in Inworld AI, ElevenLabs, Hume, OpenAI, or Amazon Polly)
    * Select Touchpoint tab in modal > Switch *Communication style* to Voice > Select *Mini* as Layout
    * Change Color mode, Font, and Accent color to fit your business
    * Click *Update channel*
  * Attach your flow to the *Functionality* section > Select the *Default behavior* cog and assign your flow to the Welcome default
  * Optionally apply guardrails rules that were defined in the workspace for any user inputs and/or application outputs: Select + Add guardrail and choose one or more guardrails > Click Save

A build constructs a package of your application with a snapshot of current state of flows, languages, and application setup. A deployment then pushes a successful build to the delivery channels where your app will be installed:

* Click deployment status in upper right > Select *Build and deploy*
  * Review the *Validation check* for critical errors or detected UX issues in custom flows. Before each new build initiates, a validation check is run to provide a preview of potential errors that may cause failed builds. Detected issues are listed with descriptions and potential solutions
  * Click *Create build*
* Click deployment status in upper right > Select *All builds*&#x20;
* Choose *Deploy* on successful build > Click *Create deployment*

## Implement Touchpoint

{% hint style="success" %}
Est. time to complete: \~5 minutes
{% endhint %}

* Click the *Configuration* tab of your application > Select the API channel under *Delivery*
* Choose *Touchpoint*
* Copy the setup snippet (scroll down) and install to your frontend
  * If applicable, define handlers for navigation, form fill, and custom commands in your frontend code
    * Use provided metadata to route each command to the correct UI behavior

Example:

```typescript
function handleShowLunch(payload) {
  setVegetarianOption(payload.vegetarian);
  setGlutenFreeOption(payload.glutenfree);
}

function handleCustomCommand(action, payload) {
  // Example: Voice-enabled lunch menu navigation
  if (action === "showLunch") {
		handleShowLunch(payload);
  }
}

const touchpointOptions = {
  config: {
    applicationUrl: "YOUR_APPLICATION_URL",
    headers: {
      "nlx-api-key": "YOUR_API_KEY",
    },
    languageCode: "en-US",
  },
  input: "voiceMini", // Enables voice input with bidirectional support
  bidirectional: {
    custom: handleCustomCommand,
  },
};
```

{% hint style="info" %}
Want to take your app offline? Click the deployed build > Select *Deployment* tab in modal > Scroll to *Danger zone* and click *Delete* *deployment*. The app stays offline until you redeploy
{% endhint %}


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.nlx.ai/platform/nlx-platform-guide/ai-applications/types/custom-voice+.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
