sidebar-flipTouchpoint

Touchpoint is a drop-in conversational UI and SDK with support for chat, voice, and Voice+™ interactions.

Overview

Touchpoint is a drop-in conversational user interface (UI) and Software Development Kit (SDK) designed to seamlessly integrate conversational AI into web and mobile applications. It provides native support for text-based chat and voice interactions — as well as Voice+™, NLX's patented multimodal technology that synchronizes voice with digital assets.

Visual examples of Touchpoint on mobile screens

Design philosophy

The future of human-computer interaction is conversational. As applications evolve, user expectations are shifting from static menus to "conversation-first" experiences. Touchpoint enables this future today.

Unlike a traditional customer service chatbot that lives in a silo, NLX Touchpoint is a rich multimodal interface. It serves as a dynamic layer on top of your application, allowing users to interact via text or voice while controlling the digital elements on their screen and vice versa. It is completely agnostic of any industry, vertical, or use case and is equally capable of powering a retail checkout assistant, a healthcare intake form, or a banking support agent.

Supported channels

messageChatchevron-rightwaveform-linesVoicechevron-rightcompassVoice+™chevron-right

Key features

chevron-rightDrop-in installationhashtag

Get up and running in minutes using a simple HTML script tag or a standard NPM package.

chevron-rightThemablehashtag

Comes with polished, accessibility-compliant defaults that can be extensively branded to match your design system.

chevron-rightResponsive web and mobile supporthashtag

Automatically adapts to desktop, tablet, and mobile form factors, including full-screen and headless modes.

chevron-rightVisual modality supporthashtag

Goes beyond text bubbles by rendering rich UI elements (cards, carousels, date pickers) in response to both chat and voice commands.

chevron-rightVoice+ supporthashtag

It supports NLX's signature Voice+ technology out-of-the-box.

  • Voice drives digital: A user speaking on the phone or to the website can trigger action on a connected device such as navigation or form completion.

  • Digital drives voice: On-screen interactions can trigger spoken responses or IVR logic.

chevron-rightBuilt-in support for agent escalationhashtag

Natively handles the UI state changes required when handing off from an AI assistant to a human agent.

Last updated