Android
This page documents the Touchpoint Android package, serving as the guide for the Touchpoint SDK for Android applications.
A native Android SDK that provides a customizable conversational interface that you can embed in your Android applications. Touchpoint allows users to interact with your NLX applications through natural language and provides a seamless conversational experience.
Overview
This SDK provides:
Voice conversations via NLX (real‑time, full‑duplex audio).
Floating widget: a compact bubble with mic/speaker/close controls that snaps to screen edges.
Bidirectional form interaction: the assistant sends field updates over a WebSocket (“voice‑plus” channel); your app applies them to native views.
Simple integration: one configuration object and a single custom view.
Requirements
Android: API 29+ (Android 10+).
Kotlin: 1.8+
Gradle: Android Gradle Plugin 8+
Permissions:
RECORD_AUDIO,INTERNETNetwork: TLS/WSS access to NLX endpoints
Installation
You can include the SDK as a module inside your app, or publish it to a repository and consume it as a dependency.
Option A — Local module (recommended while iterating)
Copy the SDK modules into your project (e.g.,
core/,voiceengine/,touchpointui/).In your
settings.gradle:In your app’s
build.gradle:
Option B — Composite build (Git submodule)
Add the SDK repo as a git submodule (or plain checkout) at, say,
third_party/nlx-sdk-android.In your root
settings.gradle:Now you can depend on the included modules without publishing:
Tip: Tag the SDK repo (e.g.,
v0.1.0) and pin your submodule to that tag to lock versions across teams.
Option C — Maven repository (GitHub Packages – private)
Publish artifacts to GitHub Packages and consume them by version tags.
Publishing (in each module’s build.gradle.kts):
Consume (in your app):
Versioning with tags: Create a tag per release and publish:
Consumers then use 0.1.0 in their Gradle dependencies.
Option D — JitPack (public)
If the repo can be public, you may use JitPack:
Push a version tag (e.g.,
v0.1.0) to your GitHub repo.In your app:
Check JitPack’s build log for the exact artifact coordinates if you publish multiple modules.
Quick Start
1) Manifest & runtime permission
Request RECORD_AUDIO at runtime on Android 10+.
2) Configure the SDK
3) Add the floating widget
4) Provide screen context & show the widget
Bidirectional Form Updates
When the assistant wants to fill a field, you’ll receive JSON over the voice‑plus WebSocket, e.g.:
These updates are surfaced through the input callback you set in TouchPointConfiguration:
Mapping IDs to views: keep a map from NLX field IDs to Android view tags/IDs and update the view:
For RadioGroups, maintain an input‑ID → value table and select the matching option:
Screen Context
Provide a structured view of the current screen so the assistant knows your fields:
Use the provided buildFlightDemoContext(...) as a template and resend context when your screen changes.
How it Works
NLX.conversation().getVoiceCredentials()fetches a token for yourconversationId.VoiceEngineconnects to NLX for real‑time audio.In parallel, the SDK opens a WebSocket (
voice‑plus) using your deployment key, bot channel (botId‑language),conversationId, and API key.The assistant publishes form updates, navigation, and custom actions over that socket; the SDK parses and routes them to your callbacks.
The
conversationIdmust match across REST, NLX, and the WebSocket URL (the SDK handles this).
Troubleshooting
403 Forbidden on WebSocket Add
Origin: https://demos.nlx.ai(or your approved origin) and ensure a validnlx-api-key. Double‑checkdeploymentKey,channelKey(BOTID-LANG),languageCode, andconversationIdin the URL.404 Not Found on WebSocket The URL is malformed. Pattern:
wss://us-east-1-ws.bots.studio.nlx.ai/?deploymentKey=…&channelKey=BOTID-LANG&languageCode=…&conversationId=…&type=voice-plus&apiKey=…No form updates
Ensure your
inputcallback is set.Provide a correct context (IDs, types).
Verify
conversationIdconsistency.Confirm the agent flow actually emits updates (try the web demo with the same bot).
Bot gets “confused” after playback Don’t auto‑call
StructuredRequest(poll = true)on every playback end.
API Surface (selected)
NlxConfig
NlxConfigTouchPointConfiguration
TouchPointConfigurationTouchPointWidgetView
TouchPointWidgetViewconfigure(config)– pass yourTouchPointConfiguration.setCurrentContext { … }– provide a context builder lambda.show()– connects to NLX and opens Voice+ WebSocket.setOnCloseActionListener { … }– receive close events.Visual states: Idle, Connecting, Active; mic/speaker color hints reflect who is speaking.
Last updated

