Android
This page documents the Touchpoint Android package, serving as the guide for the Touchpoint SDK for Android applications.
A native Android SDK that provides a customizable conversational interface that you can embed in your Android applications. Touchpoint allows users to interact with your NLX applications through natural language and provides a seamless conversational experience.
Overview
This SDK provides:
Voice conversations via NLX (real‑time, full‑duplex audio).
Floating widget: a compact bubble with mic/speaker/close controls that snaps to screen edges.
Bidirectional form interaction: the assistant sends field updates over a WebSocket (“voice‑plus” channel); your app applies them to native views.
Simple integration: one configuration object and a single custom view.
Requirements
Android: API 29+ (Android 10+).
Kotlin: 1.8+
Gradle: Android Gradle Plugin 8+
Permissions:
RECORD_AUDIO,INTERNETNetwork: TLS/WSS access to NLX endpoints
Installation
You can include the SDK as a module inside your app, or publish it to a repository and consume it as a dependency.
Option A — Local module (recommended while iterating)
Copy the SDK modules into your project (e.g.,
core/,voiceengine/,touchpointui/).In your
settings.gradle:include(":nlx-core", ":nlx-voiceengine", ":nlx-touchpointui")In your app’s
build.gradle:dependencies { implementation(project(":nlx-core")) implementation(project(":nlx-voiceengine")) implementation(project(":nlx-touchpointui")) }
Option B — Composite build (Git submodule)
Add the SDK repo as a git submodule (or plain checkout) at, say,
third_party/nlx-sdk-android.In your root
settings.gradle:includeBuild("third_party/nlx-sdk-android")Now you can depend on the included modules without publishing:
dependencies { implementation("com.yourorg.nlx:nlx-core") // resolved from the composite build implementation("com.yourorg.nlx:nlx-voiceengine") implementation("com.yourorg.nlx:nlx-touchpointui") }
Tip: Tag the SDK repo (e.g.,
v0.1.0) and pin your submodule to that tag to lock versions across teams.
Option C — Maven repository (GitHub Packages – private)
Publish artifacts to GitHub Packages and consume them by version tags.
Publishing (in each module’s build.gradle.kts):
plugins {
id("maven-publish")
}
group = "com.yourorg.nlx"
version = "0.1.0" // bump this and create a git tag v0.1.0
publishing {
publications {
create<MavenPublication>("release") {
from(components["release"]) // for Android libraries; use "java" for pure JVM modules
artifactId = "nlx-touchpointui" // change per module
}
}
repositories {
maven {
name = "GitHubPackages"
url = uri("https://maven.pkg.github.com/YOUR_GH_ORG/YOUR_REPO")
credentials {
username = System.getenv("GITHUB_ACTOR") ?: "YOUR_GH_USER"
password = System.getenv("GITHUB_TOKEN") ?: "YOUR_PAT_WITH_read:packages_write:packages"
}
}
}
}Consume (in your app):
// settings.gradle or repositories { ... } in build.gradle
repositories {
mavenCentral()
maven {
url = uri("https://maven.pkg.github.com/YOUR_GH_ORG/YOUR_REPO")
credentials {
username = providers.environmentVariable("GITHUB_ACTOR").orNull ?: "YOUR_GH_USER"
password = providers.environmentVariable("GITHUB_TOKEN").orNull ?: "YOUR_PAT_WITH_read:packages"
}
}
}
// build.gradle
dependencies {
implementation("com.yourorg.nlx:nlx-core:0.1.0")
implementation("com.yourorg.nlx:nlx-voiceengine:0.1.0")
implementation("com.yourorg.nlx:nlx-touchpointui:0.1.0")
}Versioning with tags: Create a tag per release and publish:
git tag v0.1.0
git push origin v0.1.0
./gradlew publishConsumers then use 0.1.0 in their Gradle dependencies.
Option D — JitPack (public)
If the repo can be public, you may use JitPack:
Push a version tag (e.g.,
v0.1.0) to your GitHub repo.In your app:
repositories { maven { url = uri("https://jitpack.io") } } dependencies { // For multi-module builds, artifactId is each module name you publish implementation("com.github.YOUR_GH_USER:nlx-android-sdk:0.1.0") // single-module example // or implementation("com.github.YOUR_GH_USER:YOUR_REPO_MODULE:0.1.0") // if JitPack exposes modules separately }
Check JitPack’s build log for the exact artifact coordinates if you publish multiple modules.
Quick Start
1) Manifest & runtime permission
<!-- AndroidManifest.xml -->
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.INTERNET" />Request RECORD_AUDIO at runtime on Android 10+.
2) Configure the SDK
val nlxConfig = NlxConfig(
applicationUrl = "https://bots.studio.nlx.ai/c/DEPLOYMENT/CHANNEL",
headers = mapOf("nlx-api-key" to "YOUR_API_KEY"),
conversationId = UUID.randomUUID().toString(),
userId = UUID.randomUUID().toString(),
languageCode = "en-US"
)
val tpConfig = TouchPointConfiguration(
nlxConfig = nlxConfig,
input = Input.VoiceMini,
bidirectional = TouchPointConfiguration.Bidirectional.Automatic(
TouchPointConfiguration.AutomaticBidirectionalConfig(
navigation = { command, payload -> /* handle navigation */ },
input = { fieldId, value, meta -> formBinder.applyFieldUpdate(fieldId, value) },
custom = { name, payload -> /* custom actions */ }
)
)
)3) Add the floating widget
<com.nlx.touchpointui.TouchPointWidgetView
android:id="@+id/touchpointWidget"
android:layout_width="match_parent"
android:layout_height="match_parent"/>4) Provide screen context & show the widget
class MainActivity : AppCompatActivity() {
private lateinit var widget: TouchPointWidgetView
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
widget = findViewById(R.id.touchpointWidget)
val context = buildFlightDemoContext(
tripType = "", departure = "", destination = "",
departureDate = "", returnDate = "",
numPassengers = "", cabinClass = "",
directOnly = false, flexibleDates = false, nearbyAirports = false
)
widget.setCurrentContext { context }
widget.configure(tpConfig)
findViewById<FloatingActionButton>(R.id.fabNlx).setOnClickListener {
it.visibility = View.GONE
widget.show() // connects to NLX and opens the Voice+ WebSocket
}
widget.setOnCloseActionListener {
findViewById<FloatingActionButton>(R.id.fabNlx).visibility = View.VISIBLE
}
}
}Bidirectional Form Updates
When the assistant wants to fill a field, you’ll receive JSON over the voice‑plus WebSocket, e.g.:
{
"classification": "input",
"fields": [
{ "id": "input-4", "value": "2025-09-03" },
{ "id": "input-5", "value": "2025-09-08" }
],
"conversationId": "…"
}These updates are surfaced through the input callback you set in TouchPointConfiguration:
val handleForm: (String, Any?, Map<String, Any?>) -> Unit = { fieldId, value, meta ->
formBinder.applyFieldUpdate(fieldId, value)
}Mapping IDs to views: keep a map from NLX field IDs to Android view tags/IDs and update the view:
private val idAlias = mapOf(
"input-2" to "departure",
"input-3" to "destination",
"input-4" to "departureDate",
"input-5" to "returnDate",
"select-6" to "numPassengers",
"select-7" to "cabinClassType",
"input-8" to "directOnly",
"input-9" to "flexibleDates",
"input-10" to "nearbyAirports",
"input-0" to "tripType", // radio group
"input-1" to "tripType"
)For RadioGroups, maintain an input‑ID → value table and select the matching option:
private val radioValueById = mapOf(
"input-0" to "round-trip",
"input-1" to "one-way"
)Screen Context
Provide a structured view of the current screen so the assistant knows your fields:
{
"nlx:vpContext": {
"fields": [
{"id":"input-2","name":"departure","type":"text","placeholder":"Enter departure city","value":""},
{"id":"input-3","name":"destination","type":"text","placeholder":"Enter destination city","value":""},
{"id":"input-4","name":"departureDate","type":"date","value":""},
{"id":"input-5","name":"returnDate","type":"date","value":""},
{"id":"select-6","name":"numPassengers","type":"select-one","value":"1","options":[...]}
],
"destinations": []
}
}Use the provided buildFlightDemoContext(...) as a template and resend context when your screen changes.
How it Works
NLX.conversation().getVoiceCredentials()fetches a token for yourconversationId.VoiceEngineconnects to NLX for real‑time audio.In parallel, the SDK opens a WebSocket (
voice‑plus) using your deployment key, bot channel (botId‑language),conversationId, and API key.The assistant publishes form updates, navigation, and custom actions over that socket; the SDK parses and routes them to your callbacks.
The
conversationIdmust match across REST, NLX, and the WebSocket URL (the SDK handles this).
Troubleshooting
403 Forbidden on WebSocket Add
Origin: https://demos.nlx.ai(or your approved origin) and ensure a validnlx-api-key. Double‑checkdeploymentKey,channelKey(BOTID-LANG),languageCode, andconversationIdin the URL.404 Not Found on WebSocket The URL is malformed. Pattern:
wss://us-east-1-ws.bots.studio.nlx.ai/?deploymentKey=…&channelKey=BOTID-LANG&languageCode=…&conversationId=…&type=voice-plus&apiKey=…No form updates
Ensure your
inputcallback is set.Provide a correct context (IDs, types).
Verify
conversationIdconsistency.Confirm the agent flow actually emits updates (try the web demo with the same bot).
Bot gets “confused” after playback Don’t auto‑call
StructuredRequest(poll = true)on every playback end.
API Surface (selected)
NlxConfig
NlxConfigdata class NlxConfig(
val applicationUrl: String,
val headers: Map<String, String>,
val languageCode: String = "en-US",
val userId: String? = null,
val conversationId: String? = null
)TouchPointConfiguration
TouchPointConfigurationdata class TouchPointConfiguration(
val nlxConfig: NlxConfig,
val input: Input = Input.VoiceMini,
val bidirectional: Bidirectional = Bidirectional.Disabled
) {
sealed class Bidirectional {
data object Disabled : Bidirectional()
data class Automatic(val config: AutomaticBidirectionalConfig) : Bidirectional()
data class Manual(val controller: BidirectionalController) : Bidirectional()
}
data class AutomaticBidirectionalConfig(
val navigation: ((String, Map<String, Any?>) -> Unit)? = null,
val input: ((String, Any?, Map<String, Any?>) -> Unit)? = null,
val custom: ((String, Map<String, Any?>) -> Unit)? = null
)
}TouchPointWidgetView
TouchPointWidgetViewconfigure(config)– pass yourTouchPointConfiguration.setCurrentContext { … }– provide a context builder lambda.show()– connects to NLX and opens Voice+ WebSocket.setOnCloseActionListener { … }– receive close events.Visual states: Idle, Connecting, Active; mic/speaker color hints reflect who is speaking.
Last updated

