OpenAI
The NLX OpenAI Chat channel acts as a compatibility layer (proxy) that allows standard LLM client libraries to communicate directly with your NLX AI apps.
This allows you to swap out "pure" LLM calls in your existing applications (like those using LangChain, AutoGen, or the OpenAI SDK) with calls to your managed NLX app, without rewriting your integration logic.
Key Differences
Unlike the Standard REST API, this interface mimics the OpenAI API specification strictly.
Authentication: Uses
Bearertoken auth (standard for OpenAI clients).Endpoint: Mimics
/chat/completions.Model Parameter: Used to pass routing and session information.
Configuration
Setting
Value
Base URL
https://apps.nlx.ai/v1/{deploymentKey}/{channelKey}-{languageCode}
API Key
Your NLX API Key (passed as Bearer token)
Important: You must append the language code (e.g.,
-en-US) to the end of your channel key in the Base URL.
The model Parameter
model ParameterIn standard OpenAI calls, model specifies the engine (e.g., gpt-4). In NLX OpenAI Connect, the model parameter is used to maintain conversation state.
Format: nlx:{conversationId}
nlx:61d4735d-d67b-4496-8e41-3094b7c5e1d5: Resume or continue conversation61d4735d-d67b-4496-8e41-3094b7c5e1d5.nlx: (Omit ID) Start a brand new session. The API will generate a UUID for you.
Integration Examples
OpenAI Node.js SDK
You can use the official OpenAI library by simply changing the baseURL and apiKey.
import OpenAI from 'openai';
const client = new OpenAI({
apiKey: 'YOUR_NLX_API_KEY', // Use your NLX key here
baseURL: 'https://apps.nlx.ai/v1/xxxx/xxxx-en-US'
});
async function main() {
const stream = await client.chat.completions.create({
model: 'nlx:61d4735d-d67b-4496-8e41-3094b7c5e1d5', // Pass session ID here
messages: [{ role: 'user', content: 'Hello!' }],
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content || '');
}
}
main();LangChain (Python)
Easily plug an NLX App into a LangChain workflow.
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
openai_api_key="YOUR_NLX_API_KEY",
openai_api_base="https://apps.nlx.ai/v1/xxxx/xxxx-en-US",
model_name="nlx:61d4735d-d67b-4496-8e41-3094b7c5e1d5"
)
response = llm.invoke("How do I return a product?")
print(response.content)Last updated

