LogoLogo
  • GETTING STARTED
    • Welcome to the NLX platform
    • How NLX works
    • Guides
      • Analytics dashboard
      • Chat
      • Generative Journey (Slots)
      • Model Context Protocol
      • Voice
      • Voice+ script
      • Touchpoint components
        • Carousel modality
        • Video modality
    • Terms & concepts
    • Generative AI
    • Developer
  • Build
    • Workspace setup
    • Flows
      • Intro to flows & variables
      • The Canvas
      • Flow settings
      • Nodes
      • Flow appearance
    • Resources
      • Actions
        • Implementation
        • Request model
      • Analytics tags
      • Context variables
      • Data requests
        • Implementation
        • Response model
        • Request model
      • Knowledge bases
        • Ingest content
        • Add metadata (beta)
        • Apply KB
      • Lifecycle hooks
        • Implementation
      • Modalities
      • Secrets
      • Slots (custom)
        • Adding values
        • Translating slots
      • Voice+ scripts
        • Add + download script
        • Deploy script + install SDK
        • Create Voice+ flow
    • Integrations
      • Channels
        • Alexa
        • Amazon Chime SDK
        • Amazon Connect
        • AWS End User Messaging SMS
        • AWS End User Messaging Social
        • Bandwidth
        • Genesys
        • Twilio
        • Zendesk Sunshine
      • LLM services
        • Amazon Bedrock
        • Anthropic
        • Azure OpenAI
        • Cerebras
        • Cohere
        • Google Vertex AI
        • Groq
        • NVIDIA
        • OpenAI
        • xAI
      • NLP
        • Amazon Lex
        • Google Dialogflow
        • Google Dialogflow CX
        • Custom NLP
    • Translations
  • Deploy & test
    • Applications
      • Attach flows
      • Assign default behavior
      • Add channels
        • API
          • REST API
        • Alexa
        • Amazon Chime SDK
        • Amazon Connect
        • AWS End User Messaging SMS
        • AWS End User Messaging Social
        • Genesys
        • Genesys SMS
        • Messenger
        • Microsoft Teams
        • Slack
        • SMS via Bandwidth
        • Twilio SMS
        • Twilio Voice
        • WhatsApp via Twilio
        • Zendesk Sunshine
      • Deploy
      • Optional: Set lifecycle
      • Optional: Set languages
    • Test
      • Test a conversation
      • Automated tests
      • Test an external integration
  • Analyze
    • Conversations
    • Analytics
      • Creating dashboards
      • Formulas & multi-metrics
      • Canvas analytics
    • Training
  • Workspace Settings
    • Escalation channels
    • Resource tags
    • Audit
  • Admin
    • Access control
      • Roles & permissions
    • Notifications
    • FAQ
    • Contact support
Powered by GitBook
On this page
  1. Build
  2. Integrations

LLM services

Effortlessly set up Large Language Model service into your NLX workspace

Last updated 3 months ago

LLMs (Large Language Models) are AI services that comprehend human speech and can generate responses without the need to design hardcoded messaging. LLMs are pre-trained on massive datasets to construct the most natural and appropriate responses to humans.

You may use an LLM service to handle the majority of your conversation flows with your users or to work when resolving several parameters with users via the .

NLX provides instructions for the following LLM integrations:

Amazon Bedrock
Anthropic
Azure OpenAI
Cerebras
Cohere
Google Vertex AI
Groq
NVIDIA
OpenAI
xAI
Generative Journey node