Custom NLP

Quickly set up your custom NLP into your NLX workspace

Why use a custom NLP?

Off-the shelf NLP providers, such as Amazon Lex or Google Dialogflow, may not possess the functionality a business requires for handling conversations with its users. Large enterprises often resort to tailored solutions with a custom NLP model.


How does NLX handle my custom NLP?

Just as Amazon Lex or Google Dialogflow provide APIs to ensure their NLPs are compatible with conversational AI builders, NLX provides you with simple API specifications that allow your custom NLP to handle the same actions. These include the build and deployment of an application as well as essential conversation runtime operations, such as disambiguating a user utterance.

Identical to using off-the-shelf NLPs, testing utterances through our automated test suite, architecting and building flows using the Canvas builder, and tracking performance of conversations with analytics can all be done with a custom NLP.

After your custom NLP's API is made compatible and is integrated in NLX, you may select it as the engine of choice when deploying your application.


Create compatible API

Your custom NLP's API interfaces with NLX to build your conversation flows and process user utterances.

Integration architecture

As a result, before you can link your custom NLP in NLX, an API endpoint must be configured using our specification:

post

Disambiguate unstructured text using NLP function

Authorizations
Body
buildIdstringOptional
utterancestringOptional
languageCodestringOptional

en-US, es-ES etc.

contextobjectOptional

context attributes available in NLX conversation

Responses
200
Successful operation
application/json
post
POST /disambiguate HTTP/1.1
Host: 
x-api-key: YOUR_API_KEY
Content-Type: application/json
Accept: */*
Content-Length: 193

{
  "buildId": "text",
  "utterance": "text",
  "languageCode": "text",
  "state": {
    "intentId": "text",
    "slotToElicit": {
      "intentId": "text",
      "slotId": "text",
      "slotType": "text"
    },
    "conversationId": "text"
  },
  "context": {}
}
{
  "intentId": "text",
  "slots": [
    {
      "slotId": "text",
      "value": "text"
    }
  ],
  "sentiment": "positive",
  "confidenceScore": 1
}
post

Create a new NLP build

Authorizations
Body
buildIdstringOptional
signedUrlstringOptional
Responses
200
Successful operation
application/json
post
POST /builds HTTP/1.1
Host: 
x-api-key: YOUR_API_KEY
Content-Type: application/json
Accept: */*
Content-Length: 37

{
  "buildId": "text",
  "signedUrl": "text"
}
{
  "buildId": "text",
  "status": "built",
  "errorMessage": "text"
}

The signedUrl attribute in the request body refers to presigned S3 URLs for accessing the bot's build metadata. The URL expires in 5 minutes, and should suffice for downloading and caching the metadata within customNLP. customNLP may cache and use the build metadata for disambiguation requests. The artifact is a zipped file containing the following top-level files and directories:

  1. intents - A directory with information about the bot's intents.

    • Includes a sub-directory for each languageCode.

    • Within each languageCode, there is one JSON file per intent containing metadata such as utterances, intentId, and slots, with each utterance translated to the respective languageCode.

  2. slotTypes - A directory with information about the bot's slots.

    • Includes a sub-directory for each languageCode.

    • Within each languageCode, there is one JSON file per slot with metadata like values and synonyms translated to the relevant languageCode.

  3. manifest.json - A JSON file with metadata about the build, including attributes like botId, buildId, supported languageCodes and createdTimestamp.

get

Retrieve status of a build

Authorizations
Path parameters
buildIdstringRequired
Responses
200
successful operation
application/json
get
GET /builds/{buildId} HTTP/1.1
Host: 
x-api-key: YOUR_API_KEY
Accept: */*
{
  "buildId": "text",
  "status": "built",
  "errorMessage": "text"
}
put

Update the deployment status of a build

Authorizations
Path parameters
buildIdstringRequired
Body
actionstring · enumOptionalPossible values:
Responses
200
successful operation
application/json
put
PUT /builds/{buildId} HTTP/1.1
Host: 
x-api-key: YOUR_API_KEY
Content-Type: application/json
Accept: */*
Content-Length: 19

{
  "action": "deploy"
}
{
  "buildId": "text",
  "status": "built",
  "errorMessage": "text"
}

Download full OpenAPI Specification:

You can import and visualize the NLX OpenAPI specifications using Swagger Editor.


Integrate API

Once your API is ready for integration, navigate to Integrations in your workspace menu:

  • Click + Add integration > Choose Custom NLP from the dropdown

  • Provide your integration a name

  • Enter both your Endpoint URL and API key

  • Select Create integration

Last updated