# Models

## The CreateChatCompletionRequest object

```json
{"openapi":"3.0.3","info":{"title":"NLX OpenAI Connect API","version":"1.0.0"},"components":{"schemas":{"CreateChatCompletionRequest":{"type":"object","required":["model","messages"],"properties":{"model":{"type":"string","description":"Used to pass the session identifier. Expected format: \"nlx:{conversationId}\".  If the conversationId is omitted (e.g., just \"nlx\"), a new UUID will be generated.\n"},"messages":{"type":"array","description":"A list of messages comprising the conversation so far.","items":{"$ref":"#/components/schemas/ChatCompletionMessageParam"}},"stream":{"type":"boolean","default":true,"description":"If set, partial message deltas will be sent using Server-Sent Events.  This API is optimized for streaming.\n"}}},"ChatCompletionMessageParam":{"type":"object","required":["role","content"],"properties":{"role":{"type":"string","enum":["system","user","assistant"],"description":"The role of the messages author."},"content":{"type":"string","description":"The contents of the message."}}}}}}
```

## The ChatCompletionMessageParam object

```json
{"openapi":"3.0.3","info":{"title":"NLX OpenAI Connect API","version":"1.0.0"},"components":{"schemas":{"ChatCompletionMessageParam":{"type":"object","required":["role","content"],"properties":{"role":{"type":"string","enum":["system","user","assistant"],"description":"The role of the messages author."},"content":{"type":"string","description":"The contents of the message."}}}}}}
```

## The ChatCompletionChunk object

```json
{"openapi":"3.0.3","info":{"title":"NLX OpenAI Connect API","version":"1.0.0"},"components":{"schemas":{"ChatCompletionChunk":{"type":"object","description":"Represents a streamed chunk of a chat completion response.","properties":{"id":{"type":"string","description":"A unique identifier for the chat completion."},"object":{"type":"string","enum":["chat.completion.chunk"],"description":"The object type, which is always \"chat.completion.chunk\"."},"created":{"type":"integer","description":"The Unix timestamp (in seconds) of when the chat completion was created."},"model":{"type":"string","description":"The model (deployment key) used for the completion."},"system_fingerprint":{"type":"string","description":"A fingerprint representing the backend system state (process.env.TAG)."},"choices":{"type":"array","description":"A list of chat completion choices.","items":{"$ref":"#/components/schemas/ChatCompletionChunkChoice"}}}},"ChatCompletionChunkChoice":{"type":"object","properties":{"index":{"type":"integer","description":"The index of the choice in the list of choices."},"delta":{"$ref":"#/components/schemas/ChatCompletionChunkDelta"},"logprobs":{"type":"object","nullable":true,"description":"Log probability information for the choice (null in this implementation)."},"finish_reason":{"type":"string","nullable":true,"enum":["stop","length","content_filter",null],"description":"The reason the model stopped generating tokens.  \"stop\" if the model hit a natural stop point or a provided stop sequence.\n"}}},"ChatCompletionChunkDelta":{"type":"object","description":"A chat completion delta generated by streamed model responses.","properties":{"role":{"type":"string","enum":["system","user","assistant"],"description":"The role of the author of this message."},"content":{"type":"string","description":"The contents of the chunk message.  In this implementation, this is the aggregated text from the NLX app messages.\n"}}}}}}
```

## The ChatCompletionChunkChoice object

```json
{"openapi":"3.0.3","info":{"title":"NLX OpenAI Connect API","version":"1.0.0"},"components":{"schemas":{"ChatCompletionChunkChoice":{"type":"object","properties":{"index":{"type":"integer","description":"The index of the choice in the list of choices."},"delta":{"$ref":"#/components/schemas/ChatCompletionChunkDelta"},"logprobs":{"type":"object","nullable":true,"description":"Log probability information for the choice (null in this implementation)."},"finish_reason":{"type":"string","nullable":true,"enum":["stop","length","content_filter",null],"description":"The reason the model stopped generating tokens.  \"stop\" if the model hit a natural stop point or a provided stop sequence.\n"}}},"ChatCompletionChunkDelta":{"type":"object","description":"A chat completion delta generated by streamed model responses.","properties":{"role":{"type":"string","enum":["system","user","assistant"],"description":"The role of the author of this message."},"content":{"type":"string","description":"The contents of the chunk message.  In this implementation, this is the aggregated text from the NLX app messages.\n"}}}}}}
```

## The ChatCompletionChunkDelta object

```json
{"openapi":"3.0.3","info":{"title":"NLX OpenAI Connect API","version":"1.0.0"},"components":{"schemas":{"ChatCompletionChunkDelta":{"type":"object","description":"A chat completion delta generated by streamed model responses.","properties":{"role":{"type":"string","enum":["system","user","assistant"],"description":"The role of the author of this message."},"content":{"type":"string","description":"The contents of the chunk message.  In this implementation, this is the aggregated text from the NLX app messages.\n"}}}}}}
```
