Skip to content

chat_completion

Classes:

Name Description
ChatCompletion
Choice
ChoiceLogprobs

ChatCompletion

Attributes:

Name Type Description
choices List[Choice]

A list of chat completion choices.

created int

The Unix timestamp (in seconds) of when the chat completion was created.

id str

A unique identifier for the chat completion.

model str

The model used for the chat completion.

object Literal['chat.completion']

The object type, which is always chat.completion.

system_fingerprint Optional[str]

This fingerprint represents the backend configuration that the model runs with.

usage Optional[CompletionUsage]

Usage statistics for the completion request.

choices instance-attribute

choices: List[Choice]

A list of chat completion choices.

Can be more than one if n is greater than 1.

created instance-attribute

created: int

The Unix timestamp (in seconds) of when the chat completion was created.

id instance-attribute

id: str

A unique identifier for the chat completion.

model instance-attribute

model: str

The model used for the chat completion.

object instance-attribute

object: Literal['chat.completion']

The object type, which is always chat.completion.

system_fingerprint class-attribute instance-attribute

system_fingerprint: Optional[str] = None

This fingerprint represents the backend configuration that the model runs with.

Can be used in conjunction with the seed request parameter to understand when backend changes have been made that might impact determinism.

usage class-attribute instance-attribute

usage: Optional[CompletionUsage] = None

Usage statistics for the completion request.

Choice

Attributes:

Name Type Description
finish_reason Literal['stop', 'length', 'tool_calls', 'content_filter', 'function_call']

The reason the model stopped generating tokens.

index int

The index of the choice in the list of choices.

logprobs Optional[ChoiceLogprobs]

Log probability information for the choice.

message ChatCompletionMessage

A chat completion message generated by the model.

finish_reason instance-attribute

finish_reason: Literal[
    "stop",
    "length",
    "tool_calls",
    "content_filter",
    "function_call",
]

The reason the model stopped generating tokens.

This will be stop if the model hit a natural stop point or a provided stop sequence, length if the maximum number of tokens specified in the request was reached, content_filter if content was omitted due to a flag from our content filters, tool_calls if the model called a tool, or function_call (deprecated) if the model called a function.

index instance-attribute

index: int

The index of the choice in the list of choices.

logprobs class-attribute instance-attribute

logprobs: Optional[ChoiceLogprobs] = None

Log probability information for the choice.

message instance-attribute

A chat completion message generated by the model.

ChoiceLogprobs

Attributes:

Name Type Description
content Optional[List[ChatCompletionTokenLogprob]]

A list of message content tokens with log probability information.

content class-attribute instance-attribute

A list of message content tokens with log probability information.