maxTokens property

int? maxTokens
getter/setter pair

The maximum number of tokens to generate in the completion. The token count of your prompt plus max_tokens cannot exceed the model's context length. Most models have a context length of 2048 tokens (except for the newest models, which support 4096).

Minimum value: 0

Implementation

int? maxTokens;