maxTokens property

  1. @BuiltValueField(wireName: r'max_tokens')
int? maxTokens

The maximum number of tokens to generate in the completion. The token count of your prompt plus max_tokens cannot exceed the model's context length. Most models have a context length of 2048 tokens (except for the newest models, which support 4096).

Implementation

@BuiltValueField(wireName: r'max_tokens')
int? get maxTokens;