OpenAIOptions class

Options to pass into the OpenAI LLM.

Available models:

Annotations
  • @immutable

Constructors

OpenAIOptions.new({String? model, int? bestOf, double? frequencyPenalty, Map<String, int>? logitBias, int? logprobs, int? maxTokens, int? n, double? presencePenalty, int? seed, List<String>? stop, String? suffix, double? temperature, double? topP, String? user, int concurrencyLimit = 1000})
Options to pass into the OpenAI LLM.
const

Properties

bestOf int?
Generates best_of completions server-side and returns the "best" (the one with the highest log probability per token).
final
concurrencyLimit int
The maximum number of concurrent calls that the runnable can make. Defaults to 1000 (different Runnable types may have different defaults).
finalinherited
frequencyPenalty double?
Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.
final
hashCode int
The hash code for this object.
no setter
logitBias Map<String, int>?
Modify the likelihood of specified tokens appearing in the completion.
final
logprobs int?
Include the log probabilities on the logprobs most likely tokens, as well the chosen tokens. For example, if logprobs is 5, the API will return a list of the 5 most likely tokens. The API will always return the logprob of the sampled token, so there may be up to logprobs+1 elements in the response.
final
maxTokens int?
The maximum number of tokens to generate in the completion.
final
model String?
ID of the language model to use. Check the provider's documentation for available models.
finalinherited
n int?
How many completions to generate for each prompt.
final
presencePenalty double?
Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics.
final
runtimeType Type
A representation of the runtime type of the object.
no setterinherited
seed int?
If specified, our system will make a best effort to sample deterministically, such that repeated requests with the same seed and parameters should return the same result.
final
stop List<String>?
Up to 4 sequences where the API will stop generating further tokens. The returned text will not contain the stop sequence.
final
suffix String?
The suffix that comes after a completion of inserted text.
final
temperature double?
What sampling temperature to use, between 0 and 2.
final
topP double?
An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass.
final
user String?
A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse.
final

Methods

copyWith({String? model, int? bestOf, double? frequencyPenalty, Map<String, int>? logitBias, int? logprobs, int? maxTokens, int? n, double? presencePenalty, int? seed, List<String>? stop, String? suffix, double? temperature, double? topP, String? user, int? concurrencyLimit}) OpenAIOptions
Creates a copy of this RunnableOptions with the given fields replaced by the new values.
merge(covariant OpenAIOptions? other) OpenAIOptions
Merges this RunnableOptions with another RunnableOptions.
noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
toString() String
A string representation of this object.
inherited

Operators

operator ==(covariant OpenAIOptions other) bool
The equality operator.