GroqChatSettings class
- Available extensions
Constructors
- GroqChatSettings({double temperature = 1.0, int maxTokens = 8192, double topP = 1.0, String? stop, int maxConversationalMemoryLength = 1024})
-
GroqChatSettings constructor
temperature
controls randomness of responses.
maxTokens
maximum number of tokens that can be generated in the chat completion.
topP
method of text generation where a model will only consider the most probable next tokens that make up the probability p.
stream
user server-side events to send the completion in small deltas rather than in a single batch after all processing has finished.
choicesCount
how many chat completion choices to generate for each input message.
stop
a stop sequence is a predefined or user-specified text string that signals an AI to stop generating content.
maxConversationalMemoryLength
conversational memory length. The number of previous messages to include in the model's context. A higher value will result in more context-aware responses. Default: 1024
Throws an assertion error if the temperature is not between 0.0 and 2.0, maxTokens is less than or equal to 0, topP is not between 0.0 and 1.0, or choicesCount is less than or equal to 0. Default values: temperature: 1.0, maxTokens: 8192, topP: 1.0, stream: false, choicesCount: 1, stop: null, maxConversationalMemoryLength: 1024 - GroqChatSettings.defaults()
-
Default GroqChatSettings constructor
Default values: temperature: 1.0, maxTokens: 8192, topP: 1.0, stream: false, choicesCount: 1, stop: null, maxConversationalMemoryLength: 1024
Returns a GroqChatSettings object with default values
const
Properties
- hashCode → int
-
The hash code for this object.
no setterinherited
- maxConversationalMemoryLength → int
-
Conversational memory length.
The number of previous messages to include in the model's context.
A higher value will result in more context-aware responses.
Example:final - maxTokens → int
-
The maximum number of tokens that can be generated in the chat completion.
The total length of input tokens and generated tokens is limited by the model's context length. Default:8192
final - runtimeType → Type
-
A representation of the runtime type of the object.
no setterinherited
- stop → String?
-
A stop sequence is a predefined or user-specified text string that signals an AI to stop generating content,
ensuring its responses remain focused and concise. Default:null
final - temperature → double
-
Controls
randomness
of responses.
A lower temperature leads to more predictable outputs while a higher temperature results in more varies and sometimes more creative outputs. Default:1.0
final - topP → double
-
A method of text generation where a model will only consider the most probable
next tokens that make up the probability p. 0.5 means half of all
likelihood-weighted options are considered.
Default:
1.0
final
Methods
-
copyWith(
{double? temperature, int? maxTokens, double? topP, bool? stream, String? stop, int? choicesCount, int? maxConversationalMemoryLength}) → GroqChatSettings -
Returns a copy of the current GroqChatSettings object with the new values
temperature
controls randomness of responses.
maxTokens
maximum number of tokens that can be generated in the chat completion.
topP
method of text generation where a model will only consider the most probable next tokens that make up the probability p.
stream
user server-side events to send the completion in small deltas rather than in a single batch after all processing has finished.
choicesCount
how many chat completion choices to generate for each input message.
stop
a stop sequence is a predefined or user-specified text string that signals an AI to stop generating content.maxConversationalMemoryLength
conversational memory length. The number of previous messages to include in the model's context. A higher value will result in more context-aware responses. Example: -
noSuchMethod(
Invocation invocation) → dynamic -
Invoked when a nonexistent method or property is accessed.
inherited
-
toJson(
) → Map< String, dynamic> -
Available on GroqChatSettings, provided by the GroqChatSettingsExtension extension
-
toString(
) → String -
A string representation of this object.
inherited
Operators
-
operator ==(
Object other) → bool -
The equality operator.
inherited