GroqChatSettings constructor

GroqChatSettings({
  1. double temperature = 1.0,
  2. int maxTokens = 8192,
  3. double topP = 1.0,
  4. String? stop,
  5. int maxConversationalMemoryLength = 1024,
})

GroqChatSettings constructor temperature controls randomness of responses.
maxTokens maximum number of tokens that can be generated in the chat completion.
topP method of text generation where a model will only consider the most probable next tokens that make up the probability p.
stream user server-side events to send the completion in small deltas rather than in a single batch after all processing has finished.
choicesCount how many chat completion choices to generate for each input message.
stop a stop sequence is a predefined or user-specified text string that signals an AI to stop generating content.
maxConversationalMemoryLength conversational memory length. The number of previous messages to include in the model's context. A higher value will result in more context-aware responses. Default: 1024
Throws an assertion error if the temperature is not between 0.0 and 2.0, maxTokens is less than or equal to 0, topP is not between 0.0 and 1.0, or choicesCount is less than or equal to 0. Default values: temperature: 1.0, maxTokens: 8192, topP: 1.0, stream: false, choicesCount: 1, stop: null, maxConversationalMemoryLength: 1024

Implementation

GroqChatSettings({
  this.temperature = 1.0,
  this.maxTokens = 8192,
  this.topP = 1.0,
  this.stop,
  this.maxConversationalMemoryLength = 1024,
}) {
  assert(temperature >= 0.0 && temperature <= 2.0,
      'Temperature must be between 0.0 and 2.0');
  assert(maxTokens > 0, 'Max tokens must be greater than 0');
  assert(topP >= 0.0 && topP <= 1.0, 'Top P must be between 0.0 and 1.0');
  assert(maxConversationalMemoryLength > 0,
      'Max conversational memory length must be greater than 0');
  // assert(choicesCount > 0, 'Choices count must be greater than 0');
}