OllamaConfig class

Configuration for Ollama LLM calls

Constructors

OllamaConfig({required String baseUrl, required String model, double? temperature, double? topP, int? maxTokens, bool stream = false, Map<String, dynamic> additionalParameters = const {}, int timeoutSeconds = 120})
const
OllamaConfig.local({required String model, double? temperature, double? topP, int? maxTokens, Map<String, dynamic> additionalParameters = const {}, int timeoutSeconds = 120})
Creates a default configuration for local Ollama instance
factory

Properties

additionalParameters Map<String, dynamic>
Additional parameters to pass to the Ollama API
final
baseUrl String
Base URL for the Ollama API (e.g., 'http://localhost:11434')
final
hashCode int
The hash code for this object.
no setteroverride
maxTokens int?
Maximum number of tokens to generate
final
model String
Model name to use (e.g., 'llama2', 'mistral', 'codellama')
final
runtimeType Type
A representation of the runtime type of the object.
no setterinherited
stream bool
Whether to stream the response (not supported in this implementation)
final
temperature double?
Temperature for controlling randomness (0.0 to 1.0)
final
timeoutSeconds int
Request timeout in seconds
final
topP double?
Top-p for nucleus sampling (0.0 to 1.0)
final

Methods

noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
toApiRequest(String prompt) Map<String, dynamic>
Converts the configuration to a JSON object for API requests
toMap() Map<String, dynamic>
Returns configuration information as a map
toString() String
A string representation of this object.
override
validate() → void
Validates the configuration parameters

Operators

operator ==(Object other) bool
The equality operator.
override