providers/ollama/ollama library

Modular Ollama Provider

This library provides a modular implementation of the Ollama provider

Key Benefits:

  • Single Responsibility: Each module handles one capability
  • Easier Testing: Modules can be tested independently
  • Better Maintainability: Changes isolated to specific modules
  • Cleaner Code: Smaller, focused classes
  • Reusability: Modules can be reused across providers
  • Local Deployment: Designed for local Ollama instances

Usage:

import 'package:llm_dart/providers/ollama/ollama.dart';

final provider = OllamaProvider(OllamaConfig(
  baseUrl: 'http://localhost:11434',
  model: 'llama3.2',
));

// Use chat capability
final response = await provider.chat(messages);

// Use completion capability
final completion = await provider.complete(CompletionRequest(prompt: 'Hello'));

// Use embeddings capability
final embeddings = await provider.embed(['text to embed']);

// List available models
final models = await provider.models();

Classes

OllamaChat
Ollama Chat capability implementation
OllamaChatResponse
Ollama chat response implementation
OllamaClient
Core Ollama HTTP client shared across all capability modules
OllamaCompletion
Ollama Completion capability implementation
OllamaConfig
Ollama provider configuration
OllamaEmbeddings
Ollama Embeddings capability implementation
OllamaModels
Ollama Models capability implementation
OllamaProvider
Ollama provider implementation

Functions

createOllamaChatProvider({String baseUrl = 'http://localhost:11434', String model = 'llama3.2', String? systemPrompt, double? temperature, int? maxTokens}) OllamaProvider
Create an Ollama provider for chat
createOllamaCodeProvider({String baseUrl = 'http://localhost:11434', String model = 'codellama', String? systemPrompt, double? temperature, int? maxTokens}) OllamaProvider
Create an Ollama provider for code generation
createOllamaCompletionProvider({String baseUrl = 'http://localhost:11434', String model = 'llama3.2', double? temperature, int? maxTokens}) OllamaProvider
Create an Ollama provider for completion tasks
createOllamaEmbeddingProvider({String baseUrl = 'http://localhost:11434', String model = 'nomic-embed-text'}) OllamaProvider
Create an Ollama provider for embeddings
createOllamaProvider({String? baseUrl, String? apiKey, String? model, int? maxTokens, double? temperature, String? systemPrompt, Duration? timeout, double? topP, int? topK, List<Tool>? tools, StructuredOutputFormat? jsonSchema, int? numCtx, int? numGpu, int? numThread, bool? numa, int? numBatch, String? keepAlive, bool? raw}) OllamaProvider
Create an Ollama provider with default configuration
createOllamaVisionProvider({String baseUrl = 'http://localhost:11434', String model = 'llava', String? systemPrompt, double? temperature, int? maxTokens}) OllamaProvider
Create an Ollama provider for vision tasks