neom_ollama 1.2.0 copy "neom_ollama: ^1.2.0" to clipboard
neom_ollama: ^1.2.0 copied to clipboard

Ollama integration for Dart — model discovery, pull/delete, health checks, setup automation, OpenAI-compatible chat, hardware profiling, reasoning trace parsing, and plain-text tool-call recovery for [...]

example/example.dart

import 'dart:io';
import 'package:neom_ollama/neom_ollama.dart';

void main() async {
  final client = OllamaClient();

  // Check status
  final status = await client.checkStatus();
  print('Ollama: $status');

  if (status != OllamaStatus.running) {
    print('Start Ollama first: ollama serve');
    return;
  }

  // List models
  final models = await client.listModels();
  for (final m in models) {
    print('  ${m.displayName} — ${m.sizeLabel}');
  }

  // Chat
  if (models.isNotEmpty) {
    final model = models.first.name;
    print('\nChatting with $model...');

    await for (final chunk in client.chatStream(model, 'Hello! Who are you?')) {
      stdout.write(chunk);
    }
    print('');
  }
}
0
likes
130
points
107
downloads

Documentation

API reference

Publisher

verified publisheropenneom.dev

Weekly Downloads

Ollama integration for Dart — model discovery, pull/delete, health checks, setup automation, OpenAI-compatible chat, hardware profiling, reasoning trace parsing, and plain-text tool-call recovery for local models that don't return structured tool_calls (Llama 3.2, Phi, Qwen, Hermes, DeepSeek).

Repository (GitHub)
View/report issues

Topics

#ai #ollama #local-inference #llm

License

unknown (license)

Dependencies

http

More

Packages that depend on neom_ollama