llm_ollama 0.1.7 copy "llm_ollama: ^0.1.7" to clipboard
llm_ollama: ^0.1.7 copied to clipboard

Ollama backend implementation for LLM interactions. Provides streaming chat, embeddings, tool calling, vision support, and model management.

Use this package as a library

Depend on it

Run this command:

With Dart:

 $ dart pub add llm_ollama

With Flutter:

 $ flutter pub add llm_ollama

This will add a line like this to your package's pubspec.yaml (and run an implicit dart pub get):

dependencies:
  llm_ollama: ^0.1.7

Alternatively, your editor might support dart pub get or flutter pub get. Check the docs for your editor to learn more.

Import it

Now in your Dart code, you can use:

import 'package:llm_ollama/llm_ollama.dart';
0
likes
160
points
310
downloads

Publisher

unverified uploader

Weekly Downloads

Ollama backend implementation for LLM interactions. Provides streaming chat, embeddings, tool calling, vision support, and model management.

Repository (GitHub)
View/report issues
Contributing

Topics

#ollama #llm #ai #chat #embeddings

Documentation

API reference

License

MIT (license)

Dependencies

http, llm_core

More

Packages that depend on llm_ollama