llm_llamacpp 0.1.9
llm_llamacpp: ^0.1.9 copied to clipboard
llama.cpp backend implementation for LLM interactions. Enables local on-device inference with GGUF models on Android, iOS, macOS, Windows, and Linux.
Stable versions of llm_llamacpp
| Version | Min Dart SDK | Uploaded | Documentation | Archive | |
|---|---|---|---|---|---|
| 0.1.9 | 3.8 | 38 days ago | |||
| 0.1.8 | 3.8 | 40 days ago | |||
| 0.1.7 | 3.8 | 56 days ago | |||
| 0.1.6 | 3.8 | 56 days ago | |||
| 0.1.5 | 3.8 | 2 months ago | |||
| 0.1.0 | 3.8 | 2 months ago |