llm_llamacpp 0.1.0
llm_llamacpp: ^0.1.0 copied to clipboard
llama.cpp backend implementation for LLM interactions. Enables local on-device inference with GGUF models on Android, iOS, macOS, Windows, and Linux.
Stable versions of llm_llamacpp
| Version | Min Dart SDK | Uploaded | Documentation | Archive | |
|---|---|---|---|---|---|
| 0.1.0 | 3.8 | 32 hours ago |