llm_llamacpp 0.1.9 copy "llm_llamacpp: ^0.1.9" to clipboard
llm_llamacpp: ^0.1.9 copied to clipboard

llama.cpp backend implementation for LLM interactions. Enables local on-device inference with GGUF models on Android, iOS, macOS, Windows, and Linux.

6 versionsVersions feed

Stable versions of llm_llamacpp

VersionMin Dart SDKUploadedDocumentationArchive
0.1.93.838 days agoGo to the documentation of llm_llamacpp 0.1.9Download llm_llamacpp 0.1.9 archive
0.1.83.840 days agoGo to the documentation of llm_llamacpp 0.1.8Download llm_llamacpp 0.1.8 archive
0.1.73.856 days agoDownload llm_llamacpp 0.1.7 archive
0.1.63.856 days agoDownload llm_llamacpp 0.1.6 archive
0.1.53.82 months agoDownload llm_llamacpp 0.1.5 archive
0.1.03.82 months agoDownload llm_llamacpp 0.1.0 archive
1
likes
160
points
122
downloads

Documentation

API reference

Publisher

unverified uploader

Weekly Downloads

llama.cpp backend implementation for LLM interactions. Enables local on-device inference with GGUF models on Android, iOS, macOS, Windows, and Linux.

Repository (GitHub)
View/report issues
Contributing

Topics

#llamacpp #llama #llm #flutter #ffi

License

MIT (license)

Dependencies

code_assets, ffi, flutter, hooks, http, llm_core, logging, path

More

Packages that depend on llm_llamacpp

Packages that implement llm_llamacpp