onenm_local_llm 0.1.5 copy "onenm_local_llm: ^0.1.5" to clipboard
onenm_local_llm: ^0.1.5 copied to clipboard

PlatformAndroid

Flutter plugin for on-device LLM inference on Android using llama.cpp. Simplifies model management, loading, and multi-turn chat — no cloud, no API keys, fully offline.

6 versionsVersions feed

Stable versions of onenm_local_llm

VersionMin Dart SDKUploadedDocumentationArchive
0.1.53.517 days agoGo to the documentation of onenm_local_llm 0.1.5Download onenm_local_llm 0.1.5 archive
0.1.43.519 days agoGo to the documentation of onenm_local_llm 0.1.4Download onenm_local_llm 0.1.4 archive
0.1.33.520 days agoDownload onenm_local_llm 0.1.3 archive
0.1.23.522 days agoDownload onenm_local_llm 0.1.2 archive
0.1.13.522 days agoDownload onenm_local_llm 0.1.1 archive
0.1.03.523 days agoDownload onenm_local_llm 0.1.0 archive
4
likes
160
points
398
downloads

Documentation

API reference

Publisher

unverified uploader

Weekly Downloads

Flutter plugin for on-device LLM inference on Android using llama.cpp. Simplifies model management, loading, and multi-turn chat — no cloud, no API keys, fully offline.

Repository (GitHub)
View/report issues
Contributing

Topics

#ai #llm #llama #on-device #inference

License

MIT (license)

Dependencies

flutter, http, path_provider, plugin_platform_interface

More

Packages that depend on onenm_local_llm

Packages that implement onenm_local_llm