onenm_local_llm 0.1.5
onenm_local_llm: ^0.1.5 copied to clipboard
Flutter plugin for on-device LLM inference on Android using llama.cpp. Simplifies model management, loading, and multi-turn chat — no cloud, no API keys, fully offline.
Use this package as a library
Depend on it
Run this command:
With Flutter:
$ flutter pub add onenm_local_llmThis will add a line like this to your package's pubspec.yaml (and run an implicit flutter pub get):
dependencies:
onenm_local_llm: ^0.1.5Alternatively, your editor might support flutter pub get. Check the docs for your editor to learn more.
Import it
Now in your Dart code, you can use:
import 'package:onenm_local_llm/onenm_local_llm.dart';