onde_inference 0.1.1 copy "onde_inference: ^0.1.1" to clipboard
onde_inference: ^0.1.1 copied to clipboard

On-device LLM inference SDK for Flutter & Dart. Runs Qwen 2.5 models locally with Metal (Apple silicon) and CPU acceleration — no cloud, no data leaving the device. Powered by the Onde Rust engine and [...]

2 versionsVersions feed

Stable versions of onde_inference

VersionMin Dart SDKUploadedDocumentationArchive
0.1.13.313 hours agoGo to the documentation of onde_inference 0.1.1Download onde_inference 0.1.1 archive
0.1.03.320 hours agoGo to the documentation of onde_inference 0.1.0Download onde_inference 0.1.0 archive
3
likes
110
points
0
downloads

Documentation

API reference

Publisher

verified publisherondeinference.com

Weekly Downloads

On-device LLM inference SDK for Flutter & Dart. Runs Qwen 2.5 models locally with Metal (Apple silicon) and CPU acceleration — no cloud, no data leaving the device. Powered by the Onde Rust engine and mistral.rs.

Homepage
Repository (GitHub)
View/report issues

License

MIT (license)

Dependencies

flutter, flutter_rust_bridge, freezed_annotation

More

Packages that depend on onde_inference

Packages that implement onde_inference