llamadart 0.2.0 copy "llamadart: ^0.2.0" to clipboard
llamadart: ^0.2.0 copied to clipboard

A Dart/Flutter plugin for llama.cpp - run LLM inference on any platform using GGUF models

-/ -
pub points
707
downloads

This package version is not analyzed. Check the latest stable version for its analysis.

Check the analysis log for details.

Weekly downloads

Display as:
By versions:
12
likes
0
points
707
downloads

Publisher

verified publisherleehack.com

Weekly Downloads

A Dart/Flutter plugin for llama.cpp - run LLM inference on any platform using GGUF models

Repository (GitHub)
View/report issues

Topics

#llama #llm #ai #inference #gguf

License

unknown (license)

Dependencies

code_assets, ffi, flutter, hooks, http, logging, path, web

More

Packages that depend on llamadart