llamadart 0.5.1 copy "llamadart: ^0.5.1" to clipboard
llamadart: ^0.5.1 copied to clipboard

A Dart/Flutter plugin for llama.cpp - run LLM inference on any platform using GGUF models

Use this package as a library

Depend on it

Run this command:

With Dart:

 $ dart pub add llamadart

With Flutter:

 $ flutter pub add llamadart

This will add a line like this to your package's pubspec.yaml (and run an implicit dart pub get):

dependencies:
  llamadart: ^0.5.1

Alternatively, your editor might support dart pub get or flutter pub get. Check the docs for your editor to learn more.

Import it

Now in your Dart code, you can use:

import 'package:llamadart/llamadart.dart';
11
likes
0
points
707
downloads

Publisher

verified publisherleehack.com

Weekly Downloads

A Dart/Flutter plugin for llama.cpp - run LLM inference on any platform using GGUF models

Repository (GitHub)
View/report issues

Topics

#llama #llm #ai #inference #gguf

License

unknown (license)

Dependencies

code_assets, dinja, ffi, flutter, hooks, http, json_rpc_2, logging, path, path_provider, web

More

Packages that depend on llamadart