llm_llamacpp 0.1.6
llm_llamacpp: ^0.1.6 copied to clipboard
llama.cpp backend implementation for LLM interactions. Enables local on-device inference with GGUF models on Android, iOS, macOS, Windows, and Linux.
1
likes
-/ -
pub points
122
downloads
This package version is not analyzed. Check the latest stable version for its analysis.
Check the analysis log for details.