fasttext_flutter 0.1.0
fasttext_flutter: ^0.1.0 copied to clipboard
On-device text classification and sentence embeddings using fastText .ftz quantized models via Dart FFI. Supports Android, iOS, macOS, Linux, and Windows.
fasttext_flutter #
On-device text classification and sentence embeddings using fastText .ftz quantized models via Dart FFI — no internet, no server required.
Features #
| Feature | Supported |
|---|---|
Load .ftz quantized models |
✅ |
Load .bin full models |
✅ |
Text classification — predict() |
✅ |
| Top-k predictions with probabilities | ✅ |
Sentence embeddings — computeEmbedding() |
✅ |
Cosine similarity — FastTextModel.cosineSimilarity() |
✅ |
| Unsupervised / word-vector models (cbow, skipgram) | ✅ |
| Load from file system | ✅ |
| Load from Flutter assets | ✅ |
| Background isolate inference (non-blocking UI) | ✅ |
| Android (arm64, arm, x86_64) | ✅ |
| iOS (arm64, sim-x86_64, sim-arm64) | ✅ |
| macOS · Linux · Windows | ✅ |
Getting started #
1. Add the dependency #
dependencies:
fasttext_flutter: ^0.1.0
2. Add your model as an asset #
flutter:
assets:
- assets/model.ftz
Download a pre-trained language-ID model (917 KB, 177 languages):
https://fasttext.cc/docs/en/language-identification.html
Usage #
Load a model #
import 'package:fasttext_flutter/fasttext_flutter.dart';
// From Flutter assets (most common)
final model = await FastTextModel.loadAsset('assets/lid.176.ftz');
// From an absolute file path
final model = await FastTextModel.load('/sdcard/downloads/model.ftz');
Classify text (supervised models) #
// Top-1 prediction (default)
final results = await model.predict('Bonjour le monde');
print(results.first.cleanLabel); // fr
print(results.first.probability); // 0.9994...
// Top-5
final top5 = await model.predict('Hello, world!', k: 5);
// With probability threshold
final preds = await model.predict('Hola', k: 3, threshold: 0.5);
Compute sentence embeddings (all model types) #
final vector = await model.computeEmbedding('Flutter developer');
// → Float32List of length model.dimension (e.g. 10, 100, 300)
Cosine similarity #
final a = await model.computeEmbedding('Flutter developer');
final b = await model.computeEmbedding('Mobile app developer');
final sim = FastTextModel.cosineSimilarity(a, b);
print(sim.toStringAsFixed(4)); // e.g. 0.9797 (1 = identical, 0 = unrelated)
Check model type #
print(model.modelType); // FastTextModelType.supervised / .cbow / .skipgram
print(model.isSupervised); // true for classification models
print(model.dimension); // embedding dimension
print(model.labelCount); // 0 for unsupervised models
Close the model #
model.close(); // frees native memory, safe to call multiple times
FastTextPrediction #
final r = results.first;
r.label // raw label, e.g. '__label__en'
r.cleanLabel // stripped label, e.g. 'en'
r.probability // double in [0.0, 1.0]
Error handling #
All native errors are surfaced as FastTextException:
try {
final model = await FastTextModel.load('/bad/path.ftz');
} on FastTextException catch (e) {
print(e.message);
}
Calling predict() on an unsupervised model also throws FastTextException
with a clear message directing you to computeEmbedding() instead.
Performance notes #
.ftzquantized models are 10–100× smaller than.binand nearly as accurate for classification.computeEmbedding()from.ftzis approximate — suitable for similarity ranking.predict()andcomputeEmbedding()run insideIsolate.run()— the UI thread is never blocked.- Load the model once; keep the
FastTextModelinstance alive across requests.
License #
Apache-2.0. See LICENSE.
Bundled fastText C++ sources: © Facebook, Inc., MIT license.