Ultralytics logo

English | įŽ€äŊ“中文

🚀 YOLO Flutter - Ultralytics Official Plugin

Ultralytics Actions .github/workflows/ci.yml codecov

Ultralytics Discord Ultralytics Forums Ultralytics Reddit

Ultralytics YOLO Flutter is the official plugin for running YOLO models in Flutter apps on iOS and Android. It supports detection, segmentation, classification, pose, and OBB with two simple entry points:

  • YOLO for single-image inference
  • YOLOView for real-time camera inference

The main goal is simple integration: use an official model ID, or drop in your own exported model and let the plugin resolve task metadata for you.


Ultralytics YOLO iOS App previews

Ultralytics GitHub space Ultralytics LinkedIn space Ultralytics Twitter space Ultralytics YouTube space Ultralytics TikTok space Ultralytics BiliBili space Ultralytics Discord

Apple App store

✨ Why This Plugin

  • Official Ultralytics plugin for Flutter
  • One Dart API for Android and iOS
  • Metadata-first model loading with official model download and caching
  • Real-time camera inference and single-image inference
  • Production-ready controls for thresholds, GPU use, and streaming
  • YOLO26 and YOLO11 model families supported
Feature Android iOS Details
Object Detection ✅ ✅ Bounding boxes, labels, and confidence scores
Image Classification ✅ ✅ Top class predictions and scores
Instance Segmentation ✅ ✅ Instance masks with boxes and classes
Pose Estimation ✅ ✅ Keypoints with boxes and confidence scores
Oriented Bounding Box (OBB) Detection ✅ ✅ Rotated boxes and polygon corners
Real-Time Camera Inference ✅ ✅ YOLOView for live camera workflows
Single-Image Inference ✅ ✅ YOLO for image bytes
Official Models ✅ ✅ Discovery, download, and caching for packaged model IDs
Custom Models ✅ ✅ TFLite on Android, Core ML on iOS, metadata-first tasks

⚡ Quick Start

Install the package:

Package: https://pub.dev/packages/ultralytics_yolo

dependencies:
  ultralytics_yolo: ^0.3.4
flutter pub get

Start with the default official model:

import 'package:ultralytics_yolo/ultralytics_yolo.dart';

final modelId = YOLO.defaultOfficialModel() ?? 'yolo26n';

YOLOView(
  modelPath: modelId,
  onResult: (results) {
    for (final r in results) {
      debugPrint('${r.className}: ${r.confidence}');
    }
  },
)

For single-image inference:

final yolo = YOLO(modelPath: 'yolo26n');
await yolo.loadModel();
final results = await yolo.predict(imageBytes);

â–ļī¸ Example App | 📖 Installation Guide | ⚡ Quick Start Guide

đŸ“Ļ Model Loading

The plugin supports three model flows.

1. Official model IDs

Use the default official model or a specific official ID and let the plugin handle download and caching:

final yolo = YOLO(modelPath: YOLO.defaultOfficialModel() ?? 'yolo26n');

Call YOLO.officialModels() to see which official IDs are available on the current platform.

2. Your own exported model

Pass your own exported YOLO model as a local path or Flutter asset path:

final yolo = YOLO(modelPath: 'assets/models/my-finetuned-model.tflite');

If the exported model includes metadata, the plugin infers task automatically. If metadata is missing, pass task explicitly.

final yolo = YOLO(
  modelPath: 'assets/models/my-finetuned-model.tflite',
  task: YOLOTask.detect,
);

3. Remote model URL

Pass an http or https URL and the plugin will download it into app storage before loading it.

🧭 Official vs. Custom

Use case Recommended path
Fastest first integration Official model ID like yolo26n
You trained or exported your own model Custom asset or local file
You ship different models per customer or environment Remote URL
You need the plugin to infer task automatically Any export with metadata
You have an older or stripped export without metadata Custom model plus explicit task

For official models, start with YOLO.defaultOfficialModel() or YOLO.officialModels(). For custom models, start with the exported file you actually plan to ship.

đŸ“Ĩ Drop Your Own Model Into an App

For custom models, keep the app-side setup minimal.

  • Android native assets: place .tflite files in android/app/src/main/assets
  • Flutter assets on Android: place .tflite files in assets/models/
  • iOS bundle: drag .mlpackage or .mlmodel into ios/Runner.xcworkspace
  • Flutter assets on iOS: place .mlpackage.zip files in assets/models/

Then point modelPath at that file or asset path.

iOS export note

Detection models exported to Core ML must use nms=True:

from ultralytics import YOLO

# Square [640, 640] works best when one model must run in both portrait and landscape.
# Ultralytics imgsz order is [height, width]; use [640, 384] for portrait-only or [384, 640] for landscape-only.
YOLO("yolo26n.pt").export(format="coreml", nms=True, imgsz=[640, 640])

Other tasks can use the default export settings, with the same square-orientation guidance for imgsz.

đŸŽ¯ Choose The Right API

Use YOLO when you already have image bytes and want one prediction at a time:

final yolo = YOLO(modelPath: 'yolo26n');
await yolo.loadModel();
final results = await yolo.predict(imageBytes);

Use YOLOView when you want live camera inference:

final controller = YOLOViewController();

YOLOView(
  modelPath: 'yolo26n',
  controller: controller,
  onResult: (results) {},
)

await controller.switchModel('assets/models/custom.tflite', YOLOTask.detect);
App type Model loading pattern
Live camera app YOLOView(modelPath: 'yolo26n')
Photo picker or gallery workflow YOLO(modelPath: 'yolo26n')
App with your own bundled model YOLO(modelPath: 'assets/models/custom.tflite')
Cross-platform Core ML + TFLite app Use platform-appropriate exported assets and let metadata drive task
App that changes models at runtime YOLOViewController.switchModel(...)

📚 Documentation

Guide Description
Installation Guide Requirements and platform setup
Quick Start Minimal setup for the first working app
Model Guide Official models, custom models, export flow
Usage Guide Common app patterns and examples
API Reference Full API surface
Performance Guide Tuning and performance controls
Troubleshooting Common problems and fixes

🤝 Community & Support

Ultralytics Discord Ultralytics Forums Ultralytics Reddit

💡 Contribute

Ultralytics thrives on community collaboration, and we deeply value your contributions! Whether it's bug fixes, feature enhancements, or documentation improvements, your involvement is crucial. Please review our Contributing Guide for detailed insights on how to participate. We also encourage you to share your feedback through our Survey. A heartfelt thank you 🙏 goes out to all our contributors!

Ultralytics open-source contributors

📄 License

Ultralytics offers two licensing options to accommodate diverse needs:

  • AGPL-3.0 License: Ideal for students, researchers, and enthusiasts passionate about open-source collaboration. This OSI-approved license promotes knowledge sharing and open contribution. See the LICENSE file for details.
  • Enterprise License: Designed for commercial applications, this license permits seamless integration of Ultralytics software and AI models into commercial products and services, bypassing the open-source requirements of AGPL-3.0. For commercial use cases, please inquire about an Enterprise License.

Native iOS Development

If you're interested in using YOLO models directly in iOS applications with Swift (without Flutter), check out our dedicated iOS repository:

👉 Ultralytics YOLO iOS App - A native iOS application for real-time detection, segmentation, classification, pose estimation, and OBB detection with Ultralytics YOLO models.

This repository provides:

  • Pure Swift implementation for iOS
  • Direct Core ML integration
  • Native iOS UI components
  • Example code for various YOLO tasks
  • Optimized for iOS performance

📮 Contact

Encountering issues or have feature requests related to Ultralytics YOLO? Please report them via GitHub Issues. For broader discussions, questions, and community support, join our Discord server!


Ultralytics GitHub space Ultralytics LinkedIn space Ultralytics Twitter space Ultralytics YouTube space Ultralytics TikTok space Ultralytics BiliBili space Ultralytics Discord