flutter_agentic_ai 0.1.3
flutter_agentic_ai: ^0.1.3 copied to clipboard
Embed intelligent, UI-aware AI agents in any Flutter app. Autonomous tap, type, scroll, and navigation using Flutter's semantics tree.
flutter_agentic_ai #
Embed intelligent AI agents into any Flutter app β with a single widget.
flutter_agentic_ai lets users talk to your app in plain language and get things done. The agent navigates, taps, fills forms, and completes multi-step tasks β without you writing any automation code. Works out of the box on any production app with zero widget instrumentation.
β¨ Features #
- π€ Natural language tasks β users describe what they want, the agent does it
- πΊοΈ Autonomous navigation β seamlessly routes through your app (currently requires
go_router) - π¬ Floating chat bar β draggable FAB + expandable panel, ready out of the box
- π Live thinking indicator β status overlay with cancel support
- π‘οΈ Security guardrails β blacklist elements, mask PII, or disable UI control entirely
- π RTL / Arabic support β full right-to-left layout built in
- β‘ Zero setup β no native code, no permissions, just wrap your
MaterialApp - π Gemini powered β built-in Gemini provider with proxy URL support for production
π Quick Start #
Important
Navigation Requirement: The autonomous navigation engine currently requires go_router. If your app uses standard Navigator or another routing package, the agent can still tap, type, and scroll, but it will not be able to autonomously route between screens.
1. Install #
dependencies:
flutter_agentic_ai: ^0.1.0
2. Wrap your app #
import 'package:flutter_agentic_ai/flutter_agentic_ai.dart';
// Pass your API key via --dart-define (never hardcode it)
const apiKey = String.fromEnvironment('GEMINI_API_KEY');
return AiAgent(
apiKey: apiKey,
router: router, // your GoRouter instance
instructions: 'You are a helpful assistant for MyApp.',
accentColor: Colors.deepPurple,
onResult: (result) => debugPrint(result.message),
child: MaterialApp.router(
routerConfig: router,
title: 'MyApp',
),
);
Run with:
flutter run --dart-define=GEMINI_API_KEY=your_key_here
π AiAgent Props #
Provider #
| Prop | Type | Description |
|---|---|---|
apiKey |
String? |
Gemini API key. Dev/prototyping only β use proxyUrl in production. |
provider |
AiProvider? |
Pre-configured provider instance (takes precedence over apiKey). |
proxyUrl |
String? |
Your backend proxy URL β keeps API keys off the device. |
proxyHeaders |
Map<String, String>? |
Auth headers to send with proxy requests. |
model |
String? |
Override the Gemini model (default: gemini-2.5-flash). |
Behavior #
| Prop | Type | Default | Description |
|---|---|---|---|
maxSteps |
int |
15 |
Maximum agent steps per task. |
instructions |
String? |
β | System-level instructions for every interaction. |
router |
dynamic |
β | go_router instance for deep navigation (currently the only supported router). |
language |
String |
'en' |
'en' or 'ar' β controls locale and RTL layout. |
maxTokenBudget |
int? |
β | Auto-stop when token budget is exceeded. |
maxCostUsd |
double? |
β | Auto-stop when estimated cost exceeds this value. |
debug |
bool |
false |
Enable verbose debug logging. |
Lifecycle Callbacks #
| Prop | Type | Description |
|---|---|---|
onResult |
(ExecutionResult) β void |
Called when the agent finishes a task. |
onBeforeStep |
(int stepCount) β Future<void> |
Called before each agent step. |
onAfterStep |
(List<AgentStep>) β Future<void> |
Called after each step. |
onStatusUpdate |
(String) β void |
Live status text for custom UI integration. |
Security #
| Prop | Type | Description |
|---|---|---|
interactiveBlacklist |
List<GlobalKey>? |
Elements the AI must not interact with. |
interactiveWhitelist |
List<GlobalKey>? |
If set, AI can only interact with these elements. |
transformScreenContent |
(String) β Future<String> |
Mask PII before the AI sees screen content. |
enableUiControl |
bool |
false for knowledge-only mode. Default: true. |
UI #
| Prop | Type | Description |
|---|---|---|
accentColor |
Color? |
Accent color for FAB and send button. |
theme |
AgentChatBarTheme? |
Full chat bar theme override. |
showChatBar |
bool |
Show/hide the floating chat bar. Default: true. |
π‘οΈ Security Guardrails #
Block sensitive UI areas #
final _paymentKey = GlobalKey();
AiAgent(
interactiveBlacklist: [_paymentKey],
child: Scaffold(
body: Container(key: _paymentKey, child: CreditCardForm()),
),
);
Mask PII before the AI sees it #
AiAgent(
transformScreenContent: (content) async {
return content
.replaceAll(RegExp(r'\b\d{16}\b'), '****-****-****-****')
.replaceAll(RegExp(r'[\w.]+@[\w.]+'), '[email]');
},
child: ...,
)
Knowledge-only mode #
AiAgent(
enableUiControl: false,
instructions: 'Answer questions about the app only.',
child: ...,
)
Production proxy #
AiAgent(
proxyUrl: 'https://api.myapp.com/ai',
proxyHeaders: {'Authorization': 'Bearer $userToken'},
child: ...,
)
β οΈ Using
apiKeydirectly in a release build will log a security warning. UseproxyUrlin production.
π¨ Custom Theme #
AiAgent(
theme: AgentChatBarTheme(
primaryColor: Colors.indigo,
backgroundColor: const Color(0xFF1A1A2E),
textColor: Colors.white,
),
child: ...,
)
πͺ Access the Agent Anywhere #
Trigger the agent from any widget in the tree:
final agent = AiAgentScope.of(context);
ElevatedButton(
onPressed: () => agent.send('Add the first item to cart'),
child: const Text('Let AI do it'),
);
βοΈ Custom Actions #
Register custom Dart functions the agent can call:
actionRegistry.register(AgentAction(
name: 'open_support_chat',
description: 'Opens the in-app support chat',
parameters: {},
handler: (_) async {
SupportChat.show();
return ActionResult(success: true);
},
));
π¦ Requirements #
- Flutter
>=3.0.0 - Dart
>=3.0.0 go_router(required for autonomous navigation)- Gemini API key β get one free at Google AI Studio
π License #
MIT Β© 2025 MobileAI