๐ค AI SDK Dart
A Dart/Flutter port of Vercel AI SDK v6 โ provider-agnostic APIs for text generation, streaming, structured output, tool use, embeddings, image generation, speech, and more.
What is this?
AI SDK Dart brings the full power of Vercel AI SDK v6 to Dart and Flutter. Write your AI logic once, swap providers without changing a line of business code, and ship on every platform โ mobile, web, and server. Every API mirrors its JavaScript counterpart so the official Vercel docs apply directly to your Dart code.
Screenshots
Flutter Chat App (examples/flutter_chat)
| Multi-turn Chat | Streaming Response |
![]() |
![]() |
| Completion | Object Stream |
![]() |
![]() |
Advanced App (examples/advanced_app)
| Provider Chat | Tools Chat |
![]() |
![]() |
| Image Generation | Multimodal |
![]() |
![]() |
โจ Features
๐ฃ๏ธ Text Generation & Streaming
generateTextโ single-turn or multi-step text generation with full result envelopestreamTextโ real-time token streaming with typed event taxonomysmoothStreamtransform โ configurable chunk-size smoothing for UX- Multi-step agentic loops with
maxSteps,prepareStep, andstopConditions - Callbacks:
onFinish,onStepFinish,onChunk,onError,experimentalOnStart
๐งฉ Structured Output
Output.object(schema)โ parse model output into a typed Dart objectOutput.array(schema)โ parse model output into a typed Dart listOutput.choice(options)โ constrain output to a fixed set of string valuesOutput.json()โ raw JSON without schema validation- Automatic code-fence stripping (
```json ... ```)
๐ง Type-Safe Tools & Multi-Step Agents
tool<Input, Output>()โ fully typed tool definitions with JSON schemadynamicTool()โ tools with unknown input type for dynamic use cases- Tool choice:
auto,required,none, or specific tool - Tool approval workflow with
needsApproval - Multi-step agentic loops with automatic tool result injection
onInputStart,onInputDelta,onInputAvailablelifecycle hooks
๐ผ๏ธ Multimodal
generateImageโ image generation (DALL-E 3 via OpenAI)generateSpeechโ text-to-speech audio synthesistranscribeโ speech-to-text transcription- Image inputs in prompts (multimodal vision)
๐งฎ Embeddings & Cosine Similarity
embed()โ single value embedding with usage trackingembedMany()โ batch embedding for multiple valuescosineSimilarity()โ built-in similarity computation
๐งฑ Middleware System
wrapLanguageModel(model, middlewares)โ composable middleware pipelineextractReasoningMiddlewareโ strips<think>tags intoReasoningPartextractJsonMiddlewareโ strips```json ```fencessimulateStreamingMiddlewareโ converts non-streaming models to streamingdefaultSettingsMiddlewareโ applies default temperature/top-p/etc.addToolInputExamplesMiddlewareโ enriches tool descriptions with examples
๐ Provider Registry
createProviderRegistryโ map provider aliases to model factories- Resolve models by
'provider:modelId'string at runtime - Mix providers in a single registry for multi-provider apps
๐ฑ Flutter UI Controllers
ChatControllerโ multi-turn streaming chat with message historyCompletionControllerโ single-turn text completion with statusObjectStreamControllerโ streaming typed JSON object updates
๐ MCP Client (Model Context Protocol)
MCPClientโ connect to MCP servers, discover tools, invoke themSseClientTransportโ HTTP SSE transportStdioMCPTransportโ stdio process transport- Discovered tools are directly compatible with
generateText/streamText
๐งช Conformance Suite
- 178+ tests covering every public API
- Spec-driven JSON fixtures as the source of truth
- Provider wire-format conformance tests for OpenAI, Anthropic, and Google
๐ฆ Packages
| Package | pub.dev | What it gives you |
|---|---|---|
ai_sdk_dart |
dart pub add ai |
generateText, streamText, tools, middleware, embeddings, registry |
ai_sdk_openai |
dart pub add ai_sdk_openai |
openai('gpt-4.1-mini'), embeddings, image gen, speech, transcription |
ai_sdk_anthropic |
dart pub add ai_sdk_anthropic |
anthropic('claude-sonnet-4-5'), extended thinking |
ai_sdk_google |
dart pub add ai_sdk_google |
google('gemini-2.0-flash'), embeddings |
ai_sdk_flutter_ui |
dart pub add ai_sdk_flutter_ui |
ChatController, CompletionController, ObjectStreamController |
ai_sdk_mcp |
dart pub add ai_sdk_mcp |
MCPClient, SseClientTransport, StdioMCPTransport |
ai_sdk_provider |
(transitive) | Provider interfaces for building custom providers |
ai_sdk_provideris a transitive dependency โ you do not need to add it directly.
๐ Quick Start
Dart CLI
dart pub add ai_sdk_dart ai_sdk_openai
export OPENAI_API_KEY=sk-...
import 'package:ai_sdk_dart/ai_sdk_dart.dart';
import 'package:ai_sdk_openai/ai_sdk_openai.dart';
void main() async {
// Text generation
final result = await generateText(
model: openai('gpt-4.1-mini'),
prompt: 'Say hello from AI SDK Dart!',
);
print(result.text);
}
Streaming
final result = await streamText(
model: openai('gpt-4.1-mini'),
prompt: 'Count from 1 to 5.',
);
await for (final chunk in result.textStream) {
stdout.write(chunk);
}
Structured Output
final result = await generateText<Map<String, dynamic>>(
model: openai('gpt-4.1-mini'),
prompt: 'Return the capital and currency of Japan as JSON.',
output: Output.object(
schema: Schema<Map<String, dynamic>>(
jsonSchema: const {
'type': 'object',
'properties': {
'capital': {'type': 'string'},
'currency': {'type': 'string'},
},
},
fromJson: (json) => json,
),
),
);
print(result.output); // {capital: Tokyo, currency: JPY}
Type-Safe Tools
final result = await generateText(
model: openai('gpt-4.1-mini'),
prompt: 'What is the weather in Paris?',
maxSteps: 5,
tools: {
'getWeather': tool<Map<String, dynamic>, String>(
description: 'Get current weather for a city.',
inputSchema: Schema(
jsonSchema: const {
'type': 'object',
'properties': {'city': {'type': 'string'}},
},
fromJson: (json) => json,
),
execute: (input, _) async => 'Sunny, 18ยฐC',
),
},
);
print(result.text);
Flutter Chat UI
dart pub add ai_sdk_dart ai_sdk_openai ai_sdk_flutter
import 'package:ai_sdk_flutter_ui/ai_sdk_flutter_ui.dart';
final chat = ChatController(model: openai('gpt-4.1-mini'));
// In your widget:
await chat.append('Tell me a joke');
print(chat.messages.last.content);
๐ค Providers
| Capability | OpenAI | Anthropic | |
|---|---|---|---|
| Text generation | โ | โ | โ |
| Streaming | โ | โ | โ |
| Structured output | โ | โ | โ |
| Tool use | โ | โ | โ |
| Embeddings | โ | โ | โ |
| Image generation | โ | โ | โ |
| Speech synthesis | โ | โ | โ |
| Transcription | โ | โ | โ |
| Extended thinking | โ | โ | โ |
| Multimodal (image input) | โ | โ | โ |
๐ ๏ธ Flutter UI Controllers
The ai_sdk_flutter_ui package provides three reactive controllers that integrate with any Flutter state management approach.
ChatController โ Multi-turn streaming chat
final chat = ChatController(model: openai('gpt-4.1-mini'));
// In your widget:
ListenableBuilder(
listenable: chat,
builder: (context, _) {
return Column(
children: [
for (final msg in chat.messages)
Text('${msg.role}: ${msg.content}'),
if (chat.isLoading) const CircularProgressIndicator(),
],
);
},
);
// Send a message:
await chat.append('What is the capital of France?');
CompletionController โ Single-turn completion
final completion = CompletionController(model: openai('gpt-4.1-mini'));
await completion.complete('Write a haiku about Dart.');
print(completion.text);
ObjectStreamController โ Streaming typed JSON
final controller = ObjectStreamController<Map<String, dynamic>>(
model: openai('gpt-4.1-mini'),
schema: Schema<Map<String, dynamic>>(
jsonSchema: const {'type': 'object'},
fromJson: (json) => json,
),
);
await controller.submit('Describe Japan as a JSON object.');
print(controller.object); // Partial updates arrive in real-time
๐ MCP Support
Connect to any Model Context Protocol server and use its tools directly in your AI calls:
import 'package:ai_sdk_mcp/ai_sdk_mcp.dart';
final client = MCPClient(
transport: SseClientTransport(
url: Uri.parse('http://localhost:3000/mcp'),
),
);
await client.initialize();
final tools = await client.tools(); // Returns a ToolSet
final result = await generateText(
model: openai('gpt-4.1-mini'),
prompt: 'What files are in the project?',
tools: tools,
maxSteps: 5,
);
For stdio-based MCP servers (local processes):
final client = MCPClient(
transport: StdioMCPTransport(
command: 'npx',
args: ['-y', '@modelcontextprotocol/server-filesystem', '/path/to/dir'],
),
);
๐บ๏ธ Roadmap
โ Implemented
- โ
generateTextโ full result envelope (text, steps, usage, reasoning, sources, files) - โ
streamTextโ complete event taxonomy (22 typed event types) - โ
generateObject/ structured output (object, array, choice, json) - โ
embed/embedMany+cosineSimilarity - โ
generateImage(OpenAI DALL-E 3) - โ
generateSpeech(OpenAI TTS) - โ
transcribe(OpenAI Whisper) - โ
rerank - โ Middleware system with 5 built-in middlewares
- โ
Provider registry (
createProviderRegistry) - โ Multi-step agentic loops with tool approval
- โ Flutter UI controllers (Chat, Completion, ObjectStream)
- โ MCP client (SSE + stdio transports)
- โ OpenAI, Anthropic, Google providers
- โ 178+ conformance tests
๐ Planned
- ๐ Video generation support
- ๐ Streaming MCP tool outputs + automatic reconnection
- ๐ Cohere / Vertex AI / Mistral / Ollama providers
- ๐ Additional Flutter widgets (file picker, reasoning display, citation cards)
- ๐ Dart Edge / Cloudflare Workers support
- ๐ WebSocket transport for MCP
๐ค Contributing
Contributions are welcome! Please open an issue first to discuss changes before submitting a PR.
- ๐ Bug reports โ use the Bug Report template
- ๐ก Feature requests โ use the Feature Request template
- ๐ฌ Questions & discussions โ use GitHub Discussions
Running tests
dart pub global activate melos
melos bootstrap
melos test # run all package tests
melos analyze # dart analyze across all packages
Or with the Makefile:
make get # install all workspace dependencies
make test # run all package tests
make analyze # run dart analyze
make format # format all Dart source files
Runnable examples
Set API keys before running:
export OPENAI_API_KEY=sk-...
export ANTHROPIC_API_KEY=sk-ant-...
export GOOGLE_API_KEY=AIza...
| Example | Command | What it shows |
|---|---|---|
| Dart CLI | make run-basic |
generateText, streaming, structured output, tools, embeddings, middleware |
| Flutter chat | make run |
ChatController, CompletionController, ObjectStreamController |
| Flutter chat (web) | make run-web |
Same as above on Chrome |
| Advanced app | make run-advanced |
All providers, image gen, TTS, STT, multimodal |
| Advanced app (web) | make run-advanced-web |
Same as above on Chrome |
Development
Managed with Melos as a monorepo workspace:
dart pub global activate melos
melos bootstrap
melos analyze
melos test
See docs/v6-parity-matrix.md for a feature-by-feature parity matrix against Vercel AI SDK v6.







