๐Ÿค– AI SDK Dart

A Dart/Flutter port of Vercel AI SDK v6 โ€” provider-agnostic APIs for text generation, streaming, structured output, tool use, embeddings, image generation, speech, and more.

ai_sdk_dart pub.dev ai_sdk_openai pub.dev ai_sdk_anthropic pub.dev ai_sdk_google pub.dev ai_sdk_flutter_ui pub.dev ai_sdk_mcp pub.dev ai_sdk_provider pub.dev CI License: MIT Dart SDK


What is this?

AI SDK Dart brings the full power of Vercel AI SDK v6 to Dart and Flutter. Write your AI logic once, swap providers without changing a line of business code, and ship on every platform โ€” mobile, web, and server. Every API mirrors its JavaScript counterpart so the official Vercel docs apply directly to your Dart code.


Screenshots

Flutter Chat App (examples/flutter_chat)

Multi-turn Chat Streaming Response
Multi-turn chat Chat response
Completion Object Stream
Completion result Object stream result

Advanced App (examples/advanced_app)

Provider Chat Tools Chat
Provider chat Tools chat
Image Generation Multimodal
Image generation Multimodal

โœจ Features

๐Ÿ—ฃ๏ธ Text Generation & Streaming

  • generateText โ€” single-turn or multi-step text generation with full result envelope
  • streamText โ€” real-time token streaming with typed event taxonomy
  • smoothStream transform โ€” configurable chunk-size smoothing for UX
  • Multi-step agentic loops with maxSteps, prepareStep, and stopConditions
  • Callbacks: onFinish, onStepFinish, onChunk, onError, experimentalOnStart

๐Ÿงฉ Structured Output

  • Output.object(schema) โ€” parse model output into a typed Dart object
  • Output.array(schema) โ€” parse model output into a typed Dart list
  • Output.choice(options) โ€” constrain output to a fixed set of string values
  • Output.json() โ€” raw JSON without schema validation
  • Automatic code-fence stripping (```json ... ```)

๐Ÿ”ง Type-Safe Tools & Multi-Step Agents

  • tool<Input, Output>() โ€” fully typed tool definitions with JSON schema
  • dynamicTool() โ€” tools with unknown input type for dynamic use cases
  • Tool choice: auto, required, none, or specific tool
  • Tool approval workflow with needsApproval
  • Multi-step agentic loops with automatic tool result injection
  • onInputStart, onInputDelta, onInputAvailable lifecycle hooks

๐Ÿ–ผ๏ธ Multimodal

  • generateImage โ€” image generation (DALL-E 3 via OpenAI)
  • generateSpeech โ€” text-to-speech audio synthesis
  • transcribe โ€” speech-to-text transcription
  • Image inputs in prompts (multimodal vision)

๐Ÿงฎ Embeddings & Cosine Similarity

  • embed() โ€” single value embedding with usage tracking
  • embedMany() โ€” batch embedding for multiple values
  • cosineSimilarity() โ€” built-in similarity computation

๐Ÿงฑ Middleware System

  • wrapLanguageModel(model, middlewares) โ€” composable middleware pipeline
  • extractReasoningMiddleware โ€” strips <think> tags into ReasoningPart
  • extractJsonMiddleware โ€” strips ```json ``` fences
  • simulateStreamingMiddleware โ€” converts non-streaming models to streaming
  • defaultSettingsMiddleware โ€” applies default temperature/top-p/etc.
  • addToolInputExamplesMiddleware โ€” enriches tool descriptions with examples

๐ŸŒ Provider Registry

  • createProviderRegistry โ€” map provider aliases to model factories
  • Resolve models by 'provider:modelId' string at runtime
  • Mix providers in a single registry for multi-provider apps

๐Ÿ“ฑ Flutter UI Controllers

  • ChatController โ€” multi-turn streaming chat with message history
  • CompletionController โ€” single-turn text completion with status
  • ObjectStreamController โ€” streaming typed JSON object updates

๐Ÿ”Œ MCP Client (Model Context Protocol)

  • MCPClient โ€” connect to MCP servers, discover tools, invoke them
  • SseClientTransport โ€” HTTP SSE transport
  • StdioMCPTransport โ€” stdio process transport
  • Discovered tools are directly compatible with generateText/streamText

๐Ÿงช Conformance Suite

  • 178+ tests covering every public API
  • Spec-driven JSON fixtures as the source of truth
  • Provider wire-format conformance tests for OpenAI, Anthropic, and Google

๐Ÿ“ฆ Packages

Package pub.dev What it gives you
ai_sdk_dart dart pub add ai generateText, streamText, tools, middleware, embeddings, registry
ai_sdk_openai dart pub add ai_sdk_openai openai('gpt-4.1-mini'), embeddings, image gen, speech, transcription
ai_sdk_anthropic dart pub add ai_sdk_anthropic anthropic('claude-sonnet-4-5'), extended thinking
ai_sdk_google dart pub add ai_sdk_google google('gemini-2.0-flash'), embeddings
ai_sdk_flutter_ui dart pub add ai_sdk_flutter_ui ChatController, CompletionController, ObjectStreamController
ai_sdk_mcp dart pub add ai_sdk_mcp MCPClient, SseClientTransport, StdioMCPTransport
ai_sdk_provider (transitive) Provider interfaces for building custom providers

ai_sdk_provider is a transitive dependency โ€” you do not need to add it directly.


๐Ÿš€ Quick Start

Dart CLI

dart pub add ai_sdk_dart ai_sdk_openai
export OPENAI_API_KEY=sk-...
import 'package:ai_sdk_dart/ai_sdk_dart.dart';
import 'package:ai_sdk_openai/ai_sdk_openai.dart';

void main() async {
  // Text generation
  final result = await generateText(
    model: openai('gpt-4.1-mini'),
    prompt: 'Say hello from AI SDK Dart!',
  );
  print(result.text);
}

Streaming

final result = await streamText(
  model: openai('gpt-4.1-mini'),
  prompt: 'Count from 1 to 5.',
);
await for (final chunk in result.textStream) {
  stdout.write(chunk);
}

Structured Output

final result = await generateText<Map<String, dynamic>>(
  model: openai('gpt-4.1-mini'),
  prompt: 'Return the capital and currency of Japan as JSON.',
  output: Output.object(
    schema: Schema<Map<String, dynamic>>(
      jsonSchema: const {
        'type': 'object',
        'properties': {
          'capital': {'type': 'string'},
          'currency': {'type': 'string'},
        },
      },
      fromJson: (json) => json,
    ),
  ),
);
print(result.output); // {capital: Tokyo, currency: JPY}

Type-Safe Tools

final result = await generateText(
  model: openai('gpt-4.1-mini'),
  prompt: 'What is the weather in Paris?',
  maxSteps: 5,
  tools: {
    'getWeather': tool<Map<String, dynamic>, String>(
      description: 'Get current weather for a city.',
      inputSchema: Schema(
        jsonSchema: const {
          'type': 'object',
          'properties': {'city': {'type': 'string'}},
        },
        fromJson: (json) => json,
      ),
      execute: (input, _) async => 'Sunny, 18ยฐC',
    ),
  },
);
print(result.text);

Flutter Chat UI

dart pub add ai_sdk_dart ai_sdk_openai ai_sdk_flutter
import 'package:ai_sdk_flutter_ui/ai_sdk_flutter_ui.dart';

final chat = ChatController(model: openai('gpt-4.1-mini'));

// In your widget:
await chat.append('Tell me a joke');
print(chat.messages.last.content);

๐Ÿค– Providers

Capability OpenAI Anthropic Google
Text generation โœ… โœ… โœ…
Streaming โœ… โœ… โœ…
Structured output โœ… โœ… โœ…
Tool use โœ… โœ… โœ…
Embeddings โœ… โ€” โœ…
Image generation โœ… โ€” โ€”
Speech synthesis โœ… โ€” โ€”
Transcription โœ… โ€” โ€”
Extended thinking โ€” โœ… โ€”
Multimodal (image input) โœ… โœ… โœ…

๐Ÿ› ๏ธ Flutter UI Controllers

The ai_sdk_flutter_ui package provides three reactive controllers that integrate with any Flutter state management approach.

ChatController โ€” Multi-turn streaming chat

final chat = ChatController(model: openai('gpt-4.1-mini'));

// In your widget:
ListenableBuilder(
  listenable: chat,
  builder: (context, _) {
    return Column(
      children: [
        for (final msg in chat.messages)
          Text('${msg.role}: ${msg.content}'),
        if (chat.isLoading) const CircularProgressIndicator(),
      ],
    );
  },
);

// Send a message:
await chat.append('What is the capital of France?');

CompletionController โ€” Single-turn completion

final completion = CompletionController(model: openai('gpt-4.1-mini'));
await completion.complete('Write a haiku about Dart.');
print(completion.text);

ObjectStreamController โ€” Streaming typed JSON

final controller = ObjectStreamController<Map<String, dynamic>>(
  model: openai('gpt-4.1-mini'),
  schema: Schema<Map<String, dynamic>>(
    jsonSchema: const {'type': 'object'},
    fromJson: (json) => json,
  ),
);
await controller.submit('Describe Japan as a JSON object.');
print(controller.object); // Partial updates arrive in real-time

๐Ÿ”Œ MCP Support

Connect to any Model Context Protocol server and use its tools directly in your AI calls:

import 'package:ai_sdk_mcp/ai_sdk_mcp.dart';

final client = MCPClient(
  transport: SseClientTransport(
    url: Uri.parse('http://localhost:3000/mcp'),
  ),
);

await client.initialize();
final tools = await client.tools(); // Returns a ToolSet

final result = await generateText(
  model: openai('gpt-4.1-mini'),
  prompt: 'What files are in the project?',
  tools: tools,
  maxSteps: 5,
);

For stdio-based MCP servers (local processes):

final client = MCPClient(
  transport: StdioMCPTransport(
    command: 'npx',
    args: ['-y', '@modelcontextprotocol/server-filesystem', '/path/to/dir'],
  ),
);

๐Ÿ—บ๏ธ Roadmap

โœ… Implemented

  • โœ… generateText โ€” full result envelope (text, steps, usage, reasoning, sources, files)
  • โœ… streamText โ€” complete event taxonomy (22 typed event types)
  • โœ… generateObject / structured output (object, array, choice, json)
  • โœ… embed / embedMany + cosineSimilarity
  • โœ… generateImage (OpenAI DALL-E 3)
  • โœ… generateSpeech (OpenAI TTS)
  • โœ… transcribe (OpenAI Whisper)
  • โœ… rerank
  • โœ… Middleware system with 5 built-in middlewares
  • โœ… Provider registry (createProviderRegistry)
  • โœ… Multi-step agentic loops with tool approval
  • โœ… Flutter UI controllers (Chat, Completion, ObjectStream)
  • โœ… MCP client (SSE + stdio transports)
  • โœ… OpenAI, Anthropic, Google providers
  • โœ… 178+ conformance tests

๐Ÿ”œ Planned

  • ๐Ÿ”œ Video generation support
  • ๐Ÿ”œ Streaming MCP tool outputs + automatic reconnection
  • ๐Ÿ”œ Cohere / Vertex AI / Mistral / Ollama providers
  • ๐Ÿ”œ Additional Flutter widgets (file picker, reasoning display, citation cards)
  • ๐Ÿ”œ Dart Edge / Cloudflare Workers support
  • ๐Ÿ”œ WebSocket transport for MCP

๐Ÿค Contributing

Contributions are welcome! Please open an issue first to discuss changes before submitting a PR.

Running tests

dart pub global activate melos
melos bootstrap
melos test       # run all package tests
melos analyze    # dart analyze across all packages

Or with the Makefile:

make get      # install all workspace dependencies
make test     # run all package tests
make analyze  # run dart analyze
make format   # format all Dart source files

Runnable examples

Set API keys before running:

export OPENAI_API_KEY=sk-...
export ANTHROPIC_API_KEY=sk-ant-...
export GOOGLE_API_KEY=AIza...
Example Command What it shows
Dart CLI make run-basic generateText, streaming, structured output, tools, embeddings, middleware
Flutter chat make run ChatController, CompletionController, ObjectStreamController
Flutter chat (web) make run-web Same as above on Chrome
Advanced app make run-advanced All providers, image gen, TTS, STT, multimodal
Advanced app (web) make run-advanced-web Same as above on Chrome

Development

Managed with Melos as a monorepo workspace:

dart pub global activate melos
melos bootstrap
melos analyze
melos test

See docs/v6-parity-matrix.md for a feature-by-feature parity matrix against Vercel AI SDK v6.


๐Ÿ“„ License

MIT

Libraries

ai_sdk_dart