OpenRouter Flutter SDK
A powerful Flutter plugin for the OpenRouter API, providing unified access to hundreds of AI models including OpenAI, Anthropic, Google, Meta, and more.
Features
- Chat Completions - Support for streaming and non-streaming chat completions
- Embeddings - Generate text embeddings for any supported model
- Model Discovery - Browse 400+ available models with detailed metadata
- Provider Information - List all available providers and their status
- Generation Metadata - Get detailed token usage and cost information
- Response API (Beta) - OpenAI-compatible stateless API with advanced features:
- Reasoning/thinking process display
- Web search integration
- Function calling
- File search
- Comprehensive Error Handling - Specific exception types for different error scenarios
- Type-Safe - Fully typed API with null safety
- Streaming Support - Real-time streaming responses for chat and responses API
Installation
Add this to your package's pubspec.yaml file:
dependencies:
openrouter: ^1.0.0
Then run:
flutter pub get
Quick Start
import 'package:openrouter/openrouter.dart';
void main() async {
final client = OpenRouterClient(apiKey: 'your-api-key');
// Simple chat completion
final response = await client.chatCompletion(
ChatRequest(
model: 'openai/gpt-4o',
messages: [Message.user('Hello, world!')],
),
);
print(response.content);
}
Usage
Chat Completions
Non-Streaming
final response = await client.chatCompletion(
ChatRequest(
model: 'openai/gpt-4o',
messages: [
Message.system('You are a helpful assistant.'),
Message.user('What is the capital of France?'),
],
temperature: 0.7,
maxTokens: 500,
),
);
print(response.content);
print('Tokens used: ${response.usage?.totalTokens}');
Streaming
final stream = client.streamChatCompletion(
ChatRequest(
model: 'anthropic/claude-3.5-sonnet',
messages: [Message.user('Tell me a story.')],
stream: true,
),
);
await for (final chunk in stream) {
print(chunk.contentChunk);
}
Embeddings
final response = await client.createEmbeddings(
EmbeddingsRequest(
model: 'openai/text-embedding-3-small',
input: 'The quick brown fox jumps over the lazy dog.',
),
);
print('Embedding dimensions: ${response.embeddings.first.embedding.length}');
Model Listing
final models = await client.listModels();
for (final model in models) {
print('${model.id}: ${model.description}');
}
Responses API (Beta)
The Responses API is a stateless API compatible with OpenAI's Responses format:
// Simple text request
final response = await client.createResponse(
ResponsesRequest.simple(
input: 'What is the weather in Tokyo?',
model: 'openai/gpt-4o',
),
);
print(response.outputText);
With Reasoning and Tools
final request = ResponsesRequest.withMessages(
input: [
EasyInputMessage.user('What is the latest news about AI?'),
],
model: 'openai/gpt-4o',
tools: [
WebSearchPreviewTool(),
],
);
final response = await client.createResponse(request);
// Access reasoning
final reasoning = response.reasoningOutput;
if (reasoning != null) {
for (final summary in reasoning.summary ?? []) {
print('Thinking: ${summary.text}');
}
}
// Access content
print(response.messageContent);
Streaming Responses
final stream = client.streamResponse(
ResponsesRequest(
input: [EasyInputMessage.user('Explain quantum computing.')],
model: 'openai/gpt-4o',
stream: true,
),
);
await for (final chunk in stream) {
final text = chunk.outputText;
if (text != null) print(text);
}
Error Handling
try {
final response = await client.chatCompletion(request);
} on OpenRouterApiException catch (e) {
print('API Error: ${e.message} (Status: ${e.statusCode})');
} on OpenRouterRateLimitException catch (e) {
print('Rate limited. Retry after: ${e.retryAfter}');
} on OpenRouterAuthenticationException catch (e) {
print('Authentication failed: ${e.message}');
} catch (e) {
print('Unexpected error: $e');
}
Configuration
Base URL
By default, the client uses https://openrouter.ai/api. You can customize this:
final client = OpenRouterClient(
apiKey: 'your-api-key',
baseUrl: 'https://custom-proxy.com/api',
);
HTTP Headers
Additional headers can be added for features like prompt caching:
final client = OpenRouterClient(
apiKey: 'your-api-key',
headers: {
'X-Prompt-Cache-Key': 'my-cache-key',
},
);
Example App
The package includes a comprehensive example app demonstrating all features:
cd example
flutter run
The example app includes:
- Chat Screen - Interactive chat interface with streaming support
- Responses Screen - Demo of the new Responses API with reasoning and web search
- Models Screen - Browse and filter all available models
- Settings Screen - API key management and usage statistics
API Reference
Client Methods
chatCompletion(ChatRequest)- Single chat completionstreamChatCompletion(ChatRequest)- Streaming chat completioncreateEmbeddings(EmbeddingsRequest)- Generate embeddingslistModels()- List all available modelslistProviders()- List all providersgetGeneration(String)- Get generation metadata by IDgetKeyInfo()- Get API key usage informationcreateResponse(ResponsesRequest)- Create a response (non-streaming)streamResponse(ResponsesRequest)- Stream responses
Request Models
ChatRequest- Chat completion requestEmbeddingsRequest- Embeddings requestResponsesRequest- Response API request
Response Models
ChatResponse- Chat completion responseEmbeddingsResponse- Embeddings responseResponsesResponse- Response API responseStreamingChunk- Streaming chat chunkResponsesStreamingChunk- Streaming response chunk
Supported Models
OpenRouter provides access to 400+ models including:
- OpenAI: GPT-4o, GPT-4 Turbo, GPT-3.5 Turbo, DALL-E 3
- Anthropic: Claude 3.5 Sonnet, Claude 3 Opus, Claude 3 Haiku
- Google: Gemini Pro, Gemini Flash
- Meta: Llama 3, Llama 3.1
- Mistral: Mistral Large, Mistral Medium, Mistral Small
- And many more...
See the OpenRouter documentation for the complete list.
Additional Information
OpenRouter Features
- Automatic Fallbacks - If a model/provider is down, requests automatically fall back to alternatives
- Prompt Caching - Cache prompts for faster responses and lower costs
- Request IDs - Track requests for debugging and analytics
- Provider Routing - Choose specific providers for each request
- Credits System - Transparent pricing with detailed cost breakdowns
Getting an API Key
- Visit openrouter.ai
- Sign up for an account
- Go to the API Keys section
- Create a new key
Pricing
OpenRouter provides transparent pricing across all models. Visit the pricing page for details.
Contributing
Contributions are welcome! Please read our contributing guide before submitting pull requests.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
Acknowledgments
- Thanks to OpenRouter for providing the API
- Built with Flutter and Dart
Note: This is an unofficial Flutter SDK. It is not officially maintained by OpenRouter.
Libraries
- openrouter
- OpenRouter Flutter SDK
- openrouter_method_channel
- openrouter_platform_interface
- openrouter_web