flutter_ai_providers 0.3.2
flutter_ai_providers: ^0.3.2 copied to clipboard
Community-contributed providers for the Flutter AI Toolkit.
Flutter AI Toolkit Community Providers #
Community-contributed providers for the Flutter AI Toolkit.
Features #
- 🤖 Multiple LLM Providers Support:
- OpenAI (GPT-4o, o1, etc.)
- Anthropic (Claude)
- Ollama (Local Models)
- Dartantic AI (Google Gemini, OpenAI, and more)
- 💬 Streaming Responses: Real-time message streaming for a smooth chat experience
- 🖼️ Image Understanding: Support for image attachments in conversations
Getting Started #
Installation #
Add the following dependencies to your project:
flutter pub add flutter_ai_toolkit flutter_ai_providers
Or, if you prefer to do it manually, add the following to your pubspec.yaml
file:
dependencies:
flutter_ai_toolkit: {version}
flutter_ai_providers: {version}
Usage #
You can find a complete example in the official Flutter AI Toolkit repository. Just replace the existing provider with the one you want to use.
Providers #
The following providers are currently supported:
For Google Gemini AI and Firebase Vertex AI providers, please refer to the official Flutter AI Toolkit package.
Remember that your API key is a secret!
Do not share it with others or expose it in any client-side code. Production requests must be routed through your own backend server where your API key can be securely loaded.
OpenAI Provider #
final provider = OpenAIProvider(
apiKey: 'your-api-key',
model: 'gpt-4o',
);
With this provider you can also consume OpenAI-compatible APIs like OpenRouter, xAI, Groq,GitHub Models, TogetherAI, Anyscale, One API, Llamafile, GPT4All, FastChat, etc. To do so, just replace the baseUrl
parameter with the desired API endpoint and set the required headers
. For example:
final client = OpenAIProvider(
baseUrl: 'https://openrouter.ai/api/v1',
headers: { 'api-key': 'YOUR_OPEN_ROUTER_API_KEY' },
model: 'meta-llama/llama-3.3-70b-instruct',
);
xAI:
final client = OpenAIClient(
baseUrl: 'https://api.x.ai/v1',
headers: { 'api-key': 'YOUR_XAI_API_KEY' },
model: 'grok-beta',
);
final client = OpenAIClient(
baseUrl: 'https://models.inference.ai.azure.com',
headers: { 'api-key': 'YOUR_GITHUB_TOKEN' },
model: 'Phi-3.5-MoE-instruct',
);
etc.
Anthropic Provider #
final provider = AnthropicProvider(
apiKey: 'your-api-key',
model: 'claude-3-opus-20240229',
);
Ollama Provider #
final provider = OllamaProvider(
model: 'llama3.2-vision',
);
Llama.cpp Provider #
final provider = LlamaProvider(
modelPath: '/path/to/your/model/file.gguf',
);
Open WebUI Provider #
final provider = OpenWebUIProvider(
baseUrl: 'http://localhost:3000',
apiKey: 'your-api-key',
model: 'llama3.1:latest',
);
Dartantic Provider #
The Dartantic provider offers a unified interface for multiple AI models through the dartantic_ai package, supporting Google Gemini, OpenAI, and other providers.
import 'package:dartantic_ai/dartantic_ai.dart';
final provider = DartanticProvider(
Agent(
'google:gemini-2.0-flash', // or 'openai:gpt-4o', etc.
apiKey: 'YOUR-API-KEY', // or pulled from your environment automatically
),
);
Contributing #
Contributions are welcome! If you'd like to add support for additional providers or improve existing ones, please feel free to submit a pull request.
License #
This package is licensed under the MIT License.