flutter_ai_providers 0.1.0
flutter_ai_providers: ^0.1.0 copied to clipboard
Community-contributed providers for the Flutter AI Toolkit.
Flutter AI Toolkit Community Providers #
Community-contributed providers for the Flutter AI Toolkit.
Features #
- 🤖 Multiple LLM Providers Support:
- OpenAI (GPT-4o, o1, etc.)
- Anthropic (Claude)
- Ollama (Local Models)
- 💬 Streaming Responses: Real-time message streaming for a smooth chat experience
- 🖼️ Image Understanding: Support for image attachments in conversations
Getting Started #
Installation #
Add the following dependencies to your project:
flutter pub add flutter_ai_toolkit flutter_ai_providers
Or, if you prefer to do it manually, add the following to your pubspec.yaml
file:
dependencies:
flutter_ai_toolkit: {version}
flutter_ai_providers: {version}
Usage #
You can find a complete example in the official Flutter AI Toolkit repository. Just replace the existing provider with the one you want to use.
Providers #
The following providers are currently supported:
For Google Gemini AI and Firebase Vertex AI providers, please refer to the official Flutter AI Toolkit package.
Remember that your API key is a secret!
Do not share it with others or expose it in any client-side code. Production requests must be routed through your own backend server where your API key can be securely loaded.
OpenAI Provider #
final provider = OpenAIProvider(
apiKey: 'your-api-key',
model: 'gpt-4o',
);
With this provider you can also consume OpenAI-compatible APIs like OpenRouter, xAI, Groq,GitHub Models, TogetherAI, Anyscale, One API, Llamafile, GPT4All, FastChat, etc. To do so, just replace the baseUrl
parameter with the desired API endpoint and set the required headers
. For example:
final client = OpenAIProvider(
baseUrl: 'https://openrouter.ai/api/v1',
headers: { 'api-key': 'YOUR_OPEN_ROUTER_API_KEY' },
model: 'meta-llama/llama-3.3-70b-instruct',
);
xAI:
final client = OpenAIClient(
baseUrl: 'https://api.x.ai/v1',
headers: { 'api-key': 'YOUR_XAI_API_KEY' },
model: 'grok-beta',
);
final client = OpenAIClient(
baseUrl: 'https://models.inference.ai.azure.com',
headers: { 'api-key': 'YOUR_GITHUB_TOKEN' },
model: 'Phi-3.5-MoE-instruct',
);
etc.
Anthropic Provider #
final provider = AnthropicProvider(
apiKey: 'your-api-key',
model: 'claude-3-opus-20240229',
);
Ollama Provider #
final provider = OllamaProvider(
model: 'llama3.2-vision',
);
Open WebUI Provider #
final provider = OpenWebUIProvider(
baseUrl: 'http://localhost:3000',
apiKey: 'your-api-key',
model: 'llama3.1:latest',
);
Contributing #
Contributions are welcome! If you'd like to add support for additional providers or improve existing ones, please feel free to submit a pull request.
License #
This package is licensed under the MIT License.