providers/openai/openai library
Modular OpenAI Provider
This library provides a modular implementation of the OpenAI provider
Key Benefits:
- Single Responsibility: Each module handles one capability
- Easier Testing: Modules can be tested independently
- Better Maintainability: Changes isolated to specific modules
- Cleaner Code: Smaller, focused classes
- Reusability: Modules can be reused across providers
Usage:
import 'package:llm_dart/providers/openai/openai.dart';
final provider = ModularOpenAIProvider(ModularOpenAIConfig(
apiKey: 'your-api-key',
model: 'gpt-4',
));
// Use any capability - same external API
final response = await provider.chat(messages);
final embeddings = await provider.embed(['text']);
final audio = await provider.speech('Hello world');
Classes
- ModerationAnalysis
- Extended moderation analysis result
- ModerationStats
- Moderation statistics for a batch of texts
- OpenAIAssistants
- OpenAI Assistant Management capability implementation
- OpenAIAudio
- OpenAI Audio capabilities implementation
- OpenAIBuiltInTool
- Base class for OpenAI built-in tools
- OpenAIBuiltInTools
- Convenience factory methods for creating built-in tools
- OpenAIChat
- OpenAI Chat capability implementation
- OpenAIChatResponse
- OpenAI chat response implementation
- OpenAIClient
- Core OpenAI HTTP client shared across all capability modules
- OpenAICompletion
- OpenAI Text Completion capability implementation
- OpenAIComputerUseTool
- Computer use built-in tool
- OpenAIConfig
- OpenAI provider configuration
- OpenAIEmbeddings
- OpenAI Embeddings capability implementation
- OpenAIFiles
- OpenAI File Management capability implementation
- OpenAIFileSearchTool
- File search built-in tool
- OpenAIImages
- OpenAI Image Generation capability implementation
- OpenAIModels
- OpenAI Model Listing capability implementation
- OpenAIModeration
- OpenAI Content Moderation capability implementation
- OpenAIProvider
- OpenAI Provider implementation
- OpenAIResponses
- OpenAI Responses API capability implementation
- OpenAIResponsesResponse
- OpenAI Responses API response implementation
- OpenAIWebSearchTool
- Web search built-in tool
Enums
- CompletionUseCase
- Use cases for completion optimization
- OpenAIBuiltInToolType
- OpenAI built-in tool types
Functions
-
createAzureOpenAIProvider(
{required String apiKey, required String endpoint, required String deploymentName, String apiVersion = '2024-02-15-preview', double? temperature, int? maxTokens, String? systemPrompt}) → OpenAIProvider - Create an OpenAI provider for Azure OpenAI
-
createCopilotProvider(
{required String apiKey, String model = ProviderDefaults.githubCopilotDefaultModel, double? temperature, int? maxTokens, String? systemPrompt}) → OpenAIProvider - Create an OpenAI provider for GitHub Copilot
-
createDeepSeekProvider(
{required String apiKey, String model = ProviderDefaults.deepseekDefaultModel, double? temperature, int? maxTokens, String? systemPrompt}) → OpenAIProvider - Create an OpenAI provider for DeepSeek
-
createGroqProvider(
{required String apiKey, String model = ProviderDefaults.groqDefaultModel, double? temperature, int? maxTokens, String? systemPrompt}) → OpenAIProvider - Create an OpenAI provider for Groq
-
createOpenAIProvider(
{required String apiKey, String model = ProviderDefaults.openaiDefaultModel, String baseUrl = ProviderDefaults.openaiBaseUrl, double? temperature, int? maxTokens, String? systemPrompt}) → OpenAIProvider - Create an OpenAI provider with default settings
-
createOpenRouterProvider(
{required String apiKey, String model = ProviderDefaults.openRouterDefaultModel, double? temperature, int? maxTokens, String? systemPrompt}) → OpenAIProvider - Create an OpenAI provider for OpenRouter
-
createTogetherProvider(
{required String apiKey, String model = ProviderDefaults.togetherAIDefaultModel, double? temperature, int? maxTokens, String? systemPrompt}) → OpenAIProvider - Create an OpenAI provider for Together AI