dart_agent_core library

A mobile-first, local-first Dart library for building stateful, tool-using AI agents with multi-provider LLM support (OpenAI, Gemini, AWS Bedrock).

This library provides a unified interface for interacting with various LLMs, managing agent state, handling tool calls, and maintaining conversation history.

Classes

AfterCallLLMEvent
AfterToolCallEvent
AfterToolCallRequest
AfterToolCallResponse
AgentCallToolContext
AgentController
A flexible controller based on EventBus. It wraps an EventBus and provides helper methods to interact with it.
AgentMessageHistory
AgentResumedEvent
AgentRunSuccessedEvent
AgentStartedEvent
AgentState
Represents the state of an AI agent, including its history, token usage, active skills, and planning metadata.
AgentStoppedEvent
AgentToolResult
AudioPart
BedrockClaudeClient
BeforeCallLLMEvent
BeforeCallLLMRequest
BeforeCallLLMResponse
BeforeRunAgentRequest
BeforeRunAgentResponse
BeforeToolCallEvent
BeforeToolCallRequest
BeforeToolCallResponse
ButtonToolCallBuffer
CallLLMParams
ClaudeClient
Client for the Anthropic Messages API (direct, not via AWS Bedrock).
ContextCompressor
DefaultLoopDetector
DirectorySkillInjections
DirectorySkillLoadError
DirectorySkillLoadResult
DirectorySkillMetadata
DocumentPart
EpisodicMemory
Event
Base class for all events in the system.
EventBus
A simple EventBus that supports both Publish/Subscribe and Request/Response patterns.
EventStreamDecoder
AWS EventStream Decoder for Dart. Decodes binary application/vnd.amazon.eventstream data.
EventStreamMessage
ExecutionToolResult
FileStateStorage
A simple file-based implementation of StateStorage. Saves state to a JSON file in the specified directory, naming the file based on the session ID (e.g., "$directory/$sessionId.json").
FunctionCall
Represents a function call to be executed
FunctionExecutionResult
FunctionExecutionResultMessage
GeminiChunkDecoder
Decodes a stream of JSON lines (potentially multi-line) from Gemini into JSON objects.
GeminiClient
ImagePart
JavaScriptBridgeContext
JavaScriptBridgeRegistry
JavaScriptExecutionResult
JavaScriptRuntime
LLMBasedContextCompressor
LLMChunkEvent
LLMClient
LLMMessage
Base class for all messages exchanged with an LLM.
LLMRetryingEvent
LoopDetector
LoopDetectorResult
ModelAudioPart
ModelConfig
ModelContentPart
ModelImagePart
ModelMessage
A message generated by the AI model, including text output, function calls, and multi-modal content.
ModelTextPart
ModelUsage
ModelVideoPart
NodeJavaScriptRuntime
Node.js-backed JavaScript runtime with bidirectional bridge calls over stdio.
OnAgentCancelEvent
OnAgentErrorEvent
OnAgentExceptionEvent
OpenAIChunkDecoder
OpenAIClient
OpenAIResponseTransformer
PlanChangedEvent
Planner
PlanState
PlanStep
ResponsesAPIResponseTransformer
ResponsesChunkDecoder
ResponsesClient
ResumeAgentRequest
ResumeAgentResponse
Skill
StatefulAgent
A stateful AI agent that orchestrates LLM calls, tool execution, skill management, and context compression.
StateStorage
StreamingControlMessage
StreamingEvent
StreamingMessage
SubAgent
SystemMessage
SystemPromptHistoryItem
SystemPromptPart
TextPart
Tool
Defines a tool that can be executed by an agent.
ToolChoice
ToolsHistoryItem
UserContentPart
Base class for user content parts
UserMessage
A message sent by the user, which can contain multiple content parts (text, image, video, audio, document).
VideoPart

Extensions

ListTakeLast on List<T>

Properties

memoryTools List<Tool>
final
skillOperationTools List<Tool>
final
subAgentTools List<Tool>
final

Typedefs

JavaScriptBridgeHandler = FutureOr Function(Map<String, dynamic> payload, JavaScriptBridgeContext context)
SystemCallback = Future<(SystemMessage?, List<Tool>, List<LLMMessage>)> Function(StatefulAgent agent, SystemMessage? systemMessage, List<Tool> tools, List<LLMMessage> requestMessages)
Callback type for intercepting and modifying system_message, tools, and request_messages before each LLM call. Receives the StatefulAgent instance as the first argument.

Exceptions / Errors

AgentException