lite_agent_core_dart 0.1.1
lite_agent_core_dart: ^0.1.1 copied to clipboard
LLM AI Agent sessions service in Dart.
LiteAgent core for dart #
English · 中文
LLM AI Agent multi sessions service.
Features #
- Support OpenAPI/OpenRPC/OpenModbus/OpenTool JSON Spec.
- Support LLM Function calling to
HTTP API/json-rpc 2.0 over HTTP/Modbusand more custom tools.
Usage #
Prepare #
- Some OpenSpec json file, according to
/example/json/*.json, which is callable. - Run your tool server, which is described in json file.
- Add
.envfile in theexamplefolder, and add below content in the.envfile:baseUrl = https://xxx.xxx.com # LLM API BaseURL apiKey = sk-xxxxxxxxxxxxxxxxxxxx # LLM API ApiKey - Use below method to run agent service.
Method 1(Recommend): AgentService #
- According to
/example/agent_service_example.dart - Support multi agent session via session id.
Future<void> main() async {
CapabilityDto capabilityDto = CapabilityDto(
llmConfig: _buildLLMConfig(), // LLM Config
systemPrompt: _buildSystemPrompt(), // System Prompt
openSpecList: await _buildOpenSpecList() // OpenSpec Description String List
);
SessionDto sessionDto = await agentService.initChat(
capabilityDto,
listen // Subscribe AgentMessage, Agent chat with User/Client/LLM/Tools Role
); // Get Session Id
String prompt = "<USER PROMPT, e.g. call any one tool>";
await agentService.startChat(
sessionDto.id, // Start chat with the Session Id
[UserMessageDto(type: UserMessageType.text, message: prompt)] // User Content List, support type text/imageUrl
);
}
Method 2: ToolAgent #
- According to
/example/tool_agent_example.dart - Pure native calling. Support single session.
- Method 1 AgentService is friendly encapsulation for this.
Future<void> main() async {
ToolAgent toolAgent = ToolAgent(
llmRunner: _buildLLMRunner(),
session: _buildSession(),
toolRunnerList: await _buildToolRunnerList(),
systemPrompt: _buildSystemPrompt()
);
String prompt = "<USER PROMPT, e.g. call any one tool>";
toolAgent.userToAgent([Content(type: ContentType.text, message: prompt)]);
}