llm_core 0.1.5
llm_core: ^0.1.5 copied to clipboard
Core abstractions for LLM (Large Language Model) interactions. Provides common interfaces, models, and utilities used by LLM backend implementations.
Changelog #
All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
[Unreleased] #
0.1.5 - 2026-01-26 #
Added #
StreamChatOptionsclass to encapsulate all streaming chat options and reduce parameter proliferationRetryConfigandRetryUtilfor configurable retry logic with exponential backoffTimeoutConfigfor flexible timeout configuration (connection, read, total, large payloads)LLMMetricsinterface andDefaultLLMMetricsimplementation for optional metrics collectionchatResponse()method onLLMChatRepositoryfor non-streaming complete responses- Input validation utilities in
Validationclass ChatRepositoryBuilderBasefor implementing builder patterns in repository implementationsStreamChatOptionsMergerfor merging options from multiple sources- HTTP client utilities (
HttpClientHelper) for consistent request handling - Error handling utilities (
ErrorHandlers,BackendErrorHandler) for standardized error processing - Tool execution utilities (
ToolExecutor) for managing tool calling workflows
Changed #
streamChat()now accepts optionalStreamChatOptionsparameter- Improved error handling and retry logic across all backends
- Enhanced documentation
0.1.0 - 2026-01-19 #
Added #
- Initial release
- Core abstractions for LLM interactions:
LLMChatRepository- Abstract interface for chat completionsLLMMessage- Message representation with roles and contentLLMResponse- Response wrapper with metadataLLMChunk- Streaming response chunksLLMEmbedding- Text embedding representation
- Tool calling support:
LLMTool- Tool definition with JSON Schema parametersLLMToolCall- Tool invocation representationLLMToolParam- Parameter definitions
- Exception types for error handling