llm_core 0.1.5 copy "llm_core: ^0.1.5" to clipboard
llm_core: ^0.1.5 copied to clipboard

Core abstractions for LLM (Large Language Model) interactions. Provides common interfaces, models, and utilities used by LLM backend implementations.

Changelog #

All notable changes to this project will be documented in this file.

The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.

[Unreleased] #

0.1.5 - 2026-01-26 #

Added #

  • StreamChatOptions class to encapsulate all streaming chat options and reduce parameter proliferation
  • RetryConfig and RetryUtil for configurable retry logic with exponential backoff
  • TimeoutConfig for flexible timeout configuration (connection, read, total, large payloads)
  • LLMMetrics interface and DefaultLLMMetrics implementation for optional metrics collection
  • chatResponse() method on LLMChatRepository for non-streaming complete responses
  • Input validation utilities in Validation class
  • ChatRepositoryBuilderBase for implementing builder patterns in repository implementations
  • StreamChatOptionsMerger for merging options from multiple sources
  • HTTP client utilities (HttpClientHelper) for consistent request handling
  • Error handling utilities (ErrorHandlers, BackendErrorHandler) for standardized error processing
  • Tool execution utilities (ToolExecutor) for managing tool calling workflows

Changed #

  • streamChat() now accepts optional StreamChatOptions parameter
  • Improved error handling and retry logic across all backends
  • Enhanced documentation

0.1.0 - 2026-01-19 #

Added #

  • Initial release
  • Core abstractions for LLM interactions:
    • LLMChatRepository - Abstract interface for chat completions
    • LLMMessage - Message representation with roles and content
    • LLMResponse - Response wrapper with metadata
    • LLMChunk - Streaming response chunks
    • LLMEmbedding - Text embedding representation
  • Tool calling support:
    • LLMTool - Tool definition with JSON Schema parameters
    • LLMToolCall - Tool invocation representation
    • LLMToolParam - Parameter definitions
  • Exception types for error handling
0
likes
0
points
220
downloads

Publisher

unverified uploader

Weekly Downloads

Core abstractions for LLM (Large Language Model) interactions. Provides common interfaces, models, and utilities used by LLM backend implementations.

Repository (GitHub)
View/report issues

Topics

#llm #ai #chat #embeddings #tools

License

unknown (license)

Dependencies

http, logging

More

Packages that depend on llm_core