noc_llm_dart 1.0.0
noc_llm_dart: ^1.0.0 copied to clipboard
Lightweight, asynchronous Dart & Flutter library to interact with LLMs from Cloud (OpenAI, Gemini, Groq, Sumopod) and Local APIs (LM Studio, Ollama). Features SSE streaming, auto-provider detection, a [...]
Changelog #
All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
1.0.0 - 2026-04-15 #
Added #
- Core
NocAIclass — Single entry point for all LLM providers - SSE Streaming — Real-time token-by-token streaming via
stream() - Non-Streaming — Full response via
chat()andchatSync() - Auto Provider Detection — Automatically detects Gemini Native from URL
- Dynamic Connection — HTTP for local (zero SSL overhead), HTTPS for cloud
- Multi-Turn Conversations — Built-in conversation history management
- Dual JSON Parser — Handles both OpenAI-compatible and Gemini Native formats
- Typed Exceptions —
NocConnectionException,NocAuthException,NocRateLimitException,NocParseException - 5 Comprehensive Examples:
01_sumopod.dart— Sumopod Cloud with deepseek-v302_openai.dart— Classic GPT-4/GPT-3.5 integration03_gemini_native.dart— Gemini 1.5 Flash with auto-detection04_groq.dart— Ultra-fast llama3-70b via OpenAI bridge05_local_lmstudio.dart— Local Ollama/LMStudio with zero SSL config
- Full Platform Support — Android, iOS, Linux, macOS, Web, Windows
- Bilingual Documentation — English and Indonesian README