LlmInferenceEngine constructor
LlmInferenceEngine(
- LlmInferenceOptions _options, {
- Duration timeout = const Duration(seconds: 10),
- int maxRetries = 2,
Utility to query an LLM with a prompt and receive its response as a stream.
Implementation
LlmInferenceEngine(
this._options, {
this.timeout = const Duration(seconds: 10),
this.maxRetries = 2,
}) : _readyCompleter = Completer<bool>() {
_initializeIsolate();
}