LlmInferenceEngine constructor

LlmInferenceEngine(
  1. LlmInferenceOptions _options, {
  2. Duration timeout = const Duration(seconds: 10),
  3. int maxRetries = 2,
})

Utility to query an LLM with a prompt and receive its response as a stream.

Implementation

LlmInferenceEngine(
  this._options, {
  this.timeout = const Duration(seconds: 10),
  this.maxRetries = 2,
}) : _readyCompleter = Completer<bool>() {
  _initializeIsolate();
}