LlmInferenceBaseOptions constructor

LlmInferenceBaseOptions({
  1. String? modelAssetPath,
  2. JSAny? modelAssetBuffer,
})

Implementation

external factory LlmInferenceBaseOptions({
  String? modelAssetPath,      // For cacheApi/none modes (Blob URL)
  JSAny? modelAssetBuffer,     // For streaming mode (ReadableStreamDefaultReader from OPFS)
});