LlmInferenceExecutor class

Executes MediaPipe's inference task.

{@macro TaskExecutor}

Constructors

LlmInferenceExecutor(LlmInferenceOptions options)
Executes MediaPipe's inference task.

Properties

hashCode int
The hash code for this object.
no setterinherited
options LlmInferenceOptions
Initialization values for the worker.
finalinherited
runtimeType Type
A representation of the runtime type of the object.
no setterinherited
taskName String
Debug value for log statements.
getter/setter pair
worker Pointer<Void>
The native MediaPipe object which will complete this task.
no setterinherited

Methods

cancel() → void
Terminates an in-progress query, closing down the stream.
closeWorker(Pointer<Void> worker, Pointer<Pointer<Char>> error) int
Releases the worker object behind this task.
createResultsPointer() Pointer<LlmResponseContext>
Allocates this task's results struct in native memory.
createWorker(Pointer<LlmSessionConfig> options, Pointer<Pointer<Char>> error) Pointer<Void>
Allocates this task's worker object in native memory.
dispose() → void
Releases all native resources and closes any open streams.
generateResponse(String text) Stream<LlmResponseContext>
Generates a response based on the input text.
handleErrorMessage(Pointer<Pointer<Char>> errorMessage, [int? status]) → void
Throws an exception if errorMessage is non-empty.
inherited
noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
sizeInTokens(String text) int
Runs an invocation of only the tokenization for the LLM, and returns the size (in tokens) of the result.
toString() String
A string representation of this object.
inherited

Operators

operator ==(Object other) bool
The equality operator.
inherited