io library

Classes

LlmInferenceEngine
Utility to query an LLM with a prompt and receive its response as a stream.
LlmInferenceExecutor
Executes MediaPipe's inference task.
LlmInferenceOptions
Configuration object for a MediaPipe text classifier.
LlmResponseContext
Represents all of or part of an LLM's response to a query.

Typedefs

LlmResponseCallback = Void Function(Pointer<Void>, Pointer<LlmResponseContext>)
Shape of the function MediaPipe calls with each additional response chunk from the LLM.