BaseLlmInferenceOptions class abstract

Configuration object for a MediaPipe text classifier.

See also:

Implementers

Constructors

BaseLlmInferenceOptions()

Properties

cacheDir String
Directory path for storing model related tokenizer and cache weights. The user is responsible for providing the directory that can be writable by the program. Used by CPU only.
no setter
decodeStepsPerSync int
Number of decode steps per sync. Used by GPU only. The default value is 3.
no setter
hashCode int
The hash code for this object.
no setterinherited
loraPath String
Path to the LoRA tflite flatbuffer file. Optional (default is empty string). This is only compatible with GPU models.
no setter
maxTokens int
The total length of the kv-cache.
no setter
modelPath String
The path that points to the tflite model file to use for inference.
no setter
props List<Object?>
The list of properties that will be used to determine whether two instances are equal.
no setter
randomSeed int
Random seed for sampling tokens.
no setter
runtimeType Type
A representation of the runtime type of the object.
no setterinherited
sequenceBatchSize int
Sequence batch size for encoding. Used by GPU only. Number of input tokens to process at a time for batch processing. Setting this value to 1 means both the encoding and decoding share the same graph of sequence length of 1. Setting this value to 0 means the batch size will be optimized programmatically.
no setter
stringify bool?
If set to true, the toString method will be overridden to output this instance's props.
no setterinherited
temperature double
Randomness when decoding the next token.
no setter
topK int
Top K number of tokens to be sampled from for each decoding step.
no setter

Methods

noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
toString() String
A string representation of this object.

Operators

operator ==(Object other) bool
The equality operator.
inherited