LlmInferenceEngine class
Utility to query an LLM with a prompt and receive its response as a stream.
- Inheritance
-
- Object
- BaseLlmInferenceEngine
- LlmInferenceEngine
Constructors
- LlmInferenceEngine(LlmInferenceOptions options)
- Utility to query an LLM with a prompt and receive its response as a stream.
Properties
- hashCode → int
-
The hash code for this object.
no setterinherited
- runtimeType → Type
-
A representation of the runtime type of the object.
no setterinherited
Methods
-
dispose(
) → void - Releases all native resources.
-
generateResponse(
String text) → Stream< String> -
Generates a response based on the input text.
override
-
noSuchMethod(
Invocation invocation) → dynamic -
Invoked when a nonexistent method or property is accessed.
inherited
-
sizeInTokens(
String text) → Future< int> -
Runs an invocation of only the tokenization for the LLM, and returns the
size (in tokens) of the result.
override
-
toString(
) → String -
A string representation of this object.
inherited
Operators
-
operator ==(
Object other) → bool -
The equality operator.
inherited