FakeEchoLLM class

Fake LLM for testing. It just returns the prompt or streams it char by char.

Inheritance

Constructors

FakeEchoLLM()
Fake LLM for testing. It just returns the prompt or streams it char by char.
const

Properties

defaultOptions FakeLLMOptions
The default options to use when invoking the Runnable.
finalinherited
hashCode int
The hash code for this object.
no setterinherited
modelType String
Return type of language model.
no setteroverride
runtimeType Type
A representation of the runtime type of the object.
no setterinherited

Methods

batch(List<PromptValue> inputs, {List<FakeLLMOptions>? options}) Future<List<LLMResult>>
Batches the invocation of the Runnable on the given inputs.
inherited
bind(FakeLLMOptions options) RunnableBinding<PromptValue, FakeLLMOptions, LLMResult>
Binds the Runnable to the given options.
inherited
call(String prompt, {FakeLLMOptions? options}) Future<String>
Runs the LLM on the given String prompt and returns a String with the generated text.
inherited
close() → void
Cleans up any resources associated with it the Runnable.
inherited
countTokens(PromptValue promptValue, {FakeLLMOptions? options}) Future<int>
Returns the number of tokens resulting from tokenize the given prompt.
inherited
getCompatibleOptions(RunnableOptions? options) FakeLLMOptions?
Returns the given options if they are compatible with the Runnable, otherwise returns null.
inherited
invoke(PromptValue input, {LLMOptions? options}) Future<LLMResult>
Invokes the Runnable on the given input.
override
noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
pipe<NewRunOutput extends Object?, NewCallOptions extends RunnableOptions>(Runnable<LLMResult, NewCallOptions, NewRunOutput> next) RunnableSequence<PromptValue, NewRunOutput>
Pipes the output of this Runnable into another Runnable using a RunnableSequence.
inherited
stream(PromptValue input, {LLMOptions? options}) Stream<LLMResult>
Streams the output of invoking the Runnable on the given input.
override
streamFromInputStream(Stream<PromptValue> inputStream, {FakeLLMOptions? options}) Stream<LLMResult>
Streams the output of invoking the Runnable on the given inputStream.
inherited
tokenize(PromptValue promptValue, {LLMOptions? options}) Future<List<int>>
Tokenizes the given prompt using the encoding used by the language model.
override
toString() String
A string representation of this object.
inherited
withFallbacks(List<Runnable<PromptValue, RunnableOptions, LLMResult>> fallbacks) RunnableWithFallback<PromptValue, LLMResult>
Adds fallback runnables to be invoked if the primary runnable fails.
inherited
withRetry({int maxRetries = 3, FutureOr<bool> retryIf(Object e)?, List<Duration?>? delayDurations, bool addJitter = false}) RunnableRetry<PromptValue, LLMResult>
Adds retry logic to an existing runnable.
inherited

Operators

operator ==(Object other) bool
The equality operator.
inherited