SimpleLLM<Options extends LLMOptions> class
abstract
SimpleLLM provides a simplified interface for working with LLMs. Rather than expecting the user to implement the full SimpleLLM.invoke method, the user only needs to implement SimpleLLM.callInternal.
- Inheritance
-
- Object
- Runnable<
PromptValue, Options, LLMResult> - BaseLangChain<
PromptValue, Options, LLMResult> - BaseLanguageModel<
String, Options, LLMResult> - BaseLLM<
Options> - SimpleLLM
- Implementers
Constructors
- SimpleLLM({required Options defaultOptions})
-
SimpleLLM provides a simplified interface for working with LLMs.
Rather than expecting the user to implement the full SimpleLLM.invoke
method, the user only needs to implement SimpleLLM.callInternal.
const
Properties
- defaultOptions → Options
-
The default options to use when invoking the Runnable.
finalinherited
- hashCode → int
-
The hash code for this object.
no setterinherited
- modelType → String
-
Return type of language model.
no setterinherited
- runtimeType → Type
-
A representation of the runtime type of the object.
no setterinherited
Methods
-
batch(
List< PromptValue> inputs, {List<Options> ? options}) → Future<List< LLMResult> > -
Batches the invocation of the Runnable on the given
inputs
.inherited -
bind(
Options options) → RunnableBinding< PromptValue, Options, LLMResult> -
Binds the Runnable to the given
options
.inherited -
call(
String prompt, {Options? options}) → Future< String> -
Runs the LLM on the given String prompt and returns a String with the
generated text.
inherited
-
callInternal(
String prompt, {Options? options}) → Future< String> - Method which should be implemented by subclasses to run the model.
-
close(
) → void -
Cleans up any resources associated with it the Runnable.
inherited
-
countTokens(
PromptValue promptValue, {Options? options}) → Future< int> -
Returns the number of tokens resulting from tokenize the given prompt.
inherited
-
getCompatibleOptions(
RunnableOptions? options) → Options? -
Returns the given
options
if they are compatible with the Runnable, otherwise returnsnull
.inherited -
invoke(
PromptValue input, {Options? options}) → Future< LLMResult> -
Invokes the Runnable on the given
input
.override -
noSuchMethod(
Invocation invocation) → dynamic -
Invoked when a nonexistent method or property is accessed.
inherited
-
pipe<
NewRunOutput extends Object?, NewCallOptions extends RunnableOptions> (Runnable< LLMResult, NewCallOptions, NewRunOutput> next) → RunnableSequence<PromptValue, NewRunOutput> -
Pipes the output of this Runnable into another Runnable using a
RunnableSequence.
inherited
-
stream(
PromptValue input, {Options? options}) → Stream< LLMResult> -
Streams the output of invoking the Runnable on the given
input
.inherited -
streamFromInputStream(
Stream< PromptValue> inputStream, {Options? options}) → Stream<LLMResult> -
Streams the output of invoking the Runnable on the given
inputStream
.inherited -
tokenize(
PromptValue promptValue, {Options? options}) → Future< List< int> > -
Tokenizes the given prompt using the encoding used by the language
model.
inherited
-
toString(
) → String -
A string representation of this object.
inherited
-
withFallbacks(
List< Runnable< fallbacks) → RunnableWithFallback<PromptValue, RunnableOptions, LLMResult> >PromptValue, LLMResult> -
Adds fallback runnables to be invoked if the primary runnable fails.
inherited
-
withRetry(
{int maxRetries = 3, FutureOr< bool> retryIf(Object e)?, List<Duration?> ? delayDurations, bool addJitter = false}) → RunnableRetry<PromptValue, LLMResult> -
Adds retry logic to an existing runnable.
inherited
Operators
-
operator ==(
Object other) → bool -
The equality operator.
inherited