LlamaScope class

A scope that filters responses from LlamaParent for specific prompt IDs

Constructors

LlamaScope(LlamaParent _parent)
Create a new scope for the given parent

Properties

completions Stream<CompletionEvent>
Stream of completion events for prompts sent through this scope
no setter
hashCode int
The hash code for this object.
no setterinherited
id String
final
runtimeType Type
A representation of the runtime type of the object.
no setterinherited
stream Stream<String>
Stream of text generated for prompts sent through this scope
no setter

Methods

addPromptId(String promptId) → void
Add a prompt ID to this scope (used internally)
dispose() Future<void>
Dispose of resources and FREE THE SLOT.
handleCompletion(CompletionEvent event) → void
Handle a completion event from the parent
handleResponse(LlamaResponse response) → void
Handle a response from the parent
loadSession(String path) Future<void>
Restores a session directly from a Disk file into VRAM (Tier 3).
loadState(Uint8List data) Future<void>
Restores a serialized state from RAM into VRAM for this scope.
noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
saveState() Future<Uint8List>
Captures the current KV Cache of this scope from VRAM. Returns the serialized state as a byte array (Tier 2).
sendPrompt(String prompt) Future<String>
Send a prompt to the model and track its ID in this scope
sendPromptWithImages(String prompt, List<LlamaImage> images) Future<String>
Send a prompt with images to the model and track its ID in this scope
stop({bool alsoCancelQueued = true}) Future<void>
Stop generation for this scope only.
toString() String
A string representation of this object.
inherited

Operators

operator ==(Object other) bool
The equality operator.
inherited