LlamaCpp class

A brief overview of inter-ops among main classes:

+----------------+---------------------+--------------------+
|  main Isolate  |    llama Isolate    |    native world    |
+----------------+---------------------+--------------------+
|   LlamaCpp     |     NativeLlama     |    llama_cpp       |
|                |                     |                    |
|          send --> incoming          -->       +           |
|                |                     |        |           |
|                |                    ffi       |           |
|                |                     |        |           |
|     receiving <-- outgoing          <--       +           |
|                |                     |                    |
+---------------+---------------------+---------------------+

Properties

hashCode int
The hash code for this object.
no setterinherited
runtimeType Type
A representation of the runtime type of the object.
no setterinherited
verbose bool
final

Methods

answer(String question, {int? nPrev, int? nProbs, int? topK, double? topP, double? minP, double? tfsZ, double? typicalP, double? temperature, int? penaltyLastN, double? penaltyRepeat, double? penaltyFrequency, double? penaltyPresent, int? mirostat, double? mirostatTau, double? mirostatEta, bool? penalizeNewline, String? samplersSequence}) Stream<String>
Generate text stream by given prompt. @question The prompt passed by user who want model to generate an answer.
answerWith(String params) Stream<String>
Generate text stream by given params. params json string with params, e.g.: ai.answerWith({ "prompt": "my question is", "min_p": 20, });
dispose() Future<void>
Notify isolate to free native resources, after that, finish this isolate.
noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
toString() String
A string representation of this object.
inherited

Operators

operator ==(Object other) bool
The equality operator.
inherited

Static Methods

load(String path, {int? seed, int? nThread, int? nThreadBatch, int? nPredict, int? nCtx, int? nBatch, int? nKeep, int? nGpuLayers, int? mainGpu, int numa = 0, bool verbose = true}) Future<LlamaCpp>
Async create LlamaCpp by given params.