LlmInferenceOptions.cpu constructor
LlmInferenceOptions.cpu({})
{@macro LlmInferenceOptions}
Constructor for inference models using the CPU.
Implementation
factory LlmInferenceOptions.cpu({
required String modelPath,
required String cacheDir,
required int maxTokens,
required double temperature,
required int topK,
int? randomSeed,
}) =>
throw UnimplementedError();