LlmInferenceOptions.cpu constructor

LlmInferenceOptions.cpu({
  1. required String modelPath,
  2. required String cacheDir,
  3. required int maxTokens,
  4. required double temperature,
  5. required int topK,
  6. int? randomSeed,
})

{@macro LlmInferenceOptions}

Constructor for inference models using the CPU.

Implementation

factory LlmInferenceOptions.cpu({
  required String modelPath,
  required String cacheDir,
  required int maxTokens,
  required double temperature,
  required int topK,
  int? randomSeed,
}) =>
    throw UnimplementedError();