ConfigFineTunes class

Constructors

ConfigFineTunes({String model = 'curie', int nEpochs = 4, int? batchSize, double? learningRateMultiplier, double promptLossWeight = 0.01, bool computeClassificationMetrics = false, int? classificationNClasses, int? classificationPositiveClass, String? suffix})
ConfigFineTunes.fromMap(Map<String, dynamic> map)
factory

Properties

batchSize int?
The batch size to use for training. The batch size is the number of training examples used to train a single forward and backward pass. By default, the batch size will be dynamically configured to be ~0.2% of the number of examples in the training set, capped at 256 - in general, we've found that larger batch sizes tend to work better for larger datasets.
final
classificationNClasses int?
The number of classes in a classification task. This parameter is required for multiclass classification.
final
classificationPositiveClass int?
The positive class in binary classification. This parameter is needed to generate precision, recall, and F1 metrics when doing binary classification.
final
computeClassificationMetrics bool
If set, we calculate classification-specific metrics such as accuracy and F-1 score using the validation set at the end of every epoch. These metrics can be viewed in the results file. In order to compute classification metrics, you must provide a validation_file. Additionally, you must specify classification_n_classes for multiclass classification or classification_positive_class for binary classification.
final
hashCode int
The hash code for this object.
no setterinherited
learningRateMultiplier double?
The learning rate multiplier to use for training. The fine-tuning learning rate is the original learning rate used for pretraining multiplied by this value. By default, the learning rate multiplier is the 0.05, 0.1, or 0.2 depending on final batch_size (larger learning rates tend to perform better with larger batch sizes). We recommend experimenting with values in the range 0.02 to 0.2 to see what produces the best results.
final
model String
The name of the base model to fine-tune. You can select one of "ada", "babbage", "curie", "davinci", or a fine-tuned model created after 2022-04-21
final
nEpochs int
The number of epochs to train the model for. An epoch refers to one full cycle through the training dataset.
final
promptLossWeight double
The weight to use for loss on the prompt tokens. This controls how much the model tries to learn to generate the prompt (as compared to the completion which always has a weight of 1.0), and can add a stabilizing effect to training when completions are short. If prompts are extremely long (relative to completions), it may make sense to reduce this weight so as to avoid over-prioritizing learning the prompt.
final
runtimeType Type
A representation of the runtime type of the object.
no setterinherited
suffix String?
final

Methods

copyWith({String? model, int? nEpochs, int? batchSize, double? learningRateMultiplier, double? promptLossWeight, bool? computeClassificationMetrics, int? classificationNClasses, int? classificationPositiveClass, String? suffix}) ConfigFineTunes
noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
toMap() Map<String, dynamic>
toString() String
A string representation of this object.
inherited

Operators

operator ==(Object other) bool
The equality operator.
inherited