EvaluationParameters class

Parameters that define how to split a dataset into training data and testing data, and the number of iterations to perform. These parameters are specified in the predefined algorithms but you can override them in the CreatePredictor request.

Constructors

EvaluationParameters({int? backTestWindowOffset, int? numberOfBacktestWindows})
EvaluationParameters.fromJson(Map<String, dynamic> json)
factory

Properties

backTestWindowOffset int?
The point from the end of the dataset where you want to split the data for model training and testing (evaluation). Specify the value as the number of data points. The default is the value of the forecast horizon. BackTestWindowOffset can be used to mimic a past virtual forecast start date. This value must be greater than or equal to the forecast horizon and less than half of the TARGET_TIME_SERIES dataset length.
final
hashCode int
The hash code for this object.
no setterinherited
numberOfBacktestWindows int?
The number of times to split the input data. The default is 1. Valid values are 1 through 5.
final
runtimeType Type
A representation of the runtime type of the object.
no setterinherited

Methods

noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
toJson() Map<String, dynamic>
toString() String
A string representation of this object.
inherited

Operators

operator ==(Object other) bool
The equality operator.
inherited