evaluate abstract method
Future<Vector>
evaluate(
- ModelFactory createModel,
- MetricType metricType, {
- DataPreprocessFn? onDataSplit,
Returns a future that is resolved with a vector of scores of quality of a
model depending on given metricType
Parameters:
createModel
A function that returns a model to be evaluated
metricType
A metric used to assess a model created by createModel
onDataSplit
A callback that is called when a new train-test split is
ready to be passed into a model. One may place some additional
data-dependent logic here, e.g., data preprocessing. The callback accepts
train and test data from a new split and returns a transformed split as a
list, where the first element is train data and the second one is test
data, both of DataFrame
type. This new transformed split will be passed
into the model
Example:
final data = DataFrame([
[ 1, 1, 1, 1],
[ 2, 3, 4, 5],
[18, 71, 15, 61],
[19, 0, 21, 331],
[11, 10, 9, 40],
],
header: header,
headerExists: false,
);
final modelFactory = (trainData) =>
KnnRegressor(trainData, 'col_3', k: 4);
final onDataSplit = (trainData, testData) {
final standardizer = Standardizer(trainData);
return [
standardizer.process(trainData),
standardizer.process(testData),
];
}
final validator = CrossValidator.kFold(data);
final scores = await validator.evaluate(
modelFactory,
MetricType.mape,
onDataSplit: onDataSplit,
);
final averageScore = scores.mean();
print(averageScore);
Implementation
Future<Vector> evaluate(
ModelFactory createModel,
MetricType metricType, {
DataPreprocessFn? onDataSplit,
});