ml_algo 12.0.2

  • README.md
  • CHANGELOG.md
  • Example
  • Installing
  • Versions
  • 72

Build Status Coverage Status pub package Gitter Chat

Machine learning algorithms with dart #

Table of contents

What is the ml_algo for? #

The main purpose of the library - to give developers, interested both in Dart language and data science, native Dart implementation of machine learning algorithms. This library targeted to dart vm, so, to get smoothest experience with the lib, please, do not use it in a browser.

Following algorithms are implemented:

  • Linear regression:

    • Gradient descent based linear regression
    • Coordinate descent based linear regression
  • Linear classifier:

    • Logistic regression
    • Softmax regression
  • Non-parametric regression:

    • KNN regression

The library's structure #

  • Model selection #

    • CrossValidator. Factory, that creates instances of a cross validator. Cross validation allows researchers to fit different hyperparameters of machine learning algorithms, assessing prediction quality on different parts of a dataset.
  • Classification algorithms #

    • Linear classification #
      • Logistic regression #

        An algorithm, that performs linear binary classification.

        • LogisticRegressor.gradient. Logistic regression with gradient ascent optimization of log-likelihood cost function. To use this kind of classifier your data have to be linearly separable.

        • LogisticRegressor.coordinate. Not implemented yet. Logistic regression with coordinate descent optimization of negated log-likelihood cost function. Coordinate descent allows to do feature selection (aka L1 regularization) To use this kind of classifier your data have to be linearly separable.

      • Softmax regression #

        An algorithm, that performs linear multiclass classification.

        • SoftmaxRegressor.gradient. Softmax regression with gradient ascent optimization of log-likelihood cost function. To use this kind of classifier your data have to be linearly separable.

        • SoftmaxRegressor.coordinate. Not implemented yet. Softmax regression with coordinate descent optimization of negated log-likelihood cost function. As in case of logistic regression, coordinate descent allows to do feature selection (aka L1 regularization) To use this kind of classifier your data have to be linearly separable.

  • Regression algorithms #

    • Linear regression #
      • LinearRegressor.gradient. A well-known algorithm, that performs linear regression using gradient vector of a cost function.

      • LinearRegressor.coordinate An algorithm, that uses coordinate descent in order to find optimal value of a cost function. Coordinate descent allows to perform feature selection along with regression process (This technique often calls Lasso regression).

    • Nonlinear regression #
      • ParameterlessRegressor.knn An algorithm, that makes prediction for each new observation based on first k closest observations from training data. It has quite high computational complexity, but in the same time it may easily catch non-linear pattern of the data.

Examples #

Logistic regression #

Let's classify records from well-known dataset - Pima Indians Diabets Database via Logistic regressor

Import all necessary packages. First, it's needed to ensure, if you have ml_preprocessing package in your dependencies:

dependencies:
  ml_preprocessing: ^3.2.0

We need this repo to parse raw data in order to use it farther. For more details, please, visit ml_preprocessing repository page.

import 'dart:async';

import 'package:ml_algo/ml_algo.dart';
import 'package:ml_preprocessing/ml_preprocessing.dart';

Download dataset from Pima Indians Diabets Database and read it (of course, you should provide a proper path to your downloaded file):

final data = DataFrame.fromCsv('datasets/pima_indians_diabetes_database.csv', 
  labelName: 'class variable (0 or 1)');
final features = (await data.features)
      .mapColumns((column) => column.normalize()); // it's needed to normalize the matrix column-wise to reach 
                                                   // computational stability and provide uniform scale for all 
                                                   // the values in the column
final labels = await data.labels;

Data in this file is represented by 768 records and 8 features. 9th column is a label column, it contains either 0 or 1 on each row. This column is our target - we should predict a class label for each observation. Therefore, we should point, where to get label values. Let's use labelName parameter for that (labels column name, 'class variable (0 or 1)' in our case).

Processed features and labels are contained in data structures of Matrix type. To get more information about Matrix type, please, visit ml_linal repo

Then, we should create an instance of CrossValidator class for fitting hyperparameters of our model

final validator = CrossValidator.KFold(numberOfFolds: 5);

All are set, so, we can do our classification.

Evaluate our model via accuracy metric:

final accuracy = validator.evaluate((trainFeatures, trainLabels) => 
    LogisticRegressor.gradient(
        trainFeatures, trainLabels,
        initialLearningRate: .8,
        iterationsLimit: 500,
        batchSize: 768,
        fitIntercept: true,
        interceptScale: .1,
        learningRateType: LearningRateType.constant), 
    features, labels, MetricType.accuracy);

Let's print score:

print('accuracy on classification: ${accuracy.toStringAsFixed(2)}');

We will see something like this:

acuracy on classification: 0.77

All the code above all together:

import 'dart:async';

import 'package:ml_algo/ml_algo.dart';
import 'package:ml_preprocessing/ml_preprocessing.dart';

Future main() async {
  final data = DataFrame.fromCsv('datasets/pima_indians_diabetes_database.csv', 
     labelName: 'class variable (0 or 1)');
  final features = (await data.features).mapColumns((column) => column.normalize());
  final labels = await data.labels;
  final validator = CrossValidator.kFold(numberOfFolds: 5);
  final accuracy = validator.evaluate((trainFeatures, trainLabels) => 
    LogisticRegressor.gradient(
        trainFeatures, trainLabels,
        initialLearningRate: .8,
        iterationsLimit: 500,
        batchSize: 768,
        fitIntercept: true,
        interceptScale: .1,
        learningRateType: LearningRateType.constant), 
    features, labels, MetricType.accuracy);

  print('accuracy on classification: ${accuracy.toStringFixed(2)}');
}

Softmax regression #

Let's classify another famous dataset - Iris dataset. Data in this csv is separated into 3 classes - therefore we need to use different approach to data classification - Softmax regression.

As usual, start with data preparation. Before we start, we should update our pubspec's dependencies with xrange` library:

dependencies:
    ...
    xrange: ^0.0.5
    ...

Download the file and read it:

final data = DataFrame.fromCsv('datasets/iris.csv',
    labelName: 'Species',
    columns: [ZRange.closed(1, 5)],
    categories: {
      'Species': CategoricalDataEncoderType.oneHot,
    },
);

final features = await data.features;
final labels = await data.labels;

The csv database has 6 columns, but we need to get rid of the first column, because it contains just ID of every observation - it's absolutely useless data. So, as you may notice, we provided a columns range to exclude ID-column:

columns: [ZRange.closed(1, 5)]

Also, since the label column 'Species' has categorical data, we encoded it to numerical format:

categories: {
  'Species': CategoricalDataEncoderType.oneHot,
},

Next step - create a cross validator instance:

final validator = CrossValidator.kFold(numberOfFolds: 5);

Evaluate quality of prediction:

final accuracy = validator.evaluate((trainFeatures, trainLabels) => 
      LinearClassifier.softmaxRegressor(
          trainFeatures, trainLabels,
          initialLearningRate: 0.03,
          iterationsLimit: null,
          minWeightsUpdate: 1e-6,
          randomSeed: 46,
          learningRateType: LearningRateType.constant
      ), features, labels, MetricType.accuracy);

print('Iris dataset, softmax regression: accuracy is '
  '${accuracy.toStringAsFixed(2)}'); // It yields 0.93

Gather all the code above all together:

import 'dart:async';

import 'package:ml_algo/ml_algo.dart';
import 'package:ml_preprocessing/ml_preprocessing.dart';
import 'package:xrange/zrange.dart';

Future main() async {
  final data = DataFrame.fromCsv('datasets/iris.csv',
    labelName: 'Species',
    columns: [ZRange.closed(1, 5)],
    categories: {
      'Species': CategoricalDataEncoderType.oneHot,
    },
  );

  final features = await data.features;
  final labels = await data.labels;
  final validator = CrossValidator.kFold(numberOfFolds: 5);
  final accuracy = validator.evaluate((trainFeatures, trainLabels) => 
      LinearClassifier.softmaxRegressor(
          trainFeatures, trainLabels,
          initialLearningRate: 0.03,
          iterationsLimit: null,
          minWeightsUpdate: 1e-6,
          randomSeed: 46,
          learningRateType: LearningRateType.constant
      ), features, labels, MetricType.accuracy);

  print('Iris dataset, softmax regression: accuracy is '
      '${accuracy.toStringAsFixed(2)}');
}

K nearest neighbour regression #

Let's do some prediction with a well-known non-parametric regression algorithm - k nearest neighbours. Let's take a state of the art dataset - boston housing.

As usual, import all necessary packages

import 'dart:async';

import 'package:ml_algo/ml_algo.dart';
import 'package:ml_preprocessing/ml_preprocessing.dart';
import 'package:xrange/zrange.dart';

and download and read the data

final data = DataFrame.fromCsv('lib/_datasets/housing.csv',
    headerExists: false,
    fieldDelimiter: ' ',
    labelIdx: 13,
);

As you can see, the dataset is headless, that means, that there is no a descriptive line in the beginning of the file, hence we can just use the index-based approach to point, where the outcomes column resides (13 index in our case)

Extract features and labels

// As in example above, it's needed to normalize the matrix column-wise to reach computational stability and provide 
// uniform scale for all the values in the column
final features = (await data.features).mapColumns((column) => column.normalize());
final labels = await data.labels;

Create a cross-validator instance

final validator = CrossValidator.kFold(numberOfFolds: 5);

Let the k parameter be equal to 4.

Assess a knn regressor with the chosen k value using MAPE metric

final error = validator.evaluate((trainFeatures, trainLabels) => 
  ParameterlessRegressor.knn(trainFeatures, trainLabels, k: 4), features, labels, MetricType.mape);

Let's print our error

print('MAPE error on k-fold validation: ${error.toStringAsFixed(2)}%'); // it yields approx. 6.18

Contacts #

If you have questions, feel free to write me on

Changelog #

12.0.2 #

  • ScoreToProbMapperFactory removed
  • ScoreToProbMapperType enum removed
  • ScoreToProbMapper: the entity renamed to LinkFunction

12.0.1 #

  • Cost function factory removed
  • Cost function type removed

12.0.0 #

  • Breaking change: GradientType enum removed
  • Breaking change: OptimizerType enum removed
  • Breaking change, Predictor: fit method removed, fitting is happening while a model is being created
  • Breaking change, Predictor: interface replaced with Assessable, redundant properties removed
  • Breaking change: LinearClassifier reorganized
  • Optimizers now have immutable state
  • InterceptPreprocessor replaced with a helper function addInterceptIf

11.0.1 #

  • Cross validator refactored
  • Data splitters refactored
  • Unit tests for cross validator added

11.0.0 #

  • Added immutable state to all the predictor subclasses

10.3.0 #

  • kernels added:
    • uniform
    • epanechnikov
    • cosine
    • gaussian
  • NoNParametricRegressor.nearestNeighbour: added possibility to specify the kernel function

10.2.1 #

  • test coverage restored

10.2.0 #

  • NoNParametricRegressor class added
  • KNNRegressor class added
  • ml_linalg v9.0.0 supported

10.1.0 #

  • ml_linalg v7.0.0 support

10.0.0 #

9.2.4 #

  • Data preprocessing: All categorical values are now converted to String type

9.2.3 #

  • Examples for Linear regression and Logistic regression updated (vector's normalize method used)
  • CategoricalDataEncoderType: one-hot encoding documentation corrected

9.2.2 #

  • Softmax regression example added to README

9.2.1 #

  • README corrected

9.2.0 #

  • LinearClassifier.logisticRegressor: numerical stability improved
  • LinearClassifier.logisticRegressor: probabilityThreshold parameter added
  • DataFrame.fromCsv: parameter fieldDelimiter added

9.1.0 #

  • DataFrame: labelName parameter added

9.0.0 #

  • ml_linalg v6.0.2 supported
  • Classifier: type of weightsByClasses changed from Map to Matrix
  • SoftmaxRegressor: more detailed unit tests for softmax regression added
  • Data preprocessing: DataFrame introduced (former MLData)

8.0.0 #

  • LinearClassifier.softmaxRegressor implemented
  • Metric interface refactored (getError renamed to getScore)

7.2.0 #

  • SoftmaxMapper added (aka Softmax activation function)

7.1.0 #

  • ConvergenceDetector added (this entity stops the optimizer when it is needed)

7.0.0 #

  • All the exports packed into ml_algo entry

6.2.0 #

  • Coefficients in optimizers now are a matrix
  • InitialWeightsGenerator instantiating fixed: dtype is passed now

6.1.0 #

  • LinkFunction renamed to ScoreToProbMapper
  • ScoreToProbMapper accepts vector and returns vector instead of a scalar

6.0.6 #

  • Pedantic package integration added
  • Some linter issues fixed

6.0.5 #

  • Coveralls integration added
  • dartfm check task added

6.0.4 #

  • Documentation for linear regression corrected
  • Documentation for MLData corrected

6.0.3 #

  • Documentation for logistic regression corrected

6.0.2 #

  • Tests corrected: removed import test_api.dart

6.0.1 #

  • Readme corrected

6.0.0 #

  • Library fully refactored:
    • add possibility to set certain data type for numeric computations
    • all algorithms now are more generic
    • a lot of unit tests added
    • bug fixes

5.2.0 #

  • Ordinal encoder added
  • Float32x4CsvMlData significantly extended

5.1.0 #

  • Real-life example added (black friday dataset)
  • rows parameter added to Float32x4CsvMlData
  • Unknown categorical values handling strategy types added

5.0.0 #

  • One hot encoder integrated into CSV ML data

4.3.3 #

  • Performance test for one hot encoder added

4.3.2 #

  • One hot encoder implemented

4.3.1 #

  • enum for categorical data encoding added

4.3.0 #

  • Cross validator factory added
  • README updated

4.2.0 #

  • csv-parser added

4.1.0 #

  • ml_linalg removed from export file
  • README refreshed
  • General datasets directory created

4.0.0 #

  • ml_linal ^4.0.0 supported

3.5.4 #

  • README.md updated
  • build_runner dependency updated

3.5.3 #

  • dartfmt tool applied to all necessary files

3.5.2 #

  • Travis configuration file name corrected

3.5.1 #

  • Travis integration added

3.5.0 #

  • Vectorized cost functions applied

3.4.0 #

  • ml_linalg 2.0.0 supported

3.3.0 #

  • Matrix-based gradient calculation added for log likelihood cost function

3.2.0 #

  • Matrix-based gradient calculation added for squared cost function

3.1.2 #

  • Description corrected

3.1.1 #

  • dartfm tool applied

3.1.0 #

  • Get rid of MLVector's deprecated methods

3.0.0 #

  • Library public release

2.0.0 #

  • ml_linalg supported

1.2.1 #

  • subVector -> subvector

1.2.0 #

  • Matrices support added

1.1.1 #

  • Examples fixed, dependencies fixed

1.1.0 #

  • Support of updated linalg package

1.0.1 #

  • Readme updated, dependencies fixed

1.0.0 #

  • Migration to dart 2.0

0.38.1 #

0.38.0 #

  • Lasso solution refactored

0.37.0 #

  • Support of linalg package (former simd_vector)

0.36.0 #

  • Intercept term considered (fitIntercept and interceptScale parameters)

0.35.1 #

  • Logistic regression tests improved

0.35.0 #

  • One versus all refactored, tests for logistic regression added

0.34.0 #

  • One versus all classifier

0.33.0 #

  • Gradient descent regressor type enum added

0.32.1 #

  • Gradient optimizer unit tests

0.32.0 #

  • Get rid of derivative computation

0.31.0 #

  • Get rid of di package usage

0.30.1 #

  • File structure flattened

0.30.0 #

  • Redundant gradient optimizers removed

0.29.0 #

  • part ... part of directives removed

0.28.0 #

  • Coordinate descent optimizer added
  • Lasso regressor added

0.27.0 #

  • Gradient calculation changed

0.26.1 #

  • Code was optimized (removed unnecessary)
  • Refactoring

0.26.0 #

  • More distinct modularity was added to the library
  • Unit tests were fixed

0.25.0 #

  • Tests for gradient optimizers were added
  • Gradient calculator was created as a separate entity
  • Initial weights generator was created as a separate entity
  • Learning rate generator was created as a separate entity

0.24.0 #

  • All implementations were hidden

0.23.0 #

  • findMaxima and findMinima methods were added to Optimizer interface

0.22.0 #

  • File structure reorganized, predictor classes refactored
  • README.md updated

0.21.0 #

  • Logistic regression model added (with example)

0.20.2 #

  • README.md updated

0.20.1 #

  • simd_vector dependency url fixed

0.20.0 #

  • Repository dependency corrected (dart_vector -> simd_vector)

0.19.0 #

  • Support for Float32x4Vector class was added (from dart_vector library)
  • Type List for label (target) list replaced with Float32List (in Predictor.train() and Optimizer.optimize())

0.18.0 #

  • class Vector and enum Norm were extracted to separate library (https://github.com/gyrdym/dart_vector.git)

0.17.0 #

  • Common interface for loss function was added
  • Derivative calculation was fixed (common canonical method was used)
  • Squared loss function was added as a separate class

0.16.0 #

  • README.md was actualized

0.15.0 #

  • Tests for gradient optimizers were added
  • Interfaces (almost for all entities) for DI and IOC mechanism were added
  • Randomizer class was added
  • Removed separate classes for k-fold cross validation and lpo cross validation, now it resides in CrossValidation class

0.14.0 #

  • L1 and L2 regularization added

0.13.0 #

  • Script for running all unit tests added

0.12.0 #

  • Vector interface removed
  • Regular vector implementation removed
  • TypedVector -> Vector
  • Implicit vectors constructing replaced with explicit new-instantiation

0.11.0 #

  • Entity names correction

0.10.0 #

  • K-fold cross validation added (KFoldCrossValidation)
  • Leave P out cross validation added (LpoCrossValidation)
  • DataTrainTestSplitter was removed

0.9.0 #

  • copy, fill methods were added to Vector

0.8.0 #

  • Reflection was removed for all cases (Vector instantiation, Optimizer instantiation)

0.7.0 #

  • Abstract Vector-class was added as a base for typed and regular vector classes

0.6.0 #

  • Manhattan norm support was added

0.5.2 #

  • README file was extended and clarified

0.5.1 #

  • Random interval obtaining for the mini-batch gradient descent was fixed

0.5.0 #

  • BGDOptimizer, MBGDOptimizer and GradientOptimizer were added

0.4.0 #

  • OptimizerInterface was added
  • Stochastic gradient descent optimizer was extracted from the linear regressor class
  • Line separators changed for all files (CRLF -> LF)

0.3.1 #

  • tests for sum, abs, fromRange methods of the TypedVector were added
  • tests for DataTrainTestSplitter was added

0.3.0 #

  • MAPE cost function was added

0.2.0 #

  • SGD Regressor refactored (rmse on training removed, estimator added) + example extended

0.1.0 #

  • Implementation of -, *, / operators and all vectors methods added to the TypedVector

0.0.1 #

  • Initial version

example/main.dart

import 'dart:async';

import 'package:ml_algo/ml_algo.dart';
import 'package:ml_linalg/matrix.dart';

/// A simple usage example using synthetic data. To see more complex examples,
/// please, visit other directories in this folder
Future main() async {
  // Let's create a feature matrix (a set of independent variables)
  final features = Matrix.fromList([
    [2.0, 3.0, 4.0, 5.0],
    [12.0, 32.0, 1.0, 3.0],
    [27.0, 3.0, 0.0, 59.0],
  ]);

  // Let's create dependent variables vector. It will be used as `true` values
  // to adjust regression coefficients
  final labels = Matrix.fromList([
    [4.3],
    [3.5],
    [2.1],
  ]);

  // Let's create a regressor itself and train it
  final regressor = LinearRegressor.gradient(
      features, labels,
      iterationsLimit: 100,
      initialLearningRate: 0.0005,
      learningRateType: LearningRateType.constant);

  // Let's see adjusted coefficients
  print('Regression coefficients: ${regressor.coefficients}');
}

Use this package as a library

1. Depend on it

Add this to your package's pubspec.yaml file:


dependencies:
  ml_algo: ^12.0.2

2. Install it

You can install packages from the command line:

with pub:


$ pub get

with Flutter:


$ flutter pub get

Alternatively, your editor might support pub get or flutter pub get. Check the docs for your editor to learn more.

3. Import it

Now in your Dart code, you can use:


import 'package:ml_algo/ml_algo.dart';
  
Version Uploaded Documentation Archive
12.0.2 May 24, 2019 Go to the documentation of ml_algo 12.0.2 Download ml_algo 12.0.2 archive
12.0.1 May 24, 2019 Go to the documentation of ml_algo 12.0.1 Download ml_algo 12.0.1 archive
12.0.0 May 21, 2019 Go to the documentation of ml_algo 12.0.0 Download ml_algo 12.0.0 archive
11.0.1 Apr 23, 2019 Go to the documentation of ml_algo 11.0.1 Download ml_algo 11.0.1 archive
11.0.0 Apr 21, 2019 Go to the documentation of ml_algo 11.0.0 Download ml_algo 11.0.0 archive
10.3.0 Apr 20, 2019 Go to the documentation of ml_algo 10.3.0 Download ml_algo 10.3.0 archive
10.2.1 Apr 17, 2019 Go to the documentation of ml_algo 10.2.1 Download ml_algo 10.2.1 archive
10.2.0 Apr 16, 2019 Go to the documentation of ml_algo 10.2.0 Download ml_algo 10.2.0 archive
10.1.0 Apr 3, 2019 Go to the documentation of ml_algo 10.1.0 Download ml_algo 10.1.0 archive
10.0.0 Mar 28, 2019 Go to the documentation of ml_algo 10.0.0 Download ml_algo 10.0.0 archive

All 48 versions...

Popularity:
Describes how popular the package is relative to other packages. [more]
48
Health:
Code health derived from static analysis. [more]
99
Maintenance:
Reflects how tidy and up-to-date the package is. [more]
90
Overall:
Weighted score of the above. [more]
72
Learn more about scoring.

We analyzed this package on Jun 11, 2019, and provided a score, details, and suggestions below. Analysis was completed with status completed using:

  • Dart: 2.3.1
  • pana: 0.12.17

Platforms

Detected platforms: Flutter, web, other

No platform restriction found in primary library package:ml_algo/ml_algo.dart.

Health suggestions

Fix lib/src/regressor/linear_regressor.dart. (-1 points)

Analysis of lib/src/regressor/linear_regressor.dart reported 2 hints:

line 21 col 3: Prefer using /// for doc comments.

line 86 col 3: Prefer using /// for doc comments.

Fix lib/src/optimizer/convergence_detector/convergence_detector.dart. (-0.50 points)

Analysis of lib/src/optimizer/convergence_detector/convergence_detector.dart reported 1 hint:

line 1 col 1: Prefer using /// for doc comments.

Format lib/src/algorithms/knn/kernel.dart.

Run dartfmt to format lib/src/algorithms/knn/kernel.dart.

Fix additional 28 files with analysis or formatting issues.

Additional issues in the following files:

  • lib/src/algorithms/knn/kernel_type.dart (Run dartfmt to format lib/src/algorithms/knn/kernel_type.dart.)
  • lib/src/algorithms/knn/knn.dart (Run dartfmt to format lib/src/algorithms/knn/knn.dart.)
  • lib/src/classifier/linear_classifier_mixin.dart (Run dartfmt to format lib/src/classifier/linear_classifier_mixin.dart.)
  • lib/src/classifier/logistic_regressor/gradient_logistic_regressor.dart (Run dartfmt to format lib/src/classifier/logistic_regressor/gradient_logistic_regressor.dart.)
  • lib/src/classifier/logistic_regressor/logistic_regressor.dart (Run dartfmt to format lib/src/classifier/logistic_regressor/logistic_regressor.dart.)
  • lib/src/classifier/softmax_regressor/gradient_softmax_regressor.dart (Run dartfmt to format lib/src/classifier/softmax_regressor/gradient_softmax_regressor.dart.)
  • lib/src/classifier/softmax_regressor/softmax_regressor.dart (Run dartfmt to format lib/src/classifier/softmax_regressor/softmax_regressor.dart.)
  • lib/src/cost_function/log_likelihood.dart (Run dartfmt to format lib/src/cost_function/log_likelihood.dart.)
  • lib/src/cost_function/squared.dart (Run dartfmt to format lib/src/cost_function/squared.dart.)
  • lib/src/helpers/add_intercept_if.dart (Run dartfmt to format lib/src/helpers/add_intercept_if.dart.)
  • lib/src/helpers/get_probabilities.dart (Run dartfmt to format lib/src/helpers/get_probabilities.dart.)
  • lib/src/link_function/logit/float32_inverse_logit_link_function_mixin.dart (Run dartfmt to format lib/src/link_function/logit/float32_inverse_logit_link_function_mixin.dart.)
  • lib/src/link_function/logit/inverse_logit_link_function.dart (Run dartfmt to format lib/src/link_function/logit/inverse_logit_link_function.dart.)
  • lib/src/link_function/softmax/float32_softmax_link_function_mixin.dart (Run dartfmt to format lib/src/link_function/softmax/float32_softmax_link_function_mixin.dart.)
  • lib/src/link_function/softmax/softmax_link_function.dart (Run dartfmt to format lib/src/link_function/softmax/softmax_link_function.dart.)
  • lib/src/metric/regression/mape.dart (Run dartfmt to format lib/src/metric/regression/mape.dart.)
  • lib/src/model_selection/cross_validator/cross_validator.dart (Run dartfmt to format lib/src/model_selection/cross_validator/cross_validator.dart.)
  • lib/src/model_selection/cross_validator/cross_validator_impl.dart (Run dartfmt to format lib/src/model_selection/cross_validator/cross_validator_impl.dart.)
  • lib/src/model_selection/data_splitter/k_fold.dart (Run dartfmt to format lib/src/model_selection/data_splitter/k_fold.dart.)
  • lib/src/optimizer/coordinate/coordinate.dart (Run dartfmt to format lib/src/optimizer/coordinate/coordinate.dart.)
  • lib/src/optimizer/gradient/gradient.dart (Run dartfmt to format lib/src/optimizer/gradient/gradient.dart.)
  • lib/src/optimizer/optimizer.dart (Run dartfmt to format lib/src/optimizer/optimizer.dart.)
  • lib/src/optimizer/optimizer_factory.dart (Run dartfmt to format lib/src/optimizer/optimizer_factory.dart.)
  • lib/src/optimizer/optimizer_factory_impl.dart (Run dartfmt to format lib/src/optimizer/optimizer_factory_impl.dart.)
  • lib/src/regressor/coordinate_regressor.dart (Run dartfmt to format lib/src/regressor/coordinate_regressor.dart.)
  • lib/src/regressor/gradient_regressor.dart (Run dartfmt to format lib/src/regressor/gradient_regressor.dart.)
  • lib/src/regressor/knn_regressor.dart (Run dartfmt to format lib/src/regressor/knn_regressor.dart.)
  • lib/src/regressor/parameterless_regressor.dart (Run dartfmt to format lib/src/regressor/parameterless_regressor.dart.)

Maintenance suggestions

The package description is too short. (-10 points)

Add more detail to the description field of pubspec.yaml. Use 60 to 180 characters to describe the package, what it does, and its target use case.

Dependencies

Package Constraint Resolved Available
Direct dependencies
Dart SDK >=2.3.0 <3.0.0
ml_linalg ^10.0.0 10.3.2
quiver ^2.0.2 2.0.3
tuple ^1.0.2 1.0.2
xrange ^0.0.5 0.0.6
Transitive dependencies
matcher 0.12.5
meta 1.1.7
path 1.6.2
stack_trace 1.9.3
Dev dependencies
benchmark_harness >=1.0.0 <2.0.0
build_runner ^1.1.2
build_test ^0.10.2
mockito ^3.0.0
pedantic 1.1.0
test ^1.2.0