synadart 0.3.1 synadart: ^0.3.1 copied to clipboard
Library for creating neural networks, with a purely Dart implementation.
Version 0.1.0 #
- Added Multi-layer Perceptron and a basic algorithm for backpropagation
Version 0.1.1 #
- Added README.md, updated formatting
Version 0.2.0 #
- Added FF ( feedforward ) and simple Perceptron networks
- Added LReLU, eLU and tanh activation function
- Renamed 'sigmoid' to 'logistic' function
Version 0.2.1 #
- Removed FF ( feedforward ) and simple Perceptron networks in favour of an upcoming simpler implementation of basically the same idea, through just one network model.
- Added [learningRate] as a parameter, and removed the hard-coded value of 0.2.
- Organised the files slightly
- Updated documentation of
Neuron
Version 0.2.2 #
- Updated documentation of
Layer
and removed a chunk of dead code.
Version 0.2.3 #
- Updated documentation of
Network
. - Replaced
process()
inLayer
with anoutput
getter, simplifying the implementation of getting eachNeuron
's output.
Version 0.2.4 #
- Updated documentation of
activation.dart
, having added explanations for the different activation functions.
Version 0.2.5 #
- Renamed
Multilayer Perceptron
toDeep Feed-forward
, which should be a more fitting and future-proof name.
Version 0.3.0 #
- Updated documentation of
Logger
,Backpropagation
andValueGenerator
. - Created
examples
folder with arecognise_5
example that allows for recognition of the number '5'.
Version 0.3.1 #
- Added 5 new activation functions:
SeLU
,Softplus
,Softsign
,Swish
andGaussian
. - Renamed the
logistic
function tosigmoid
. - Added
abs
function for obtaining the absolute of a value tomathematical_operations