synadart 0.5.0
synadart: ^0.5.0 copied to clipboard
A limited but fully documented neural network library created for educational purposes.
0.5.0 #
- Update licence bearer and links.
- Bump dependencies.
- Bump the SDK version to
3.0.0. - BREAKING: Remove the
sprintdependency, which is inappropriate for a package of this kind. Replaced logging with documented exceptions.
0.4.5 #
- Relicensed from GPLv3 to MIT.
0.4.4 #
- Update SDK version from
2.12.0to2.17.0. - Updated project description to make it more accurate in describing what
synadartactually is.
0.4.3 #
- Bumped version of
sprintfrom1.0.2+3to1.0.3. - Updated repository, homepage and issue tracker links.
- Refactored and made formatting and style changes to bring the project up to par.
0.4.2+1 #
- Updated package description.
0.4.2 #
- Updated
sprintversion from1.0.0+1to1.0.2+3. - Replaced the now discontinued
pedanticwith thewordslint ruleset. - Reversed the order of versions in
CHANGELOG.mdfrom ascending to descending.
0.4.1+1 #
- Refactored code.
- Removed
logger.dartin favour of thesprintpackage.
0.4.1 #
- Updated documentation.
0.4.0 #
- Organised code.
- Replaced network types such as
feed-forwardordeep feed-forwardwith a single classSequential. - Moved focus from
NetworktoLayer, so that different layers can be added to aNetwork, rather than creating new types of networks, and limiting the user to a preset model. - Updated
example.dartandREADME.md.
0.3.2 #
- Added a simple feed-forward network model.
0.3.1 #
- Added 5 new activation functions:
SeLU,Softplus,Softsign,SwishandGaussian. - Renamed the 'logistic' function to 'sigmoid'.
- Created function
abs()for obtaining the absolute value of a variable.
0.3.0 #
- Updated documentation of
Logger,BackpropagationandValueGenerator. - Created
/examplesdirectory with a fileexample.dartthat displays the network being used to recognise the number '5'.
0.2.5 #
- Renamed 'Multilayer Perceptron' to 'Deep Feed-Forward', since 'deep feed-forward' is broader as a concept than 'multi-layer perceptrons'.
0.2.4 #
- Updated documentation of
activation.dart, having added explanations for the different activation functions.
0.2.3 #
- Updated documentation of
Network. - Replaced
process()inLayerwith anoutputgetter, simplifying the method of getting eachNeuron's output.
0.2.2 #
- Updated documentation of
Layerand removed a chunk of dead code.
0.2.1 #
- Removed the feed-forward network and simple perceptrons in favour of an upcoming simpler implementation of networks, through the use of a single network model.
- Added [learningRate] as a parameter, and removed the hard-coded value of
0.2. - Updated documentation of
Neuron.
0.2.0 #
- Added a feed-forward network and simple perceptrons.
- Added
LReLU,eLUandtanhactivation functions. - Renamed 'sigmoid' to 'logistic'.
0.1.1 #
- Added
README.mdand updated formatting.
0.1.0 #
- Implemented a multilayer perceptron and a basic algorithm for backpropagation.