eneural_net 1.1.3 eneural_net: ^1.1.3 copied to clipboard
AI Library to create efficient Artificial Neural Networks. Computation uses SIMD (Single Instruction Multiple Data) to improve performance.
1.1.3 #
ANN
:- Added
toJson
,toJsonMap
andfromJson
.
- Added
Layer
:- Added
toJson
,toJsonMap
andfromJson
.
- Added
ActivationFunction
:- Added
toJson
,toJsonMap
,fromJson
andbyName
.
- Added
Scale
:- Added
format
. - Added
toJson
,toJsonMap
andfromJson
.
- Added
Signal
:- Added
format
andfromFormat
. - Optimize
values
implementation for each format.
- Added
Propagation
remove unused_layersPreviousGradientsDeltas
.- Extension
ListExtension
: - Added
asDoubles
andasInts
.
1.1.2 #
ActivationFunctionSigmoid
:- Changed to use new faster
dart:math.exp
function.
- Changed to use new faster
1.1.1 #
ActivationFunction
:- Added base class
ActivationFunctionFloat32x4
. - SIMD Optimization:
- Improved performance in 2x.
ActivationFunctionLinear
,ActivationFunctionSigmoid
,ActivationFunctionSigmoidFast
,ActivationFunctionSigmoidBoundedFast
.
- Added base class
eneural_net_fast_math.dart
:exp
: Improved performance and input range bounded to -87..87.expFloat32x4
: new SIMD Optimized Exponential function.
Chronometer
:- Improved
toString
numbers. Comparable
.- operator
+
.
- Improved
eneural_net_extensions
:- Improved extensions.
- Improved documentation.
Training
:- Added
logProgressEnabled
.
- Added
- intl: ^0.17.0
1.1.0 #
ActivationFunction
:- Added field
flatSpot
forderivativeEntryWithFlatSpot()
. - Added
ActivationFunctionLinear
. ActivationFunctionSigmoid
: activation with bounds (-700 .. 700).
- Added field
- Improved collections and numeric extensions.
- Improved
DataStatistics
and addCSV
generator. Signal
:- Added SIMD related operations.
- Added:
computeSumSquaresMean
,computeSumSquares
,valuesAsDouble
. - Set extra values (out of length range):
setExtraValuesToZero
,setExtraValuesToOne
,setExtraValues
. - Improved documentation.
Sample
:- Input/Output statistics and proximity.
- Added
SamplesSet
:- With per set computed
defaultTargetGlobalError
. - Automatic
removeConflicts
.
- With per set computed
Training
:- Split into
Propagation
andParameterStrategy
, allowing other algorithms. - Added
Backpropagation
with SIMD, smart learning rate and smart momentum. - Added
iRprop+
. - Added
TrainingLogger
. - Added
selectInitialANN
.
- Split into
ANN
:- Optional bias neuron.
- Allow different
ActivationFunction
for each layer.
1.0.2 #
- Expose fast math as an additional library.
1.0.1 #
README.md
:- Improve text.
- Improve activation function text.
- Fix example.
1.0.0 #
- Initial version.
- Training algorithms: Backpropagation.
- Activation functions: Sigmoid and approximation versions.
- Fast math functions.
- SIMD: Float32x4