Activation.relu constructor
Activation.relu()
ReLU (Rectified Linear Unit) activation function
relu(x) = max(0,x)
Example:
final relu = Activation.relu();
final x = Matrix.row([-1, 1]);
final y = relu.function(x);
print(y); // output: matrix 1тип2 [[0.0, 1.0]]
Implementation
Activation.relu() {
function =
(Matrix m, [dynamic param]) => m.apply((double x) => math.max(x, 0));
dfunction =
(Matrix m, [dynamic param]) => [m.apply((double x) => x > 0 ? 1 : 0)];
name = 'relu';
}