softmax method
Softmax activation function:
\text{Softmax}(x_{i}) = \frac{\exp(x_i)}{\sum_j \exp(x_j)} LaTeX in the absence of a better way to express it...
Implementation
List<double> softmax(List<double> val) {
double max = mathutils.NEGATIVE_INFINITY;
double sum = 0.0;
val.asMap().forEach((i, v) => max = val[i] > max ? val[i] : max);
val.asMap().forEach((i, v) => val[i] = exp(val[i] - max));
val.asMap().forEach((i, v) => sum += val[i]);
val.asMap().forEach((i, v) => val[i] /= sum);
return val;
}