Loss.crossEntropy constructor

Loss.crossEntropy()

Cross Entropy loss function for batch

crossEntropy(y, yP) = -sum y*ln(yP) for y being One-Hot encoded

Example:

final y = Matrix.column([0, 1, 0]);
final yP = Matrix.column([0.1, 0.1, 0.8]);

final crossEntropy = Loss.crossEntropy();
double loss = crossEntropy.function(y, yP);
print(loss); // output: 2.3025850929940455

Implementation

Loss.crossEntropy() {
  function = (Matrix y, Matrix yP, [dynamic parameter]) {
    return -(y % yP.apply((x) => x != 0 ? math.log(x) : math.log(x + 1e-4)))
            .reduceSum() /
        y.m;
  };
  dfunction = (Matrix y, Matrix yP, [dynamic fromSoftmax = false]) {
    if ((fromSoftmax as bool)) {
      return yP - y;
    }
    return -y % (yP.apply((x) => x != 0 ? 1 / x : 1e4));
  };
  name = 'cross_entropy';
}