Loss.sparseCrossEntropy constructor

Loss.sparseCrossEntropy()

Sparse Cross Entropy loss function for batch

sparceCrossEntropy(y, yP) = -sum y*ln(yP) for y being labeled

Example:

final y = Matrix.column([1]); // second label, which is eqv. to [0, 1, 0]
final yP = Matrix.column([0.1, 0.1, 0.8]);

final sparseCrossEntropy = Loss.sparseCrossEntropy();
double loss = sparseCrossEntropy.function(y, yP);
print(loss); // output: 2.3025850929940455

Implementation

Loss.sparseCrossEntropy() {
  function = (Matrix y, Matrix yP, [dynamic parameter]) {
    Matrix categorical = Matrix.zero(n: yP.n, m: yP.m);
    if (y.m == 1) {
      for (int i = 0; i < y.n; i += 1) {
        categorical[y[i][0].toInt()][i] = 1;
      }
    } else {
      for (int i = 0; i < y.m; i += 1) {
        categorical[y[0][i].toInt()][i] = 1;
      }
    }
    return -(categorical %
                yP.apply((x) => x != 0 ? math.log(x) : math.log(x + 1e-4)))
            .reduceSum() /
        y.m;
  };
  dfunction = (Matrix y, Matrix yP, [dynamic fromSoftmax = false]) {
    Matrix categorical = Matrix.zero(n: yP.n, m: yP.m);
    if (y.m == 1) {
      for (int i = 0; i < y.n; i += 1) {
        categorical[y[i][0].toInt()][i] = 1;
      }
    } else {
      for (int i = 0; i < y.m; i += 1) {
        categorical[y[0][i].toInt()][i] = 1;
      }
    }
    if ((fromSoftmax as bool)) {
      return yP - categorical;
    }
    return -categorical % (yP.apply((x) => x != 0 ? 1 / x : 1e4));
  };
  name = 'sparse_cross_entropy';
}