divergence function
Kullback-Leibler Divergence
The Kullback-Leibler Divergence (KLD) is a measure of how one probability distribution diverges from a second, expected probability distribution.
Implementation
double divergence(List<double> p, List<double> q) {
if (p.length != q.length) {
throw ArgumentError('The length of p and q must be the same.');
}
double epsilon = 1e-12;
double sum = 0;
for (int i = 0; i < p.length; i++) {
// Avoid log(0) by adding epsilon
p[i] = p[i] == 0 ? epsilon : p[i];
q[i] = q[i] == 0 ? epsilon : q[i];
sum += p[i] * log(p[i] / q[i]);
}
return sum;
}