Commit 15a731ca authored by Romain Reuillon's avatar Romain Reuillon
Browse files

[Tool] enh: implement KL divergence

parent e319119c
Pipeline #1240 passed with stages
in 45 minutes and 46 seconds
...@@ -206,4 +206,12 @@ trait Stat { ...@@ -206,4 +206,12 @@ trait Stat {
Some(test.kolmogorovSmirnovTest(new NormalDistribution(null, mu, sigma), data.toArray)) Some(test.kolmogorovSmirnovTest(new NormalDistribution(null, mu, sigma), data.toArray))
} }
def klDivergence(p1: Array[Double], p2: Array[Double]) = {
val s = (p1 zip p2).
filter { case (p1, p2) p1 == 0.0 || p2 == 0.0 }.
map { case (p1, p2) p1 * math.log(p1 / p2) }.sum
s / math.log(2)
}
} }
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment