Kullback.Leibler {truecluster} | R Documentation |
These functions calculate mulivariate informationtheoretic measures: Kullback Leibler divergence.
Kullback.Leibler(p, basis = 2)
p |
matrix, each column contains probabilities of one distribution across the same alphabet (rows) |
basis |
basis of the log used (default=2) |
D_KL(P||Q) = sum(P*log(P/Q))
A square divergence matrix,
Totals |
always returned: a list with components H joint entropy, Ha row entropy, Hab row conditional entropy given columns, Hb column entropy, Hba column conditional entropy given rows, Im mutual information |
Margins |
returned unless grain="Total": pa row probabilities , ha row entropie, hab colwise conditional entropies, pb column probabilities, hb column entropy , hba rowwise conditional entropies |
Cells |
returned if grain="Cells": p joint probabilities, pab columnwise conditional probabilities, pba rowwise conditional probabilities, h joint entropies, hab columnwise conditional entropies, hba rowwise conditional entropies |
Jens Oehlschlägel
MacKay, David J.C. (2003). Information Theory, Inference, and Learning Algorithms (chapter 8). Cambridge University Press.
shannon.information
, dist.entropy
, Kullback.Leibler
, log
x <- seq(-3, 3, 0.1) cp <- pnorm(x) p <- cp[-1] - cp[-length(cp)] cq <- punif(x, -3, 3) q <- cq[-1] - cq[-length(cq)] Kullback.Leibler(cbind(p,q))