Monday, March 20, 2006

Kullback-Leibler Divergence Properties

1. The distance value cannot be less than 0. Due to the Gibbs inequality, which says if both the data distributions are similar the the Dkl distance is 0.

2. One might call it as the distance metric, but this metric is not symmetric i.e

Dkl pq is not equal to Dkl qp ( from wikipedia)

No comments: