An Eager Avocado

Eager Avocado

I give myself very good advice, but I very seldom follow it.

kNN estimator in action


-nearest neighbor Entropy estimator

Nearest neighbor estimator

in which

-nearest neighbor estimator

in which

1D Example

The data below is generated from the Normal distribution $X \sim N(0,1)$

The true entropy is

= 2.0470956

plot of chunk unnamed-chunk-2

To estimate entropy using k-nearest neighbor distance, we

(1) sort the data

(2) Take the mean of nearest neighbor distance

(3) Plug in the approximation

The approximation at different number of samples plot of chunk unnamed-chunk-6

-nearest neigbor Mutual information estimator

Estimating entropy in joint space

Consider a pair of data point (X,Y) to be a data point Z in the joint space of dimensions. The joint entropy can be treated as the entropy of in the joint space