# kNN estimator in action

,

## -nearest neighbor Entropy estimator

### Nearest neighbor estimator

in which

- is the gamma function
- (Euler-Mascheroni constant)
- is the volume of the d-dimensional unit sphere
- is the distance from to its nearest neighbor

### -nearest neighbor estimator

in which

- $d_k(x_i)$ is the distance from $x_i$ to its $k^{th}$-nearest neighbor

### 1D Example

The data below is generated from the Normal distribution $X \sim N(0,1)$

The true entropy is

= 2.0470956

To estimate entropy using k-nearest neighbor distance, we

(1) sort the data

(2) Take the mean of nearest neighbor distance

(3) Plug in the approximation

The approximation at different number of samples

## -nearest neigbor Mutual information estimator

### Estimating entropy in joint space

Consider a pair of data point (X,Y) to be a data point Z in the joint space of dimensions. The joint entropy can be treated as the entropy of in the joint space