## Kullback-Leibler Divergence

Tags: #machine learning #kl divergence### Equation

$$KL(P||Q)=\sum_{x}P(x)\log(\frac{P(x)}{Q(x)})$$### Latex Code

KL(P||Q)=\sum_{x}P(x)\log(\frac{P(x)}{Q(x)})

### Have Fun

Let's Vote for the Most Difficult Equation!

### Introduction

#### Equation

#### Latex Code

KL(P||Q)=\sum_{x}P(x)\log(\frac{P(x)}{Q(x)})

#### Explanation

Latex code for the Kullback-Leibler Divergence. I will briefly introduce the notations in this formulation.

- : KL Divergence between P and Q
- : Distribution of P(x) over x
- : Distribution of Q(x) over x

Reply