Entropy

Entropy of a random variable indicates how hard to predict the value of the variable or the uncertainty of the variable. It’s also called “expected surprisal”.

H(x)=p(x)logp(x)dx=E[logp(x)]


Illustration: Entropy of a coin flip



Reference

Entropy: https://en.wikipedia.org/wiki/Entropy