# of observing an [[sigma-algebra|event]]
The **surprise** upon observing an event $A$ is its **negative log-likelihood**:
$
\begin{align*}
\mathrm{surprise}(A) &:= - \log_{2} \Pr(A) \text{ bits} \\
&= -\ln \Pr(A) \text{ nats}
\end{align*}
$
ie gaining $1$ bit of information from an event means we have cut the sample space in half
![[negative log likelihood graph.png|200]]
# of observing the value of a [[random variable]]
i.e. if $x \sim p$ then we say $- \log_{2} p(x)$ measures the surprise
expected surprise is called the [[entropy]]