This function calculates the entropy of a ddf distribution.
Details
The entropy \(\Eta\) of a discrete random variable \(X\) with image \(\mathcal{X}\) and probability mass function \(p\) is defined as $$\Eta = - \sum_{x\in\mathcal{X}} p(x) \log_b(p(x)),$$ where \(b\) denotes the base of the logarithm being used. Common values for \(b\) are \(2\), Euler's constant \(e\) or \(10\) and the corresponding units of entropy are "bits" (or "shannons"), "nats" and "bans" (also called "hartleys" or "dits"), respectively.
Entropy measures the level of "surprise" of the possible outcomes.
Examples
# The entropy of two fair coin tosses in "bits" is 2
entropy(unif(4))
#> [1] 2
