8 lines
898 B
Plaintext
8 lines
898 B
Plaintext
Calculate the [[wp:Entropy (information theory)|information entropy]] (Shannon entropy) of a given input string.
|
|
|
|
Entropy is the [[wp:Expected value|expected value]] of the measure of [[wp:Self-information|information]] content in a system. In general, the Shannon entropy of a variable <math>X</math> is defined as:
|
|
:<math>H(X) = \sum_{x\in\Omega} P(x) I(x)</math>
|
|
where the information content <math>I(x) = -\log_{b} P(x)</math>. If the base of the logarithm <math>b = 2</math>, the result is expressed in ''bits'', a [[wp:Units of information|unit of information]]. Therefore, given a string <math>S</math> of length <math>n</math> where <math>P(s_i)</math> is the relative frequency of each character, the entropy of a string in bits is:
|
|
:<math>H(S) = -\sum_{i=0}^n P(s_i) \log_2 (P(s_i))</math>
|
|
For this task, use "<tt>1223334444</tt>" as an example. The result should be around 1.84644 bits.
|