Shannon (unit)

From Wikipedia, the free encyclopedia

The shannon (symbol: Sh) is a unit of information defined by IEC 80000-13. One shannon is the information content of an event occurring when its probability is 1/2. If a message is made of a sequence of a given number of bits, with all possible bit strings being equally likely, the information content of one such message expressed in shannons is equal to the number of bits in the sequence.[1] For this and historical reasons, the shannon is more commonly known as the bit. Using the shannon rather than the bit as a unit gives an explicit distinction between the amount of information that is expressed and the quantity of data that may be used to represent the information.[2]

The shannon can be converted to other information units according to[3]

1 Sh ≈ 0.693 nat ≈ 0.301 Hart.

The shannon is named after Claude Shannon, the founder of information theory.

The Shannon entropy (uncertainty) of a discrete distribution is equal to the expected value of the information of determining an outcome,[citation needed] and so Shannon entropy has the same units as does information. Thus, one shannon is also the Shannon entropy of a system with two equally probable states.[4]

References[]

  1. ^ ""shannon", A Dictionary of Units of Measurement".
  2. ^ Olivier Rioul (2018). "This is IT: A primer on Shannon's entropy and Information" (PDF). L'Information, Séminaire Poincaré. XXIII: 43–77. Retrieved 2021-05-23. The Système International d'unités recommends the use of the shannon (Sh) as the information unit in place of the bit to distinguish the amount of information from the quantity of data that may be used to represent this information. Thus according to the SI standard, H(X) should actually be expressed in shannons. The entropy of one bit lies between 0 and 1 Sh.
  3. ^ "IEC 80000-13:2008". International Organization for Standardization. Retrieved 21 July 2013.
  4. ^ Olivier Rioul (2018). "This is IT: A primer on Shannon's entropy and Information" (PDF). L'Information, Séminaire Poincaré. XXIII: 43–77. Retrieved 2021-05-23. To illustrate the difference between binary digit [bit] and binary unit [shannon], consider one random bit X ∈ {0,1}. This random variable X follows a Bernoulli distribution with some parameter p. Its entropy [...] can take any value between 0 bit and 1 bit. The maximum value 1 [shannon] is attained in the equiprobable case p = 1/2. Otherwise, the entropy of one bit is actually less than one [shannon].
Retrieved from ""