Let be a -ary code of length , i.e. a subset of .[1] Let be the rate of , the relative distance and
be the of radius centered at . Let be the volume of the Hamming ball of radius . It is obvious that the volume of a Hamming Ball is translation-invariant, i.e. indifferent to In particular,
With large enough , the rate and the relative distance satisfy the Elias-Bassalygo bound:
where
is the q-ary entropy function and
is a function related with Johnson bound.
Proof[]
To prove the Elias–Bassalygo bound, start with the following Lemma:
Lemma. For and , there exists a Hamming ball of radius with at least
codewords in it.
Proof of Lemma. Randomly pick a received word and let be the Hamming ball centered at with radius . Since is (uniform) randomly selected the expected size of overlapped region is
Since this is the expected value of the size, there must exist at least one such that
otherwise the expectation must be smaller than this value.
Now we prove the Elias–Bassalygo bound. Define By Lemma, there exists a Hamming ball with codewords such that:
By the Johnson bound, we have . Thus,
The second inequality follows from lower bound on the volume of a Hamming ball:
^Each -ary block code of length is a subset of the strings of where the alphabet set has elements.
Bassalygo, L. A. (1965), "New upper bounds for error-correcting codes", Problems of Information Transmission, 1 (1): 32–35
Claude E. Shannon, Robert G. Gallager; Berlekamp, Elwyn R. (1967), "Lower bounds to error probability for coding on discrete memoryless channels. Part I.", Information and Control, 10: 65–103, doi:10.1016/s0019-9958(67)90052-6
Claude E. Shannon, Robert G. Gallager; Berlekamp, Elwyn R. (1967), "Lower bounds to error probability for coding on discrete memoryless channels. Part II.", Information and Control, 10: 522–552, doi:10.1016/s0019-9958(67)91200-4