Contraharmonic mean

From Wikipedia, the free encyclopedia

In mathematics, a contraharmonic mean is a function complementary to the harmonic mean. The contraharmonic mean is a special case of the Lehmer mean, , where p = 2.

Definition[]

The contraharmonic mean of a set of positive numbers is defined as the arithmetic mean of the squares of the numbers divided by the arithmetic mean of the numbers:

Properties[]

It is easy to show that this satisfies the characteristic properties of a mean:

The first property implies the fixed point property, that for all k > 0,

C(k, k, …, k) = k

The contraharmonic mean is higher in value than the arithmetic mean and also higher than the root mean square:

where x is a list of values, H is the harmonic mean, G is geometric mean, L is the logarithmic mean, A is the arithmetic mean, R is the root mean square and C is the contraharmonic mean. Unless all values of x are the same, the ≤ signs above can be replaced by <.

The name contraharmonic may be due to the fact that when taking the mean of only two variables, the contraharmonic mean is as high above the arithmetic mean as the arithmetic mean is above the harmonic mean (i.e., the arithmetic mean of the two variables is equal to the arithmetic mean of their harmonic and contraharmonic means).

Two-variable formulae[]

From the formulas for the arithmetic mean and harmonic mean of two variables we have:

Notice that for two variables the average of the harmonic and contraharmonic means is exactly equal to the arithmetic mean:

A(H(ab), C(ab)) = A(ab)

As a gets closer to 0 then H(ab) also gets closer to 0. The harmonic mean is very sensitive to low values. On the other hand, the contraharmonic mean is sensitive to larger values, so as a approaches 0 then C(ab) approaches b (so their average remains A(ab)).

There are two other notable relationships between 2-variable means. First, the geometric mean of the arithmetic and harmonic means is equal to the geometric mean of the two values:

The second relationship is that the geometric mean of the arithmetic and contraharmonic means is the root mean square:

The contraharmonic mean of two variables can be constructed geometrically using a trapezoid (see [1]).

Additional constructions[]

The contraharmonic mean can be constructed on a circle similar to the way the Pythagorean means of two variables are constructed. The contraharmonic mean is the remainder of the diameter on which the harmonic mean lies.

Properties[]

The contraharmonic mean of a random variable is equal to the sum of the arithmetic mean and the variance divided by the arithmetic mean.[1] Since the variance is always ≥0 the contraharmonic mean is always greater than or equal to the arithmetic mean.

The ratio of the variance and the mean was proposed as a test statistic by Clapham.[2] This statistic is the contraharmonic mean less one.

It is also related to Katz's statistic[3]

where m is the mean, s2 the variance and n is the sample size.

Jn is asymptotically normally distributed with a mean of zero and variance of 1.

Uses in statistics[]

The problem of a size biased sample was discussed by Cox in 1969 on a problem of sampling fibres. The expectation of size biased sample is equal to its contraharmonic mean.[4]

The probability of a fibre being sampled is proportional to its length. Because of this the usual sample mean (arithmetic mean) is a biased estimator of the true mean. To see this consider

where f(x) is the true population distribution, g(x) is the length weighted distribution and m is the sample mean. Taking the usual expectation of the mean here gives the contraharmonic mean rather than the usual (arithmetic) mean of the sample. This problem can be overcome by taking instead the expectation of the harmonic mean (1/x). The expectation and variance of 1/x are

and has variance

where E[] is the expectation operator. Asymptotically E[1/x] is distributed normally.

The asymptotic efficiency of length biased sampling depends compared to random sampling on the underlying distribution. if f(x) is log normal the efficiency is 1 while if the population is gamma distributed with index b, the efficiency is b/(b − 1).

This distribution has been used in several areas.[5][6]

It has been used in image analysis.[7]

History[]

The contraharmonic mean was discovered by the Greek mathematician Eudoxus in the 4th century BCE.

See also[]

References[]

  1. ^ Kingley MSC (1989) The distribution of hauled out ringed seals an interpretation of Taylor's law. Oecologia 79: 106-110
  2. ^ Clapham AR (1936) Overdispersion in grassland communities and the use of statistical methods in plant ecology. J Ecol 14: 232
  3. ^ Katz L (1965) United treatment of a broad class of discrete probability distributions. in Proceedings of the International Symposium on Discrete Distributions. Montreal
  4. ^ Zelen M (1972) Length-biased sampling and biomedical problems. In Biometric Society Meeting, Dallas, Texas
  5. ^ Keillor BD, D'Amico M & Horton V (2001) Global Consumer Tendencies. Psychology & Marketing 18(1) 1-19
  6. ^ Sudman (1980) Quota sampling techniques and weighting procedures to correct for frequency bias
  7. ^ Pathak M, Singh S (2014) Comparative analysis of image denoising techniques. International Journal of Computer Science & Engineering Technology 5 (2) 160-167
  • Essay #3 - Some "mean" Trapezoids, by Shannon Umberger: [2]
  • Construction of the Contraharmonic Mean in a Trapezoid: [3]
  • Means in the Trapezoid: [4]
  • Means of Complex Numbers: [5]
  • Proofs without Words / Exercises in Visual Thinking, by Roger B. Nelsen, page 56, ISBN 0-88385-700-6
  • Pythagorean Means: [6] (extend the segment that represents the Harmonic mean through the circle's center to the other side, creating a diameter. The length of the diameter segment after the Harmonic segment is the Contraharmonic mean.)
  • Pahikkala, Jussi (2010), On contraharmonic mean and Pythagorean triples, Elemente der Mathematik 65 (2): 62–67.

External links[]

Retrieved from ""