Entropy

edit

The information entropy of the Von Mises distribution is defined as[1]:

 

where   is any interval of length  . The logarithm of the density of the Von Mises distribution is straightforward:

 

The characteristic function representation for the Von Mises distribution is:

 

where  . Substituting these expressions into the entropy integral, exchanging the order of integration and summation, and using the orthogonality of the cosines, the entropy may be written:

 

Collapsible

edit
Table of closed-form stable distribution PDF's
 

where   is a Lommel function. (Reference: Garoni & Frankel[2].)

 

where S(x) and C(x) are Fresnel Integrals (Reference: Hopcraft et. al.[3].)

 

where   is a Whittaker function.) (Reference: Uchaikin & Zolotarev [4].)

Cauchy distribution

 


  •  

(Reference: Garoni & Frankel [2].)



  •  

(Reference: Garoni & Frankel[2].)


 

The following are asymmetric distributions (specifically, where β = 1).


  •  

where Kv(x) is a modified Bessel function of the second kind. (Reference: Hopcraft et. al.[3].)



 



  •  

(Reference: Zolotarev 1961 [5].)


  •  .

(Reference: Kagan et. al.[6].)


The symmetric distributions for which α = p / q and p > q can be derived from a result in Garoni & Frankel<ref name="G&F"\>.

Also Meijer functions (Zolatarev)

Relationship to Tsallis entropy

edit

While the normal distribution has the maximum entropy for a fixed first moment   and second moment   of the random variable, the Student's t-distribution has the maximum Tsallis entropy for a fixed first and second moment.

The Tsallis entropy of a probability density   is defined as:

 

where   is the support of f(t). For a normalized density (zeroth moment equal to unity), with fixed values of the first and second moment, using the calculus of variations and the method of Lagrange multipliers, the entropy H will be maximized when the Lagrangian equation is satisfied:

 

where the   are the Lagrange multipliers. The variation of the Tsallis entropy is

 

and so the Lagrange equation is satisfied when:

 

or, solving for f(t):

 

Solving for the   using the three moment constraints (assuming the centered Student's t-distribution for which the mean of t is zero):

 
 
 

yields the Student's t-distribution as the expression which maximizes the Tsallis entropy.

  1. ^ Cite error: The named reference Mardia99 was invoked but never defined (see the help page).
  2. ^ a b c Garoni, T. M.; Frankel, N. E. (2002). "Lévy flights: Exact results and asymptotics beyond all orders". Journal of Mathematical Physics. 43 (5): 2670–2689.
  3. ^ a b Hopcraft, K. I.; Jakeman, E.; Tanner, R. M. J. (1999). "Lévy random walks with fluctuating step number and multiscale behavior". Physical Review E. 60 (5): 5327–5343.
  4. ^ Uchaikin, V. V.; Zolotarev, V. M. (1999). "Chance And Stability - Stable Distributions And Their Applications". VSP. Utrecht, Netherlands.
  5. ^ Zolotarev, V. M. (1961). "Expression of the density of a stable distribution with exponent alpha greater than one by means of a frequency with exponent 1/alpha". Selected Translations in Mathematical Statistics and Probability. 1: 163–167.
  6. ^ Zaliapin, I. V.; Kagan, Y. Y.; Schoenberg, F. P. (2005). "Approximating the Distribution of Pareto Sums". Pure and Applied Geophysics. 162 (6): 1187–1228.