This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Statistics 351 (Fall 2007) The Gamma Function Suppose that p > 0, and define Γ( p ) := Z ∞ u p 1 e u du. We call Γ( p ) the Gamma function and it appears in many of the formulae of density functions for continuous random variables such as the Gamma distribution, Beta distribution, Chi squared distribution, t distribution, and F distribution. The first thing that should be checked is that the integral defining Γ( p ) is convergent for p > 0. For now, we will assume that it is true that the Gamma function is welldefined. This will allow us to derive some of its important properties and show its utility for statistics. The Gamma function may be viewed as a generalization of the factorial function as this first result shows. Proposition 1. If p > , then Γ( p + 1) = p Γ( p ) . Proof. This is proved using integration by parts from firstyear calculus. Indeed, Γ( p + 1) = Z ∞ u p +1 1 e u du = Z ∞ u p e u du = u p e u ∞ + Z ∞ pu p 1 e u du = 0 + p Γ( p ) . To do the integration by parts, let w = u p , dw = pu p 1 , dv = e u , v = e u and recall that R w dv = wv R v dw ....
View
Full Document
 Fall '08
 MichaelKozdron
 Statistics, Probability, Probability theory, probability density function, Gamma function, e−u du

Click to edit the document details