This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: Statistics 351 (Fall 2007) The Gamma Function Suppose that p > 0, and define ( p ) := Z u p- 1 e- u du. We call ( p ) the Gamma function and it appears in many of the formulae of density functions for continuous random variables such as the Gamma distribution, Beta distribution, Chi- squared distribution, t distribution, and F distribution. The first thing that should be checked is that the integral defining ( p ) is convergent for p > 0. For now, we will assume that it is true that the Gamma function is well-defined. This will allow us to derive some of its important properties and show its utility for statistics. The Gamma function may be viewed as a generalization of the factorial function as this first result shows. Proposition 1. If p > , then ( p + 1) = p ( p ) . Proof. This is proved using integration by parts from first-year calculus. Indeed, ( p + 1) = Z u p +1- 1 e- u du = Z u p e- u du =- u p e- u + Z pu p- 1 e- u du = 0 + p ( p ) . To do the integration by parts, let w = u p , dw = pu p- 1 , dv = e- u , v =- e- u and recall that R w dv = wv- R v dw ....
View Full Document