# gamma - Statistics 351(Fall 2007 The Gamma Function Suppose...

This preview shows pages 1–2. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Statistics 351 (Fall 2007) The Gamma Function Suppose that p > 0, and define Γ( p ) := Z ∞ u p- 1 e- u du. We call Γ( p ) the Gamma function and it appears in many of the formulae of density functions for continuous random variables such as the Gamma distribution, Beta distribution, Chi- squared distribution, t distribution, and F distribution. The first thing that should be checked is that the integral defining Γ( p ) is convergent for p > 0. For now, we will assume that it is true that the Gamma function is well-defined. This will allow us to derive some of its important properties and show its utility for statistics. The Gamma function may be viewed as a generalization of the factorial function as this first result shows. Proposition 1. If p > , then Γ( p + 1) = p Γ( p ) . Proof. This is proved using integration by parts from first-year calculus. Indeed, Γ( p + 1) = Z ∞ u p +1- 1 e- u du = Z ∞ u p e- u du =- u p e- u ∞ + Z ∞ pu p- 1 e- u du = 0 + p Γ( p ) . To do the integration by parts, let w = u p , dw = pu p- 1 , dv = e- u , v =- e- u and recall that R w dv = wv- R v dw ....
View Full Document

{[ snackBarMessage ]}

### Page1 / 4

gamma - Statistics 351(Fall 2007 The Gamma Function Suppose...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online