gamma - Statistics 351(Fall 2007 The Gamma Function Suppose...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Statistics 351 (Fall 2007) The Gamma Function Suppose that p > 0, and define Γ( p ) := Z ∞ u p- 1 e- u du. We call Γ( p ) the Gamma function and it appears in many of the formulae of density functions for continuous random variables such as the Gamma distribution, Beta distribution, Chi- squared distribution, t distribution, and F distribution. The first thing that should be checked is that the integral defining Γ( p ) is convergent for p > 0. For now, we will assume that it is true that the Gamma function is well-defined. This will allow us to derive some of its important properties and show its utility for statistics. The Gamma function may be viewed as a generalization of the factorial function as this first result shows. Proposition 1. If p > , then Γ( p + 1) = p Γ( p ) . Proof. This is proved using integration by parts from first-year calculus. Indeed, Γ( p + 1) = Z ∞ u p +1- 1 e- u du = Z ∞ u p e- u du =- u p e- u ∞ + Z ∞ pu p- 1 e- u du = 0 + p Γ( p ) . To do the integration by parts, let w = u p , dw = pu p- 1 , dv = e- u , v =- e- u and recall that R w dv = wv- R v dw ....
View Full Document

{[ snackBarMessage ]}

Page1 / 4

gamma - Statistics 351(Fall 2007 The Gamma Function Suppose...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online