Introduction to ACL1
OBJECTIVE
The objective of this introduction is for students to be able to gain access to and perform some
basic functions using the ACL software. This assignment is written as a
3.3 Continuous Distribution
3.3.1 Uniform Distribution
The continuous uniform distribution is dened by spreading mass uniformly over an interval [a, b]. Its pdf given by f (x|a, b) =
b a
1 ba
if x [
3.3.3 Normal Distribution
The normal distribution has several advantages over the other distributions. a. The normal distribution and distributions associated with it are very tractable and analytical
3.3.4 Beta Distribution
The beta(, ) pdf is
f (x|, ) =
1
x1 (1 x) 1 ,
B (, )
0 < x < 1,
> 0,
> 0,
where B (, ) denotes the beta function,
1
B (, ) =
x1 (1 x) 1 dx =
0
()( )
.
( + )
For n > , we have
3.4 Exponential Families
A family of pdfs or pmfs is called an exponential family if it can be expressed as
k
f (x| ) = h(x)c( ) exp
i=1
wi ( )ti (x) .
(1)
Here h(x) 0 and t1 (x), . . . , tk (x) are r
3.6 Inequalities and Identities
Theorem 3.6.1 (Chebychevs Inequality) Let X be a random variable and let g (x) be a nonnegative function. Then, for any r > 0, P (g (X ) r) Eg (X ) . r
Proof:
Eg (X ) =
3
Bivariate Transformations
Let (X, Y ) be a bivariate random vector with a known probability distribution. Let U = g 1 (X, Y )
and V = g2 (X, Y ), where g1 (x, y ) and g2 (x, y ) are some specied fun
4
Hierarchical Models and Mixture Distributions
Example 4.1 (Binomial-Poisson hierarchy) Perhaps the most classic hierarchical model is the following. An insect lays a large number of eggs, each survi
4. Multiple Random Variables
4.1 Joint and Marginal Distributions
Denition 4.1.1 An n-dimensional random vector is a function from a sample space S into Rn , n-dimensional Euclidean space. Suppose, fo
4.2 Conditional Distributions and Independence
Denition 4.2.1 Let (X, Y ) be a discrete bivariate random vector with joint pmf f (x, y ) and marginal pmfs fX (x) and fY (y ). For any x such that P (X
4.5 Covariance and Correlation
In earlier sections, we have discussed the absence or presence of a relationship between two random variables, Independence or nonindependence. But if there is a relatio
Multivariate Distribution
The random vector X = (X1 , . . . , Xn ) has a sample space that is a subset of Rn . If X is discrete random vector, then the joint pmf of x is the function dened by f (x) =
5.1 Basic Concepts of Random Samples
Denition 5.1.1
The random variables X1 , . . . , Xn are called a random sample of size n from the population
f (x) if X1 , . . . , Xn are mutually independent rand
5.3.1 Properties of the sample mean and variance
Lemma 5.3.2 (Facts about chi-squared random variables) We use the notation 2 to denote a chi-squared random variable with p degrees of freedom. p (a) I
5.3.2 The Derived Distributions: Students t and Snedecors F
Denition Let X1 , . . . , Xn be a random sample from a N (, 2 ) distribution. The quantity (X )/(S/ n) has Students t distribution with n 1
Sums of Random Variables from a Random Sample
Denition 5.2.1
Let X1 , . . . , Xn be a random sample of size n from a population and let T (x1 , . . . , xn ) be a
real-valued or vector-valued function
1
Order Statistics
Definition The order statistics of a random sample X1 , . . . , Xn are the sample values placed in ascending order. They are denoted by X(1) , . . . , X(n) . The order statistics ar
5.5 Convergence Concepts
This section treats the somewhat fanciful idea of allowing the sample size to approach innity
and investigates the behavior of certain sample quantities as this happens. We ar
3.2.3 Binomial Distribution
The binomial distribution is based on the idea of a Bernoulli trial. A Bernoulli trail is
an experiment with two, and only two, possible outcomes. A random variable X has a
3.2.4 Poisson Distribution
Denition Let X be the number of events per basic unit: For example, Number of rain drops in one minute. Number of cars passing by you for an hour. Number of chocolate partic
Advanced Statistical Inference I
Homework 4: Multiple Random Variables
Due Date: November 30th
1. (Basic Calculation)
(a) Exercise 4.4.
(b) Exercise 4.5.
(c) Exercise 4.7.
2. (Check on Independence an
Advanced Statistical Inference I
Homework 3: Common Families of Distributions
Due Date: November 2nd
1. (Engineering Applications)
(a) Exercise 3.2(c).
(b) Exercise 3.3.
2. (Business and Law Applicati
Advanced Statistical Inference I
Homework 1: Probability Theory
Due Date: October 7th
1. (Detect mixture distribution) Exercise 1.6.
2. (Countable additivity and Kolmogorovs Axiom) Exercise 1.12 and E
Lecture 1: Set Theory
1
Set Theory
One of the main objectives of a statistician is to draw conclusions about a population of objects by
conducting an experiment. The st step in this endeavor is to ide
1.2.3 Counting and Equally Likely Outcomes
Methods of counting are often used in order to construct probability assignments on nite
sample spaces, although they can be used to answer other questions a
1.3 Conditional Probability and Independence
All of the probabilities that we have dealt with thus far have been unconditional probabilities.
A sample space was dened and all probabilities were calcul
1.6. Density and Mass Functions
Denition 1.6.1 (Probability Mass Function) The probability mass function (pmf) of a discrete random variable X is given by fX (x) = P (X = x) for all x.
Example 1.6.2 (
Lecture 2 : Basics of Probability Theory
When an experiment is performed, the realization of the experiment is an outcome in the sample
space. If the experiment is performed a number of times, dierent