Course Hero has millions of student submitted documents similar to the one

below including study guides, practice problems, reference materials, practice exams, textbook help and tutor support.

Course Hero has millions of student submitted documents similar to the one below including study guides, practice problems, reference materials, practice exams, textbook help and tutor support.

distributions 1
Stationary and the limit theorem
Definition 1.1. The vector is called a stationary distribution of the chain if has entries (j : j S) such that: (a) j 0 for all j, and
j
j = 1,
i
(b) = P, which is to say that j = equations). Note: Pn =
i pij for all j (the balance
for all n 0.
If X0 has distribution then Xn has distribution for all n.
1
Proposition 1.2. An irreducible chain has a stationary distribution if and only if all the states are non-null persistent; in this case, is the 1 unique stationary distribution and is given by i = i for each i S, where i is the mean recurrence time of i. We will carry out the proof of this in several steps.
2
Fix a state k and let i (k) be the mean number of visits of the chain to the state i between two successive visits to state k; that is i (k) = E(Ni |X0 = k) where
Ni =
n=1
1{Xn = i} {Tk n}
and Tk is the time of the first return to state k. We write (k) for the vector (i (k) : i S). Clearly Tk = iS Ni , and hence k =
iS
i (k)
3
Lemma 1.3. For any state k of an irreducible persistent chain, the vector (k) satisfies i (k) < for all i, and furthermore (k) = (k)P.
4
Proof. We show first that i (k) < when i = k. Observe that k (k) = 1. We write lki (n) = P(Xn = i, Tk n|X0 = k). Clearly fkk (m + n) lki (m)fik (n). By irreducibility of the chain, there exists n such that fik (n) > 0. So for n 2 1 1 i (k) = lki (m) fkk (m + n) < fik (n) m=1 fik (n) m=1 as required. Next observe that lki (1) = pki , and lki (n) =
j:j=k
P (Xn = i, Xn-1 = j, Tk n|X0 = k) =
j:j=k
lkj (n-1)pji .
5
Summing over n 2, we obtain i (k) = pki +
j:j=k
j (k)pji ,
j:j=k
n2
lkj (n - 1)pji = k (k)pki +
since k (k) = 1.
6
For any irreducible chain, the vector (k) satisfies (k) = (k)P, and furthermore that the components of (k) are non-negative with sum k . Hence, if k < , the vector with entries i = i (k)/k satisfies = P and furthermore has non-negative entries which sum to 1; that is to say, is a stationary distribution. We have proved that every non-null persistent irreducible chain has a stationary distribution.
7
Proposition 1.4. If the chain is irreducible and persistent, there exists a positive root x of the equation x = xP, which is unique up to a multiplicative constant. The chain is non-null if i xi < and null if i xi = . We've seen all of this except for the uniqueness claim, which we won't establish although it isn't difficult.
8
Proof of Proposition (1.2) Suppose the is a stationary distribution of the chain. If all states are transient then pij (n) 0 as n , for all i and j. So j =
i
i pij (n) 0
j
as n
for all i and j,
which contradicts
j = 1.
9
We show next that the existence of implies that all states are 1 non-null and that i = i for each i. Suppose that X0 has distribution , so that P(X0 = i) = i for each i. Then
j j =
n=1
P(Tj n|X0 = j)P(X0 = j) =
n=1
P(Tj n, X0 = j).
10
However, P(Tj 1, X0 = j) = P(X0 = j), and for n 2, P(Tj n, X0 = j) = P(X0 = j, Xm = j for 1 m n - 1) = P(Xm = j for 1 m n - 1) - P(Xm = j for 0 m n - 1) = P(Xm = j for 0 m n - 2) - P(Xm = j for 0 m n - 1) by homogeneity = an-2 - an-1 where an = P(Xm = j for 0 m n).
11
Sum over n to obtain j j = P(X0 = j) + P(X0 = j) - lim an = 1 - lim an .
n n
However, an P(Xm = j for all m) = 0 as n , by the persistence of j. We have shown that j j = 1,
1 so that j = j < if j > 0. To see that j > 0 for all j, suppose on the contrary that j = 0 for some j.
12
Then 0 = j =
i
i pij (n) i pij (n)
for all i and n,
yielding that i = 0 whenever i j. The chain is assumed irreducible, so that i = 0 for all i in contradiction of the fact that i 's sum to 1. Hence j < and all states of the chain are non-null. Furthermore, 1 we see that j are specified uniquely as j .
13
Thus, if exists then it is unique and all the states of the chain are non-null persistent. Conversely, if the states of the chain are non-null persistent then the chain has a stationary distribution given by Lemma (1.3).
14
Proposition 1.5. If i j then i is null persistent if and only if j is null persistent. Proof. Let C(i) be the irreducible closed equivalence class of states which contains the non-null persistent state i. Suppose that X0 C(i). Then Xn C(i) for all n, and Lemma (1.3) and Proposition (1.2) combine to tell us that all the states in C(i) are non-null.
15
Proposition 1.6. Let s S be any state of an irreducible chain. The chain is transient if and only if there exists a non-zero solution {yi : i = s}, satisfying |yi | 1 for all i, to the equations yi =
j:j=s
pij yj ,
i = s.
16
Example [random walk with retaining barrier] A particle performs a random walk on the non-negative integers with a retaining barrier at 0. The transition probabilities are p0,0 = q, Let = p/q.
1 (a) If q < p, take s = 0 to see that yi = 1 - i satisfies the equation in Proposition (1.6), and so the chain is transient.
pi,i+1 = p
for i 0
pi,i-1 = q
for i 1,
17
(b) Solve the equation = P to find that there exists a stationary distribution, with j = j (1 - ), if and only if q > p. Thus the chain is non-null persistent if and only if q > p.
1 (c) If q = p = 2 the chain is persistent since symmetric random walk is persistent (just reflect negative excursions of a symmetric random walk into the positive half-line). Solve the equation x = xP to find that xi = 1 for all i provides a root, unique up to a multiplicative constant. However, i xi = so that the chain is null by Proposition (1.4).
18
Theorem 1.7. For an irreducible aperiodic chain, we have that pij (n) 1 j as n , for all i and j.
Proof. If the chain is transient the the result is trivial. Suppose X is an irreducible aperiodic non-null persistent. Construct a 'coupled chain' Z = (X, Y ), being an ordered pair X = {Xn : n 0}, Y = {Yn : n 0} of independent Markov chains, each having state space S and transition matrix P. Then Z = {Zn = (Xn , Yn ) : n 0} takes values in S S, and it is easy to check that Z is a Markov chain with transition probabilities pij,kl = P (Zn+1 = (k, l)|Zn = (i, j)) = P (Xn+1 = k|Xn = i) P (Yn+1 = l|Yn = j) = pik pjl by independence
19
Since X is irreducible and aperiodic then Z is also irreducible. Since X is non-null persistent it has a unique stationary distribution , and it is easy to see that Z has a stationary distribution = (ij : i, j S) given by ij = i j ; thus Z is also non-null persistent.
20
Now suppose that X0 = i and Y0 = j, so that Z0 = (i, j). Choose any state s S and let T = min {n 1 : Zn = (s, s)} denote the time of the first passage of Z to (s, s). Then P(T < ) = 1. Starting from Z0 = (X0 , Y0 ) = (i, j) pik (n) = P(Xn = k) = P(Xn = k, T n) + P(Xn = k, T > n) = P(Yn = k, T n) + P(Xn = k, T > n) because, given T n, Xn and Yn are identically distributed P(Yn = k) + P(T > n) = pjk (n) + P(T > n).
21
This, and the related inequality with i and j interchanged, yields |pik (n) - pjk (n)| P(T > n) 0 therefore, pik (n) - pjk (n) 0 as n for all i,j and k. as n ;
Thus, if limn pik (n) exists, then it does not depend on i. To show that it exists, write k - pjk (n) =
i
i (pik (n) - pjk (n)) 0
as n .
22
Example [the age of a renewal process] Initially an item is put into use, and when it fails it is replaced at the beginning of the next time period by a new item. Suppose that the lives of the items are independent and each will fail in its ith period of use with probability Pi , i 1, where the distribution {Pi } is aperiodic and i iPi < . Let Xn denote the age of the item in use at time nthat is, the number of periods (including the nth) it has been in use.
23
Then if we let (i) = Pi
j=i
Pj
denote the probability that an i unit old item fails, then {Xn , n 0} is a Markov chain with transition probabilities given by Pi,1 = (i) = 1 - Pi,i+1 , Hence the limiting probabilities are such that 1 =
i
i 1.
i (i), i 1.
i+1 = i (1 - (i)),
24
Iterating yields
i+1
= i (1 - (i)) = i-1 (1 - (i))(1 - (i - 1)) = 1 (1 - (1))(1 - (2)) (1 - (i))
= 1
j=i+1
Pj
= 1 P(X i + 1),
25
where X is the life of an item. Using
1
i = 1 yields
1 = 1
i=1
P(X i)
or 1 = 1/E[X] and hence i = P(X i)/E[X], i 1.
26
Example Suppose that during each time period, every member of a population independently dies with probability p, and also that the number of new members that join the population in each time period is a Poisson random variable with mean . If we let Xn denote the number of members of the population at the beginning of period n, then it is easy to see that {Xn , n = 1, . . .} is a Markov chain. To find the stationary probabilities of this chain, suppose that X0 is distributed as a Poisson random variable with parameter . Since each of these X0 individuals will independently be alive at the beginning of the next period with probability 1 - p, it follows that the number of them that are still in the population at time 1 is a Poisson random variable with mean (1 - p).
27
As the number of new members that join the population by time 1 is an independent Poisson random variable with mean , it thus follows that X1 is a Poisson random variable with mean (1 - p) + . Hence, if = (1 - p) + then the chain would be stationary. Hence, by the uniqueness of the stationary distribution, we can conclude that the stationary distribution is Poisson with mean /p. That is, j = e-/p (/p)j /j!, j = 0, 1, . . .
28
Example [the Gibbs sampler] Let p(x1 , . . . , xn ) be the joint probability mass function of the random vector X1 , . . . , Xn . In cases where it is difficult to directly generate the values of such a random vector, but where it is relatively easy to generate, for each i, a random variable having the conditional distribution of Xi given all of the other Xj , j = i, we can generate a random vector whose probability mass function is approximately p(x1 , . . . , xn ) by using the Gibbs sampler. It works as follows.
29
Let X 0 = (x0 , . . . , x0 ) be any vector for which p(x0 , . . . , x0 ) > 0. n 1 1 1 Then generate a random variable whose distribution is the conditional distribution of X1 given that Xj = x0 , j = 2, . . . , n, and call its j value x1 . 1 Then generate a random variable whose distribution is the conditional distribution of X2 given that X1 = x1 , Xj = x0 , j = 3, . . . , n, and call 1 j its value x1 . 2
30
Then continue this in fashion until you have generated a random variable whose distribution is the conditional distribution of Xn given that Xj = x1 , j = 1, . . . , n - 1, and call its value x1 . n j Let X1 = (x1 , . . . , x1 ), and repeat the process, this time starting with n 1 X1 in place of X0 , to obtain the new vector X2 , and so on. It is easy to see that the sequence of vectors Xj , j 0 is a Markov chain and the claim is that its stationary probabilities are given by p(x1 , . . . , xn ).
31
To verify the claim, suppose that X 0 has probability mass function p(x1 , . . . , xn ). Then it is easy to see that at any point in this algorithm the vector xj , . . . , xj , xj-1 , . . . , xj-1 will be the value of a random n 1 i-1 i j variable with mass function p(x1 , . . . , xn ). For instance, letting Xi be the random variable that takes on the value denoted by xj then i
1 P(X1 0 = x1 , Xj = xj , j = 2, . . . , n) 1 0 0 = P(X1 = x1 |Xj = xj , j = 2, . . . , n)P(Xj = xj , j = 2, . . . , n)
= P(X1 = x1 |Xj = xj , j = 2, . . . , n)P(Xj = xj , j = 2, . . . , n) = p(x1 , . . . , xn ).
32
Thus p(x1 , . . . , xn ) is a stationary probability distribution and so, provided that the Markov chain is irreducible and aperiodic, we can conclude that it is the limiting probability vector for the Gibbs sampler. It also follows from the proceeding that p(x1 , . . . , xn ) would be the limiting probability vector even if the Gibbs sampler were not systematic in first changing the value of X1 , then X2 , and so on., Indeed, even if the component whose value was to be changed was always randomly determined, then p(x1 , . . . , xn ) would remain a stationary distribution, and would thus be the limiting probability mass function provided that the resulting chain is aperiodic and irreducible.
33
2
Exercises
1) Each day one of n possible elements is requested; the ith one with n probability Pi , i 1, i=1 Pi = 1. These elements are at all times arranged in an ordered list that is revised as follows: the element selected is moved to the front of the list with the relative positions of all the other elements remaining unchanged. Define the state at any time to be the list ordering at that time. (a) Argue that the above is Markov chain. (b) For any state i1 , . . . , in (which is a permutation of 1, 2, . . . , n) let (i1 , . . . , in ) denote the limiting probability. Argue that (i1 , . . . , in ) = Pi1 Pin-1 Pi 2 . 1 - pi1 1 - Pi1 - - Pin-2
34
2) Let {Xn , n 0} be a Markov chain with stationary probabilities j , j 0. Suppose that X0 = 0 and define T = min{n : n > 0 and Xn = 0}. Let Yj = XT -j , j = 0, 1, . . . , T . Argue that {Yj , j = 0, . . . , T } is distributed as the states of the reverse Markov chain (with transition probabilities Pij = j Pji /i ) starting in state 0 until it returns to 0.
35
3) Consider a finite Markov chain on the state space {0, 1, 2, . . . , N } with transition probability matrix P = (Pij )N i,j=0 consisting of three classes {0}, {1, 2, . . . , N - 1} and {N } where 0 and N are absorbing states, both accessible from k = 1, . . . , N - 1, and {1, 2, . . . , N - 1} is a transient class. Let k be a state satisfying 0 < k < N . We define an auxiliary process called "the return process" with transition matrix P by altering the first and last row of P so that P0k = PN k = 1 and leave the other rows unchanged. The return process is clearly irreducible. Prove that the expected time until absorption k with initial state k in the original process equals 1/(0 + N ) - 1 where 0 + N is the stationary probability of being in state 0 or N for the return process. Hint: use the relation between stationary probabilities and expected recurrence times to states.
36
3
Reversibility
Suppose that {Xn : 0 n N } is an irreducible non-null persistent Markov chain, with transition matrix P and stationary distribution . Suppose further that Xn has distribution for every n. Define the 'reversed chain' Y by Yn = XN -n for 0 n N .
37
Proposition 3.1. The sequence Y is a Markov chain with P(Yn+1 = j|Yn = i) = (j /i )pji .
38
Proof. We have as required that
39
P(Yn+1 = in+1 |Yn = in , Yn-1 = in-1 , . . . , Y0 = i0 ) P(Yk = ik , 0 k n + 1) = P(Yk = ik , 0 k n) P(XN -n-1 = in+1 , XN -n = in , . . . , XN = i0 ) = P(XN -n = in , . . . , XN = i0 ) in+1 pin+1 ,in pin ,in-1 . . . pi1 ,i0 = in pin ,in-1 . . . pi1 ,i0 in+1 pin+1 ,in = . i n
40
We call the chain Y the time reversal of chain X, and we say that X is reversible if X and Y have the same transition probabilities.
41
Let X = {Xn : 0 n N } be an irreducible Markov chain such that Xn has the stationary distribution for all n. The chain is called reversible if the transition matrices of X and its time-reversal Y are the same, which is to say that i pij = j pji for all i,j.
These equations are called the detailed balance equations.
42
Proposition 3.2. Let P be the transition matrix of an irreducible chain X, and suppose that there exists a distribution such that i pij = j pji for all i,j S. Then is a stationary distribution of the chain. Furthermore, X is reversible in equilibrium.
43
Proof. Suppose that satisfies the conditions above. Then i pij =
i i
j pji = j
i
pji = j
and so = P, whence is stationary. The reversibility in equilibrium of X follows from the definition.
44
Example [Ehrenfest model of diffusion] Two containers A and B are placed adjacent to each other and gas is allowed to pass through a small aperture joining them. total of m gas molecules is distributed between the containers. We assume that at each epoch of time one molecule, picked uniformly at random from the m available, passes through this aperture. Let Xn be the number of molecules in container A after n units of time has passed. Clearly {Xn } is a Markov chain with transition matrix
45
i i , pi,i-1 = if 0 i m. m m Rather than solve the equation = P to find the stationary distribution, we note that such a reasonable diffusion model should be reversible in equilibrium. Look for solutions of the detailed balance equations i pij = j pji to obtain i = m ( 1 )m . i 2 pi,i+1 = 1 -
46
Example [the Metropolis algorithm] Let aj ,j = 1, . . . , m be positive m numbers and let A = j=1 aj . Suppose that m is large and that A is difficult to compute, and suppose we ideally want to simulate the values of a sequence of independent random variables whose probabilities are pj = aj /A, j = 1, . . . , m. One way of simulating a sequence of random variables whose distributions converge to {pj , j = 1, . . . , m} is to find a Markov chain that is both easy to simulate and whose limiting probabilities are the pj . The Metropolis algorithm provides an approach for accomplishing this task.
47
Let Q be any irreducible transition probability matrix on the integers 1, 2, . . . , n such that qij = qji for all i and j. Now define a Markov chain {Xn , n 0} as follows. If Xn = i, then generate a random variable that is equal to j with probability qij , i, j = 1, . . . , m. If this random variable takes on the value j, then set Xn+1 equal to j with probability min{1, aj /ai }, and set it equal to i otherwise. That is, the transition probabilities of {Xn , n 0} are q min(1, a /a ) ij j i Pij = qii + qij {1 - min(1, aj /ai )}
j=i
if j = i, if j = i.
48
We will now show that the limiting probabilities of this Markov chain are precisely the pj . To prove that the pj are the limiting probabilities, we will first show that the chain is time reversible with stationary probabilities pj , j = 1, . . . , m by showing that pi Pij = pj Pji . To verify the preceding we must show that pi qij min(1, ai /aj ) = pi qji min(1, ai /aj ). Now, qij = qji and aj /ai = pj /pi and so we must verify that pi min(1, pj /pi ) = pj min(1, pi /pj ).
49
However this is immediate since both sides of the equation are equal to min(pi , pj ). That these stationary probabilities are also limiting probabilities follows from the fact that since Q is an irreducible transition probability matrix, {Xn } will also be irreducible, and as (except in the trivial case where pi 1/n) Pii > 0 for some i, it is also aperiodic. By choosing a transition probability matrix Q that is easy to simulate that is, for each i it is easy to generate the value of a random variable that is equal to j with probability qij , j = 1, . . . , n we can use the preceding to generate a Markov chain whose limiting probabilities are aj /A, j = 1, . . . , n. This can also be accomplished without computing A.
50
Example [random walk on a graph] Consider a graph having a positive number wij associate with each edge (i, j), and suppose that a particle moves from vertex to vertex in the following manner: If the particle is presently at vertex i then it will next move vertex j with probability Pij = wij /
j
wij
where wij is 0 if (i, j) is not an edge of the graph. The Markov chain describing the sequence of vertices visited by the particle is called a random walk on an edge weighted graph.
51
Proposition 3.3. Consider a random walk on an edge weighted graph with a finite number of vertices. If this Markov chain is irreducible the it is, in steady state, time reversible with stationary probabilities given by i =
j i
wij . wij i
Proof. The time reversible equations i Pij = j Pji reduce to i wij j wji = k wik k wjk
52
or, equivalently, since wij = wji i = k wik implying that i = c
k
j k wjk
wik
which, since
i = 1, prove this result.
53
4
Exercises
1) Consider a time-reversible Markov chain on the state space {0, 1, 2, . . .} with transition probabilities Pij and limiting probabilities i . Now consider the same chain truncated to the states {0, 1, . . . , M }. That is, the truncated chain has transition probabilities Pij + k>M Pik , 0 i M, j = i P ij = Pij , 0i=jM 0, otherwise. Show that the truncated chain is also time reversible and has limiting probabilities given by i = i
M i=0
i
.
54
2) Suppose M balls are initially distributed among m urns. At each stage one of the balls is selected at random, taken from whichever urn it is in, and placed, again at random, in one of the other m - 1 urns. Consider the Markov chain whose state at any time is the vector (n1 , . . . , nm ), where ni denotes the number of balls in urn i. Guess at the limiting probabilities for this Markov chain and verify your guess and show at the same time that the Markov chain is time reversible.
55

**Find millions of documents on Course Hero - Study Guides, Lecture Notes, Reference Materials, Practice Exams and more. Course Hero has millions of course specific materials providing students with the best way to expand their education.**

Below is a small sample set of documents:

Berkeley - MATH - 128A

Berkeley - MATH - 128A

Math 128A, Spring 2007 Homework 6 Solution 5.1.12 It's enough to check exactness for 1, x, x2 , x3 , etc. The degree of precision is the first n for which our rule is not exact for xn+1 .11dx = 10 11 3 1+ 1=1 4 4 1 3 0 + 2/3 = 1/2 4 4 1 3 0 + 4

Berkeley - STAT - 150

Statistics 150: Spring 2007March 6, 20070-11Continuous-Time Markov ChainsConsider a continuous-time stochastic process {Xt , t 0} taking on values in a set of nonnegative integers (we could take any finite or countable state space, but we w

Berkeley - MATH - 105

IEOR 165: Engineering Statistics, Quality Control and Forecasting, Spring 2008 Homework 4 Solution Chapter 9 Question 1 (b) Noting x = 11.625, Y = 16.4875 B=n i=1 (xi - x)Yi n x2 - n2 x i=1 i= 1.206 A = Y - B x = 2.464 Thus the estimated reg

Berkeley - MATH - 105

IEOR 165: Engineering Statistics, Quality Control and Forecasting, Spring 2008 Homework 3 SolutionExtra Question (a) The first sample moment is =. n Looking at the expected value of the first moment for a single uniform random variable, E[X] = The

Berkeley - MATH - 105

Mathematics 105 - Spring 2004 - M. Christ Problem Set 10 For Friday April 30: Continue to study 4.1 of our text. Solve the following problems from Stroock 4.1: 4.1.8, 10, 11, 12. (For 4.1.10, note that the -algebras in question are the Borel algebras

Berkeley - IEOR - 165

IEOR 165: Engineering Statistics, Quality Control and Forecasting, Spring 2008 Homework 2 SolutionChapter 7 Question 8 Noting X = 3.1502 (a) 95 percent CI: X 1.96 n = 3.1502 1.96(.1)/ 5 = (3.0625, 3.2379) (b) 99 percent CI: X z.005 n = 3.1

Berkeley - MATH - 105

Mathematics 105, Spring 2004 - M. Christ Midterm Exam #2 Comments Distribution of scores: There were 50 points possible. The highest score was 50 and the third 1 highest was 39. The median was 25 2 , the 75th percentile was 35, and the 25th percentil

Berkeley - MATH - 128A

Math 128A, Spring 2007 Homework 3 Solution 4.1.12 (a) L0 (x) + L1 (x) + L2 (x) + L3 (x) is the unique polynomial of degree 3 interpolating the data (x0 , 1), (x1 , 1), (x2 , 1), (x3 , 1). Since the polynomial 1 also has degree 3 and interpolates th

Berkeley - MATH - 105

Mathematics 105 - Spring 2004 - M. Christ Problem Set 9 For Friday April 16: Continue to study 3.3 of our text. We will treat 3.4 in a somewhat superficial way by discussing the statement (3.4.7) but not its proof, and showing how it implies Theorem

Berkeley - MATH - 105

Mathematics 105 - Spring 2004 - M. Christ Problem Set 8 (corrected1 ) For Friday April 9: Study 3.3 of our text. Solve the following problems from Stroock 3.3: 3.3.16,17,19,20,23. In problem 23, simplify the statement by assuming that (E) and (E) are

Columbia - ECON - w1105

Simple Numerical ModelGDP (AS)CSIdAEAE v AS Und IAE>AS AE>AS AE>AS AE>AS AE=AS AE<AS AE<AS AE<AS AE<ASFalling Falling Falling Falling 0 Rising Rising Rising RisingGDPExpand Expand Expand Expand GDP* Contract Contract Contract Contract

Columbia - ECON - w1105

Macro Model IIGDPTT=100YdYd-Y-TCC=100+.7 5YdSS=YdC 50 100IdId=100GG=150ADC+I+GAd v ASUnd InvGDP ChgEXPS,T,I,GY(C+I+G)700 900100 100600 800550 700100 100150 150800 950AE>AS AE>AS-100EXP EXPG+I > S

Columbia - ECON - w1105

Columbia - MATH - v2010

The Final Exam will be accumulative, about 35 % for the material covered in Chapter 1 to Chapter 5, about 65 % for the material covered after the second midterm. We didn't really talk about Chapter 7, so those sections in Chapter 7 is not required, e

Iowa State - EM - 378

Iowa State - EM - 378

Iowa State - EM - 378

Iowa State - EM - 378

Iowa State - EM - 378

Berkeley - MCB - 130

MCB 102 James Berger3/19/08 Lecture 25Sharing or distribution of lecture notes, or sharing of your subscription, is ILLEGAL and will be prosecuted. Our non-profit, student-run program depends on your individual subscription for its continued exist

Berkeley - MCB - 130

MCB 102 James Berger3/17/08 Lecture 24Sharing or distribution of lecture notes, or sharing of your subscription, is ILLEGAL and will be prosecuted. Our non-profit, student-run program depends on your individual subscription for its continued exist

Berkeley - MCB - 130

Cloning, Restriction enzymes and DNA analysisChapter 9, pp. 306-317 March 12, 2008Cloning overview In the 1970s, enzymes were discovered that can cut DNA at specific sites These "restriction enzymes" are part of a bacterial mechanism for defense

Berkeley - MCB - 130

106 amplificationRepeat ~25-30 timesPCRPCR amplify if neededgen WholeomeGeneX:affinity tag sequence Introduce/express ProteinX-affinity tag Pull-down/co-purify ProteinX+binding partners Mass spec Identify"TAP" (tandem affinity purificat

Berkeley - MCB - 130

What might you see if replication weren't semiconservative?Cairns, J., J. Mol Biol. 1963InitiationTerminationTimeJ.A. Huberman and A.D. Rigss, 1968, J. Mol. Biol.; and J.A. Huberman and A. Tsai, 1973, J. Mol. Biol.Efficiency Vmax/KmPYRI

Ohio State - BUSMGT - 330m

StatTools Assignment #1 SolutionPart I: 2a.Trading volume One Variable Summary Mean Std. Dev. Median Minimum Maximum Count 1st Quartile 3rd Quartile Interquartile Range Data Set #1 Revenue growth Data Set #1579338.41 1496813.67 27392.00 100.00 66

Ohio State - BUSMGT - 330m

StatTools Assignment #2 SolutionPart I: 1. and 2. Assignment #2, Part 1Population Mean Population Stdev Sample Size SE of the Mean Sample # 100 8 16 21 106.79 101.10 99.18 101.47 92.14 102.27 101.87 98.19 94.80 99.11 88.40 91.53 110.38 102.71 102

Ohio State - BUSMGT - 331m

Flrat5 letterot your la.i n.n. (lftlu.tltyt) Last5 dlgltsof yolr Student Numbor(SSN))t)t ttttBuaineaa Manag.m.nt Eran 331H".netrastretsg, As.,.ten-(c-Y( ae)Youhave90 minutes complete exam. A countdown to this limerwill be prcjecled the sc

Ohio State - BUSMGT - 331m

Fl6t 5 lotto.of you. lest name lustltl) 0.ttLa.t 5 dlglE of yolr Stud.nt Number(SSt{)tt-)tttttBu3lne! Manegemant Elam ll 331 (Last Firsl): Name /5*l p.u exA^ ZAto this timerwill be pojectedon the scroenduringthe quiz. No Youhave90 minutes c

Washington - A A - 210

COSMOS: Complete Online Solutions Manual Organization SystemChapter 2, Solution 2.(a)(b)We measure:R = 57 lb, = 86 R = 57 lb86 !Vector Mechanics for Engineers: Statics and Dynamics, 8/e, Ferdinand P. Beer, E. Russell Johnston, Jr., Elli

Washington - A A - 210

COSMOS: Complete Online Solutions Manual Organization SystemChapter 2, Solution 37.300-N Force:Fx = ( 300 N ) cos 20 = 281.91 N Fy = ( 300 N ) sin 20 = 102.61 N400-N Force:Fx = ( 400 N ) cos85 = 34.862 N Fy = ( 400 N ) sin 85 = 398.48 N600-

Washington - A A - 210

COSMOS: Complete Online Solutions Manual Organization SystemChapter 2, Solution 16.Using the Law of Cosines and the Law of Sines,R 2 = ( 45 lb ) + (15 lb ) - 2 ( 45 lb )(15 lb ) cos1352 2or R = 56.609 lb56.609 lb 15 lb = sin135 sinor = 10

Washington - A A - 210

COSMOS: Complete Online Solutions Manual Organization SystemChapter 2, Solution 9.Using the Law of SinesF1 R 20 lb = = sin sin 38 sin = 90 - 10 = 80, = 180 - 80 - 38 = 62Then:F1 R 20 lb = = sin 80 sin 38 sin 62or (a) F1 = 22.3 lb ! (b)

University of Texas - BME - 303

BME303 Intro. to ComputingChapter 2Bits, Data Types, and OperationsBME303 Intro. to ComputingOutline Analog vs. Digital Bit Numbering systems Data types3BME303 Intro. to ComputingAnalog vs. DigitalModern computers are digital device

University of Texas - BME - 303

BME 303: Introduction to Computing for Biomedical EngineersProf. Pengyu Ren ENS 618 pren@mail.utexas.eduOnline Course Information Blackboard: http:/courses.utexas.edu/ Syllabus Lecture materials Homework assignments Membership is mandatory,

University of Texas - BME - 303

BME303 Intro. to ComputingAnnouncements Next week's special labs: Computer Anatomy Today's welcome gathering: 4-5:30 pm in the San Jacinto Multipurpose Room1BME303 Intro. to ComputingChapter 1Welcome AboardBME303 Intro. to ComputingHis

University of Texas - BME - 303

BME303 Intro. to ComputingChapter 2 cont'dBME303 Intro. to ComputingHexadecimal?0001001010101011 0001 0010 1010 1011 1 2 A B12AB a convenient way to represent binary strings2BME303 Intro. to ComputingHexadecimal?Decimal value0001

University of Texas - BME - 303

BME303 Intro. to ComputingAdditionAs we've discussed, 2's comp. addition is just binary addition. assume all integers have the same number of bits ignore carry out for now, assume that sum fits in n-bit 2's comp. representation+01101000 (10

Cornell - M&AE - 326

MAE 326 Spring 2008 HW 11, due Wed., 16 April Problems 1-4 are for review purposes and are optional extra credit, but recommended if you had trouble with problems 1 or 3 in Prelim 2. 1. A= 6 2 0 -2(1)Find the eigenvalues and eigenvectors of A. Do

Cornell - M&AE - 326

MAE 326 - Spring 2008 1) Review Problem 1Homework 11 SolutionsFor this problem we are to obtain the eigenvalues and eigenvectors of the matrix A: A= 6 2 0 -2As we know from linear algebra, the eigenvalues of A are the roots of the characterist

Cornell - M&AE - 326

Cornell - M&AE - 326

MAE 326 Spring 2008 HW 13, due Friday, May 2 This assignment will count as two HW grades. 1. B-9-11 2. B-9-13 3. B-11-2 Plot this graph using the Matlab bode function. Determine analytical expressions for the low and high frequency behavior of the fr

Baylor - EGR - 1301

Georgia Tech - MATH - 2401

MATH 2401A1-B1 TEST1 (01/25/02)Name: SOLUTIONPlease read questions carefully and show all your work. You are allowed 50 minutes on this exam. 1. (20 pts) Findd (f dt g),d (f dt g),d (uf ), dtandd (f dt u) for the functions 1 i + t3

Georgia Tech - ECON - 2000

Georgia Tech - CHEM - 1101

Georgia Tech - AE - 3003

Georgia Tech - MATH - 2401

Georgia Tech - PHYSICS - 2211

MasteringPhysics: Assignment Print View3/2/08 4:49 PM[ Print View ]PHYS 2211 ABCDE Spring 08MP06: AccelerationDue at 11:59pm on Thursday, January 17, 2008View Grading DetailsGraph of v(t) for a Sports CarThe graph shows the velocity of a

Georgia Tech - PHYSICS - 2212

[ Print View ]PHYS 2212 G/H Spring 2008MP01bDue at 9:00am on Tuesday, January 15, 2008View Grading DetailsElectric ChargeThis assignment will receive a 30% bonus if it is completed before 9:00 AM, Friday, January 11.A Test Charge Determines

Georgia Tech - PHYSICS - 2211

MasteringPhysics: Assignment Print View3/2/08 4:51 PM[ Print View ]PHYS 2211 ABCDE Spring 08MP11: ForcesDue at 11:59pm on Sunday, January 27, 2008View Grading DetailsTwo Forces Acting at a PointTwo forces, and , act at a point. has a magn

Georgia Tech - PHYSICS - 2211

MasteringPhysics: Assignment Print View3/2/08 4:50 PM[ Print View ]PHYS 2211 ABCDE Spring 08MP09: VectorsDue at 11:59pm on Thursday, January 24, 2008View Grading DetailsResolving Vector Components with TrigonometryOften a vector is speci

Georgia Tech - PHYSICS - 2211

MasteringPhysics: Assignment Print View4/28/08 3:03 PM[ Print View ]PHYS 2211 ABCDE Spring 08MP48: Rotational KinematicsDue at 11:59pm on Tuesday, April 8, 2008View Grading DetailsAn Exhausted BicyclistAn exhausted bicyclist pedals somewh

Georgia Tech - PHYSICS - 2211

MasteringPhysics: Assignment Print View3/2/08 4:53 PM[ Print View ]PHYS 2211 ABCDE Spring 08MP20: Projectile MotionDue at 11:59pm on Tuesday, February 12, 2008View Grading DetailsProjectile Motion TutorialLearning Goal: Understand how to

Georgia Tech - PHYSICS - 2211

MasteringPhysics: Assignment Print View3/2/08 4:52 PM[ Print View ]PHYS 2211 ABCDE Spring 08MP19: Force and Motion in 2-DDue at 11:59pm on Sunday, February 10, 2008View Grading DetailsAn Object Accelerating on a RampLearning Goal: Underst

Georgia Tech - PHYSICS - 2211

MasteringPhysics: Assignment Print View3/2/08 4:51 PM[ Print View ][ Print ]PHYS 2211 ABCDE Spring 08MP15: Mass and WeightDue at 11:59pm on Sunday, February 3, 2008View Grading DetailsProblem 5.11Part A An astronaut's weight on earth is

Georgia Tech - PHYSICS - 2211

MasteringPhysics: Assignment Print View3/2/08 4:52 PM[ Print View ]PHYS 2211 ABCDE Spring 08MP17: DragDue at 11:59pm on Thursday, February 7, 2008View Grading DetailsSkydivingA sky diver of mass 80.0 (including parachute) jumps off a plan

Georgia Tech - PHYSICS - 2211

MasteringPhysics: Assignment Print View4/28/08 3:03 PM[ Print View ][ Print ]PHYS 2211 ABCDE Spring 08MP52: Rotational Energy and RollingDue at 11:59pm on Tuesday, April 15, 2008View Grading DetailsA Rolling Hollow SphereA hollow spheri

Georgia Tech - PHYSICS - 2211

MasteringPhysics: Assignment Print View4/28/08 3:03 PM[ Print View ]PHYS 2211 ABCDE Spring 08MP47: Gravitational Potential EnergyDue at 11:59pm on Sunday, April 6, 2008View Grading DetailsEnergy of a SpacecraftVery far from earth (at ), a

Georgia Tech - PHYSICS - 2211

MasteringPhysics: Assignment Print View3/2/08 4:53 PM[ Print View ]PHYS 2211 ABCDE Spring 08MP23: Uniform Circular MotionDue at 11:59pm on Sunday, February 17, 2008View Grading DetailsSpeed of a BulletA bullet is shot through two cardboar

Georgia Tech - PHYSICS - 2211

MasteringPhysics: Assignment Print View3/2/08 4:50 PM[ Print View ]PHYS 2211 ABCDE Spring 08MP08: Free FallDue at 11:59pm on Tuesday, January 22, 2008View Grading DetailsRocket HeightA rocket, initially at rest on the ground, accelerates