This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: 1 Linear Programming: Basic Concepts and Graphical Solutions 1. Elements of linear programming (a) Decision variables: (Xj,j : 1, . . .,n) (b) Objective function: Max (or Min) Z : 2 chj : 01X1+02X2+. . .+chn
j=1 (c) Constraints: represents physical (or ﬁnancial) restriction on decisions n
lth constraint: 2013'ij : aélX1+aé2X2+. . .+aénXﬂ (S, Z, 01‘ 2) 5%
3'21 2. Solution concepts Feasible solution : A speciﬁcation of decision variables satisfying all constraints.
Feasible region : The set of all feasible solutions, consists of straight lines and corner points. Extreme point (cornerpoint) feasible solution : A solution that lies at a corner of the feasible region. Binding constraint : A constraint that is satisﬁed with “:”. (Usage (L.H.S) : Limit (R.H.S)) Fundamental Theorem of Linear Programming : If there is exactly one optimal solution, it must be a corner point feasible solution. Four Scenarios a. Unique optimal solution, b. multiple optimal solutions, c. unbounded, d. infeasible. Every LP falls into exactly one of these four categories. . Graphical Method of Solving LP: Problems with (2 or 3) variables Step 1 Find the range of the decision variables for which each constraint is met.
Step 2 Determine the feasible region: the intersection of the feasible regions of all constraints.
Step 3 Deﬁne the slope of the isoproﬁt (isocost) function. Step 4 Increase (Decrease) the intercept of the iso—proﬁt (iso—cost) function until the iso—proﬁt
(iso—cost) function is farthest (closest) from the origin and still intersects with the feasible region. 2 LP: Excel Outputs, Sensitivity Analysis, and Applications 1. Excel Solver.
2. Answer and sensitivity reports: Answer report : Displays the optimal solution (X;,j : 1, . . . , n), the value of the ob— jective function (2*), and the binding constraints. Sensitivity report : Allows us to investigate the effect of changing objective function
coefﬁcients (eg. unit proﬁt contribution) and constraint levels. The Solver Sensitivity Report has two panels: (a) “Adjustable Cells” Contains information on sensitivity to changes in objective function coeﬁicients. o Allowable increase/decrease The allowable increase and decrease in the
objective function coeﬁicient over which the optimal decision variable levels
do not change. The objective function value (eg. proﬁt, cost) changes by
an amount equal to the change in the coeﬁicient times the optimal decision
variable level. For any changes outside the allowable increase/decrease, the LP must be re—solved. 0 Reduced cost: For Xj : 0, additional decrease in the coefﬁcient of an
objective function in order to made Xj > 0. Alternatively, if X3; is changed from zero to one, the objective function changes by an amount equal to the
reduced cost. If X5; 75 0, then the reduced cost is zero. (b) “Constraints” Contains information on sensitivity to changes in RHS constraint levels. 0 Allowable increase/decrease The allowable increase and decrease in the
RHS constraint over which the shadow price is applicable and the binding con straints stay the same. For any changes outside the allowable increase/decrease,
the LP must be re—solved. 0 Shadow price: Change in the objective function per unit increase ofthe RHS
of the constraint. Note that for changes in constraints with non—zero shadow prices, optimal decision variable levels change as well. 3 Basic Probability Concepts and Bayes’ Rule 1. Basic Concepts
Sample space: The set of all possible outcomes/sample points.
Sample point (elementary event): A unique individual outcome. The set of sample points is a mutually exclusive and collectively exhaustive set. Event : A collection of outcomes of interest.
Probability: Likelihood of the event. Maps outcome(s) to a number between 0 and 1. Random variable: Maps outcomes to numbers: Continuous or Discrete. 2. Finding the probability of an event or compound event
I Prob. of an event E: If E : {61, . . .,en} A P(E) : P(el) + P(€2) + + P(en). i 36} of ways that the event occurs 0 If all outcomes are equally likely: P(E) — # of outcomes Probability Rules : (d Conditional probability and multiplication rule
B(A)B) : w 4:» P(AﬂB) : P(A)P(BA) : P(B)P(AB) (e) Independence: P(A F) B) : P(A)P(B) or P(AB) : P(A) and P(BA) : P(B).
(f) Total probability rule:
0 (Two events: A and fl): P(B) : P(BﬂA)—l—P(Bﬂ}l) : P(A)P(BA)+P(A)P(Blfl)
o (in, mutually exclusive and collectively exhaustive events: E1, . . . ,En)
leiﬂEj :®and P(E1U~‘UEn): 1, then
19(3): 223:1 H3 m Eé) = P(E1)P(BE1) + P(E2)P(BE2) + ‘ ’ ‘ + P(En)P(BEn)
Bayes” Rule (Multi. Rule —— Total. Prob. Rule) Revising prob. of an event after learning new information. P A B i P(Aﬂ B) i P(Aﬂ B) i P(A)P(BA)
( l )_ P(B) _ P(Aﬂ B) +P(AmB) _ P(A)P(B)A) +P(A)P(Bf_1) 3. Tables (and ﬁgures) for summarizing probability information. We presented two tables
and one “tree” for this. Examples of each, ﬁlled in for the “Tale of two suppliers in—class example” are presented below. B :Part is bad 3 :Partis not bad Total P(A) : P(AmB) +P(A mE) P(An F): P(A)P(F\A) A A : Supplier (. 6)(. 96 ) :.024+.576
.576 :60
_ 20(2)}:(3 42) P(A_)P(?A_) P(Z)=P(Zn3)+P(Zn§)
A : Supplier S (.4)(. 02) (.4)(. 98 ) 2.008+.392
:.008 .392 :40
Total 10(0): P(An B)+ P(A7n B) “3—) P(B—OA)+P(B—n A7) P(A)~P(Z):.60+.40:1.00
: .024 + .003 .576 + .392 _
: .032 M P(B)WP(B):.032+.968:1.00
Prior Conditional Joint Posterior
P(A):.60 P(B\A):.O4 PM 0 B): P(A)P(B \A) PWBFHAOB):El:75¢
: (. 6)(. 04 ) 13(3) 032 '
: .024
19(2):.40 P(B\A):.02 In?” B) : P(Z)P(3 ‘1) P(2‘B):P(2ﬂB):L08: 25
: (.4)(. 02) P(B) .032 '
:.008
P(B): P(A m B)+ P(A—m B)
: .024 + .008
: .032 P(A m3) : P(A)P(B  A) : (.6)(.04) : .024 P(B 0 A): .04 P(Zﬂ E) : HERE 4 E) : (.4)(. 98) : .392 13(5): P(AmB)+P(A—m B)
= .024 +.008 = .032 4 Decision Analysis 1. Onestage decision analysis: How to solve? Payoﬁ table, Decision tree (Treeplan)
Sequence of Moves: DM chooses an alternative az :3 Nature chooses a s.o.nj (83') :5 Payoﬁ
realized based on the combination of a, and 3,. Max EMV rule Chooses an alternative a, which maximizes EMV. EMV[0»Z'] : :P(s.o.n. j) payoff( mi, 3,; )
jZI [L6
Good Eco nnmyr 50 Apartment 50 I14
Bad Economy
30
30 0.8
Good Ecn Horny
100 States of Nature
Dec1suon Bad Economy
Econom
___— 100 100
I14 Bad Economy
410 ___—
___— —40 0.6
Good Eco nnmy 30
Warehouse 30 DA Bad Economy
1 D 10 10 2. Multistage decision analysis How to solve? Decision tree (Treeplan)
DM selects a series of decisions from a set of choices over time, in the face of uncertainty.
(DM’s move Immediﬁe payOE Nature’s move Er)    13 Nature’s move ‘3 Game over (Terminal payoff)
How to solve?
Step 1. Calculate terminal payoﬁs: Sum lPs on path to each terminal node
Step 2. Roll back as before, starting from the terminals, and using terminal payoﬁs:
At Event nodes: Compute expected value of ﬁnal payoﬂ‘s of all branches
At Decision nodes: Select maximum (or minimum, if minimizing costs or EOL) final payoff among branches Repeat the process until reaching the beginning of the tree. 3. Perfect information Complete foresight7 information given by an Oracle. With Pl a decision maker makes decisions chooses (i: after learnin' the state of nature 5
7 L J Apartment 50 U 6
Good Economy
100 30 30 740 Warehouse
1D 1U 1U EVPI (Expected value of perfect information) Max amount to pay for the perfect info. 7L
EMV[with Pl] : ZP(s.o.n j)Payoff( Best (Lg, for 5))
j:l EVPI : EMV[With free Pl] — EMV[no info] 4. Using sample (imperfect) information (a) Revised prob. of state of nature using Bayes7 rule.
Changes sequence of events and decision tree, Prior 85 Likelihood (conditional) (What we have) => Marginal 85 Posterior (What we need) Figure 1: Two states of nature (5', 50) Two information types (Pos7Neg) (b) EVSI (Expected Value of Sample Info‘) The maximum amount to pay for the sample info.
EVSI : EMV[free sample info] — EMV[no info] (EVSI g EVPI) (c) ENGS (Expected Net Gain from Sampling) Average gain by using sample info at cost 0
ENGS : EM V[sample info. at cost 0] — EM V[no info] : EVSI — c 5 Discrete Random Variables, Binomial and Poisson RVs 1. Discrete Random Variable: Assumes ﬁnite or countably many number of values (e.g. # of bottles of beer consumed at a party). LE PMF : Probability mass function p(:z;) : P(X : )
CDF : Cumulative distribution function F(a:) : P(X S as) : Sign 39(2).
If X is an non—neg. and integer—valued r.v., F(a:) : P(X g 1:) : p(0) + p(1) —— ~~+p(:z:). Upper tail distribution : P(X > 1:) : 1 — P(X g 1:) : 1 — F(a:) 2. Mean (Expectation): Weighted average of possible values (weight: p(:z:))
M = Ele = Em mph?)
0 For any number (2, E[CX] : cE[X] : c ,u
o For any two X1 and X2, E[X1 + X2] = E[X1] + E[X2] = m + #2 3. Variance: Probability weighted sum of squared differences from the mean ,u wmn:a%JMX—mﬁ:21wwﬁmm $ mwn=ﬂ=Emaewmﬂ=Emaeﬁ o For any number 0, Var[cX] = cgl/aﬂX] = c202 a Standard deviation (0 : «lVaﬂXD 4. Binomial distribution X m bin0m(n,p) a Used to count the number of successes out of n i.i.d trials, where each trial has success probability
p. (i.e., X=X1+Xg+"'+Xn) P(X : m) : (3W1 —p)n—m, g; : 0, . . .,n EXCEL: Binomdist l
where 77’ :Landnl:1><2><...><nwith0!:1.
m $l(nim)l Mean: E[X] : up and Variance: VaﬂX] : np(1 — p) 5. Poisson distribution X m P0isson()\,t)
a Used to count the number of events (e.g. arrivals) occurring over time or space, where A is the
mean number of arrivals per unit (time or space) and t is the number of units.
W)”
ml
Mean:Variance: E[X] : VaﬂX] : At P(X : m) : (gr/‘2 a: : 0, 1, . .. EXCEL: Poisson ...
View
Full Document
 Winter '07
 All

Click to edit the document details