{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

ReynoldsCh02

# ReynoldsCh02 - The Simple Imperative Language intexp:= 0 |...

This preview shows page 1. Sign up to view the full content.

This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: The Simple Imperative Language intexp ::= 0 | 1 | . . . | var | -intexp | intexp +intexp | intexp -intexp | . . . boolexp ::= true | false | intexp = intexp | intexp < intexp | intexp ≤ intexp | . . . | ¬boolexp | boolexp ∧ boolexp | boolexp ∨ boolexp | . . . (no quantiﬁed terms) comm ::= var := intexp | skip | comm ; comm | if boolexp then comm else comm | while boolexp do comm (may fail to terminate) Denotational Semantics of SIL [[−]]intexp ∈ intexp → Σ → Z Σ = var → Z [[−]]comm ∈ comm → Σ → Σ⊥ Σ⊥ = Σ ∪ {⊥} (divergence) [[−]]boolexp ∈ boolexp → Σ → B [[v := e]]comm σ = (simpler than [[−]]assert ) def [σ | v : [[e]]intexp σ ] [[x:= x*6]]comm [x : 7] = [x : 7 | x : [[x*6]]intexp [x : 7]] = [x : 7 | x : 42] = [x : 42] [[skip]]comm σ [[c ; c￿]] comm σ = NOT! = σ [[c￿]]comm ([[c]]comm σ ) ￿ ￿￿ ￿ = ⊥ if c fails to terminate Denotational Semantics of SIL [[−]]intexp ∈ intexp → Σ → Z Σ = var → Z [[−]]comm ∈ comm → Σ → Σ⊥ Σ⊥ = Σ ∪ {⊥} (divergence) [[−]]boolexp ∈ boolexp → Σ → B [[v := e]]comm σ = (simpler than [[−]]assert ) def [σ | v : [[e]]intexp σ ] [[x:= x*6]]comm [x : 7] = [x : 7 | x : [[x*6]]intexp [x : 7]] = [x : 7 | x : 42] = [x : 42] [[skip]]comm σ [[c ; c￿]] comm σ = NOT! = σ [[c￿]]comm ([[c]]comm σ ) ￿ ￿￿ ￿ = ⊥ if c fails to terminate Denotational Semantics of SIL [[−]]intexp ∈ intexp → Σ → Z Σ = var → Z [[−]]comm ∈ comm → Σ → Σ⊥ Σ⊥ = Σ ∪ {⊥} (divergence) [[−]]boolexp ∈ boolexp → Σ → B [[v := e]]comm σ = (simpler than [[−]]assert ) def [σ | v : [[e]]intexp σ ] [[x:= x*6]]comm [x : 7] = [x : 7 | x : [[x*6]]intexp [x : 7]] = [x : 7 | x : 42] = [x : 42] [[skip]]comm σ [[c ; c￿]] comm σ = NOT! = σ [[c￿]]comm ([[c]]comm σ ) ￿ ￿￿ ￿ = ⊥ if c fails to terminate Denotational Semantics of SIL [[−]]intexp ∈ intexp → Σ → Z Σ = var → Z [[−]]comm ∈ comm → Σ → Σ⊥ Σ⊥ = Σ ∪ {⊥} (divergence) [[−]]boolexp ∈ boolexp → Σ → B [[v := e]]comm σ = (simpler than [[−]]assert ) def [σ | v : [[e]]intexp σ ] [[x:= x*6]]comm [x : 7] = [x : 7 | x : [[x*6]]intexp [x : 7]] = [x : 7 | x : 42] = [x : 42] [[skip]]comm σ [[c ; c￿]] comm σ = NOT! = σ [[c￿]]comm ([[c]]comm σ )[[c]]comm σ ￿ ￿￿ ￿ = ⊥ if c fails to terminate Denotational Semantics of SIL [[−]]intexp ∈ intexp → Σ → Z Σ = var → Z [[−]]comm ∈ comm → Σ → Σ⊥ Σ⊥ = Σ ∪ {⊥} (divergence) [[−]]boolexp ∈ boolexp → Σ → B [[v := e]]comm σ = (simpler than [[−]]assert ) def [σ | v : [[e]]intexp σ ] [[x:= x*6]]comm [x : 7] = [x : 7 | x : [[x*6]]intexp [x : 7]] = [x : 7 | x : 42] = [x : 42] [[skip]]comm σ [[c ; c￿]] comm σ = NOT! = σ [[c￿]]comm ([[c]]comm σ ) ￿ ￿￿ ￿ = ⊥ if c fails to terminate Semantics of Sequential Composition We can extend f ∈ S → T⊥ to f⊥ ∈ S⊥ → T⊥: ⊥ f⊥ x ⊥ This deﬁnes def ⊥, = f x, if x = ⊥ otherwise (−)⊥ ∈ (S → T⊥) → S⊥ → T⊥ ⊥ (a special case of the Kleisli monadic operator). So [[−]]comm ∈ comm → Σ → Σ⊥ ⇒ [[c￿]]comm ∈ Σ → Σ⊥ ⇒ ([[c￿]]comm )⊥ ∈ Σ⊥ → Σ⊥ ⊥ [[c ; c￿]]comm σ = ([[c￿]]comm )⊥ ([[c]]comm σ ) ⊥ Semantics of Conditionals [[c ]] 0 comm σ, if [[b]]boolexp σ = true [[if b then c0 else c1]]comm σ = [[c ]]comm σ, if [[b]] 1 boolexp σ = false Example: [[if x<0 then x:= -x else skip]]comm [x : −3] = [[x:= -x]]comm [x : −3], = [x : −3 | x : [[-x]]intexp [x : −3]] since [[x<0]]boolexp [x : −3] = true = [x : 3] [[if x<0 then x:= -x else skip]]comm [x : 5] = [[skip]]comm [x : 5], = [x : 5] since [[x<0]]boolexp [x : 5] = false Problems with the Semantics of Loops Idea: deﬁne the meaning of while b do c as that of if b then (c ; while b do c) else skip But the equation [[while b do c]]comm σ = [[if b then (c ; while b do c) else skip]]comm σ ([[while = σ, b do c]]comm )⊥ ([[c]]comm σ ), if [[b]]boolexp σ = true ⊥ otherwise is not syntax directed and sometimes has inﬁnitely many solutions: [[while true do x:= x+1]]comm = λσ : Σ. σ ￿ is a solution for any σ ￿. Partially Ordered Sets A relation ρ is reﬂexive on S transitive antisymmetric symmetric ￿ is reﬂexive on P & transitive iff iff iff iff ∀x ∈ S. xρx xρy & y ρz ⇒ xρz xρy & y ρx ⇒ x = y xρy ⇒ y ρx ⇒￿ is a preorder on P ￿ is a preorder on P & antisymmetric ⇒￿ is a partial order on P P with a partial order ￿ on P ⇒a poset P y ∈ P : ∀X ⊆ P. ∀x ∈ X. x ￿ y ⇒y is an upper bound of X P with IP as a partial order on P ⇒a discretely ordered P f ∈P →P ￿ & ∀x, y∈P. (x ￿ y⇒f x ￿￿ f y )⇒f is monotone from P to P ￿ Least Upper Bounds y is a lub of X ⊆ P if y is an upper bound of X and ∀z ∈ P. (z is an upper bound of X ⇒ y ￿ z ) ￿ If P is a poset and X ⊆ P , there is at most one lub X of X . ￿ {} = ⊥ — the least element of P (when it exists). Let X ⊆ P P such that ￿ X exists for every X ∈ X . Then ￿￿ { X |X ∈ X } = ￿￿ X if either of these lubs exists. In particular ∞ ￿ ∞ ￿ i=0 j =0 xij = ￿ { xij | i ∈ N and j ∈ N } = ￿ ￿ if ∞ xij exist for all j , or ∞ xij exist for all i. i=0 j =0 ∞ ￿ ∞ ￿ j =0 i=0 xij Domains A chain is a countably inﬁnite non-decreasing sequence x0 ￿ x1 ￿ . . . ￿ The limit of a chain C is its lub C when it exists. A chain C is interesting if ￿ C ∈ C. / (Chains with ﬁnitely many distinct elements are uninteresting.) A poset P is a predomain (or complete partial order — cpo) if P contains the limits of all its chains. A predomain P is a domain (or pointed cpo) if P has a least element ⊥. In semantic domains, ￿ is an order based on information content: x ￿ y (x approximates y , y is a reﬁnement of x) if x yields the same results as y in all contexts when it terminates, but may diverge in more contexts. Lifting Any set S can be viewed as a predomain with discrete partial order ￿ = IS . The lifting P⊥ of a predomain P is the domain D = P ∪ {⊥} where ⊥ ∈ P , and x ￿D y if x = ⊥ or x ￿P y . / c￿￿￿￿￿ c￿￿￿￿￿ a d ￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿ ￿￿￿ ￿￿￿ ￿￿ ￿￿￿￿ ￿￿ ￿￿￿ b ￿ ￿ ￿￿￿ ￿￿ ￿￿￿ ￿￿￿ ￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ e f ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ g −→ a ￿ ￿￿￿￿￿￿￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿ b d ￿ ￿￿ ￿￿￿ ￿￿￿ ￿￿ ￿￿￿ ￿￿￿ ￿￿ ￿￿￿ ￿￿￿ ￿￿ ￿ ￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿ ￿￿￿ ￿￿￿ ￿￿￿￿ ￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿ ￿￿￿ ￿ ￿￿ ￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ e f ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿ ￿￿ ￿￿ ￿￿￿ ￿￿ ￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿￿ ￿￿ ￿￿ ￿￿￿ ￿￿ ￿￿￿￿￿￿￿ ⊥ D is a ﬂat domain if D − {⊥} is discretely ordered by ￿. g Continuous Functions If P and P ￿ are predomains, f ∈ P → P ￿ is a continuous function from P to P ￿ if it maps limits to limits: ￿ f ( { x i | xi ∈ C } ) = ￿￿ { f xi | xi ∈ C } for every chain C ⊆ P Continuous functions are monotone: consider chains x ￿ y ￿ y . . . There are non-continuous monotone functions: Let P ⊇ the interesting chain C = (x0 ￿ x1 ￿ . . .) with a limit x in P , and P ￿ = {⊥, ￿} with ⊥ ￿￿ ￿. Then f = { [xi, ⊥] | xi ∈ C } ∪ {[x, ￿]} is monotone but not continuous: ￿￿ ￿ { f xi | xi ∈ C } = ⊥ = ￿ = f ( C ) ￿ Monotone vs Continuous Functions If f ∈ P → P ￿ is monotone, then f is continuous ￿ iff f ( xi) ￿ i ￿￿ i (f xi) for all interesting chains xi (i ∈ N) in P . Proof [1ex] For uninteresting chains: if ￿ i xi = xn, then ￿￿ i ￿ (f xi) = f xn = f ( xi). i [1ex] For interesting chains: prove the opposite approximation: (∀i ∈ N. xi ￿ ￿ j ￿ xj )⇒ (∀i ∈ N. f xi ￿ f ( xj )) ⇒ ￿￿ i ￿ j (f xi) ￿ f ( xi) i The (Pre)domain of Continuous Functions Pointwise ordering on functions in P → P ￿ where P ￿ is a predomain: f ￿→ g ⇐⇒ ∀x ∈ P. f x ￿￿ g x Proposition: If both P and P ￿ are predomains, then the set [P → P ￿] of continuous functions from P to P ￿ with partial order ￿→ is a predomain with ￿ fi = λx ∈ P. ￿￿ ( f i x) If P ￿ is a domain, then [P → P ￿] is a domain with ⊥→ = λx ∈ P. ⊥￿ The (Pre)domain of Continuous Functions: Proof To prove [P → P ￿] is a predomain: Let fi be a chain in [P → ￿￿ P ￿ ], and f = λx ∈ P. ￿￿ f i x. ( fix exists because f0x ￿￿ f1x ￿￿ . . . since f0 ￿→ f1 ￿→ . . . and P ￿ is a predomain) fi ￿→ f since ∀x ∈ P. fix ￿￿ f x; hence f is an upper bound of {fi}. If g is such that ∀i ∈ N. fi ￿→ g , then ∀x ∈ P. fix ￿￿ gx, hence ∀x ∈ P. f x ￿￿ gx, i.e. f ￿→ g . ⇒ f is the limit of fi... but is f continuous so it is in [P → P ￿]? Yes: If xj is a chain in P , then ￿ f ( xj ) = j ￿￿ i ￿ f i ( xj ) = j ￿￿ ￿￿ ij f i xj = ￿￿ ￿￿ ji f i xj = ￿￿ j f xj Some Continuous Functions For predomains P, P ￿, P ￿￿, if f ∈ P → P ￿ is a constant function, then f ∈ [P → P ￿] IP ∈ [P → P ] if f ∈ [P → P ￿] and g ∈ [P ￿ → P ￿￿], then g · f ∈ [P → P ￿￿] if f ∈ [P → P ￿], then (− · f ) ∈ [[P ￿ → P ￿￿] → [P → P ￿￿]] if f ∈ [P ￿ → P ￿￿], then (f · −) ∈ [[P → P ￿] → [P → P ￿￿]] Strict Functions and Lifting If D and D￿ are domains, f ∈ D → D￿ is strict if f ⊥ = ⊥￿. If P and P ￿ are predomains and f ∈ P → P ￿, then the strict function def f⊥ = λx ∈ P⊥. f x, ⊥￿, if x ∈ P if x = ⊥ ￿ is the lifting of f to P⊥ → P⊥; if P ￿ is a domain, then the strict function def f⊥ = λx ∈ P⊥. ⊥ f x, ⊥￿, is the source lifting of f to P⊥ → P ￿. If f is continuous, so are f⊥ and f⊥ . ⊥ (−)⊥ and (−)⊥ are also continuous. ⊥ if x ∈ P if x = ⊥ Least Fixed-Point If f ∈ S → S , then x ∈ S is a ﬁxed-point of f if x = f x. Theorem [Least Fixed-Point of a Continuous Function] If D is a domain and f ∈ [D → D], ￿ def ∞ then x = Proof: i=0 f i ⊥ is the least ﬁxed-point of f . x exists because ⊥ ￿ f ⊥ ￿ . . . f i⊥ ￿ f i+1⊥ ￿ . . . is a chain. x is a ﬁxed-point because fx = f( ∞ ￿ i=0 f i ⊥) = ∞ ￿ i=0 f (f i ⊥ ) = ∞ ￿ i=1 f i⊥ = For any ﬁxed-point y of f , ⊥ ￿ y ⇒ f ⊥ ￿ f y = y , by induction ∀i ∈ N. f i⊥ ￿ ∞ ￿ i=0 f i⊥ = x ￿ y , therefore x = (f i⊥) ￿ y . The Least Fixed-Point Operator Let YD = λf ∈ [D → D]. ∞ ￿ i=0 f i⊥ Then for each f ∈ [D → D], YD f is the least ﬁxed-point of f . YD ∈ [[D → D] → D] Semantics of Loops The semantic equation [[while b do c]]comm σ ([[while = σ, b do c]]comm )⊥ ([[c]]comm σ ), if [[b]]boolexp σ = true ⊥ otherwise implies that [[while b do c]]comm is a ﬁxed-point of f def ⊥ ([[c]]comm σ ), ⊥ F = λf ∈ [Σ → Σ⊥].λσ ∈ Σ. σ, if [[b]]boolexp σ = true otherwise We pick the least ﬁxed-point: def [[while b do c]]comm = Y[Σ→Σ⊥]F Semantics of Loops: Intuition def w0 = while true do skip [[w0]]comm = ⊥ def wi+1 = if b then (c ; wi) else skip [[wi+1]]comm = F [[wi]]comm The loop while b do c behaves like wi from state σ if the loop evaluates the condition n ≤ i times: [[while [[wi]]comm σ = ⊥, b do c]]comm σ, if n ≤ i if n > i or the loop fails to terminate: So [[while b do c]]comm σ = ⊥ = [[wi]]comm σ. ∀σ ∈ Σ. [[while b do c]]comm σ = ⇒ [[while b do c]]comm = ∞ ￿ [[wn]]comm σ n=0 Y[Σ→Σ⊥]F Variable Declarations Syntax: comm ::= newvar var := intexp in comm Semantics: [[newvar v := e in c]]comm σ def = − | v : σ v ])⊥ ([[c]]comm [σ | v : [[e]]intexp σ ]) ([ ⊥ ⊥, if σ ￿ = ⊥ = ￿ [σ | v : σ v ], otherwise where σ ￿ = [[c]]comm [σ | v : [[e]]intexp σ ] newvar v := e in c binds v in c, but not in e: F V (newvar v := e in c) = (F V (c) − {v }) ∪ F V (e) Problems with Substitutions Only variables are allowed on the left of assignment ⇒ substitution cannot be deﬁned as for predicate logic: (x:= x+1)/x → 10 = 10:= 10+1 We have to require δ ∈ var → var ; then (v := e)/δ = (δ v ):= (e/(cvar · δ )) (c0 ; c1)/δ = (c0/δ ) ; (c1/δ ) ... (newvar v := e in c)/δ = newvar u:= (e/(cvar · δ )) in (c/[δ | v : u]) where u ∈ {δ w | w ∈ F V (c) − {v }} / Assigned Variables Hence it is useful to know which variables are assigned to: F A(v := e) = {v } F A(c0 ; c1) = F A(c0) ∪ F A(c1) ... F A(newvar v := e in c) = F A(c) − {v } Note that F A(c) ⊆ F V (c) Coincidence Theorem for Commands The meaning of a command now depends not only on the mapping of its free variables: [[c]]comm σ v = σ v if [[c]]comm σ ￿= ⊥ and v ∈ F V (c) / (i.e. all non-free variables get the values they had before c was executed). Coincidence Theorem: (a) If σ u = σ ￿u for all u ∈ F V (c), then [[c]]comm σ = ⊥ = [[c]]comm σ ￿ or ∀v ∈ F V (c). [[c]]comm σ v = [[c]]comm σ ￿v . (b) If [[c]]comm σ ￿= ⊥, then [[c]]comm σ v = σ v for all v ∈ F A(c). / More Trouble with Substitutions Recall that for predicate logic [[−]]([[−]]intexp σ · δ ) = [[−/δ ]]σ . The corresponding property for commands: [[−]](σ · δ ) = [[−/δ ]]σ · δ ; fails in general due to aliasing: (x:= x+1 ; y:= y*2)/[x : z|y : z] = (z:= z+1 ; z:= z*2) [x : 2 | y : 2] = [z : 2] · [x : z|y : z] but [[x:= x+1 ; y:= y*2]]comm [x : 2 | y : 2] = [x : 3 | y : 4] ([[z:= z+1 ; z:= z*2]]comm [z : 2]) · [x : z|y : z] = [z : 6] · [x : z|y : z] = [x : 6 | y : 6] Substitution Theorem for Commands: If δ ∈ var → var and δ is an injection from a set V ⊇ F V (c), and σ and σ ￿ are such that σ ￿v = σ (δ v ) for all v ∈ V , then ([[c]]comm )σ ￿v = ([[c/δ ]]comm σ · δ )v for all v ∈ V . Abstractness of Semantics Abstract semantics are an attempt to separate the important properties of a language (what computations can it express) from the unimportant (how exactly computations are represented). The more terms are considered equal by a semantics, the more abstract it is. A semantic function [[−]]1 is at least as abstract as [[−]]0 if [[−]]1 equates all terms that [[−]]0 does: ∀c. [[c]]0 = [[c￿]]0 ⇒ [[c]]1 = [[c￿]]1 Soundness of Semantics If there are other means of observing the result of a computation, a semantics may be incorrect if it equates too many terms. C = the set of contexts: terms with a hole •. A term c can be placed in the hole of a context C , yielding term C [c] (not subtitution — variable capture is possible) Example: if C = newvar x:= 1 in •, then C [x:= x+1] = newvar x:= 1 in x:= x+1. O = terms → outcomes : the set of observations. A semantic function [[−]] is sound iff ∀c, c￿. [[c]] = [[c￿]] ⇒ ∀O ∈ O. ∀C ∈ C . O(C [c]) = O(C [c￿]). Fully Abstract Semantics Recap: [[−]]1 is at least as abstract as [[−]]0 if [[−]]1 equates all terms that [[−]]0 does: ∀c. [[c]]0 = [[c￿]]0 ⇒ [[c]]1 = [[c￿]]1 [[−]] is sound iff ∀c, c￿. [[c]] = [[c￿]] ⇒ ∀O ∈ O. ∀C ∈ C . O(C [c]) = O(C [c￿]). A semantics is fully abstract iff ∀c, c￿. [[c]] = [[c￿]] ∀O ∈ O. ∀C ∈ C . O(C [c]) = O(C [c￿]) i.e. iff it is a “most abstract” sound semantics. Full Abstractness of Semantics for SIL def Consider observations Oσ,v ∈ O = comm → Z⊥ observing the value of variable v after executing from state σ : ⊥, if [[c]]comm σ = Oσ,v (c) = [[c]]comm σ v, otherwise ⊥ = ((−) v )⊥([[c]]comm σ ) [[−]]comm is fully abstract (with respect to observations O): [[−]]comm is sound: By compositionality, if [[c]]comm = [[c￿]]comm , then [[C [c]]]comm = [[C [c￿]]]comm for any context C (induction); hence O(C [c]) = O(C [c￿]) for any observation O. [[−]]comm is most abstract: Consider the empty context C = •; if Oσ,v (c) = Oσ,v (c￿) for all v ∈ var , σ ∈ Σ, then [[c]] = [[c￿]]. Observing Termination of Closed Commands Sufﬁces to observe if closed commands terminate: If [[c]]comm ￿= [[c￿]]comm , construct a context that distinguishes c and c￿. Suppose [[c]]comm σ ￿= [[c￿]]comm σ for some σ . def Let {vi | i ∈ 1 to n} = F V (c) ∪ F V (c￿), and κi be constants such that [[κi]]intexp σ ￿ = σ vi. Then by the Coincidence Theorem [[c]]comm [σ ￿|vi : κi i∈1 to n] ￿= [[c￿]]comm [σ ￿|vi : κi i∈1 to n] for any state σ ￿. Observing Termination Cont’d Consider then the context C closing both c and c￿: def C = newvar v1:= κ1 in . . . newvar vn:= κn in • C [c] and C [c￿] may not both diverge from any initial state σ ￿, since [[C [c]]]comm σ ￿ = ([−|vi : σ ￿vi i∈1 to n])⊥ [[c]]comm [σ ￿|vi : κi i∈1 to n] ⊥ and C [c] = ⊥ = C [c￿] is only possible if [[c]]comm [σ ￿|vi : κi i∈1 to n] = ⊥ = [[c￿]]comm [σ ￿|vi : κi i∈1 to n], but by assumption and Coincidence the initial state [σ ￿|vi : κi i∈1 to n] distinguishes c and c￿. Observing Termination Cont’d If only one of C [c] and C [c￿] terminates, then the restricted observations on C distinguishes between them. If both C [c] and C [c￿] terminate, then [[c]]comm σ ￿= ⊥ = [[c￿]]comm σ , ￿ hence [[c]]σ v = [[κ]]σ ￿ ￿= [[c￿]]σ v for some v . Then for context def D = C [(• ; while v =κ do skip)] we have [[D[c]]]comm σ ￿ = ⊥ = [[D[c￿]]]comm σ ￿, ￿ ⇒ Oσ,v (D[c]) ￿= Oσ,v (D[c￿]). ...
View Full Document

{[ snackBarMessage ]}

Ask a homework question - tutors are online