Signal Processing and Linear Systems-B.P.Lathi copy

Dual personality o f a signal t he discussion so far

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: v irtue o f orthogonality; t hat is, all t erms of r cos nwot cos mwot dt lTD (3.96a) ITo s tands for integration over a ny c ontiguous interval of To seconds. By where using a trigonometric identity (see Sec. B.7-6), Eq. (3.96a) can be expressed as I=~[r c os(n+m)wotdt+ r c os(n-m)wotdt] 2 lTD lTD (3.96b) 3 Signal R epresentation by Orthogonal Sets 224 Since cos wot e xecutes one complete cycle during any interval of To d uration, cos (n+ m )wot executes (71 + m ) complete cycles during any interval of d uration To· Therefore, t he first integral in Eq. (3.96b), which represents t he a rea under (n + m) complete cycles o f a sinusoid, equals zero. T he same argument shows t hat t he second integral in E q. (3.96b) is also zero, except when n = m. Hence I in Eq. (3.96) is zero for all n f m. When n = m, t he first integral in Eq. (3.96b) is still zero, b ut t he second integral yields r dt r cos nwot cos mwot dt ={ I =~ 2 l To = To 2 T hus ~ n fm {~ n fm l To T (3.97a) m =nfO Using similar arguments, we c an show t hat r sin nwot sin mwot dt l~ = 2 (3.97b) n =m fO a nd r sin nwot cos mwot dt =0 (3.97c) for all n and m l To Appendix 3C: Orthogonality o f t he Exponential Signal S et T he s et of exponentials val of d uration To, t hat is, e inwot (n = 0, ± 1, ±2, . .. ) is orthogonal over any inter- m fn m (3.98) =n Let t he i ntegral o n t he left-hand side of Eq. (3.98) be 1. (3.99) T he case m = n is trivial. In this case t he i ntegrand is unity, a nd I = To· W hen fn m I= 1 j (m - n)wo e j(m-n)wot It' tl 1 e j(m-n)wotl [ ej(m-n)woTo J(m - n)wo +To = . - 1] = 0 T he l ast result follows from t he fact t hat woTo = 271", a nd ej27rk = 1 for all integral values of k. 3 .9 Summary T his c hapter discusses t he foundations of signal representation in terms of its components. T here is a perfect analogy between vectors and signals; t he analogy is 3.9 S ummary 225 so strong t hat t he t erm ' analogy' understates t he reality. Signals a re n ot j ust l ike vectors. Signals a re vectors. T he inner or scalar product of two (real) signals is t he a rea u nder t he p roduct of t he two signals. I f t his inner or scalar p roduct is zero, t he signals are said t o be orthogonal. A signal f (t) has a component c x(t), where c is t he i nner p roduct of f (t) a nd x (t) divided by E x, t he energy of x (t). A good measure of similarity of two signals f (t) a nd x (t) is t he correlation coefficient en, which is equal t o t he i nner p roduct of f (t) a nd x (t) divided by J EfE x . I t c an be shown t hat - 1 :S en :S 1. T he m aximum similarity (en = 1) occurs only when t he two signals have t he s ame waveform within a (positive) multiplicative constant, t hat is, when f (t) = K x(t). T he m aximum dissimilarity (en = - 1) occurs only when f (t) = - Kx(t). Zero similarity ( en = 0) occurs when t he signals are orthogonal. In binary communication, where we a re required to distinguish between t he two known waveforms in t he presence of noise a nd d istortion, selecting t he two waveforms with maximum dissimilarity (en = - 1) provides maximum distinguishability. J ust as a vector c an be represente...
View Full Document

Ask a homework question - tutors are online