This preview shows pages 1–7. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: CIS
CIS 5371 Cryptography
3. Probability & Information Theory 1 Basic rules of probability
Notation Events: S, , E, F, ..., EF, EF, … . Pr[S]=1, Pr[]=0, 0 Pr[E] 1
Pr[ Pr[EF] = Pr[E] + Pr[F]  Pr[EF], . . E S \ E , Pr[ E ] Pr [ E ] 1 EF Pr[ E ] Pr[ F ] Pr[ E F ]
Pr[ F  E ] Pr[ E ]
2 Basic rules of probability
An Experiment can yield one of n
equally probable outcomes. Event E1: One of the n values occurs. Pr[E1] = 1/n Event E2: m of the n values occur. Pr[E2] = m/n Event E3: An Ace is drawn from a pack of 52 cards Pr[E3] = 4/52 = 1/13
3 Basic rules of probability
Binomial Distributi on :
n k
Pr [ k succeses in n trials ] p (1 p ) n k k Bayes ' Law :
Pr[ E ]
Pr[ E  F ] Pr[ F  E ]
Pr[ F ] 4 Bi
Birthday Paradox
Let f : X Y where Y is a set of n elements
Event E k , :
for k pairwise distinct values x1 , x 2 , , x k the
probabilit y of a collision f ( xi ) f ( x j ), occurs
for some i j , is at least Birthday Paradox : If 1 / 2 then k 1.1774 n
5 Information Theory
Th
Entropy (Shannon) The entropy of a message source is a measure of the
amount of information the source has Let L = {a1, … , an} be a language with n letters. S is a source that outputs these letters with
independent probabilities Prob[a1], … , Prob[an]. The entropy of S is:
H (S ) n i 1 1
Pr[ a i ] log 2 Pr[ a ] bits i 6 Information Theory
Th
Example
A source S outputs a random bit.
Its entropy is: 1
1
H ( S ) Pr[0] log 2 Pr[0] Pr[1] log 2 Pr[1] 1
1 (1) (1) 1 bit
2
2 7 ...
View
Full
Document
This note was uploaded on 12/11/2011 for the course CIS 5371 taught by Professor Mascagni during the Fall '11 term at FSU.
 Fall '11
 Mascagni

Click to edit the document details