TIM-158-HW1.docx - Weichih Sun TIM 158 Homework 1 Problem...

This preview shows 1 out of 3 pages.

Weichih Sun 4/11/17 TIM 158 Homework 1 Problem Estimated Time Actual Time Problem 1 1 Hour Problem 2 1 Hour Problem 3 1 Hour Problem 4 2 Hours 1. Entropy of a 2-event source Step 1: Define the Problem (a) Write an expression for entropy H(p) in bits of a 2-event source, where p is the probability of the first event. (b) Determine the value of p which maximizes H(p). (c) Draw a graph of H(p) as a function of p, and use it to check your answer in part (b) (d) Carefully study the graph of H(p), and draw relevant conclusions. Step 2: Create a Plan 1. For a 2-event source write an expression for H(p) 2. Determine the value of p which maximizes H(p) 3. Create a graph for H(p) 4. Draw conclusions using the graph Step 3: Execute the Plan 1. For a 2-event source write an expression for H(p) For a 2-event source we have P 1 with probability p and P 2 with probability (1-p). The expression to calculate entropy H(p) would be: H(p) = - [ p log p +( 1 p ) log ( 1 p ) ] (we calculate everything with base 2 for log) Therefore, log 2 x = log e x log e 2 = ln x ln 2 H(p) = - [ p ln p + ( 1 p ) ln ( 1 p ) ] / ln 2 2. Determine the value of p which maximizes H(p) To find the value that maximizes H(p) we have to take the derivative of H(p) and p dH ( p ) dp = 0 This will find p that maximizes H(p) which would give us p = 0.5. H(p=0.5) = 1 bit
Image of page 1

Subscribe to view the full document.

3. Create a graph for H(p) 4. Draw conclusions using the graph From the graph we can see that entropy increases when probability increases until it hits the maximum probability. After that we can see that as probability increases after the maximum the entropy starts to decrease. Step 4: Check your Work 1. From this we can tell that entropy is a measurement of the information we expect in the future. Also, that it is the average information taken with respect to all the outcomes. If given a 2-event source with each event having the same probability of the event to occur it would give us the highest amount of entropy that can happen. 2. Entropy for a Uniform Distribution Step 1: Define the Problem 1. What is the entropy in bits for a random variable which has a uniform distribution of 32 (equally likely) outcomes? If you use a string of n binary digits to identify these 32 outcomes, then what is the value of n?
Image of page 2
Image of page 3
You've reached the end of this preview.
  • Winter '13
  • Desa
  • Net Present Value, Probability theory, Binary numeral system, development cost

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern