HW05solutions2008 - STATS 408 HOMEWORK #5 SOLUTION 1. When...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
STATS 408 HOMEWORK #5 SOLUTION 1. When would we not want to use range as a measure of variation? We would not want to use range as a measure of variation to compare sets of items that differ in the number of items. As the number of items in a dataset increases, the range can only increase. 2. Provide an example to illustrate when the mean absolute deviation may be a poor reflection of variation. If we add sources of variation, the MAD does not increase. For example, if we have a basket with {-1 and 1} and add that to the same basket {-1 and 1} we obtain an outcome basket with {-2, 0, 0, 2}. There is more variation in this basket, but the MAD for this will be 1, the same as it is for {-1, 1}. 3. Give an example of how the standard deviation can mislead if used with spatial data. Standard deviation can be misleading with spatial data when the starting point of the measurement is rather arbitrary. For example, with angular data, the 0 degree mark is much closer to the 350 degree mark than the 60 degree mark, but this is not reflected in standard deviation. 4. Give an example where you might use the entropy as a reflection of variation. We would use entropy as a reflection of variation when we want to see the distribution across categories. Entropy could be used, for example, to assess people’s preferences for various ice cream flavors (say, vanilla, chocolate, strawberry to simplify it). 5. A congressional committee learns that members have various opinions regarding an issue. Describe how you would use entropy to learn whether a dialogue on these issues has moved the committee towards a consensus. In this case, low entropy would correspond to a consensus (all possessing the same opinion) and high entropy would correspond to disagreement (a number of different opinions). To assess this with our measure of entropy, we could categorize the opinions into k categories and calculate the entropy before the discussion using E= -2[p 1 ln p 1 +…+ p
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 04/02/2008 for the course STAT 408 taught by Professor Rothman during the Spring '08 term at University of Michigan.

Page1 / 4

HW05solutions2008 - STATS 408 HOMEWORK #5 SOLUTION 1. When...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online