08-2 - Image compression

08-2 - Image compression - 4/28/2008 Basic Image...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
4/28/2008 1 Basic Image compression Basic Image compression methods Spring 2008 ELEN 4304/5365 DIP 1 by Gleb V. Tcheslavski: gleb@ee.lamar.edu http://ee.lamar.edu/gleb/dip/index.htm Huffman coding Huffman coding is one of the most popular techniques for removing coding redundancy. It yields the smallest possible number of code symbols per a source symbol. The first step is to create a series of source reductions by ordering the probabilities of the symbols under consideration and combining the lowest probability symbols into a single symbol that replaces them in the next source Spring 2008 ELEN 4304/5365 DIP 2 reduction. K -ary Huffman coding
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
4/28/2008 2 Huffman coding A set of symbols and their probabilities are ordered from top to bottom in terms of decreasing probability values. To form the first source prediction the bottom two probabilities (0 04 and 0 06) are source prediction, the bottom two probabilities (0.04 and 0.06) are combined to form a “compound symbol” with probability of 0.1. this compound symbol and its associated probability are placed in the first source prediction column so that probabilities of the reduced source also are ordered from the most to the least probable. This process is repeated until a reduced source with two symbols is reached. Spring 2008 ELEN 4304/5365 DIP 3 Huffman coding The second step is to code each reduced source, starting with the smallest source and working back to the original source. Spring 2008 ELEN 4304/5365 DIP 4 The minimum length binary code for a two-symbol source are the symbols 0 and 1. These code symbols are assigned to the two symbols on the right.
Background image of page 2
4/28/2008 3 Huffman coding We observe that the assignment of “1” and “0” is arbitrary and can be reversed without any harm. Since the reduced source symbol with probability 0.6 was generated by combining two symbols in the reduced source to its left, the “0” used to code it, is now assigned to both of these symbols, and “0” and “1” are arbitrarily appended to each to distinguish them from each other. This procedure is repeated for each reduced source until the original source is reached. The final code appears as shown. The average code length is: Spring 2008 ELEN 4304/5365 DIP 5 0.4 1 0.3 2 0.1 3 0.1 4 0.06 5 0.04 5 2.2 / avg L bits pixel =⋅ +⋅ + + = The entropy of the source is 2.14 bits/symbol. Huffman coding Huffman’s procedure creates the optimal code for a set of symbols and probabilities where each symbol is coded one at a time. Once the code has been created, coding and/or error-free decoding is accomplished by a lookup table manner. The code itself is an instantaneous uniquely decodable block code. Block code – each source symbol is mapped into a fixed sequence of code symbols. Instantaneous code – each code word in a string of code symbols b d ddid d t l f ih b i bl Spring 2008 ELEN 4304/5365 DIP 6 can be decoded independently from neighboring symbols.
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 31

08-2 - Image compression - 4/28/2008 Basic Image...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online