lecture10

lecture10 - ELEC317 Digital Image Processing Image...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
1 ELEC317 Digital Image Processing Lecture 10 Image Compression Amount of data is huge for images. e.g. 512 X 512 images at 8 bits/pixel Æ 2M bits If we have an R-G-B video with 512 X 512 pixel/frame at 30 frames/sec Æ 180Mb/s There are two big categories of image compression method: 1. Predictive coding 2. Transform coding For an image where each pixel has L levels with probability P 1 , P 2 ,…P L , the entropy of pixel is = = L i i i p p H 1 2 log (Bits/ symbol) { } p E 2 log = From noiseless coding theorem, we can code the data with an average bit rate of H per symbol without distortion. e.g. L=2 2 1 2 1 = = p p 1 2 1 log 2 1 2 1 log 2 1 = = H 4 3 , 4 1 2 1 = = p p 1 < H Based on this theorem, if the bit rate of original data is B , and its entropy is H, then maximum compression ratio is H B C = The above conclusion is applicable if we code each pixel independent of its neighbors. it turns out that for most images, if we encode a group (or block) of pixels together, then compression ratio can be further increased. This method is referred to as block coding or vector quantization. 1. Pixel coding Here we treat each pixel as an individual & code each pixel separately. 1.1. PCM (Pulse code modulation)
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2 Here each pixel value (a continuous variable) is quantized to 2 L levels. The quantization levels can be non-uniform. For normal images, if uniform quantization levels are used, then usually 2 4 -2 8 levels are enough Æ 4-8 bits/pixel. 1.2. Entropy Coding suppose probability of a pixel value to lie between the i TH levels is p i . then as mentioned before, can define a code such that average bit rate is equal to = L i i i p p 1 2 log Design Of Huffman Code 1. Arrange probability in decreasing order 2. Combine symbols with lowest prob. 3. Shuffle to decreasing order again Average number of bits 25 . 0 2 25 . 0 2 125 . 0 3 125 . 0 3 0625 . 0 4 0625 . 0 4 0625 . 0 4 0625 . 0 4 7 8 Χ + Χ + Χ + Χ + Χ + Χ + Χ + Χ = 43 42 1 1 x x =2.75 Entropy 4 0625 . 0 log 0625 . 0 2 125 . 0 log 125 . 0 2 25 . 0 log 25 . 0 2 2 2 Χ Χ Χ = =1+0.75+1 =2.75 Code is optimal One method of designing such a code is Huffman code. For design of this code, refer to ELEC 214.
Background image of page 2
3 3 2 1 . .......... , 0 , 5 , 1 , 4 4 4 4 43 4 4 4 42 1 1 1 3 2 1 1000 7 7 8 0 0 0 00000 xxxx xxxx 1.3. Run Length Code Here no. of s ' φ between two successive 1’s is coded.
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 12

lecture10 - ELEC317 Digital Image Processing Image...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online