This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Image Interpolation, Rotation and Straightening Lab Exercise 1 for the Computer Vision Course Informatics Institute University of Amsterdam The Netherlands January 6, 2009 1 Introduction In many image processing applications you have access to a number of image transformation methods. These include rotating, stretching and even shearing an image. These methods can be used to create some artistical effect, but they are also useful in cor recting for some unwanted distortion, which were introduced, when the picture was taken (like camera rotation, perspective distortions etc.) The input for these methods (at least in this Exercise) is always a quadrilateral region of a discrete image F . That means that you only know the intensity values at discrete integral pixel positions. The output consists of a rectangular image G with axes aligned with the Cartesian coordinate axes x and y . In an arbitrary image transformation, there is no guarantee, that an “input”pixel will be positioned at a pixel in the “output” image as well. Rather, most of the time your output image pixels will “look at” image positions in the input image, which are “between” pixels. Figure 1: Pixel positions in an original and a transformed version of an image don’t always agree. So you need access to intensity values, which are not on the sampling grid of the original image, e.g. the intensitiy value at position (6.4, 7.3). In this Exercise you will examine and implement interpolation techniques, which solve this problem. Then you will implement image transformations, such as rotations and affine transfor mations, employing these techniques. 1 For information on the grading see the handin section at the end of this assignment. The Matlab code that is given throughout the exercise is to help you structuring your program. It is not obligatory to use these exact functions. 2 Interpolation The transformation algorithm needs a way to access the original image F in locations that are not on the sampling grid. We thus need a function pixelValue that returns the value in the location ( x,y ) even for noninteger coordinate values. 2.1 Questions and Exercises ? Write a Matlab function pixelValue that returns the value of a pixel at real valued coordi nates (x,y) . The function should look like: Listing 1: Image Interpolation function color = pixelValue( image , x, y, method ) % pixel value at real coordinates if inImage(size(image),x,y) % do the interpolation switch(method) case ’nearest ’ % Do nearest neighbour return; case ’linear ’ % Do bilinear interpolation end %end switch else % return a constant end ? Implement nearest neighbour interpolation ( method = ’nearest’ ) and bilinear interpolation ( method = ’linear’ ), as are described in the syllabus. Take the necessary steps to deal with the “border problem” ( i.e. implement the function inImage). The simplest choice is to return a constant value in case (x,y) is not within the domain of the image. Please noteis not within the domain of the image....
View
Full
Document
This note was uploaded on 10/26/2010 for the course AI CV taught by Professor Boomgaard during the Spring '10 term at UVA.
 Spring '10
 Boomgaard

Click to edit the document details