Unformatted text preview: centimeters too low. The normals used are the same.
The resulting graph is no longer consistent. It does not look as though a simple smooth curve
would t this data well.
In summary, when model and image are aligned there will be a consistent relationship
between image intensity and model normal. This is predicted by our assumption that there
is an imaging function that relates models and images. While the actual form of this function
depends on lighting and surface properties, a correct alignment will always lead to a consistent
relationship. Conversely, when model and image are misaligned the relationship between
15 Paul A. Viola 16 1 0.8 0.6 0.4 0 20 60 80 CHAPTER 1. INTRODUCTION 40 Position Though developed for alignment, the EMMA estimates of entropy and mutual information can be used in other applications. We will show that EMMA can be used to correct
inhomogeneities in MRI scans. In addition, we will derive a new approach for dimensionality
reduction based on entropy. Similar to principal components analysis, our technique can nd
low dimensional projections of higher dimensional data that preserve the most information This same technology can be used for the alignment of other types of signals. In its full
generality, EMMA can be used whenever there is a need to align images from two di erent
sensors, the socalled sensor fusion" problem. For example, in medical imaging data from
one type of sensor such as magnetic resonance imaging must be aligned to data from another
sensor such as computed tomography. We will demonstrate that EMMA can be used to
solve problems such as this. A major contribution of this thesis is the derivation of a formal technique that delivers
a principled estimate of consistency". We will show that when the mutual information
between an image and a model is high they are likely to be aligned. Toward making this
technique a reality we have de ned a new approach for evaluating entropy and information
called EMMA. We have also de ned an e cient scheme for adjusting a set of parameters so
that mutual information and entropy can be optimized. We will use EMMA to e ectively
evaluate and adjust the alignment of three dimensional models and two dimensional images. intensity and normal is inconsistent. Figure 1.3: On the left is a video image of Ron with the single scanline highlighted. On
the right is a graph of the intensities observed along this scan line. Real Intensity. 17 1 0.5 0 0.5 1 1 0.5 0 0.5 1 0 0 20 20 40 60 60 Position 40 Position 80 80 AITR 1548 Figure 1.4: On the left is a depth map of Ron with the single scanline highlighted. At
top right is a graph of the x component of the surface normal. On the bottom right is the y
component of the normal. 1.1. AN INTRODUCTION TO ALIGNMENT Normal: X Comp. Normal: Y Comp. Paul A. Viola CHAPTER 1. INTRODUCTION Intensity 1
0.8
0.6
0.4
0.2
0
0.8 0.6 0.4 0.2 0 0.2 0.4 0.6 X Component Normal Figure 1.5: The Aligned Case: A scatter plot of the intensity of the video image versus
the x component of the surface normal from the model. The image and model are correctly
aligned. Intensity 1
0.8
0.6
0.4
0.2
0
0.8 0.6 0.4 0.2 0 0.2 0.4 0.6 X Component Normal Figure 1.6: The Misaligned Case: On the left is the misaligned scanline from the video
image of Ron. On the right is a scatter plot of the intensity of this part of the video image
versus the x component of the surface normal from the model.
possible. 1.2 Overview of the Thesis
The second chapter contains an overview of the probability theory necessary to understand
what EMMA is doing and how it does it. The third chapter discusses estimation of entropy
from samples. While a number of techniques for manipulating entropy currently exist, EMMA
combines computational e ciency with the exibility necessary to model a wide variety of
18 1.2. OVERVIEW OF THE THESIS AITR 1548 distributions. The fourth chapter returns to our discussion of alignment. We show here that
EMMA is capable of aligning signals where simpler techniques cannot. This chapter will
present the basic equations that underly alignment by maximization of mutual information.
The fth chapter contains a wide variety of alignment experiments designed both to validate
our approach and explore the scope of possible application. In chapter six we will describe
applications besides alignment to which we have applied EMMA. For example, our scheme
for e ciently manipulating entropy includes a stochastic form of gradient descent. We will
describe a ow estimation problem in which stochastic gradient descent speeds convergence
by a factor of thirty. Chapter seven will include a discussion of our results and a comparison
with related work. 19 Chapter 2
Probability and Entropy
One of the key insights in this thesis is that many of the techniques that are common in
computer vision, such as correlation, are easily interpreted as statistics of random variables.
Once this is done we can use a broad range of tools from probability to analyze the behavior of
these statistics....
View
Full Document
 Spring '10
 Cudeback
 The Land, Probability distribution, Probability theory, probability density function, Mutual Information, Paul A. Viola

Click to edit the document details