1995_Viola_thesis_registrationMI

Once these points move outside the e ective range of

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Weighted Neighbor Likelihood vs. EMMA Weighted neighbor likelihood and EMMA are both smoothly di erentiable functions that can be used to align signals when the imaging function is unknown. Qualitatively the EMMA estimate of joint entropy seems better. Joint entropy seems to have a wider basin in these synthetic experiments. If weighted neighbor likelihood and EMMA are so similar, why is there a di erence? Recall that weighted neighbor likelihood measures the conditional entropy of the image given the model. It does this under the assumption that the conditional distribution of the image is Gaussian. Weighted neighbor likelihood use the data around a point to estimate the mean of a Gaussian. The log likelihood of that point is then proportional to the squared di erence from this mean. In general log likelihood calculations are very sensitive to outliers. Outliers are points that are, because of noise or measurement error, perturbed and land far from where they should have. Recall that the log likelihood of a sample is the sum of the log likelihoods of each point in the sample. As a result a single outlier can ruin a sample that would otherwise have had a high likelihood. A more reasonable measure might introduce a bound on the penalty for a single point. Once a single point moved beyond a certain distance from the local mean the cost would no longer increase. Calculating likelihood in this way is closely related to the concept of a robust statistic EMMA on the other hand does not assume that the conditional distribution of the image given the model is Gaussian. Instead it approximates the density non-parametrically. EMMA can handle situations where there are multiple peaks in the conditional distribution. While there is a likelihood penalty if a group of points are perturbed away from the local mean, it is not a function of the distance from the mean. Once these points move outside the e ective range of the smoothing function there is no additional penalty. EMMA's robust nature prevents it from getting swamped by a few outliers in the joint distribution. This gives it a greater ability to deal with the distributions that arise from misalignments between model and image. In the next sections we will describe a number of other situations where EMMA is better than weighted neighbor likelihood. 93 Paul A. Viola CHAPTER 4. MATCHING AND ALIGNMENT 4.2.1 Non-functional Signals Up until this point our analysis of alignment has assumed that there exists an imaging function that relates the model and the image. For at least two classes of problems there will be no imaging function at all. The rst arises from a common situation in computer vision: occlusion. The second arises when the model does not contain all of the information required to predict the image. In both cases no single function, regardless of exogenous variables, can be used to predict the image from the model. Figure 4.14 shows a graph of our original pair of signals, except that vx has now been corrupted by an occlusion. Occlusion proves to be particularly bothersome for the alignment techniques we have proposed. For example, the basic assumption behind normalized cost has been violated; the occluded signal is not a linearly transformed version of the model. In addition, a quick glance at the joint space shows that the assumption behind weighted neighbor likelihood has also been violated see Figure 4.15; even when the signals are aligned, there is no longer any function that relates ux and vx. Figure 4.16 show a graph of weighted neighbor likelihood versus translation. The global minimum no longer coincides with the correct translation. In some cases EMMA can be used to align partially occluded signals. Joint entropy does not su er from the strong assumption that the signals are functionally related. Though part of the signal may be corrupted, the remaining parts retain their low entropy relationship. Figure 4.17 show a plot of joint entropy for the occluded pair of signals. The simplest example of non-functional signals often arises when the model and the image are swapped. Whenever the function between the model and the image is non-monotonic, the relationship between the image and the model is non-functional. The non-monotonically related signals shown in Figure 4.2 are an example. Figure 4.18 shows the joint space of the swapped signals and a weighted neighbor function approximation. The function t to this joint space is a terrible approximation of the data. The quality of the function approximation points out an important limitation of weighted neighbor likelihood. While normalized cost is a symmetric comparison metric, weighted neighbor likelihood is not. It may seem at rst that this is an unimportant distinction. It is not. Symmetric measures allow us to match images to models as well as models to images. This can be critical when it is not possible to construct a detailed model. 94 4.2. WEIGHTED NEIGHBOR LIKELIHOOD VS. EMMA AI-TR 1548 u(x) v(x) 2 Intensity 1.5 1 0.5 0 -0.5 -1 -1.5 0 100 200 300 Position 400 Figure 4.14: Graph of ux and vx where vx has been perturbed b...
View Full Document

This note was uploaded on 02/10/2010 for the course TBE 2300 taught by Professor Cudeback during the Spring '10 term at Webber.

Ask a homework question - tutors are online