# Register now to access 7 million high quality study materials (What's Course Hero?) Course Hero is the premier provider of high quality online educational resources. With millions of study documents, online tutors, digital flashcards and free courseware, Course Hero is helping students learn more efficiently and effectively. Whether you're interested in exploring new subjects or mastering key topics for your next exam, Course Hero has the tools you need to achieve your goals.

45 Pages

### instancelearning

Course: CSE 847, Fall 2008
School: Michigan State University
Rating:

Word Count: 1811

#### Document Preview

Classification Rong Statistical Jin Classification Problems Inpu t Given X f : X Y ? Y Outpu t input X={x1, x2, , xm} Predict the class label y Y Y = {-1,1}, binary class classification problems Y = {1, 2, 3, , c}, multiple class classification problems Goal: need to learn the function: f: X Y Examples of Classification Problem Text categorization: Doc: Months of campaigning and weeks of...

Register Now

#### Unformatted Document Excerpt

Coursehero >> Michigan >> Michigan State University >> CSE 847

Course Hero has millions of student submitted documents similar to the one
below including study guides, practice problems, reference materials, practice exams, textbook help and tutor support.

Course Hero has millions of student submitted documents similar to the one below including study guides, practice problems, reference materials, practice exams, textbook help and tutor support.
Classification Rong Statistical Jin Classification Problems Inpu t Given X f : X Y ? Y Outpu t input X={x1, x2, , xm} Predict the class label y Y Y = {-1,1}, binary class classification problems Y = {1, 2, 3, , c}, multiple class classification problems Goal: need to learn the function: f: X Y Examples of Classification Problem Text categorization: Doc: Months of campaigning and weeks of round-the-clock efforts in Iowa all came down to a final push Sunday, Politics Topic: Non-politics Input features X: Word frequency {(campaigning, 1), (democrats, 2), (basketball, 0), } Y = +1: politics Y = -1: non-politics Class label y: Examples of Classification Problem Text categorization: Doc: Months of campaigning and weeks of round-the-clock efforts in Iowa all came down to a final push Sunday, Politics Topic: Non-politics Input features X: Word frequency {(campaigning, 1), (democrats, 2), (basketball, 0), } Y = +1: politics Y = -1: not-politics Class label y: Examples of Classification Problem Image Classification: Which images are birds, which are not? Input features X Color histogram {(red, 1004), (red, 23000), } Y = +1: bird image Y = -1: non-bird image Class label y Examples of Classification Problem Image Classification: Which images are birds, which are not? Input features X Color histogram {(red, 1004), (blue, 23000), } Y = +1: bird image Y = -1: non-bird image Class label y Classification Problems Inpu t X f : X Y ? Y Outpu t Doc: Months of campaigning and weeks of round-the-clock efforts in Iowa all came down to a final push Sunday, f: doc topic How to obtain f ? f: image topic Politics Not-politics Birds Not-Birds Learn classification function f from examples Learning from Examples Training examples: r r r Dtrain = { ( x1 , y1 ) , ( x2 , y2 ) ,..., ( xn , yn ) } n : the number of training examples r xi d : ( xi ,1 , xi ,2 ,..., xi , d ) yi Y : Binary class: Y = {+1, -1} Multiple class: Y = {1, 2,..., c} Identical Independent Distribution (i.i.d.) Each training example is drawn independently from the identical source Training examples are similar to testing examples Learning from Examples Training examples: r r r Dtrain = { ( x1 , y1 ) , ( x2 , y2 ) ,..., ( xn , yn ) } n : the number of training examples r xi d : ( xi ,1 , xi ,2 ,..., xi , d ) yi Y : Binary class: Y = {+1, -1} Multiple class: Y = {1, 2,..., c} Identical Independent Distribution (i.i.d.) Each training example is drawn independently from the identical source Learning from Examples Given training examples r r r Dtrain = { ( x1 , y1 ) , ( x2 , y2 ) ,..., ( xn , yn ) } Goal: learn a classification function f(x):XY that is consistent with training examples What is the easiest way to do it ? K Nearest Neighbor (kNN) Approach How many neighbors should we count ? (k=1) (k=4) Cross Validation Divide training examples into two sets A training set (80%) and a validation set (20%) Predict the class labels of the examples in the validation set by the examples in the training set Choose the number of neighbors k that maximizes the classification accuracy Leave-One-Out Method For k = 1, 2, , K Err(k) = 0; 1. Randomly select a training data point and hide its class label 2. Using the remaining data and given K to predict the class label for the left data point 3. Err(k) = Err(k) + 1 if the predicted label is different from the true label Repeat the procedure until all training examples are tested Choose the k whose Err(k) is minimal Leave-One-Out Method For k = 1, 2, , K Err(k) = 0; 1. Randomly select a training data point and hide its class label 2. Using the remaining data and given K to predict the class label for the left data point 3. Err(k) = Err(k) + 1 if the predicted label is different from the true label Repeat the procedure until all training examples are tested Choose the k whose Err(k) is minimal Leave-One-Out Method For k = 1, 2, , K Err(k) = 0; 1. Randomly select a training data point and hide its class label 2. Using the remaining data and given k to predict the class label for the left data point 3. Err(k) = Err(k) + 1 if the predicted label is different from the true label Repeat the procedure until all training examples are tested Choose the k whose Err(k) is minimal (k=1) Leave-One-Out Method For k = 1, 2, , K Err(k) = 0; 1. Randomly select a training data point and hide its class label 2. Using the remaining data and given k to predict the class label for the left data point 3. Err(k) = Err(k) + 1 if the predicted label is different from the true label Repeat the procedure until all training examples are tested Choose the k whose Err(k) is minimal (k=1) Err(1) = 1 Leave-One-Out Method For k = 1, 2, , K Err(k) = 0; 1. Randomly select a training data point and hide its class label 2. Using the remaining data and given k to predict the class label for the left data point 3. Err(k) = Err(k) + 1 if the predicted label is different from the true label Repeat the procedure until all training examples are tested Choose the k whose Err(k) is minimal Err(1) = 1 Leave-One-Out Method For k = 1, 2, , K Err(k) = 0; 1. Randomly select a training data point and hide its class label 2. Using the remaining data and given k to predict the class label for the left data point 3. Err(k) = Err(k) + 1 if the predicted label is different from the true label Err(1) = 3 k=2 Err(2) = 2 Err(3) = 6 Repeat the procedure until all training examples are tested Choose the k whose Err(k) minimal Probabilistic is interpretation of KNN Estimate the probability density function Pr(y|x) around the location of x Count of data points in class y in the neighborhood of x A small neighborhood large variance unreliable estimation A large neighborhood large bias inaccurate estimation Bias and variance tradeoff Weighted kNN Weight the contribution of each close neighbor based on their distances Weight function r r x xi 2 rr 2 w( x , xi ) = exp 2 2 Prediction rr r i w( x , xi ) ( y, yi ) Pr( y | x ) = rr i w( x , xi ) 1 ( y , yi ) = 0 y = yi y yi Estimate 2 in the Weight Function Leave one cross validation Training dataset D is divided into two sets r Compute the Pr( y1 | x1 , D1 ) r Validation set ( x1 , y1 ) r r r Training set D1 = { ( x2 , y2 ), ( x3 , y3 ),..., ( xn , yn )} Estimate 2 in the Weight Function r Pr( y1 | x1 , D1 ) = r r w( x1 , xi ) ( y1 , yi ) i =2 n r r w( x1 , xi ) i =2 n r r x1 xi 2 2 exp 2 2 ( y1 , yi ) i=2 = r r n x1 xi 2 exp 2 2 2 i =2 n Pr(y|x1, D-1) is a function of 2 Estimate 2 in the Weight Function r Pr( y1 | x1 , D1 ) = r r w( x1 , xi ) ( y1 , yi ) i =2 n r r w( x1 , xi ) i =2 n r r x1 xi 2 2 exp 2 2 ( y1 , yi ) i=2 = r r n x1 xi 2 exp 2 2 2 i =2 n Pr(y|x1, D-1) is a function of 2 Estimate 2 in the Weight Function r In general, we can have expression for Pr( yi | xi , Di ) r Validation set ( xi , yi ) r r r r Training set D = { ( x , y ),...., ( x , y ), ( x , y )..., ( x , y )} i 1 2 i 1 i 1 i +1 i +1 n n Estimate 2 by maximizing the likelihood r l ( ) = log Pr( yi | xi , Di ) i =1 n * = arg max l ( ) Estimate 2 in the Weight Function r In general, we can have expression for Pr( yi | xi , Di ) r Validation set ( xi , yi ) r r r r Training set D = { ( x , y ),...., ( x , y ), ( x , y )..., ( x , y )} i 1 2 i 1 i 1 i +1 i +1 n n Estimate 2 by maximizing the likelihood r l ( ) = log Pr( yi | xi , Di ) i =1 n * = arg max l ( ) Optimization r r xi xk 2 exp 2 ( y k , yi ) n n 2 2 k i r l ( ) = log Pr( yi | xi , Di ) = log r r xi xk 2 i =1 i =1 i exp 2 2 2 k = arg max l ( ) * It is a DC (difference of two convex functions) function Challenges in Optimization Convex functions are easiest to be optimized Single-mode functions are the second easiest Multi-mode functions are difficult to be optimized Gradient Ascent r r xi xk 2 exp 2 ( y k , yi ) n n 2 2 k i r l ( ) = log Pr( yi | xi , Di ) = log r r xi xk 2 i =1 i =1 i exp 2 2 2 k = 1/ 2 n n r l( ) = log Pr( yi | xi , Di ) = log i =1 i =1 ( k i k i r r exp xi xk i ) ( y , y ) r r exp ( x x ) 2 2 k i 2 k 2 Gradient Ascent (contd) n n r l( ) = log Pr( yi | xi , Di ) = log i =1 i =1 ( k i k i r r exp xi xk i ) ( y , y ) r r exp ( x x ) 2 2 k i 2 k 2 Compute the derivative of l(), i.e., dl ( ) / d Update dl ( ) +t d How to decide the step size t? Gradient Ascent: Line Search Excerpt from the slides by Steven Boyd Gradient Ascent Stop criterion | dl ( ) / d | is predefined small value Start =0, Define , , and Compute dl ( ) / d Choose step size t via backtracking line search Update + tdl ( ) / d Repeat ...

Find millions of documents on Course Hero - Study Guides, Lecture Notes, Reference Materials, Practice Exams and more. Course Hero has millions of course specific materials providing students with the best way to expand their education.

Below is a small sample set of documents:

Stanford - PSYCH - 224
5BD)e@ 1716@ 6B9B4nlBhene)6U7X!611'7B745U@6B24 e@4j'1&amp;'e5BAX24#@2U2ffB45eBU'7@1Dv4e y { @132D44F4lAhh1e3e67BD4eg194UBy
Stanford - PDYN - 1016
1%W1oftA &quot;UNITED STATES DISTRICT COURT A &quot;[W I ISTRICT OF FLORIDA T WIp4SION 1ST :~ r C0 ~ ~ `;7-T-17TB MDAU CLYFILE DM1JI)LE Disn j OF k ;,: ; A TAMPA, FLORIDAIN RE PARADYNE NETWORKS, INC SECURITIES LITIGATIONPLAINTIFFS ' CONSOLIDATED AME
Michigan State University - READ - 1977
Michigan State University - READ - 1985
Michigan State University - READ - 1981
Michigan State University - READ - 1980
Michigan State University - READ - 1972
Michigan State University - READ - 1975
Stanford - MATH - 23309
Chapter IFourier Series on TWe denote by R the additive group of real numbers and by Z the subgroup consisting of the integers. The group T is defined as the quotient R/2Z where, as indicated by the notation, 2Z is the group of the integral multi
Michigan State University - SS - 041897
water4-17-97By KRISTYN SOBIERCapital News ServiceLANSING - A recent report declaring Michigan the second-worst state forthe number of facilities that violate federal water quality laws isstirring local and statewide controversy. According
Stanford - C - 020121
Flavour-Symmetry Strategies to Extract Robert Fleischer DESY Hamburg, Theory Group WIN 02, Christchurch, New Zealand, 2126 January 2002 Setting the Stage Isospin + SU (3) + Dynamics: from B K, U -Spin Strategies: Focus on the following syst
Stanford - CS - 468
Fast Frictional Dynamics for Rigid BodiesDanny Kaufman Timothy Edmonds Dinesh Pai Rutgers UniversityRigid BodiesSimplified Modelhard to know internal properties of all items Rigid insteadMore complex simulationForces instantaneously change ve
UT Arlington - OPMA - 6371
STRUCTURAL EQUATION MODELING, 7(3), 461483 Copyright 2000, Lawrence Erlbaum Associates, Inc.TEACHERS CORNERReporting Analyses of Covariance StructuresAnne BoomsmaDepartment of Statistics, Measurement Theory &amp; Information Technology University
Michigan State University - WRIT - 121
Martine Courant Rife WRIT121Communication ProblemSolution:121AUDIENCE: Your class peers as well as forty-three diverse (politically, religiously, racially, etc.) college writing teachersall of whom have masters degrees in English, Education, Rh
Stanford - ESST - 1025
Gase-5:02-cv-04497 RRI W - -Document 253Filed 07/20/2007Page 1 of 81LERACH COUGHLIN STOIA GELLER RUDMAN &amp; ROBBIN S LLP2 PATRICK J. CO GHLI (111070) JOHN K. GRANT ( 169813) 3 LUKE 0. BROOKS (212802) 100 Pine Street, Suite 2600 4 San Francisc
UT Arlington - EE - 5328
19-4784; Rev 0; 10/99KIT ATION EVALU E AILABL AVLow-Power, 16-Bit Smart ADCGeneral Description Featureso Low-Noise, 400A Single-Chip Sensor Signal Conditioning o High-Precision Front End Resolves Less than 1V of Differential Input Signal o On-C
Michigan State University - FS - 401
Stanford - CESV - 1036
UNITED STATES DISTRICT COURT SOUTHERN DISTRICT OF NEW YORKx BILL POWELL and MARK A. HAWES, : Individually and On Behalf of All Others : Similarly Situated, : : Plaintiffs, : : vs. : : CHINA ENERGY SAVINGS TECHNOLOGY, : INC., KWUN-LUEN SIU, LAWRENCE
Michigan State University - PHY - 231
PHYSICS 231 INTRODUCTORY PHYSICS ISection 001PHYSICS 231 INTRODUCTORY PHYSICS I Lecturer: Carl Schmidt (Sec. 001) schmidt@pa.msu.edu (517) 355-9200, ext. 2128 Office Hours: To be determined in 1248 BPS or by appointmentCourse Informationhttp:
Michigan State University - PHY - 231
PHYSICS 231 INTRODUCTORY PHYSICS ILecture 2PHYSICS 231 INTRODUCTORY PHYSICS I Lecturer: Carl Schmidt (Sec. 001) schmidt@pa.msu.edu (517) 355-9200, ext. 2128 Office Hours: Friday 1-2:30 pm in 1248 BPS or by appointmentMain points of last lectur
Michigan State University - PHY - 231
PHYSICS 231 INTRODUCTORY PHYSICS ILecture 2PHYSICS 231 INTRODUCTORY PHYSICS I Lecturer: Carl Schmidt (Sec. 001) schmidt@pa.msu.edu (517) 355-9200, ext. 2128 Office Hours: Friday 1-2:30 pm in 1248 BPS or by appointmentMain points of last lectur
Michigan State University - PHY - 231
examples of everyday life where Physics plays the major role?Where is the physics in: playing the guitar? football match? cooking? having an x-ray?Physics is a Fundamental ScienceFive major areas Mechanics Thermodynamics Electromagnetism Rel
Michigan State University - LIB - 1949
16USGA JOURNAL: July, 1949Upswing in British GolfBy WILLIAM C. CAMPBELLHUNTINGTON, W. VA. GUYAN GOLF AND COUNTRY CLUB,To those who expect the Walker Cup Match to be just another American victory, I would advise that the British will bring fro
Michigan State University - ECE - 480
Stanford - C - 0504071
Large TPCs for low energy rare event detectionNNN05 Next Generation of Nucleon Decay and Neutrino Detectors 7-9 April 2005 Aussois, Savoie, France Highlights from the Paris TPC workshop Spherical TPC project and motivationI. GiomatarisSECOND
Stanford - C - 080625
Very High-Energy Gamma Ray AstronomyUllrich Schwanke Institute of Physics Humboldt University Berlin Newtonstrasse 14 12489 Berlin, GERMANY1OverviewVery high-energy (VHE) gamma ray astronomy explores the non-thermal Universe at energies great
UT Arlington - EE - 5347
Review Exam # 2 EE 5347Exam 2 on Chapter 3: Monday. Nov. 17, 2008 Impedance Transformer Design Be able to do recursion analysis for /4 transformer. Table 3.1 will be provided on the exam if needed. Filter Approximation 1 1 + (f /fc )2n 1 G= 2 T (f /
Michigan State University - LIB - 1921
280BULLETIN OF GREEN SECTION OF THE tvoi. 1. NO. 12Tillage dump. All that is needed to complete the picture is the tin cans, broken bottles, and waste paper. Every green the expert put down went bad; so they were plowed up and are being reseeded.
UT Arlington - EE - 4339
Dual-Band Bandpass Filter and Diplexer Based on Double-Sided Parallel-Strip LineJian-Xin Chen, Xiu-Yin Zhang, and Quan Xue Wireless Communications Research Centre, Department of Electronic Engineering, City University of Hong Kong, 83 Tat Chee Avenu
Michigan State University - MATH - 235
1-1Exercises-8th-edition1. Problems for Sec.1.1: 1,3,5, Sec. 1.2, 1, Sec.1.3: 1,3,5,7,9,15,17,19 2. Problems for Sec.2.1: 1,3,5,13-19, Sec.2.2:1-6,9,11,31-38, Sec.2.3: 15,7,8,12,16, Sec.2.4:28,29, Sec.2.6:1-10,25-29 (skip Secs. 2.5, 2.7, 2.8. 2.9)
Michigan State University - LIB - 1983
The SoilsControversy - Mixes for Green Construction and Topdressingby MARVIN H. FERGUSON Former National Director, USGA Green SectionNBUILDING and maintaining a golf green, there is one major objective: the green must provide a satisfactory surface
Michigan State University - LIB - 1990
It will take a well-balancedmanagementapproachto get this pond back into shape.A Different Look at IPM: Integrated Pond Managementby JAMES CONNOLLY Agronomist, Northeastern Region, USGA Green Sectionare made to control the plant with certai
Michigan State University - GEO - 1299
27 YEAR SUMMARY OF ANNUAL VALUES FOR CARO (1299) Y E A R TEMPERATURE (F) MEANS EXTREMES MEAN MEAN MONTH HIGH- DATE LOW- DATE MAX MIN MEAN EST EST TOTAL NUMBER OF DEGREE DAYS HEAT COOL PRECIPITATION (IN) LIQ. EQUIV. SNOWFALL AMT. GREATEST AMT. GREATES
Michigan State University - GEO - 0094
29 YEAR SUMMARY OF ANNUAL VALUES FOR ALBION (0094) Y E A R PRECIPITATION (IN) TOTAL NUMBER OF DAYS LIQ. EQUIV. SNOWFALL PRECIP(IN) AMT. GREATEST AMT. GREATEST =&gt; =&gt; =&gt; AMT MON AMT MON 0.10 0.25 0.501971 27.30 6.56 Jul 43.6 14.0 Mar 68 30 16 1972 37
Michigan State University - GEO - 3170
Michigan State University - GEO - 3585
HARBOR BEACH 3 NW DIVISION: 7 STATION #3585
Michigan State University - GEO - 7274
Sheet1DIVISION: 2 COUNTY: MACKINAC YEAR 1981 1982 1983 1984 1985 1986 1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 JAN .20 .45 .26 .11 .27 .70 .20 .50 .90 .85 .75 .42 .17 .75 .30 .75 1.08 .30 .52 .85 FEBSAINT IGNACE STATI
Michigan State University - GEO - 2626
ESCANABA DIVISION: 2 STATION #2626
Michigan State University - GEO - 1675
COLDWATER DIVISION: 9 STATION #1675
Michigan State University - GEO - 8920
WHITEFISH POINT DIVISION: 2 STATION #8920
Michigan State University - GEO - 1299
CARO STATE HOSPITAL DIVISION: 7 STATION #1299
Michigan State University - GEO - 5097
MAPLE CITY DIVISION: 3 STATION #5097
Michigan State University - GEO - 3429
GREENVILLE 1 NNE DIVISION: 6 STATION #3429
Michigan State University - GEO - 0230
ANN ARBOR U OF M DIVISION: 10 STATION #0230
Michigan State University - GEO - 6300
OWOSSO WASTEWATER PLANT DIVISION: 9 STATION #6300 DA
Michigan State University - GEO - 4967
1951-1980 STATISTICAL SUMMARY FOR LUPTON DIVISION: NORTHEAST LOWER TOWN: 23N COUNTY: OGEMAW RANGE: 03E LATITUDE: 44d 25m SECTION: 01 LONGITUDE: 84d 01m
Michigan State University - GEO - 5662
1951-1980 STATISTICAL SUMMARY FOR MT. PLEASANT DIVISION: CENTRAL LOWER TOWN: 14N COUNTY: ISABELLA RANGE: 04W LATITUDE: 43d 35m SECTION: 22
Michigan State University - GEO - 5712
1951-1980 STATISTICAL SUMMARY FOR MUSKEGON WSO DIVISION: WEST CENTRAL LOWER TOWN: 09N COUNTY: MUSKEGON RANGE: 16W LATITUDE: 43d 10m SECTION: 17
Michigan State University - GEO - 3769
1951-1980 STATISTICAL SUMMARY FOR HESPERIA DIVISION: WEST CENTRAL LOWER TOWN: 14N COUNTY: OCEANA RANGE: 15W LATITUDE: 43d 35m SECTION: 02
Stanford - SEP - 137
a)3 4 5Depth (km)b)Angle ()20 45 0 45 20 45 0 45 20 45 0 45 2025 Distance (km)3035c)Angle ()25 Distance (km)3035d)Angle ()25 Distance (km)303525 Distance (km)3035e)3 4 5Depth (km)f)Angle ()20 45 0 45 20
Stanford - SEP - 137
Stanford - SEP - 137
a) -80000-4000Distance (m) 040008000b)1000 03000Distance (m) 500070009000200Extrapolation Step2000Depth (m)40040006006000800Physical DomainRiemannian Domain3000 Distance (m) 5000 7000 9000d) -80000-400
Stanford - SEP - 137
Stanford - SEP - 137
a)0.0 0.5 1.0 Depth (km) 1.5 2.0 2.5 3.0 3.5 4.0 -3 -2 -1 0 1 2 3 0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5b)0.0 0.5 Depth (km) 1.0c)0.0 0.5 1.0 Depth (km) 1.5 2.0 2.5 3.0 3.5d)Depth (km)1.5 2.0 2.5 3.0 3.5 e)Di
UT Arlington - EE - 5310
To get rid of some errors on running Cadence University of Texas at Arlington AMIC lab (ELB 204) Partha Pratim Ghosh partha@uta.edu 1) First of all you have to copy 00setup.cdk into your home folder. As soon as logged in to Gamma server, yo
UT Arlington - EE - 5310
HW 2 Solution Part 1(W=0.3um L=0.3um) ADS SimulationHspice simulation Input File for Spice *DC Characteristics* * .OPTIONS SEARCH='/home/axs9588/SPICE' .LIB '/home/axs9588/nmos.lib' TSMC25N .OPTIONS LIST ACCT NODE POST ** vds 1 0 dc 2.5 vgs 2 0 dc
UT Arlington - EE - 5317
EE5317-001 Advanced Digital VLSI Design Spring 2009, Mon/Wed 10:30 AM 11:50 AM TH 115INSTRUCTOR: Sungyong Jung, Assistant Professor, EE Department. Office: 252 Nedderman Hall, Email: jung@uta.edu, Phone: 817-272-1338 Office Hours: 10:00 AM 11:30 A
UT Arlington - EE - 5312
EE5312 VLSI Design and Technology Spring 2005, Mon/Wed 10:30 11:50 am 105 GACBINSTRUCTOR: Sungyong Jung, Assistant Professor, EE Department. Office: 537 Nedderman Hall, Email: jung@uta.edu Office Hours: 10:30 AM 12:00PM on Tuesday (Other times by
Stanford - RACC - 1009
US District Court Civil Docket as of 05/21/2003 Retrieved from the court on Monday, August 07, 2006U.S. District Court District of Delaware (Wilmington)CIVIL DOCKET FOR CASE #: 1:99-md-01304-KAJIn Re: Reliance Acceptance v. Doppelt, et al Assign
UT Arlington - CSE - 6392
Computer Vision and Image Understanding 77, 211232 (2000) doi:10.1006/cviu.1999.0816, available online at http:/www.idealibrary.com onInterpolation Artefacts in Mutual Information-Based Image RegistrationJosien P. W. Pluim, J. B. Antoine Maintz, a
Stanford - VAPH - 1030
0 RECEIPT 0 AMOUNT SUMMONS ISSUE DUNITED STATES DISTRICT COURT LOCAL RULE 4 .1 WAIVER FORM DISTRIC T OF MASSACHUSETTS - X MCF ISSUEI EDWARD A . TOVREA, individually and on behalf : Case No. SY DIRTY C . DATE of all others similarly situated, : Plai