{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

20080242978 - (19 United States US 20080242978A1(12 Patent...

Info iconThis preview shows pages 1–22. Sign up to view the full content.

View Full Document Right Arrow Icon
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 2
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 4
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 6
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 8
Background image of page 9

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 10
Background image of page 11

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 12
Background image of page 13

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 14
Background image of page 15

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 16
Background image of page 17

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 18
Background image of page 19

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 20
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: (19) United States US 20080242978A1 (12) Patent Application Publication (10) Pub. No.: US 2008/0242978 A1 Simon et al. (43) Pub. Date: Oct. 2, 2008 (54) METHOD AND APPARATUS FOR REGISTERINGA PHYSICAL SPACE T0 IMAGE SPACE (75) Inventors: David Simon, Boulder, CO (US); Amir Ghanei, Louisville, CO (U S); Kevin Mark, Westminster, CO (US); Andrew Lajoie, Superior, CO (US); Geoffrey M. Ruckel, Denver, CO (U S) Correspondence Address: HARNESS, DICKEY & PIERCE, P.L.C. PO. BOX 828 BLOOMFIELD HILLS, MI 48303 (US) (73) Assignee: Medtronic Navigation, Inc., Louisville, CO (US) 22 (21) Appl. No.: 11/693,558 (22) Filed: Mar. 29, 2007 Publication Classification (51) Int. Cl. A613 6/02 (2006.01) (52) US. Cl. ........................................................ 600/426 (57) ABSTRACT A method and apparatus for registering physical space to image space is disclosed. The system allows for determining fiducial markers as pixels or voxels in image data. The system can correlate and register the determined fiducial points with fiducial markers in physical space. 24 Matching & Transformation Identification of Fiducial Point Candidates 0 Initial 2D Search 0 3D Refinement o Optionally Attach Fiducial Markers 0 Distance Search 0 Find All Fiducial Markers in Physical Space 0 Find Best Transform 0 Register Image Space to Physical Space Fine Search for Missed FIducIaI Points Register Image Space to Physical Space Patent Application Publication Oct. 2, 2008 Sheet 1 0f 5 US 2008/0242978 A1 22 24 , , , , Matching & Transformation Identification of FIdUCIaI . Distance Search Point Candidates 0 Find All Fiducial Markers 0 Initial 2D Search in Physical Space ° 3D. Refinement - Find Best Transform ° Optionally Aim 0 Register Image Space Fiducial Markers to Physical Space 110 50b 82 $5)\ / ,, {1? 114 90 108 104 33:4,” /-_106 \ ‘ / /90 IFig-3 Patent Application Publication Oct. 2, 2008 Sheet 2 0f 5 US 2008/0242978 A1 Oct. 2, 2008 Sheet 3 0f 5 US 2008/0242978 A1 Patent Application Publication v- m: $3228 :8 :33: 28% 5 am 6 mm comm a: 95 E828 E8 was: 8 comm. 28w a: 83228 $5288 E_9.._ =83: 9:53 8* a E8 5%: 355 “$591.5 828; gm 28m an 8 gm 5 a: :8 SEES 52.8 2528 mem a: tow £23 2833528 80133228 3%: 9 E SE Ea fig: .5 a :8 as E E: 2528 E8 5%: am 58m 2 smaefl $3: on 2858 258 $223 $8 8&3 837,5 cog 325 22:28 Emeoww £8 =o=8$m $588 2 935$ 38 cozoomom 939:2. _m_u:c_n_ coas— _m_o:_o_n_ E8”— _m_Eom=m._. E8”— § E E E E a: Patent Application Publication Oct. 2, 2008 Sheet 4 0f 5 US 2008/0242978 A1 1563 [Fig -5 Patent Application Publication Oct. 2, 2008 Sheet 5 0f 5 US 2008/0242978 A1 Find All Physical Space 24 Fiducial Markers 202 204 Determine Distance Between AII Physical Space Fidumal Markers Perform Transform with Last N+1 Subset 26 Determine Distance Between AII Fiducial Point Candidates in Best Score List ( Block 196 ) AII Physical Space Fiducial Markers Matched to One Fiducial Point Candidate? 205 Determine AII Subsets of Three Fiducial Markers in Physical Space Determine AII Subsets of Three FIdUCIaI Point Candidates in Image Space Compare All Physical Space Three Fiducial Marker Subsets And All Image Data Three Fiducial Point Subsets Output a List of All Matching Three Fiducial Marke Subsets and Three Fiducial Point Subsets Register Image Space to Physical Space Calculate an Image Space to Physical Space Transform Based on Each of the Three-Point Subset Matches Between Physical Space and Image Space Navigate Procedure Find Additional Matching Fiducial Point Candidates and Fiducial Markers Based on Transforms Add at Least One Point to Each of the Three-Point Subsets to Form N-Point Subsets Perform Transform with N- or N+1 Point Subset Find Additional Matching Points Additional Points Foun- ? 226 Add Additional Matching Points to Form N+1 Subsets [Fig-6 US 2008/0242978 A1 METHOD AND APPARATUS FOR REGISTERING A PHYSICAL SPACE TO IMAGE SPACE FIELD [0001] The present disclosure relates to a method and appa- ratus for performing a computer assisted surgical procedure, and particularly to a method and apparatus for registering physical space to image space. BACKGROUND [0002] The statements in this section merely provide back- ground information related to the present disclosure and may not constitute prior art. [0003] Surgical procedures can be performed on anato- mies, such as the human anatomy for providing a therapy to the anatomy. Therapies can include implantation of prosthe- sis, applications of pharmaceutical or biological materials, and other therapies. To apply a therapy to the internal portions of the anatomy, however, an incision or axis portal is gener- ally required to be formed in the anatomy. [0004] The axis portal, such as an incision, can require healing and recovery time in addition to the reason for the application of the therapy. Therefore, it is desirable, to mini- mize the trauma produced by and the size of the axis portal. In reducing the size of the axis portal, however, the ability of a user, such as a surgeon, to effectively apply a therapy can be reduced. Therefore, it is desirable to provide a mechanism so that a surgeon can provide a therapy without minimization or reduction of effectiveness of the physical or in viewing the area to apply a therapy. Navigation of instruments relative to an anatomy can be used. The navigation can use acquired image data to confirm and position an instrument relative to the patient. The image data, if it is preacquired or acquired prior to positioning an instrument with an anatomy, is gener- ally registered to the patient. The registration process, how- ever, can require several steps and interaction of a user with a workstation or computer system. Therefore, it is desirable, to minimize the steps of a user to register image data to a patient and to allow the registration process to be efficient as pos- sible. SUMMARY [0005] A system and apparatus is disclosed to allow for registration of image space to physical space. The physical space can include patient space or the navigated space relative to a patient. The physical space can include any portion or the entire patient or area surrounding the patient. Generally patient space includes that area which is part of the navigable field in which an instrument or navigated portion can be tracked. [0006] Image space can be image data acquired of any appropriate portion, such as image data of a patient. The image data can be any compilation or data set of image points that is imaged with any appropriate imaging system. The image points can include pixels, voxels, groups of pixels or voxels, or any other appropriate data point. The image data can include data regarding the anatomy of the patient, a physi- cal property of a portion affixed relative to the patient, and other appropriate data. For example, the patient may also include a fiducial point or marker. The fiducial point or marker can be a natural portion of a patient or can include an artificial structure that is interconnected to the patient. For Oct. 2, 2008 example, a fiducial marker can be stuck to the surface of the patient using an adhesive material. The image data can then include image points or data points about the patient and the fiducial marker. [0007] According to various embodiments a method of reg- istering image space and physical space in a computer aided surgical navigation system is disclosed. The method can include acquiring image data to define the image space including a plurality of points having a sub-plurality of fidu- cial points. The method can further include forming a tem- plate of a fiducial marker, comparing the template to the plurality of points, and determining the sub-plurality of fidu- cial points from the plurality of points based on the comparing the template. Fiducial markers in the physical space can be identified. A first subset of the fiducial points can be matched with a first subset of the fiducial markers and a transformation of the image space to the physical space can be made with the match. Image space can also be registered to the physical space. [0008] According to various embodiments a method of reg- istering image space and physical space in a computer aided surgical navigation system is disclosed. The method can include imaging a member including a fiducial marker affixed to the member to acquire image data and comparing a com- puter readable fiducial template to the image data to identify fiducial points in the image data. A device defining a single center point in any orientation can be tracked relative to the fiducial marker and a position of the fiducial marker can be determined based on tracking the device. The identified fidu- cial points and the determined fiducial markers can be com- pared and the image space can be registered to the physical space. [0009] According to various embodiments a computer aided surgical navigation system to navigate a procedure relative to a patient having registration of image space and physical space is disclosed. The system can include a tracking system having a localizer and a tracking device. A fiducial marker can be associated with the patient to define a fiducial marker point. An instrument can be associated with the track- ing device, wherein the instrument includes a fiducial marker contact portion defining a single center point. A processor can be associated with the tracking system to determine a position of the single center point in physical space. A memory system can store the image data of the patient and the fiducial marker and a display device can display the image data. The proces- sor can execute a first set of instructions to compare the image data of the patient and the fiducial marker to a predetermined fiducial template to determine fiducial points in the image data. The processor can also execute a second set of instruc- tions to match the fiducial points in image space to fiducial marker points in physical space. [0010] Further areas of applicability will become apparent from the description provided herein. It should be understood that the description and specific examples are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure. DRAWINGS [0011] The drawings described herein are for illustration purposes only and are not intended to limit the scope of the present disclosure in any way. [0012] FIG. 1 is a schematic overview of a system for identification of fiducial points in image data and registration to physical space, according to various embodiments; US 2008/0242978 A1 [0013] FIG. 2 is an environmental view of a surgical navi- gation system, according to various embodiments; [0014] FIG. 3 is an environmental view of an anatomy with fiducial markers and pointing device, according to various embodiments, contacting the fiducial markers; [0015] FIG. 4 is a detailed flow diagram of a portion of the process illustrated in FIG. 1; [0016] FIG. 5 is exemplary image data; and [0017] FIG. 6 is a detailed flow diagram of a portion of the process illustrated in FIG. 1. DETAILED DESCRIPTION [0018] The following description is merely exemplary in nature and is not intended to limit the present disclosure, application, or uses. Initially, one skilled in the art will under- stand that the system and apparatus disclosed herein can be used in any appropriate procedure. Although a head frame is illustrated attached to a cranium and image data is illustrated for a cranium, and appropriate portion of the anatomy can be imaged. Moreover, a head frame may not be necessary and a dynamic reference frame can be attached to any appropriate structure, such as a bone screw, an adhesive base, an orifice, etc. [0019] With reference to FIG. 1, an identification and reg- istration system 20 to identify a fiducial point in image data and registering the identified or determined fiducial point to a fiducial marker in physical space or patient space. Herein, patient space canbe understoodto be a specific physical space to which registration can be made. Discussed in detail herein, registration matches or correlates points in the image space and points in the physical space. Image data can be acquired in any appropriate format, as discussed herein. The image data can be used during an operative procedure, prior to an operative procedure, or after an operative procedure. [0020] With continuing reference to FIG. 1, the system 20 includes three main portions or procedures. Initially fiducial points in image data defining image space are identified in block 22. Once fiducial points are identified in block 22, a matching and registration process occurs in block 24. After the matching and transformation process in block 24, a fine or refinement search can be optionally performed in block 26. After the fine search, if needed, in block 26 a final transfor- mation or registration can occur in block 27. The registration can register the image space and the patient space based upon the matched fiducial markers and fiducial points. [0021] The initial identification of the fiducial points in block 22 can include two main sub-portions. The sub-por- tions can include an initial two-dimensional (2D) search. In the 2D search the fiducial points can be searched in a 2D resected data set or after a 2D resection of three-dimensional (3D) data. After the initial 2D search, 3D refinement search can occur. The 3D refinement search can occur in a three dimensional image data. As is understood in the art, two dimensional image data and three dimensional image data can be acquired with a single imaging process. For example, a plurality of two dimensional image slices can be registered together or stacked to create a three dimensional model. The 3D refinement search can then search the created three dimen- sional model at the locations identified in the initial two dimensional search. In addition, as discussed herein, 2D image data can be resected from the 3D data set for the 2D search. [0022] After the search and identification of the fiducials points in block 22, a match and transformation can occur in Oct. 2, 2008 block 24. Briefly, the matching and transformation can include finding or determining all or a subset of the fiducial markers in physical space. The finding or determining of the fiducial markers in patient space or physical space can occur according to any appropriate method, as discussed further herein. Once the fiducial markers are determined in physical space they can be compared to the determined fiducial points in the image space, identified or determined in block 22. Initially a distance search can be performed for both the fiducial markers found in physical space and fiducial points determined in the image space. A transformation can then be calculated to attempt to match the set of found fiducial mark- ers in physical space and the determined fiducial points in the image space, as discussed in greater detail herein. Once an optimized or appropriate transformation has been deter- mined, registration of the image data to the physical space can be performed in block 27. [0023] Optionally, a fine search for fiducial points in the image data can be performed in block 26. For example, if the matching of found physical space fiducial markers and deter- mined fiducial points in the image space does not create an appropriate number of matchable fiducial points and markers, additional fiducial points can be determined. For example, a second search based upon a position of a found fiducial marker in physical space can be used to attempt to locate an additional fiducial point in image space. It will be understood that a fine search and refinement search is not required and an appropriate number of fiducial points in image space and fiducial markers in physical space can be found initially with- out performing the fine search in block 26. [0024] With reference to FIG. 2, a navigation system 30 that can be used for various procedures, including the system 20 is illustrated. The navigation system 30 can be used to track the location of a device 82, such as a pointer probe, relative to a patient 32 to assist in the implementation of the system 20. It should be further noted that the navigation system 30 may be used to navigate or track other devices including: catheters, probes, needles, leads, implants, etc. Moreover, the navigated device may be used in any region of the body. The navigation system 30 and the various devices may be used in any appro- priate procedure, such as one that is generally minimally invasive, arthroscopic, percutaneous, stereotactic, or an open procedure. Although an exemplary navigation system 30 including an imaging system 34 are discussed herein, one skilled in the art will understand that the disclosure is merely for clarity of the present discussion and any appropriate imag- ing system, navigation system, patient specific data, and non- patient specific data can be used. For example, the intraop- erative imaging system can include an MRI imaging system, such as the PoleStar® MRI sold by Medtronic, Inc. or an O-ar‘mTM imaging system sold by Breakaway Imaging, LLC. having a place of business in Massachusetts, USA. It will be understood that the navigation system 30 can incorporate or be used with any appropriate preoperatively or intraopera- tively acquired image data. [0025] The navigation system 30 can include the optional imaging device 34 that is used to acquire pre-, intra-, or post-operative or real-time image data of the patient 32. The image data acquired with the imaging device 34 can be used as part of the image data in the system 20. In addition, data from atlas models canbe used to produce patient images, such as those disclosed in US. patent application Ser. No. 10/687, 539, filed Oct. 16, 2003, now US. Pat. App. Pub. No. 2005/ 0085714, entitled “METHOD AND APPARATUS FOR US 2008/0242978 A1 SURGICAL NAVIGATION OF A MULTIPLE PIECE CON- STRUCT FOR IMPLANTATION”, incorporated herein by reference. The optional imaging device 34 is, for example, a fluoroscopic X-ray imaging device that may be configured as a C-arm 36 having an X-ray source 38, an X-ray receiving section 40, an optional calibration and tracking target 42 and optional radiation sensors. The calibration and tracking target 42 includes calibration markers (not illustrated). Image data may also be acquired using other imaging devices, such as those discussed above and herein. [0026] An optional imaging device controller 43 may con- trol the imaging device 34, such as the C-arm 36, which can capture the x-ray images received at the receiving section 40 and store the images for later use. The controller 43 may also be separate from the C-arm 36 and can be part of or incorpo- rated into a work station 44. The controller 43 can control the rotation of the C-arm 36. For example, the C-arm 36 can move in the direction of arrow 46 or rotate about a longitudinal axis 32a of the patient 32, allowing anterior or lateral views of the patient 32 to be imaged. Each of these movements involves rotation about a mechanical axis 48 of the C-arm 36. The movements of the imaging device 34, such as the C-arm 36 can be tracked with a tracking device 50. [0027] In the example of FIG. 2, the longitudinal axis 32a of the patient 32 is substantially in line with the mechanical axis 48 of the C-arm 36. This enables the C-arm 36 to be rotated relative to the patient 32, allowing images of the patient 32 to be taken from multiple directions or in multiple planes.An example of a fluoroscopic C-arm X-ray device that may be used as the optional imaging device 34 is the “Series 9600 Mobile Digital Imaging System,” from GE Healthcare, (formerly OEC Medical Systems, Inc.) of Salt Lake City, Utah. Other exemplary fluoroscopes include bi-plane fluoro- scopic systems, ceiling mounted fluoroscopic systems, cath- lab fluoroscopic systems, fixed C-arm fluoroscopic systems, isocentric C-arm uoroscopic systems, 3D fluoroscopic sys- tems, intraoperative O-armTM imaging systems, etc. [0028] The C-arm X-ray imaging system 36 can be any ...
View Full Document

{[ snackBarMessage ]}