Systems and Processes Using Imaging Data To Facilitate Surgical Procedures
Systems and processes for tracking anatomy, instrumentation, trial implants, implants, and references, and rendering images and data related to them in connection with surgical operations, for example total knee arthroplasties (“TKA”). These systems and processes are accomplished by using a computer to intraoperatively obtain images of body parts and to register, navigate, and track surgical instruments.
This application is a continuation of U.S. Ser. No. 10/084,012, filed Feb. 27, 2002 and entitled “Total Knee Arthroplasty Systems and Processes,” which claims the benefit of U.S. Ser. No. 60/271,818, filed Feb. 27, 2001 and entitled “Image Guided System for Arthroplasty” and U.S. Ser. No. 60/355,899, filed Feb. 11, 2002 and entitled “Surgical Navigation Systems and Processes,” all of which are incorporated herein by this reference.
FIELD OF INVENTIONSystems and processes for tracking anatomy, implements, instrumentation, trial implants, implant components and virtual constructs or references, and rendering images and data related to them in connection with orthopedic, surgical and other operations, for example Total Knee Arthroplasty (“TKA”). Anatomical structures and such items may be attached to or otherwise associated with fiducial functionality, and constructs may be registered in position using fiducial functionality whose position and orientation can be sensed and tracked by systems and according to processes of the present invention in three dimensions in order to perform TKA. Such structures, items and constructs can be rendered onscreen properly positioned and oriented relative to each other using associated image files, data files, image input, other sensory input, based on the tracking. Such systems and processes, among other things, allow surgeons to navigate and perform TKA using images that reveal interior portions of the body combined with computer generated or transmitted images that show surgical implements, instruments, trials, implants, and/or other devices located and oriented properly relative to the body part. Such systems and processes allow, among other things, more accurate and effective resection of bone, placement and assessment of trial implants and joint performance, and placement and assessment of performance of actual implants and joint performance.
BACKGROUND AND SUMMARYA leading cause of wear and revision in prosthetics such as knee implants, hip implants and shoulder implants is less than optimum implant alignment. In a Total Knee Arthroplasty, for example, current instrument design for resection of bone limits the alignment of the femoral and tibial resections to average values for varus/valgus, flexion/extension, and external/internal rotation. Additionally, surgeons often use visual landmarks or “rules of thumb” for alignment which can be misleading due to anatomical variability. Intramedullary referencing instruments also violate the femoral and tibial canal. This intrusion increases the risk of fat embolism and unnecessary blood loss in the patient. Surgeons also rely on instrumentation to predict the appropriate implant size for the femur and tibia instead of the ability to intraoperatively template the appropriate size of the implants for optimal performance. Another challenge for surgeons is soft tissue or ligament balancing after the bone resections have been made. Releasing some of the soft tissue points can change the balance of the knee; however, the multiple options can be confusing for many surgeons. In revision TKA, for example, many of the visual landmarks are no longer present, making alignment and restoration of the joint line difficult. The present invention is applicable not only for knee repair, reconstruction or replacement surgery, but also repair, reconstruction or replacement surgery in connection with any other joint of the body as well as any other surgical or other operation where it is useful to track position and orientation of body parts, non-body components and/or virtual references such as rotational axes, and to display and output data regarding positioning and orientation of them relative to each other for use in navigation and performance of the operation.
Several providers have developed and marketed various forms of imaging systems for use in surgery. Many are based on CT scans and/or MRI data or on digitized points on the anatomy. Other systems align preoperative CT scans, MRIs or other images with intraoperative patient positions. A preoperative planning system allows the surgeon to select reference points and to determine the final implant position. Intraoperatively, the system calibrates the patient position to that preoperative plan, such as using a “point cloud” technique, and can use a robot to make femoral and tibial preparations.
Systems and processes according to one embodiment of the present invention use position and/or orientation tracking sensors such as infrared sensors acting stereoscopically or otherwise to track positions of body parts, surgery-related items such as implements, instrumentation, trial prosthetics, prosthetic components, and virtual constructs or references such as rotational axes which have been calculated and, stored based on designation of bone landmarks. Processing capability such as any desired form of computer functionality, whether standalone, networked, or otherwise, takes into account the position and orientation information as to various items in the position sensing field (which may correspond generally or specifically to all or portions or more than all of the surgical field) based on sensed position and orientation of their associated fiducials or based on stored position and/or orientation information. The processing functionality correlates this position and orientation information for each object with stored information regarding the items, such as a computerized fluoroscopic imaged file of a femur or tibia, a wire frame data file for rendering a representation of an instrumentation component, trial prosthesis or actual prosthesis, or a computer generated file relating to a rotational axis or other virtual construct or reference. The processing functionality then displays position and orientation of these objects on a screen or monitor, or otherwise. Thus, systems and processes according to one embodiment of the invention can display and otherwise output useful data relating to predicted or actual position and orientation of body parts, surgically related items, implants, and virtual constructs for use in navigation, assessment, and otherwise performing surgery or other operations.
As one example, images such as fluoroscopy images showing internal aspects of the femur and tibia can be displayed on the monitor in combination with actual or predicted shape, position and orientation of surgical implements, instrumentation components, trial implants, actual prosthetic components, and rotational axes in order to allow the surgeon to properly position and assess performance of various aspects of the joint being repaired, reconstructed or replaced. The surgeon may navigate tools, instrumentation, trial prostheses, actual prostheses and other items relative to bones and other body parts in order to perform TKA's more accurately, efficiently, and with better alignment and stability. Systems and processes according to the present invention can also use the position tracking information and, if desired, data relating to shape and configuration of surgical related items and virtual constructs or references in order to produce numerical data which may be used with or without graphic imaging to perform tasks such as assessing performance of trial prosthetics statically and throughout a range of motion, appropriately modifying tissue such as ligaments to improve such performance and similarly assessing performance of actual prosthetic components which have been placed in the patient for alignment and stability. Systems and processes according to the present invention can also generate data based on position tracking and, if desired, other information to provide cues on screen, aurally or as otherwise desired to assist in the surgery such as suggesting certain bone modification steps or measures which may be taken to release certain ligaments or portions of them based on performance of components as sensed by systems and processes according to the present invention.
According to a preferred embodiment of systems and processes according to the present invention, at least the following steps are involved:
1. Obtain appropriate images such as fluoroscopy images of appropriate body parts such as femur and tibia, the imager being tracked in position via an associated fiducial whose position and orientation is tracked by position/orientation sensors such as stereoscopic infrared (active or passive) sensors according to the present invention.
2. Register tools, instrumentation, trial components, prosthetic components, and other items to be used in surgery, each of which corresponds to a fiducial whose position and orientation can be tracked by the position/orientation sensors.
3. Locating and registering body structure such as designating points on the femur and tibia using a probe associated with a fiducial in order to provide the processing functionality information relating to the body part such as rotational axes.
4. Navigating and positioning instrumentation such as cutting instrumentation in order to modify bone, at least partially using images generated by the processing functionality corresponding to what is being tracked and/or has been tracked, and/or is predicted, by the system, and thereby resecting bone effectively, efficiently and accurately.
5. Navigating and positioning trial components such as femoral components and tibial components, some or all of which may be installed using impactors with a fiducial and, if desired, at the appropriate time discontinuing tracking the position and orientation of the trial component using the impactor fiducial and starting to track that position and orientation using the body part fiducial on which the component is installed.
6. Assessing alignment and stability of the trial components and joint, both statically and dynamically as desired, using images of the body parts in combination with images of the trial components while conducting appropriate rotation, anterior-posterior drawer and flexion/extension tests and automatically storing and calculating results to present data or information which allows the surgeon to assess alignment and stability.
7. Releasing tissue such as ligaments if necessary and adjusting trial components as desired for acceptable alignment and stability.
8. Installing implant components whose positions may be tracked at first via fiducials associated with impactors for the components and then tracked via fiducials on the body parts in which the components are installed.
9. Assessing alignment and stability of the implant components and joint by use of some or all tests mentioned above and/or other tests as desired, releasing tissue if desired, adjusting if desired, and otherwise verifying acceptable alignment, stability and performance of the prosthesis, both statically and dynamically.
This process, or processes including it or some of it may be used in any total or partial joint repair, reconstruction or replacement, including knees, hips, shoulders, elbows, ankles and any other desired joint in the body.
Such processes are disclosed in U.S. Ser. No. 60/271,818 filed Feb. 27, 2001, entitled Image Guided System for Arthroplasty, which is incorporated herein by reference as are all documents incorporated by reference therein.
Systems and processes according to the present invention represent significant improvement over other, previous systems and processes. For instance, systems which use CT and MRI data generally require the placement of reference frames pre-operatively which can lead to infection at the pin site. The resulting 3D images must then be registered, or calibrated, to the patient anatomy intraoperatively. Current registration methods are less accurate than the fluoroscopic system. These imaging modalities are also more expensive. Some “imageless” systems, or non-imaging systems, require digitizing a large number of points to define the complex anatomical geometries of the knee at each desired site. This can be very time intensive resulting in longer operating room time. Other imageless systems determine the mechanical axis of the knee by performing an intraoperative kinematic motion to determine the center of rotation at the hip, knee, and ankle. This requires placement of reference frames at the iliac crest of the pelvis and in or on the ankle. This calculation is also time consuming as the system must find multiple points in different planes in order to find the center of rotation. This is also problematic in patients with a pathologic condition. Ligaments and soft tissues in the arthritic patient are not normal and thus will give a center of rotation that is not desirable for normal knees. Robotic systems require expensive CT or MRI scans and also require pre-operative placement of reference frames, usually the day before surgery. These systems are also much slower, almost doubling operating room time and expense.
None of these systems can effectively track femoral and/or tibial trials during a range of motion and calculate the relative positions of the articular surfaces, among other things. Also, none of them currently make suggestions on ligament balancing, display ligament balancing techniques, or surgical techniques. Additionally, none of these systems currently track the patella.
An object of certain aspects of the present invention is to use computer processing functionality in combination with imaging and position and/or orientation tracking sensors to present to the surgeon during surgical operations visual and data information useful to navigate, track and/or position implements, instrumentation, trial components, prosthetic components and other items and virtual constructs relative to the human body in order to improve performance of a repaired, replaced or reconstructed knee joint.
Another object of certain aspects of the present invention is to use computer processing functionality in combination with imaging and position and/or orientation tracking sensors to present to the surgeon during surgical operations visual and data information useful to assess performance of a knee and certain items positioned therein, including components such as trial components and prosthetic components, for stability, alignment and other factors, and to adjust tissue and body and non-body structure in order to improve such performance of a repaired, reconstructed or replaced knee joint.
Another object of certain aspects of the present invention is to use computer processing functionality in combination with imaging and position and/or orientation tracking sensors to present to the surgeon during surgical operations visual and data information useful to show predicted position and movement of implements, instrumentation, trial components, prosthetic components and other items and virtual constructs relative to the human body in order to select appropriate components, resect bone accurately, effectively and efficiently, and thereby improve performance of a repaired, replaced or reconstructed knee joint.
Other objects, features and advantages of the present invention are apparent with respect to the remainder of this document.
Systems and processes according to a preferred embodiment of the present invention use computer capacity, including standalone and/or networked, to store data regarding spatial aspects of surgically related items and virtual constructs or references including body parts, implements, instrumentation, trial components, prosthetic components and rotational axes of body parts. Any or all of these may be physically or virtually connected to or incorporate any desired form of mark, structure, component, or other fiducial or reference device or technique which allows position and/or orientation of the item to which it is attached to be sensed and tracked, preferably in three dimensions of translation and three degrees of rotation as well as in time if desired. In the preferred embodiment, such “fidicuals” are reference frames each containing at least three, preferably four, sometimes more, reflective elements such as spheres reflective of lightwave or infrared energy, or active elements such as LEDs.
In a preferred embodiment, orientation of the elements on a particular fiducial varies from one fiducial to the next so that sensors according to the present invention may distinguish between various components to which the fiducials are attached in order to correlate for display and other purposes data files or images of the components. In a preferred embodiment of the present invention, some fiducials use reflective elements and some use active elements, both of which may be tracked by preferably two, sometimes more infrared sensors whose output may be processed in concert to geometrically calculate position and orientation of the item to which the fiducial is attached.
Position/orientation tracking sensors and fiducials need not be confined to the infrared spectrum. Any electromagnetic, electrostatic, light, sound, radiofrequency or other desired technique may be used. Alternatively, each item such as a surgical implement, instrumentation component, trial component, implant component or other device may contain its own “active” fiducial such as a microchip with appropriate field sensing or position/orientation sensing functionality and communications link such as spread spectrum RF link, in order to report position and orientation of the item. Such active fiducials, or hybrid active/passive fiducials such as transponders can be implanted in the body parts or in any of the surgically related devices mentioned above, or conveniently located at their surface or otherwise as desired. Fiducials may also take the form of conventional structures such as a screw driven into a bone, or any other three dimensional item attached to another item, position and orientation of such three dimensional item able to be tracked in order to track position and orientation of body parts and surgically related items. Hybrid fiducials may be partly passive, partly active such as inductive components or transponders which respond with a certain signal or data set when queried by sensors according to the present invention.
Systems and processes according to a preferred embodiment of the present invention employ a computer to calculate and store reference axes of body components such as in a TKA, for example, the mechanical axis of the femur and tibia. From these axes such systems track the position of the instrumentation and osteotomy guides so that bone resections will locate the implant position optimally, usually aligned with the mechanical axis. Furthermore, during trial reduction of the knee, the systems provide feedback on the balancing of the ligaments in a range of motion and under varus/valgus, anterior/posterior and rotary stresses and can suggest or at least provide more accurate information than in the past about which ligaments the surgeon should release in order to obtain correct balancing, alignment and stability. Systems and processes according to the present invention can also suggest modifications to implant size, positioning, and other techniques to achieve optimal kinematics. Systems and processes according to the present invention can also include databases of information regarding tasks such as ligament balancing, in order to provide suggestions to the surgeon based on performance of test results as automatically calculated by such systems and processes.
In the embodiment shown in
Computing functionality 18 can process, store and output on monitor 24 and otherwise various forms of data which correspond in whole or part to body parts 10 and 12 and other components for item 22. For example, in the embodiment shown in
The computer functionality 18 can also store data relating to configuration, size and other properties of items 22 such as implements, instrumentation, trial components, implant components and other items used in surgery. When those are introduced into the field of position/orientation sensor 16, computer functionality 18 can generate and display overlaid or in combination with the fluoroscopic images of the body parts 10 and 12, computer generated images of implements, instrumentation components, trial components, implant components and other items 22 for navigation, positioning, assessment and other uses.
Additionally, computer functionality 18 can track any point in the position/orientation sensor 16 field such as by using a designator or a probe 26. The probe also can contain or be attached to a fiducial 14. The surgeon, nurse, or other user touches the tip of probe 26 to a point such as a landmark on bone structure and actuates the foot pedal 20 or otherwise instructs the computer 18 to note the landmark position. The position/orientation sensor 16 “sees” the position and orientation of fiducial 14 “knows” where the tip of probe 26 is relative to that fiducial 14 and thus calculates and stores, and can display on monitor 24 whenever desired and in whatever form or fashion or color, the point or other position designated by probe 26 when the foot pedal 20 is hit or other command is given. Thus, probe 26 can be used to designate landmarks on bone structure in order to allow the computer 18 to store and track, relative to movement of the bone fiducial 14, virtual or logical information such as mechanical axis 28, medial lateral axis 30 and anterior/posterior axis 32 of femur 12, tibia 10 and other body parts in addition to any other virtual or actual construct or reference.
Systems and processes according to an embodiment of the present invention such as the subject of
Similarly, the mechanical axis and other axes or constructs of body parts 10 and 12 can also be “registered” for tracking by the system. Again, the system has employed a fluoroscope to obtain images of the femoral head, knee and ankle of the sort shown in
After the mechanical axis and other rotation axes and constructs relating to the femur and tibia are established, instrumentation can be properly oriented to resect or modify bone in order to fit trial components and implant components properly according to the embodiment of the invention shown in
Once the extramedullary rod, intramedullary rod, other type of rod has been placed, instrumentation can be positioned as tracked in position and orientation by sensor 16 and displayed on screen face 24. Thus, a cutting block of the sort used to establish the condylar anterior cut, with its fiducial 14 attached, is introduced into the field and positioned on the rod. Because the cutting block corresponds to a particular implant product and can be adjusted and designated on screen to correspond to a particular implant size of that product, the computer 18 can generate and display a graphic of the cutting block and the femoral component overlain on the fluoroscopic image as shown in
In a similar fashion, instrumentation may be navigated and positioned on the proximal portion of the tibia 10 as shown in
Once resection and modification of bone has been accomplished, implant trials can then be installed and tracked by the system in a manner similar to navigating and positioning the instrumentation, as displayed on the screen 24. Thus, a femoral component trial, a tibial plateau trial, and a bearing plate trial may be placed as navigated on screen using computer generated overlays corresponding to the trials.
During the trial installation process, and also during the implant component installation process, instrument positioning process or at any other desired point in surgical or other operations according to the present invention, the system can transition or segue from tracking a component according to a first fiducial to tracking the component according to a second fiducial. Thus, as shown as
Once the trial components are installed, the surgeon can assess alignment and stability of the components and the joint. During such assessment, in trial reduction, the computer can display on monitor 24 the relative motion between the trial components to allow the surgeon to make soft tissue releases and changes in order to improve the kinematics of the knee. The system can also apply rules and/or intelligence to make suggestions based on the information such as what soft tissue releases to make if the surgeon desires. The system can also display how the soft tissue releases are to be made.
At the end of the case, all alignment information can be saved for the patient file. This is of great assistance to the surgeon due to the fact that the outcome of implant positioning can be seen before any resections have been made to the bone. The system is also capable of tracking the patella and resulting placement of cutting guides and the patellar trial position. The system then tracks alignment of the patella with the patellar femoral groove and will give feedback on issues, such as, patellar tilt.
The tracking and image information provided by systems and processes according to the present invention facilitate telemedical techniques, because they provide useful images for distribution to distant geographic locations where expert surgical or medical specialists may collaborate during surgery. Thus, systems and processes according to the present invention can be used in connection with computing functionality 18 which is networked or otherwise in communication with computing functionality in other locations, whether by PSTN, information exchange infrastructures such as packet switched networks including the Internet, or as otherwise desire. Such remote imaging may occur on computers, wireless devices, videoconferencing devices or in any other mode or on any other platform which is now or may in the future be capable of rending images or parts of them produced in accordance with the present invention. Parallel communication links such as switched or unswitched telephone call connections may also accompany or form part of such telemedical techniques. Distant databases such as online catalogs of implant suppliers or prosthetics buyers or distributors may form part of or be networked with functionality 18 to give the surgeon in real time access to additional options for implants which could be procured and used during the surgical operation.
Claims
1-20. (canceled)
21. A system for facilitating a joint arthroplasty on a particular patient's joint, the system comprising a computer display system for planning the joint arthroplasty on the particular patient's joint, the computer display system comprising:
- (a) an input configured to receive anatomic structure data about at least a portion of the patient's joint and to receive first device data about a first size of an orthopaedic implant;
- (b) a processor; and
- (c) an output configured to cause an electronic display to display: (i) a visual representation of the portion of the patient's joint; (ii) a visual representation of at least one predicted resection relative to the visual representation of the portion of the patient's joint; (iii) a visual representation of a predicted position and orientation of the first size of the orthopaedic implant relative to the visual representation of the portion of the joint; (iv) a numerical representation of a predicted internal/external rotational alignment of the orthopaedic implant relative to the patient's joint; (v) a numerical representation of a predicted varus/valgus rotational alignment of the orthopaedic implant relative to the patient's joint; and (vi) a numerical representation of a predicted flexion/extension rotational alignment of the orthopaedic implant relative to the patient's joint.
22. The system of claim 21, further comprising the first size of orthopaedic implant.
23. The system of claim 21, wherein the at least one predicted resection comprises a distal femoral resection and a tibial resection; and wherein the system further comprises a distal femoral cutting guide configured to facilitate the distal femoral resection and a tibial cutting guide configured to facilitate the tibial resection.
24. The system of claim 21, wherein the input is configured to receive data relating to additional sizes of the orthopaedic implant, and wherein the output is configured to drive the display to display a numerical size indicia of the orthopaedic implant.
25. The system of claim 24, wherein the output is configured to cause the display to alternatively display a visual representation of a predicted position and orientation of at least one of the additional sizes of the orthopaedic implant.
26. The system of claim 21, wherein the input is configured to receive data relating to a change to at least one of the predicted internal/external rotational alignment, predicted varus/valgus rotational alignment, and the flexion/extension rotational alignment.
27. The system of claim 26, wherein the output is configured to cause the display to display an updated visual representation of the predicted position and orientation of the first size of the orthopaedic implant based on the received data relating to the change.
28. The system of claim 26, wherein the output is configured to cause the display to display an updated visual representation of the at least one predicted resection based on the received data relating to the change.
29. The system of claim 28, wherein the output is configured to cause the display to display an updated numerical representation of at least one of the predicted internal/external rotational alignment, predicted varus/valgus rotational alignment, and the flexion/extension rotational alignment based on the received data relating to the change.
30. The system of claim 26, wherein the output is configured to cause the display to display an updated numerical representation of at least one of the predicted internal/external rotational alignment, predicted varus/valgus rotational alignment, and the flexion/extension rotational alignment based on the received data relating to the change.
31. The system of claim 21, wherein the input is configured to receive data relating to an axis of the patient's joint.
32. The system of claim 21, wherein the input is configured to receive data relating to a mechanical axis of the patient's joint.
33. The system of claim 21, wherein the output is configured to cause the display to display a visual representation of an axis of the patient's joint.
34. The system of claim 21, wherein the output is configured to cause the display to display a visual representation of a mechanical axis of the patient's joint.
35. The system of claim 21, wherein the input is configured to receive data relating to an actual position and orientation of a surgical instrument relative to the patient's joint.
36. The system of claim 35, wherein the processor is configured to automatically determine a position and orientation of the at least one predicted resection based on the data relating to the actual position and orientation of the surgical instrument relative to the patient's joint.
37. The system of claim 21, wherein the output is configured to cause the display to display the visual representation of the predicted position and orientation of the first size of the orthopaedic implant in an anterior-posterior view and a lateral view.
38. A system for facilitating a joint arthroplasty on a particular patient's joint, the system comprising a computer display system for planning the joint arthroplasty on the particular patient's joint, the computer display system comprising:
- (a) an input configured to receive anatomic structure data about at least a portion of the patient's joint and to receive first device data about a first size of an orthopaedic implant;
- (b) a processor; and
- (c) an output configured to cause an electronic display to display: (i) a visual representation of the portion of the patient's joint; (ii) a visual representation of at least one predicted resection relative to the visual representation of the portion of the patient's joint; (iii) a numerical representation of a predicted varus/valgus rotational alignment of the orthopaedic implant relative to the patient's joint; and (iv) a numerical representation of a predicted flexion/extension rotational alignment of the orthopaedic implant relative to the patient's joint.
39. The system of claim 38, wherein the at least one predicted resection comprises a distal femoral resection, an anterior femoral resection, a posterior femoral resection, an anterior chamfer femoral resection, a posterior chamfer femoral resection, and a tibial resection; and wherein the system further comprises a distal femoral cutting guide configured to facilitate the distal femoral resection, a four-in-one femoral cutting guide configured to facilitate the anterior, posterior, anterior chamfer and posterior chamfer resections, and a tibial cutting guide configured to facilitate the tibial resection.
40. The system of claim 38, wherein the at least one predicted resection comprises a distal femoral resection and a tibial resection; and wherein the system further comprises a distal femoral cutting guide configured to facilitate the distal femoral resection and a tibial cutting guide configured to facilitate the tibial resection.
41. The system of claim 38, wherein the input is configured to receive data relating to additional sizes of the orthopaedic implant, and wherein the output is configured to cause the display to display a numerical size indicia of the orthopaedic implant.
42. The system of claim 38, wherein the input is configured to receive data relating to a change to at least one of the predicted varus/valgus rotational alignment and the flexion/extension rotational alignment.
43. The system of claim 42, wherein the output is configured to cause the display to display an updated visual representation of the at least one predicted resection based on the received data relating to the change.
44. The system of claim 43, wherein the output is configured to cause the display to display an updated numerical representation of at least one of the predicted varus/valgus rotational alignment and the flexion/extension rotational alignment based on the received data relating to the change.
45. The system of claim 38, wherein the output is configured to cause the display to display the visual representation of the at least one predicted resection in an anterior-posterior view and a lateral view.
46. A system for facilitating a joint arthroplasty on a particular patient's joint, the system comprising a computer display system for planning the joint arthroplasty on the particular patient's joint, the computer display system comprising:
- (a) an input configured to receive anatomic structure data about at least a portion of the patient's joint and to receive first device data about an orthopaedic implant;
- (b) a processor; and
- (c) an output configured to cause an electronic display to display: (i) a visual representation of the portion of the patient's joint; (ii) a visual representation of a predicted position and orientation of the first size of the orthopaedic implant relative to the visual representation of the portion of the joint; and
- (iii) a numerical representation of a flexion/extension rotational alignment of the orthopaedic implant relative to the patient's joint.
47. A process for planning and facilitating a joint arthroplasty procedure on a particular patient's joint, the process comprising:
- (a) receiving, by a computing device including a processor, anatomic structure data about at least a portion of the patient's joint and first device data about a first size of an orthopaedic implant; and
- (b) based on the received data, and using the computing device, causing an electronic display to display: (i) a visual representation of the portion of the patient's joint; (ii) a visual representation of at least one predicted resection relative to the visual representation of the portion of the patient's joint; (iii) a visual representation of a predicted position and orientation of the first size of the orthopaedic implant relative to the visual representation of the portion of the joint; (iv) a numerical representation of a predicted internal/external rotational alignment of the orthopaedic implant relative to the patient's joint; (v) a numerical representation of a predicted varus/valgus rotational alignment of the orthopaedic implant relative to the patient's joint; and (vi) a numerical representation of a predicted flexion/extension rotational alignment of the orthopaedic implant relative to the patient's joint.
48. The process of claim 47, further comprising providing the first size of orthopaedic implant.
49. The process of claim 47, further comprising providing a femoral cutting guide and a tibial cutting guide; wherein the at least one predicted resection comprises a distal femoral resection and a tibial resection; and wherein the distal femoral cutting guide is configured to facilitate the distal femoral resection and the tibial cutting guide is configured to facilitate the tibial resection.
50. The process of claim 47, further comprising:
- (a) receiving, by the computing device, data relating to additional sizes of the orthopaedic implant; and
- (b) based on the received data, and using the computing device, causing the display to display a numerical size indicia of the orthopaedic implant.
51. The process of claim 50, further comprising, using the computing device to cause the display to alternatively display a visual representation of a predicted position and orientation of at least one of the additional sizes of the orthopaedic implant.
52. The process of claim 47, further comprising receiving, by the computing device, data relating to a change to at least one of the predicted internal/external rotational alignment, predicted varus/valgus rotational alignment, and the flexion/extension rotational alignment.
53. The process of claim 52, further comprising using the computing device to cause the display to display an updated visual representation of the predicted position and orientation of the first size of the orthopaedic implant based on the received data relating to the change.
54. The process of claim 52, further comprising using the computing device to cause the display to display an updated visual representation of the at least one predicted resection based on the received data relating to the change.
55. The process of claim 54, further comprising using the computing device to cause the display to display an updated numerical representation of at least one of the predicted internal/external rotational alignment, predicted varus/valgus rotational alignment, and the flexion/extension rotational alignment based on the received data relating to the change.
56. The process of claim 52, further comprising using the computing device to cause the display to display an updated numerical representation of at least one of the predicted internal/external rotational alignment, predicted varus/valgus rotational alignment, and the flexion/extension rotational alignment based on the received data relating to the change.
57. The process of claim 47, further comprising receiving, by the computing device, data relating to an axis of the patient's joint.
58. The process of claim 47, further comprising receiving, by the computing device, data relating to a mechanical axis of the patient's joint.
59. The process of claim 47, further comprising using the computing device to cause the display to display a visual representation of an axis of the patient's joint.
60. The process of claim 47, further comprising using the computing device to cause the display to display a visual representation of a mechanical axis of the patient's joint.
61. The process of claim 47, further comprising receiving, by the computing device, data relating to an actual position and orientation of a surgical instrument relative to the patient's joint.
62. The process of claim 61, wherein the processor of the computing device automatically determines a position and orientation of the at least one predicted resection based on the data relating to the actual position and orientation of the surgical instrument relative to the patient's joint.
63. The process of claim 47, further comprising using the computing device to cause the display to display the visual representation of the predicted position and orientation of the first size of the orthopaedic implant in an anterior-posterior view and a lateral view.
64. A process for planning and facilitating a joint arthroplasty procedure on a particular patient's joint, the process comprising:
- (a) receiving, by a computing device including a processor, anatomic structure data about at least a portion of the patient's joint and first device data about a first size of an orthopaedic implant; and
- (b) based on the received data, and using the computing device, causing an electronic display to display: (i) a visual representation of the portion of the patient's joint; (ii) a visual representation of at least one predicted resection relative to the visual representation of the portion of the patient's joint; (iii) a numerical representation of a predicted varus/valgus rotational alignment of the orthopaedic implant relative to the patient's joint; and (iv) a numerical representation of a predicted flexion/extension rotational alignment of the orthopaedic implant relative to the patient's joint.
65. The process of claim 64, further comprising providing a femoral cutting guide and a tibial cutting guide; wherein the at least one predicted resection comprises a distal femoral resection and a tibial resection; and wherein the distal femoral cutting guide is configured to facilitate the distal femoral resection and the tibial cutting guide is configured to facilitate the tibial resection.
66. The process of claim 64, further comprising:
- (a) receiving, by the computing device, data relating to additional sizes of the orthopaedic implant; and
- (b) based on the received data, and using the computing device, causing the display to display a numerical size indicia of the orthopaedic implant.
67. The process of claim 64, further comprising receiving, by the computing device, data relating to a change to at least one of the predicted varus/valgus rotational alignment and the flexion/extension rotational alignment.
68. The process of claim 67, further comprising using the computing device to cause the display to display an updated visual representation of the at least one predicted resection based on the received data relating to the change.
69. The process of claim 68, further comprising using the computing device to cause the display to display an updated numerical representation of at least one of the predicted varus/valgus rotational alignment and the flexion/extension rotational alignment based on the received data relating to the change.
70. The process of claim 64, further comprising using the computing device to cause the display to display the visual representation of the at least one predicted resection in an anterior-posterior view and a lateral view.
71. A process for planning and facilitating a joint arthroplasty procedure on a particular patient's joint, the process comprising:
- (a) receiving, by a computing device including a processor, data about at least a portion of the patient's joint and data about an orthopaedic implant; and
- (b) based on the received data, and using the computing device, causing an electronic display to display: (i) a visual representation of the portion of the patient's joint; (ii) a visual representation of a predicted position and orientation of the first size of the orthopaedic implant relative to the visual representation of the portion of the joint; and (iii) a numerical representation of a flexion/extension rotational alignment of the orthopaedic implant relative to the patient's joint.
Type: Application
Filed: May 14, 2012
Publication Date: Sep 6, 2012
Inventor: Christopher P. Carson (Seymour, CT)
Application Number: 13/470,765