Patents by Inventor William E. Higgins

William E. Higgins has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10667679
    Abstract: A global registration system and method identifies bronchoscope position without the need for significant bronchoscope maneuvers, technician intervention, or electromagnetic sensors. Virtual bronchoscopy (VB) renderings of a 3D airway tree are obtained including VB views of branch positions within the airway tree. At least one real bronchoscopic (RB) video frame is received from a bronchoscope inserted into the airway tree. An algorithm according to the invention is executed on a computer to identify the several most likely branch positions having a VB view closest to the received RB view, and the 3D position of the bronchoscope within the airway tree is determined in accordance with the branch position identified in the VB view. The preferred embodiment involves a fast local registration search over all the branches in a global airway-bifurcation search space, with the weighted normalized sum of squares distance metric used for finding the best match.
    Type: Grant
    Filed: March 22, 2018
    Date of Patent: June 2, 2020
    Assignee: The Penn State Research Foundation
    Inventors: William E. Higgins, Rahul Khare, Scott A. Merritt
  • Patent number: 10482606
    Abstract: This invention relates generally to medical imaging and, in particular, to a method and system for automatic lymph node station mapping, automatic path or route report generation. A computer-based system for automatically locating the central chest lymph-node stations in a 3D MDCT image is described. Automated analysis methods extract the airway tree, airway-tree centerlines, aorta, pulmonary artery, lungs, key skeletal structures, and major-airway labels. Geometrical and anatomical cues arising from the extracted structures are used to localize the major nodal stations. The system calculates and displays the nodal stations in 3D. Visualization tools within the system enable the user to interact with the stations to locate visible lymph nodes.
    Type: Grant
    Filed: May 16, 2017
    Date of Patent: November 19, 2019
    Assignee: The Penn State Research Foundation
    Inventors: William E. Higgins, Jason D. Gibbs, Kun-Chang Yu, Michael W. Graham, Kongkuo Lu
  • Publication number: 20180220883
    Abstract: A global registration system and method identifies bronchoscope position without the need for significant bronchoscope maneuvers, technician intervention, or electromagnetic sensors. Virtual bronchoscopy (VB) renderings of a 3D airway tree are obtained including VB views of branch positions within the airway tree. At least one real bronchoscopic (RB) video frame is received from a bronchoscope inserted into the airway tree. An algorithm according to the invention is executed on a computer to identify the several most likely branch positions having a VB view closest to the received RB view, and the 3D position of the bronchoscope within the airway tree is determined in accordance with the branch position identified in the VB view. The preferred embodiment involves a fast local registration search over all the branches in a global airway-bifurcation search space, with the weighted normalized sum of squares distance metric used for finding the best match.
    Type: Application
    Filed: March 22, 2018
    Publication date: August 9, 2018
    Inventors: William E. Higgins, Rahul Khare, Scott A. Merritt
  • Publication number: 20170345155
    Abstract: This invention relates generally to medical imaging and, in particular, to a method and system for automatic lymph node station mapping, automatic path or route report generation. A computer-based system for automatically locating the central chest lymph-node stations in a 3D MDCT image is described. Automated analysis methods extract the airway tree, airway-tree centerlines, aorta, pulmonary artery, lungs, key skeletal structures, and major-airway labels. Geometrical and anatomical cues arising from the extracted structures are used to localize the major nodal stations. The system calculates and displays the nodal stations in 3D. Visualization tools within the system enable the user to interact with the stations to locate visible lymph nodes.
    Type: Application
    Filed: May 16, 2017
    Publication date: November 30, 2017
    Inventors: William E. Higgins, Jason D. Gibbs, Kun-Chang Yu, Michael W. Graham, Kongkuo Lu
  • Patent number: 9757021
    Abstract: Two system-level bronchoscopy guidance solutions are presented. The first incorporates a global-registration algorithm to provide the physician with updated navigational and guidance information during bronchoscopy. The system can handle general navigation to a region of interest (ROI), as well as adverse events, and it requires minimal commands so that it can be directly controlled by the physician. The second solution visualizes the global picture of all the bifurcations and their relative orientations in advance and suggests the maneuvers needed by the bronchoscope to approach the ROI. Guided bronchoscopy results using human airway-tree phantoms demonstrate the potential of the two solutions.
    Type: Grant
    Filed: January 31, 2012
    Date of Patent: September 12, 2017
    Assignee: The Penn State Research Foundation
    Inventors: William E. Higgins, Rahul Khare
  • Patent number: 9675420
    Abstract: Methods and apparatus assist in planning routes through hollow, branching organs in patients to optimize subsequent endoscopic procedures. Information is provided about the organ and a follow-on endoscopic procedure associated with the organ. The most appropriate navigable route or routes to a target region of interest (ROI) within the organ are then identified given anatomical, endoscopic-device, or procedure-specific constraints derived from the information provided. The method may include the step of modifying the viewing direction at each site along a route to give physically meaningful navigation directions or to reflect the requirements of a follow-on live endoscopic procedure. An existing route may further be extended, if necessary, to an ROI beyond the organ. The information provided may include anatomical constraints that define locations or organs to avoid; anatomical constraints that confine the route within specific geometric locations; or a metric for selecting the most appropriate route.
    Type: Grant
    Filed: May 18, 2015
    Date of Patent: June 13, 2017
    Assignee: The Penn State Research Foundation
    Inventors: William E. Higgins, Jason D. Gibbs
  • Patent number: 9672631
    Abstract: This invention relates generally to medical imaging and, in particular, to a method and system for reconstructing a model path through a branched tubular organ. Novel methodologies and systems segment and define accurate endoluminal surfaces in airway trees, including small peripheral bronchi. An automatic algorithm is described that searches the entire lung volume for airway branches and poses airway-tree segmentation as a global graph-theoretic optimization problem. A suite of interactive segmentation tools for cleaning and extending critical areas of the automatically segmented result is disclosed. A model path is reconstructed through the airway tree.
    Type: Grant
    Filed: February 16, 2009
    Date of Patent: June 6, 2017
    Assignee: The Penn State Research Foundation
    Inventors: William E. Higgins, Jason D. Gibbs, Kun-Chang Yu, Michael W. Graham, Kongkuo Lu
  • Publication number: 20150257847
    Abstract: Methods and apparatus assist in planning routes through hollow, branching organs in patients to optimize subsequent endoscopic procedures. Information is provided about the organ and a follow-on endoscopic procedure associated with the organ. The most appropriate navigable route or routes to a target region of interest (ROI) within the organ are then identified given anatomical, endoscopic-device, or procedure-specific constraints derived from the information provided. The method may include the step of modifying the viewing direction at each site along a route to give physically meaningful navigation directions or to reflect the requirements of a follow-on live endoscopic procedure. An existing route may further be extended, if necessary, to an ROI beyond the organ. The information provided may include anatomical constraints that define locations or organs to avoid; anatomical constraints that confine the route within specific geometric locations; or a metric for selecting the most appropriate route.
    Type: Application
    Filed: May 18, 2015
    Publication date: September 17, 2015
    Inventors: William E. Higgins, Jason D. Gibbs
  • Patent number: 9037215
    Abstract: Methods and apparatus assist in planning routes through hollow, branching organs in patients to optimize subsequent endoscopic procedures. Information is provided about the organ and a follow-on endoscopic procedure associated with the organ. The most appropriate navigable route or routes to a target region of interest (ROI) within the organ are then identified given anatomical, endoscopic-device, or procedure-specific constraints derived from the information provided. The method may include the step of modifying the viewing direction at each site along a route to give physically meaningful navigation directions or to reflect the requirements of a follow-on live endoscopic procedure. An existing route may further be extended, if necessary, to an ROI beyond the organ. The information provided may include anatomical constraints that define locations or organs to avoid; anatomical constraints that confine the route within specific geometric locations; or a metric for selecting the most appropriate route.
    Type: Grant
    Filed: January 24, 2008
    Date of Patent: May 19, 2015
    Assignee: The Penn State Research Foundation
    Inventors: William E. Higgins, Jason D. Gibbs
  • Patent number: 8675935
    Abstract: Fast and continuous registration between two imaging modalities makes it possible to completely determine the rigid transformation between multiple sources at real-time or near real-time frame-rates in order to localize video cameras and register the two sources. A set of reference images are computed or captured within a known environment, with corresponding depth maps and image gradients defining a reference source. Given one frame from a real-time or near-real time video feed, and starting from an initial guess of viewpoint, a real-time video frame is warped to the nearest viewing site of the reference source. An image difference is computed between the warped video frame and the reference image. Steps are repeated for each frame until the viewpoint converges or the next video frame becomes available. The final viewpoint gives an estimate of the relative rotation and translation between the camera at that particular video frame and the reference source.
    Type: Grant
    Filed: November 16, 2011
    Date of Patent: March 18, 2014
    Assignee: The Penn State Research Foundation
    Inventors: William E. Higgins, Scott A. Merritt, Lav Rai
  • Patent number: 8672836
    Abstract: Methods and apparatus provide continuous guidance of endoscopy during a live procedure. A data-set based on 3D image data is pre-computed including reference information representative of a predefined route through a body organ to a final destination. A plurality of live real endoscopic (RE) images are displayed as an operator maneuvers an endoscope within the body organ. A registration and tracking algorithm registers the data-set to one or more of the RE images and continuously maintains the registration as the endoscope is locally maneuvered. Additional information related to the final destination is then presented enabling the endoscope operator to decide on a final maneuver for the procedure. The reference information may include 3D organ surfaces, 3D routes through an organ system, or 3D regions of interest (ROIs), as well as a virtual endoscopic (VE) image generated from the precomputed data-set.
    Type: Grant
    Filed: January 30, 2008
    Date of Patent: March 18, 2014
    Assignee: The Penn State Research Foundation
    Inventors: William E. Higgins, Scott A. Merritt, Lav Rai, Jason D. Gibbs, Kun-Chang Yu
  • Publication number: 20120203065
    Abstract: Two system-level bronchoscopy guidance solutions are presented. The first incorporates a global-registration algorithm to provide the physician with updated navigational and guidance information during bronchoscopy. The system can handle general navigation to a region of interest (ROI), as well as adverse events, and it requires minimal commands so that it can be directly controlled by the physician. The second solution visualizes the global picture of all the bifurcations and their relative orientations in advance and suggests the maneuvers needed by the bronchoscope to approach the ROI. Guided bronchoscopy results using human airway-tree phantoms demonstrate the potential of the two solutions.
    Type: Application
    Filed: January 31, 2012
    Publication date: August 9, 2012
    Applicant: The Penn State Research Foundation
    Inventors: William E. Higgins, Rahul Khare
  • Publication number: 20120203067
    Abstract: A technician-free strategy enables real-time guidance of bronchoscopy. The approach uses measurements of the bronchoscope's movement to predict its position in 3D virtual space. To achieve this, a bronchoscope model, defining the device's shape in the airway tree to a given point p, provides an insertion depth to p. In real time, the invention compares an observed bronchoscope insertion depth and roll angle, measured by an optical sensor, to precalculated insertion depths along a predefined route in the virtual airway tree to predict a bronchoscope's location and orientation.
    Type: Application
    Filed: January 31, 2012
    Publication date: August 9, 2012
    Applicant: The Penn State Research Foundation
    Inventors: William E. Higgins, Jason D. Gibbs, Duane C. Cornish
  • Publication number: 20120082351
    Abstract: Fast and continuous registration between two imaging modalities makes it possible to completely determine the rigid transformation between multiple sources at real-time or near real-time frame-rates in order to localize video cameras and register the two sources. A set of reference images are computed or captured within a known environment, with corresponding depth maps and image gradients defining a reference source. Given one frame from a real-time or near-real time video feed, and starting from an initial guess of viewpoint, a real-time video frame is warped to the nearest viewing site of the reference source. An image difference is computed between the warped video frame and the reference image. Steps are repeated for each frame until the viewpoint converges or the next video frame becomes available. The final viewpoint gives an estimate of the relative rotation and translation between the camera at that particular video frame and the reference source.
    Type: Application
    Filed: November 16, 2011
    Publication date: April 5, 2012
    Applicant: The Penn State Research Foundation
    Inventors: William E. Higgins, Scott A. Merritt, Lav Rai
  • Patent number: 8064669
    Abstract: A novel framework for fast and continuous registration between two imaging modalities is disclosed. The approach makes it possible to completely determine the rigid transformation between multiple sources at real-time or near real-time frame-rates in order to localize the cameras and register the two sources. A disclosed example includes computing or capturing a set of reference images within a known environment, complete with corresponding depth maps and image gradients. The collection of these images and depth maps constitutes the reference source. The second source is a real-time or near-real time source which may include a live video feed. Given one frame from this video feed, and starting from an initial guess of viewpoint, the real-time video frame is warped to the nearest viewing site of the reference source. An image difference is computed between the warped video frame and the reference image.
    Type: Grant
    Filed: February 7, 2011
    Date of Patent: November 22, 2011
    Assignee: The Penn State Research Foundation
    Inventors: William E. Higgins, Scott A. Merritt, Lav Rai
  • Publication number: 20110184238
    Abstract: A global registration system and method identifies bronchoscope position without the need for significant bronchoscope maneuvers, technician intervention, or electromagnetic sensors. Virtual bronchoscopy (VB) renderings of a 3D airway tree are obtained including VB views of branch positions within the airway tree. At least one real bronchoscopic (RB) video frame is received from a bronchoscope inserted into the airway tree. An algorithm according to the invention is executed on a computer to identify the several most likely branch positions having a VB view closest to the received RB view, and the 3D position of the bronchoscope within the airway tree is determined in accordance with the branch position identified in the VB view. The preferred embodiment involves a fast local registration search over all the branches in a global airway-bifurcation search space, with the weighted normalized sum of squares distance metric used for finding the best match.
    Type: Application
    Filed: January 28, 2011
    Publication date: July 28, 2011
    Applicant: The Penn State Research Foundation
    Inventors: William E. Higgins, Rahul Khare, Scott A. Merritt
  • Publication number: 20110128352
    Abstract: A novel framework for fast and continuous registration between two imaging modalities is disclosed. The approach makes it possible to completely determine the rigid transformation between multiple sources at real-time or near real-time frame-rates in order to localize the cameras and register the two sources. A disclosed example includes computing or capturing a set of reference images within a known environment, complete with corresponding depth maps and image gradients. The collection of these images and depth maps constitutes the reference source. The second source is a real-time or near-real time source which may include a live video feed. Given one frame from this video feed, and starting from an initial guess of viewpoint, the real-time video frame is warped to the nearest viewing site of the reference source. An image difference is computed between the warped video frame and the reference image.
    Type: Application
    Filed: February 7, 2011
    Publication date: June 2, 2011
    Applicant: The Penn State Research Foundation
    Inventors: William E. Higgins, Scott A. Merritt, Lav Rai
  • Patent number: 7889905
    Abstract: A novel framework for fast and continuous registration between two imaging modalities is disclosed. The approach makes it possible to completely determine the rigid transformation between multiple sources at real-time or near real-time frame-rates in order to localize the cameras and register the two sources. A disclosed example includes computing or capturing a set of reference images within a known environment, complete with corresponding depth maps and image gradients. The collection of these images and depth maps constitutes the reference source. The second source is a real-time or near-real time source which may include a live video feed. Given one frame from this video feed, and starting from an initial guess of viewpoint, the real-time video frame is warped to the nearest viewing site of the reference source. An image difference is computed between the warped video frame and the reference image.
    Type: Grant
    Filed: May 19, 2006
    Date of Patent: February 15, 2011
    Assignee: The Penn State Research Foundation
    Inventors: William E. Higgins, Scott A. Merritt, Lav Rai
  • Publication number: 20100310146
    Abstract: This invention relates generally to medical imaging and, in particular, to a method and system for reconstructing a model path through a branched tubular organ. Novel methodologies and systems segment and define accurate endoluminal surfaces in airway trees, including small peripheral bronchi. An automatic algorithm is described that searches the entire lung volume for airway branches and poses airway-tree segmentation as a global graph-theoretic optimization problem. A suite of interactive segmentation tools for cleaning and extending critical areas of the automatically segmented result is disclosed. A model path is reconstructed through the airway tree.
    Type: Application
    Filed: February 16, 2009
    Publication date: December 9, 2010
    Applicant: The Penn State Research Foundation
    Inventors: Williams E. Higgins, Jason D. Gibbs, Kun-Chang Yu, Michael W. Graham, Kongkuo Lu
  • Publication number: 20100280365
    Abstract: A method provides guidance to the physician during a live bronchoscopy or other endoscopic procedures. The 3D motion of the bronchoscope is estimated using a fast coarse tracking step followed by a fine registration step. The tracking is based on finding a set of corresponding feature points across a plurality of consecutive bronchoscopic video frames, then estimating for the new pose of the bronchoscope. In the preferred embodiment the pose estimation is based on linearization of the rotation matrix. By giving a set of corresponding points across the current bronchoscopic video image, and the CT-based virtual image as an input, the same method can also be used for manual registration. The fine registration step is preferably a gradient-based Gauss-Newton method that maximizes the correlation between the bronchoscopic video image and the CT-based virtual image. The continuous guidance is provided by estimating the 3D motion of the bronchoscope in a loop.
    Type: Application
    Filed: July 12, 2010
    Publication date: November 4, 2010
    Applicant: The Penn State Research Foundation
    Inventors: William E. Higgins, Scott A. Merritt, Lav Rai