Patents by Inventor Greg M. Osgood

Greg M. Osgood has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240041558
    Abstract: A system for surgical navigation, including an instrument for a medical procedure attached to a camera and having a spatial position relative to the camera, an x-ray system to acquire x-ray images, and multiple fiducial markers detectable by both the camera and x-ray system, having a radio-opaque material arranged as at least one of a line and a point. A computer receives an optical image from the camera and an x-ray image from the x-ray system, identifies fiducial markers visible in both the optical image and x-ray image, determines for each fiducial marker a spatial position relative to the camera based on the optical image and relative to the x-ray system based on the x-ray image, and determines a spatial position for the instrument relative to the x-ray system based on at least the spatial positions relative to the camera and x-ray system.
    Type: Application
    Filed: December 9, 2021
    Publication date: February 8, 2024
    Applicant: The Johns Hopkins University
    Inventors: Jeffrey H. SIEWERDSEN, Niral M. SHETH, Prasad VAGDARGI, Greg M. OSGOOD, Wathudurage Tharindu DE SILVA
  • Patent number: 11369440
    Abstract: A system for modelling a portion of a patient includes a processing unit, a manipulator, a sensor, and a display device. The processing unit is configured to receive patient data and to process the patient data to generate a model of the portion of the patient. The sensor is configured to capture manipulator data. The processing unit is configured to receive the manipulator data from the sensor and to process the manipulator data to determine a position of the manipulator, an orientation of the manipulator, or both. The display device is configured to display the model on or in the manipulator.
    Type: Grant
    Filed: March 30, 2018
    Date of Patent: June 28, 2022
    Assignee: THE JOHN HOPKINS UNIVERSITY
    Inventors: Bernhard Fuerst, Greg M. Osgood, Nassir Navab, Alexander Winkler
  • Publication number: 20210121243
    Abstract: A system for modelling a portion of a patient includes a processing unit, a manipulator, a sensor, and a display device. The processing unit is configured to receive patient data and to process the patient data to generate a model of the portion of the patient. The sensor is configured to capture manipulator data. The processing unit is configured to receive the manipulator data from the sensor and to process the manipulator data to determine a position of the manipulator, an orientation of the manipulator, or both. The display device is configured to display the model on or in the manipulator.
    Type: Application
    Filed: March 30, 2018
    Publication date: April 29, 2021
    Inventors: Bernhard FUERST, Greg M. OSGOOD, Nassir NAVAB, Alexander WINKLER
  • Publication number: 20200275988
    Abstract: The present invention is directed to a system and method for image to world registration for medical reality applications, using a world spatial map. This invention is a system and method to link any point in a fluoroscopic image to its corresponding position in the visual world using spatial mapping with a head mounted display (HMD) (world tracking). On a projectional fluoroscopic 2D image, any point on the image can be thought of as representing a line that is perpendicular to the plane of the image that intersects that point. The point itself could lie at any position in space along this line, located between the X-Ray source and the detector. With the aid of the HMD, a virtual line is displayed in the visual field of the user.
    Type: Application
    Filed: October 2, 2018
    Publication date: September 3, 2020
    Inventors: Alex A. Johnson, Kevin Yu, Sebastian Andress, Mohammadjavad Fotouhighazvin, Greg M. Osgood, Nassir Navab, Mathias Unberath