Patents by Inventor Peter Kazanzides

Peter Kazanzides has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11928838
    Abstract: A calibration platform may obtain measurements for aligning a real-world coordinate system and a display coordinate system. For example, the calibration platform may display, via an optical see-through head-mounted display (OST-HMD), a three-dimensional virtual object and receive, from a positional tracking device, information that relates to a current pose of a three-dimensional real-world object to be aligned with the three-dimensional virtual object. The calibration platform may record a three-dimensional position of a plurality of points on the three-dimensional real-world object based on the current pose of the three-dimensional real-world object, based on an indication that the plurality of points on the three-dimensional real-world object respectively corresponds with a plurality of points on the three-dimensional virtual object.
    Type: Grant
    Filed: July 8, 2022
    Date of Patent: March 12, 2024
    Assignee: The Johns Hopkins University
    Inventors: Ehsan Azimi, Long Qian, Peter Kazanzides, Nassir Navab
  • Publication number: 20240013679
    Abstract: A system includes: a phantom object; one or more sensors within the phantom object; and a processor, a computer readable memory, a non-transitory computer readable storage medium associated with a computing device in communication with the phantom object, and program instructions executable by the computing device to cause the computing device to perform operations including: detecting a medical instrument within the phantom object based on sensor data captured by the one or more sensors; measuring a distance between the medical instrument and a target point based on the sensor data; and storing or outputting information identifying the distance between the medical instrument and the target point.
    Type: Application
    Filed: September 28, 2021
    Publication date: January 11, 2024
    Applicant: THE JOHNS HOPKINS UNIVERSITY
    Inventors: Ehsan AZIMI, Peter KAZANZIDES, Zhiyuan NIU
  • Patent number: 11861062
    Abstract: A calibration platform may display, via an optical see-through head-mounted display (OST-HMD), a virtual image having at least one feature. The calibration platform may determine, based on information relating to a gaze of a user wearing the OST-HMD, that the user performed a voluntary eye blink to indicate that the at least one feature of the virtual image appears to the user to be aligned with at least one point on the three-dimensional real-world object. The calibration platform may record an alignment measurement based on a position of the at least one point on the three-dimensional real-world object in a real-world coordinate system based on a time when the user performed the voluntary eye blink. Accordingly, the alignment measurement may be used to generate a function providing a mapping between three-dimensional points in the real-world coordinate system and corresponding points in a display space of the OST-HMD.
    Type: Grant
    Filed: January 31, 2019
    Date of Patent: January 2, 2024
    Assignee: The Johns Hopkins University
    Inventors: Ehsan Azimi, Long Qian, Peter Kazanzides, Nassir Navab
  • Publication number: 20230149084
    Abstract: A computer-implemented method includes: receiving, by an augmented reality device, a medical image of a surgical site, generating, by the augmented reality device, a virtual surgical site model based on the medical image; presenting, by the augmented reality device, the virtual surgical site model; receiving, by the augmented reality device, user calibration input; aligning, by the ugmented reality device, the virtual surgical site model with a real-life surgical site based on the user calibration input; and displaying, by the augmented reality device and after the aligning, a virtual insertion path between an incision point and a target point to aid in inserting a tool as part of performing a surgical procedure.
    Type: Application
    Filed: March 18, 2021
    Publication date: May 18, 2023
    Applicant: THE JOHNS HOPKINS UNIVERSITY
    Inventors: Ehsan AZIMI, Peter KAZANZIDES, Judy HUANG, Camilo MOLINA
  • Publication number: 20230027801
    Abstract: A computer-implemented method for displaying augmented reality (AR) content within an AR device coupled to one or more loupe lenses comprising: obtaining calibration parameters defining a magnified display portion within a display of the AR device, wherein the magnified display portion corresponds to boundaries encompassing the one or more loupe lenses; receiving the AR content for display within the AR device; and rendering the AR content within the display, wherein the rendering the AR content comprises: identifying a magnified portion of the AR content to be displayed within the magnified display portion, and rendering the magnified portion of the AR content within the magnified display portion.
    Type: Application
    Filed: January 5, 2021
    Publication date: January 26, 2023
    Applicant: THE JOHNS HOPKINS UNIVERSITY
    Inventors: Long QIAN, Peter KAZANZIDES, Mathias UNBERATH, Tianyu SONG
  • Publication number: 20220366598
    Abstract: A calibration platform may obtain measurements for aligning a real-world coordinate system and a display coordinate system. For example, the calibration platform may display, via an optical see-through head-mounted display (OST-HMD), a three-dimensional virtual object and receive, from a positional tracking device, information that relates to a current pose of a three-dimensional real-world object to be aligned with the three-dimensional virtual object. The calibration platform may record a three-dimensional position of a plurality of points on the three-dimensional real-world object based on the current pose of the three-dimensional real-world object, based on an indication that the plurality of points on the three-dimensional real-world object respectively corresponds with a plurality of points on the three-dimensional virtual object.
    Type: Application
    Filed: July 8, 2022
    Publication date: November 17, 2022
    Applicant: The Johns Hopkins University
    Inventors: Ehsan AZIMI, Long QIAN, Peter KAZANZIDES, Nassir NAVAB
  • Patent number: 11386572
    Abstract: A calibration platform may obtain measurements for aligning a real-world coordinate system and a display coordinate system. For example, the calibration platform may display, via an optical see-through head-mounted display (OST-HMD), a three-dimensional virtual object and receive, from a positional tracking device, information that relates to a current pose of a three-dimensional real-world object to be aligned with the three-dimensional virtual object. The calibration platform may record a three-dimensional position of a plurality of points on the three-dimensional real-world object based on the current pose of the three-dimensional real-world object, based on an indication that the plurality of points on the three-dimensional real-world object respectively corresponds with a plurality of points on the three-dimensional virtual object.
    Type: Grant
    Filed: January 31, 2019
    Date of Patent: July 12, 2022
    Assignee: The Johns Hopkins University
    Inventors: Ehsan Azimi, Long Qian, Peter Kazanzides, Nassir Navab
  • Patent number: 11259870
    Abstract: In one embodiment of the invention, a a minimally invasive surgical system is disclosed. The system configured to capture and display camera images of a surgical site on at least one display device at a surgeon console; switch out of a following mode and into a masters-as-mice (MaM) mode; overlay a graphical user interface (GUI) including an interactive graphical object onto the camera images; and render a pointer within the camera images for user interactive control. In the following mode, the input devices of the surgeon console may couple motion into surgical instruments. In the MaM mode, the input devices interact with the GUI and interactive graphical objects. The pointer is manipulated in three dimensions by input devices having at least three degrees of freedom. Interactive graphical objects are related to physical objects in the surgical site or a function thereof and are manipulatable by the input devices.
    Type: Grant
    Filed: October 4, 2017
    Date of Patent: March 1, 2022
    Assignee: Intuitive Surgical Operations, Inc.
    Inventors: Simon P. DiMaio, Christopher J. Hasser, Russell H. Taylor, David Q. Larkin, Peter Kazanzides, Anton Deguet, Balazs Peter Vagvolgyi, Joshua Leven
  • Patent number: 11244508
    Abstract: A device may determine a view of a user of a head mounted display. The device may obtain tracking data relating to a surgical procedure. The device may obtain, from an imaging device, surgical imaging relating to the surgical procedure. The device may orient models of objects based on the tracking data and the view of the user of the head mounted display, wherein the objects includes the imaging device. The device may augment, by providing output to the head mounted display for display, the view of the user with contextual information relating to the objects based on orienting the models based on the tracking data and the view of the user, wherein the contextual information includes the surgical imaging captured by an imaging device.
    Type: Grant
    Filed: January 25, 2019
    Date of Patent: February 8, 2022
    Assignee: The Johns Hopkins University
    Inventors: Peter Kazanzides, Long Qian
  • Publication number: 20220015832
    Abstract: A minimally invasive surgical system is disclosed including a processor coupled to a stereoscopic endoscope and a stereoscopic video display device. The processor generates an operative image of an anatomic structure in the surgical site, overlays the operative image onto the captured stereo video images for display on the stereoscopic video display device, generates and overlays a pointer onto the operative image or the captured stereo video images to display the pointer on the stereoscopic video display device with a three dimensional appearance, and switches between a first mode for input devices of a surgeon console used to couple motion into surgical instruments and a second mode used to control an interactive graphical user interface to allow interactions with the pointer and the operative image in three dimensions using input devices having at least three degrees of freedom.
    Type: Application
    Filed: September 7, 2021
    Publication date: January 20, 2022
    Inventors: Simon P. DiMaio, Christopher J. Hasser, Russell H. Taylor, David Q. Larkin, Peter Kazanzides, Anton Deguet, Balazs Peter Vagvolgyi, Joshua Leven
  • Publication number: 20210142508
    Abstract: A calibration platform may obtain measurements for aligning a real-world coordinate system and a display coordinate system. For example, the calibration platform may display, via an optical see-through head-mounted display (OST-HMD), a three-dimensional virtual object and receive, from a positional tracking device, information that relates to a current pose of a three-dimensional real-world object to be aligned with the three-dimensional virtual object. The calibration platform may record a three-dimensional position of a plurality of points on the three-dimensional real-world object based on the current pose of the three-dimensional real-world object, based on an indication that the plurality of points on the three-dimensional real-world object respectively corresponds with a plurality of points on the three-dimensional virtual object.
    Type: Application
    Filed: January 31, 2019
    Publication date: May 13, 2021
    Applicant: The Johns Hopkins University
    Inventors: Ehsan AZIMI, Long QIAN, Peter KAZANZIDES, Nassir NAVAB
  • Publication number: 20200388075
    Abstract: A device may determine a view of a user of a head mounted display. The device may obtain tracking data relating to a surgical procedure. The device may obtain, from an imaging device, surgical imaging relating to the surgical procedure. The device may orient models of objects based on the tracking data and the view of the user of the head mounted display, wherein the objects includes the imaging device. The device may augment, by providing output to the head mounted display for display, the view of the user with contextual information relating to the objects based on orienting the models based on the tracking data and the view of the user, wherein the contextual information includes the surgical imaging captured by an imaging device.
    Type: Application
    Filed: January 25, 2019
    Publication date: December 10, 2020
    Inventors: Peter KAZANZIDES, Long QIAN
  • Publication number: 20200363867
    Abstract: A calibration platform may display, via an optical see-through head-mounted display (OST-HMD), a virtual image having at least one feature. The calibration platform may determine, based on information relating to a gaze of a user wearing the OST-HMD, that the user performed a voluntary eye blink to indicate that the at least one feature of the virtual image appears to the user to be aligned with at least one point on the three-dimensional real-world object. The calibration platform may record an alignment measurement based on a position of the at least one point on the three-dimensional real-world object in a real-world coordinate system based on a time when the user performed the voluntary eye blink. Accordingly, the alignment measurement may be used to generate a function providing a mapping between three-dimensional points in the real-world coordinate system and corresponding points in a display space of the OST-HMD.
    Type: Application
    Filed: January 31, 2019
    Publication date: November 19, 2020
    Applicant: The Johns Hopkins University
    Inventors: Ehsan AZIMI, Long QIAN, Peter KAZANZIDES, Nassir NAVAB
  • Patent number: 10531828
    Abstract: The present invention is directed to a method and system for photoacoustic imaging for guiding medical procedures. A transducer is placed near the site of the procedure. The optical fiber, coupled to an electromagnetic source, such as a laser, is attached to a medical device. During the procedure, the device and optical fiber are inserted into the procedure site where the optical fiber illuminates the procedure site, which has a thickness of approximately 2 mm. Photoacoustic images are acquired to visualize the procedure site as the procedure is proceeding in order to provide real-time guidance. This system is applicable to multiple surgical and interventional procedures, such as transsphenoidal surgery.
    Type: Grant
    Filed: February 2, 2015
    Date of Patent: January 14, 2020
    Assignee: The Johns Hopkins University
    Inventors: Muyinatu Bell, Emad Boctor, Peter Kazanzides
  • Publication number: 20180042680
    Abstract: In one embodiment of the invention, a a minimally invasive surgical system is disclosed. The system configured to capture and display camera images of a surgical site on at least one display device at a surgeon console; switch out of a following mode and into a masters-as-mice (MaM) mode; overlay a graphical user interface (GUI) including an interactive graphical object onto the camera images; and render a pointer within the camera images for user interactive control. In the following mode, the input devices of the surgeon console may couple motion into surgical instruments. In the MaM mode, the input devices interact with the GUI and interactive graphical objects. The pointer is manipulated in three dimensions by input devices having at least three degrees of freedom. Interactive graphical objects are related to physical objects in the surgical site or a function thereof and are manipulatable by the input devices.
    Type: Application
    Filed: October 4, 2017
    Publication date: February 15, 2018
    Inventors: Simon P. DiMaio, Christopher J. Hasser, Russell H. Taylor, David Q. Larkin, Peter Kazanzides, Anton Deguet, Balazs Peter Vagvolgyi, Joshua Leven
  • Patent number: 9815206
    Abstract: According to some embodiments of the present invention, a cooperatively controlled robot includes a robotic actuator assembly comprising a tool holder and a force sensor, a control system adapted to communicate with the robotic actuator assembly and the force sensor, and an output system in communication with the control system. The tool holder is configured to receive a tool to be manipulated by a user. The control system is configured to receive an instruction from a user to switch from a robot control mode into a user interface control mode. The force sensor is configured to measure at least one of a force and a torque applied to the tool, and the control system is configured to receive an indication of the at least one of a force and a torque applied to the tool and manipulate the output system based on the indication.
    Type: Grant
    Filed: September 25, 2014
    Date of Patent: November 14, 2017
    Assignee: THE JOHNS HOPKINS UNIVERSITY
    Inventors: Marcin A. Balicki, Peter Kazanzides, Anton Deguet, Russell H. Taylor
  • Patent number: 9795446
    Abstract: In one embodiment of the invention, a method for a minimally invasive surgical system is disclosed. The method includes capturing and displaying camera images of a surgical site on at least one display device at a surgeon console; switching out of a following mode and into a masters-as-mice (MaM) mode; overlaying a graphical user interface (GUI) including an interactive graphical object onto the camera images; and rendering a pointer within the camera images for user interactive control. In the following mode, the input devices of the surgeon console may couple motion into surgical instruments. In the MaM mode, the input devices interact with the GUI and interactive graphical objects. The pointer is manipulated in three dimensions by input devices having at least three degrees of freedom. Interactive graphical objects are related to physical objects in the surgical site or a function thereof and are manipulatable by the input devices.
    Type: Grant
    Filed: February 25, 2013
    Date of Patent: October 24, 2017
    Assignee: Intuitive Surgical Operations, Inc.
    Inventors: Simon P. DiMaio, Christopher J. Hasser, Russell H. Taylor, David Q. Larkin, Peter Kazanzides, Anton Deguet, Balazs Peter Vagvolgyi, Joshua Leven
  • Patent number: 9770828
    Abstract: A combined teleoperative-cooperative controllable robotic system includes a robotic actuator assembly, a control system adapted to communicate with the robotic actuator assembly, and a teleoperation unit adapted to communicate with the control system. The control system is configured to control at least a first portion of the robotic actuator assembly in response to at least one of a force or a torque applied to at least a second portion of the robotic actuator assembly by a first user for cooperative control. The control system is further configured to control at least a third portion of the robotic actuator assembly in response to input by a second user from the teleoperation unit for teleoperative control.
    Type: Grant
    Filed: September 28, 2012
    Date of Patent: September 26, 2017
    Assignee: THE JOHNS HOPKINS UNIVERSITY
    Inventors: Russell H. Taylor, Marcin A. Balicki, Peter Kazanzides, Xia Tian
  • Publication number: 20160089212
    Abstract: According to some embodiments of the present invention, a cooperatively controlled robot includes a robotic actuator assembly comprising a tool holder and a force sensor, a control system adapted to communicate with the robotic actuator assembly and the force sensor, and an output system in communication with the control system. The tool holder is configured to receive a tool to be manipulated by a user. The control system is configured to receive an instruction from a user to switch from a robot control mode into a user interface control mode. The force sensor is configured to measure at least one of a force and a torque applied to the tool, and the control system is configured to receive an indication of the at least one of a force and a torque applied to the tool and manipulate the output system based on the indication.
    Type: Application
    Filed: September 25, 2014
    Publication date: March 31, 2016
    Inventors: Marcin A. Balicki, Peter Kazanzides, Anton Deguet, Russell H. Taylor
  • Publication number: 20150223903
    Abstract: The present invention is directed to a method and system for photoacoustic imaging for guiding medical procedures. A transducer is placed near the site of the procedure. The optical fiber, coupled to an electromagnetic source, such as a laser, is attached to a medical device. During the procedure, the device and optical fiber are inserted into the procedure site where the optical fiber illuminates the procedure site, which has a thickness of approximately 2 mm. Photoacoustic images are acquired to visualize the procedure site as the procedure is proceeding in order to provide real-time guidance. This system is applicable to multiple surgical and interventional procedures, such as transsphenoidal surgery.
    Type: Application
    Filed: February 2, 2015
    Publication date: August 13, 2015
    Inventors: Muyinatu Bell, Emad Boctor, Peter Kazanzides