Patents by Inventor Praveen BABU J D

Praveen BABU J D has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20210177370
    Abstract: A method of viewing a patient including inserting a catheter is described for health procedure navigation. A CT scan is carried out on a body part of a patient. Raw data from the CT scan is processed to create three-dimensional image data, storing the image data in the data store. Projectors receive generated light in a pattern representative of the image data and waveguides guide the light to a retina of an eye of a viewer while light from an external surface of the body transmits to the retina of the eye so that the viewer sees the external surface of the body augmented with the processed data rendering of the body part.
    Type: Application
    Filed: August 22, 2019
    Publication date: June 17, 2021
    Applicant: Magic Leap, Inc.
    Inventors: Nastasja U ROBAINA, Praveen BABU J D, David Charles LUNDMARK, Alexander ILIC
  • Patent number: 11024086
    Abstract: Disclosed is an approach for managing and displaying virtual content in a mixed reality environment on a one-on-one basis independently by each application, each virtual content is rendered by its respective application into a bounded volume referred herein as a “Prism.” Each Prism may have characteristics and properties that allow a universe application to manage and display the Prism in the mixed reality environment such that the universe application may manage the placement and display of the virtual content in the mixed reality environment by managing the Prism itself.
    Type: Grant
    Filed: December 18, 2018
    Date of Patent: June 1, 2021
    Assignee: Magic Leap, Inc.
    Inventors: June Tate-Gans, Eric Norman Yiskis, Mark Ashley Rushton, David William Hover, Praveen Babu J D
  • Patent number: 11017592
    Abstract: A method is disclosed, the method comprising the steps of receiving, from a first client application, first graphical data comprising a first node; receiving, from a second client application independent of the first client application, second graphical data comprising a second node; and generating a scenegraph, wherein the scenegraph describes a hierarchical relationship between the first node and the second node.
    Type: Grant
    Filed: March 29, 2018
    Date of Patent: May 25, 2021
    Assignee: MAGIC LEAP, INC.
    Inventor: Praveen Babu J D
  • Patent number: 10939084
    Abstract: Disclosed is an approach for displaying 3D videos in a VR and/or AR system. The 3D videos may include 3D animated objects that escape from the display screen. The 3D videos may interact with objects within the VR and/or AR environment. The 3D video may be interactive with a user such that based on user input corresponding to decisions elected by the user at certain portions of the 3D video such that a different storyline and possibly a different conclusion may result for the 3D video. The 3D video may be a 3D icon displayed within a portal of a final 3D render world.
    Type: Grant
    Filed: December 19, 2018
    Date of Patent: March 2, 2021
    Assignee: Magic Leap, Inc.
    Inventors: Praveen Babu J D, Sean Christopher Riley
  • Publication number: 20200363865
    Abstract: Examples of wearable systems and methods can use multiple inputs (e.g., gesture, head pose, eye gaze, voice, and/or environmental factors (e.g., location)) to determine a command that should be executed and objects in the three-dimensional (3D) environment that should be operated on. The multiple inputs can also be used by the wearable system to permit a user to interact with text, such as, e.g., composing, selecting, or editing text.
    Type: Application
    Filed: August 4, 2020
    Publication date: November 19, 2020
    Inventors: James M. Powderly, Savannah Niles, Jennifer M.R. Devine, Adam C. Carlson, Jeffrey Scott Sommers, Praveen Babu J D
  • Patent number: 10768693
    Abstract: Examples of wearable systems and methods can use multiple inputs (e.g., gesture, head pose, eye gaze, voice, and/or environmental factors (e.g., location)) to determine a command that should be executed and objects in the three-dimensional (3D) environment that should be operated on. The multiple inputs can also be used by the wearable system to permit a user to interact with text, such as, e.g., composing, selecting, or editing text.
    Type: Grant
    Filed: April 17, 2018
    Date of Patent: September 8, 2020
    Assignee: Magic Leap, Inc.
    Inventors: James M. Powderly, Savannah Niles, Jennifer M. R. Devine, Adam C. Carlson, Jeffrey Scott Sommers, Praveen Babu J D
  • Publication number: 20200036816
    Abstract: A host device having a first processor executes an application via the first processor. The host device determines a state of the application. A scenegraph is generated corresponding to the state of the application, and the scenegraph is presented to a remote device having a display and a second processor. The remote device is configured to, in response to receiving the scenegraph, render to the display a view corresponding to the scenegraph, without executing the application via the second processor.
    Type: Application
    Filed: July 22, 2019
    Publication date: January 30, 2020
    Inventors: Praveen Babu J D, Karen Stolzenberg, Jehangir Tajik, Rohit Anil Talwalkar, Colman Thomas Bryant, Leonid Zolotarev
  • Publication number: 20190199993
    Abstract: Disclosed is an approach for displaying 3D videos in a VR and/or AR system. The 3D videos may include 3D animated objects that escape from the display screen. The 3D videos may interact with objects within the VR and/or AR environment. The 3D video may be interactive with a user such that based on user input corresponding to decisions elected by the user at certain portions of the 3D video such that a different storyline and possibly a different conclusion may result for the 3D video. The 3D video may be a 3D icon displayed within a portal of a final 3D render world.
    Type: Application
    Filed: December 19, 2018
    Publication date: June 27, 2019
    Applicant: MAGIC LEAP, INC.
    Inventors: Praveen BABU J D, Sean Christopher RILEY, Jeffrey Scott Sommers
  • Publication number: 20180307303
    Abstract: Examples of wearable systems and methods can use multiple inputs (e.g., gesture, head pose, eye gaze, voice, and/or environmental factors (e.g., location)) to determine a command that should be executed and objects in the three-dimensional (3D) environment that should be operated on. The multiple inputs can also be used by the wearable system to permit a user to interact with text, such as, e.g., composing, selecting, or editing text.
    Type: Application
    Filed: April 17, 2018
    Publication date: October 25, 2018
    Inventors: JAMES M. POWDERLY, SAVANNAH NILES, JENNIFER M.R. DEVINE, ADAM C. CARLSON, JEFFREY SOMMERS, PRAVEEN BABU J D
  • Publication number: 20180286116
    Abstract: A method is disclosed, the method comprising the steps of receiving, from a first client application, first graphical data comprising a first node; receiving, from a second client application independent of the first client application, second graphical data comprising a second node; and generating a scenegraph, wherein the scenegraph describes a hierarchical relationship between the first node and the second node.
    Type: Application
    Filed: March 29, 2018
    Publication date: October 4, 2018
    Inventor: Praveen BABU J D