Patents by Inventor Joseph G. Hager, IV

Joseph G. Hager, IV has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11475647
    Abstract: Systems and methods are presented for immersive and simultaneous animation in a mixed reality environment. Techniques disclosed represent a physical object, present at a scene, in a 3D space of a virtual environment associated with the scene. A virtual element is posed relative to the representation of the physical object in the virtual environment. The virtual element is displayed to users from a perspective of each user in the virtual environment. Responsive to an interaction of one user with the virtual element, an edit command is generated and the pose of the virtual element is adjusted in the virtual environment according to the edit command. The display of the virtual element to the users is then updated according to the adjusted pose. When simultaneous and conflicting edit commands are generated by collaborating users, policies to reconcile the conflicting edit commands are disclosed.
    Type: Grant
    Filed: April 28, 2021
    Date of Patent: October 18, 2022
    Assignee: Disney Enterprises, Inc.
    Inventors: Corey D. Drake, Kenneth J. Mitchell, Rachel E. Rodgers, Joseph G. Hager, IV, Kyna P. McIntosh, Ye Pan
  • Publication number: 20220215634
    Abstract: Systems and methods are presented for immersive and simultaneous animation in a mixed reality environment, Techniques disclosed represent a physical object, present at a scene, in a 3D space of a virtual environment associated with the scene. A virtual element is posed relative to the representation of the physical object in the virtual environment. The virtual element is displayed to users from a perspective of each user in the virtual environment. Responsive to an interaction of one user with the virtual element, an edit command is generated and the pose of the virtual element is adjusted in the virtual environment according to the edit command. The display of the virtual element to the users is then updated according to the adjusted pose. When simultaneous and conflicting edit commands are generated by collaborating users, policies to reconcile the conflicting edit commands are disclosed.
    Type: Application
    Filed: April 28, 2021
    Publication date: July 7, 2022
    Inventors: Corey D. Drake, Kenneth J. Mitchell, Rachel E. Rodgers, Joseph G. Hager, IV, Kyna P. McIntosh, Ye Pan
  • Patent number: 11354852
    Abstract: The present disclosure relates generally to systems and methods for creating a mixed reality environment. A mixed reality system includes a performance area for generating a mixed reality environment, a motion determination module that determines and tracks the motion of an object within the performance area, a physical article module that generates a physical article within the performance area, and a mixed reality display that displays a virtual article within the performance area, wherein the physical article and the virtual article are correlated to the motion of the object.
    Type: Grant
    Filed: October 10, 2019
    Date of Patent: June 7, 2022
    Assignee: DISNEY ENTERPRISES, INC.
    Inventors: Leslie M. Evans, Siroberto Scerbo, Clare M. Carroll, Joseph G. Hager, IV, Nicholas S. Newberg, Alexis P. Wieland, Jonathan Becker
  • Publication number: 20210374982
    Abstract: A system can be used in conjunction with a display configured to display an augmented reality (AR) environment including a virtual object placed in a real environment, the virtual object having a virtual location in the AR environment. The system includes a projector, a memory storing a software code, and a hardware processor configured to execute the software code to: determine a projector location of the projector in the real environment; generate a shadow projection in the real environment, the shadow projection corresponding to the virtual object and being based on the virtual location of the virtual object and the projector location; and project, using the projector, a light pattern in the real environment, the light pattern including a light projection and the shadow projection corresponding to the virtual object.
    Type: Application
    Filed: May 27, 2020
    Publication date: December 2, 2021
    Inventors: Zdravko V. Velinov, Kenneth John Mitchell, Joseph G. Hager, IV
  • Patent number: 11107286
    Abstract: There are provided systems and methods for synchronizing effects for multi-user mixed reality experiences. In one implementation, such a system includes a computing platform having a hardware processor and a system memory storing a software code. The hardware processor executes the software code to receive sensor data from multiple sensors within a venue, identify an activity in the venue based on the sensor data, and track a respective perspective and a respective location within the venue of each of multiple observers of the activity. The hardware processor also executes the software code to identify a first effect triggered by an action of one of the observers, conform the first effect to the respective perspective and the respective location of each of the observers to produce multiple second effects corresponding to the first effect, and output the second effects for operating multiple actuating devices within the venue during the activity.
    Type: Grant
    Filed: September 25, 2019
    Date of Patent: August 31, 2021
    Assignee: Disney Enterprises, Inc.
    Inventors: Leslie Evans, Nicholas Newberg, Alexis P. Wieland, Clare M. Carroll, Joseph G. Hager, IV, Siroberto Scerbo, Jonathan Becker
  • Patent number: 11024098
    Abstract: Systems and methods are presented for immersive and simultaneous animation in a mixed reality environment. Techniques disclosed represent a physical object, present at a scene, in a 3D space of a virtual environment associated with the scene. A virtual element is posed relative to the representation of the physical object in the virtual environment. The virtual element is displayed to users from a perspective of each user in the virtual environment. Responsive to an interaction of one user with the virtual element, an edit command is generated and the pose of the virtual element is adjusted in the virtual environment according to the edit command. The display of the virtual element to the users is then updated according to the adjusted pose. When simultaneous and conflicting edit commands are generated by collaborating users, policies to reconcile the conflicting edit commands are disclosed.
    Type: Grant
    Filed: June 30, 2020
    Date of Patent: June 1, 2021
    Assignee: Disney Enterprises, Inc.
    Inventors: Corey D. Drake, Kenneth J. Mitchell, Rachel E. Rodgers, Joseph G. Hager, IV, Kyna P. McIntosh, Ye Pan
  • Publication number: 20210110598
    Abstract: The present disclosure relates generally to systems and methods for creating a mixed reality environment. A mixed reality system includes a performance area for generating a mixed reality environment, a motion determination module that determines and tracks the motion of an object within the performance area, a physical article module that generates a physical article within the performance area, and a mixed reality display that displays a virtual article within the performance area, wherein the physical article and the virtual article are correlated to the motion of the object.
    Type: Application
    Filed: October 10, 2019
    Publication date: April 15, 2021
    Inventors: Leslie M. Evans, Siroberto Scerbo, Clare M. Carroll, Joseph G. Hager, IV, Nicholas S. Newberg, Alexis P. Wieland, Jonathan Becker
  • Publication number: 20210090334
    Abstract: There are provided systems and methods for synchronizing effects for multi-user mixed reality experiences. In one implementation, such a system includes a computing platform having a hardware processor and a system memory storing a software code. The hardware processor executes the software code to receive sensor data from multiple sensors within a venue, identify an activity in the venue based on the sensor data, and track a respective perspective and a respective location within the venue of each of multiple observers of the activity. The hardware processor also executes the software code to identify a first effect triggered by an action of one of the observers, conform the first effect to the respective perspective and the respective location of each of the observers to produce multiple second effects corresponding to the first effect, and output the second effects for operating multiple actuating devices within the venue during the activity.
    Type: Application
    Filed: September 25, 2019
    Publication date: March 25, 2021
    Inventors: Leslie Evans, Nicholas Newberg, Alexis P. Wieland, Clare M. Carroll, Joseph G. Hager, IV, Siroberto Scerbo, Jonathan Becker