Patents by Inventor Mathias Kolsch

Mathias Kolsch has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20220398844
    Abstract: Augmented reality (AR) or virtual reality (VR) systems described herein can be configured to record images, video, and/or annotations for concurrent communication to a remote system for display or subsequent access. A communication between a user and an expert user using the system can include an audio communication (unidirectional or bidirectional), a video communication from the user to the expert user (allowing the expert user to see, in real time, the same environment as the user), and a data communication (via which content overlaid over the video communication, such as annotations, may be displayed for both the expert user and the user). The systems can be configured to communicate concurrently while also creating a recording for later review based on a “live” or current issue being faced by a user assisted by an expert where the communication session is recorded for later playback by other users.
    Type: Application
    Filed: August 16, 2022
    Publication date: December 15, 2022
    Inventors: Gerald Wright, JR., John James Lechleiter, Michael Gervautz, Steffen Gauglitz, Mathias Kolsch, Arungundram Mahendran
  • Patent number: 11417091
    Abstract: A shared augmented reality system can support the sharing of video captured by a local user, using a head mounted display (HMD), with a remote user. The remote user may add augmented reality annotations (markings, notes, drawings) to certain objects within the environment captured within the video, where the annotations track the movement of those objects within the shared video. An HMD may not, however, provide a convenient interface for performing certain user-interface intensive tasks, which might be better performed on an additional device such as a mobile phone, tablet, or computer. During a shared augmented reality session, the additional device can be configured to communicate with a HMD such that certain tasks can be performed by the user through the additional device, and other tasks can be performed or experienced through the HMD. The additional device, the HMD and the remote user's device can communicatively coordinate during the session.
    Type: Grant
    Filed: May 30, 2018
    Date of Patent: August 16, 2022
    Inventors: Gerald Wright, Jr., John James Lechleiter, Devender Yamakawa, Michael Gervautz, Steffen Gauglitz, Mathias Kolsch, Arungundram Mahendran
  • Publication number: 20200394012
    Abstract: A shared augmented reality system can support the sharing of video captured by a local user, using a head mounted display (HMD), with a remote user. The remote user may add augmented reality annotations (markings, notes, drawings) to certain objects within the environment captured within the video, where the annotations track the movement of those objects within the shared video. An HMD may not, however, provide a convenient interface for performing certain user-interface intensive tasks, which might be better performed on an additional device such as a mobile phone, tablet, or computer. During a shared augmented reality session, the additional device can be configured to communicate with a HMD such that certain tasks can be performed by the user through the additional device, and other tasks can be performed or experienced through the HMD. The additional device, the HMD and the remote user's device can communicatively coordinate during the session.
    Type: Application
    Filed: May 30, 2018
    Publication date: December 17, 2020
    Inventors: Gerald Wright, JR., John James Lechleiter, Devender Yamakawa, Michael Gervautz, Steffen Gauglitz, Mathias Kolsch, Arungundram Mahendran
  • Patent number: 9489851
    Abstract: Embodiments in accordance with the invention provide a Landing Signal Officer (LSO) Information Management and Trend Analysis (IMTA) system for electronically capturing landing performance data related to aircraft approaches and landings in an IMTA application residing on a portable electronic device (PED) and for automatically generating performance data and trend analysis of the data. In one embodiment, data is input by user, such as an LSO, to one or more context sensitive graphical user interfaces displayed on a touch screen PED. Data entered to and generated by the IMTA application can be further communicated to and updated by external computer systems and appended with additional data and/or video available from external computer systems.
    Type: Grant
    Filed: March 27, 2015
    Date of Patent: November 8, 2016
    Assignee: The United States of America, as represented by the Secretary of the Navy
    Inventors: Michael Gregory Ross, Michael E. McCauley, Neil Charles Rowe, Mathias Kolsch, Arijit Das, Terry D. Norbraten
  • Patent number: 7893935
    Abstract: The present invention relates to a system, method, and computer program product for immersive navigation in a virtual environment (VE) suitable for allowing a user to change a view orientation in the VE independently of physical orientation of a user input, such as orientation of the user's head.
    Type: Grant
    Filed: November 30, 2009
    Date of Patent: February 22, 2011
    Assignee: HRL Laboratories, LLC
    Inventors: Howard Neely, III, Jason Fox, Mathias Kolsch, Matthew Shomphe, Jason Jerald
  • Patent number: 7646394
    Abstract: The present invention relates to a system, method and computer program product for enabling user interaction with objects in a virtual environment independently of apparent virtual viewpoint altitude, by non-linearly scaling the virtual actuator. In doing so, the system receives a virtual-viewpoint position and a virtual actuator position from a virtual environment processing subsystem, and a real-viewpoint position and a real-actuator position from a real-world environment tracking subsystem. An xy-scale factor is then calculated based on the virtual-viewpoint position. A non-linear mapping is thereafter calculated between a real dataset and a virtual dataset based on the xy-scale-factor. The real dataset comprises the real-actuator position and the real-viewpoint position in the real-world environment, and the virtual dataset comprises the virtual-actuator position and the virtual-viewpoint position in the virtual environment.
    Type: Grant
    Filed: March 7, 2005
    Date of Patent: January 12, 2010
    Assignees: HRL Laboratories, LLC, Raytheon Systems
    Inventors: Howard Neely, III, Jason Fox, Mathias Kolsch, Matt Shomphe, Jason Jarald, Mike Daily