Patents by Inventor Valentin Heun

Valentin Heun has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11776205
    Abstract: One or more image and/or depth cameras capture images and/or depths of a physical environment over time. A computer system processes the images to create a static 3-dimensional (3D) model representing stationary structure and a dynamic 3D model representing moving or moveable objects within the environment. The system visually overlays the dynamic 3D model over the static 3D model in a user interface. Through the user interface, a user can create virtual spatial interaction sensors, each of which is defined by a volume of space within the environment. A virtual spatial interaction sensor can be triggered, based on analysis of the dynamic 3D model by the computer system, whenever a moveable object within the environment intersects the defined volume of the sensor. Times and durations of intersections can be logged and used for process refinement.
    Type: Grant
    Filed: June 9, 2021
    Date of Patent: October 3, 2023
    Inventors: James Keat Hobin, Valentin Heun
  • Publication number: 20230229282
    Abstract: A configuration system uses multiple depth cameras to create a volumetric capture space around an electronically controllable industrial machine or system, referred to as a target system. The output of the cameras is processed to create a live 3D model of everything within the space. A remote operator can then navigate within this 3D model, for example from a desktop application, in order to view the target system from various perspectives in a live 3D telepresence. In addition to the live 3D model, a configuration system generates a 3D user interface for programming and configuring machines or target systems within the space in a spatially coherent way. Local operators can interact with the target system using mobile phones which track the target system in augmented reality. Any number of local operators can interact with a remote operator to simultaneously program and configure the target system.
    Type: Application
    Filed: March 25, 2023
    Publication date: July 20, 2023
    Inventors: Benjamin Reynolds, Valentin Heun, James Keat Hobin, Hisham Bedri
  • Patent number: 11625140
    Abstract: A configuration system uses multiple depth cameras to create a volumetric capture space around an electronically controllable industrial machine or system, referred to as a target system. The output of the cameras is processed to create a live 3D model of everything within the space. A remote operator can then navigate within this 3D model, for example from a desktop application, in order to view the target system from various perspectives in a live 3D telepresence. In addition to the live 3D model, a configuration system generates a 3D user interface for programming and configuring machines or target systems within the space in a spatially coherent way. Local operators can interact with the target system using mobile phones which track the target system in augmented reality. Any number of local operators can interact with a remote operator to simultaneously program and configure the target system.
    Type: Grant
    Filed: June 1, 2020
    Date of Patent: April 11, 2023
    Inventors: Benjamin Reynolds, Valentin Heun, James Keat Hobin, Hisham Bedri
  • Patent number: 11562544
    Abstract: A display of an augmented reality-enabled (AR) device, such as a mobile phone, can be used to transfer a graphical object between a secondary display, such as a computer monitor, that is captured by a camera of the AR device, and AR space, where the object is visible only through the AR interface of the AR device. A graphical object can be selected through the AR interface and, for example, moved around on a canvas of the secondary display by the user of the AR device. When the AR interface is used to move an enabled object near an edge of the canvas or physical boundary of the secondary display, the object as shown on the secondary display can be made to disappear from the secondary display to be replaced by a virtual object shown only on the AR interface in a similar location.
    Type: Grant
    Filed: June 28, 2021
    Date of Patent: January 24, 2023
    Inventors: Valentin Heun, Benjamin Reynolds, Christian Vázquez
  • Publication number: 20210383604
    Abstract: One or more image and/or depth cameras capture images and/or depths of a physical environment over time. A computer system processes the images to create a static 3-dimensional (3D) model representing stationary structure and a dynamic 3D model representing moving or moveable objects within the environment. The system visually overlays the dynamic 3D model over the static 3D model in a user interface. Through the user interface, a user can create virtual spatial interaction sensors, each of which is defined by a volume of space within the environment. A virtual spatial interaction sensor can be triggered, based on analysis of the dynamic 3D model by the computer system, whenever a moveable object within the environment intersects the defined volume of the sensor. Times and durations of intersections can be logged and used for process refinement.
    Type: Application
    Filed: June 9, 2021
    Publication date: December 9, 2021
    Inventors: James Keat Hobin, Valentin Heun
  • Publication number: 20210335047
    Abstract: A display of an augmented reality-enabled (AR) device, such as a mobile phone, can be used to transfer a graphical object between a secondary display, such as a computer monitor, that is captured by a camera of the AR device, and AR space, where the object is visible only through the AR interface of the AR device. A graphical object can be selected through the AR interface and, for example, moved around on a canvas of the secondary display by the user of the AR device. When the AR interface is used to move an enabled object near an edge of the canvas or physical boundary of the secondary display, the object as shown on the secondary display can be made to disappear from the secondary display to be replaced by a virtual object shown only on the AR interface in a similar location.
    Type: Application
    Filed: June 28, 2021
    Publication date: October 28, 2021
    Inventors: Valentin Heun, Benjamin Reynolds, Christian Vázquez
  • Patent number: 11049322
    Abstract: A display of an augmented reality-enabled (AR) device, such as a mobile phone, can be used to transfer a graphical object between a secondary display, such as a computer monitor, that is captured by a camera of the AR device, and AR space, where the object is visible only through the AR interface of the AR device. A graphical object can be selected through the AR interface and, for example, moved around on a canvas of the secondary display by the user of the AR device. When the AR interface is used to move an enabled object near an edge of the canvas or physical boundary of the secondary display, the object as shown on the secondary display can be made to disappear from the secondary display to be replaced by a virtual object shown only on the AR interface in a similar location.
    Type: Grant
    Filed: June 18, 2019
    Date of Patent: June 29, 2021
    Inventors: Benjamin Reynolds, Christian Vázquez, Valentin Heun
  • Publication number: 20200379627
    Abstract: A configuration system uses multiple depth cameras to create a volumetric capture space around an electronically controllable industrial machine or system, referred to as a target system. The output of the cameras is processed to create a live 3D model of everything within the space. A remote operator can then navigate within this 3D model, for example from a desktop application, in order to view the target system from various perspectives in a live 3D telepresence. In addition to the live 3D model, a configuration system generates a 3D user interface for programming and configuring machines or target systems within the space in a spatially coherent way. Local operators can interact with the target system using mobile phones which track the target system in augmented reality. Any number of local operators can interact with a remote operator to simultaneously program and configure the target system.
    Type: Application
    Filed: June 1, 2020
    Publication date: December 3, 2020
    Inventors: Benjamin Reynolds, Valentin Heun, James Keat Hobin, Hisham Bedri
  • Publication number: 20200051337
    Abstract: A display of an augmented reality-enabled (AR) device, such as a mobile phone, can be used to transfer a graphical object between a secondary display, such as a computer monitor, that is captured by a camera of the AR device, and AR space, where the object is visible only through the AR interface of the AR device. A graphical object can be selected through the AR interface and, for example, moved around on a canvas of the secondary display by the user of the AR device. When the AR interface is used to move an enabled object near an edge of the canvas or physical boundary of the secondary display, the object as shown on the secondary display can be made to disappear from the secondary display to be replaced by a virtual object shown only on the AR interface in a similar location.
    Type: Application
    Filed: June 18, 2019
    Publication date: February 13, 2020
    Inventors: Benjamin Reynolds, Christian Vázquez, Valentin Heun