Patents by Inventor Jakob Zillner

Jakob Zillner has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12265222
    Abstract: Visual-inertial tracking of an eyewear device using sensors. The eyewear device monitors the sensors of a visual inertial odometry system (VIOS) that provide input for determining a position of the device within its environment. The eyewear device determines the status of the VIOS based information from the sensors and adjusts the plurality of sensors (e.g., by turning on/off sensors, changing the sampling rate, of a combination thereof) based on the determined status. The eyewear device then determines the position of the eyewear device within the environment using the adjusted plurality of sensors.
    Type: Grant
    Filed: September 14, 2023
    Date of Patent: April 1, 2025
    Assignee: Snap Inc.
    Inventors: Olha Borys, Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
  • Publication number: 20250093948
    Abstract: A method for calibrating a visual-inertial tracking system is described. A device operates the visual-inertial tracking system without receiving a tracking request from a virtual object display application. In response to operating the visual-inertial tracking system, the device accesses sensor data from sensors at the device. The device identifies, based on the sensor data, a first calibration parameter value of the visual-inertial tracking system and stores the first calibration parameter value. The system detects a tracking request from the virtual object display application. In response to the tracking request, the system accesses the first calibration parameter value and determines a second calibration parameter value from the first calibration parameter value.
    Type: Application
    Filed: December 5, 2024
    Publication date: March 20, 2025
    Inventors: Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
  • Patent number: 12210672
    Abstract: A method for calibrating a visual-inertial tracking system is described. A device operates the visual-inertial tracking system without receiving a tracking request from a virtual object display application. In response to operating the visual-inertial tracking system, the device accesses sensor data from sensors at the device. The device identifies, based on the sensor data, a first calibration parameter value of the visual-inertial tracking system and stores the first calibration parameter value. The system detects a tracking request from the virtual object display application. In response to the tracking request, the system accesses the first calibration parameter value and determines a second calibration parameter value from the first calibration parameter value.
    Type: Grant
    Filed: March 2, 2023
    Date of Patent: January 28, 2025
    Assignee: Snap Inc.
    Inventors: Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
  • Publication number: 20250008068
    Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for misaligned vantage point mitigation for computer stereo vision. A misaligned vantage point mitigation system determines whether vantage points of the optical sensors are misaligned from an expected vantage point and, if so, determines an adjustment variable to mitigate the misalignment based on the location of matching features identified in images captured by both optical sensors.
    Type: Application
    Filed: September 12, 2024
    Publication date: January 2, 2025
    Inventors: Olha Borys, Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
  • Publication number: 20240355079
    Abstract: A system is disclosed, including a processor and a memory. The memory stores instructions that, when executed by the processor, configure the system to perform operations. Surface plane information is obtained, defining a surface plane passing through a surface location and oriented according to a surface normal. An edge is detected in an image. Virtual content is presented, having a virtual position based on an orientation of the edge and the surface plane information.
    Type: Application
    Filed: April 24, 2023
    Publication date: October 24, 2024
    Inventors: Lien Le Hong Tran, Olha Borys, Ilteris Kaan Canberk, Tobias Maier, Jakob Zillner
  • Patent number: 12126783
    Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for misaligned vantage point mitigation for computer stereo vision. A misaligned vantage point mitigation system determines whether vantage points of the optical sensors are misaligned from an expected vantage point and, if so, determines an adjustment variable to mitigate the misalignment based on the location of matching features identified in images captured by both optical sensors.
    Type: Grant
    Filed: May 18, 2022
    Date of Patent: October 22, 2024
    Assignee: SNAP INC.
    Inventors: Olha Borys, Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
  • Publication number: 20240177328
    Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for continuous surface and depth estimation. A continuous surface and depth estimation system determines the depth and surface normal of physical objects by using stereo vision limited within a predetermined window.
    Type: Application
    Filed: February 7, 2024
    Publication date: May 30, 2024
    Inventors: Olha Borys, Ilteris Kaan Canberk, Daniel Wagner, Jakob Zillner
  • Patent number: 11961251
    Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for continuous surface and depth estimation. A continuous surface and depth estimation system determines the depth and surface normal of physical objects by using stereo vision limited within a predetermined window.
    Type: Grant
    Filed: May 18, 2022
    Date of Patent: April 16, 2024
    Assignee: SNAP INC.
    Inventors: Olha Borys, Ilteris Kaan Canberk, Daniel Wagner, Jakob Zillner
  • Publication number: 20240004197
    Abstract: Visual-inertial tracking of an eyewear device using sensors. The eyewear device monitors the sensors of a visual inertial odometry system (VIOS) that provide input for determining a position of the device within its environment. The eyewear device determines the status of the VIOS based information from the sensors and adjusts the plurality of sensors (e.g., by turning on/off sensors, changing the sampling rate, of a combination thereof) based on the determined status. The eyewear device then determines the position of the eyewear device within the environment using the adjusted plurality of sensors.
    Type: Application
    Filed: September 14, 2023
    Publication date: January 4, 2024
    Inventors: Olha Borys, Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
  • Publication number: 20230401796
    Abstract: A method for aligning coordinate systems from separate augmented reality (AR) devices is described. In one aspect, the method includes generating predicted depths of a first point cloud by applying a pre-trained model to a first single image generated by a first monocular camera of a first augmented reality (AR) device, and first sparse 3D points generated by a first SLAM system at the first AR device, generating predicted depths of a second point cloud by applying the pre-trained model to a second single image generated by a second monocular camera of the second AR device, and second sparse 3D points generated by a second SLAM system at the second AR device, determining a relative pose between the first AR device and the second AR device by registering the first point cloud with the second point cloud.
    Type: Application
    Filed: August 23, 2022
    Publication date: December 14, 2023
    Inventors: Georgios Evangelidis, Branislav Micusik, Jakob Zillner, Nathan Jacob Litke
  • Patent number: 11789266
    Abstract: Visual-inertial tracking of an eyewear device using sensors. The eyewear device monitors the sensors of a visual inertial odometry system (VIOS) that provide input for determining a position of the device within its environment. The eyewear device determines the status of the VIOS based information from the sensors and adjusts the plurality of sensors (e.g., by turning on/off sensors, changing the sampling rate, of a combination thereof) based on the determined status. The eyewear device then determines the position of the eyewear device within the environment using the adjusted plurality of sensors.
    Type: Grant
    Filed: December 15, 2020
    Date of Patent: October 17, 2023
    Assignee: Snap Inc.
    Inventors: Olha Borys, Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
  • Publication number: 20230205311
    Abstract: A method for calibrating a visual-inertial tracking system is described. A device operates the visual-inertial tracking system without receiving a tracking request from a virtual object display application. In response to operating the visual-inertial tracking system, the device accesses sensor data from sensors at the device. The device identifies, based on the sensor data, a first calibration parameter value of the visual-inertial tracking system and stores the first calibration parameter value. The system detects a tracking request from the virtual object display application. In response to the tracking request, the system accesses the first calibration parameter value and determines a second calibration parameter value from the first calibration parameter value.
    Type: Application
    Filed: March 2, 2023
    Publication date: June 29, 2023
    Inventors: Georg Halmetschlager-Funek, Matthias kALKGRUBER, Daniel Wolf, Jakob Zillner
  • Publication number: 20230177708
    Abstract: A depth estimation system to perform operations that include: receiving image data generated by a client device, the image data comprising a depiction of an environment; identifying a set of image features based on the image data; determining a pose of the client device based on the set of features; generating a depth estimation based on the image data and the pose of the client device; and generating a mesh model of the environment based on the depth estimation.
    Type: Application
    Filed: December 5, 2022
    Publication date: June 8, 2023
    Inventors: Erick Mendez Mendez, Isac Andreas Müller Sandvik, Qi Pan, Edward James Rosten, Andrew Tristan Spek, Daniel Wagner, Jakob Zillner
  • Patent number: 11662805
    Abstract: A method for calibrating a visual-inertial tracking system is described. A device operates the visual-inertial tracking system without receiving a tracking request from a virtual object display application. In response to operating the visual-inertial tracking system, the device accesses sensor data from sensors at the device. The device identifies, based on the sensor data, a first calibration parameter value of the visual-inertial tracking system and stores the first calibration parameter value. The system detects a tracking request from the virtual object display application. In response to the tracking request, the system accesses the first calibration parameter value and determines a second calibration parameter value from the first calibration parameter value.
    Type: Grant
    Filed: April 9, 2021
    Date of Patent: May 30, 2023
    Assignee: Snap Inc.
    Inventors: Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
  • Publication number: 20220377306
    Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for misaligned vantage point mitigation for computer stereo vision. A misaligned vantage point mitigation system determines whether vantage points of the optical sensors are misaligned from an expected vantage point and, if so, determines an adjustment variable to mitigate the misalignment based on the location of matching features identified in images captured by both optical sensors.
    Type: Application
    Filed: May 18, 2022
    Publication date: November 24, 2022
    Inventors: Olha Borys, Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
  • Publication number: 20220375112
    Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for continuous surface and depth estimation. A continuous surface and depth estimation system determines the depth and surface normal of physical objects by using stereo vision limited within a predetermined window.
    Type: Application
    Filed: May 18, 2022
    Publication date: November 24, 2022
    Inventors: Olha Borys, Ilteris Kaan Canberk, Daniel Wagner, Jakob Zillner
  • Publication number: 20220206565
    Abstract: A method for calibrating a visual-inertial tracking system is described. A device operates the visual-inertial tracking system without receiving a tracking request from a virtual object display application. In response to operating the visual-inertial tracking system, the device accesses sensor data from sensors at the device. The device identifies, based on the sensor data, a first calibration parameter value of the visual-inertial tracking system and stores the first calibration parameter value. The system detects a tracking request from the virtual object display application. In response to the tracking request, the system accesses the first calibration parameter value and determines a second calibration parameter value from the first calibration parameter value.
    Type: Application
    Filed: April 9, 2021
    Publication date: June 30, 2022
    Inventors: Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
  • Patent number: 11082607
    Abstract: Systems and methods for generating composite depth images are disclosed. Exemplary implementations may: capture, by a depth sensor, a set of depth images over a capture period of time; generate, by an inertial sensor, inertial signals that convey values of one or more inertial parameters characterizing motion of the depth sensor during the capture period of time; select a target capture position based on one or more of the capture positions of the set of depth images; generate, using the values of the one or more inertial parameters during the capture period of time, re-projected depth images; and generate a composite depth image by combining multiple depth images, such multiple depth images including a first re-projected depth image and a second re-projected depth image.
    Type: Grant
    Filed: June 11, 2020
    Date of Patent: August 3, 2021
    Assignee: FACEBOOK TECHNOLOGIES, LLC
    Inventors: Luca Ricci, Daniel Wagner, Jeroen Diederik Hol, Jakob Zillner
  • Publication number: 20200322542
    Abstract: Systems and methods for generating composite depth images are disclosed. Exemplary implementations may: capture, by a depth sensor, a set of depth images over a capture period of time; generate, by an inertial sensor, inertial signals that convey values of one or more inertial parameters characterizing motion of the depth sensor during the capture period of time; select a target capture position based on one or more of the capture positions of the set of depth images; generate, using the values of the one or more inertial parameters during the capture period of time, re-projected depth images; and generate a composite depth image by combining multiple depth images, such multiple depth images including a first re-projected depth image and a second re-projected depth image.
    Type: Application
    Filed: June 11, 2020
    Publication date: October 8, 2020
    Inventors: Luca Ricci, Daniel Wagner, Jeroen Diederik Hol, Jakob Zillner
  • Patent number: 10686980
    Abstract: Systems and methods for generating composite depth images are disclosed. Exemplary implementations may: capture, by a depth sensor, a set of depth images over a capture period of time; generate, by an inertial sensor, inertial signals that convey values of one or more inertial parameters characterizing motion of the depth sensor during the capture period of time; select a target capture position based on one or more of the capture positions of the set of depth images; generate, using the values of the one or more inertial parameters during the capture period of time, re-projected depth images; and generate a composite depth image by combining multiple depth images, such multiple depth images including a first re-projected depth image and a second re-projected depth image.
    Type: Grant
    Filed: January 22, 2019
    Date of Patent: June 16, 2020
    Assignee: DAQRI, LLC
    Inventors: Luca Ricci, Daniel Wagner, Jeroen Diederik Hol, Jakob Zillner