Patents by Inventor Matthias Kalkgruber
Matthias Kalkgruber has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240111156Abstract: A system for deformation or bending correction in an Augmented Reality (AR) system. Sensors are positioned in a frame of a head-worn AR system to sense forces or pressure acting on the frame by temple pieces attached to the frame. The sensed forces or pressure are used in conjunction with a model of the frame to determine a corrected model of the frame. The corrected model is used to correct video data captured by the AR system and to correct a video virtual overlay that is provided to a user wearing the head-worn AR system.Type: ApplicationFiled: October 4, 2022Publication date: April 4, 2024Inventors: Matthias Kalkgruber, Tiago Miguel Pereira Torres, Weston Welge, Ramzi Zahreddine
-
Patent number: 11941184Abstract: A method for dynamically initializing a 3 degrees of freedom (3DOF) tracking device is described. In one aspect, the method includes accessing a gyroscope signal from a gyroscope of the 3DOF tracking device, accessing an accelerometer signal from an accelerometer of the 3DOF tracking device, determining an initial state includes a combination of an initial orientation, an initial position, and an initial velocity of the 3DOF tracking device, the initial state indicating a starting condition of the 3DOF tracking device, integrating the gyroscope signal and the accelerometer signal to obtain orientation and position signals using the initial state, and refining an inclination signal of the orientation signal using the position signal.Type: GrantFiled: November 11, 2021Date of Patent: March 26, 2024Assignee: Snap Inc.Inventors: Jeroen Diederik Hol, Matthias Kalkgruber
-
Publication number: 20240085166Abstract: An eyewear device including a strain gauge sensor to determine when the eyewear device is manipulated by a user, such as being put on, taken off, and interacted with. A processor identifies a signature event based on sensor signals received from the strain gauge sensor and a data table of strain gauge sensor measurements corresponding to signature events. The processor controls the eyewear device as a function of the identified signature event, such as powering on a display of the eyewear device as the eyewear device is being put on a user's head, and then turning of the display when the eyewear device is removed from the user's head.Type: ApplicationFiled: September 12, 2022Publication date: March 14, 2024Inventors: Jason Heger, Matthias Kalkgruber, Erick Mendez Mendez
-
Publication number: 20240031678Abstract: A method and apparatus of tracking poses of a rolling-shutter camera in an augmented reality (AR) system is provided. The method and apparatus use camera information and inertial sensor readings from Inertial Measurement Unit (IMU) to estimate the pose of the camera at a reference line. Thereafter, relative pose changes at scanlines may be calculated using the inertial sensor data. The estimated reference pose of the camera is then further refined based on the visual information from the camera, the relative pose changes and the optimized reference line pose of a previous image. Thereafter, the estimate of the scanline poses may be updated using the relative pose changes obtained in the earlier steps.Type: ApplicationFiled: October 2, 2023Publication date: January 25, 2024Inventors: Georg Halmetschlager-Funek, Jeroen Hol, Matthias Kalkgruber, Nikolaj Kuntner, Daniel Wolf
-
Publication number: 20240029302Abstract: A method for correcting a bending of a flexible device is described. In one aspect, the method includes accessing feature data of a first stereo frame that is generated by stereo optical sensors of the flexible device, the feature data generated based on a visual-inertial odometry (VIO) system of the flexible device, accessing depth map data of the first stereo frame, the depth map data generated based on a depth map system of the flexible device, estimating a pitch-roll bias and a yaw bias based on the features data and the depth map data of the first stereo frame, and generating a second stereo frame after the first stereo frame, the second stereo frame based on the pitch-roll bias and the yaw bias of the first stereo frame.Type: ApplicationFiled: September 27, 2023Publication date: January 25, 2024Inventors: Sagi Katz, Matthias Kalkgruber
-
Publication number: 20240004197Abstract: Visual-inertial tracking of an eyewear device using sensors. The eyewear device monitors the sensors of a visual inertial odometry system (VIOS) that provide input for determining a position of the device within its environment. The eyewear device determines the status of the VIOS based information from the sensors and adjusts the plurality of sensors (e.g., by turning on/off sensors, changing the sampling rate, of a combination thereof) based on the determined status. The eyewear device then determines the position of the eyewear device within the environment using the adjusted plurality of sensors.Type: ApplicationFiled: September 14, 2023Publication date: January 4, 2024Inventors: Olha Borys, Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
-
Patent number: 11854227Abstract: A method for correcting a bending of a flexible device is described. In one aspect, the method includes accessing feature data of a first stereo frame that is generated by stereo optical sensors of the flexible device, the feature data generated based on a visual-inertial odometry (VIO) system of the flexible device, accessing depth map data of the first stereo frame, the depth map data generated based on a depth map system of the flexible device, estimating a pitch-roll bias and a yaw bias based on the features data and the depth map data of the first stereo frame, and generating a second stereo frame after the first stereo frame, the second stereo frame based on the pitch-roll bias and the yaw bias of the first stereo frame.Type: GrantFiled: September 21, 2021Date of Patent: December 26, 2023Assignee: SNAP INC.Inventors: Sagi Katz, Matthias Kalkgruber
-
Publication number: 20230388632Abstract: A method for limiting motion blur in a visual tracking system is described. In one aspect, the method includes accessing a first image generated by an optical sensor of the visual tracking system, identifying camera operating parameters of the optical sensor for the first image, determining a motion of the optical sensor for the first image, determining a motion blur level of the first image based on the camera operating parameters of the optical sensor and the motion of the optical sensor, and adjusting the camera operating parameters of the optical sensor based on the motion blur level.Type: ApplicationFiled: August 14, 2023Publication date: November 30, 2023Inventors: Bo Ding, Ozi Egri, Matthias Kalkgruber, Daniel Wolf
-
Publication number: 20230360267Abstract: A method for adjusting camera intrinsic parameters of a multi-camera visual tracking device is described. In one aspect, a method for calibrating the multi-camera visual tracking system includes disabling a first camera of the multi-camera visual tracking system while a second camera of the multi-camera visual tracking system is enabled, detecting a first set of features in a first image generated by the first camera after detecting that the temperature of the first camera is within the threshold of the factory calibration temperature of the first camera, and accessing and correcting intrinsic parameters of the second camera based on the projection of the first set of features in the second image and a second set of features in the second image.Type: ApplicationFiled: May 17, 2023Publication date: November 9, 2023Inventors: Clemens Birklbauer, Georg Halmetschlager-Funek, Matthias Kalkgruber, Kai Zhou
-
Publication number: 20230350206Abstract: Devices and methods for dynamic power configuration (e.g., reduction) for thermal management (e.g., mitigation) in a wearable electronic device such as an eyewear device. The wearable electronic device monitors its temperature and, responsive to the temperature, configures the services it provides to operate in different modes for thermal mitigation (e.g., to prevent overheating). For example, based on temperature, the wearable electronic device adjusts sensors (e.g., turns cameras on or off, changes the sampling rate, or a combination thereof) and adjusts display components (e.g., adjusted rate at which a graphical processing unit generates images and a visual display is updated). This enables the wearable electronic device to consume less power when temperatures are too high in order to provide thermal mitigation.Type: ApplicationFiled: July 6, 2023Publication date: November 2, 2023Applicant: Snap Inc.Inventors: Sumant Hanumante, Bernhard Jung, Matthias Kalkgruber, Anton Kondratenko, Edward Lee Kim-Koon, Gerald Nilles, John James Robertson, Dmitry Ryuma, Alexander Sourov, Daniel Wolf
-
Patent number: 11789266Abstract: Visual-inertial tracking of an eyewear device using sensors. The eyewear device monitors the sensors of a visual inertial odometry system (VIOS) that provide input for determining a position of the device within its environment. The eyewear device determines the status of the VIOS based information from the sensors and adjusts the plurality of sensors (e.g., by turning on/off sensors, changing the sampling rate, of a combination thereof) based on the determined status. The eyewear device then determines the position of the eyewear device within the environment using the adjusted plurality of sensors.Type: GrantFiled: December 15, 2020Date of Patent: October 17, 2023Assignee: Snap Inc.Inventors: Olha Borys, Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
-
Patent number: 11792517Abstract: A method and apparatus of tracking poses of a rolling-shutter camera in an augmented reality (AR) system is provided. The method and apparatus use camera information and inertial sensor readings from Inertial Measurement Unit (IMU) to estimate the pose of the camera at a reference line. Thereafter, relative pose changes at scanlines may be calculated using the inertial sensor data. The estimated reference pose of the camera is then further refined based on the visual information from the camera, the relative pose changes and the optimized reference line pose of a previous image. Thereafter, the estimate of the scanline poses may be updated using the relative pose changes obtained in the earlier steps.Type: GrantFiled: December 28, 2020Date of Patent: October 17, 2023Assignee: Snap Inc.Inventors: Georg Halmetschlager-Funek, Jeroen Hol, Matthias Kalkgruber, Nikolaj Kuntner, Daniel Wolf
-
Publication number: 20230296902Abstract: An eyewear device with flexible frame for Augmented Reality (AR) is disclosed. At least two sensors and a display are mounted on the flexible frame. When in use, the real time geometry of the eyewear device may change from factory calibrated geometry, resulting in low quality AR rendering. A modeling module is provided to model the real time geometry of the eyewear device on the fly using sensor information of the at least two sensors. The modeled real time geometry is then provided to a rendering module to accurately display the AR to the user.Type: ApplicationFiled: April 21, 2023Publication date: September 21, 2023Inventors: Clemens Birklbauer, Georg Halmetschlager-Funek, Jeroen Hol, Matthias Kalkgruber, Daniel Wagner
-
Publication number: 20230300464Abstract: A method for mitigating motion blur in a visual-inertial tracking system is described. In one aspect, the method includes accessing a first image generated by an optical sensor of the visual tracking system, accessing a second image generated by the optical sensor of the visual tracking system, the second image following the first image, determining a first motion blur level of the first image, determining a second motion blur level of the second image, identifying a scale change between the first image and the second image, determining a first optimal scale level for the first image based on the first motion blur level and the scale change, and determining a second optimal scale level for the second image based on the second motion blur level and the scale change.Type: ApplicationFiled: May 22, 2023Publication date: September 21, 2023Inventors: Matthias Kalkgruber, Daniel Wolf
-
Publication number: 20230297164Abstract: A method for improving the startup time of a six-degrees of freedom tracking system is described. An augmented reality system receives a device initialization request and activates a first set of sensors in response to the device initialization request. The augmented reality system receives first tracking data from the first set of sensors. The augmented reality system receives an augmented reality experience request and in response to the augmented reality request, causes display of a set of augmented reality content items based on the first tracking data and simultaneously activates a second set of sensors. The augmented reality system receives second tracking data from the activated second set of sensors. The augmented reality system updates the display of the set of augmented reality content items based on the second tracking data.Type: ApplicationFiled: May 24, 2023Publication date: September 21, 2023Inventors: Jeroen Diederik Hol, Matthias Kalkgruber, Erick Mendez Mendez, Niall Murphy, Gerald Nilles, Mathieu Emmanuel Vignau
-
Patent number: 11765457Abstract: A method for limiting motion blur in a visual tracking system is described. In one aspect, the method includes accessing a first image generated by an optical sensor of the visual tracking system, identifying camera operating parameters of the optical sensor for the first image, determining a motion of the optical sensor for the first image, determining a motion blur level of the first image based on the camera operating parameters of the optical sensor and the motion of the optical sensor, and adjusting the camera operating parameters of the optical sensor based on the motion blur level.Type: GrantFiled: November 9, 2021Date of Patent: September 19, 2023Assignee: SNAP INC.Inventors: Bo Ding, Ozi Egri, Matthias Kalkgruber, Daniel Wolf
-
Patent number: 11719939Abstract: Devices and methods for dynamic power configuration (e.g., reduction) for thermal management (e.g., mitigation) in a wearable electronic device such as an eyewear device. The wearable electronic device monitors its temperature and, responsive to the temperature, configures the services is provides to operate in different modes for thermal mitigation (e.g., to prevent overheating). For example, based on temperature, the wearable electronic device adjusts sensors (e.g., turns cameras on or off, changes the sampling rate, or a combination thereof) and adjusts display components (e.g., adjusted rate at which a graphical processing unit generates images and a visual display is updated). This enables the wearable electronic device to consume less power when temperatures are too high in order to provide thermal mitigation.Type: GrantFiled: May 9, 2022Date of Patent: August 8, 2023Assignee: Snap Inc.Inventors: Sumant Hanumante, Bernhard Jung, Matthias Kalkgruber, Anton Kondratenko, Edward Lee Kim-Koon, Gerald Nilles, John James Robertson, Dmitry Ryuma, Alexander Sourov, Daniel Wolf
-
Publication number: 20230205311Abstract: A method for calibrating a visual-inertial tracking system is described. A device operates the visual-inertial tracking system without receiving a tracking request from a virtual object display application. In response to operating the visual-inertial tracking system, the device accesses sensor data from sensors at the device. The device identifies, based on the sensor data, a first calibration parameter value of the visual-inertial tracking system and stores the first calibration parameter value. The system detects a tracking request from the virtual object display application. In response to the tracking request, the system accesses the first calibration parameter value and determines a second calibration parameter value from the first calibration parameter value.Type: ApplicationFiled: March 2, 2023Publication date: June 29, 2023Inventors: Georg Halmetschlager-Funek, Matthias kALKGRUBER, Daniel Wolf, Jakob Zillner
-
Patent number: 11688101Abstract: A method for adjusting camera intrinsic parameters of a multi-camera visual tracking device is described. In one aspect, a method for calibrating the multi-camera visual tracking system includes disabling a first camera of the multi-camera visual tracking system while a second camera of the multi-camera visual tracking system is enabled, detecting a first set of features in a first image generated by the first camera after detecting that the temperature of the first camera is within the threshold of the factory calibration temperature of the first camera, and accessing and correcting intrinsic parameters of the second camera based on the projection of the first set of features in the second image and a second set of features in the second image.Type: GrantFiled: September 23, 2021Date of Patent: June 27, 2023Assignee: Snap Inc.Inventors: Clemens Birklbauer, Georg Halmetschlager-Funek, Matthias Kalkgruber, Kai Zhou
-
Patent number: 11681361Abstract: A method for improving the startup time of a six-degrees of freedom tracking system is described. An augmented reality system receives a device initialization request and activates a first set of sensors in response to the device initialization request. The augmented reality system receives first tracking data from the first set of sensors. The augmented reality system receives an augmented reality experience request and in response to the augmented reality request, causes display of a set of augmented reality content items based on the first tracking data and simultaneously activates a second set of sensors. The augmented reality system receives second tracking data from the activated second set of sensors. The augmented reality system updates the display of the set of augmented reality content items based on the second tracking data.Type: GrantFiled: May 12, 2022Date of Patent: June 20, 2023Assignee: Snap Inc.Inventors: Jeroen Diederik Hol, Matthias Kalkgruber, Erick Mendez Mendez, Niall Murphy, Gerald Nilles, Mathieu Emmanuel Vignau