Patents by Inventor Daniel Wolfe

Daniel Wolfe has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12382188
    Abstract: A hand-tracking input pipeline dimming system for an AR system is provided. The AR system deactivates the hand-tracking input pipeline and places a camera component of the hand-tracking input pipeline in a limited operational mode. The AR system uses the camera component to detect initiation of a gesture by a user of the AR system and in response to detecting the initiation of the gesture, the AR system activates the hand-tracking input pipeline and places the camera component in a fully operational mode.
    Type: Grant
    Filed: September 19, 2022
    Date of Patent: August 5, 2025
    Assignee: SNAP INC.
    Inventors: Jan Bajana, Daniel Colascione, Georgios Evangelidis, Erick Mendez Mendez, Daniel Wolf
  • Patent number: 12356095
    Abstract: Visual-inertial tracking of an eyewear device using a rolling shutter camera(s). The device includes a position determining system. Visual-inertial tracking is implemented by sensing motion of the device. An initial pose is obtained for a rolling shutter camera and an image of an environment is captured. The image includes feature points captured at a particular capture time. A number of poses for the rolling shutter camera is computed based on the initial pose and sensed movement of the device. The number of computed poses is responsive to the sensed movement of the mobile device. A computed pose is selected for each feature point in the image by matching the particular capture time for the feature point to the particular computed time for the computed pose. The position of the mobile device is determined within the environment using the feature points and the selected computed poses for the feature points.
    Type: Grant
    Filed: May 29, 2024
    Date of Patent: July 8, 2025
    Assignee: Snap Inc.
    Inventors: Matthias Kalkgruber, Erick Mendez Mendez, Daniel Wagner, Daniel Wolf, Kai Zhou
  • Publication number: 20250216679
    Abstract: Visual-inertial tracking of an eyewear device using sensors. The eyewear device monitors the sensors of a visual inertial odometry system (VIOS) that provide input for determining a position of the device within its environment. The eyewear device determines the status of the VIOS based information from the sensors and adjusts the plurality of sensors (e.g., by turning on/off sensors, changing the sampling rate, of a combination thereof) based on the determined status. The eyewear device then determines the position of the eyewear device within the environment using the adjusted plurality of sensors.
    Type: Application
    Filed: March 17, 2025
    Publication date: July 3, 2025
    Inventors: Olha Borys, Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
  • Publication number: 20250199622
    Abstract: A method for aligning coordinate systems of user devices in an augmented reality system using somatic points of a user's hand as alignment markers. Images captured from multiple user devices are used to align the reference coordinate systems of the user devices to a common reference coordinate system. In some examples, user devices capture images of a hand of a user and use object recognition to identify somatic points as alignment markers. The somatic points of a user device are translated to a common reference coordinate system determined by another user device.
    Type: Application
    Filed: March 4, 2025
    Publication date: June 19, 2025
    Inventors: Georgios Evangelidis, Bernhard Jung, Ilteris Kaan Canberk, Daniel Wolf, Balázs Tóth, Márton Gergely Kajtár, Branislav Micusik
  • Publication number: 20250190050
    Abstract: Bending data is used to facilitate tracking operations of an extended reality (XR) device, such as hand tracking or other object tracking operations. The XR device obtains bending data indicative of bending of the XR device to accommodate a body part of a user wearing the XR device. The XR device determines, based on the bending data, whether to use previously identified biometric data in a tracking operation. A mode of the XR device is selected based on this determination. The XR device performs the tracking operation based on the selected mode. The selected mode may be a first mode in which the previously identified biometric data is used in the tracking operation or a second mode which does not apply previously identified biometric data in the tracking operation.
    Type: Application
    Filed: February 20, 2025
    Publication date: June 12, 2025
    Inventors: Thomas Faeulhammer, Matthias Kalkgruber, Thomas Muttenthaler, Tiago Miguel Pereira Torres, Daniel Wolf
  • Publication number: 20250182302
    Abstract: A method for carving a 3D space using hands tracking is described. In one aspect, a method includes accessing a first frame from a camera of a display device, tracking, using a hand tracking algorithm operating at the display device, hand pixels corresponding to one or more user hands depicted in the first frame, detecting, using a sensor of the display device, depths of the hand pixels, identifying a 3D region based on the depths of the hand pixels, and applying a 3D reconstruction engine to the 3D region.
    Type: Application
    Filed: February 13, 2025
    Publication date: June 5, 2025
    Inventors: Branislav Micusik, Georgios Evangelidis, Daniel Wolf
  • Patent number: 12287164
    Abstract: A modular handgun system comprises an elongated universal trigger frame having a pair of slide rails and an accessory rail. The trigger frame is adapted to have a trigger assembly mounted thereto. The system further comprises a grip frame having an elongated channel and a hand grip extending downwardly from the elongated channel. The trigger frame is removably mounted in the elongated channel of the grip frame. The accessory rail of the trigger frame is positioned forward of a forward end of the elongated channel of the grip frame so as to be exposed. The system further comprises a slide and barrel assembly slidably mounted on the slide rails of the trigger frame.
    Type: Grant
    Filed: December 8, 2022
    Date of Patent: April 29, 2025
    Assignee: ZEV Technologies, LLC
    Inventors: Alec Daniel Wolf, Gene Anthony Velasquez
  • Patent number: 12271517
    Abstract: Bending data is used to facilitate tracking operations of an extended reality (XR) device, such as hand tracking or other object tracking operations. The XR device obtains bending data indicative of bending of the XR device to accommodate a body part of a user wearing the XR device. The XR device determines, based on the bending data, whether to use previously identified biometric data in a tracking operation. A mode of the XR device is selected responsive to determining whether to use the previously identified biometric data. The selected mode is used to initialize the tracking operation. The selected mode may be a first mode in which the previously identified biometric data is used in the tracking operation or a second mode in which the previously identified biometric data is not used in the tracking operation.
    Type: Grant
    Filed: September 29, 2023
    Date of Patent: April 8, 2025
    Assignee: Snap Inc.
    Inventors: Thomas Faeulhammer, Matthias Kalkgruber, Thomas Muttenthaler, Tiago Miguel Pereira Torres, Daniel Wolf
  • Publication number: 20250110547
    Abstract: Bending data is used to facilitate tracking operations of an extended reality (XR) device, such as hand tracking or other object tracking operations. The XR device obtains bending data indicative of bending of the XR device to accommodate a body part of a user wearing the XR device. The XR device determines, based on the bending data, whether to use previously identified biometric data in a tracking operation. A mode of the XR device is selected responsive to determining whether to use the previously identified biometric data. The selected mode is used to initialize the tracking operation. The selected mode may be a first mode in which the previously identified biometric data is used in the tracking operation or a second mode in which the previously identified biometric data is not used in the tracking operation.
    Type: Application
    Filed: September 29, 2023
    Publication date: April 3, 2025
    Inventors: Thomas Faeulhammer, Matthias Kalkgruber, Thomas Muttenthaler, Tiago Miguel Pereira Torres, Daniel Wolf
  • Patent number: 12265222
    Abstract: Visual-inertial tracking of an eyewear device using sensors. The eyewear device monitors the sensors of a visual inertial odometry system (VIOS) that provide input for determining a position of the device within its environment. The eyewear device determines the status of the VIOS based information from the sensors and adjusts the plurality of sensors (e.g., by turning on/off sensors, changing the sampling rate, of a combination thereof) based on the determined status. The eyewear device then determines the position of the eyewear device within the environment using the adjusted plurality of sensors.
    Type: Grant
    Filed: September 14, 2023
    Date of Patent: April 1, 2025
    Assignee: Snap Inc.
    Inventors: Olha Borys, Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
  • Patent number: 12265664
    Abstract: A method for aligning coordinate systems of user devices in an augmented reality system using somatic points of a user's hand as alignment markers. Images captured from multiple user devices are used to align the reference coordinate systems of the user devices to a common reference coordinate system. In some examples, user devices capture images of a hand of a user and use object recognition to identify somatic points as alignment markers. The somatic points of a user device are translated to a common reference coordinate system determined by another user device.
    Type: Grant
    Filed: June 26, 2023
    Date of Patent: April 1, 2025
    Assignee: Snap Inc.
    Inventors: Georgios Evangelidis, Bernhard Jung, Ilteris Kaan Canberk, Daniel Wolf, Balázs Tóth, Márton Gergely Kajtár, Branislav Micusik
  • Patent number: 12260567
    Abstract: A method for carving a 3D space using hands tracking is described. In one aspect, a method includes accessing a first frame from a camera of a display device, tracking, using a hand tracking algorithm operating at the display device, hand pixels corresponding to one or more user hands depicted in the first frame, detecting, using a sensor of the display device, depths of the hand pixels, identifying a 3D region based on the depths of the hand pixels, and applying a 3D reconstruction engine to the 3D region.
    Type: Grant
    Filed: October 25, 2022
    Date of Patent: March 25, 2025
    Assignee: Snap Inc.
    Inventors: Branislav Micusik, Georgios Evangelidis, Daniel Wolf
  • Publication number: 20250093948
    Abstract: A method for calibrating a visual-inertial tracking system is described. A device operates the visual-inertial tracking system without receiving a tracking request from a virtual object display application. In response to operating the visual-inertial tracking system, the device accesses sensor data from sensors at the device. The device identifies, based on the sensor data, a first calibration parameter value of the visual-inertial tracking system and stores the first calibration parameter value. The system detects a tracking request from the virtual object display application. In response to the tracking request, the system accesses the first calibration parameter value and determines a second calibration parameter value from the first calibration parameter value.
    Type: Application
    Filed: December 5, 2024
    Publication date: March 20, 2025
    Inventors: Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
  • Publication number: 20250071422
    Abstract: A method for mitigating motion blur in a visual-inertial tracking system is described. In one aspect, the method includes accessing a first image generated by an optical sensor of the visual tracking system, accessing a second image generated by the optical sensor of the visual tracking system, the second image following the first image, determining a first motion blur level of the first image, determining a second motion blur level of the second image, identifying a scale change between the first image and the second image, determining a first optimal scale level for the first image based on the first motion blur level and the scale change, and determining a second optimal scale level for the second image based on the second motion blur level and the scale change.
    Type: Application
    Filed: November 14, 2024
    Publication date: February 27, 2025
    Inventors: Matthias Kalkgruber, Daniel Wolf
  • Publication number: 20250068228
    Abstract: Examples describe a method performed by an extended reality (XR) device that implements a multi-camera object tracking system. The XR device accesses object tracking data associated with an object in a real-world environment. Based on the object tracking data, the XR device activates a low-power mode of the multi-camera object tracking system. In the low-power mode, a state of the object in the real-world environment is determined by using the multi-camera object tracking system.
    Type: Application
    Filed: September 29, 2023
    Publication date: February 27, 2025
    Inventors: Evangelos Chatzikalymnios, Thomas Faeulhammer, Daniel Wolf, Kai Zhou
  • Publication number: 20250054176
    Abstract: A device includes a processor, image sensors, and memory storing instructions to obtain images from the sensors and process a first image to identify coordinates of a bounding box around an object. The device processes the area within the first box to determine 2-D positions of landmarks associated with the object, derives first 3-D positions of the landmarks, and determines coordinates of a second box bounding the object in the second image using the 3-D landmark positions. The device processes the area within the second box to determine 2-D positions of landmarks and uses triangulation to derive second 3-D positions of the landmarks. Overall, the device obtains images, detects objects and landmarks, determines 2-D and 3-D positions of landmarks, and triangulates 3-D positions.
    Type: Application
    Filed: September 27, 2023
    Publication date: February 13, 2025
    Inventors: Evangelos Chatzikalymnios, Thomas Faeulhammer, Ihor Tymchyshyn, Daniel Wolf
  • Publication number: 20250037249
    Abstract: A method for mitigating motion blur in a visual tracking system is described. In one aspect, a method for selective motion blur mitigation in a visual tracking system includes accessing a first image generated by an optical sensor of the visual tracking system, identifying camera operating parameters of the optical sensor during the optical sensor generating the first image, determining a motion of the optical sensor during the optical sensor generating the first image, determining a motion blur level of the first image based on the camera operating parameters of the optical sensor and the motion of the optical sensor, and determining whether to downscale the first image using a pyramid computation algorithm based on the motion blur level.
    Type: Application
    Filed: October 15, 2024
    Publication date: January 30, 2025
    Inventors: Olha Borys, Matthias Kalkgruber, Daniel Wolf
  • Patent number: 12210672
    Abstract: A method for calibrating a visual-inertial tracking system is described. A device operates the visual-inertial tracking system without receiving a tracking request from a virtual object display application. In response to operating the visual-inertial tracking system, the device accesses sensor data from sensors at the device. The device identifies, based on the sensor data, a first calibration parameter value of the visual-inertial tracking system and stores the first calibration parameter value. The system detects a tracking request from the virtual object display application. In response to the tracking request, the system accesses the first calibration parameter value and determines a second calibration parameter value from the first calibration parameter value.
    Type: Grant
    Filed: March 2, 2023
    Date of Patent: January 28, 2025
    Assignee: Snap Inc.
    Inventors: Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
  • Patent number: 12207848
    Abstract: A cervical plate assembly is disclosed. The cervical plate assembly includes a base plate including: (1) at least two bone screw seats, each bone screw seat including a borehole dimensioned to receive a bone screw, and (2) a first blocking seat positioned between the at least two bone screw seats. The cervical plate assembly includes a blocking mechanism retained within the first blocking seat of the base plate. The blocking mechanism is selectively positionable between a closed position in which the blocking mechanism obstructs at least one bone screw seat to retain a bone screw with the base plate, and an open position in which the bone screw seats are unobstructed.
    Type: Grant
    Filed: February 18, 2022
    Date of Patent: January 28, 2025
    Assignee: Globus Medical, Inc.
    Inventors: Daniel Wolfe, Jason Zappacosta
  • Patent number: D1082546
    Type: Grant
    Filed: December 9, 2024
    Date of Patent: July 8, 2025
    Assignee: International Lifeline Fund
    Inventor: Daniel Wolf