Patents by Inventor Daniel Wolfe
Daniel Wolfe has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12287164Abstract: A modular handgun system comprises an elongated universal trigger frame having a pair of slide rails and an accessory rail. The trigger frame is adapted to have a trigger assembly mounted thereto. The system further comprises a grip frame having an elongated channel and a hand grip extending downwardly from the elongated channel. The trigger frame is removably mounted in the elongated channel of the grip frame. The accessory rail of the trigger frame is positioned forward of a forward end of the elongated channel of the grip frame so as to be exposed. The system further comprises a slide and barrel assembly slidably mounted on the slide rails of the trigger frame.Type: GrantFiled: December 8, 2022Date of Patent: April 29, 2025Assignee: ZEV Technologies, LLCInventors: Alec Daniel Wolf, Gene Anthony Velasquez
-
Patent number: 12271517Abstract: Bending data is used to facilitate tracking operations of an extended reality (XR) device, such as hand tracking or other object tracking operations. The XR device obtains bending data indicative of bending of the XR device to accommodate a body part of a user wearing the XR device. The XR device determines, based on the bending data, whether to use previously identified biometric data in a tracking operation. A mode of the XR device is selected responsive to determining whether to use the previously identified biometric data. The selected mode is used to initialize the tracking operation. The selected mode may be a first mode in which the previously identified biometric data is used in the tracking operation or a second mode in which the previously identified biometric data is not used in the tracking operation.Type: GrantFiled: September 29, 2023Date of Patent: April 8, 2025Assignee: Snap Inc.Inventors: Thomas Faeulhammer, Matthias Kalkgruber, Thomas Muttenthaler, Tiago Miguel Pereira Torres, Daniel Wolf
-
Publication number: 20250110547Abstract: Bending data is used to facilitate tracking operations of an extended reality (XR) device, such as hand tracking or other object tracking operations. The XR device obtains bending data indicative of bending of the XR device to accommodate a body part of a user wearing the XR device. The XR device determines, based on the bending data, whether to use previously identified biometric data in a tracking operation. A mode of the XR device is selected responsive to determining whether to use the previously identified biometric data. The selected mode is used to initialize the tracking operation. The selected mode may be a first mode in which the previously identified biometric data is used in the tracking operation or a second mode in which the previously identified biometric data is not used in the tracking operation.Type: ApplicationFiled: September 29, 2023Publication date: April 3, 2025Inventors: Thomas Faeulhammer, Matthias Kalkgruber, Thomas Muttenthaler, Tiago Miguel Pereira Torres, Daniel Wolf
-
Patent number: 12265222Abstract: Visual-inertial tracking of an eyewear device using sensors. The eyewear device monitors the sensors of a visual inertial odometry system (VIOS) that provide input for determining a position of the device within its environment. The eyewear device determines the status of the VIOS based information from the sensors and adjusts the plurality of sensors (e.g., by turning on/off sensors, changing the sampling rate, of a combination thereof) based on the determined status. The eyewear device then determines the position of the eyewear device within the environment using the adjusted plurality of sensors.Type: GrantFiled: September 14, 2023Date of Patent: April 1, 2025Assignee: Snap Inc.Inventors: Olha Borys, Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
-
Patent number: 12265664Abstract: A method for aligning coordinate systems of user devices in an augmented reality system using somatic points of a user's hand as alignment markers. Images captured from multiple user devices are used to align the reference coordinate systems of the user devices to a common reference coordinate system. In some examples, user devices capture images of a hand of a user and use object recognition to identify somatic points as alignment markers. The somatic points of a user device are translated to a common reference coordinate system determined by another user device.Type: GrantFiled: June 26, 2023Date of Patent: April 1, 2025Assignee: Snap Inc.Inventors: Georgios Evangelidis, Bernhard Jung, Ilteris Kaan Canberk, Daniel Wolf, Balázs Tóth, Márton Gergely Kajtár, Branislav Micusik
-
Patent number: 12260567Abstract: A method for carving a 3D space using hands tracking is described. In one aspect, a method includes accessing a first frame from a camera of a display device, tracking, using a hand tracking algorithm operating at the display device, hand pixels corresponding to one or more user hands depicted in the first frame, detecting, using a sensor of the display device, depths of the hand pixels, identifying a 3D region based on the depths of the hand pixels, and applying a 3D reconstruction engine to the 3D region.Type: GrantFiled: October 25, 2022Date of Patent: March 25, 2025Assignee: Snap Inc.Inventors: Branislav Micusik, Georgios Evangelidis, Daniel Wolf
-
Publication number: 20250093948Abstract: A method for calibrating a visual-inertial tracking system is described. A device operates the visual-inertial tracking system without receiving a tracking request from a virtual object display application. In response to operating the visual-inertial tracking system, the device accesses sensor data from sensors at the device. The device identifies, based on the sensor data, a first calibration parameter value of the visual-inertial tracking system and stores the first calibration parameter value. The system detects a tracking request from the virtual object display application. In response to the tracking request, the system accesses the first calibration parameter value and determines a second calibration parameter value from the first calibration parameter value.Type: ApplicationFiled: December 5, 2024Publication date: March 20, 2025Inventors: Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
-
Publication number: 20250068228Abstract: Examples describe a method performed by an extended reality (XR) device that implements a multi-camera object tracking system. The XR device accesses object tracking data associated with an object in a real-world environment. Based on the object tracking data, the XR device activates a low-power mode of the multi-camera object tracking system. In the low-power mode, a state of the object in the real-world environment is determined by using the multi-camera object tracking system.Type: ApplicationFiled: September 29, 2023Publication date: February 27, 2025Inventors: Evangelos Chatzikalymnios, Thomas Faeulhammer, Daniel Wolf, Kai Zhou
-
Publication number: 20250071422Abstract: A method for mitigating motion blur in a visual-inertial tracking system is described. In one aspect, the method includes accessing a first image generated by an optical sensor of the visual tracking system, accessing a second image generated by the optical sensor of the visual tracking system, the second image following the first image, determining a first motion blur level of the first image, determining a second motion blur level of the second image, identifying a scale change between the first image and the second image, determining a first optimal scale level for the first image based on the first motion blur level and the scale change, and determining a second optimal scale level for the second image based on the second motion blur level and the scale change.Type: ApplicationFiled: November 14, 2024Publication date: February 27, 2025Inventors: Matthias Kalkgruber, Daniel Wolf
-
Publication number: 20250054176Abstract: A device includes a processor, image sensors, and memory storing instructions to obtain images from the sensors and process a first image to identify coordinates of a bounding box around an object. The device processes the area within the first box to determine 2-D positions of landmarks associated with the object, derives first 3-D positions of the landmarks, and determines coordinates of a second box bounding the object in the second image using the 3-D landmark positions. The device processes the area within the second box to determine 2-D positions of landmarks and uses triangulation to derive second 3-D positions of the landmarks. Overall, the device obtains images, detects objects and landmarks, determines 2-D and 3-D positions of landmarks, and triangulates 3-D positions.Type: ApplicationFiled: September 27, 2023Publication date: February 13, 2025Inventors: Evangelos Chatzikalymnios, Thomas Faeulhammer, Ihor Tymchyshyn, Daniel Wolf
-
Publication number: 20250037249Abstract: A method for mitigating motion blur in a visual tracking system is described. In one aspect, a method for selective motion blur mitigation in a visual tracking system includes accessing a first image generated by an optical sensor of the visual tracking system, identifying camera operating parameters of the optical sensor during the optical sensor generating the first image, determining a motion of the optical sensor during the optical sensor generating the first image, determining a motion blur level of the first image based on the camera operating parameters of the optical sensor and the motion of the optical sensor, and determining whether to downscale the first image using a pyramid computation algorithm based on the motion blur level.Type: ApplicationFiled: October 15, 2024Publication date: January 30, 2025Inventors: Olha Borys, Matthias Kalkgruber, Daniel Wolf
-
Patent number: 12210672Abstract: A method for calibrating a visual-inertial tracking system is described. A device operates the visual-inertial tracking system without receiving a tracking request from a virtual object display application. In response to operating the visual-inertial tracking system, the device accesses sensor data from sensors at the device. The device identifies, based on the sensor data, a first calibration parameter value of the visual-inertial tracking system and stores the first calibration parameter value. The system detects a tracking request from the virtual object display application. In response to the tracking request, the system accesses the first calibration parameter value and determines a second calibration parameter value from the first calibration parameter value.Type: GrantFiled: March 2, 2023Date of Patent: January 28, 2025Assignee: Snap Inc.Inventors: Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
-
Patent number: 12207848Abstract: A cervical plate assembly is disclosed. The cervical plate assembly includes a base plate including: (1) at least two bone screw seats, each bone screw seat including a borehole dimensioned to receive a bone screw, and (2) a first blocking seat positioned between the at least two bone screw seats. The cervical plate assembly includes a blocking mechanism retained within the first blocking seat of the base plate. The blocking mechanism is selectively positionable between a closed position in which the blocking mechanism obstructs at least one bone screw seat to retain a bone screw with the base plate, and an open position in which the bone screw seats are unobstructed.Type: GrantFiled: February 18, 2022Date of Patent: January 28, 2025Assignee: Globus Medical, Inc.Inventors: Daniel Wolfe, Jason Zappacosta
-
Publication number: 20250022162Abstract: Examples disclosed herein relate to the use of shared pose data in extended reality (XR) tracking. A communication link is established between a first XR device and a second XR device. The second XR device is worn by a user. The first XR device receives pose data of the second XR device via the communication link and captures an image of the user. The user is identified based on the image and the pose data.Type: ApplicationFiled: August 15, 2023Publication date: January 16, 2025Inventors: Brian Fulkerson, Thomas Muttenthaler, Georgios Papandreou, Daniel Wolf
-
Patent number: 12192625Abstract: A method for mitigating motion blur in a visual-inertial tracking system is described. In one aspect, the method includes accessing a first image generated by an optical sensor of the visual tracking system, accessing a second image generated by the optical sensor of the visual tracking system, the second image following the first image, determining a first motion blur level of the first image, determining a second motion blur level of the second image, identifying a scale change between the first image and the second image, determining a first optimal scale level for the first image based on the first motion blur level and the scale change, and determining a second optimal scale level for the second image based on the second motion blur level and the scale change.Type: GrantFiled: May 22, 2023Date of Patent: January 7, 2025Assignee: SNAP INC.Inventors: Matthias Kalkgruber, Daniel Wolf
-
Publication number: 20250008068Abstract: Disclosed are systems, methods, and non-transitory computer-readable media for misaligned vantage point mitigation for computer stereo vision. A misaligned vantage point mitigation system determines whether vantage points of the optical sensors are misaligned from an expected vantage point and, if so, determines an adjustment variable to mitigate the misalignment based on the location of matching features identified in images captured by both optical sensors.Type: ApplicationFiled: September 12, 2024Publication date: January 2, 2025Inventors: Olha Borys, Georg Halmetschlager-Funek, Matthias Kalkgruber, Daniel Wolf, Jakob Zillner
-
Publication number: 20240427156Abstract: Devices and methods for dynamic power configuration (e.g., reduction) for thermal management (e.g., mitigation) in a wearable electronic device such as an eyewear device. The wearable electronic device monitors its temperature and, responsive to the temperature, configures the services it provides to operate in different modes for thermal mitigation (e.g., to prevent overheating). For example, based on temperature, the wearable electronic device adjusts sensors (e.g., turns cameras on or off, changes the sampling rate, or a combination thereof) and adjusts display components (e.g., adjusted rate at which a graphical processing unit generates images and a visual display is updated). This enables the wearable electronic device to consume less power when temperatures are too high in order to provide thermal mitigation.Type: ApplicationFiled: September 6, 2024Publication date: December 26, 2024Inventors: Sumant Hanumante, Bernhard Jung, Matthias Kalkgruber, Anton Kondratenko, Edward Lee Kim-Koon, Gerald Nilles, John James Robertson, Dmitry Ryuma, Alexander Sourov, Daniel Wolf
-
Patent number: 12148128Abstract: A method for mitigating motion blur in a visual tracking system is described. In one aspect, a method for selective motion blur mitigation in a visual tracking system includes accessing a first image generated by an optical sensor of the visual tracking system, identifying camera operating parameters of the optical sensor during the optical sensor generating the first image, determining a motion of the optical sensor during the optical sensor generating the first image, determining a motion blur level of the first image based on the camera operating parameters of the optical sensor and the motion of the optical sensor, and determining whether to downscale the first image using a pyramid computation algorithm based on the motion blur level.Type: GrantFiled: November 8, 2021Date of Patent: November 19, 2024Assignee: Snap Inc.Inventors: Olha Borys, Matthias Kalkgruber, Daniel Wolf
-
Patent number: 12133137Abstract: Systems and methods of providing a user interface in which map features associated with places are selectively highlighted are disclosed herein. In some example embodiments, a computer system receives a request for a transportation service associated with a place, retrieves an entrance geographic location for the place from a database, with the entrance geographic location being stored in association with the place in the database and representing an entrance for accessing the place, generating route information based on the retrieved entrance geographic location, with the route information indicating a route from an origin geographic location of a computing device of a user to the entrance geographic location of the place, and causing the generated route information to be displayed within a user interface on a computing device of the user.Type: GrantFiled: September 1, 2023Date of Patent: October 29, 2024Assignee: Uber Technologies, Inc.Inventors: Satyendra Kumar Nainwal, Daniel Wolf, Kaivalya Bachubhai Parikh, Ankit Tandon
-
Publication number: 20240350179Abstract: A cervical plate assembly is disclosed. The cervical plate assembly includes a base plate including: (1) at least two bone screw seats, each bone screw seat including a borehole dimensioned to receive a bone screw, and (2) a first blocking seat positioned between the at least two bone screw seats. The cervical plate assembly includes a blocking mechanism retained within the first blocking seat of the base plate. The blocking mechanism is selectively positionable between a closed position in which the blocking mechanism obstructs at least one bone screw seat to retain a bone screw with the base plate, and an open position in which the bone screw seats are unobstructed.Type: ApplicationFiled: July 2, 2024Publication date: October 24, 2024Inventors: Daniel Wolfe, Jason Zappacosta