Patents by Inventor David Samuel HOLZ
David Samuel HOLZ has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240045509Abstract: The technology disclosed relates to user interfaces for controlling augmented reality (AR) or virtual reality (VR) environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system. Switching the AR/VR presentation on or off to interact with the real world surrounding them, for example to drink some soda, can be addressed with a convenient mode switching gesture associated with switching between operational modes in a VR/AR enabled device.Type: ApplicationFiled: September 28, 2023Publication date: February 8, 2024Applicant: ULTRAHAPTICS IP TWO LIMITEDInventor: David Samuel HOLZ
-
Publication number: 20230419460Abstract: An AR calibration system for correcting AR headset distortions. A calibration image is provided to a screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the screen. One or more cameras are located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera(s) cancels out an acceptable portion of the calibration image provided to the screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.Type: ApplicationFiled: September 13, 2023Publication date: December 28, 2023Inventors: Johnathon Scott SELSTAD, David Samuel HOLZ
-
Patent number: 11854308Abstract: The technology disclosed also initializes a new hand that enters the field of view of a gesture recognition system using a parallax detection module. The parallax detection module determines candidate regions of interest (ROI) for a given input hand image and computes depth, rotation and position information for the candidate ROI. Then, for each of the candidate ROI, an ImagePatch, which includes the hand, is extracted from the original input hand image to minimize processing of low-information pixels. Further, a hand classifier neural network is used to determine which ImagePatch most resembles a hand. For the qualified, most-hand like ImagePatch, a 3D virtual hand is initialized with depth, rotation and position matching that of the qualified ImagePatch.Type: GrantFiled: February 14, 2017Date of Patent: December 26, 2023Assignee: ULTRAHAPTICS IP TWO LIMITEDInventors: Jonathan Marsden, Raffi Bedikian, David Samuel Holz
-
Patent number: 11841920Abstract: The technology disclosed introduces two types of neural networks: “master” or “generalists” networks and “expert” or “specialists” networks. Both, master networks and expert networks, are fully connected neural networks that take a feature vector of an input hand image and produce a prediction of the hand pose. Master networks and expert networks differ from each other based on the data on which they are trained. In particular, master networks are trained on the entire data set. In contrast, expert networks are trained only on a subset of the entire dataset. In regards to the hand poses, master networks are trained on the input image data representing all available hand poses comprising the training data (including both real and simulated hand images).Type: GrantFiled: February 14, 2017Date of Patent: December 12, 2023Assignee: Ultrahaptics IP Two LimitedInventors: Jonathan Marsden, Raffi Bedikian, David Samuel Holz
-
Patent number: 11798141Abstract: An AR calibration system for correcting AR headset distortions. A calibration image is provided to an external screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the external screen. One or more cameras are located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera(s) cancels out an acceptable portion of the calibration image provided to the external screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.Type: GrantFiled: May 10, 2022Date of Patent: October 24, 2023Assignee: Ultrahaptics IP Two LimitedInventors: Johnathon Scott Selstad, David Samuel Holz
-
Patent number: 11782513Abstract: The technology disclosed relates to user interfaces for controlling augmented reality (AR) or virtual reality (VR) environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system. Switching the AR/VR presentation on or off to interact with the real world surrounding them, for example to drink some soda, can be addressed with a convenient mode switching gesture associated with switching between operational modes in a VR/AR enabled device.Type: GrantFiled: June 11, 2021Date of Patent: October 10, 2023Assignee: Ultrahaptics IP Two LimitedInventor: David Samuel Holz
-
Patent number: 11714880Abstract: The technology disclosed performs hand pose estimation on a so-called “joint-by-joint” basis. So, when a plurality of estimates for the 28 hand joints are received from a plurality of expert networks (and from master experts in some high-confidence scenarios), the estimates are analyzed at a joint level and a final location for each joint is calculated based on the plurality of estimates for a particular joint. This is a novel solution discovered by the technology disclosed because nothing in the field of art determines hand pose estimates at such granularity and precision. Regarding granularity and precision, because hand pose estimates are computed on a joint-by-joint basis, this allows the technology disclosed to detect in real time even the minutest and most subtle hand movements, such a bend/yaw/tilt/roll of a segment of a finger or a tilt an occluded finger, as demonstrated supra in the Experimental Results section of this application.Type: GrantFiled: July 10, 2019Date of Patent: August 1, 2023Assignee: Ultrahaptics IP Two LimitedInventors: Jonathan Marsden, Raffi Bedikian, David Samuel Holz
-
Publication number: 20230214458Abstract: The technology disclosed performs hand pose estimation on a so-called “joint-by-joint” basis. So, when a plurality of estimates for the 28 hand joints are received from a plurality of expert networks (and from master experts in some high-confidence scenarios), the estimates are analyzed at a joint level and a final location for each joint is calculated based on the plurality of estimates for a particular joint. This is a novel solution discovered by the technology disclosed because nothing in the field of art determines hand pose estimates at such granularity and precision. Regarding granularity and precision, because hand pose estimates are computed on a joint-by joint basis, this allows the technology disclosed to detect in real time even the minutest and most subtle hand movements, such a bend/yaw/tilt/roll of a segment of a finger or a tilt an occluded finger, as demonstrated supra in the Experimental Results section of this application.Type: ApplicationFiled: July 10, 2019Publication date: July 6, 2023Applicant: Ultrahaptics IP Two LimitedInventors: Jonathan MARSDEN, Raffi BEDIKIAN, David Samuel HOLZ
-
Publication number: 20220270218Abstract: An AR calibration system for correcting AR headset distortions. A calibration image is provided to an external screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the external screen. One or more cameras are located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera(s) cancels out an acceptable portion of the calibration image provided to the external screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.Type: ApplicationFiled: May 10, 2022Publication date: August 25, 2022Applicant: Ultrahaptics IP Two LimitedInventors: Johnathon Scott SELSTAD, David Samuel Holz
-
Patent number: 11354787Abstract: The disclosed technology teaches an AR calibration system for compensating for AR headset distortions. A calibration image is provided to an external screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the external screen. The camera is located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera cancels out an acceptable portion of the calibration image provided to the external screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.Type: GrantFiled: November 5, 2019Date of Patent: June 7, 2022Assignee: Ultrahaptics IP Two LimitedInventors: Johnathon Scott Selstad, David Samuel Holz
-
Publication number: 20210303079Abstract: The technology disclosed relates to user interfaces for controlling augmented reality (AR) or virtual reality (VR) environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system. Switching the AR/VR presentation on or off to interact with the real world surrounding them, for example to drink some soda, can be addressed with a convenient mode switching gesture associated with switching between operational modes in a VR/AR enabled device.Type: ApplicationFiled: June 11, 2021Publication date: September 30, 2021Applicant: Ultrahaptics IP Two LimitedInventor: David Samuel HOLZ
-
Patent number: 11036304Abstract: The technology disclosed relates to user interfaces for controlling augmented reality (AR) or virtual reality (VR) environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system. Switching the AR/VR presentation on or off to interact with the real world surrounding them, for example to drink some soda, can be addressed with a convenient mode switching gesture associated with switching between operational modes in a VR/AR enabled device.Type: GrantFiled: May 18, 2020Date of Patent: June 15, 2021Assignee: Ultrahaptics IP Two LimitedInventor: David Samuel Holz
-
Publication number: 20200278756Abstract: The technology disclosed relates to user interfaces for controlling augmented reality (AR) or virtual reality (VR) environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system. Switching the AR/VR presentation on or off to interact with the real world surrounding them, for example to drink some soda, can be addressed with a convenient mode switching gesture associated with switching between operational modes in a VR/AR enabled device.Type: ApplicationFiled: May 18, 2020Publication date: September 3, 2020Applicant: Ultrahaptics IP Two LimitedInventor: David Samuel HOLZ
-
Patent number: 10656720Abstract: The technology disclosed relates to user interfaces for controlling augmented reality (AR) or virtual reality (VR) environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system. Switching the AR/VR presentation on or off to interact with the real world surrounding them, for example to drink some soda, can be addressed with a convenient mode switching gesture associated with switching between operational modes in a VR/AR enabled device.Type: GrantFiled: January 15, 2016Date of Patent: May 19, 2020Assignee: Ultrahaptics IP Two LimitedInventor: David Samuel Holz
-
Publication number: 20200143524Abstract: The disclosed technology teaches an AR calibration system for compensating for AR headset distortions. A calibration image is provided to an external screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the external screen. The camera is located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera cancels out an acceptable portion of the calibration image provided to the external screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.Type: ApplicationFiled: November 5, 2019Publication date: May 7, 2020Applicant: Ultrahaptics IP Two LimitedInventors: Johnathon Scott SELSTAD, David Samuel HOLZ
-
Patent number: 10139918Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: GrantFiled: September 28, 2016Date of Patent: November 27, 2018Assignee: Leap Motion, Inc.Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Samuel Holz, Maxwell Sills, Matias Perez, Gabriel A. Hare, Ryan Christopher Julian
-
Publication number: 20170017306Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: ApplicationFiled: September 28, 2016Publication date: January 19, 2017Applicant: Leap Motion, Inc.Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David Samuel HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel A. HARE, Ryan Christopher JULIAN
-
Patent number: 9507442Abstract: Disclosed are a system and a device including a motion capture device and an input device, hereafter referred to as a stylus, which has additional functionality. The motion capture device detects the motion of the stylus and the detected motion is used as an input to a computer system. The system is able to differentiate identical movements of a stylus as different inputs by varying a detectable property of the stylus. The stylus may exhibit a variable reflective property that is detectable by the motion capture device. The variable reflective property gives the stylus additional functionality with an extended vocabulary. The extended vocabulary includes supplemental information and/or instructions detected by the motion capture device.Type: GrantFiled: May 21, 2014Date of Patent: November 29, 2016Assignee: Leap Motion, Inc.Inventors: David Samuel Holz, Kevin A. Horowitz, Justin Schunick
-
Patent number: 9348419Abstract: First and second detection systems coupled to a controller are synchronized, with the first detection system including first emission and detection modules while the second detection system includes a second detection module, for emitting radiation towards and detecting radiation from a region. A pulse of radiation emitted from the first emission module is detected by the first and second detection modules for a first time interval starting at time T1 and for a second time interval starting at time T2, respectively. The radiation received is compared to determine a radiation difference measurement. The starting time T2 is adjusted relative to starting time T1 based at least in part upon the radiation difference measurement to determine a revised starting time T2, thereby aiding the synchronization of starting time T2 with starting time T1.Type: GrantFiled: January 24, 2014Date of Patent: May 24, 2016Assignee: Leap Motion, Inc.Inventors: Ryan Christopher Julian, Hongyuan He, David Samuel Holz
-
Patent number: 9285893Abstract: Imaging systems and methods optimize illumination of objects for purposes of detection, recognition and/or tracking by tailoring the illumination to the position of the object within the detection space. For example, feedback from a tracking system may be used to control and aim the lighting elements so that the illumination can be reduced or increased depending on the need.Type: GrantFiled: January 18, 2013Date of Patent: March 15, 2016Assignee: Leap Motion, Inc.Inventor: David Samuel Holz