Patents by Inventor David S. HOLZ
David S. HOLZ has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12105889Abstract: The technology disclosed relates to automatically interpreting motion of a control object in a three dimensional (3D) sensor space by sensing a movement of the control object in the (3D) sensor space, interpreting movement of the control object, and presenting the interpreted movement as a path on a display. The path may be displayed once the speed of the movement exceeds a pre-determined threshold measured in cm per second. Once the path is displayed, the technology duplicates a display object that intersects the path on the display. In some implementations, the control object may be a device, a hand, or a portion of a hand (such as a finger).Type: GrantFiled: June 23, 2023Date of Patent: October 1, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Isaac Cohen, David S. Holz, Maxwell Sills
-
Patent number: 12095969Abstract: A motion sensory and imaging device capable of acquiring imaging information of the scene and providing at least a near real time pass-through of imaging information to a user. The sensory and imaging device can be used stand-alone or coupled to a wearable or portable device to create a wearable sensory system capable of presenting to the wearer the imaging information augmented with virtualized or created presentations of information.Type: GrantFiled: September 28, 2023Date of Patent: September 17, 2024Assignee: ULTRAHAPTICS IP TWO LIMITEDInventors: David S. Holz, Neeloy Roy, Hongyuan He
-
Publication number: 20240302163Abstract: Methods and systems for determining a gesture command from analysis of differences in positions of fit closed curves fit to observed edges of a control object to track motion of the control object while making a gesture in a 3D space include repeatedly obtaining captured images of a control object moving in 3D space and calculating observed edges of the control object from the captured images. Closed curves are fit to the observed edges of the control object, including control object appendages for multiple portions of any complex control objects, as captured in the captured images by selecting a closed curve from a family of similar closed curves that fit the observed edges of the control object as captured using an assumed parameter. Using fitted closed curves, a complex control object is constructed from multiple portions of any complex control objects and one or more of control object appendages appended.Type: ApplicationFiled: May 14, 2024Publication date: September 12, 2024Applicant: ULTRAHAPTICS IP TWO LIMITEDInventor: David S. HOLZ
-
Patent number: 12086327Abstract: A method for detecting a finger is provided. The method includes obtaining a plurality of digital images including a first digital image captured by a camera from a field of view containing a background and a hand including at least one finger, and obtaining an identification of pixels of the plurality of digital images that correspond to at least one finger that is visible in the plurality of digital images rather than to the background, the pixels being identified by: obtaining, from the digital images, a Gaussian brightness falloff pattern indicative of at least one finger, identifying an axis of the at least one finger based on the obtained Gaussian brightness falloff pattern indicative of the at least one finger without identifying edges of the at least one finger, and identifying the pixels that correspond to the at least one finger based on the identified axis.Type: GrantFiled: September 18, 2023Date of Patent: September 10, 2024Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Hua Yang
-
Patent number: 12086935Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.Type: GrantFiled: September 26, 2023Date of Patent: September 10, 2024Assignee: ULTRAHAPTICS IP TWO LIMITEDInventors: Kevin A. Horowitz, David S. Holz
-
Publication number: 20240296588Abstract: The technology disclosed relates to coordinating motion-capture of a hand by a network of motion-capture sensors having overlapping fields of view. In particular, it relates to designating a first sensor among three or more motion-capture sensors as having a master frame of reference, observing motion of a hand as it passes through overlapping fields of view of the respective motion-capture sensors, synchronizing capture of images of the hand within the overlapping fields of view by pairs of the motion-capture devices, and using the pairs of the hand images captured by the synchronized motion-capture devices to automatically calibrate the motion-capture sensors to the master frame of reference frame.Type: ApplicationFiled: May 13, 2024Publication date: September 5, 2024Applicant: Ultrahaptics IP Two LimitedInventor: David S. HOLZ
-
Patent number: 12067157Abstract: The technology disclosed can provide capabilities such as using motion sensors and/or other types of sensors coupled to a motion-capture system to monitor motions within a real environment. A virtual object can be projected to a user of a portable device integrated into an augmented rendering of a real environment about the user. Motion information of a user body portion is determined based at least in part upon sensory information received from imaging or acoustic sensory devices. Control information is communicated to a system based in part on a combination of the motion of the portable device and the detected motion of the user. The virtual device experience can be augmented in some implementations by the addition of haptic, audio and/or other sensory information projectors.Type: GrantFiled: December 23, 2022Date of Patent: August 20, 2024Assignee: Ultrahaptics IP Two LimitedInventor: David S. Holz
-
Patent number: 12050757Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: GrantFiled: February 24, 2023Date of Patent: July 30, 2024Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Patent number: 12045394Abstract: Methods and systems for processing input from an image-capture device for gesture-recognition. The method further includes computationally interpreting user gestures in accordance with a first mode of operation; analyzing the path of movement of an object to determine an intent of a user to change modes of operation; and, upon determining an intent of the user to change modes of operation, subsequently interpreting user gestures in accordance with the second mode of operation.Type: GrantFiled: July 26, 2023Date of Patent: July 23, 2024Assignee: ULTRAHAPTICS IP TWO LIMITEDInventor: David S. Holz
-
Patent number: 12039103Abstract: The technology disclosed relates to determining intent for the interaction by calculating a center of effort for the applied forces. Movement of the points of virtual contacts and the center of effort are then monitored to determine a gesture-type intended for the interaction. The number of points of virtual contacts of the feeler zones and proximities between the points of virtual contacts are used to determine a degree of precision of a control object-gesture.Type: GrantFiled: December 19, 2022Date of Patent: July 16, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Pohung Chen, David S. Holz
-
Patent number: 12032746Abstract: The technology disclosed relates to a method of realistic displacement of a virtual object for an interaction between a control object in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a 2D sub-component free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting rotation of the virtual object by the 3D solid control object model.Type: GrantFiled: July 18, 2022Date of Patent: July 9, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Alex Marcolina, David S. Holz
-
Patent number: 12020458Abstract: The technology disclosed relates to coordinating motion-capture of a hand by a network of motion-capture sensors having overlapping fields of view. In particular, it relates to designating a first sensor among three or more motion-capture sensors as having a master frame of reference, observing motion of a hand as it passes through overlapping fields of view of the respective motion-capture sensors, synchronizing capture of images of the hand within the overlapping fields of view by pairs of the motion-capture devices, and using the pairs of the hand images captured by the synchronized motion-capture devices to automatically calibrate the motion-capture sensors to the master frame of reference frame.Type: GrantFiled: January 18, 2022Date of Patent: June 25, 2024Assignee: Ultrahaptics IP Two LimitedInventor: David S. Holz
-
Publication number: 20240201794Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: ApplicationFiled: February 26, 2024Publication date: June 20, 2024Applicant: ULTRAHAPTICS IP TWO LIMITEDInventors: Kevin A. HOROWITZ, Matias PEREZ, Raffi BEDIKIAN, David S. HOLZ, Gabriel A. HARE
-
Patent number: 11994377Abstract: Methods and systems for capturing motion and/or determining the shapes and positions of one or more objects in 3D space utilize cross-sections thereof. In various embodiments, images of the cross-sections are captured using a camera based on reflections therefrom or shadows cast thereby.Type: GrantFiled: September 2, 2020Date of Patent: May 28, 2024Assignee: Ultrahaptics IP Two LimitedInventor: David S. Holz
-
Publication number: 20240143871Abstract: The technology disclosed relates to simplifying updating of a predictive model using clustering observed points. In particular, it relates to observing a set of points in 3D sensory space, determining surface normal directions from the points, clustering the points by their surface normal directions and adjacency, accessing a predictive model of a hand, refining positions of segments of the predictive model, matching the clusters of the points to the segments, and using the matched clusters to refine the positions of the matched segments. It also relates to distinguishing between alternative motions between two observed locations of a control object in a 3D sensory space by accessing first and second positions of a segment of a predictive model of a control object such that motion between the first position and the second position was at least partially occluded from observation in a 3D sensory space.Type: ApplicationFiled: January 5, 2024Publication date: May 2, 2024Applicant: ULTRAHAPTICS IP TWO LIMITEDInventors: David S. HOLZ, Kevin HOROWITZ, Raffi BEDIKIAN, Hua YANG
-
Publication number: 20240094860Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: ApplicationFiled: February 24, 2023Publication date: March 21, 2024Applicant: Ultrahaptics IP Two LimitedInventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Patent number: 11914792Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: GrantFiled: February 17, 2023Date of Patent: February 27, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
-
Publication number: 20240031547Abstract: A motion sensory and imaging device capable of acquiring imaging information of the scene and providing at least a near real time pass-through of imaging information to a user. The sensory and imaging device can be used stand-alone or coupled to a wearable or portable device to create a wearable sensory system capable of presenting to the wearer the imaging information augmented with virtualized or created presentations of information.Type: ApplicationFiled: September 28, 2023Publication date: January 25, 2024Applicant: ULTRAHAPTICS IP TWO LIMITEDInventors: David S. HOLZ, Neeloy ROY, Hongyuan HE
-
Publication number: 20240029356Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.Type: ApplicationFiled: September 26, 2023Publication date: January 25, 2024Applicant: ULTRAHAPTICS IP TWO LIMITEDInventors: Kevin A. HOROWITZ, David S. HOLZ
-
Patent number: 11868687Abstract: The technology disclosed relates to simplifying updating of a predictive model using clustering observed points. In particular, it relates to observing a set of points in 3D sensory space, determining surface normal directions from the points, clustering the points by their surface normal directions and adjacency, accessing a predictive model of a hand, refining positions of segments of the predictive model, matching the clusters of the points to the segments, and using the matched clusters to refine the positions of the matched segments. It also relates to distinguishing between alternative motions between two observed locations of a control object in a 3D sensory space by accessing first and second positions of a segment of a predictive model of a control object such that motion between the first position and the second position was at least partially occluded from observation in a 3D sensory space.Type: GrantFiled: January 30, 2023Date of Patent: January 9, 2024Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Kevin Horowitz, Raffi Bedikian, Hua Yang