Patents by Inventor David S. HOLZ
David S. HOLZ has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240094860Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: ApplicationFiled: February 24, 2023Publication date: March 21, 2024Applicant: Ultrahaptics IP Two LimitedInventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Patent number: 11914792Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: GrantFiled: February 17, 2023Date of Patent: February 27, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
-
Publication number: 20240031547Abstract: A motion sensory and imaging device capable of acquiring imaging information of the scene and providing at least a near real time pass-through of imaging information to a user. The sensory and imaging device can be used stand-alone or coupled to a wearable or portable device to create a wearable sensory system capable of presenting to the wearer the imaging information augmented with virtualized or created presentations of information.Type: ApplicationFiled: September 28, 2023Publication date: January 25, 2024Applicant: ULTRAHAPTICS IP TWO LIMITEDInventors: David S. HOLZ, Neeloy ROY, Hongyuan HE
-
Publication number: 20240029356Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.Type: ApplicationFiled: September 26, 2023Publication date: January 25, 2024Applicant: ULTRAHAPTICS IP TWO LIMITEDInventors: Kevin A. HOROWITZ, David S. HOLZ
-
Patent number: 11868687Abstract: The technology disclosed relates to simplifying updating of a predictive model using clustering observed points. In particular, it relates to observing a set of points in 3D sensory space, determining surface normal directions from the points, clustering the points by their surface normal directions and adjacency, accessing a predictive model of a hand, refining positions of segments of the predictive model, matching the clusters of the points to the segments, and using the matched clusters to refine the positions of the matched segments. It also relates to distinguishing between alternative motions between two observed locations of a control object in a 3D sensory space by accessing first and second positions of a segment of a predictive model of a control object such that motion between the first position and the second position was at least partially occluded from observation in a 3D sensory space.Type: GrantFiled: January 30, 2023Date of Patent: January 9, 2024Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Kevin Horowitz, Raffi Bedikian, Hua Yang
-
Publication number: 20240004438Abstract: The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted.Type: ApplicationFiled: September 19, 2023Publication date: January 4, 2024Applicant: ULTRAHAPTICS IP TWO LIMITEDInventors: David S. HOLZ, Paul DURDIK
-
Publication number: 20240004479Abstract: A method for detecting a finger is provided. The method includes obtaining a plurality of digital images including a first digital image captured by a camera from a field of view containing a background and a hand including at least one finger, and obtaining an identification of pixels of the plurality of digital images that correspond to at least one finger that is visible in the plurality of digital images rather than to the background, the pixels being identified by: obtaining, from the digital images, a Gaussian brightness falloff pattern indicative of at least one finger, identifying an axis of the at least one finger based on the obtained Gaussian brightness falloff pattern indicative of the at least one finger without identifying edges of the at least one finger, and identifying the pixels that correspond to the at least one finger based on the identified axis.Type: ApplicationFiled: September 18, 2023Publication date: January 4, 2024Applicant: Ultrahaptics IP Two LimitedInventors: David S. HOLZ, Hua YANG
-
Publication number: 20230367399Abstract: Methods and systems for processing input from an image-capture device for gesture-recognition. The method further includes computationally interpreting user gestures in accordance with a first mode of operation; analyzing the path of movement of an object to determine an intent of a user to change modes of operation; and, upon determining an intent of the user to change modes of operation, subsequently interpreting user gestures in accordance with the second mode of operation.Type: ApplicationFiled: July 26, 2023Publication date: November 16, 2023Applicant: ULTRAHAPTICS IP TWO LIMITEDInventor: David S. HOLZ
-
Publication number: 20230368577Abstract: The technology disclosed relates to highly functional/highly accurate motion sensory control devices for use in automotive and industrial control systems capable of capturing and providing images to motion capture systems that detect gestures in a three dimensional (3D) sensory space.Type: ApplicationFiled: July 24, 2023Publication date: November 16, 2023Applicant: Ultrahaptics IP Two LimitedInventors: David S. HOLZ, Justin Schunick, Neeloy Roy, Chen Zheng, Ward Travis
-
Publication number: 20230333662Abstract: The technology disclosed relates to automatically interpreting motion of a control object in a three dimensional (3D) sensor space by sensing a movement of the control object in the (3D) sensor space, interpreting movement of the control object, and presenting the interpreted movement as a path on a display. The path may be displayed once the speed of the movement exceeds a pre-determined threshold measured in cm per second. Once the path is displayed, the technology duplicates a display object that intersects the path on the display. In some implementations, the control object may be a device, a hand, or a portion of a hand (such as a finger).Type: ApplicationFiled: June 23, 2023Publication date: October 19, 2023Applicant: Ultrahaptics IP Two LimitedInventors: Isaac COHEN, David S. HOLZ, Maxwell SILLS
-
Publication number: 20230334796Abstract: The technology disclosed can provide capabilities to view and/or interact with the real world to the user of a wearable (or portable) device using a sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved user experience, greater safety, greater functionality to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted devices (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.Type: ApplicationFiled: June 8, 2023Publication date: October 19, 2023Applicant: ULTRAHAPTICS IP TWO LIMITEDInventor: David S. HOLZ
-
Publication number: 20230325005Abstract: A region of space may be monitored for the presence or absence of one or more control objects, and object attributes and changes thereto may be interpreted as control information provided as input to a machine or application. In some embodiments, the region is monitored using a combination of scanning and image-based sensing.Type: ApplicationFiled: June 13, 2023Publication date: October 12, 2023Applicant: Ultrahaptics IP Two LimitedInventor: David S. HOLZ
-
Patent number: 11782516Abstract: A method for detecting a finger is provided. The method includes obtaining a plurality of digital images including a first digital image captured by a camera from a field of view containing a background and a hand including at least one finger, and obtaining an identification of pixels of the plurality of digital images that correspond to at least one finger that is visible in the plurality of digital images rather than to the background, the pixels being identified by: obtaining, from the digital images, a Gaussian brightness falloff pattern indicative of at least one finger, identifying an axis of the at least one finger based on the obtained Gaussian brightness falloff pattern indicative of the at least one finger without identifying edges of the at least one finger, and identifying the pixels that correspond to the at least one finger based on the identified axis.Type: GrantFiled: March 11, 2022Date of Patent: October 10, 2023Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Hua Yang
-
Patent number: 11778159Abstract: A motion sensory and imaging device capable of acquiring imaging information of the scene and providing at least a near real time pass-through of imaging information to a user. The sensory and imaging device can be used stand-alone or coupled to a wearable or portable device to create a wearable sensory system capable of presenting to the wearer the imaging information augmented with virtualized or created presentations of information.Type: GrantFiled: October 24, 2022Date of Patent: October 3, 2023Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Neeloy Roy, Hongyuan He
-
Patent number: 11776208Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.Type: GrantFiled: March 21, 2022Date of Patent: October 3, 2023Assignee: Ultrahaptics IP Two LimitedInventors: Kevin A. Horowitz, David S. Holz
-
Patent number: 11775033Abstract: The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted.Type: GrantFiled: September 1, 2022Date of Patent: October 3, 2023Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Paul Durdik
-
Patent number: 11749026Abstract: The technology disclosed relates to highly functional/highly accurate motion sensory control devices for use in automotive and industrial control systems capable of capturing and providing images to motion capture systems that detect gestures in a three dimensional (3D) sensory space.Type: GrantFiled: June 23, 2022Date of Patent: September 5, 2023Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Justin Schunick, Neeloy Roy, Chen Zheng, Ward Travis
-
Patent number: 11726575Abstract: The technology disclosed relates to automatically interpreting a gesture of a control object in a three dimensional sensor space by sensing a movement of the control object in the three dimensional sensor space, sensing orientation of the control object, defining a control plane tangential to a surface of the control object and interpreting the gesture based on whether the movement of the control object is more normal to the control plane or more parallel to the control plane.Type: GrantFiled: July 19, 2021Date of Patent: August 15, 2023Assignee: Ultrahaptics IP Two LimitedInventors: Isaac Cohen, David S. Holz, Maxwell Sills
-
Patent number: 11720181Abstract: Methods and systems for processing input from an image-capture device for gesture-recognition. The method further includes computationally interpreting user gestures in accordance with a first mode of operation; analyzing the path of movement of an object to determine an intent of a user to change modes of operation; and, upon determining an intent of the user to change modes of operation, subsequently interpreting user gestures in accordance with the second mode of operation.Type: GrantFiled: August 26, 2022Date of Patent: August 8, 2023Assignee: Ultrahaptics IP Two LimitedInventor: David S. Holz
-
Patent number: 11720180Abstract: A region of space may be monitored for the presence or absence of one or more control objects, and object attributes and changes thereto may be interpreted as control information provided as input to a machine or application. In some embodiments, the region is monitored using a combination of scanning and image-based sensing.Type: GrantFiled: July 11, 2022Date of Patent: August 8, 2023Assignee: Ultrahaptics IP Two LimitedInventor: David S. Holz