Patents Assigned to Ultrahaptics IP Two Limited
-
Publication number: 20250232539Abstract: The technology disclosed relates to a method of realistic rendering of a real object as a virtual object in a virtual space using an offset in the position of the hand in a three-dimensional (3D) sensory space. An offset between expected positions of the eye(s) of a wearer of a head mounted device and a sensor attached to the head mounted device for sensing a position of at least one hand in a three-dimensional (3D) sensory space is determined. A position of the hand in the three-dimensional (3D) sensory space can be sensed using a sensor. The sensed position of the hand can be transformed by the offset into a re-rendered position of the hand as would appear to the wearer of the head mounted device if the wearer were looking at the actual hand. The re-rendered hand can be depicted to the wearer of the head mounted device.Type: ApplicationFiled: April 7, 2025Publication date: July 17, 2025Applicant: Ultrahaptics IP Two LimitedInventors: Alex Marcolina, David Holz
-
Patent number: 12347145Abstract: The technology disclosed relates to coordinating motion-capture of a hand by a network of motion-capture sensors having overlapping fields of view. In particular, it relates to designating a first sensor among three or more motion-capture sensors as having a master frame of reference, observing motion of a hand as it passes through overlapping fields of view of the respective motion-capture sensors, synchronizing capture of images of the hand within the overlapping fields of view by pairs of the motion-capture devices, and using the pairs of the hand images captured by the synchronized motion-capture devices to automatically calibrate the motion-capture sensors to the master frame of reference frame.Type: GrantFiled: May 13, 2024Date of Patent: July 1, 2025Assignee: Ultrahaptics IP Two LimitedInventor: David S. Holz
-
Publication number: 20250165079Abstract: A region of space may be monitored for the presence or absence of one or more control objects, and object attributes and changes thereto may be interpreted as control information provided as input to a machine or application. In some embodiments, the region is monitored using a combination of scanning and image-based sensing.Type: ApplicationFiled: January 16, 2025Publication date: May 22, 2025Applicant: Ultrahaptics IP Two LimitedInventor: David S. HOLZ
-
Patent number: 12293478Abstract: The technology disclosed relates to a method of realistic rendering of a real object as a virtual object in a virtual space using an offset in the position of the hand in a three-dimensional (3D) sensory space. An offset between expected positions of the eye(s) of a wearer of a head mounted device and a sensor attached to the head mounted device for sensing a position of at least one hand in a three-dimensional (3D) sensory space is determined. A position of the hand in the three-dimensional (3D) sensory space can be sensed using a sensor. The sensed position of the hand can be transformed by the offset into a re-rendered position of the hand as would appear to the wearer of the head mounted device if the wearer were looking at the actual hand. The re-rendered hand can be depicted to the wearer of the head mounted device.Type: GrantFiled: April 8, 2024Date of Patent: May 6, 2025Assignee: Ultrahaptics IP Two LimitedInventors: Alex Marcolina, David Holz
-
Publication number: 20250130648Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: ApplicationFiled: December 19, 2024Publication date: April 24, 2025Applicant: Ultrahaptics IP Two LimitedInventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel HARE, Ryan JULIAN
-
Patent number: 12260023Abstract: A region of space may be monitored for the presence or absence of one or more control objects, and object attributes and changes thereto may be interpreted as control information provided as input to a machine or application. In some embodiments, the region is monitored using a combination of scanning and image-based sensing.Type: GrantFiled: June 13, 2023Date of Patent: March 25, 2025Assignee: Ultrahaptics IP Two LimitedInventor: David S. Holz
-
Publication number: 20250086910Abstract: The technology disclosed can provide capabilities to view and/or interact with the real world to the user of a wearable (or portable) device using a sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved user experience, greater safety, greater functionality to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted devices (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.Type: ApplicationFiled: November 22, 2024Publication date: March 13, 2025Applicant: Ultrahaptics IP Two LimitedInventor: David S. Holz
-
Patent number: 12236528Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.Type: GrantFiled: September 30, 2022Date of Patent: February 25, 2025Assignee: Ultrahaptics IP Two LimitedInventors: Kevin A. Horowitz, David S. Holz
-
Publication number: 20250054248Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.Type: ApplicationFiled: October 21, 2024Publication date: February 13, 2025Applicant: Ultrahaptics IP Two LimitedInventors: David S. Holz, Robert Samuel Gordon, Matias Perez
-
Patent number: 12204695Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, the gesture is identified as an engagement gesture, and compared with reference gestures from a library of reference gestures. In some embodiments, a degree of completion of the recognized engagement gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: GrantFiled: July 7, 2023Date of Patent: January 21, 2025Assignee: Ultrahaptics IP Two LimitedInventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian
-
Publication number: 20250021169Abstract: A method for detecting a finger is provided. The method includes obtaining a plurality of digital images including a first digital image captured by a camera from a field of view containing a background and a hand including at least one finger, and obtaining an identification of pixels of the plurality of digital images that correspond to at least one finger that is visible in the plurality of digital images rather than to the background, the pixels being identified by: obtaining, from the digital images, a Gaussian brightness falloff pattern indicative of at least one finger, identifying an axis of the at least one finger based on the obtained Gaussian brightness falloff pattern indicative of the at least one finger without identifying edges of the at least one finger, and identifying the pixels that correspond to the at least one finger based on the identified axis.Type: ApplicationFiled: July 22, 2024Publication date: January 16, 2025Applicant: Ultrahaptics IP Two LimitedInventors: David S. HOLZ, Hua Yang
-
Publication number: 20250004568Abstract: The technology disclosed relates to automatically interpreting motion of a control object in a three dimensional (3D) sensor space by sensing a movement of the control object in the (3D) sensor space, interpreting movement of the control object, and presenting the interpreted movement as a path on a display. The path may be displayed once the speed of the movement exceeds a pre-determined threshold measured in cm per second. Once the path is displayed, the technology duplicates a display object that intersects the path on the display. In some implementations, the control object may be a device, a hand, or a portion of a hand (such as a finger).Type: ApplicationFiled: September 12, 2024Publication date: January 2, 2025Applicant: Ultrahaptics IP Two LimitedInventors: Isaac COHEN, David S. HOLZ, Maxwell SILLS
-
Publication number: 20240419254Abstract: The technology disclosed relates to automatically (e.g., programmatically) initializing predictive information for tracking a control object (e.g., hand, hand and tool combination, robot end effector) based upon information about characteristics of the object determined from sets of collected observed information. Automated initialization techniques obviate the need for special and often bizarre start-up rituals (place your hands on the screen at the places indicated during a full moon, and so forth) required by conventional techniques. In implementations, systems can refine initial predictive information to reflect an observed condition based on comparison of the observed with an analysis of sets of collected observed information.Type: ApplicationFiled: August 27, 2024Publication date: December 19, 2024Applicant: Ultrahaptics IP Two LimitedInventor: Kevin A. Horowitz
-
Publication number: 20240419256Abstract: The technology disclosed relates to using gestures to supplant or augment use of a standard input device coupled to a system. It also relates to controlling a display using gestures. It further relates to controlling a system using more than one input device. In particular, it relates to detecting a standard input device that causes on-screen actions on a display in response to control manipulations performed using the standard input device. Further, a library of analogous gestures is identified, which includes gestures that are analogous to the control manipulations and also cause the on-screen actions responsive to the control manipulations. Thus, when a gesture from the library of analogous gestures is detected, a signal is generated that mimics a standard signal from the standard input device and causes at least one on-screen action.Type: ApplicationFiled: August 29, 2024Publication date: December 19, 2024Applicant: Ultrahaptics IP Two LimitedInventor: David Holz
-
Patent number: 12169918Abstract: An AR calibration system for correcting AR headset distortions. A calibration image is provided to a screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the screen. One or more cameras are located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera(s) cancels out an acceptable portion of the calibration image provided to the screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.Type: GrantFiled: September 13, 2023Date of Patent: December 17, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Johnathon Scott Selstad, David Samuel Holz
-
Patent number: 12164694Abstract: The technology disclosed relates to manipulating a virtual object. In particular, it relates to detecting a hand in a three-dimensional (3D) sensory space and generating a predictive model of the hand, and using the predictive model to track motion of the hand. The predictive model includes positions of calculation points of fingers, thumb and palm of the hand. The technology disclosed relates to dynamically selecting at least one manipulation point proximate to a virtual object based on the motion tracked by the predictive model and positions of one or more of the calculation points, and manipulating the virtual object by interaction between at least some of the calculation points of the predictive model and the dynamically selected manipulation point.Type: GrantFiled: November 22, 2021Date of Patent: December 10, 2024Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Raffi Bedikian, Adrian Gasinski, Maxwell Sills, Hua Yang, Gabriel Hare
-
Publication number: 20240361877Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: ApplicationFiled: July 10, 2024Publication date: October 31, 2024Applicant: Ultrahaptics IP Two LimitedInventors: David S. HOLZ, Barrett FOX, Kyle A. HAY, Gabriel A. HARE, Wilbur Yung Sheng YU, Dave EDELHART, Jody MEDICH, Daniel PLEMMONS
-
Patent number: 12131011Abstract: The technology disclosed relates to providing simplified manipulation of virtual objects by detected hand motions. In particular, it relates to a detecting hand motion and positions of the calculation points relative to a virtual object to be manipulated, dynamically selecting at least one manipulation point proximate to the virtual object based on the detected hand motion and positions of one or more of the calculation points, and manipulating the virtual object by interaction between the detected hand motion and positions of one or more of the calculation points and the dynamically selected manipulation point.Type: GrantFiled: July 28, 2020Date of Patent: October 29, 2024Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Raffi Bedikian, Adrian Gasinski, Hua Yang, Gabriel A. Hare, Maxwell Sills
-
Patent number: 12125157Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.Type: GrantFiled: December 21, 2022Date of Patent: October 22, 2024Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Robert Samuel Gordon, Matias Perez
-
Patent number: 12118134Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.Type: GrantFiled: January 27, 2022Date of Patent: October 15, 2024Assignee: Ultrahaptics IP Two LimitedInventors: John Adrian Arthur Johnston, Johnathon Scott Selstad, Alex Marcolina