Patents Assigned to Ultrahaptics IP Two Limited
  • Patent number: 12293478
    Abstract: The technology disclosed relates to a method of realistic rendering of a real object as a virtual object in a virtual space using an offset in the position of the hand in a three-dimensional (3D) sensory space. An offset between expected positions of the eye(s) of a wearer of a head mounted device and a sensor attached to the head mounted device for sensing a position of at least one hand in a three-dimensional (3D) sensory space is determined. A position of the hand in the three-dimensional (3D) sensory space can be sensed using a sensor. The sensed position of the hand can be transformed by the offset into a re-rendered position of the hand as would appear to the wearer of the head mounted device if the wearer were looking at the actual hand. The re-rendered hand can be depicted to the wearer of the head mounted device.
    Type: Grant
    Filed: April 8, 2024
    Date of Patent: May 6, 2025
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Alex Marcolina, David Holz
  • Publication number: 20250130648
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Application
    Filed: December 19, 2024
    Publication date: April 24, 2025
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel HARE, Ryan JULIAN
  • Publication number: 20250130700
    Abstract: The technology disclosed relates to providing simplified manipulation of virtual objects by detected hand motions. In particular, it relates to a detecting hand motion and positions of the calculation points relative to a virtual object to be manipulated, dynamically selecting at least one manipulation point proximate to the virtual object based on the detected hand motion and positions of one or more of the calculation points, and manipulating the virtual object by interaction between the detected hand motion and positions of one or more of the calculation points and the dynamically selected manipulation point.
    Type: Application
    Filed: October 28, 2024
    Publication date: April 24, 2025
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventors: David S. HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Hua YANG, Gabriel A. HARE, Maxwell SILLS
  • Publication number: 20250130647
    Abstract: The technology disclosed relates to identifying an object in a field of view of a camera. In particular, it relates to identifying a display in the field of view of the camera. This is achieved by monitoring a space including acquiring a series of image frames of the space using the camera and detecting one or more light sources in the series of image frames. Further, one or more frequencies of periodic intensity or brightness variations, also referred to as ‘refresh rate’, of light emitted from the light sources is measured. Based on the one or more frequencies of periodic intensity variations of light emitted from the light sources, at least one display that includes the light sources is identified.
    Type: Application
    Filed: November 18, 2024
    Publication date: April 24, 2025
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventor: David HOLZ
  • Publication number: 20250117070
    Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.
    Type: Application
    Filed: October 14, 2024
    Publication date: April 10, 2025
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventors: John Adrian Arthur JOHNSTON, Johnathon Scott SELSTAD, Alex MARCOLINA
  • Patent number: 12265761
    Abstract: The technology disclosed relates to simplifying updating of a predictive model using clustering observed points. In particular, it relates to observing a set of points in 3D sensory space, determining surface normal directions from the points, clustering the points by their surface normal directions and adjacency, accessing a predictive model of a hand, refining positions of segments of the predictive model, matching the clusters of the points to the segments, and using the matched clusters to refine the positions of the matched segments. It also relates to distinguishing between alternative motions between two observed locations of a control object in a 3D sensory space by accessing first and second positions of a segment of a predictive model of a control object such that motion between the first position and the second position was at least partially occluded from observation in a 3D sensory space.
    Type: Grant
    Filed: January 5, 2024
    Date of Patent: April 1, 2025
    Assignee: ULTRAHAPTICS IP TWO LIMITED
    Inventors: David S. Holz, Kevin Horowitz, Raffi Bedikian, Hua Yang
  • Publication number: 20250103145
    Abstract: The technology disclosed relates to manipulating a virtual object. In particular, it relates to detecting a hand in a three-dimensional (3D) sensory space and generating a predictive model of the hand, and using the predictive model to track motion of the hand. The predictive model includes positions of calculation points of fingers, thumb and palm of the hand. The technology disclosed relates to dynamically selecting at least one manipulation point proximate to a virtual object based on the motion tracked by the predictive model and positions of one or more of the calculation points, and manipulating the virtual object by interaction between at least some of the calculation points of the predictive model and the dynamically selected manipulation point.
    Type: Application
    Filed: December 9, 2024
    Publication date: March 27, 2025
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventors: David S. HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Maxwell SILLS, Hua YANG, Gabriel HARE
  • Patent number: 12260023
    Abstract: A region of space may be monitored for the presence or absence of one or more control objects, and object attributes and changes thereto may be interpreted as control information provided as input to a machine or application. In some embodiments, the region is monitored using a combination of scanning and image-based sensing.
    Type: Grant
    Filed: June 13, 2023
    Date of Patent: March 25, 2025
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Patent number: 12260679
    Abstract: The technology disclosed also initializes a new hand that enters the field of view of a gesture recognition system using a parallax detection module. The parallax detection module determines candidate regions of interest (ROI) for a given input hand image and computes depth, rotation and position information for the candidate ROI. Then, for each of the candidate ROI, an ImagePatch, which includes the hand, is extracted from the original input hand image to minimize processing of low-information pixels. Further, a hand classifier neural network is used to determine which ImagePatch most resembles a hand. For the qualified, most-hand like ImagePatch, a 3D virtual hand is initialized with depth, rotation and position matching that of the qualified ImagePatch.
    Type: Grant
    Filed: December 20, 2023
    Date of Patent: March 25, 2025
    Assignee: ULTRAHAPTICS IP TWO LIMITED
    Inventors: Jonathan Marsden, Raffi Bedikian, David Samuel Holz
  • Publication number: 20250086910
    Abstract: The technology disclosed can provide capabilities to view and/or interact with the real world to the user of a wearable (or portable) device using a sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved user experience, greater safety, greater functionality to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted devices (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Application
    Filed: November 22, 2024
    Publication date: March 13, 2025
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Patent number: 12243238
    Abstract: The technology disclosed performs hand pose estimation on a so-called “joint-by-joint” basis. So, when a plurality of estimates for the 28 hand joints are received from a plurality of expert networks (and from master experts in some high-confidence scenarios), the estimates are analyzed at a joint level and a final location for each joint is calculated based on the plurality of estimates for a particular joint. This is a novel solution discovered by the technology disclosed because nothing in the field of art determines hand pose estimates at such granularity and precision. Regarding granularity and precision, because hand pose estimates are computed on a joint-by-joint basis, this allows the technology disclosed to detect in real time even the minutest and most subtle hand movements, such a bend/yaw/tilt/roll of a segment of a finger or a tilt an occluded finger, as demonstrated supra in the Experimental Results section of this application.
    Type: Grant
    Filed: July 20, 2023
    Date of Patent: March 4, 2025
    Assignee: ULTRAHAPTICS IP TWO LIMITED
    Inventors: Jonathan Marsden, Raffi Bedikian, David Samuel Holz
  • Patent number: 12242312
    Abstract: The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted.
    Type: Grant
    Filed: September 19, 2023
    Date of Patent: March 4, 2025
    Assignee: ULTRAHAPTICS IP TWO LIMITED
    Inventors: David S. Holz, Paul Durdik
  • Patent number: 12236528
    Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.
    Type: Grant
    Filed: September 30, 2022
    Date of Patent: February 25, 2025
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Kevin A. Horowitz, David S. Holz
  • Patent number: 12229217
    Abstract: The technology disclosed introduces two types of neural networks: “master” or “generalists” networks and “expert” or “specialists” networks. Both, master networks and expert networks, are fully connected neural networks that take a feature vector of an input hand image and produce a prediction of the hand pose. Master networks and expert networks differ from each other based on the data on which they are trained. In particular, master networks are trained on the entire data set. In contrast, expert networks are trained only on a subset of the entire dataset. In regards to the hand poses, master networks are trained on the input image data representing all available hand poses comprising the training data (including both real and simulated hand images).
    Type: Grant
    Filed: December 11, 2023
    Date of Patent: February 18, 2025
    Assignee: ULTRAHAPTICS IP TWO LIMITED
    Inventors: Jonathan Marsden, Raffi Bedikian, David Samuel Holz
  • Publication number: 20250054248
    Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Application
    Filed: October 21, 2024
    Publication date: February 13, 2025
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Robert Samuel Gordon, Matias Perez
  • Patent number: 12225280
    Abstract: The technology disclosed relates to adjusting the monitored field of view of a camera and/or a view of a virtual scene from a point of view of a virtual camera based on the distance between tracked objects. For example, if the user's hand is being tracked for gestures, the closer the hand gets to another object, the tighter the frame can become—i.e., the more the camera can zoom in so that the hand and the other object occupy most of the frame. The camera can also be reoriented so that the hand and the other object remain in the center of the field of view. The distance between two objects in a camera's field of view can be determined and a parameter adjusted based thereon. In particular, the pan and/or zoom levels of the camera may be adjusted in accordance with the distance.
    Type: Grant
    Filed: August 12, 2022
    Date of Patent: February 11, 2025
    Assignee: ULTRAHAPTICS IP TWO LIMITED
    Inventor: David S. Holz
  • Patent number: 12204695
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, the gesture is identified as an engagement gesture, and compared with reference gestures from a library of reference gestures. In some embodiments, a degree of completion of the recognized engagement gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Grant
    Filed: July 7, 2023
    Date of Patent: January 21, 2025
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian
  • Publication number: 20250021169
    Abstract: A method for detecting a finger is provided. The method includes obtaining a plurality of digital images including a first digital image captured by a camera from a field of view containing a background and a hand including at least one finger, and obtaining an identification of pixels of the plurality of digital images that correspond to at least one finger that is visible in the plurality of digital images rather than to the background, the pixels being identified by: obtaining, from the digital images, a Gaussian brightness falloff pattern indicative of at least one finger, identifying an axis of the at least one finger based on the obtained Gaussian brightness falloff pattern indicative of the at least one finger without identifying edges of the at least one finger, and identifying the pixels that correspond to the at least one finger based on the identified axis.
    Type: Application
    Filed: July 22, 2024
    Publication date: January 16, 2025
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. HOLZ, Hua Yang
  • Publication number: 20250013314
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Application
    Filed: September 23, 2024
    Publication date: January 9, 2025
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventors: Isaac COHEN, Maxwell SILLS
  • Publication number: 20250004568
    Abstract: The technology disclosed relates to automatically interpreting motion of a control object in a three dimensional (3D) sensor space by sensing a movement of the control object in the (3D) sensor space, interpreting movement of the control object, and presenting the interpreted movement as a path on a display. The path may be displayed once the speed of the movement exceeds a pre-determined threshold measured in cm per second. Once the path is displayed, the technology duplicates a display object that intersects the path on the display. In some implementations, the control object may be a device, a hand, or a portion of a hand (such as a finger).
    Type: Application
    Filed: September 12, 2024
    Publication date: January 2, 2025
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Isaac COHEN, David S. HOLZ, Maxwell SILLS