Patents Assigned to Ultrahaptics IP Two Limited
  • Publication number: 20240143871
    Abstract: The technology disclosed relates to simplifying updating of a predictive model using clustering observed points. In particular, it relates to observing a set of points in 3D sensory space, determining surface normal directions from the points, clustering the points by their surface normal directions and adjacency, accessing a predictive model of a hand, refining positions of segments of the predictive model, matching the clusters of the points to the segments, and using the matched clusters to refine the positions of the matched segments. It also relates to distinguishing between alternative motions between two observed locations of a control object in a 3D sensory space by accessing first and second positions of a segment of a predictive model of a control object such that motion between the first position and the second position was at least partially occluded from observation in a 3D sensory space.
    Type: Application
    Filed: January 5, 2024
    Publication date: May 2, 2024
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventors: David S. HOLZ, Kevin HOROWITZ, Raffi BEDIKIAN, Hua YANG
  • Patent number: 11954808
    Abstract: The technology disclosed relates to a method of realistic rendering of a real object as a virtual object in a virtual space using an offset in the position of the hand in a three-dimensional (3D) sensory space. An offset between expected positions of the eye(s) of a wearer of a head mounted device and a sensor attached to the head mounted device for sensing a position of at least one hand in a three-dimensional (3D) sensory space is determined. A position of the hand in the three-dimensional (3D) sensory space can be sensed using a sensor. The sensed position of the hand can be transformed by the offset into a re-rendered position of the hand as would appear to the wearer of the head mounted device if the wearer were looking at the actual hand. The re-rendered hand can be depicted to the wearer of the head mounted device.
    Type: Grant
    Filed: February 7, 2022
    Date of Patent: April 9, 2024
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Alex Marcolina, David Holz
  • Patent number: 11941163
    Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.
    Type: Grant
    Filed: January 27, 2022
    Date of Patent: March 26, 2024
    Assignee: Ultrahaptics IP Two Limited
    Inventors: John Adrian Arthur Johnston, Johnathon Scott Selstad, Alex Marcolina
  • Publication number: 20240094860
    Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.
    Type: Application
    Filed: February 24, 2023
    Publication date: March 21, 2024
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
  • Publication number: 20240077950
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The technology disclosed includes determining from the motion information whether a motion of the control object with respect to the virtual control construct is an engagement gesture, such as a virtual mouse click or other control device operation. The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.
    Type: Application
    Filed: November 9, 2023
    Publication date: March 7, 2024
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ
  • Publication number: 20240069646
    Abstract: The technology disclosed relates to identifying an object in a field of view of a camera. In particular, it relates to identifying a display in the field of view of the camera. This is achieved by monitoring a space including acquiring a series of image frames of the space using the camera and detecting one or more light sources in the series of image frames. Further, one or more frequencies of periodic intensity or brightness variations, also referred to as ‘refresh rate’, of light emitted from the light sources is measured. Based on the one or more frequencies of periodic intensity variations of light emitted from the light sources, at least one display that includes the light sources is identified.
    Type: Application
    Filed: November 6, 2023
    Publication date: February 29, 2024
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventor: David HOLZ
  • Patent number: 11914792
    Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.
    Type: Grant
    Filed: February 17, 2023
    Date of Patent: February 27, 2024
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
  • Publication number: 20240061511
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, the gesture is identified as an engagement gesture, and compared with reference gestures from a library of reference gestures. In some embodiments, a degree of completion of the recognized engagement gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Application
    Filed: July 7, 2023
    Publication date: February 22, 2024
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel HARE, Ryan JULIAN
  • Publication number: 20240045509
    Abstract: The technology disclosed relates to user interfaces for controlling augmented reality (AR) or virtual reality (VR) environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system. Switching the AR/VR presentation on or off to interact with the real world surrounding them, for example to drink some soda, can be addressed with a convenient mode switching gesture associated with switching between operational modes in a VR/AR enabled device.
    Type: Application
    Filed: September 28, 2023
    Publication date: February 8, 2024
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventor: David Samuel HOLZ
  • Publication number: 20240028131
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Application
    Filed: September 26, 2023
    Publication date: January 25, 2024
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventors: Isaac COHEN, Maxwell SILLS, Paul DURDIK
  • Publication number: 20240031547
    Abstract: A motion sensory and imaging device capable of acquiring imaging information of the scene and providing at least a near real time pass-through of imaging information to a user. The sensory and imaging device can be used stand-alone or coupled to a wearable or portable device to create a wearable sensory system capable of presenting to the wearer the imaging information augmented with virtualized or created presentations of information.
    Type: Application
    Filed: September 28, 2023
    Publication date: January 25, 2024
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventors: David S. HOLZ, Neeloy ROY, Hongyuan HE
  • Publication number: 20240029356
    Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.
    Type: Application
    Filed: September 26, 2023
    Publication date: January 25, 2024
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventors: Kevin A. HOROWITZ, David S. HOLZ
  • Publication number: 20240019941
    Abstract: The technology disclosed relates to operating a motion-capture system responsive to available computational resources. In particular, it relates to assessing a level of image acquisition and image-analysis resources available using benchmarking of system components. In response, one or more image acquisition parameters and/or image-analysis parameters are adjusted. Acquisition and/or analysis of image data are then made compliant with the adjusted image acquisition parameters and/or image-analysis parameters. In some implementations, image acquisition parameters include frame resolution and frame capture rate and image-analysis parameters include analysis algorithm and analysis density.
    Type: Application
    Filed: September 27, 2023
    Publication date: January 18, 2024
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventor: David HOLZ
  • Patent number: 11874970
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.
    Type: Grant
    Filed: June 6, 2022
    Date of Patent: January 16, 2024
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz
  • Patent number: 11875012
    Abstract: The technology disclosed relates to positioning and revealing a control interface in a virtual or augmented reality that includes causing display of a plurality of interface projectiles at a first region of a virtual or augmented reality. Input is received that is interpreted as user interaction with an interface projectile. User interaction includes selecting and throwing the interface projectile in a first direction. An animation of the interface projectile is displayed along a trajectory in the first directions to a place where it lands. A blooming of the control interface blooming from the interface projectile at the place where it lands is displayed.
    Type: Grant
    Filed: May 21, 2019
    Date of Patent: January 16, 2024
    Assignee: Ultrahaptics IP Two Limited
    Inventor: Nicholas James Benson
  • Patent number: 11868687
    Abstract: The technology disclosed relates to simplifying updating of a predictive model using clustering observed points. In particular, it relates to observing a set of points in 3D sensory space, determining surface normal directions from the points, clustering the points by their surface normal directions and adjacency, accessing a predictive model of a hand, refining positions of segments of the predictive model, matching the clusters of the points to the segments, and using the matched clusters to refine the positions of the matched segments. It also relates to distinguishing between alternative motions between two observed locations of a control object in a 3D sensory space by accessing first and second positions of a segment of a predictive model of a control object such that motion between the first position and the second position was at least partially occluded from observation in a 3D sensory space.
    Type: Grant
    Filed: January 30, 2023
    Date of Patent: January 9, 2024
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Kevin Horowitz, Raffi Bedikian, Hua Yang
  • Publication number: 20240004438
    Abstract: The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted.
    Type: Application
    Filed: September 19, 2023
    Publication date: January 4, 2024
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventors: David S. HOLZ, Paul DURDIK
  • Publication number: 20240004479
    Abstract: A method for detecting a finger is provided. The method includes obtaining a plurality of digital images including a first digital image captured by a camera from a field of view containing a background and a hand including at least one finger, and obtaining an identification of pixels of the plurality of digital images that correspond to at least one finger that is visible in the plurality of digital images rather than to the background, the pixels being identified by: obtaining, from the digital images, a Gaussian brightness falloff pattern indicative of at least one finger, identifying an axis of the at least one finger based on the obtained Gaussian brightness falloff pattern indicative of the at least one finger without identifying edges of the at least one finger, and identifying the pixels that correspond to the at least one finger based on the identified axis.
    Type: Application
    Filed: September 18, 2023
    Publication date: January 4, 2024
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. HOLZ, Hua YANG
  • Patent number: 11854308
    Abstract: The technology disclosed also initializes a new hand that enters the field of view of a gesture recognition system using a parallax detection module. The parallax detection module determines candidate regions of interest (ROI) for a given input hand image and computes depth, rotation and position information for the candidate ROI. Then, for each of the candidate ROI, an ImagePatch, which includes the hand, is extracted from the original input hand image to minimize processing of low-information pixels. Further, a hand classifier neural network is used to determine which ImagePatch most resembles a hand. For the qualified, most-hand like ImagePatch, a 3D virtual hand is initialized with depth, rotation and position matching that of the qualified ImagePatch.
    Type: Grant
    Filed: February 14, 2017
    Date of Patent: December 26, 2023
    Assignee: ULTRAHAPTICS IP TWO LIMITED
    Inventors: Jonathan Marsden, Raffi Bedikian, David Samuel Holz
  • Patent number: 11841920
    Abstract: The technology disclosed introduces two types of neural networks: “master” or “generalists” networks and “expert” or “specialists” networks. Both, master networks and expert networks, are fully connected neural networks that take a feature vector of an input hand image and produce a prediction of the hand pose. Master networks and expert networks differ from each other based on the data on which they are trained. In particular, master networks are trained on the entire data set. In contrast, expert networks are trained only on a subset of the entire dataset. In regards to the hand poses, master networks are trained on the input image data representing all available hand poses comprising the training data (including both real and simulated hand images).
    Type: Grant
    Filed: February 14, 2017
    Date of Patent: December 12, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Jonathan Marsden, Raffi Bedikian, David Samuel Holz