Patents Assigned to Ultrahaptics IP Two Limited
  • Patent number: 11726575
    Abstract: The technology disclosed relates to automatically interpreting a gesture of a control object in a three dimensional sensor space by sensing a movement of the control object in the three dimensional sensor space, sensing orientation of the control object, defining a control plane tangential to a surface of the control object and interpreting the gesture based on whether the movement of the control object is more normal to the control plane or more parallel to the control plane.
    Type: Grant
    Filed: July 19, 2021
    Date of Patent: August 15, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Isaac Cohen, David S. Holz, Maxwell Sills
  • Patent number: 11720180
    Abstract: A region of space may be monitored for the presence or absence of one or more control objects, and object attributes and changes thereto may be interpreted as control information provided as input to a machine or application. In some embodiments, the region is monitored using a combination of scanning and image-based sensing.
    Type: Grant
    Filed: July 11, 2022
    Date of Patent: August 8, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Patent number: 11720181
    Abstract: Methods and systems for processing input from an image-capture device for gesture-recognition. The method further includes computationally interpreting user gestures in accordance with a first mode of operation; analyzing the path of movement of an object to determine an intent of a user to change modes of operation; and, upon determining an intent of the user to change modes of operation, subsequently interpreting user gestures in accordance with the second mode of operation.
    Type: Grant
    Filed: August 26, 2022
    Date of Patent: August 8, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Publication number: 20230245500
    Abstract: The technology disclosed can provide methods and systems for identifying users while capturing motion and/or determining the path of a portion of the user with one or more optical, acoustic or vibrational sensors. Implementations can enable use of security aware devices, e.g., automated teller machines (ATMs), cash registers and banking machines, other secure vending or service machines, security screening apparatus, secure terminals, airplanes, automobiles and so forth that comprise sensors and processors employing optical, audio or vibrational detection mechanisms suitable for providing gesture detection, personal identification, user recognition, authorization of control inputs, and other machine control and/or machine communications applications. A virtual experience can be provided to the user in some implementations by the addition of haptic, audio and/or other sensory information projectors.
    Type: Application
    Filed: March 31, 2023
    Publication date: August 3, 2023
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventors: Maxwell SILLS, Aaron SMITH, David S. HOLZ, Hongyuan (Jimmy) HE
  • Patent number: 11714880
    Abstract: The technology disclosed performs hand pose estimation on a so-called “joint-by-joint” basis. So, when a plurality of estimates for the 28 hand joints are received from a plurality of expert networks (and from master experts in some high-confidence scenarios), the estimates are analyzed at a joint level and a final location for each joint is calculated based on the plurality of estimates for a particular joint. This is a novel solution discovered by the technology disclosed because nothing in the field of art determines hand pose estimates at such granularity and precision. Regarding granularity and precision, because hand pose estimates are computed on a joint-by-joint basis, this allows the technology disclosed to detect in real time even the minutest and most subtle hand movements, such a bend/yaw/tilt/roll of a segment of a finger or a tilt an occluded finger, as demonstrated supra in the Experimental Results section of this application.
    Type: Grant
    Filed: July 10, 2019
    Date of Patent: August 1, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Jonathan Marsden, Raffi Bedikian, David Samuel Holz
  • Publication number: 20230214458
    Abstract: The technology disclosed performs hand pose estimation on a so-called “joint-by-joint” basis. So, when a plurality of estimates for the 28 hand joints are received from a plurality of expert networks (and from master experts in some high-confidence scenarios), the estimates are analyzed at a joint level and a final location for each joint is calculated based on the plurality of estimates for a particular joint. This is a novel solution discovered by the technology disclosed because nothing in the field of art determines hand pose estimates at such granularity and precision. Regarding granularity and precision, because hand pose estimates are computed on a joint-by joint basis, this allows the technology disclosed to detect in real time even the minutest and most subtle hand movements, such a bend/yaw/tilt/roll of a segment of a finger or a tilt an occluded finger, as demonstrated supra in the Experimental Results section of this application.
    Type: Application
    Filed: July 10, 2019
    Publication date: July 6, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Jonathan MARSDEN, Raffi BEDIKIAN, David Samuel HOLZ
  • Patent number: 11693115
    Abstract: The technology disclosed relates to determining positional information of an object in a field of view. In particular, it relates to measuring, using a light sensitive sensor, one or more differences in an intensity of returning light that is (i) emitted from respective directionally oriented non-coplanar light sources of a plurality of directionally oriented light sources that have at least some overlapping fields of illumination and (ii) reflected from the target object as the target object moves through a region of space monitored by the light sensitive sensor, and recognizing signals in response to (i) positional information of the target object determined based on, a first position in space at a first time t0 and a second position in space at a second time t1 sensed using the measured one or more differences in the intensity of the returning light and (ii) a non-coplanar movement of the target object.
    Type: Grant
    Filed: February 24, 2020
    Date of Patent: July 4, 2023
    Assignee: ULTRAHAPTICS IP TWO LIMITED
    Inventor: David Holz
  • Publication number: 20230205321
    Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.
    Type: Application
    Filed: February 17, 2023
    Publication date: June 29, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Kevin A. HOROWITZ, Matias PEREZ, Raffi BEDIKIAN, David S. HOLZ, Gabriel A. HARE
  • Publication number: 20230205151
    Abstract: The technology disclosed relates to selecting among devices to interact with. It also relates operating a smart phone with reduced power consumption. It further relates to gesturally interacting with devices that lack gestural responsiveness. The technology disclosed also relates to distinguishing control gestures from proximate non-control gestures in a pervasive three dimensional (3D) sensory space. The technology disclosed further relates to selecting among virtual interaction modalities to interact with.
    Type: Application
    Filed: January 4, 2023
    Publication date: June 29, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Robert Samuel GORDON, Paul Alan DURDIK, Maxwell SILLS
  • Patent number: 11676349
    Abstract: The technology disclosed can provide capabilities to view and/or interact with the real world to the user of a wearable (or portable) device using a sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved user experience, greater safety, greater functionality to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted devices (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Grant
    Filed: July 23, 2021
    Date of Patent: June 13, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Publication number: 20230169236
    Abstract: The technology disclosed relates to simplifying updating of a predictive model using clustering observed points. In particular, it relates to observing a set of points in 3D sensory space, determining surface normal directions from the points, clustering the points by their surface normal directions and adjacency, accessing a predictive model of a hand, refining positions of segments of the predictive model, matching the clusters of the points to the segments, and using the matched clusters to refine the positions of the matched segments. It also relates to distinguishing between alternative motions between two observed locations of a control object in a 3D sensory space by accessing first and second positions of a segment of a predictive model of a control object such that motion between the first position and the second position was at least partially occluded from observation in a 3D sensory space.
    Type: Application
    Filed: January 30, 2023
    Publication date: June 1, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. HOLZ, Kevin HOROWITZ, Raffi BEDIKIAN, Hua YANG
  • Publication number: 20230140737
    Abstract: The technology disclosed can provide capabilities such as using motion sensors and/or other types of sensors coupled to a motion-capture system to monitor motions within a real environment. A virtual object can be projected to a user of a portable device integrated into an augmented rendering of a real environment about the user. Motion information of a user body portion is determined based at least in part upon sensory information received from imaging or acoustic sensory devices. Control information is communicated to a system based in part on a combination of the motion of the portable device and the detected motion of the user. The virtual device experience can be augmented in some implementations by the addition of haptic, audio and/or other sensory information projectors.
    Type: Application
    Filed: December 23, 2022
    Publication date: May 4, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David S. HOLZ
  • Publication number: 20230125265
    Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Application
    Filed: December 21, 2022
    Publication date: April 27, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. HOLZ, Robert Samuel GORDON, Matias PEREZ
  • Publication number: 20230121570
    Abstract: The technology disclosed relates to determining intent for the interaction by calculating a center of effort for the applied forces. Movement of the points of virtual contacts and the center of effort are then monitored to determine a gesture-type intended for the interaction. The number of points of virtual contacts of the feeler zones and proximities between the points of virtual contacts are used to determine a degree of precision of a control object-gesture.
    Type: Application
    Filed: December 19, 2022
    Publication date: April 20, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Pohung CHEN, David S. HOLZ
  • Patent number: 11620859
    Abstract: The technology disclosed can provide methods and systems for identifying users while capturing motion and/or determining the path of a portion of the user with one or more optical, acoustic or vibrational sensors. Implementations can enable use of security aware devices, e.g., automated teller machines (ATMs), cash registers and banking machines, other secure vending or service machines, security screening apparatus, secure terminals, airplanes, automobiles and so forth that comprise sensors and processors employing optical, audio or vibrational detection mechanisms suitable for providing gesture detection, personal identification, user recognition, authorization of control inputs, and other machine control and/or machine communications applications. A virtual experience can be provided to the user in some implementations by the addition of haptic, audio and/or other sensory information projectors.
    Type: Grant
    Filed: July 28, 2020
    Date of Patent: April 4, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Maxwell Sills, Aaron Smith, David S. Holz, Hongyuan (Jimmy) He
  • Publication number: 20230094182
    Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.
    Type: Application
    Filed: September 30, 2022
    Publication date: March 30, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Kevin A. HOROWITZ, David S. HOLZ
  • Publication number: 20230072748
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Application
    Filed: November 10, 2022
    Publication date: March 9, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Isaac COHEN, Maxwell SILLS
  • Patent number: 11599237
    Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.
    Type: Grant
    Filed: February 12, 2021
    Date of Patent: March 7, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
  • Patent number: 11586292
    Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.
    Type: Grant
    Filed: March 1, 2021
    Date of Patent: February 21, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
  • Publication number: 20230042990
    Abstract: The technology disclosed relates to a motion sensory and imaging device capable of acquiring imaging information of the scene and providing at least a near real time pass-through of imaging information to a user. The sensory and imaging device can be used stand-alone or coupled to a wearable or portable device to create a wearable sensory system capable of presenting to the wearer the imaging information augmented with virtualized or created presentations of information.
    Type: Application
    Filed: October 24, 2022
    Publication date: February 9, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. HOLZ, Neeloy ROY, Hongyuan HE