Patents Assigned to Ultrahaptics IP Two Limited
  • Patent number: 11854308
    Abstract: The technology disclosed also initializes a new hand that enters the field of view of a gesture recognition system using a parallax detection module. The parallax detection module determines candidate regions of interest (ROI) for a given input hand image and computes depth, rotation and position information for the candidate ROI. Then, for each of the candidate ROI, an ImagePatch, which includes the hand, is extracted from the original input hand image to minimize processing of low-information pixels. Further, a hand classifier neural network is used to determine which ImagePatch most resembles a hand. For the qualified, most-hand like ImagePatch, a 3D virtual hand is initialized with depth, rotation and position matching that of the qualified ImagePatch.
    Type: Grant
    Filed: February 14, 2017
    Date of Patent: December 26, 2023
    Assignee: ULTRAHAPTICS IP TWO LIMITED
    Inventors: Jonathan Marsden, Raffi Bedikian, David Samuel Holz
  • Patent number: 11841920
    Abstract: The technology disclosed introduces two types of neural networks: “master” or “generalists” networks and “expert” or “specialists” networks. Both, master networks and expert networks, are fully connected neural networks that take a feature vector of an input hand image and produce a prediction of the hand pose. Master networks and expert networks differ from each other based on the data on which they are trained. In particular, master networks are trained on the entire data set. In contrast, expert networks are trained only on a subset of the entire dataset. In regards to the hand poses, master networks are trained on the input image data representing all available hand poses comprising the training data (including both real and simulated hand images).
    Type: Grant
    Filed: February 14, 2017
    Date of Patent: December 12, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Jonathan Marsden, Raffi Bedikian, David Samuel Holz
  • Publication number: 20230367399
    Abstract: Methods and systems for processing input from an image-capture device for gesture-recognition. The method further includes computationally interpreting user gestures in accordance with a first mode of operation; analyzing the path of movement of an object to determine an intent of a user to change modes of operation; and, upon determining an intent of the user to change modes of operation, subsequently interpreting user gestures in accordance with the second mode of operation.
    Type: Application
    Filed: July 26, 2023
    Publication date: November 16, 2023
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventor: David S. HOLZ
  • Publication number: 20230368577
    Abstract: The technology disclosed relates to highly functional/highly accurate motion sensory control devices for use in automotive and industrial control systems capable of capturing and providing images to motion capture systems that detect gestures in a three dimensional (3D) sensory space.
    Type: Application
    Filed: July 24, 2023
    Publication date: November 16, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. HOLZ, Justin Schunick, Neeloy Roy, Chen Zheng, Ward Travis
  • Patent number: 11809634
    Abstract: The technology disclosed relates to identifying an object in a field of view of a camera. In particular, it relates to identifying a display in the field of view of the camera. This is achieved by monitoring a space including acquiring a series of image frames of the space using the camera and detecting one or more light sources in the series of image frames. Further, one or more frequencies of periodic intensity or brightness variations, also referred to as ‘refresh rate’, of light emitted from the light sources is measured. Based on the one or more frequencies of periodic intensity variations of light emitted from the light sources, at least one display that includes the light sources is identified.
    Type: Grant
    Filed: May 2, 2022
    Date of Patent: November 7, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David Holz
  • Publication number: 20230342024
    Abstract: The technology disclosed relates to selecting a virtual item from a virtual grid in a three-dimensional (3D) sensory space. It also relates to navigating a virtual modality displaying a plurality of virtual items arranged in a grid by and automatically selecting a virtual item in a virtual grid at a terminal end of a control gesture of a control object responsive to a terminal gesture that transitions the control object from one physical arrangement to another. In one implementation, the control object is a hand. In some implementations, physical arrangements of the control object include at least a flat hand with thumb parallel to fingers, closed, half-open, pinched, curled, fisted, mime gun, okay sign, thumbs-up, ILY sign, one-finger point, two-finger point, thumb point, or pinkie point.
    Type: Application
    Filed: June 29, 2023
    Publication date: October 26, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Bingxin Ku, Pohung Chen, Isaac Cohen, Paul A. Durdik
  • Patent number: 11798141
    Abstract: An AR calibration system for correcting AR headset distortions. A calibration image is provided to an external screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the external screen. One or more cameras are located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera(s) cancels out an acceptable portion of the calibration image provided to the external screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.
    Type: Grant
    Filed: May 10, 2022
    Date of Patent: October 24, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Johnathon Scott Selstad, David Samuel Holz
  • Publication number: 20230333662
    Abstract: The technology disclosed relates to automatically interpreting motion of a control object in a three dimensional (3D) sensor space by sensing a movement of the control object in the (3D) sensor space, interpreting movement of the control object, and presenting the interpreted movement as a path on a display. The path may be displayed once the speed of the movement exceeds a pre-determined threshold measured in cm per second. Once the path is displayed, the technology duplicates a display object that intersects the path on the display. In some implementations, the control object may be a device, a hand, or a portion of a hand (such as a finger).
    Type: Application
    Filed: June 23, 2023
    Publication date: October 19, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Isaac COHEN, David S. HOLZ, Maxwell SILLS
  • Publication number: 20230334796
    Abstract: The technology disclosed can provide capabilities to view and/or interact with the real world to the user of a wearable (or portable) device using a sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved user experience, greater safety, greater functionality to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted devices (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Application
    Filed: June 8, 2023
    Publication date: October 19, 2023
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventor: David S. HOLZ
  • Publication number: 20230325005
    Abstract: A region of space may be monitored for the presence or absence of one or more control objects, and object attributes and changes thereto may be interpreted as control information provided as input to a machine or application. In some embodiments, the region is monitored using a combination of scanning and image-based sensing.
    Type: Application
    Filed: June 13, 2023
    Publication date: October 12, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David S. HOLZ
  • Patent number: 11782513
    Abstract: The technology disclosed relates to user interfaces for controlling augmented reality (AR) or virtual reality (VR) environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system. Switching the AR/VR presentation on or off to interact with the real world surrounding them, for example to drink some soda, can be addressed with a convenient mode switching gesture associated with switching between operational modes in a VR/AR enabled device.
    Type: Grant
    Filed: June 11, 2021
    Date of Patent: October 10, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David Samuel Holz
  • Patent number: 11782516
    Abstract: A method for detecting a finger is provided. The method includes obtaining a plurality of digital images including a first digital image captured by a camera from a field of view containing a background and a hand including at least one finger, and obtaining an identification of pixels of the plurality of digital images that correspond to at least one finger that is visible in the plurality of digital images rather than to the background, the pixels being identified by: obtaining, from the digital images, a Gaussian brightness falloff pattern indicative of at least one finger, identifying an axis of the at least one finger based on the obtained Gaussian brightness falloff pattern indicative of the at least one finger without identifying edges of the at least one finger, and identifying the pixels that correspond to the at least one finger based on the identified axis.
    Type: Grant
    Filed: March 11, 2022
    Date of Patent: October 10, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Hua Yang
  • Patent number: 11778159
    Abstract: A motion sensory and imaging device capable of acquiring imaging information of the scene and providing at least a near real time pass-through of imaging information to a user. The sensory and imaging device can be used stand-alone or coupled to a wearable or portable device to create a wearable sensory system capable of presenting to the wearer the imaging information augmented with virtualized or created presentations of information.
    Type: Grant
    Filed: October 24, 2022
    Date of Patent: October 3, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Neeloy Roy, Hongyuan He
  • Patent number: 11775080
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Grant
    Filed: October 3, 2022
    Date of Patent: October 3, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Isaac Cohen, Maxwell Sills, Paul Durdik
  • Patent number: 11776208
    Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.
    Type: Grant
    Filed: March 21, 2022
    Date of Patent: October 3, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Kevin A. Horowitz, David S. Holz
  • Patent number: 11775033
    Abstract: The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted.
    Type: Grant
    Filed: September 1, 2022
    Date of Patent: October 3, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Paul Durdik
  • Patent number: 11775078
    Abstract: The technology disclosed relates to operating a motion-capture system responsive to available computational resources. In particular, it relates to assessing a level of image acquisition and image-analysis resources available using benchmarking of system components. In response, one or more image acquisition parameters and/or image-analysis parameters are adjusted. Acquisition and/or analysis of image data are then made compliant with the adjusted image acquisition parameters and/or image-analysis parameters. In some implementations, image acquisition parameters include frame resolution and frame capture rate and image-analysis parameters include analysis algorithm and analysis density.
    Type: Grant
    Filed: July 14, 2022
    Date of Patent: October 3, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David Holz
  • Publication number: 20230288563
    Abstract: The technology disclosed relates to determining positional information of an object in a field of view. In particular, it relates to measuring, using a light sensitive sensor, returning light that is (i) emitted from respective directionally oriented non-coplanar light sources of a plurality of directionally oriented light sources and (ii) returning from the target object, such as an automobile, as the target object moves through a region of space monitored by the light sensitive sensor. The technology disclosed compares the measured returning light to a look-up table that comprises mappings of measurements from the light sensitive sensor to a corresponding incoming angle of light; and determines positional information for the target object using the incoming angle of light.
    Type: Application
    Filed: May 16, 2023
    Publication date: September 14, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David HOLZ
  • Patent number: 11749026
    Abstract: The technology disclosed relates to highly functional/highly accurate motion sensory control devices for use in automotive and industrial control systems capable of capturing and providing images to motion capture systems that detect gestures in a three dimensional (3D) sensory space.
    Type: Grant
    Filed: June 23, 2022
    Date of Patent: September 5, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Justin Schunick, Neeloy Roy, Chen Zheng, Ward Travis
  • Patent number: 11740705
    Abstract: A method and system are provided for controlling a machine using gestures. The method includes sensing a variation of position of a control object using an imaging system, determining, from the variation, one or more primitives describing a characteristic of a control object moving in space, comparing the one or more primitives to one or more gesture templates in a library of gesture templates, selecting, based on a result of the comparing, one or more gesture templates corresponding to the one or more primitives, and providing at least one gesture template of the selected one or more gesture templates as an indication of a command to issue to a machine under control responsive to the variation.
    Type: Grant
    Filed: February 7, 2022
    Date of Patent: August 29, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian