Patents Assigned to Ultrahaptics IP Two Limited
  • Patent number: 11461966
    Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.
    Type: Grant
    Filed: November 20, 2020
    Date of Patent: October 4, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Kevin A Horowitz, David S Holz
  • Publication number: 20220300085
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.
    Type: Application
    Filed: June 6, 2022
    Publication date: September 22, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ
  • Patent number: 11435788
    Abstract: The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted.
    Type: Grant
    Filed: March 1, 2021
    Date of Patent: September 6, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Paul Durdik
  • Publication number: 20220277527
    Abstract: The technology disclosed relates to a method of realistic rendering of a real object as a virtual object in a virtual space using an offset in the position of the hand in a three-dimensional (3D) sensory space. An offset between expected positions of the eye(s) of a wearer of a head mounted device and a sensor attached to the head mounted device for sensing a position of at least one hand in a three-dimensional (3D) sensory space is determined. A position of the hand in the three-dimensional (3D) sensory space can be sensed using a sensor. The sensed position of the hand can be transformed by the offset into a re-rendered position of the hand as would appear to the wearer of the head mounted device if the wearer were looking at the actual hand. The re-rendered hand can be depicted to the wearer of the head mounted device.
    Type: Application
    Filed: February 7, 2022
    Publication date: September 1, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Alex Marcolina, David Holz
  • Patent number: 11429194
    Abstract: Methods and systems for processing input from an image-capture device for gesture-recognition. The method further includes computationally interpreting user gestures in accordance with a first mode of operation; analyzing the path of movement of an object to determine an intent of a user to change modes of operation; and, upon determining an intent of the user to change modes of operation, subsequently interpreting user gestures in accordance with the second mode of operation.
    Type: Grant
    Filed: December 6, 2021
    Date of Patent: August 30, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Publication number: 20220269352
    Abstract: The technology disclosed relates to the creation of a gesture library for subsequent use in filtering gestures. In particular, the methods disclosed here generate and store gestures and their characteristic values to create a set of user-defined reference gestures that can be compared against characteristics of actual gestures performed in a 3D sensory space. Based on these comparisons, a set of gestures of interest may be filtered from all the gestures performed in the 3D sensory space. The technology disclosed also relates to customizing gesture interpretation for a particular user, and to setting parameters for recognizing gestures by prompting the user to select values for characteristics of the gestures. In one implementation, the technology disclosed includes performing characteristic focused demonstrations of boundaries of the gesture.
    Type: Application
    Filed: May 10, 2022
    Publication date: August 25, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David S. HOLZ
  • Publication number: 20220270218
    Abstract: An AR calibration system for correcting AR headset distortions. A calibration image is provided to an external screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the external screen. One or more cameras are located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera(s) cancels out an acceptable portion of the calibration image provided to the external screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.
    Type: Application
    Filed: May 10, 2022
    Publication date: August 25, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Johnathon Scott SELSTAD, David Samuel Holz
  • Patent number: 11418706
    Abstract: The technology disclosed relates to adjusting the monitored field of view of a camera and/or a view of a virtual scene from a point of view of a virtual camera based on the distance between tracked objects. For example, if the user's hand is being tracked for gestures, the closer the hand gets to another object, the tighter the frame can become—i.e., the more the camera can zoom in so that the hand and the other object occupy most of the frame. The camera can also be reoriented so that the hand and the other object remain in the center of the field of view. The distance between two objects in a camera's field of view can be determined and a parameter of a motion-capture system adjusted based thereon. In particular, the pan and/or zoom levels of the camera may be adjusted in accordance with the distance.
    Type: Grant
    Filed: April 30, 2021
    Date of Patent: August 16, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Publication number: 20220254138
    Abstract: The technology disclosed relates to identifying an object in a field of view of a camera. In particular, it relates to identifying a display in the field of view of the camera. This is achieved by monitoring a space including acquiring a series of image frames of the space using the camera and detecting one or more light sources in the series of image frames. Further, one or more frequencies of periodic intensity or brightness variations, also referred to as ‘refresh rate’, of light emitted from the light sources is measured. Based on the one or more frequencies of periodic intensity variations of light emitted from the light sources, at least one display that includes the light sources is identified.
    Type: Application
    Filed: May 2, 2022
    Publication date: August 11, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David HOLZ
  • Publication number: 20220236808
    Abstract: A method and system are provided for controlling a machine using gestures. The method includes sensing a variation of position of a control object using an imaging system, determining, from the variation, one or more primitives describing a characteristic of a control object moving in space, comparing the one or more primitives to one or more gesture templates in a library of gesture templates, selecting, based on a result of the comparing, one or more gesture templates corresponding to the one or more primitives, and providing at least one gesture template of the selected one or more gesture templates as an indication of a command to issue to a machine under control responsive to the variation.
    Type: Application
    Filed: February 7, 2022
    Publication date: July 28, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian
  • Patent number: 11392212
    Abstract: The technology disclosed relates to a method of realistic displacement of a virtual object for an interaction between a control object in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a 2D sub-component free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting rotation of the virtual object by the 3D solid control object model.
    Type: Grant
    Filed: February 26, 2021
    Date of Patent: July 19, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Alex Marcolina, David S. Holz
  • Patent number: 11392211
    Abstract: The technology disclosed relates to operating a motion-capture system responsive to available computational resources. In particular, it relates to assessing a level of image acquisition and image-analysis resources available using benchmarking of system components. In response, one or more image acquisition parameters and/or image-analysis parameters are adjusted. Acquisition and/or analysis of image data are then made compliant with the adjusted image acquisition parameters and/or image-analysis parameters. In some implementations, image acquisition parameters include frame resolution and frame capture rate and image-analysis parameters include analysis algorithm and analysis density.
    Type: Grant
    Filed: July 23, 2020
    Date of Patent: July 19, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David Holz
  • Patent number: 11386711
    Abstract: The technology disclosed relates to highly functional/highly accurate motion sensory control devices for use in automotive and industrial control systems capable of capturing and providing images to motion capture systems that detect gestures in a three dimensional (3D) sensory space.
    Type: Grant
    Filed: August 13, 2015
    Date of Patent: July 12, 2022
    Assignee: ULTRAHAPTICS IP TWO LIMITED
    Inventors: David S. Holz, Justin Schunick, Neeloy Roy, Chen Zheng, Ward Travis
  • Publication number: 20220215623
    Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.
    Type: Application
    Filed: March 21, 2022
    Publication date: July 7, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Kevin A. HOROWITZ, David S. HOLZ
  • Publication number: 20220197479
    Abstract: A method is provided, which includes identifying, for presentation, separate interactive objects, each separate interactive object being associated with a particular finger of two or more fingers, wherein a position of each separate interactive object changes in response to a movement of the associated finger of the two or more fingers, obtaining, for at least one separate interactive object, a determination of a property of a screen area surrounding the at least one separate interactive object, and modifying a presentation property of the at least one separate interactive object based on the determined property of the screen area surrounding the at least one separate interactive object. The property of the screen area surrounding the at least one separate interactive object is determined based on at least one of: a brightness, a color and a pattern of the screen area surrounding the at least one separate interactive object.
    Type: Application
    Filed: March 8, 2022
    Publication date: June 23, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David S. HOLZ
  • Publication number: 20220198776
    Abstract: A method for detecting a finger is provided. The method includes obtaining a plurality of digital images including a first digital image captured by a camera from a field of view containing a background and a hand including at least one finger, and obtaining an identification of pixels of the plurality of digital images that correspond to at least one finger that is visible in the plurality of digital images rather than to the background, the pixels being identified by: obtaining, from the digital images, a Gaussian brightness falloff pattern indicative of at least one finger, identifying an axis of the at least one finger based on the obtained Gaussian brightness falloff pattern indicative of the at least one finger without identifying edges of the at least one finger, and identifying the pixels that correspond to the at least one finger based on the identified axis.
    Type: Application
    Filed: March 11, 2022
    Publication date: June 23, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. HOLZ, Hua Yang
  • Patent number: 11353962
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.
    Type: Grant
    Filed: August 6, 2020
    Date of Patent: June 7, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz
  • Patent number: 11354787
    Abstract: The disclosed technology teaches an AR calibration system for compensating for AR headset distortions. A calibration image is provided to an external screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the external screen. The camera is located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera cancels out an acceptable portion of the calibration image provided to the external screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.
    Type: Grant
    Filed: November 5, 2019
    Date of Patent: June 7, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Johnathon Scott Selstad, David Samuel Holz
  • Patent number: 11347317
    Abstract: The technology disclosed relates to filtering gestures, according to one implementation. In particular, it relates to distinguishing between interesting gestures from non-interesting gestures in a three-dimensional (3D) sensory space by comparing characteristics of user-defined reference gestures against characteristics of actual gestures performed in the 3D sensory space. Based on the comparison, a set of gestures of interest are filtered from all the gestures performed in the 3D sensory space. The technology disclosed also relates to customizing gesture interpretation for a particular user, according to another implementation. In particular, it relates to setting parameters for recognizing gestures by prompting the user to select values for characteristics of the gestures. In one implementation, the technology disclosed includes performing characteristic focused demonstrations of boundaries of the gesture.
    Type: Grant
    Filed: April 14, 2020
    Date of Patent: May 31, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Publication number: 20220155873
    Abstract: A method and system for controlling an electronic device using gesture and/or a device is provided. The method includes capturing, in a 3D sensor space, an image including a user manipulable hand-held input device and a body part of a user, finding an entry in a database of multiple user manipulable hand-held input devices that matches the image of the user manipulable hand-held input device, wherein each user manipulable hand-held input device, having an entry in the database, respectively generates signals in response to performing one or more specific control manipulations, determining a primary control mode of primarily controlling the electronic device using 3D gestures or using control manipulations directly from the user manipulable hand-held input device, the primary control mode being determined based on a predetermined priority level associated with the user manipulable hand-held input device and controlling the electronic device using the determined primary control mode.
    Type: Application
    Filed: November 22, 2021
    Publication date: May 19, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David HOLZ