Patents Assigned to Ultrahaptics IP Two Limited
  • Publication number: 20210003977
    Abstract: The technology disclosed relates to selecting among devices room to interact with. It also relates operating a smart phone with reduced power consumption. It further relates to gesturally interacting with devices that lack gestural responsiveness. The technology disclosed also relates to distinguishing control gestures from proximate non-control gestures in a pervasive three dimensional (3D) sensory space. The technology disclosed further relates to selecting among virtual interaction modalities to interact with.
    Type: Application
    Filed: September 21, 2020
    Publication date: January 7, 2021
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Robert Samuel Gordon, Paul Alan Durdik, Maxwell Sills
  • Patent number: 10880537
    Abstract: A motion sensory and imaging device that is capable of acquiring imaging information of the scene and providing at least a near real time pass-through of imaging information to a user. The sensory and imaging device can be used stand-alone or coupled to a wearable or portable device to create a wearable sensory system capable of presenting to the wearer the imaging information augmented with virtualized or created presentations of information.
    Type: Grant
    Filed: July 8, 2019
    Date of Patent: December 29, 2020
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Neeloy Roy, Hongyuan He
  • Publication number: 20200400428
    Abstract: Methods and systems for capturing motion and/or determining the shapes and positions of one or more objects in 3D space utilize cross-sections thereof. In various embodiments, images of the cross-sections are captured using a camera based on reflections therefrom or shadows cast thereby.
    Type: Application
    Filed: September 2, 2020
    Publication date: December 24, 2020
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David S. HOLZ
  • Publication number: 20200401232
    Abstract: The technology disclosed relates to motion capture and gesture recognition. In particular, it calculates the exerted force implied by a human hand motion and applies the equivalent through a robotic arm to a target object. In one implementation, this is achieved by tracking the motion and contact of the human hand and generating corresponding robotic commands that replicate the motion and contact of the human hand on a workpiece through a robotic tool.
    Type: Application
    Filed: September 3, 2020
    Publication date: December 24, 2020
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Maxwell Sills, Robert S. Gordon, Paul Durdik
  • Patent number: 10866632
    Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.
    Type: Grant
    Filed: September 30, 2019
    Date of Patent: December 15, 2020
    Assignee: Ultrahaptics IP Two Limited
    Inventors: John Adrian Arthur Johnston, Johnathon Scott Selstad, Alex Marcolina
  • Patent number: 10846942
    Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.
    Type: Grant
    Filed: August 29, 2014
    Date of Patent: November 24, 2020
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Kevin A Horowitz, David S Holz
  • Publication number: 20200356179
    Abstract: The technology disclosed relates to operating a motion-capture system responsive to available computational resources. In particular, it relates to assessing a level of image acquisition and image-analysis resources available using benchmarking of system components. In response, one or more image acquisition parameters and/or image-analysis parameters are adjusted. Acquisition and/or analysis of image data are then made compliant with the adjusted image acquisition parameters and/or image-analysis parameters. In some implementations, image acquisition parameters include frame resolution and frame capture rate and image-analysis parameters include analysis algorithm and analysis density.
    Type: Application
    Filed: July 23, 2020
    Publication date: November 12, 2020
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David HOLZ
  • Publication number: 20200356238
    Abstract: The technology disclosed relates to providing simplified manipulation of virtual objects by detected hand motions. In particular, it relates to a detecting hand motion and positions of the calculation points relative to a virtual object to be manipulated, dynamically selecting at least one manipulation point proximate to the virtual object based on the detected hand motion and positions of one or more of the calculation points, and manipulating the virtual object by interaction between the detected hand motion and positions of one or more of the calculation points and the dynamically selected manipulation point.
    Type: Application
    Filed: July 28, 2020
    Publication date: November 12, 2020
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Hua YANG, Gabriel A. HARE, Maxwell SILLS
  • Patent number: 10831281
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its contact with a “virtual touch plane or surface” (i.e., a plane, portion of a plane, and/or surface computationally defined in space, or corresponding to any physical surface).
    Type: Grant
    Filed: May 2, 2019
    Date of Patent: November 10, 2020
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Hua Yang, Leonid Kontsevich, James Donald, David S. Holz, Jonathan Marsden, Paul Durdik
  • Patent number: 10832470
    Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.
    Type: Grant
    Filed: August 12, 2019
    Date of Patent: November 10, 2020
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Kevin A. Horowitz, David S. Holz
  • Patent number: 10832080
    Abstract: The technology disclosed relates to identifying an object in a field of view of a camera. In particular, it relates to identifying a display in the field of view of the camera. This is achieved by monitoring a space including acquiring a series of image frames of the space using the camera and detecting one or more light sources in the series of image frames. Further, one or more frequencies of periodic intensity or brightness variations, also referred to as ‘refresh rate’, of light emitted from the light sources is measured. Based on the one or more frequencies of periodic intensity variations of light emitted from the light sources, at least one display that includes the light sources is identified.
    Type: Grant
    Filed: March 7, 2019
    Date of Patent: November 10, 2020
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David Holz
  • Patent number: 10817130
    Abstract: The technology disclosed relates to distinguishing meaningful gestures from proximate non-meaningful gestures in a three-dimensional (3D) sensory space. In particular, it relates to calculating spatial trajectories of different gestures and determining a dominant gesture based on magnitudes of the spatial trajectories. The technology disclosed also relates to uniformly responding to gestural inputs from a user irrespective of a position of the user. In particular, it relates to automatically adapting a responsiveness scale between gestures in a physical space and resulting responses in a gestural interface by automatically proportioning on-screen responsiveness to scaled movement distances of gestures in the physical space, user spacing with the 3D sensory space, or virtual object density in the gestural interface.
    Type: Grant
    Filed: August 3, 2018
    Date of Patent: October 27, 2020
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Publication number: 20200334491
    Abstract: Enhanced contrast between an object of interest and background surfaces visible in an image is provided using controlled lighting directed at the object. Exploiting the falloff of light intensity with distance, a light source (or multiple light sources), such as an infrared light source, can be positioned near one or more cameras to shine light onto the object while the camera(s) capture images. The captured images can be analyzed to distinguish object pixels from background pixels.
    Type: Application
    Filed: June 29, 2020
    Publication date: October 22, 2020
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Hua Yang
  • Publication number: 20200319715
    Abstract: A region of space may be monitored for the presence or absence of one or more control objects, and object attributes and changes thereto may be interpreted as control information provided as input to a machine or application. In some embodiments, the region is monitored using a combination of scanning and image-based sensing.
    Type: Application
    Filed: June 22, 2020
    Publication date: October 8, 2020
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Publication number: 20200320793
    Abstract: The technology disclosed relates to a method of realistic rendering of a real object as a virtual object in a virtual space using an offset in the position of the hand in a three-dimensional (3D) sensory space. An offset between expected positions of the eye(s) of a wearer of a head mounted device and a sensor attached to the head mounted device for sensing a position of at least one hand in a three-dimensional (3D) sensory space is determined. A position of the hand in the three-dimensional (3D) sensory space can be sensed using a sensor. The sensed position of the hand can be transformed by the offset into a re-rendered position of the hand as would appear to the wearer of the head mounted device if the wearer were looking at the actual hand. The re-rendered hand can be depicted to the wearer of the head mounted device.
    Type: Application
    Filed: March 11, 2020
    Publication date: October 8, 2020
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Alex MARCOLINA, David HOLZ
  • Publication number: 20200301515
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Application
    Filed: February 28, 2020
    Publication date: September 24, 2020
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Isaac COHEN, Maxwell SILLS, Paul DURDIK
  • Patent number: 10782657
    Abstract: The technology disclosed relates to selecting among devices room to interact with. It also relates operating a smart phone with reduced power consumption. It further relates to gesturally interacting with devices that lack gestural responsiveness. The technology disclosed also relates to distinguishing control gestures from proximate non-control gestures in a pervasive three dimensional (3D) sensory space. The technology disclosed further relates to selecting among virtual interaction modalities to interact with.
    Type: Grant
    Filed: February 19, 2015
    Date of Patent: September 22, 2020
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Robert Samuel Gordon, Maxwell Sills, Paul Alan Durdik
  • Patent number: 10782847
    Abstract: The technology disclosed relates to distinguishing meaningful gestures from proximate non-meaningful gestures in a three-dimensional (3D) sensory space. In particular, it relates to calculating spatial trajectories of different gestures and determining a dominant gesture based on magnitudes of the spatial trajectories. The technology disclosed also relates to uniformly responding to gestural inputs from a user irrespective of a position of the user. In particular, it relates to automatically adapting a responsiveness scale between gestures in a physical space and resulting responses in a gestural interface by automatically proportioning on-screen responsiveness to scaled movement distances of gestures in the physical space, user spacing with the 3D sensory space, or virtual object density in the gestural interface.
    Type: Grant
    Filed: April 19, 2017
    Date of Patent: September 22, 2020
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Publication number: 20200293115
    Abstract: The technology disclosed relates to using gestures to supplant or augment use of a standard input device coupled to a system. It also relates to controlling a display using gestures. It further relates to controlling a system using more than one input device. In particular, it relates to detecting a standard input device that causes on-screen actions on a display in response to control manipulations performed using the standard input device. Further, a library of analogous gestures is identified, which includes gestures that are analogous to the control manipulations and also cause the on-screen actions responsive to the control manipulations. Thus, when a gesture from the library of analogous gestures is detected, a signal is generated that mimics a standard signal from the standard input device and causes at least one on-screen action.
    Type: Application
    Filed: May 28, 2020
    Publication date: September 17, 2020
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David HOLZ
  • Publication number: 20200286295
    Abstract: The technology disclosed can provide capabilities to view and/or interact with the real world to the user of a wearable (or portable) device using a sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved user experience, greater safety, greater functionality to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted devices (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Application
    Filed: March 18, 2020
    Publication date: September 10, 2020
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David S. HOLZ