Patents Assigned to Ultrahaptics IP Two Limited
  • Patent number: 11347317
    Abstract: The technology disclosed relates to filtering gestures, according to one implementation. In particular, it relates to distinguishing between interesting gestures from non-interesting gestures in a three-dimensional (3D) sensory space by comparing characteristics of user-defined reference gestures against characteristics of actual gestures performed in the 3D sensory space. Based on the comparison, a set of gestures of interest are filtered from all the gestures performed in the 3D sensory space. The technology disclosed also relates to customizing gesture interpretation for a particular user, according to another implementation. In particular, it relates to setting parameters for recognizing gestures by prompting the user to select values for characteristics of the gestures. In one implementation, the technology disclosed includes performing characteristic focused demonstrations of boundaries of the gesture.
    Type: Grant
    Filed: April 14, 2020
    Date of Patent: May 31, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Publication number: 20220155873
    Abstract: A method and system for controlling an electronic device using gesture and/or a device is provided. The method includes capturing, in a 3D sensor space, an image including a user manipulable hand-held input device and a body part of a user, finding an entry in a database of multiple user manipulable hand-held input devices that matches the image of the user manipulable hand-held input device, wherein each user manipulable hand-held input device, having an entry in the database, respectively generates signals in response to performing one or more specific control manipulations, determining a primary control mode of primarily controlling the electronic device using 3D gestures or using control manipulations directly from the user manipulable hand-held input device, the primary control mode being determined based on a predetermined priority level associated with the user manipulable hand-held input device and controlling the electronic device using the determined primary control mode.
    Type: Application
    Filed: November 22, 2021
    Publication date: May 19, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David HOLZ
  • Publication number: 20220147137
    Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.
    Type: Application
    Filed: January 27, 2022
    Publication date: May 12, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventors: John Adrian Arthur JOHNSTON, Johnathon Scott SELSTAD, Alex MARCOLINA
  • Patent number: 11321577
    Abstract: The technology disclosed relates to identifying an object in a field of view of a camera. In particular, it relates to identifying a display in the field of view of the camera. This is achieved by monitoring a space including acquiring a series of image frames of the space using the camera and detecting one or more light sources in the series of image frames. Further, one or more frequencies of periodic intensity or brightness variations, also referred to as ‘refresh rate’, of light emitted from the light sources is measured. Based on the one or more frequencies of periodic intensity variations of light emitted from the light sources, at least one display that includes the light sources is identified.
    Type: Grant
    Filed: October 1, 2020
    Date of Patent: May 3, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David Holz
  • Patent number: 11307282
    Abstract: The technology disclosed relates to determining positional information about an object of interest is provided. In particular, it includes, conducting scanning of a field of interest with an emission from a transmission area according to an ordered scan pattern. The emission can be received to form a signal based upon at least one salient property (e.g., intensity, amplitude, frequency, polarization, phase, or other detectable feature) of the emission varying with time at the object of interest. Synchronization information about the ordered scan pattern can be derived from a source, a second signal broadcast separately, social media share, others, or and/or combinations thereof). A correspondence between at least one characteristic of the signal and the synchronization information can be established. Positional information can be determined based at least in part upon the correspondence.
    Type: Grant
    Filed: October 24, 2014
    Date of Patent: April 19, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Robert Samuel Gordon, Gabriel A. Hare, Neeloy Roy, Maxwell Sills, Paul Durdik
  • Patent number: 11308711
    Abstract: Enhanced contrast between an object of interest and background surfaces visible in an image is provided using controlled lighting directed at the object. Exploiting the falloff of light intensity with distance, a light source (or multiple light sources), such as an infrared light source, can be positioned near one or more cameras to shine light onto the object while the camera(s) capture images. The captured images can be analyzed to distinguish object pixels from background pixels.
    Type: Grant
    Filed: June 29, 2020
    Date of Patent: April 19, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Hua Yang
  • Publication number: 20220091680
    Abstract: Methods and systems for processing input from an image-capture device for gesture-recognition. The method further includes computationally interpreting user gestures in accordance with a first mode of operation; analyzing the path of movement of an object to determine an intent of a user to change modes of operation; and, upon determining an intent of the user to change modes of operation, subsequently interpreting user gestures in accordance with the second mode of operation.
    Type: Application
    Filed: December 6, 2021
    Publication date: March 24, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David S. HOLZ
  • Patent number: 11282273
    Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.
    Type: Grant
    Filed: October 1, 2020
    Date of Patent: March 22, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Kevin A. Horowitz, David S. Holz
  • Publication number: 20220083880
    Abstract: The technology disclosed relates to manipulating a virtual object. In particular, it relates to detecting a hand in a three-dimensional (3D) sensory space and generating a predictive model of the hand, and using the predictive model to track motion of the hand. The predictive model includes positions of calculation points of fingers, thumb and palm of the hand. The technology disclosed relates to dynamically selecting at least one manipulation point proximate to a virtual object based on the motion tracked by the predictive model and positions of one or more of the calculation points, and manipulating the virtual object by interaction between at least some of the calculation points of the predictive model and the dynamically selected manipulation point.
    Type: Application
    Filed: November 22, 2021
    Publication date: March 17, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Maxwell SILLS, Hua YANG, Gabriel HARE
  • Patent number: 11275480
    Abstract: Aspects of the systems and methods are described providing for modifying a presented interactive element or object, such as a cursor, based on user-input gestures, the presented environment of the cursor, or any combination thereof. The color, size, shape, transparency, and/or responsiveness of the cursor may change based on the gesture velocity, acceleration, or path. In one implementation, the cursor “stretches” to graphically indicate the velocity and/or acceleration of the gesture. The display properties of the cursor may also change if, for example, the area of the screen occupied by the cursor is dark, bright, textured, or is otherwise complicated. In another implementation, the cursor is drawn using sub-pixel smoothing to improve its visual quality.
    Type: Grant
    Filed: March 1, 2021
    Date of Patent: March 15, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Patent number: 11269481
    Abstract: The technology disclosed relates to distinguishing meaningful gestures from proximate non-meaningful gestures in a three-dimensional (3D) sensory space. In particular, it relates to calculating spatial trajectories of different gestures and determining a dominant gesture based on magnitudes of the spatial trajectories. The technology disclosed also relates to uniformly responding to gestural inputs from a user irrespective of a position of the user. In particular, it relates to automatically adapting a responsiveness scale between gestures in a physical space and resulting responses in a gestural interface by automatically proportioning on-screen responsiveness to scaled movement distances of gestures in the physical space, user spacing with the 3D sensory space, or virtual object density in the gestural interface.
    Type: Grant
    Filed: October 1, 2020
    Date of Patent: March 8, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Patent number: 11243612
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Grant
    Filed: November 19, 2018
    Date of Patent: February 8, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian
  • Patent number: 11244513
    Abstract: The technology disclosed relates to a method of realistic rendering of a real object as a virtual object in a virtual space using an offset in the position of the hand in a three-dimensional (3D) sensory space. An offset between expected positions of the eye(s) of a wearer of a head mounted device and a sensor attached to the head mounted device for sensing a position of at least one hand in a three-dimensional (3D) sensory space is determined. A position of the hand in the three-dimensional (3D) sensory space can be sensed using a sensor. The sensed position of the hand can be transformed by the offset into a re-rendered position of the hand as would appear to the wearer of the head mounted device if the wearer were looking at the actual hand. The re-rendered hand can be depicted to the wearer of the head mounted device.
    Type: Grant
    Filed: March 11, 2020
    Date of Patent: February 8, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Alex Marcolina, David Holz
  • Patent number: 11237625
    Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.
    Type: Grant
    Filed: November 24, 2020
    Date of Patent: February 1, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: John Adrian Arthur Johnston, Johnathon Scott Selstad, Alex Marcolina
  • Patent number: 11227172
    Abstract: The technology disclosed relates to coordinating motion-capture of a hand by a network of motion-capture sensors having overlapping fields of view. In particular, it relates to designating a first sensor among three or more motion-capture sensors as having a master frame of reference, observing motion of a hand as it passes through overlapping fields of view of the respective motion-capture sensors, synchronizing capture of images of the hand within the overlapping fields of view by pairs of the motion-capture devices, and using the pairs of the hand images captured by the synchronized motion-capture devices to automatically calibrate the motion-capture sensors to the master frame of reference frame.
    Type: Grant
    Filed: July 29, 2019
    Date of Patent: January 18, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Publication number: 20220011871
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Application
    Filed: September 27, 2021
    Publication date: January 13, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Isaac COHEN, Maxwell SILLS
  • Publication number: 20210382543
    Abstract: The technology disclosed can provide capabilities such as using motion sensors and/or other types of sensors coupled to a motion-capture system to monitor motions within a real environment. A virtual object can be projected to a user of a portable device integrated into an augmented rendering of a real environment about the user. Motion information of a user body portion is determined based at least in part upon sensory information received from imaging or acoustic sensory devices. Control information is communicated to a system based in part on a combination of the motion of the portable device and the detected motion of the user. The virtual device experience can be augmented in some implementations by the addition of haptic, audio and/or other sensory information projectors.
    Type: Application
    Filed: August 23, 2021
    Publication date: December 9, 2021
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David S. HOLZ
  • Publication number: 20210382563
    Abstract: Methods and systems for processing an input are disclosed that detect a portion of a hand and/or other detectable object in a region of space monitored by a 3D sensor. The method further includes determining a zone corresponding to the region of space in which the portion of the hand or other detectable object was detected. Also, the method can include determining from the zone a correct way to interpret inputs made by a position, shape or a motion of the portion of the hand or other detectable object.
    Type: Application
    Filed: August 23, 2021
    Publication date: December 9, 2021
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Avinash DABIR, Paul DURDIK, Keith MERTENS, Michael ZAGORSEK
  • Patent number: 11194404
    Abstract: Methods and systems for processing input from an image-capture device for gesture-recognition. The method further includes computationally interpreting user gestures in accordance with a first mode of operation; analyzing the path of movement of an object to determine an intent of a user to change modes of operation; and, upon determining an intent of the user to change modes of operation, subsequently interpreting user gestures in accordance with the second mode of operation.
    Type: Grant
    Filed: January 21, 2021
    Date of Patent: December 7, 2021
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Patent number: 11181985
    Abstract: The technology disclosed relates to using gestures to supplant or augment use of a standard input device coupled to a system. It also relates to controlling a display using gestures. It further relates to controlling a system using more than one input device. In particular, it relates to detecting a standard input device that causes on-screen actions on a display in response to control manipulations performed using the standard input device. Further, a library of analogous gestures is identified, which includes gestures that are analogous to the control manipulations and also cause the on-screen actions responsive to the control manipulations. Thus, when a gesture from the library of analogous gestures is detected, a signal is generated that mimics a standard signal from the standard input device and causes at least one on-screen action.
    Type: Grant
    Filed: May 28, 2020
    Date of Patent: November 23, 2021
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David Holz