Patents by Inventor Gabriel Hare

Gabriel Hare has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250130648
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Application
    Filed: December 19, 2024
    Publication date: April 24, 2025
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel HARE, Ryan JULIAN
  • Publication number: 20250103145
    Abstract: The technology disclosed relates to manipulating a virtual object. In particular, it relates to detecting a hand in a three-dimensional (3D) sensory space and generating a predictive model of the hand, and using the predictive model to track motion of the hand. The predictive model includes positions of calculation points of fingers, thumb and palm of the hand. The technology disclosed relates to dynamically selecting at least one manipulation point proximate to a virtual object based on the motion tracked by the predictive model and positions of one or more of the calculation points, and manipulating the virtual object by interaction between at least some of the calculation points of the predictive model and the dynamically selected manipulation point.
    Type: Application
    Filed: December 9, 2024
    Publication date: March 27, 2025
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventors: David S. HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Maxwell SILLS, Hua YANG, Gabriel HARE
  • Patent number: 12204695
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, the gesture is identified as an engagement gesture, and compared with reference gestures from a library of reference gestures. In some embodiments, a degree of completion of the recognized engagement gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Grant
    Filed: July 7, 2023
    Date of Patent: January 21, 2025
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian
  • Patent number: 12164694
    Abstract: The technology disclosed relates to manipulating a virtual object. In particular, it relates to detecting a hand in a three-dimensional (3D) sensory space and generating a predictive model of the hand, and using the predictive model to track motion of the hand. The predictive model includes positions of calculation points of fingers, thumb and palm of the hand. The technology disclosed relates to dynamically selecting at least one manipulation point proximate to a virtual object based on the motion tracked by the predictive model and positions of one or more of the calculation points, and manipulating the virtual object by interaction between at least some of the calculation points of the predictive model and the dynamically selected manipulation point.
    Type: Grant
    Filed: November 22, 2021
    Date of Patent: December 10, 2024
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Raffi Bedikian, Adrian Gasinski, Maxwell Sills, Hua Yang, Gabriel Hare
  • Publication number: 20240061511
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, the gesture is identified as an engagement gesture, and compared with reference gestures from a library of reference gestures. In some embodiments, a degree of completion of the recognized engagement gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Application
    Filed: July 7, 2023
    Publication date: February 22, 2024
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel HARE, Ryan JULIAN
  • Patent number: 11740705
    Abstract: A method and system are provided for controlling a machine using gestures. The method includes sensing a variation of position of a control object using an imaging system, determining, from the variation, one or more primitives describing a characteristic of a control object moving in space, comparing the one or more primitives to one or more gesture templates in a library of gesture templates, selecting, based on a result of the comparing, one or more gesture templates corresponding to the one or more primitives, and providing at least one gesture template of the selected one or more gesture templates as an indication of a command to issue to a machine under control responsive to the variation.
    Type: Grant
    Filed: February 7, 2022
    Date of Patent: August 29, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian
  • Publication number: 20220236808
    Abstract: A method and system are provided for controlling a machine using gestures. The method includes sensing a variation of position of a control object using an imaging system, determining, from the variation, one or more primitives describing a characteristic of a control object moving in space, comparing the one or more primitives to one or more gesture templates in a library of gesture templates, selecting, based on a result of the comparing, one or more gesture templates corresponding to the one or more primitives, and providing at least one gesture template of the selected one or more gesture templates as an indication of a command to issue to a machine under control responsive to the variation.
    Type: Application
    Filed: February 7, 2022
    Publication date: July 28, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian
  • Publication number: 20220083880
    Abstract: The technology disclosed relates to manipulating a virtual object. In particular, it relates to detecting a hand in a three-dimensional (3D) sensory space and generating a predictive model of the hand, and using the predictive model to track motion of the hand. The predictive model includes positions of calculation points of fingers, thumb and palm of the hand. The technology disclosed relates to dynamically selecting at least one manipulation point proximate to a virtual object based on the motion tracked by the predictive model and positions of one or more of the calculation points, and manipulating the virtual object by interaction between at least some of the calculation points of the predictive model and the dynamically selected manipulation point.
    Type: Application
    Filed: November 22, 2021
    Publication date: March 17, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Maxwell SILLS, Hua YANG, Gabriel HARE
  • Patent number: 11243612
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Grant
    Filed: November 19, 2018
    Date of Patent: February 8, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian
  • Patent number: 11182685
    Abstract: The technology disclosed relates to manipulating a virtual object. In particular, it relates to detecting a hand in a three-dimensional (3D) sensory space and generating a predictive model of the hand, and using the predictive model to track motion of the hand. The predictive model includes positions of calculation points of fingers, thumb and palm of the hand. The technology disclosed relates to dynamically selecting at least one manipulation point proximate to a virtual object based on the motion tracked by the predictive model and positions of one or more of the calculation points, and manipulating the virtual object by interaction between at least some of the calculation points of the predictive model and the dynamically selected manipulation point.
    Type: Grant
    Filed: June 5, 2018
    Date of Patent: November 23, 2021
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Raffi Bedikian, Adrian Gasinski, Maxwell Sills, Hua Yang, Gabriel Hare
  • Publication number: 20200004403
    Abstract: The technology disclosed relates to using virtual attraction between hand or other control object in a three-dimensional (3D) sensory space and a virtual object in a virtual space. In particular, it relates to defining a virtual attraction zone of a hand or other control object that is tracked in a three-dimensional (3D) sensory space and generating one or more interaction forces between the control object and a virtual object in a virtual space that cause motion of the virtual object responsive to proximity of the control object to the virtual object and escalation with a virtual pinch or grasp action of the control object directed to a manipulation point of the virtual object.
    Type: Application
    Filed: September 11, 2019
    Publication date: January 2, 2020
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Hua YANG, Maxwell SILLS, Gabriel HARE
  • Patent number: 10416834
    Abstract: The technology disclosed relates to using virtual attraction between hand or other control object in a three-dimensional (3D) sensory space and a virtual object in a virtual space. In particular, it relates to defining a virtual attraction zone of a hand or other control object that is tracked in a three-dimensional (3D) sensory space and generating one or more interaction forces between the control object and a virtual object in a virtual space that cause motion of the virtual object responsive to proximity of the control object to the virtual object and escalation with a virtual pinch or grasp action of the control object directed to a manipulation point of the virtual object.
    Type: Grant
    Filed: November 13, 2014
    Date of Patent: September 17, 2019
    Assignee: Leap Motion, Inc.
    Inventors: David S Holz, Raffi Bedikian, Adrian Gasinski, Hua Yang, Maxwell Sills, Gabriel Hare
  • Publication number: 20190155394
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Application
    Filed: November 19, 2018
    Publication date: May 23, 2019
    Applicant: Leap Motion, Inc.
    Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel HARE, Ryan JULIAN
  • Publication number: 20190042957
    Abstract: The technology disclosed relates to manipulating a virtual object. In particular, it relates to detecting a hand in a three-dimensional (3D) sensory space and generating a predictive model of the hand, and using the predictive model to track motion of the hand. The predictive model includes positions of calculation points of fingers, thumb and palm of the hand. The technology disclosed relates to dynamically selecting at least one manipulation point proximate to a virtual object based on the motion tracked by the predictive model and positions of one or more of the calculation points, and manipulating the virtual object by interaction between at least some of the calculation points of the predictive model and the dynamically selected manipulation point.
    Type: Application
    Filed: June 5, 2018
    Publication date: February 7, 2019
    Applicant: Leap Motion, Inc.
    Inventors: David S. HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Maxwell SILLS, Hua YANG, Gabriel HARE
  • Patent number: 10168873
    Abstract: The technology disclosed relates to providing simplified manipulation of virtual objects by detected hand motions. In particular, it relates to a detecting hand motion and positions of the calculation points relative to a virtual object to be manipulated, dynamically selecting at least one manipulation point proximate to the virtual object based on the detected hand motion and positions of one or more of the calculation points, and manipulating the virtual object by interaction between the detected hand motion and positions of one or more of the calculation points and the dynamically selected manipulation point.
    Type: Grant
    Filed: October 29, 2014
    Date of Patent: January 1, 2019
    Assignee: LEAP MOTION, INC.
    Inventors: David S Holz, Raffi Bedikian, Adrian Gasinski, Maxwell Sills, Hua Yang, Gabriel Hare
  • Patent number: 9996797
    Abstract: The technology disclosed relates to manipulating a virtual object. In particular, it relates to detecting a hand in a three-dimensional (3D) sensory space and generating a predictive model of the hand, and using the predictive model to track motion of the hand. The predictive model includes positions of calculation points of fingers, thumb and palm of the hand. The technology disclosed relates to dynamically selecting at least one manipulation point proximate to a virtual object based on the motion tracked by the predictive model and positions of one or more of the calculation points, and manipulating the virtual object by interaction between at least some of the calculation points of the predictive model and the dynamically selected manipulation point.
    Type: Grant
    Filed: October 31, 2014
    Date of Patent: June 12, 2018
    Assignee: LEAP MOTION, INC.
    Inventors: David S Holz, Raffi Bedikian, Adrian Gasinski, Maxwell Sills, Hua Yang, Gabriel Hare
  • Publication number: 20180130050
    Abstract: Blockchain-based systems and methods incorporate secure wallets; enhanced, randomized, secure identifiers for uniquely identifying discrete items; and cryptographically secure time-stamped blockchains. Role-based secure wallets include private cryptographic keys for digitally signing transactions for recording in a blockchain. Operations to be performed using role-based wallet can be permitted or restricted based on privileges associated with the respective wallets. Multiple blockchains can also be used to track transactions involving different units of account, for example, a measurement of a discrete product being transferred and an associated value of the transaction.
    Type: Application
    Filed: March 9, 2017
    Publication date: May 10, 2018
    Inventors: Benjamin James Taylor, Gabriel Hare
  • Publication number: 20180130034
    Abstract: Blockchain-based systems and methods incorporate secure wallets; enhanced, randomized, secure identifiers for uniquely identifying discrete items; and cryptographically secure time-stamped blockchains. Role-based secure wallets include private cryptographic keys for digitally signing transactions for recording in a blockchain. Operations to be performed using role-based wallet can be permitted or restricted based on privileges associated with the respective wallets. Multiple blockchains can also be used to track transactions involving different units of account, for example, a measurement of a discrete product being transferred and an associated value of the transaction.
    Type: Application
    Filed: November 7, 2016
    Publication date: May 10, 2018
    Inventors: Benjamin J. Taylor, Gabriel Hare, Aaron M. Smith