Patents by Inventor Jonathan Marsden

Jonathan Marsden has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240077950
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The technology disclosed includes determining from the motion information whether a motion of the control object with respect to the virtual control construct is an engagement gesture, such as a virtual mouse click or other control device operation. The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.
    Type: Application
    Filed: November 9, 2023
    Publication date: March 7, 2024
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ
  • Publication number: 20240061511
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, the gesture is identified as an engagement gesture, and compared with reference gestures from a library of reference gestures. In some embodiments, a degree of completion of the recognized engagement gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Application
    Filed: July 7, 2023
    Publication date: February 22, 2024
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel HARE, Ryan JULIAN
  • Patent number: 11874970
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.
    Type: Grant
    Filed: June 6, 2022
    Date of Patent: January 16, 2024
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz
  • Patent number: 11854308
    Abstract: The technology disclosed also initializes a new hand that enters the field of view of a gesture recognition system using a parallax detection module. The parallax detection module determines candidate regions of interest (ROI) for a given input hand image and computes depth, rotation and position information for the candidate ROI. Then, for each of the candidate ROI, an ImagePatch, which includes the hand, is extracted from the original input hand image to minimize processing of low-information pixels. Further, a hand classifier neural network is used to determine which ImagePatch most resembles a hand. For the qualified, most-hand like ImagePatch, a 3D virtual hand is initialized with depth, rotation and position matching that of the qualified ImagePatch.
    Type: Grant
    Filed: February 14, 2017
    Date of Patent: December 26, 2023
    Assignee: ULTRAHAPTICS IP TWO LIMITED
    Inventors: Jonathan Marsden, Raffi Bedikian, David Samuel Holz
  • Patent number: 11841920
    Abstract: The technology disclosed introduces two types of neural networks: “master” or “generalists” networks and “expert” or “specialists” networks. Both, master networks and expert networks, are fully connected neural networks that take a feature vector of an input hand image and produce a prediction of the hand pose. Master networks and expert networks differ from each other based on the data on which they are trained. In particular, master networks are trained on the entire data set. In contrast, expert networks are trained only on a subset of the entire dataset. In regards to the hand poses, master networks are trained on the input image data representing all available hand poses comprising the training data (including both real and simulated hand images).
    Type: Grant
    Filed: February 14, 2017
    Date of Patent: December 12, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Jonathan Marsden, Raffi Bedikian, David Samuel Holz
  • Patent number: 11740705
    Abstract: A method and system are provided for controlling a machine using gestures. The method includes sensing a variation of position of a control object using an imaging system, determining, from the variation, one or more primitives describing a characteristic of a control object moving in space, comparing the one or more primitives to one or more gesture templates in a library of gesture templates, selecting, based on a result of the comparing, one or more gesture templates corresponding to the one or more primitives, and providing at least one gesture template of the selected one or more gesture templates as an indication of a command to issue to a machine under control responsive to the variation.
    Type: Grant
    Filed: February 7, 2022
    Date of Patent: August 29, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian
  • Patent number: 11714880
    Abstract: The technology disclosed performs hand pose estimation on a so-called “joint-by-joint” basis. So, when a plurality of estimates for the 28 hand joints are received from a plurality of expert networks (and from master experts in some high-confidence scenarios), the estimates are analyzed at a joint level and a final location for each joint is calculated based on the plurality of estimates for a particular joint. This is a novel solution discovered by the technology disclosed because nothing in the field of art determines hand pose estimates at such granularity and precision. Regarding granularity and precision, because hand pose estimates are computed on a joint-by-joint basis, this allows the technology disclosed to detect in real time even the minutest and most subtle hand movements, such a bend/yaw/tilt/roll of a segment of a finger or a tilt an occluded finger, as demonstrated supra in the Experimental Results section of this application.
    Type: Grant
    Filed: July 10, 2019
    Date of Patent: August 1, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Jonathan Marsden, Raffi Bedikian, David Samuel Holz
  • Publication number: 20230214458
    Abstract: The technology disclosed performs hand pose estimation on a so-called “joint-by-joint” basis. So, when a plurality of estimates for the 28 hand joints are received from a plurality of expert networks (and from master experts in some high-confidence scenarios), the estimates are analyzed at a joint level and a final location for each joint is calculated based on the plurality of estimates for a particular joint. This is a novel solution discovered by the technology disclosed because nothing in the field of art determines hand pose estimates at such granularity and precision. Regarding granularity and precision, because hand pose estimates are computed on a joint-by joint basis, this allows the technology disclosed to detect in real time even the minutest and most subtle hand movements, such a bend/yaw/tilt/roll of a segment of a finger or a tilt an occluded finger, as demonstrated supra in the Experimental Results section of this application.
    Type: Application
    Filed: July 10, 2019
    Publication date: July 6, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Jonathan MARSDEN, Raffi BEDIKIAN, David Samuel HOLZ
  • Patent number: 11567578
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its contact with a “virtual touch plane or surface” (i.e., a plane, portion of a plane, and/or surface computationally defined in space, or corresponding to any physical surface).
    Type: Grant
    Filed: November 9, 2020
    Date of Patent: January 31, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Hua Yang, Leonid Kontsevich, James Donald, David S. Holz, Jonathan Marsden, Paul Durdik
  • Publication number: 20220300085
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.
    Type: Application
    Filed: June 6, 2022
    Publication date: September 22, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ
  • Publication number: 20220236808
    Abstract: A method and system are provided for controlling a machine using gestures. The method includes sensing a variation of position of a control object using an imaging system, determining, from the variation, one or more primitives describing a characteristic of a control object moving in space, comparing the one or more primitives to one or more gesture templates in a library of gesture templates, selecting, based on a result of the comparing, one or more gesture templates corresponding to the one or more primitives, and providing at least one gesture template of the selected one or more gesture templates as an indication of a command to issue to a machine under control responsive to the variation.
    Type: Application
    Filed: February 7, 2022
    Publication date: July 28, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian
  • Publication number: 20220207182
    Abstract: Method of providing anonymized data comprises: first server connected to first data source generates shared encryption key and receives, from networked location, first message comprising first encrypted identifier that it is unable to decrypt. The first server generates shared encryption key and generates, from the first data source, first dataset that may include data corresponding to the first encrypted identifier, and returns it to the networked location. To generate the first dataset, the first server obtains, from the first data source, set of at least one local unencrypted identifier. For each local unencrypted identifier, it verifies, using the shared encryption key, the first encrypted identifier against the local unencrypted identifier to obtain successful or failed verification. If the verification is successful, it obtains, from the first data source, first corresponding data that is associated with the local unencrypted identifier, wherein the first dataset comprises this first corresponding data.
    Type: Application
    Filed: December 22, 2021
    Publication date: June 30, 2022
    Inventors: Jonathan Marsden Bradshaw, Philip-Moritz Russmeyer
  • Patent number: 11353962
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.
    Type: Grant
    Filed: August 6, 2020
    Date of Patent: June 7, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz
  • Patent number: 11243612
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Grant
    Filed: November 19, 2018
    Date of Patent: February 8, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian
  • Publication number: 20210081054
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its contact with a “virtual touch plane or surface” (i.e., a plane, portion of a plane, and/or surface computationally defined in space, or corresponding to any physical surface).
    Type: Application
    Filed: November 9, 2020
    Publication date: March 18, 2021
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Hua Yang, Leonid Kontsevich, James Donald, David S. Holz, Jonathan Marsden, Paul Durdik
  • Publication number: 20200363874
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.
    Type: Application
    Filed: August 6, 2020
    Publication date: November 19, 2020
    Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ
  • Patent number: 10831281
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its contact with a “virtual touch plane or surface” (i.e., a plane, portion of a plane, and/or surface computationally defined in space, or corresponding to any physical surface).
    Type: Grant
    Filed: May 2, 2019
    Date of Patent: November 10, 2020
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Hua Yang, Leonid Kontsevich, James Donald, David S. Holz, Jonathan Marsden, Paul Durdik
  • Patent number: 10739862
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.
    Type: Grant
    Filed: August 3, 2018
    Date of Patent: August 11, 2020
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz
  • Publication number: 20190258320
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its contact with a “virtual touch plane or surface” (i.e., a plane, portion of a plane, and/or surface computationally defined in space, or corresponding to any physical surface).
    Type: Application
    Filed: May 2, 2019
    Publication date: August 22, 2019
    Applicant: Leap Motion, Inc.
    Inventors: Hua Yang, Leonid Kontsevich, James Donald, David S. Holz, Jonathan Marsden, Paul Durdik
  • Publication number: 20190155394
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Application
    Filed: November 19, 2018
    Publication date: May 23, 2019
    Applicant: Leap Motion, Inc.
    Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel HARE, Ryan JULIAN