Patents by Inventor Raffi A. Bedikian

Raffi A. Bedikian has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20190146660
    Abstract: The technology disclosed relates to providing simplified manipulation of virtual objects by detected hand motions. In particular, it relates to a detecting hand motion and positions of the calculation points relative to a virtual object to be manipulated, dynamically selecting at least one manipulation point proximate to the virtual object based on the detected hand motion and positions of one or more of the calculation points, and manipulating the virtual object by interaction between the detected hand motion and positions of one or more of the calculation points and the dynamically selected manipulation point.
    Type: Application
    Filed: December 20, 2018
    Publication date: May 16, 2019
    Applicant: Leap Motion, Inc.
    Inventors: David S. HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Hua YANG, Gabriel A. HARE, Maxwell SILLS
  • Publication number: 20190050509
    Abstract: The technology disclosed relates to simplifying updating of a predictive model using clustering observed points. In particular, it relates to observing a set of points in 3D sensory space, determining surface normal directions from the points, clustering the points by their surface normal directions and adjacency, accessing a predictive model of a hand, refining positions of segments of the predictive model, matching the clusters of the points to the segments, and using the matched clusters to refine the positions of the matched segments. It also relates to distinguishing between alternative motions between two observed locations of a control object in a 3D sensory space by accessing first and second positions of a segment of a predictive model of a control object such that motion between the first position and the second position was at least partially occluded from observation in a 3D sensory space.
    Type: Application
    Filed: June 8, 2018
    Publication date: February 14, 2019
    Applicant: Leap Motion, Inc.
    Inventors: David S. HOLZ, Raffi BEDIKIAN, Kevin HOROWITZ, Hua YANG
  • Publication number: 20190042957
    Abstract: The technology disclosed relates to manipulating a virtual object. In particular, it relates to detecting a hand in a three-dimensional (3D) sensory space and generating a predictive model of the hand, and using the predictive model to track motion of the hand. The predictive model includes positions of calculation points of fingers, thumb and palm of the hand. The technology disclosed relates to dynamically selecting at least one manipulation point proximate to a virtual object based on the motion tracked by the predictive model and positions of one or more of the calculation points, and manipulating the virtual object by interaction between at least some of the calculation points of the predictive model and the dynamically selected manipulation point.
    Type: Application
    Filed: June 5, 2018
    Publication date: February 7, 2019
    Applicant: Leap Motion, Inc.
    Inventors: David S. HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Maxwell SILLS, Hua YANG, Gabriel HARE
  • Publication number: 20190033979
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.
    Type: Application
    Filed: August 3, 2018
    Publication date: January 31, 2019
    Applicant: Leap Motion, Inc.
    Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ
  • Publication number: 20190033975
    Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.
    Type: Application
    Filed: May 24, 2018
    Publication date: January 31, 2019
    Applicant: Leap Motion, Inc.
    Inventors: Kevin A. HOROWITZ, Matias PEREZ, Raffi BEDIKIAN, David S. HOLZ, Gabriel A. HARE
  • Patent number: 10168873
    Abstract: The technology disclosed relates to providing simplified manipulation of virtual objects by detected hand motions. In particular, it relates to a detecting hand motion and positions of the calculation points relative to a virtual object to be manipulated, dynamically selecting at least one manipulation point proximate to the virtual object based on the detected hand motion and positions of one or more of the calculation points, and manipulating the virtual object by interaction between the detected hand motion and positions of one or more of the calculation points and the dynamically selected manipulation point.
    Type: Grant
    Filed: October 29, 2014
    Date of Patent: January 1, 2019
    Assignee: LEAP MOTION, INC.
    Inventors: David S Holz, Raffi Bedikian, Adrian Gasinski, Maxwell Sills, Hua Yang, Gabriel Hare
  • Patent number: 10139918
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Grant
    Filed: September 28, 2016
    Date of Patent: November 27, 2018
    Assignee: Leap Motion, Inc.
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Samuel Holz, Maxwell Sills, Matias Perez, Gabriel A. Hare, Ryan Christopher Julian
  • Patent number: 10042430
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.
    Type: Grant
    Filed: November 21, 2016
    Date of Patent: August 7, 2018
    Assignee: Leap Motion, Inc.
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz
  • Patent number: 9996638
    Abstract: The technology disclosed relates to simplifying updating of a predictive model using clustering observed points. In particular, it relates to observing a set of points in 3D sensory space, determining surface normal directions from the points, clustering the points by their surface normal directions and adjacency, accessing a predictive model of a hand, refining positions of segments of the predictive model, matching the clusters of the points to the segments, and using the matched clusters to refine the positions of the matched segments. It also relates to distinguishing between alternative motions between two observed locations of a control object in a 3D sensory space by accessing first and second positions of a segment of a predictive model of a control object such that motion between the first position and the second position was at least partially occluded from observation in a 3D sensory space.
    Type: Grant
    Filed: October 31, 2014
    Date of Patent: June 12, 2018
    Assignee: LEAP MOTION, INC.
    Inventors: David S Holz, Raffi Bedikian, Kevin Horowitz, Hua Yang
  • Patent number: 9996797
    Abstract: The technology disclosed relates to manipulating a virtual object. In particular, it relates to detecting a hand in a three-dimensional (3D) sensory space and generating a predictive model of the hand, and using the predictive model to track motion of the hand. The predictive model includes positions of calculation points of fingers, thumb and palm of the hand. The technology disclosed relates to dynamically selecting at least one manipulation point proximate to a virtual object based on the motion tracked by the predictive model and positions of one or more of the calculation points, and manipulating the virtual object by interaction between at least some of the calculation points of the predictive model and the dynamically selected manipulation point.
    Type: Grant
    Filed: October 31, 2014
    Date of Patent: June 12, 2018
    Assignee: LEAP MOTION, INC.
    Inventors: David S Holz, Raffi Bedikian, Adrian Gasinski, Maxwell Sills, Hua Yang, Gabriel Hare
  • Patent number: 9983686
    Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.
    Type: Grant
    Filed: October 9, 2017
    Date of Patent: May 29, 2018
    Assignee: Leap Motion, Inc.
    Inventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
  • Patent number: 9911240
    Abstract: The technology disclosed relates to a method of interacting with a virtual object. In particular, it relates to referencing a virtual object in an augmented reality space, identifying a physical location of a device in at least one image of the augmented reality space, generating for display a control coincident with a surface of the device, sensing interactions between at least one control object and the control coincident with the surface of the device, and generating data signaling manipulations of the control coincident with the surface of the device.
    Type: Grant
    Filed: August 16, 2017
    Date of Patent: March 6, 2018
    Assignee: Leap Motion, Inc.
    Inventors: Raffi Bedikian, Hongyuan (Jimmy) He, David S. Holz
  • Publication number: 20180032144
    Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.
    Type: Application
    Filed: October 9, 2017
    Publication date: February 1, 2018
    Applicant: Leap Motion, Inc.
    Inventors: Kevin A. HOROWITZ, Matias PEREZ, Raffi BEDIKIAN, David S. HOLZ, Gabriel A. HARE
  • Publication number: 20170345218
    Abstract: The technology disclosed relates to a method of interacting with a virtual object. In particular, it relates to referencing a virtual object in an augmented reality space, identifying a physical location of a device in at least one image of the augmented reality space, generating for display a control coincident with a surface of the device, sensing interactions between at least one control object and the control coincident with the surface of the device, and generating data signaling manipulations of the control coincident with the surface of the device.
    Type: Application
    Filed: August 16, 2017
    Publication date: November 30, 2017
    Applicant: Leap Motion, Inc.
    Inventors: Raffi Bedikian, Hongyuan (Jimmy) He, David S. Holz
  • Patent number: 9785247
    Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.
    Type: Grant
    Filed: May 14, 2015
    Date of Patent: October 10, 2017
    Assignee: LEAP MOTION, INC.
    Inventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
  • Patent number: 9767613
    Abstract: The technology disclosed relates to a method of interacting with a virtual object. In particular, it relates to referencing a virtual object in an augmented reality space, identifying a physical location of a device in at least one image of the augmented reality space, generating for display a control coincident with a surface of the device, sensing interactions between at least one control object and the control coincident with the surface of the device, and generating data signaling manipulations of the control coincident with the surface of the device.
    Type: Grant
    Filed: February 19, 2015
    Date of Patent: September 19, 2017
    Assignee: LEAP MOTION, INC.
    Inventors: Raffi Bedikian, Hongyuan (Jimmy) He, David S. Holz
  • Publication number: 20170075428
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.
    Type: Application
    Filed: November 21, 2016
    Publication date: March 16, 2017
    Applicant: Leap Motion, Inc.
    Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ
  • Publication number: 20170017306
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Application
    Filed: September 28, 2016
    Publication date: January 19, 2017
    Applicant: Leap Motion, Inc.
    Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David Samuel HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel A. HARE, Ryan Christopher JULIAN
  • Patent number: 9501152
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes may be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct may be updated, continuously or from time to time, based on the control object's location.
    Type: Grant
    Filed: January 14, 2014
    Date of Patent: November 22, 2016
    Assignee: Leap Motion, Inc.
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz
  • Patent number: 9459697
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Grant
    Filed: January 15, 2014
    Date of Patent: October 4, 2016
    Assignee: Leap Motion, Inc.
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel A. Hare, Ryan Julian