Patents by Inventor Matias Perez

Matias Perez has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10139918
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Grant
    Filed: September 28, 2016
    Date of Patent: November 27, 2018
    Assignee: Leap Motion, Inc.
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Samuel Holz, Maxwell Sills, Matias Perez, Gabriel A. Hare, Ryan Christopher Julian
  • Patent number: 10043320
    Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Grant
    Filed: August 31, 2017
    Date of Patent: August 7, 2018
    Assignee: Leap Motion, Inc.
    Inventors: David S. Holz, Robert Samuel Gordon, Matias Perez
  • Patent number: 10007350
    Abstract: A technology for tracking motion of a wearable sensor system uses a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. In particular, it relates to capturing gross features and feature values of a real world space using RGB pixels and capturing fine features and feature values of the real world space using IR pixels. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. It also relates to capturing different sceneries of a shared real world space from the perspective of multiple users. It further relates to sharing content between wearable sensor systems. In further relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.
    Type: Grant
    Filed: June 25, 2015
    Date of Patent: June 26, 2018
    Assignee: LEAP MOTION, INC.
    Inventors: David S. Holz, Matias Perez, Davide Onofrio
  • Patent number: 9983686
    Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.
    Type: Grant
    Filed: October 9, 2017
    Date of Patent: May 29, 2018
    Assignee: Leap Motion, Inc.
    Inventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
  • Publication number: 20180032144
    Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.
    Type: Application
    Filed: October 9, 2017
    Publication date: February 1, 2018
    Applicant: Leap Motion, Inc.
    Inventors: Kevin A. HOROWITZ, Matias PEREZ, Raffi BEDIKIAN, David S. HOLZ, Gabriel A. HARE
  • Publication number: 20180012074
    Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Application
    Filed: August 31, 2017
    Publication date: January 11, 2018
    Applicant: Leap Motion, Inc.
    Inventors: David S. HOLZ, Robert Samuel GORDON, Matias PEREZ
  • Patent number: 9857876
    Abstract: Implementations of the technology disclosed convert captured motion from Cartesian/(x,y,z) space to Frenet-Serret frame space, apply one or more filters to the motion in Frenet-Serret space, and output data (for display or control) in a desired coordinate space—e.g., in a Cartesian/(x,y,z) reference frame. The output data can better represent a user's actual motion or intended motion.
    Type: Grant
    Filed: July 22, 2014
    Date of Patent: January 2, 2018
    Assignee: Leap Motion, Inc.
    Inventors: Gabriel A Hare, Keith Mertens, Matias Perez, Neeloy Roy, David Holz
  • Patent number: 9785247
    Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.
    Type: Grant
    Filed: May 14, 2015
    Date of Patent: October 10, 2017
    Assignee: LEAP MOTION, INC.
    Inventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
  • Patent number: 9754167
    Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Grant
    Filed: April 17, 2015
    Date of Patent: September 5, 2017
    Assignee: LEAP MOTION, INC.
    Inventors: David S. Holz, Robert Samuel Gordon, Matias Perez
  • Patent number: 9645654
    Abstract: The technology disclosed relates generally to image analysis and, in particular embodiments, to identifying shapes and capturing motions of objects in three-dimensional space. This is accomplished by calculation of numerous span lengths between opposing sides of a control object wherein each control object can consist of a plurality of span modes, each span mode identified by a frequency distribution of a plurality of sample points. The relevant sample points are derived from the pairing of boundary points on the opposing sides of the control object. For each span mode, span width parameters are calculated from at least part of the distribution of the span lengths, using the span width parameters to initialize at least a portion of a model of the control object, and generating predictive information from the initialized model.
    Type: Grant
    Filed: December 4, 2014
    Date of Patent: May 9, 2017
    Assignee: LEAP MOTION, INC.
    Inventors: Matias Perez, Kevin Horowitz
  • Publication number: 20170017306
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Application
    Filed: September 28, 2016
    Publication date: January 19, 2017
    Applicant: Leap Motion, Inc.
    Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David Samuel HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel A. HARE, Ryan Christopher JULIAN
  • Patent number: 9459697
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Grant
    Filed: January 15, 2014
    Date of Patent: October 4, 2016
    Assignee: Leap Motion, Inc.
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel A. Hare, Ryan Julian
  • Publication number: 20150153835
    Abstract: The technology disclosed relates generally to image analysis and, in particular embodiments, to identifying shapes and capturing motions of objects in three-dimensional space. This is accomplished by calculation of numerous span lengths between opposing sides of a control object wherein each control object can consist of a plurality of span modes, each span mode identified by a frequency distribution of a plurality of sample points. The relevant sample points are derived from the pairing of boundary points on the opposing sides of the control object. For each span mode, span width parameters are calculated from at least part of the distribution of the span lengths, using the span width parameters to initialize at least a portion of a model of the control object, and generating predictive information from the initialized model.
    Type: Application
    Filed: December 4, 2014
    Publication date: June 4, 2015
    Applicant: LEAP MOTION, INC.
    Inventors: Matias Perez, Kevin Horowitz
  • Publication number: 20150029092
    Abstract: The technology disclosed relates to using a curvilinear gestural path of a control object as a gesture-based input command for a motion-sensing system. In particular, the curvilinear gestural path can be broken down into curve segments, and each curve segment can be mapped to a recorded gesture primitive. Further, certain sequences of gesture primitives can be used to identify the original curvilinear gesture.
    Type: Application
    Filed: July 23, 2014
    Publication date: January 29, 2015
    Applicant: LEAP MOTION, INC.
    Inventors: David HOLZ, Matias PEREZ, Neeloy ROY, Maxwell Sills
  • Publication number: 20150022447
    Abstract: Implementations of the technology disclosed convert captured motion from Cartesian/(x,y,z) space to Frenet-Serret frame space, apply one or more filters to the motion in Frenet-Serret space, and output data (for display or control) in a desired coordinate space—e.g., in a Cartesian/(x,y,z) reference frame. The output data can better represent a user's actual motion or intended motion.
    Type: Application
    Filed: July 22, 2014
    Publication date: January 22, 2015
    Applicant: Leap Motion, Inc.
    Inventors: Gabriel A. HARE, Keith MERTENS, Matias PEREZ, Neeloy ROY, David HOLZ
  • Publication number: 20140201666
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Application
    Filed: January 15, 2014
    Publication date: July 17, 2014
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel A. Hare, Ryan Julian