Patents by Inventor Matias Perez
Matias Perez has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10139918Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: GrantFiled: September 28, 2016Date of Patent: November 27, 2018Assignee: Leap Motion, Inc.Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Samuel Holz, Maxwell Sills, Matias Perez, Gabriel A. Hare, Ryan Christopher Julian
-
Patent number: 10043320Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.Type: GrantFiled: August 31, 2017Date of Patent: August 7, 2018Assignee: Leap Motion, Inc.Inventors: David S. Holz, Robert Samuel Gordon, Matias Perez
-
Patent number: 10007350Abstract: A technology for tracking motion of a wearable sensor system uses a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. In particular, it relates to capturing gross features and feature values of a real world space using RGB pixels and capturing fine features and feature values of the real world space using IR pixels. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. It also relates to capturing different sceneries of a shared real world space from the perspective of multiple users. It further relates to sharing content between wearable sensor systems. In further relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: GrantFiled: June 25, 2015Date of Patent: June 26, 2018Assignee: LEAP MOTION, INC.Inventors: David S. Holz, Matias Perez, Davide Onofrio
-
Patent number: 9983686Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: GrantFiled: October 9, 2017Date of Patent: May 29, 2018Assignee: Leap Motion, Inc.Inventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
-
Publication number: 20180032144Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: ApplicationFiled: October 9, 2017Publication date: February 1, 2018Applicant: Leap Motion, Inc.Inventors: Kevin A. HOROWITZ, Matias PEREZ, Raffi BEDIKIAN, David S. HOLZ, Gabriel A. HARE
-
Publication number: 20180012074Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.Type: ApplicationFiled: August 31, 2017Publication date: January 11, 2018Applicant: Leap Motion, Inc.Inventors: David S. HOLZ, Robert Samuel GORDON, Matias PEREZ
-
Patent number: 9857876Abstract: Implementations of the technology disclosed convert captured motion from Cartesian/(x,y,z) space to Frenet-Serret frame space, apply one or more filters to the motion in Frenet-Serret space, and output data (for display or control) in a desired coordinate space—e.g., in a Cartesian/(x,y,z) reference frame. The output data can better represent a user's actual motion or intended motion.Type: GrantFiled: July 22, 2014Date of Patent: January 2, 2018Assignee: Leap Motion, Inc.Inventors: Gabriel A Hare, Keith Mertens, Matias Perez, Neeloy Roy, David Holz
-
Patent number: 9785247Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: GrantFiled: May 14, 2015Date of Patent: October 10, 2017Assignee: LEAP MOTION, INC.Inventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
-
Patent number: 9754167Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.Type: GrantFiled: April 17, 2015Date of Patent: September 5, 2017Assignee: LEAP MOTION, INC.Inventors: David S. Holz, Robert Samuel Gordon, Matias Perez
-
Patent number: 9645654Abstract: The technology disclosed relates generally to image analysis and, in particular embodiments, to identifying shapes and capturing motions of objects in three-dimensional space. This is accomplished by calculation of numerous span lengths between opposing sides of a control object wherein each control object can consist of a plurality of span modes, each span mode identified by a frequency distribution of a plurality of sample points. The relevant sample points are derived from the pairing of boundary points on the opposing sides of the control object. For each span mode, span width parameters are calculated from at least part of the distribution of the span lengths, using the span width parameters to initialize at least a portion of a model of the control object, and generating predictive information from the initialized model.Type: GrantFiled: December 4, 2014Date of Patent: May 9, 2017Assignee: LEAP MOTION, INC.Inventors: Matias Perez, Kevin Horowitz
-
Publication number: 20170017306Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: ApplicationFiled: September 28, 2016Publication date: January 19, 2017Applicant: Leap Motion, Inc.Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David Samuel HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel A. HARE, Ryan Christopher JULIAN
-
Patent number: 9459697Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: GrantFiled: January 15, 2014Date of Patent: October 4, 2016Assignee: Leap Motion, Inc.Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel A. Hare, Ryan Julian
-
Publication number: 20150153835Abstract: The technology disclosed relates generally to image analysis and, in particular embodiments, to identifying shapes and capturing motions of objects in three-dimensional space. This is accomplished by calculation of numerous span lengths between opposing sides of a control object wherein each control object can consist of a plurality of span modes, each span mode identified by a frequency distribution of a plurality of sample points. The relevant sample points are derived from the pairing of boundary points on the opposing sides of the control object. For each span mode, span width parameters are calculated from at least part of the distribution of the span lengths, using the span width parameters to initialize at least a portion of a model of the control object, and generating predictive information from the initialized model.Type: ApplicationFiled: December 4, 2014Publication date: June 4, 2015Applicant: LEAP MOTION, INC.Inventors: Matias Perez, Kevin Horowitz
-
Publication number: 20150029092Abstract: The technology disclosed relates to using a curvilinear gestural path of a control object as a gesture-based input command for a motion-sensing system. In particular, the curvilinear gestural path can be broken down into curve segments, and each curve segment can be mapped to a recorded gesture primitive. Further, certain sequences of gesture primitives can be used to identify the original curvilinear gesture.Type: ApplicationFiled: July 23, 2014Publication date: January 29, 2015Applicant: LEAP MOTION, INC.Inventors: David HOLZ, Matias PEREZ, Neeloy ROY, Maxwell Sills
-
Publication number: 20150022447Abstract: Implementations of the technology disclosed convert captured motion from Cartesian/(x,y,z) space to Frenet-Serret frame space, apply one or more filters to the motion in Frenet-Serret space, and output data (for display or control) in a desired coordinate space—e.g., in a Cartesian/(x,y,z) reference frame. The output data can better represent a user's actual motion or intended motion.Type: ApplicationFiled: July 22, 2014Publication date: January 22, 2015Applicant: Leap Motion, Inc.Inventors: Gabriel A. HARE, Keith MERTENS, Matias PEREZ, Neeloy ROY, David HOLZ
-
Publication number: 20140201666Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: ApplicationFiled: January 15, 2014Publication date: July 17, 2014Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel A. Hare, Ryan Julian