Patents Assigned to Leap Motion, Inc.
  • Publication number: 20180046256
    Abstract: A region of space may be monitored for the presence or absence of one or more control objects, and object attributes and changes thereto may be interpreted as control information provided as input to a machine or application. In some embodiments, the region is monitored using a combination of scanning and image-based sensing.
    Type: Application
    Filed: September 5, 2017
    Publication date: February 15, 2018
    Applicant: Leap Motion, Inc.
    Inventor: David S. Holz
  • Patent number: 9891712
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Grant
    Filed: December 16, 2014
    Date of Patent: February 13, 2018
    Assignee: LEAP MOTION, INC.
    Inventors: Isaac Cohen, Maxwell Sills, Paul Durdik
  • Publication number: 20180039334
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Application
    Filed: August 18, 2017
    Publication date: February 8, 2018
    Applicant: Leap Motion, Inc.
    Inventors: Isaac COHEN, Maxwell SILLS
  • Patent number: 9886097
    Abstract: The technology disclosed relates to automatically (e.g., programmatically) initializing predictive information for tracking a complex control object (e.g., hand, hand and tool combination, robot end effector) based upon information about characteristics of the object determined from sets of collected observed information. Automated initialization techniques obviate the need for special and often bizarre start-up rituals (place your hands on the screen at the places indicated during a full moon, and so forth) required by conventional techniques. In implementations, systems can refine initial predictive information to reflect an observed condition based on comparison of the observed with an analysis of sets of collected observed information.
    Type: Grant
    Filed: May 5, 2017
    Date of Patent: February 6, 2018
    Assignee: Leap Motion, Inc.
    Inventor: Kevin A. Horowitz
  • Publication number: 20180033159
    Abstract: Methods and systems for capturing motion and/or determining the shapes and positions of one or more objects in 3D space utilize cross-sections thereof. In various embodiments, images of the cross-sections are captured using a camera based on edge points thereof.
    Type: Application
    Filed: August 18, 2017
    Publication date: February 1, 2018
    Applicant: Leap Motion, Inc.
    Inventor: David Holz
  • Publication number: 20180032144
    Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.
    Type: Application
    Filed: October 9, 2017
    Publication date: February 1, 2018
    Applicant: Leap Motion, Inc.
    Inventors: Kevin A. HOROWITZ, Matias PEREZ, Raffi BEDIKIAN, David S. HOLZ, Gabriel A. HARE
  • Patent number: 9881386
    Abstract: Methods and systems for capturing motion and/or determining the shapes and positions of one or more objects in 3D space utilize cross-sections thereof. In various embodiments, images of the cross-sections are captured using a camera based on edge points thereof.
    Type: Grant
    Filed: August 18, 2017
    Date of Patent: January 30, 2018
    Assignee: Leap Motion, Inc.
    Inventor: David Holz
  • Patent number: 9868449
    Abstract: The technology disclosed relates to an embeddable motion sensory control device that detects gestures in a three dimensional (3D) sensory space within a vehicle cabin, detecting a gesture in the 3D sensory space and interpreting the gesture as a command to a (sub) system of the vehicle under control, and issuing the command when appropriate.
    Type: Grant
    Filed: May 29, 2015
    Date of Patent: January 16, 2018
    Assignee: Leap Motion, Inc.
    Inventors: David S. Holz, Hua Yang, Robert Samuel Gordon, Neeloy Roy, Justin Schunick, Paul A. Durdik
  • Publication number: 20180011546
    Abstract: The technology disclosed relates to using gestures to supplant or augment use of a standard input device coupled to a system. It also relates to controlling a display using gestures. It further relates to controlling a system using more than one input device. In particular, it relates to detecting a standard input device that causes on-screen actions on a display in response to control manipulations performed using the standard input device. Further, a library of analogous gestures is identified, which includes gestures that are analogous to the control manipulations and also cause the on-screen actions responsive to the control manipulations. Thus, when a gesture from the library of analogous gestures is detected, a signal is generated that mimics a standard signal from the standard input device and causes at least one on-screen action.
    Type: Application
    Filed: September 5, 2017
    Publication date: January 11, 2018
    Applicant: Leap Motion, Inc.
    Inventor: David HOLZ
  • Publication number: 20180012074
    Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Application
    Filed: August 31, 2017
    Publication date: January 11, 2018
    Applicant: Leap Motion, Inc.
    Inventors: David S. HOLZ, Robert Samuel GORDON, Matias PEREZ
  • Patent number: 9857876
    Abstract: Implementations of the technology disclosed convert captured motion from Cartesian/(x,y,z) space to Frenet-Serret frame space, apply one or more filters to the motion in Frenet-Serret space, and output data (for display or control) in a desired coordinate space—e.g., in a Cartesian/(x,y,z) reference frame. The output data can better represent a user's actual motion or intended motion.
    Type: Grant
    Filed: July 22, 2014
    Date of Patent: January 2, 2018
    Assignee: Leap Motion, Inc.
    Inventors: Gabriel A Hare, Keith Mertens, Matias Perez, Neeloy Roy, David Holz
  • Publication number: 20170374279
    Abstract: The technology disclosed relates to adjusting the monitored field of view of a camera and/or a view of a virtual scene from a point of view of a virtual camera based on the distance between tracked objects. For example, if the user's hand is being tracked for gestures, the closer the hand gets to another object, the tighter the frame can become—i.e., the more the camera can zoom in so that the hand and the other object occupy most of the frame. The camera can also be reoriented so that the hand and the other object remain in the center of the field of view. The distance between two objects in a camera's field of view can be determined and a parameter of a motion-capture system adjusted based thereon. In particular, the pan and/or zoom levels of the camera may be adjusted in accordance with the distance.
    Type: Application
    Filed: September 8, 2017
    Publication date: December 28, 2017
    Applicant: Leap Motion, Inc.
    Inventor: David S. HOLZ
  • Publication number: 20170345218
    Abstract: The technology disclosed relates to a method of interacting with a virtual object. In particular, it relates to referencing a virtual object in an augmented reality space, identifying a physical location of a device in at least one image of the augmented reality space, generating for display a control coincident with a surface of the device, sensing interactions between at least one control object and the control coincident with the surface of the device, and generating data signaling manipulations of the control coincident with the surface of the device.
    Type: Application
    Filed: August 16, 2017
    Publication date: November 30, 2017
    Applicant: Leap Motion, Inc.
    Inventors: Raffi Bedikian, Hongyuan (Jimmy) He, David S. Holz
  • Publication number: 20170344068
    Abstract: A motion control assembly includes a motion control device electrically connected to a battery pack and to a mobile computing device for at least data transmission therebetween. The motion control device can generate inputs, such as inputs corresponding to an attribute of a sensed object, for transmission to the mobile computing device. The drain on a battery of a battery-powered mobile computing device can be reduced when used with a motion control device as follows. A motion control assembly, comprising a motion control device and a battery pack, capable of powering the motion control device, as an integral, one-piece unit, is selected. The motion control device is connected to an electrical connector of a battery-powered mobile computing device. The motion control device is supplied with power from the battery pack during use so the motion control device can be operated using the power from the battery pack.
    Type: Application
    Filed: August 15, 2017
    Publication date: November 30, 2017
    Applicant: Leap Motion, Inc.
    Inventors: Robert Samuel GORDON, Paul Alan DURDIK
  • Publication number: 20170345219
    Abstract: The technology disclosed can provide capabilities to view and/or interact with the real world to the user of a wearable (or portable) device using a sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved user experience, greater safety, greater functionality to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted devices (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Application
    Filed: August 18, 2017
    Publication date: November 30, 2017
    Applicant: Leap Motion, Inc.
    Inventor: David S. HOLZ
  • Publication number: 20170330374
    Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.
    Type: Application
    Filed: July 31, 2017
    Publication date: November 16, 2017
    Applicant: Leap Motion, Inc.
    Inventors: Kevin A. HOROWITZ, David S. HOLZ
  • Publication number: 20170300125
    Abstract: The technology disclosed relates to operating a motion-capture system responsive to available computational resources. In particular, it relates to assessing a level of image acquisition and image-analysis resources available using benchmarking of system components. In response, one or more image acquisition parameters and/or image-analysis parameters are adjusted. Acquisition and/or analysis of image data are then made compliant with the adjusted image acquisition parameters and/or image-analysis parameters. In some implementations, image acquisition parameters include frame resolution and frame capture rate and image-analysis parameters include analysis algorithm and analysis density.
    Type: Application
    Filed: July 3, 2017
    Publication date: October 19, 2017
    Applicant: Leap Motion, Inc.
    Inventor: David HOLZ
  • Publication number: 20170300209
    Abstract: The technology disclosed relates to distinguishing meaningful gestures from proximate non-meaningful gestures in a three-dimensional (3D) sensory space. In particular, it relates to calculating spatial trajectories of different gestures and determining a dominant gesture based on magnitudes of the spatial trajectories. The technology disclosed also relates to uniformly responding to gestural inputs from a user irrespective of a position of the user. In particular, it relates to automatically adapting a responsiveness scale between gestures in a physical space and resulting responses in a gestural interface by automatically proportioning on-screen responsiveness to scaled movement distances of gestures in the physical space, user spacing with the 3D sensory space, or virtual object density in the gestural interface.
    Type: Application
    Filed: June 29, 2017
    Publication date: October 19, 2017
    Applicant: Leap Motion, Inc.
    Inventor: David S. HOLZ
  • Patent number: 9785247
    Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.
    Type: Grant
    Filed: May 14, 2015
    Date of Patent: October 10, 2017
    Assignee: LEAP MOTION, INC.
    Inventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
  • Publication number: 20170285169
    Abstract: The technology disclosed relates to determining positional information of an object in a field of view. In particular, it relates to calculating a distance of the object from a reference such as a sensor including scanning the field of view by selectively illuminating directionally oriented light sources and measuring one or more differences in property of returning light emitted from the light sources and reflected from the object. The property can be intensity or phase difference of the light. It also relates to finding an object in a region of space. In particular, it relates to scanning the region of space with directionally controllable illumination, determining a difference in a property of the illumination received for two or more points in the scanning, and determining positional information of the object based in part upon the points in the scanning corresponding to the difference in the property.
    Type: Application
    Filed: June 16, 2017
    Publication date: October 5, 2017
    Applicant: Leap Motion, Inc.
    Inventor: David HOLZ