Patents Assigned to Leap Motion, Inc.
  • Patent number: 9778752
    Abstract: A region of space may be monitored for the presence or absence of one or more control objects, and object attributes and changes thereto may be interpreted as control information provided as input to a machine or application. In some embodiments, the region is monitored using a combination of scanning and image-based sensing.
    Type: Grant
    Filed: December 28, 2016
    Date of Patent: October 3, 2017
    Assignee: Leap Motion, Inc.
    Inventor: David S. Holz
  • Publication number: 20170270356
    Abstract: The technology disclosed can provide methods and systems for identifying users while capturing motion and/or determining the path of a portion of the user with one or more optical, acoustic or vibrational sensors. Implementations can enable use of security aware devices, e.g., automated teller machines (ATMs), cash registers and banking machines, other secure vending or service machines, security screening apparatus, secure terminals, airplanes, automobiles and so forth that comprise sensors and processors employing optical, audio or vibrational detection mechanisms suitable for providing gesture detection, personal identification, user recognition, authorization of control inputs, and other machine control and/or machine communications applications. A virtual experience can be provided to the user in some implementations by the addition of haptic, audio and/or other sensory information projectors.
    Type: Application
    Filed: June 2, 2017
    Publication date: September 21, 2017
    Applicant: Leap Motion, Inc.
    Inventors: Maxwell SILLS, Aaron SMITH, David S. HOLZ, Hongyuan (Jimmy) He
  • Patent number: 9767345
    Abstract: Methods and systems for capturing motion and/or determining the shapes and positions of one or more objects in 3D space utilize cross-sections thereof. In various embodiments, images of the cross-sections are captured using a camera based on edge points thereof.
    Type: Grant
    Filed: August 31, 2016
    Date of Patent: September 19, 2017
    Assignee: Leap Motion, Inc.
    Inventor: David Holz
  • Patent number: 9766709
    Abstract: The technology disclosed relates to using gestures to supplant or augment use of a standard input device coupled to a system. It also relates to controlling a display using gestures. It further relates to controlling a system using more than one input device. In particular, it relates to detecting a standard input device that causes on-screen actions on a display in response to control manipulations performed using the standard input device. Further, a library of analogous gestures is identified, which includes gestures that are analogous to the control manipulations and also cause the on-screen actions responsive to the control manipulations. Thus, when a gesture from the library of analogous gestures is detected, a signal is generated that mimics a standard signal from the standard input device and causes at least one on-screen action.
    Type: Grant
    Filed: March 14, 2014
    Date of Patent: September 19, 2017
    Assignee: Leap Motion, Inc.
    Inventor: David Holz
  • Patent number: 9767613
    Abstract: The technology disclosed relates to a method of interacting with a virtual object. In particular, it relates to referencing a virtual object in an augmented reality space, identifying a physical location of a device in at least one image of the augmented reality space, generating for display a control coincident with a surface of the device, sensing interactions between at least one control object and the control coincident with the surface of the device, and generating data signaling manipulations of the control coincident with the surface of the device.
    Type: Grant
    Filed: February 19, 2015
    Date of Patent: September 19, 2017
    Assignee: LEAP MOTION, INC.
    Inventors: Raffi Bedikian, Hongyuan (Jimmy) He, David S. Holz
  • Patent number: 9762792
    Abstract: The technology disclosed relates to adjusting the monitored field of view of a camera and/or a view of a virtual scene from a point of view of a virtual camera based on the distance between tracked objects. For example, if the user's hand is being tracked for gestures, the closer the hand gets to another object, the tighter the frame can becomeā€”i.e., the more the camera can zoom in so that the hand and the other object occupy most of the frame. The camera can also be reoriented so that the hand and the other object remain in the center of the field of view. The distance between two objects in a camera's field of view can be determined and a parameter of a motion-capture system adjusted based thereon. In particular, the pan and/or zoom levels of the camera may be adjusted in accordance with the distance.
    Type: Grant
    Filed: December 20, 2016
    Date of Patent: September 12, 2017
    Assignee: Leap Motion, Inc.
    Inventor: David S. Holz
  • Patent number: 9754167
    Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Grant
    Filed: April 17, 2015
    Date of Patent: September 5, 2017
    Assignee: LEAP MOTION, INC.
    Inventors: David S. Holz, Robert Samuel Gordon, Matias Perez
  • Patent number: 9747691
    Abstract: The technology disclosed relates to tracking movement of a real world object in three-dimensional (3D) space. In particular, it relates to mapping, to image planes of a camera, projections of observation points on a curved volumetric model of the real world object. The projections are used to calculate a retraction of the observation points at different times during which the real world object has moved. The retraction is then used to determine translational and rotational movement of the real world object between the different times.
    Type: Grant
    Filed: July 2, 2015
    Date of Patent: August 29, 2017
    Assignee: LEAP MOTION, INC.
    Inventors: David S. Holz, W. Dale Hall
  • Patent number: 9747696
    Abstract: Systems and methods are disclosed for detecting user gestures using detection zones to save computational time and cost and/or to provide normalized position-based parameters, such as position coordinates or movement vectors. The detection zones may be established explicitly by a user or a computer application, or may instead be determined from the user's pattern of gestural activity. The detection zones may have three-dimensional (3D) boundaries or may be two-dimensional (2D) frames. The size and location of the detection zone may be adjusted based on the distance and direction between the user and the motion-capture system.
    Type: Grant
    Filed: May 19, 2014
    Date of Patent: August 29, 2017
    Assignee: Leap Motion, Inc.
    Inventor: David Holz
  • Publication number: 20170242492
    Abstract: The technology disclosed relates to automatically (e.g., programmatically) initializing predictive information for tracking a complex control object (e.g., hand, hand and tool combination, robot end effector) based upon information about characteristics of the object determined from sets of collected observed information. Automated initialization techniques obviate the need for special and often bizarre start-up rituals (place your hands on the screen at the places indicated during a full moon, and so forth) required by conventional techniques. In implementations, systems can refine initial predictive information to reflect an observed condition based on comparison of the observed with an analysis of sets of collected observed information.
    Type: Application
    Filed: May 5, 2017
    Publication date: August 24, 2017
    Applicant: Leap Motion, Inc.
    Inventor: Kevin A. HOROWITZ
  • Patent number: 9741169
    Abstract: The technology disclosed can provide capabilities to view and/or interact with the real world to the user of a wearable (or portable) device using a sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved user experience, greater safety, greater functionality to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted devices (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Grant
    Filed: May 20, 2015
    Date of Patent: August 22, 2017
    Assignee: Leap Motion, Inc.
    Inventor: David S. Holz
  • Patent number: 9740242
    Abstract: A motion control assembly includes a motion control device electrically connected to a battery pack and to a mobile computing device for at least data transmission therebetween. The motion control device can generate inputs, such as inputs corresponding to an attribute of a sensed object, for transmission to the mobile computing device. The drain on a battery of a battery-powered mobile computing device can be reduced when used with a motion control device as follows. A motion control assembly, comprising a motion control device and a battery pack, capable of powering the motion control device, as an integral, one-piece unit, is selected. The motion control device is connected to an electrical connector of a battery-powered mobile computing device. The motion control device is supplied with power from the battery pack during use so the motion control device can be operated using the power from the battery pack.
    Type: Grant
    Filed: January 5, 2015
    Date of Patent: August 22, 2017
    Assignee: LEAP MOTION, INC.
    Inventors: Robert Samuel Gordon, Paul Alan Durdik
  • Patent number: 9740296
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Grant
    Filed: December 16, 2014
    Date of Patent: August 22, 2017
    Assignee: LEAP MOTION, INC.
    Inventors: Isaac Cohen, Maxwell Sills
  • Patent number: 9741136
    Abstract: Methods and systems for capturing motion and/or determining the shapes and positions of one or more objects in 3D space utilize cross-sections thereof. In various embodiments, images of the cross-sections are captured using a camera based on edge points thereof.
    Type: Grant
    Filed: December 21, 2016
    Date of Patent: August 22, 2017
    Assignee: Leap Motion, Inc.
    Inventor: David Holz
  • Publication number: 20170235377
    Abstract: The technology disclosed relates to a method of realistic rotation of a virtual object for an interaction between a control object in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a two e sub-component free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting rotation of the virtual object by the 3D solid control object model.
    Type: Application
    Filed: May 4, 2017
    Publication date: August 17, 2017
    Applicant: Leap Motion, Inc.
    Inventors: Alex MARCOLINA, David S. HOLZ, Paul A. Durdik
  • Publication number: 20170236293
    Abstract: Enhanced contrast between an object of interest and background surfaces visible in an image is provided using controlled lighting directed at the object. Exploiting the falloff of light intensity with distance, a light source (or multiple light sources), such as an infrared light source, can be positioned near one or more cameras to shine light onto the object while the camera(s) capture images. The captured images can be analyzed to distinguish object pixels from background pixels.
    Type: Application
    Filed: May 3, 2017
    Publication date: August 17, 2017
    Applicant: Leap Motion, Inc.
    Inventors: David S. Holz, Hua Yang
  • Patent number: 9733715
    Abstract: The technology disclosed relates to operating a motion-capture system responsive to available computational resources. In particular, it relates to assessing a level of image acquisition and image-analysis resources available using benchmarking of system components. In response, one or more image acquisition parameters and/or image-analysis parameters are adjusted. Acquisition and/or analysis of image data are then made compliant with the adjusted image acquisition parameters and/or image-analysis parameters. In some implementations, image acquisition parameters include frame resolution and frame capture rate and image-analysis parameters include analysis algorithm and analysis density.
    Type: Grant
    Filed: March 14, 2014
    Date of Patent: August 15, 2017
    Assignee: Leap Motion, Inc.
    Inventor: David Holz
  • Publication number: 20170223260
    Abstract: The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted.
    Type: Application
    Filed: April 20, 2017
    Publication date: August 3, 2017
    Applicant: Leap Motion, Inc.
    Inventors: David S. HOLZ, Paul DURDIK
  • Publication number: 20170223271
    Abstract: The technology disclosed provides systems and methods for reducing the overall power consumption of an optical motion-capture system without compromising the quality of motion capture and tracking. In implementations, this is accomplished by operating the motion-detecting cameras and associated image-processing hardware in a low-power mode (e.g., at a low frame rate or in a standby or sleep mode) unless and until touch gestures of an object such as a tap, sequence of taps, or swiping motions are performed with a surface proximate to the cameras. A contact microphone or other appropriate sensor is used for detecting audio signals or other vibrations generated by contact of the object with the surface.
    Type: Application
    Filed: April 14, 2017
    Publication date: August 3, 2017
    Applicant: Leap Motion, Inc.
    Inventor: David S. HOLZ
  • Publication number: 20170220126
    Abstract: The technology disclosed relates to distinguishing meaningful gestures from proximate non-meaningful gestures in a three-dimensional (3D) sensory space. In particular, it relates to calculating spatial trajectories of different gestures and determining a dominant gesture based on magnitudes of the spatial trajectories. The technology disclosed also relates to uniformly responding to gestural inputs from a user irrespective of a position of the user. In particular, it relates to automatically adapting a responsiveness scale between gestures in a physical space and resulting responses in a gestural interface by automatically proportioning on-screen responsiveness to scaled movement distances of gestures in the physical space, user spacing with the 3D sensory space, or virtual object density in the gestural interface.
    Type: Application
    Filed: April 19, 2017
    Publication date: August 3, 2017
    Applicant: Leap Motion, Inc.
    Inventor: David S. HOLZ