Patents Assigned to Leap Motion, Inc.
  • Publication number: 20190146660
    Abstract: The technology disclosed relates to providing simplified manipulation of virtual objects by detected hand motions. In particular, it relates to a detecting hand motion and positions of the calculation points relative to a virtual object to be manipulated, dynamically selecting at least one manipulation point proximate to the virtual object based on the detected hand motion and positions of one or more of the calculation points, and manipulating the virtual object by interaction between the detected hand motion and positions of one or more of the calculation points and the dynamically selected manipulation point.
    Type: Application
    Filed: December 20, 2018
    Publication date: May 16, 2019
    Applicant: Leap Motion, Inc.
    Inventors: David S. HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Hua YANG, Gabriel A. HARE, Maxwell SILLS
  • Patent number: 10281553
    Abstract: A method and system determines object orientation using a light source to create a shadow line extending from the light source. A camera captures an image including the shadow line on an object surface. An orientation module determines the surface orientation from the shadow line. In some examples a transparency imperfection in a window through which a camera receives light can be detected and a message sent to a user as to the presence of a light-blocking or light-distorting substance or particle. A system can control illumination while imaging an object in space using a light source mounted to a support structure so a camera captures an image of the illuminated object. Direct illumination of the camera by light from the light source can be prevented such as by blocking the light or using a light-transmissive window adjacent the camera to reject light transmitted directly from the light source.
    Type: Grant
    Filed: December 2, 2013
    Date of Patent: May 7, 2019
    Assignee: Leap Motion, Inc.
    Inventor: David Holz
  • Patent number: 10281992
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Grant
    Filed: January 3, 2018
    Date of Patent: May 7, 2019
    Assignee: Leap Motion, Inc.
    Inventors: Isaac Cohen, Maxwell Sills, Paul Durdik
  • Patent number: 10281987
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its contact with a “virtual touch plane or surface” (i.e., a plane, portion of a plane, and/or surface computationally defined in space, or corresponding to any physical surface).
    Type: Grant
    Filed: September 3, 2014
    Date of Patent: May 7, 2019
    Assignee: Leap Motion, Inc.
    Inventors: Hua Yang, Leonid Kontsevich, James Donald, David S Holz, Jonathan Marsden, Paul Durdik
  • Patent number: 10275039
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Grant
    Filed: August 18, 2017
    Date of Patent: April 30, 2019
    Assignee: Leap Motion, Inc.
    Inventors: Isaac Cohen, Maxwell Sills
  • Publication number: 20190113980
    Abstract: The technology disclosed relates to automatically interpreting a gesture of a control object in a three dimensional sensor space by sensing a movement of the control object in the three dimensional sensor space, sensing orientation of the control object, defining a control plane tangential to a surface of the control object and interpreting the gesture based on whether the movement of the control object is more normal to the control plane or more parallel to the control plane.
    Type: Application
    Filed: December 7, 2018
    Publication date: April 18, 2019
    Applicant: Leap Motion, Inc.
    Inventors: Isaac COHEN, David S. HOLZ, Maxwell SILLS
  • Patent number: 10261594
    Abstract: The technology disclosed relates to a method of realistic displacement of a virtual object for an interaction between a control object in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a 2D sub-component free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting rotation of the virtual object by the 3D solid control object model.
    Type: Grant
    Filed: May 4, 2017
    Date of Patent: April 16, 2019
    Assignee: Leap Motion, Inc.
    Inventors: Alex Marcolina, David S. Holz
  • Publication number: 20190108676
    Abstract: Methods and systems for capturing motion and/or determining the shapes and positions of one or more objects in 3D space utilize cross-sections thereof. In various embodiments, images of the cross-sections are captured using a camera based on edge points thereof.
    Type: Application
    Filed: December 7, 2018
    Publication date: April 11, 2019
    Applicant: Leap Motion, Inc.
    Inventor: David Holz
  • Patent number: 10254849
    Abstract: Methods and systems for processing input from an image-capture device for gesture-recognition. The method further includes computationally interpreting user gestures in accordance with a first mode of operation; analyzing the path of movement of an object to determine an intent of a user to change modes of operation; and, upon determining an intent of the user to change modes of operation, subsequently interpreting user gestures in accordance with the second mode of operation.
    Type: Grant
    Filed: March 22, 2018
    Date of Patent: April 9, 2019
    Assignee: Leap Motion, Inc.
    Inventor: David S. Holz
  • Publication number: 20190094909
    Abstract: A motion control assembly includes a motion control device electrically connected to a battery pack and to a mobile computing device for at least data transmission therebetween. The motion control device can generate inputs, such as inputs corresponding to an attribute of a sensed object, for transmission to the mobile computing device. The drain on a battery of a battery-powered mobile computing device can be reduced when used with a motion control device as follows. A motion control assembly, comprising a motion control device and a battery pack, capable of powering the motion control device, as an integral, one-piece unit, is selected. The motion control device is connected to an electrical connector of a battery-powered mobile computing device. The motion control device is supplied with power from the battery pack during use so the motion control device can be operated using the power from the battery pack.
    Type: Application
    Filed: October 5, 2018
    Publication date: March 28, 2019
    Applicant: Leap Motion, Inc.
    Inventors: Robert Samuel GORDON, Paul Alan Durdik
  • Patent number: 10241639
    Abstract: The technology disclosed relates to distinguishing meaningful gestures from proximate non-meaningful gestures in a three-dimensional (3D) sensory space. In particular, it relates to calculating spatial trajectories of different gestures and determining a dominant gesture based on magnitudes of the spatial trajectories. The technology disclosed also relates to uniformly responding to gestural inputs from a user irrespective of a position of the user. In particular, it relates to automatically adapting a responsiveness scale between gestures in a physical space and resulting responses in a gestural interface by automatically proportioning on-screen responsiveness to scaled movement distances of gestures in the physical space, user spacing with the 3D sensory space, or virtual object density in the gestural interface.
    Type: Grant
    Filed: January 15, 2014
    Date of Patent: March 26, 2019
    Assignee: LEAP MOTION, INC.
    Inventor: David Holz
  • Publication number: 20190079594
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Application
    Filed: November 13, 2018
    Publication date: March 14, 2019
    Applicant: Leap Motion, Inc.
    Inventors: Isaac COHEN, Maxwell SILLS
  • Patent number: 10229339
    Abstract: The technology disclosed relates to identifying an object in a field of view of a camera. In particular, it relates to identifying a display in the field of view of the camera. This is achieved by monitoring a space including acquiring a series of image frames of the space using the camera and detecting one or more light sources in the series of image frames. Further, one or more frequencies of periodic intensity or brightness variations, also referred to as ‘refresh rate’, of light emitted from the light sources is measured. Based on the one or more frequencies of periodic intensity variations of light emitted from the light sources, at least one display that includes the light sources is identified.
    Type: Grant
    Filed: March 13, 2017
    Date of Patent: March 12, 2019
    Assignee: Leap Motion, Inc.
    Inventor: David Holz
  • Publication number: 20190073112
    Abstract: The technology disclosed relates to distinguishing meaningful gestures from proximate non-meaningful gestures in a three-dimensional (3D) sensory space. In particular, it relates to calculating spatial trajectories of different gestures and determining a dominant gesture based on magnitudes of the spatial trajectories. The technology disclosed also relates to uniformly responding to gestural inputs from a user irrespective of a position of the user. In particular, it relates to automatically adapting a responsiveness scale between gestures in a physical space and resulting responses in a gestural interface by automatically proportioning on-screen responsiveness to scaled movement distances of gestures in the physical space, user spacing with the 3D sensory space, or virtual object density in the gestural interface.
    Type: Application
    Filed: August 3, 2018
    Publication date: March 7, 2019
    Applicant: Leap Motion, Inc.
    Inventor: David S. HOLZ
  • Publication number: 20190073829
    Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Application
    Filed: August 3, 2018
    Publication date: March 7, 2019
    Applicant: Leap Motion, Inc.
    Inventors: David S. HOLZ, Robert Samuel GORDON, Matias PEREZ
  • Patent number: 10222871
    Abstract: The technology disclosed relates to operating a motion-capture system responsive to available computational resources. In particular, it relates to assessing a level of image acquisition and image-analysis resources available using benchmarking of system components. In response, one or more image acquisition parameters and/or image-analysis parameters are adjusted. Acquisition and/or analysis of image data are then made compliant with the adjusted image acquisition parameters and/or image-analysis parameters. In some implementations, image acquisition parameters include frame resolution and frame capture rate and image-analysis parameters include analysis algorithm and analysis density.
    Type: Grant
    Filed: July 3, 2017
    Date of Patent: March 5, 2019
    Assignee: Leap Motion, Inc.
    Inventor: David Holz
  • Publication number: 20190064918
    Abstract: The technology disclosed can provide capabilities such as using motion sensors and/or other types of sensors coupled to a motion-capture system to monitor motions within a real environment. A virtual object can be projected to a user of a portable device integrated into an augmented rendering of a real environment about the user. Motion information of a user body portion is determined based at least in part upon sensory information received from imaging or acoustic sensory devices. Control information is communicated to a system based in part on a combination of the motion of the portable device and the detected motion of the user. The virtual device experience can be augmented in some implementations by the addition of haptic, audio and/or other sensory information projectors.
    Type: Application
    Filed: June 22, 2018
    Publication date: February 28, 2019
    Applicant: Leap Motion, Inc.
    Inventor: David S. HOLZ
  • Patent number: 10218895
    Abstract: The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted.
    Type: Grant
    Filed: April 20, 2017
    Date of Patent: February 26, 2019
    Assignee: Leap Motion, Inc.
    Inventors: David S. Holz, Paul Durdik
  • Publication number: 20190056791
    Abstract: The technology disclosed relates to tracking motion of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. In particular, it relates to capturing gross features and feature values of a real world space using RGB pixels and capturing fine features and feature values of the real world space using IR pixels. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. It also relates to capturing different sceneries of a shared real world space from the perspective of multiple users. It further relates to sharing content between wearable sensor systems. In further relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.
    Type: Application
    Filed: June 22, 2018
    Publication date: February 21, 2019
    Applicant: Leap Motion, Inc.
    Inventors: David S. HOLZ, Matias PEREZ, Davide ONOFRIO
  • Publication number: 20190058868
    Abstract: A single image sensor including an array of uniformly and continuously spaced light-sensing pixels in conjunction with a plurality of lenses that focus light reflected from an object onto a plurality of different pixel regions of the image sensor, each lens focusing light on a different one of the pixel regions enables a controller, including a processor and an object detection module, coupled to the single image to analyze the pixel regions, to generate a three-dimensional (3D) image of the object through a plurality of images obtained with the image sensor, generate a depth map that calculates depth values for pixels of at least the object, detect 3D motion of the object using the depth values, create a 3D model of the object based on the 3D image, and track 3D motion of the object based on the 3D model.
    Type: Application
    Filed: May 11, 2018
    Publication date: February 21, 2019
    Applicant: Leap Motion, Inc.
    Inventor: David HOLZ