Patents Assigned to Leap Motion, Inc.
  • Publication number: 20180253150
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Application
    Filed: January 3, 2018
    Publication date: September 6, 2018
    Applicant: Leap Motion, Inc.
    Inventors: Isaac Cohen, Maxwell Sills, Paul Durdik
  • Patent number: 10043320
    Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Grant
    Filed: August 31, 2017
    Date of Patent: August 7, 2018
    Assignee: Leap Motion, Inc.
    Inventors: David S. Holz, Robert Samuel Gordon, Matias Perez
  • Patent number: 10042510
    Abstract: The technology disclosed relates to distinguishing meaningful gestures from proximate non-meaningful gestures in a three-dimensional (3D) sensory space. In particular, it relates to calculating spatial trajectories of different gestures and determining a dominant gesture based on magnitudes of the spatial trajectories. The technology disclosed also relates to uniformly responding to gestural inputs from a user irrespective of a position of the user. In particular, it relates to automatically adapting a responsiveness scale between gestures in a physical space and resulting responses in a gestural interface by automatically proportioning on-screen responsiveness to scaled movement distances of gestures in the physical space, user spacing with the 3D sensory space, or virtual object density in the gestural interface.
    Type: Grant
    Filed: January 15, 2014
    Date of Patent: August 7, 2018
    Assignee: Leap Motion, Inc.
    Inventor: David Holz
  • Patent number: 10042430
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.
    Type: Grant
    Filed: November 21, 2016
    Date of Patent: August 7, 2018
    Assignee: Leap Motion, Inc.
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz
  • Patent number: 10037474
    Abstract: The technology disclosed relates to coordinating motion-capture of a hand by a network of motion-capture sensors having overlapping fields of view. In particular, it relates to designating a first sensor among three or more motion-capture sensors as having a master frame of reference, observing motion of a hand as it passes through overlapping fields of view of the respective motion-capture sensors, synchronizing capture of images of the hand within the overlapping fields of view by pairs of the motion-capture devices, and using the pairs of the hand images captured by the synchronized motion-capture devices to automatically calibrate the motion-capture sensors to the master frame of reference frame.
    Type: Grant
    Filed: March 15, 2014
    Date of Patent: July 31, 2018
    Assignee: Leap Motion, Inc.
    Inventor: David S. Holz
  • Patent number: 10007350
    Abstract: A technology for tracking motion of a wearable sensor system uses a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. In particular, it relates to capturing gross features and feature values of a real world space using RGB pixels and capturing fine features and feature values of the real world space using IR pixels. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. It also relates to capturing different sceneries of a shared real world space from the perspective of multiple users. It further relates to sharing content between wearable sensor systems. In further relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.
    Type: Grant
    Filed: June 25, 2015
    Date of Patent: June 26, 2018
    Assignee: LEAP MOTION, INC.
    Inventors: David S. Holz, Matias Perez, Davide Onofrio
  • Patent number: 10007329
    Abstract: The technology disclosed can provide capabilities such as using motion sensors and/or other types of sensors coupled to a motion-capture system to monitor motions within a real environment. A virtual object can be projected to a user of a portable device integrated into an augmented rendering of a real environment about the user. Motion information of a user body portion is determined based at least in part upon sensory information received from imaging or acoustic sensory devices. Control information is communicated to a system based in part on a combination of the motion of the portable device and the detected motion of the user. The virtual device experience can be augmented in some implementations by the addition of haptic, audio and/or other sensory information projectors.
    Type: Grant
    Filed: February 11, 2015
    Date of Patent: June 26, 2018
    Assignee: Leap Motion, Inc.
    Inventor: David S. Holz
  • Patent number: 9996638
    Abstract: The technology disclosed relates to simplifying updating of a predictive model using clustering observed points. In particular, it relates to observing a set of points in 3D sensory space, determining surface normal directions from the points, clustering the points by their surface normal directions and adjacency, accessing a predictive model of a hand, refining positions of segments of the predictive model, matching the clusters of the points to the segments, and using the matched clusters to refine the positions of the matched segments. It also relates to distinguishing between alternative motions between two observed locations of a control object in a 3D sensory space by accessing first and second positions of a segment of a predictive model of a control object such that motion between the first position and the second position was at least partially occluded from observation in a 3D sensory space.
    Type: Grant
    Filed: October 31, 2014
    Date of Patent: June 12, 2018
    Assignee: LEAP MOTION, INC.
    Inventors: David S Holz, Raffi Bedikian, Kevin Horowitz, Hua Yang
  • Patent number: 9996797
    Abstract: The technology disclosed relates to manipulating a virtual object. In particular, it relates to detecting a hand in a three-dimensional (3D) sensory space and generating a predictive model of the hand, and using the predictive model to track motion of the hand. The predictive model includes positions of calculation points of fingers, thumb and palm of the hand. The technology disclosed relates to dynamically selecting at least one manipulation point proximate to a virtual object based on the motion tracked by the predictive model and positions of one or more of the calculation points, and manipulating the virtual object by interaction between at least some of the calculation points of the predictive model and the dynamically selected manipulation point.
    Type: Grant
    Filed: October 31, 2014
    Date of Patent: June 12, 2018
    Assignee: LEAP MOTION, INC.
    Inventors: David S Holz, Raffi Bedikian, Adrian Gasinski, Maxwell Sills, Hua Yang, Gabriel Hare
  • Patent number: 9983686
    Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.
    Type: Grant
    Filed: October 9, 2017
    Date of Patent: May 29, 2018
    Assignee: Leap Motion, Inc.
    Inventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
  • Patent number: 9986153
    Abstract: The technology disclosed relates to adjusting the monitored field of view of a camera and/or a view of a virtual scene from a point of view of a virtual camera based on the distance between tracked objects. For example, if the user's hand is being tracked for gestures, the closer the hand gets to another object, the tighter the frame can becomeā€”i.e., the more the camera can zoom in so that the hand and the other object occupy most of the frame. The camera can also be reoriented so that the hand and the other object remain in the center of the field of view. The distance between two objects in a camera's field of view can be determined and a parameter of a motion-capture system adjusted based thereon. In particular, the pan and/or zoom levels of the camera may be adjusted in accordance with the distance.
    Type: Grant
    Filed: September 8, 2017
    Date of Patent: May 29, 2018
    Assignee: Leap Motion, Inc.
    Inventor: David S. Holz
  • Patent number: 9973741
    Abstract: A single image sensor including an array of uniformly and continuously spaced light-sensing pixels in conjunction with a plurality of lenses that focus light reflected from an object onto a plurality of different pixel regions of the image sensor, each lens focusing light on a different one of the pixel regions enables a controller, including a processor and an object detection module, coupled to the single image to analyze the pixel regions, to generate a three-dimensional (3D) image of the object through a plurality of images obtained with the image sensor, generate a depth map that calculates depth values for pixels of at least the object, detect 3D motion of the object using the depth values, create a 3D model of the object based on the 3D image, and track 3D motion of the object based on the 3D model.
    Type: Grant
    Filed: June 27, 2016
    Date of Patent: May 15, 2018
    Assignee: Leap Motion, Inc.
    Inventor: David Holz
  • Publication number: 20180130228
    Abstract: Methods and systems for capturing motion and/or determining the shapes and positions of one or more objects in 3D space utilize cross-sections thereof. In various embodiments, images of the cross-sections are captured using a camera based on edge points thereof.
    Type: Application
    Filed: January 4, 2018
    Publication date: May 10, 2018
    Applicant: Leap Motion, Inc.
    Inventor: David Holz
  • Patent number: 9945660
    Abstract: Methods and systems for capturing motion and/or determining the shapes and positions of one or more objects in 3D space utilize cross-sections thereof. In various embodiments, images of the cross-sections are captured using a camera based on reflections therefrom or shadows cast thereby.
    Type: Grant
    Filed: May 27, 2015
    Date of Patent: April 17, 2018
    Assignee: Leap Motion, Inc.
    Inventor: David S. Holz
  • Patent number: 9934580
    Abstract: Enhanced contrast between an object of interest and background surfaces visible in an image is provided using controlled lighting directed at the object. Exploiting the falloff of light intensity with distance, a light source (or multiple light sources), such as an infrared light source, can be positioned near one or more cameras to shine light onto the object while the camera(s) capture images. The captured images can be analyzed to distinguish object pixels from background pixels.
    Type: Grant
    Filed: May 3, 2017
    Date of Patent: April 3, 2018
    Assignee: Leap Motion, Inc.
    Inventors: David S. Holz, Hua Yang
  • Patent number: 9934609
    Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.
    Type: Grant
    Filed: July 31, 2017
    Date of Patent: April 3, 2018
    Assignee: Leap Motion, Inc.
    Inventors: Kevin A. Horowitz, David S. Holz
  • Patent number: 9927880
    Abstract: Methods and systems for processing input from an image-capture device for gesture-recognition. The method further includes computationally interpreting user gestures in accordance with a first mode of operation; analyzing the path of movement of an object to determine an intent of a user to change modes of operation; and, upon determining an intent of the user to change modes of operation, subsequently interpreting user gestures in accordance with the second mode of operation.
    Type: Grant
    Filed: January 20, 2017
    Date of Patent: March 27, 2018
    Assignee: Leap Motion, Inc.
    Inventor: David S. Holz
  • Patent number: 9927522
    Abstract: The technology disclosed relates to determining positional information of an object in a field of view. In particular, it relates to calculating a distance of the object from a reference such as a sensor including scanning the field of view by selectively illuminating directionally oriented light sources and measuring one or more differences in property of returning light emitted from the light sources and reflected from the object. The property can be intensity or phase difference of the light. It also relates to finding an object in a region of space. In particular, it relates to scanning the region of space with directionally controllable illumination, determining a difference in a property of the illumination received for two or more points in the scanning, and determining positional information of the object based in part upon the points in the scanning corresponding to the difference in the property.
    Type: Grant
    Filed: June 16, 2017
    Date of Patent: March 27, 2018
    Assignee: Leap Motion, Inc.
    Inventor: David Holz
  • Patent number: 9916009
    Abstract: Methods and systems for processing an input are disclosed that detect a portion of a hand and/or other detectable object in a region of space monitored by a 3D sensor. The method further includes determining a zone corresponding to the region of space in which the portion of the hand or other detectable object was detected. Also, the method can include determining from the zone a correct way to interpret inputs made by a position, shape or a motion of the portion of the hand or other detectable object.
    Type: Grant
    Filed: April 25, 2014
    Date of Patent: March 13, 2018
    Assignee: LEAP MOTION, INC.
    Inventors: Michael Zagorsek, Avinash Dabir, Paul Durdik, Keith Mertens
  • Patent number: 9911240
    Abstract: The technology disclosed relates to a method of interacting with a virtual object. In particular, it relates to referencing a virtual object in an augmented reality space, identifying a physical location of a device in at least one image of the augmented reality space, generating for display a control coincident with a surface of the device, sensing interactions between at least one control object and the control coincident with the surface of the device, and generating data signaling manipulations of the control coincident with the surface of the device.
    Type: Grant
    Filed: August 16, 2017
    Date of Patent: March 6, 2018
    Assignee: Leap Motion, Inc.
    Inventors: Raffi Bedikian, Hongyuan (Jimmy) He, David S. Holz