Patents by Inventor Maxwell Sills

Maxwell Sills has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20180039334
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Application
    Filed: August 18, 2017
    Publication date: February 8, 2018
    Applicant: Leap Motion, Inc.
    Inventors: Isaac COHEN, Maxwell SILLS
  • Publication number: 20170270356
    Abstract: The technology disclosed can provide methods and systems for identifying users while capturing motion and/or determining the path of a portion of the user with one or more optical, acoustic or vibrational sensors. Implementations can enable use of security aware devices, e.g., automated teller machines (ATMs), cash registers and banking machines, other secure vending or service machines, security screening apparatus, secure terminals, airplanes, automobiles and so forth that comprise sensors and processors employing optical, audio or vibrational detection mechanisms suitable for providing gesture detection, personal identification, user recognition, authorization of control inputs, and other machine control and/or machine communications applications. A virtual experience can be provided to the user in some implementations by the addition of haptic, audio and/or other sensory information projectors.
    Type: Application
    Filed: June 2, 2017
    Publication date: September 21, 2017
    Applicant: Leap Motion, Inc.
    Inventors: Maxwell SILLS, Aaron SMITH, David S. HOLZ, Hongyuan (Jimmy) He
  • Patent number: 9740296
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Grant
    Filed: December 16, 2014
    Date of Patent: August 22, 2017
    Assignee: LEAP MOTION, INC.
    Inventors: Isaac Cohen, Maxwell Sills
  • Patent number: 9679197
    Abstract: The technology disclosed can provide methods and systems for identifying users while capturing motion and/or determining the path of a portion of the user with one or more optical, acoustic or vibrational sensors. Implementations can enable use of security aware devices, e.g., automated teller machines (ATMs), cash registers and banking machines, other secure vending or service machines, security screening apparatus, secure terminals, airplanes, automobiles and so forth that comprise sensors and processors employing optical, audio or vibrational detection mechanisms suitable for providing gesture detection, personal identification, user recognition, authorization of control inputs, and other machine control and/or machine communications applications. A virtual experience can be provided to the user in some implementations by the addition of haptic, audio and/or other sensory information projectors.
    Type: Grant
    Filed: March 13, 2015
    Date of Patent: June 13, 2017
    Assignee: Leap Motion, Inc.
    Inventors: Maxwell Sills, Aaron Smith, David S. Holz, Hongyuan (Jimmy) He
  • Publication number: 20170017306
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Application
    Filed: September 28, 2016
    Publication date: January 19, 2017
    Applicant: Leap Motion, Inc.
    Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David Samuel HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel A. HARE, Ryan Christopher JULIAN
  • Patent number: 9459697
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Grant
    Filed: January 15, 2014
    Date of Patent: October 4, 2016
    Assignee: Leap Motion, Inc.
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel A. Hare, Ryan Julian
  • Publication number: 20150346701
    Abstract: The technology disclosed relates to selecting among devices room to interact with. It also relates operating a smart phone with reduced power consumption. It further relates to gesturally interacting with devices that lack gestural responsiveness. The technology disclosed also relates to distinguishing control gestures from proximate non-control gestures in a pervasive three dimensional (3D) sensory space. The technology disclosed further relates to selecting among virtual interaction modalities to interact with.
    Type: Application
    Filed: February 19, 2015
    Publication date: December 3, 2015
    Applicant: Leap Motion, Inc.
    Inventors: Robert Samuel Gordon, Maxwell Sills, Paul Alan Durdik
  • Publication number: 20150169076
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Application
    Filed: December 16, 2014
    Publication date: June 18, 2015
    Applicant: LEAP MOTION, INC.
    Inventors: Isaac Cohen, Maxwell Sills, Paul Alan Durdik
  • Publication number: 20150169176
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Application
    Filed: December 16, 2014
    Publication date: June 18, 2015
    Applicant: Leap Motion, Inc.
    Inventors: Isaac Cohen, Maxwell Sills
  • Publication number: 20150169175
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Application
    Filed: December 16, 2014
    Publication date: June 18, 2015
    Applicant: LEAP MOTION, INC.
    Inventors: Isaac Cohen, Maxwell Sills
  • Publication number: 20150119074
    Abstract: The technology disclosed relates to determining positional information about an object of interest is provided. In particular, it includes, conducting scanning of a field of interest with an emission from a transmission area according to an ordered scan pattern. The emission can be received to form a signal based upon at least one salient property (e.g., intensity, amplitude, frequency, polarization, phase, or other detectable feature) of the emission varying with time at the object of interest. Synchronization information about the ordered scan pattern can be derived from a source, a second signal broadcast separately, social media share, others, or and/or combinations thereof). A correspondence between at least one characteristic of the signal and the synchronization information can be established. Positional information can be determined based at least in part upon the correspondence.
    Type: Application
    Filed: October 24, 2014
    Publication date: April 30, 2015
    Applicant: LEAP MOTION, INC.
    Inventors: David S. HOLZ, Robert Samuel GORDON, Gabriel A. HARE, Neeloy ROY, Maxwell SILLS, Paul DURDIK
  • Publication number: 20150103004
    Abstract: The technology disclosed relates to automatically interpreting a gesture of a control object in a three dimensional sensor space by sensing a movement of the control object in the three dimensional sensor space, sensing orientation of the control object, defining a control plane tangential to a surface of the control object and interpreting the gesture based on whether the movement of the control object is more normal to the control plane or more parallel to the control plane.
    Type: Application
    Filed: October 16, 2014
    Publication date: April 16, 2015
    Applicant: LEAP MOTION, INC.
    Inventors: Isaac COHEN, David S HOLZ, Maxwell SILLS
  • Publication number: 20150029092
    Abstract: The technology disclosed relates to using a curvilinear gestural path of a control object as a gesture-based input command for a motion-sensing system. In particular, the curvilinear gestural path can be broken down into curve segments, and each curve segment can be mapped to a recorded gesture primitive. Further, certain sequences of gesture primitives can be used to identify the original curvilinear gesture.
    Type: Application
    Filed: July 23, 2014
    Publication date: January 29, 2015
    Applicant: LEAP MOTION, INC.
    Inventors: David HOLZ, Matias PEREZ, Neeloy ROY, Maxwell Sills
  • Publication number: 20140201666
    Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.
    Type: Application
    Filed: January 15, 2014
    Publication date: July 17, 2014
    Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel A. Hare, Ryan Julian