Patents by Inventor Maxwell Sills
Maxwell Sills has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20200004403Abstract: The technology disclosed relates to using virtual attraction between hand or other control object in a three-dimensional (3D) sensory space and a virtual object in a virtual space. In particular, it relates to defining a virtual attraction zone of a hand or other control object that is tracked in a three-dimensional (3D) sensory space and generating one or more interaction forces between the control object and a virtual object in a virtual space that cause motion of the virtual object responsive to proximity of the control object to the virtual object and escalation with a virtual pinch or grasp action of the control object directed to a manipulation point of the virtual object.Type: ApplicationFiled: September 11, 2019Publication date: January 2, 2020Applicant: Ultrahaptics IP Two LimitedInventors: David S HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Hua YANG, Maxwell SILLS, Gabriel HARE
-
Publication number: 20200004344Abstract: The technology disclosed relates to automatically interpreting a gesture of a control object in a three dimensional sensor space by sensing a movement of the control object in the three dimensional sensor space, sensing orientation of the control object, defining a control plane tangential to a surface of the control object and interpreting the gesture based on whether the movement of the control object is more normal to the control plane or more parallel to the control plane.Type: ApplicationFiled: September 13, 2019Publication date: January 2, 2020Applicant: Ultrahaptics IP Two LimitedInventors: Isaac COHEN, David S. HOLZ, Maxwell SILLS
-
Patent number: 10452154Abstract: The technology disclosed relates to automatically interpreting a gesture of a control object in a three dimensional sensor space by sensing a movement of the control object in the three dimensional sensor space, sensing orientation of the control object, defining a control plane tangential to a surface of the control object and interpreting the gesture based on whether the movement of the control object is more normal to the control plane or more parallel to the control plane.Type: GrantFiled: December 7, 2018Date of Patent: October 22, 2019Assignee: Ultrahaptics IP Two LimitedInventors: Isaac Cohen, David S. Holz, Maxwell Sills
-
Patent number: 10416834Abstract: The technology disclosed relates to using virtual attraction between hand or other control object in a three-dimensional (3D) sensory space and a virtual object in a virtual space. In particular, it relates to defining a virtual attraction zone of a hand or other control object that is tracked in a three-dimensional (3D) sensory space and generating one or more interaction forces between the control object and a virtual object in a virtual space that cause motion of the virtual object responsive to proximity of the control object to the virtual object and escalation with a virtual pinch or grasp action of the control object directed to a manipulation point of the virtual object.Type: GrantFiled: November 13, 2014Date of Patent: September 17, 2019Assignee: Leap Motion, Inc.Inventors: David S Holz, Raffi Bedikian, Adrian Gasinski, Hua Yang, Maxwell Sills, Gabriel Hare
-
Publication number: 20190265803Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: ApplicationFiled: May 6, 2019Publication date: August 29, 2019Applicant: Leap Motion, Inc.Inventors: Isaac COHEN, Maxwell SILLS, Paul DURDIK
-
Publication number: 20190250719Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: ApplicationFiled: April 29, 2019Publication date: August 15, 2019Applicant: Leap Motion, Inc.Inventors: Isaac COHEN, Maxwell SILLS
-
Publication number: 20190155394Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: ApplicationFiled: November 19, 2018Publication date: May 23, 2019Applicant: Leap Motion, Inc.Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel HARE, Ryan JULIAN
-
Publication number: 20190146660Abstract: The technology disclosed relates to providing simplified manipulation of virtual objects by detected hand motions. In particular, it relates to a detecting hand motion and positions of the calculation points relative to a virtual object to be manipulated, dynamically selecting at least one manipulation point proximate to the virtual object based on the detected hand motion and positions of one or more of the calculation points, and manipulating the virtual object by interaction between the detected hand motion and positions of one or more of the calculation points and the dynamically selected manipulation point.Type: ApplicationFiled: December 20, 2018Publication date: May 16, 2019Applicant: Leap Motion, Inc.Inventors: David S. HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Hua YANG, Gabriel A. HARE, Maxwell SILLS
-
Patent number: 10281992Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: GrantFiled: January 3, 2018Date of Patent: May 7, 2019Assignee: Leap Motion, Inc.Inventors: Isaac Cohen, Maxwell Sills, Paul Durdik
-
Patent number: 10275039Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: GrantFiled: August 18, 2017Date of Patent: April 30, 2019Assignee: Leap Motion, Inc.Inventors: Isaac Cohen, Maxwell Sills
-
Publication number: 20190113980Abstract: The technology disclosed relates to automatically interpreting a gesture of a control object in a three dimensional sensor space by sensing a movement of the control object in the three dimensional sensor space, sensing orientation of the control object, defining a control plane tangential to a surface of the control object and interpreting the gesture based on whether the movement of the control object is more normal to the control plane or more parallel to the control plane.Type: ApplicationFiled: December 7, 2018Publication date: April 18, 2019Applicant: Leap Motion, Inc.Inventors: Isaac COHEN, David S. HOLZ, Maxwell SILLS
-
Publication number: 20190079594Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: ApplicationFiled: November 13, 2018Publication date: March 14, 2019Applicant: Leap Motion, Inc.Inventors: Isaac COHEN, Maxwell SILLS
-
Publication number: 20190042957Abstract: The technology disclosed relates to manipulating a virtual object. In particular, it relates to detecting a hand in a three-dimensional (3D) sensory space and generating a predictive model of the hand, and using the predictive model to track motion of the hand. The predictive model includes positions of calculation points of fingers, thumb and palm of the hand. The technology disclosed relates to dynamically selecting at least one manipulation point proximate to a virtual object based on the motion tracked by the predictive model and positions of one or more of the calculation points, and manipulating the virtual object by interaction between at least some of the calculation points of the predictive model and the dynamically selected manipulation point.Type: ApplicationFiled: June 5, 2018Publication date: February 7, 2019Applicant: Leap Motion, Inc.Inventors: David S. HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Maxwell SILLS, Hua YANG, Gabriel HARE
-
Patent number: 10168873Abstract: The technology disclosed relates to providing simplified manipulation of virtual objects by detected hand motions. In particular, it relates to a detecting hand motion and positions of the calculation points relative to a virtual object to be manipulated, dynamically selecting at least one manipulation point proximate to the virtual object based on the detected hand motion and positions of one or more of the calculation points, and manipulating the virtual object by interaction between the detected hand motion and positions of one or more of the calculation points and the dynamically selected manipulation point.Type: GrantFiled: October 29, 2014Date of Patent: January 1, 2019Assignee: LEAP MOTION, INC.Inventors: David S Holz, Raffi Bedikian, Adrian Gasinski, Maxwell Sills, Hua Yang, Gabriel Hare
-
Patent number: 10152136Abstract: The technology disclosed relates to automatically interpreting a gesture of a control object in a three dimensional sensor space by sensing a movement of the control object in the three dimensional sensor space, sensing orientation of the control object, defining a control plane tangential to a surface of the control object and interpreting the gesture based on whether the movement of the control object is more normal to the control plane or more parallel to the control plane.Type: GrantFiled: October 16, 2014Date of Patent: December 11, 2018Assignee: LEAP MOTION, INC.Inventors: Isaac Cohen, David S Holz, Maxwell Sills
-
Patent number: 10139918Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: GrantFiled: September 28, 2016Date of Patent: November 27, 2018Assignee: Leap Motion, Inc.Inventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Samuel Holz, Maxwell Sills, Matias Perez, Gabriel A. Hare, Ryan Christopher Julian
-
Patent number: 10126822Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: GrantFiled: December 16, 2014Date of Patent: November 13, 2018Assignee: Leap Motion, Inc.Inventors: Isaac Cohen, Maxwell Sills
-
Publication number: 20180253150Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: ApplicationFiled: January 3, 2018Publication date: September 6, 2018Applicant: Leap Motion, Inc.Inventors: Isaac Cohen, Maxwell Sills, Paul Durdik
-
Patent number: 9996797Abstract: The technology disclosed relates to manipulating a virtual object. In particular, it relates to detecting a hand in a three-dimensional (3D) sensory space and generating a predictive model of the hand, and using the predictive model to track motion of the hand. The predictive model includes positions of calculation points of fingers, thumb and palm of the hand. The technology disclosed relates to dynamically selecting at least one manipulation point proximate to a virtual object based on the motion tracked by the predictive model and positions of one or more of the calculation points, and manipulating the virtual object by interaction between at least some of the calculation points of the predictive model and the dynamically selected manipulation point.Type: GrantFiled: October 31, 2014Date of Patent: June 12, 2018Assignee: LEAP MOTION, INC.Inventors: David S Holz, Raffi Bedikian, Adrian Gasinski, Maxwell Sills, Hua Yang, Gabriel Hare
-
Patent number: 9891712Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: GrantFiled: December 16, 2014Date of Patent: February 13, 2018Assignee: LEAP MOTION, INC.Inventors: Isaac Cohen, Maxwell Sills, Paul Durdik