Patents by Inventor Maxwell Sills
Maxwell Sills has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11182685Abstract: The technology disclosed relates to manipulating a virtual object. In particular, it relates to detecting a hand in a three-dimensional (3D) sensory space and generating a predictive model of the hand, and using the predictive model to track motion of the hand. The predictive model includes positions of calculation points of fingers, thumb and palm of the hand. The technology disclosed relates to dynamically selecting at least one manipulation point proximate to a virtual object based on the motion tracked by the predictive model and positions of one or more of the calculation points, and manipulating the virtual object by interaction between at least some of the calculation points of the predictive model and the dynamically selected manipulation point.Type: GrantFiled: June 5, 2018Date of Patent: November 23, 2021Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Raffi Bedikian, Adrian Gasinski, Maxwell Sills, Hua Yang, Gabriel Hare
-
Publication number: 20210342012Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: ApplicationFiled: July 16, 2021Publication date: November 4, 2021Applicant: Ultrahaptics IP Two LimitedInventors: Isaac COHEN, Maxwell SILLS, Paul DURDIK
-
Publication number: 20210342013Abstract: The technology disclosed relates to automatically interpreting a gesture of a control object in a three dimensional sensor space by sensing a movement of the control object in the three dimensional sensor space, sensing orientation of the control object, defining a control plane tangential to a surface of the control object and interpreting the gesture based on whether the movement of the control object is more normal to the control plane or more parallel to the control plane.Type: ApplicationFiled: July 19, 2021Publication date: November 4, 2021Applicant: Ultrahaptics IP Two LimitedInventors: Isaac COHEN, David S. HOLZ, Maxwell SILLS
-
Patent number: 11132064Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: GrantFiled: November 13, 2018Date of Patent: September 28, 2021Assignee: Ultrahaptics IP Two LimitedInventors: Isaac Cohen, Maxwell Sills
-
Patent number: 11068070Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: GrantFiled: February 28, 2020Date of Patent: July 20, 2021Assignee: Ultrahaptics IP Two LimitedInventors: Isaac Cohen, Maxwell Sills, Paul Durdik
-
Patent number: 11068071Abstract: The technology disclosed relates to automatically interpreting a gesture of a control object in a three dimensional sensor space by sensing a movement of the control object in the three dimensional sensor space, sensing orientation of the control object, defining a control plane tangential to a surface of the control object and interpreting the gesture based on whether the movement of the control object is more normal to the control plane or more parallel to the control plane.Type: GrantFiled: April 27, 2020Date of Patent: July 20, 2021Assignee: Ultrahaptics IP Two LimitedInventors: Isaac Cohen, David S. Holz, Maxwell Sills
-
Publication number: 20210141462Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: ApplicationFiled: January 21, 2021Publication date: May 13, 2021Applicant: Ultrahaptics IP Two LimitedInventors: Isaac COHEN, Maxwell SILLS
-
Patent number: 10901518Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: GrantFiled: April 29, 2019Date of Patent: January 26, 2021Assignee: Ultrahaptics IP Two LimitedInventors: Isaac Cohen, Maxwell Sills
-
Publication number: 20210003977Abstract: The technology disclosed relates to selecting among devices room to interact with. It also relates operating a smart phone with reduced power consumption. It further relates to gesturally interacting with devices that lack gestural responsiveness. The technology disclosed also relates to distinguishing control gestures from proximate non-control gestures in a pervasive three dimensional (3D) sensory space. The technology disclosed further relates to selecting among virtual interaction modalities to interact with.Type: ApplicationFiled: September 21, 2020Publication date: January 7, 2021Applicant: Ultrahaptics IP Two LimitedInventors: Robert Samuel Gordon, Paul Alan Durdik, Maxwell Sills
-
Publication number: 20200401232Abstract: The technology disclosed relates to motion capture and gesture recognition. In particular, it calculates the exerted force implied by a human hand motion and applies the equivalent through a robotic arm to a target object. In one implementation, this is achieved by tracking the motion and contact of the human hand and generating corresponding robotic commands that replicate the motion and contact of the human hand on a workpiece through a robotic tool.Type: ApplicationFiled: September 3, 2020Publication date: December 24, 2020Applicant: Ultrahaptics IP Two LimitedInventors: Maxwell Sills, Robert S. Gordon, Paul Durdik
-
Publication number: 20200356766Abstract: The technology disclosed can provide methods and systems for identifying users while capturing motion and/or determining the path of a portion of the user with one or more optical, acoustic or vibrational sensors. Implementations can enable use of security aware devices, e.g., automated teller machines (ATMs), cash registers and banking machines, other secure vending or service machines, security screening apparatus, secure terminals, airplanes, automobiles and so forth that comprise sensors and processors employing optical, audio or vibrational detection mechanisms suitable for providing gesture detection, personal identification, user recognition, authorization of control inputs, and other machine control and/or machine communications applications. A virtual experience can be provided to the user in some implementations by the addition of haptic, audio and/or other sensory information projectors.Type: ApplicationFiled: July 28, 2020Publication date: November 12, 2020Inventors: Maxwell SILLS, Aaron SMITH, David S. HOLZ, Hongyuan (Jimmy) HE
-
Publication number: 20200356238Abstract: The technology disclosed relates to providing simplified manipulation of virtual objects by detected hand motions. In particular, it relates to a detecting hand motion and positions of the calculation points relative to a virtual object to be manipulated, dynamically selecting at least one manipulation point proximate to the virtual object based on the detected hand motion and positions of one or more of the calculation points, and manipulating the virtual object by interaction between the detected hand motion and positions of one or more of the calculation points and the dynamically selected manipulation point.Type: ApplicationFiled: July 28, 2020Publication date: November 12, 2020Applicant: Ultrahaptics IP Two LimitedInventors: David S. HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Hua YANG, Gabriel A. HARE, Maxwell SILLS
-
Publication number: 20200301515Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: ApplicationFiled: February 28, 2020Publication date: September 24, 2020Applicant: Ultrahaptics IP Two LimitedInventors: Isaac COHEN, Maxwell SILLS, Paul DURDIK
-
Patent number: 10782657Abstract: The technology disclosed relates to selecting among devices room to interact with. It also relates operating a smart phone with reduced power consumption. It further relates to gesturally interacting with devices that lack gestural responsiveness. The technology disclosed also relates to distinguishing control gestures from proximate non-control gestures in a pervasive three dimensional (3D) sensory space. The technology disclosed further relates to selecting among virtual interaction modalities to interact with.Type: GrantFiled: February 19, 2015Date of Patent: September 22, 2020Assignee: Ultrahaptics IP Two LimitedInventors: Robert Samuel Gordon, Maxwell Sills, Paul Alan Durdik
-
Patent number: 10768708Abstract: The technology disclosed relates to motion capture and gesture recognition. In particular, it calculates the exerted force implied by a human hand motion and applies the equivalent through a robotic arm to a target object. In one implementation, this is achieved by tracking the motion and contact of the human hand and generating corresponding robotic commands that replicate the motion and contact of the human hand on a workpiece through a robotic tool.Type: GrantFiled: August 21, 2015Date of Patent: September 8, 2020Assignee: Ultrahaptics IP Two LimitedInventors: Maxwell Sills, Robert S. Gordon, Paul Durdik
-
Publication number: 20200257374Abstract: The technology disclosed relates to automatically interpreting a gesture of a control object in a three dimensional sensor space by sensing a movement of the control object in the three dimensional sensor space, sensing orientation of the control object, defining a control plane tangential to a surface of the control object and interpreting the gesture based on whether the movement of the control object is more normal to the control plane or more parallel to the control plane.Type: ApplicationFiled: April 27, 2020Publication date: August 13, 2020Applicant: Ultrahaptics IP Two LimitedInventors: Isaac COHEN, David S. HOLZ, Maxwell SILLS
-
Patent number: 10739965Abstract: The technology disclosed relates to providing simplified manipulation of virtual objects by detected hand motions. In particular, it relates to a detecting hand motion and positions of the calculation points relative to a virtual object to be manipulated, dynamically selecting at least one manipulation point proximate to the virtual object based on the detected hand motion and positions of one or more of the calculation points, and manipulating the virtual object by interaction between the detected hand motion and positions of one or more of the calculation points and the dynamically selected manipulation point.Type: GrantFiled: December 20, 2018Date of Patent: August 11, 2020Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Raffi Bedikian, Adrian Gasinski, Hua Yang, Gabriel A. Hare, Maxwell Sills
-
Patent number: 10733429Abstract: The technology disclosed can provide methods and systems for identifying users while capturing motion and/or determining the path of a portion of the user with one or more optical, acoustic or vibrational sensors. Implementations can enable use of security aware devices, e.g., automated teller machines (ATMs), cash registers and banking machines, other secure vending or service machines, security screening apparatus, secure terminals, airplanes, automobiles and so forth that comprise sensors and processors employing optical, audio or vibrational detection mechanisms suitable for providing gesture detection, personal identification, user recognition, authorization of control inputs, and other machine control and/or machine communications applications. A virtual experience can be provided to the user in some implementations by the addition of haptic, audio and/or other sensory information projectors.Type: GrantFiled: June 2, 2017Date of Patent: August 4, 2020Assignee: Ultrahaptics IP Two LimitedInventors: Maxwell Sills, Aaron Smith, David S. Holz, Hongyuan (Jimmy) He
-
Patent number: 10635185Abstract: The technology disclosed relates to automatically interpreting a gesture of a control object in a three dimensional sensor space by sensing a movement of the control object in the three dimensional sensor space, sensing orientation of the control object, defining a control plane tangential to a surface of the control object and interpreting the gesture based on whether the movement of the control object is more normal to the control plane or more parallel to the control plane.Type: GrantFiled: September 13, 2019Date of Patent: April 28, 2020Assignee: Ultrahaptics IP Two LimitedInventors: Isaac Cohen, David S. Holz, Maxwell Sills
-
Patent number: 10579155Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: GrantFiled: May 6, 2019Date of Patent: March 3, 2020Assignee: Ultrahaptics IP Two LimitedInventors: Isaac Cohen, Maxwell Sills, Paul Durdik