Patents by Inventor Maxwell Sills
Maxwell Sills has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240061511Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, the gesture is identified as an engagement gesture, and compared with reference gestures from a library of reference gestures. In some embodiments, a degree of completion of the recognized engagement gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: ApplicationFiled: July 7, 2023Publication date: February 22, 2024Applicant: Ultrahaptics IP Two LimitedInventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel HARE, Ryan JULIAN
-
Publication number: 20240028131Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: ApplicationFiled: September 26, 2023Publication date: January 25, 2024Applicant: ULTRAHAPTICS IP TWO LIMITEDInventors: Isaac COHEN, Maxwell SILLS, Paul DURDIK
-
Publication number: 20230333662Abstract: The technology disclosed relates to automatically interpreting motion of a control object in a three dimensional (3D) sensor space by sensing a movement of the control object in the (3D) sensor space, interpreting movement of the control object, and presenting the interpreted movement as a path on a display. The path may be displayed once the speed of the movement exceeds a pre-determined threshold measured in cm per second. Once the path is displayed, the technology duplicates a display object that intersects the path on the display. In some implementations, the control object may be a device, a hand, or a portion of a hand (such as a finger).Type: ApplicationFiled: June 23, 2023Publication date: October 19, 2023Applicant: Ultrahaptics IP Two LimitedInventors: Isaac COHEN, David S. HOLZ, Maxwell SILLS
-
Patent number: 11775080Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: GrantFiled: October 3, 2022Date of Patent: October 3, 2023Assignee: Ultrahaptics IP Two LimitedInventors: Isaac Cohen, Maxwell Sills, Paul Durdik
-
Patent number: 11740705Abstract: A method and system are provided for controlling a machine using gestures. The method includes sensing a variation of position of a control object using an imaging system, determining, from the variation, one or more primitives describing a characteristic of a control object moving in space, comparing the one or more primitives to one or more gesture templates in a library of gesture templates, selecting, based on a result of the comparing, one or more gesture templates corresponding to the one or more primitives, and providing at least one gesture template of the selected one or more gesture templates as an indication of a command to issue to a machine under control responsive to the variation.Type: GrantFiled: February 7, 2022Date of Patent: August 29, 2023Assignee: Ultrahaptics IP Two LimitedInventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian
-
Patent number: 11726575Abstract: The technology disclosed relates to automatically interpreting a gesture of a control object in a three dimensional sensor space by sensing a movement of the control object in the three dimensional sensor space, sensing orientation of the control object, defining a control plane tangential to a surface of the control object and interpreting the gesture based on whether the movement of the control object is more normal to the control plane or more parallel to the control plane.Type: GrantFiled: July 19, 2021Date of Patent: August 15, 2023Assignee: Ultrahaptics IP Two LimitedInventors: Isaac Cohen, David S. Holz, Maxwell Sills
-
Publication number: 20230245500Abstract: The technology disclosed can provide methods and systems for identifying users while capturing motion and/or determining the path of a portion of the user with one or more optical, acoustic or vibrational sensors. Implementations can enable use of security aware devices, e.g., automated teller machines (ATMs), cash registers and banking machines, other secure vending or service machines, security screening apparatus, secure terminals, airplanes, automobiles and so forth that comprise sensors and processors employing optical, audio or vibrational detection mechanisms suitable for providing gesture detection, personal identification, user recognition, authorization of control inputs, and other machine control and/or machine communications applications. A virtual experience can be provided to the user in some implementations by the addition of haptic, audio and/or other sensory information projectors.Type: ApplicationFiled: March 31, 2023Publication date: August 3, 2023Applicant: ULTRAHAPTICS IP TWO LIMITEDInventors: Maxwell SILLS, Aaron SMITH, David S. HOLZ, Hongyuan (Jimmy) HE
-
Publication number: 20230205151Abstract: The technology disclosed relates to selecting among devices to interact with. It also relates operating a smart phone with reduced power consumption. It further relates to gesturally interacting with devices that lack gestural responsiveness. The technology disclosed also relates to distinguishing control gestures from proximate non-control gestures in a pervasive three dimensional (3D) sensory space. The technology disclosed further relates to selecting among virtual interaction modalities to interact with.Type: ApplicationFiled: January 4, 2023Publication date: June 29, 2023Applicant: Ultrahaptics IP Two LimitedInventors: Robert Samuel GORDON, Paul Alan DURDIK, Maxwell SILLS
-
Patent number: 11620859Abstract: The technology disclosed can provide methods and systems for identifying users while capturing motion and/or determining the path of a portion of the user with one or more optical, acoustic or vibrational sensors. Implementations can enable use of security aware devices, e.g., automated teller machines (ATMs), cash registers and banking machines, other secure vending or service machines, security screening apparatus, secure terminals, airplanes, automobiles and so forth that comprise sensors and processors employing optical, audio or vibrational detection mechanisms suitable for providing gesture detection, personal identification, user recognition, authorization of control inputs, and other machine control and/or machine communications applications. A virtual experience can be provided to the user in some implementations by the addition of haptic, audio and/or other sensory information projectors.Type: GrantFiled: July 28, 2020Date of Patent: April 4, 2023Assignee: Ultrahaptics IP Two LimitedInventors: Maxwell Sills, Aaron Smith, David S. Holz, Hongyuan (Jimmy) He
-
Publication number: 20230072748Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: ApplicationFiled: November 10, 2022Publication date: March 9, 2023Applicant: Ultrahaptics IP Two LimitedInventors: Isaac COHEN, Maxwell SILLS
-
Patent number: 11567583Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: GrantFiled: September 27, 2021Date of Patent: January 31, 2023Assignee: Ultrahaptics IP Two LimitedInventors: Isaac Cohen, Maxwell Sills
-
Publication number: 20230025269Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: ApplicationFiled: October 3, 2022Publication date: January 26, 2023Applicant: Ultrahaptics IP Two LimitedInventors: Isaac COHEN, Maxwell SILLS, Paul DURDIK
-
Patent number: 11561519Abstract: The technology disclosed relates to selecting among devices room to interact with. It also relates operating a smart phone with reduced power consumption. It further relates to gesturally interacting with devices that lack gestural responsiveness. The technology disclosed also relates to distinguishing control gestures from proximate non-control gestures in a pervasive three dimensional (3D) sensory space. The technology disclosed further relates to selecting among virtual interaction modalities to interact with.Type: GrantFiled: September 21, 2020Date of Patent: January 24, 2023Assignee: Ultrahaptics IP Two LimitedInventors: Robert Samuel Gordon, Paul Alan Durdik, Maxwell Sills
-
Patent number: 11500473Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: GrantFiled: January 21, 2021Date of Patent: November 15, 2022Assignee: Ultrahaptics IP Two LimitedInventors: Isaac Cohen, Maxwell Sills
-
Patent number: 11460929Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: GrantFiled: July 16, 2021Date of Patent: October 4, 2022Assignee: Ultrahaptics IP Two LimitedInventors: Isaac Cohen, Maxwell Sills, Paul Durdik
-
Publication number: 20220236808Abstract: A method and system are provided for controlling a machine using gestures. The method includes sensing a variation of position of a control object using an imaging system, determining, from the variation, one or more primitives describing a characteristic of a control object moving in space, comparing the one or more primitives to one or more gesture templates in a library of gesture templates, selecting, based on a result of the comparing, one or more gesture templates corresponding to the one or more primitives, and providing at least one gesture template of the selected one or more gesture templates as an indication of a command to issue to a machine under control responsive to the variation.Type: ApplicationFiled: February 7, 2022Publication date: July 28, 2022Applicant: Ultrahaptics IP Two LimitedInventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian
-
Patent number: 11307282Abstract: The technology disclosed relates to determining positional information about an object of interest is provided. In particular, it includes, conducting scanning of a field of interest with an emission from a transmission area according to an ordered scan pattern. The emission can be received to form a signal based upon at least one salient property (e.g., intensity, amplitude, frequency, polarization, phase, or other detectable feature) of the emission varying with time at the object of interest. Synchronization information about the ordered scan pattern can be derived from a source, a second signal broadcast separately, social media share, others, or and/or combinations thereof). A correspondence between at least one characteristic of the signal and the synchronization information can be established. Positional information can be determined based at least in part upon the correspondence.Type: GrantFiled: October 24, 2014Date of Patent: April 19, 2022Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Robert Samuel Gordon, Gabriel A. Hare, Neeloy Roy, Maxwell Sills, Paul Durdik
-
Publication number: 20220083880Abstract: The technology disclosed relates to manipulating a virtual object. In particular, it relates to detecting a hand in a three-dimensional (3D) sensory space and generating a predictive model of the hand, and using the predictive model to track motion of the hand. The predictive model includes positions of calculation points of fingers, thumb and palm of the hand. The technology disclosed relates to dynamically selecting at least one manipulation point proximate to a virtual object based on the motion tracked by the predictive model and positions of one or more of the calculation points, and manipulating the virtual object by interaction between at least some of the calculation points of the predictive model and the dynamically selected manipulation point.Type: ApplicationFiled: November 22, 2021Publication date: March 17, 2022Applicant: Ultrahaptics IP Two LimitedInventors: David S. HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Maxwell SILLS, Hua YANG, Gabriel HARE
-
Patent number: 11243612Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: GrantFiled: November 19, 2018Date of Patent: February 8, 2022Assignee: Ultrahaptics IP Two LimitedInventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian
-
Publication number: 20220011871Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: ApplicationFiled: September 27, 2021Publication date: January 13, 2022Applicant: Ultrahaptics IP Two LimitedInventors: Isaac COHEN, Maxwell SILLS