Patents by Inventor Keith Mertens
Keith Mertens has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240077950Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The technology disclosed includes determining from the motion information whether a motion of the control object with respect to the virtual control construct is an engagement gesture, such as a virtual mouse click or other control device operation. The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.Type: ApplicationFiled: November 9, 2023Publication date: March 7, 2024Applicant: Ultrahaptics IP Two LimitedInventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ
-
Publication number: 20240061511Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, the gesture is identified as an engagement gesture, and compared with reference gestures from a library of reference gestures. In some embodiments, a degree of completion of the recognized engagement gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: ApplicationFiled: July 7, 2023Publication date: February 22, 2024Applicant: Ultrahaptics IP Two LimitedInventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel HARE, Ryan JULIAN
-
Patent number: 11874970Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.Type: GrantFiled: June 6, 2022Date of Patent: January 16, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz
-
Patent number: 11740705Abstract: A method and system are provided for controlling a machine using gestures. The method includes sensing a variation of position of a control object using an imaging system, determining, from the variation, one or more primitives describing a characteristic of a control object moving in space, comparing the one or more primitives to one or more gesture templates in a library of gesture templates, selecting, based on a result of the comparing, one or more gesture templates corresponding to the one or more primitives, and providing at least one gesture template of the selected one or more gesture templates as an indication of a command to issue to a machine under control responsive to the variation.Type: GrantFiled: February 7, 2022Date of Patent: August 29, 2023Assignee: Ultrahaptics IP Two LimitedInventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian
-
Publication number: 20220300085Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.Type: ApplicationFiled: June 6, 2022Publication date: September 22, 2022Applicant: Ultrahaptics IP Two LimitedInventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ
-
Publication number: 20220236808Abstract: A method and system are provided for controlling a machine using gestures. The method includes sensing a variation of position of a control object using an imaging system, determining, from the variation, one or more primitives describing a characteristic of a control object moving in space, comparing the one or more primitives to one or more gesture templates in a library of gesture templates, selecting, based on a result of the comparing, one or more gesture templates corresponding to the one or more primitives, and providing at least one gesture template of the selected one or more gesture templates as an indication of a command to issue to a machine under control responsive to the variation.Type: ApplicationFiled: February 7, 2022Publication date: July 28, 2022Applicant: Ultrahaptics IP Two LimitedInventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian
-
Patent number: 11386522Abstract: Disclosed herein are systems and methods for correcting distortion in a camera lens. The methods can include receiving at least one image of a calibration object, in which the image is captured via the camera lens and the lens has lens distortion. The methods can further include fitting a plurality of geodesics in the image; determining at least one connection equation for the plurality of geodesics; and determining a metric based on the connection equation, the metric comprising a first distorted radial coordinate. The methods can further include determining an undistorted radial coordinate based on the first distorted radial coordinate; determining a second distorted radial coordinate as a function of the undistorted radial coordinate; inverting the undistorted radial coordinate; and generating an undistorted image based on the inverted undistorted radial coordinate.Type: GrantFiled: August 7, 2020Date of Patent: July 12, 2022Inventors: Gabriel Archacki Hare, Keith Mertens, Daniel Rothman
-
Patent number: 11353962Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.Type: GrantFiled: August 6, 2020Date of Patent: June 7, 2022Assignee: Ultrahaptics IP Two LimitedInventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz
-
Publication number: 20220146415Abstract: Disclosed herein are devices and related methods for identifying odorants using a combination of light sources which enable discrimination between different odorants and/or determination of odorant concentration. The response of a chemical sensor to one or more odorants is observed under a combination and/or sequence of light sources, and the resulting data is subsequently analyzed using machine learning methods to identify one or more odorants and/or to determine the concentration of one or more odorants.Type: ApplicationFiled: November 10, 2021Publication date: May 12, 2022Applicant: Olfato Wearables LimitedInventors: Vakhtang Putkaradze, Natasa Vretenar, Josef Hocher, Keith Mertens
-
Patent number: 11243612Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: GrantFiled: November 19, 2018Date of Patent: February 8, 2022Assignee: Ultrahaptics IP Two LimitedInventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz, Maxwell Sills, Matias Perez, Gabriel Hare, Ryan Julian
-
Publication number: 20210382563Abstract: Methods and systems for processing an input are disclosed that detect a portion of a hand and/or other detectable object in a region of space monitored by a 3D sensor. The method further includes determining a zone corresponding to the region of space in which the portion of the hand or other detectable object was detected. Also, the method can include determining from the zone a correct way to interpret inputs made by a position, shape or a motion of the portion of the hand or other detectable object.Type: ApplicationFiled: August 23, 2021Publication date: December 9, 2021Applicant: Ultrahaptics IP Two LimitedInventors: Avinash DABIR, Paul DURDIK, Keith MERTENS, Michael ZAGORSEK
-
Patent number: 11099653Abstract: Methods and systems for processing an input are disclosed that detect a portion of a hand and/or other detectable object in a region of space monitored by a 3D sensor. The method further includes determining a zone corresponding to the region of space in which the portion of the hand or other detectable object was detected. Also, the method can include determining from the zone a correct way to interpret inputs made by a position, shape or a motion of the portion of the hand or other detectable object.Type: GrantFiled: October 21, 2019Date of Patent: August 24, 2021Assignee: Ultrahaptics IP Two LimitedInventors: Michael Zagorsek, Avinash Dabir, Paul Durdik, Keith Mertens
-
Publication number: 20210118089Abstract: Disclosed herein are systems and methods for correcting distortion in a camera lens. The methods can include receiving at least one image of a calibration object, in which the image is captured via the camera lens and the lens has lens distortion. The methods can further include fitting a plurality of geodesics in the image; determining at least one connection equation for the plurality of geodesics; and determining a metric based on the connection equation, the metric comprising a first distorted radial coordinate. The methods can further include determining an undistorted radial coordinate based on the first distorted radial coordinate; determining a second distorted radial coordinate as a function of the undistorted radial coordinate; inverting the undistorted radial coordinate; and generating an undistorted image based on the inverted undistorted radial coordinate.Type: ApplicationFiled: August 7, 2020Publication date: April 22, 2021Inventors: Gabriel Archacki Hare, Keith Mertens, Daniel Rothman
-
Publication number: 20200363874Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.Type: ApplicationFiled: August 6, 2020Publication date: November 19, 2020Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ
-
Patent number: 10739862Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.Type: GrantFiled: August 3, 2018Date of Patent: August 11, 2020Assignee: Ultrahaptics IP Two LimitedInventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz
-
Publication number: 20200050281Abstract: Methods and systems for processing an input are disclosed that detect a portion of a hand and/or other detectable object in a region of space monitored by a 3D sensor. The method further includes determining a zone corresponding to the region of space in which the portion of the hand or other detectable object was detected. Also, the method can include determining from the zone a correct way to interpret inputs made by a position, shape or a motion of the portion of the hand or other detectable object.Type: ApplicationFiled: October 21, 2019Publication date: February 13, 2020Applicant: Ultrahaptics IP Two LimitedInventors: Michael ZAGORSEK, Avinash DABIR, Paul DURDIK, Keith MERTENS
-
Patent number: 10452151Abstract: Methods and systems for processing an input are disclosed that detect a portion of a hand and/or other detectable object in a region of space monitored by a 3D sensor. The method further includes determining a zone corresponding to the region of space in which the portion of the hand or other detectable object was detected. Also, the method can include determining from the zone a correct way to interpret inputs made by a position, shape or a motion of the portion of the hand or other detectable object.Type: GrantFiled: March 9, 2018Date of Patent: October 22, 2019Assignee: Ultrahaptics IP Two LimitedInventors: Michael Zagorsek, Avinash Dabir, Paul Durdik, Keith Mertens
-
Publication number: 20190155394Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: ApplicationFiled: November 19, 2018Publication date: May 23, 2019Applicant: Leap Motion, Inc.Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel HARE, Ryan JULIAN
-
Publication number: 20190033979Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.Type: ApplicationFiled: August 3, 2018Publication date: January 31, 2019Applicant: Leap Motion, Inc.Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ
-
Publication number: 20190018495Abstract: Methods and systems for processing an input are disclosed that detect a portion of a hand and/or other detectable object in a region of space monitored by a 3D sensor. The method further includes determining a zone corresponding to the region of space in which the portion of the hand or other detectable object was detected. Also, the method can include determining from the zone a correct way to interpret inputs made by a position, shape or a motion of the portion of the hand or other detectable object.Type: ApplicationFiled: March 9, 2018Publication date: January 17, 2019Applicant: LEAP MOTION, INC.Inventors: Michael ZAGORSEK, Avinash DABIR, Paul DURDIK, Keith MERTENS