Patents by Inventor Gabriel A. Hare
Gabriel A. Hare has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250130700Abstract: The technology disclosed relates to providing simplified manipulation of virtual objects by detected hand motions. In particular, it relates to a detecting hand motion and positions of the calculation points relative to a virtual object to be manipulated, dynamically selecting at least one manipulation point proximate to the virtual object based on the detected hand motion and positions of one or more of the calculation points, and manipulating the virtual object by interaction between the detected hand motion and positions of one or more of the calculation points and the dynamically selected manipulation point.Type: ApplicationFiled: October 28, 2024Publication date: April 24, 2025Applicant: ULTRAHAPTICS IP TWO LIMITEDInventors: David S. HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Hua YANG, Gabriel A. HARE, Maxwell SILLS
-
Publication number: 20240361877Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: ApplicationFiled: July 10, 2024Publication date: October 31, 2024Applicant: Ultrahaptics IP Two LimitedInventors: David S. HOLZ, Barrett FOX, Kyle A. HAY, Gabriel A. HARE, Wilbur Yung Sheng YU, Dave EDELHART, Jody MEDICH, Daniel PLEMMONS
-
Patent number: 12131011Abstract: The technology disclosed relates to providing simplified manipulation of virtual objects by detected hand motions. In particular, it relates to a detecting hand motion and positions of the calculation points relative to a virtual object to be manipulated, dynamically selecting at least one manipulation point proximate to the virtual object based on the detected hand motion and positions of one or more of the calculation points, and manipulating the virtual object by interaction between the detected hand motion and positions of one or more of the calculation points and the dynamically selected manipulation point.Type: GrantFiled: July 28, 2020Date of Patent: October 29, 2024Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Raffi Bedikian, Adrian Gasinski, Hua Yang, Gabriel A. Hare, Maxwell Sills
-
Patent number: 12050757Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: GrantFiled: February 24, 2023Date of Patent: July 30, 2024Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Publication number: 20240201794Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: ApplicationFiled: February 26, 2024Publication date: June 20, 2024Applicant: ULTRAHAPTICS IP TWO LIMITEDInventors: Kevin A. HOROWITZ, Matias PEREZ, Raffi BEDIKIAN, David S. HOLZ, Gabriel A. HARE
-
Publication number: 20240094860Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: ApplicationFiled: February 24, 2023Publication date: March 21, 2024Applicant: Ultrahaptics IP Two LimitedInventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Patent number: 11914792Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: GrantFiled: February 17, 2023Date of Patent: February 27, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
-
Publication number: 20230205321Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: ApplicationFiled: February 17, 2023Publication date: June 29, 2023Applicant: Ultrahaptics IP Two LimitedInventors: Kevin A. HOROWITZ, Matias PEREZ, Raffi BEDIKIAN, David S. HOLZ, Gabriel A. HARE
-
Patent number: 11599237Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: GrantFiled: February 12, 2021Date of Patent: March 7, 2023Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Patent number: 11586292Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: GrantFiled: March 1, 2021Date of Patent: February 21, 2023Assignee: Ultrahaptics IP Two LimitedInventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
-
Patent number: 11307282Abstract: The technology disclosed relates to determining positional information about an object of interest is provided. In particular, it includes, conducting scanning of a field of interest with an emission from a transmission area according to an ordered scan pattern. The emission can be received to form a signal based upon at least one salient property (e.g., intensity, amplitude, frequency, polarization, phase, or other detectable feature) of the emission varying with time at the object of interest. Synchronization information about the ordered scan pattern can be derived from a source, a second signal broadcast separately, social media share, others, or and/or combinations thereof). A correspondence between at least one characteristic of the signal and the synchronization information can be established. Positional information can be determined based at least in part upon the correspondence.Type: GrantFiled: October 24, 2014Date of Patent: April 19, 2022Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Robert Samuel Gordon, Gabriel A. Hare, Neeloy Roy, Maxwell Sills, Paul Durdik
-
Publication number: 20210181859Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: ApplicationFiled: March 1, 2021Publication date: June 17, 2021Applicant: Ultrahaptics IP Two LimitedInventors: Kevin A. HOROWITZ, Matias PEREZ, Raffi BEDIKIAN, David S. HOLZ, Gabriel A. HARE
-
Publication number: 20210165555Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: ApplicationFiled: February 12, 2021Publication date: June 3, 2021Applicant: Ultrahaptics IP Two LimitedInventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Patent number: 10936082Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: GrantFiled: September 30, 2019Date of Patent: March 2, 2021Assignee: Ultrahaptics IP Two LimitedInventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
-
Patent number: 10921949Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: GrantFiled: July 12, 2019Date of Patent: February 16, 2021Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Publication number: 20200356238Abstract: The technology disclosed relates to providing simplified manipulation of virtual objects by detected hand motions. In particular, it relates to a detecting hand motion and positions of the calculation points relative to a virtual object to be manipulated, dynamically selecting at least one manipulation point proximate to the virtual object based on the detected hand motion and positions of one or more of the calculation points, and manipulating the virtual object by interaction between the detected hand motion and positions of one or more of the calculation points and the dynamically selected manipulation point.Type: ApplicationFiled: July 28, 2020Publication date: November 12, 2020Applicant: Ultrahaptics IP Two LimitedInventors: David S. HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Hua YANG, Gabriel A. HARE, Maxwell SILLS
-
Patent number: 10739965Abstract: The technology disclosed relates to providing simplified manipulation of virtual objects by detected hand motions. In particular, it relates to a detecting hand motion and positions of the calculation points relative to a virtual object to be manipulated, dynamically selecting at least one manipulation point proximate to the virtual object based on the detected hand motion and positions of one or more of the calculation points, and manipulating the virtual object by interaction between the detected hand motion and positions of one or more of the calculation points and the dynamically selected manipulation point.Type: GrantFiled: December 20, 2018Date of Patent: August 11, 2020Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Raffi Bedikian, Adrian Gasinski, Hua Yang, Gabriel A. Hare, Maxwell Sills
-
Publication number: 20200033951Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: ApplicationFiled: September 30, 2019Publication date: January 30, 2020Inventors: Kevin A. HOROWITZ, Matias PEREZ, Raffi BEDIKIAN, David S. HOLZ, Gabriel A. HARE
-
Publication number: 20190391724Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: ApplicationFiled: July 12, 2019Publication date: December 26, 2019Inventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Patent number: 10429943Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: GrantFiled: May 24, 2018Date of Patent: October 1, 2019Assignee: Ultrahaptics IP Two LimitedInventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare