Patents by Inventor Kyle A. HAY
Kyle A. HAY has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240094860Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: ApplicationFiled: February 24, 2023Publication date: March 21, 2024Applicant: Ultrahaptics IP Two LimitedInventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Patent number: 11914025Abstract: A plurality of positional sensing devices are situated at regular intervals within an environment and collect data for tracking objects moving within the environment. Phase shift of modulated Doppler pulses reflected from the sensing devices to objects are measured and converted into positional data indicating positions of detected objects within the environment. Associated timestamp data is also collected by the positional sensing devices. The positional data and associated timestamp data is aggregated from the plurality of positional sensors, and the aggregated positional data and is clustered to determine point clouds which are associated with the detected objects. The clusters are tracked by tracklets that track the position of each cluster over time. Trajectories for each detected object are determined by connecting tracklets together that are associated with the same detected object.Type: GrantFiled: August 4, 2023Date of Patent: February 27, 2024Assignee: Density, Inc.Inventors: Andrew Farah, Casey Kelso, Christian Ayerh, John Shanley, Robert Grazioli, Benjamin Redfield, Garrett Bastable, Brian Weinreich, Kyle Hay
-
Publication number: 20230210230Abstract: A watchband connection mechanism includes a watchband with a watchband housing and a cam rotatable between a first position where the cam extends from the watchband into a watch ledge in a watch side cavity and a second position where the cam is recessed in the watchband housing. A watch includes a watch housing, a watch cavity in the watch housing for receiving at least a portion of the watchband, and a button coupled to the watch housing and movable with respect to the watch cavity. In operation, the watchband is received in the watch cavity of the watch housing, which enables the cam to rotate into the first position to extend from the band into the watch ledge. The watch housing prevents the watchband from uncoupling with the watch in the first position of the cam due to the cam securing in the watch ledge. The user then depresses the button to move the cam into the second position, which is back within the watchband, to uncouple the watchband from the watch.Type: ApplicationFiled: December 22, 2022Publication date: July 6, 2023Inventors: Thinh Tran, Kyle Hay
-
Patent number: 11599237Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: GrantFiled: February 12, 2021Date of Patent: March 7, 2023Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Publication number: 20210165555Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: ApplicationFiled: February 12, 2021Publication date: June 3, 2021Applicant: Ultrahaptics IP Two LimitedInventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Patent number: 10921949Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: GrantFiled: July 12, 2019Date of Patent: February 16, 2021Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Publication number: 20190391724Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: ApplicationFiled: July 12, 2019Publication date: December 26, 2019Inventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Patent number: 10353532Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: GrantFiled: February 19, 2015Date of Patent: July 16, 2019Assignee: LEAP MOTION, INC.Inventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Publication number: 20150320189Abstract: The technology disclosed relates to providing devices and methods for attaching motion capture devices to head mounted displays (HMDs) using existing features of the HMDS, with no modification to the design of the HMDs. A motion capture device is attached with an adapter to a wearable device that can be a personal HMD having a goggle form factor. The motion capture device is operable to be attached to or detached from an adapter, and the adapter is operable to be attached to or detached from an HMD. The motion capture device is attached to the HMD with an adapter in a fixed position and orientation. In embodiments, the attachment mechanism coupling the adapter to the HMD utilizes existing functional or ornamental elements of an HMD. Functional or ornamental elements of the HMD include; air vents, bosses, grooves, recessed channels, slots formed where two parts connect, openings for head straps etc.Type: ApplicationFiled: May 8, 2015Publication date: November 12, 2015Applicant: LEAP MOTION, INC.Inventors: Barry JU, Kyle A. HAY
-
Publication number: 20150326762Abstract: The technology disclosed relates to providing devices and methods for attaching motion capture devices to head mounted displays (HMDs) using existing features of the HMDS, with no modification to the design of the HMDs. A motion capture device is attached with an adapter to a wearable device that can be a personal HMD having a goggle form factor. The motion capture device is operable to be attached to or detached from an adapter, and the adapter is operable to be attached to or detached from an HMD. The motion capture device is attached to the HMD with an adapter in a fixed position and orientation. In embodiments, the attachment mechanism coupling the adapter to the HMD utilizes existing functional or ornamental elements of an HMD. Functional or ornamental elements of the HMD include; air vents, bosses, grooves, recessed channels, slots formed where two parts connect, openings for head straps etc.Type: ApplicationFiled: May 8, 2015Publication date: November 12, 2015Applicant: LEAP MOTION, INC.Inventors: Barry JU, Kyle A. HAY
-
Patent number: D726727Type: GrantFiled: August 3, 2012Date of Patent: April 14, 2015Assignee: Leap Motion, Inc.Inventors: David Holz, Kyle Hay, Michael Buckwald