Patents by Inventor Kyle A. HAY

Kyle A. HAY has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240094860
    Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.
    Type: Application
    Filed: February 24, 2023
    Publication date: March 21, 2024
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
  • Patent number: 11914025
    Abstract: A plurality of positional sensing devices are situated at regular intervals within an environment and collect data for tracking objects moving within the environment. Phase shift of modulated Doppler pulses reflected from the sensing devices to objects are measured and converted into positional data indicating positions of detected objects within the environment. Associated timestamp data is also collected by the positional sensing devices. The positional data and associated timestamp data is aggregated from the plurality of positional sensors, and the aggregated positional data and is clustered to determine point clouds which are associated with the detected objects. The clusters are tracked by tracklets that track the position of each cluster over time. Trajectories for each detected object are determined by connecting tracklets together that are associated with the same detected object.
    Type: Grant
    Filed: August 4, 2023
    Date of Patent: February 27, 2024
    Assignee: Density, Inc.
    Inventors: Andrew Farah, Casey Kelso, Christian Ayerh, John Shanley, Robert Grazioli, Benjamin Redfield, Garrett Bastable, Brian Weinreich, Kyle Hay
  • Publication number: 20230210230
    Abstract: A watchband connection mechanism includes a watchband with a watchband housing and a cam rotatable between a first position where the cam extends from the watchband into a watch ledge in a watch side cavity and a second position where the cam is recessed in the watchband housing. A watch includes a watch housing, a watch cavity in the watch housing for receiving at least a portion of the watchband, and a button coupled to the watch housing and movable with respect to the watch cavity. In operation, the watchband is received in the watch cavity of the watch housing, which enables the cam to rotate into the first position to extend from the band into the watch ledge. The watch housing prevents the watchband from uncoupling with the watch in the first position of the cam due to the cam securing in the watch ledge. The user then depresses the button to move the cam into the second position, which is back within the watchband, to uncouple the watchband from the watch.
    Type: Application
    Filed: December 22, 2022
    Publication date: July 6, 2023
    Inventors: Thinh Tran, Kyle Hay
  • Patent number: 11599237
    Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.
    Type: Grant
    Filed: February 12, 2021
    Date of Patent: March 7, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
  • Publication number: 20210165555
    Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.
    Type: Application
    Filed: February 12, 2021
    Publication date: June 3, 2021
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
  • Patent number: 10921949
    Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.
    Type: Grant
    Filed: July 12, 2019
    Date of Patent: February 16, 2021
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
  • Publication number: 20190391724
    Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.
    Type: Application
    Filed: July 12, 2019
    Publication date: December 26, 2019
    Inventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
  • Patent number: 10353532
    Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.
    Type: Grant
    Filed: February 19, 2015
    Date of Patent: July 16, 2019
    Assignee: LEAP MOTION, INC.
    Inventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
  • Publication number: 20150320189
    Abstract: The technology disclosed relates to providing devices and methods for attaching motion capture devices to head mounted displays (HMDs) using existing features of the HMDS, with no modification to the design of the HMDs. A motion capture device is attached with an adapter to a wearable device that can be a personal HMD having a goggle form factor. The motion capture device is operable to be attached to or detached from an adapter, and the adapter is operable to be attached to or detached from an HMD. The motion capture device is attached to the HMD with an adapter in a fixed position and orientation. In embodiments, the attachment mechanism coupling the adapter to the HMD utilizes existing functional or ornamental elements of an HMD. Functional or ornamental elements of the HMD include; air vents, bosses, grooves, recessed channels, slots formed where two parts connect, openings for head straps etc.
    Type: Application
    Filed: May 8, 2015
    Publication date: November 12, 2015
    Applicant: LEAP MOTION, INC.
    Inventors: Barry JU, Kyle A. HAY
  • Publication number: 20150326762
    Abstract: The technology disclosed relates to providing devices and methods for attaching motion capture devices to head mounted displays (HMDs) using existing features of the HMDS, with no modification to the design of the HMDs. A motion capture device is attached with an adapter to a wearable device that can be a personal HMD having a goggle form factor. The motion capture device is operable to be attached to or detached from an adapter, and the adapter is operable to be attached to or detached from an HMD. The motion capture device is attached to the HMD with an adapter in a fixed position and orientation. In embodiments, the attachment mechanism coupling the adapter to the HMD utilizes existing functional or ornamental elements of an HMD. Functional or ornamental elements of the HMD include; air vents, bosses, grooves, recessed channels, slots formed where two parts connect, openings for head straps etc.
    Type: Application
    Filed: May 8, 2015
    Publication date: November 12, 2015
    Applicant: LEAP MOTION, INC.
    Inventors: Barry JU, Kyle A. HAY
  • Patent number: D726727
    Type: Grant
    Filed: August 3, 2012
    Date of Patent: April 14, 2015
    Assignee: Leap Motion, Inc.
    Inventors: David Holz, Kyle Hay, Michael Buckwald