Patents by Inventor Johnathon Scott Selstad

Johnathon Scott Selstad has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12169918
    Abstract: An AR calibration system for correcting AR headset distortions. A calibration image is provided to a screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the screen. One or more cameras are located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera(s) cancels out an acceptable portion of the calibration image provided to the screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.
    Type: Grant
    Filed: September 13, 2023
    Date of Patent: December 17, 2024
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Johnathon Scott Selstad, David Samuel Holz
  • Patent number: 12118134
    Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.
    Type: Grant
    Filed: January 27, 2022
    Date of Patent: October 15, 2024
    Assignee: Ultrahaptics IP Two Limited
    Inventors: John Adrian Arthur Johnston, Johnathon Scott Selstad, Alex Marcolina
  • Patent number: 11941163
    Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.
    Type: Grant
    Filed: January 27, 2022
    Date of Patent: March 26, 2024
    Assignee: Ultrahaptics IP Two Limited
    Inventors: John Adrian Arthur Johnston, Johnathon Scott Selstad, Alex Marcolina
  • Publication number: 20230419460
    Abstract: An AR calibration system for correcting AR headset distortions. A calibration image is provided to a screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the screen. One or more cameras are located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera(s) cancels out an acceptable portion of the calibration image provided to the screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.
    Type: Application
    Filed: September 13, 2023
    Publication date: December 28, 2023
    Inventors: Johnathon Scott SELSTAD, David Samuel HOLZ
  • Patent number: 11798141
    Abstract: An AR calibration system for correcting AR headset distortions. A calibration image is provided to an external screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the external screen. One or more cameras are located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera(s) cancels out an acceptable portion of the calibration image provided to the external screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.
    Type: Grant
    Filed: May 10, 2022
    Date of Patent: October 24, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Johnathon Scott Selstad, David Samuel Holz
  • Publication number: 20220270218
    Abstract: An AR calibration system for correcting AR headset distortions. A calibration image is provided to an external screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the external screen. One or more cameras are located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera(s) cancels out an acceptable portion of the calibration image provided to the external screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.
    Type: Application
    Filed: May 10, 2022
    Publication date: August 25, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Johnathon Scott SELSTAD, David Samuel Holz
  • Patent number: 11354787
    Abstract: The disclosed technology teaches an AR calibration system for compensating for AR headset distortions. A calibration image is provided to an external screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the external screen. The camera is located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera cancels out an acceptable portion of the calibration image provided to the external screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.
    Type: Grant
    Filed: November 5, 2019
    Date of Patent: June 7, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Johnathon Scott Selstad, David Samuel Holz
  • Publication number: 20220147137
    Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.
    Type: Application
    Filed: January 27, 2022
    Publication date: May 12, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventors: John Adrian Arthur JOHNSTON, Johnathon Scott SELSTAD, Alex MARCOLINA
  • Patent number: 11237625
    Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.
    Type: Grant
    Filed: November 24, 2020
    Date of Patent: February 1, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: John Adrian Arthur Johnston, Johnathon Scott Selstad, Alex Marcolina
  • Publication number: 20210081036
    Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.
    Type: Application
    Filed: November 24, 2020
    Publication date: March 18, 2021
    Applicant: Ultrahaptics IP Two Limited
    Inventors: John Adrian Arthur JOHNSTON, Johnathon Scott SELSTAD, Alex MARCOLINA
  • Patent number: 10866632
    Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.
    Type: Grant
    Filed: September 30, 2019
    Date of Patent: December 15, 2020
    Assignee: Ultrahaptics IP Two Limited
    Inventors: John Adrian Arthur Johnston, Johnathon Scott Selstad, Alex Marcolina
  • Publication number: 20200143524
    Abstract: The disclosed technology teaches an AR calibration system for compensating for AR headset distortions. A calibration image is provided to an external screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the external screen. The camera is located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera cancels out an acceptable portion of the calibration image provided to the external screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.
    Type: Application
    Filed: November 5, 2019
    Publication date: May 7, 2020
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Johnathon Scott SELSTAD, David Samuel HOLZ
  • Publication number: 20200097071
    Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.
    Type: Application
    Filed: September 30, 2019
    Publication date: March 26, 2020
    Applicant: Ultrahaptics IP Two Limited
    Inventors: John Adrian Arthur JOHNSTON, Johnathon Scott SELSTAD, Alex MARCOLINA
  • Patent number: 10429923
    Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.
    Type: Grant
    Filed: May 25, 2017
    Date of Patent: October 1, 2019
    Assignee: Ultrahaptics IP Two Limited
    Inventors: John Adrian Arthur Johnston, Johnathon Scott Selstad, Alex Marcolina