Patents by Inventor Johnathon Scott Selstad
Johnathon Scott Selstad has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12169918Abstract: An AR calibration system for correcting AR headset distortions. A calibration image is provided to a screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the screen. One or more cameras are located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera(s) cancels out an acceptable portion of the calibration image provided to the screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.Type: GrantFiled: September 13, 2023Date of Patent: December 17, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Johnathon Scott Selstad, David Samuel Holz
-
Patent number: 12118134Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.Type: GrantFiled: January 27, 2022Date of Patent: October 15, 2024Assignee: Ultrahaptics IP Two LimitedInventors: John Adrian Arthur Johnston, Johnathon Scott Selstad, Alex Marcolina
-
Patent number: 11941163Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.Type: GrantFiled: January 27, 2022Date of Patent: March 26, 2024Assignee: Ultrahaptics IP Two LimitedInventors: John Adrian Arthur Johnston, Johnathon Scott Selstad, Alex Marcolina
-
Publication number: 20230419460Abstract: An AR calibration system for correcting AR headset distortions. A calibration image is provided to a screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the screen. One or more cameras are located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera(s) cancels out an acceptable portion of the calibration image provided to the screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.Type: ApplicationFiled: September 13, 2023Publication date: December 28, 2023Inventors: Johnathon Scott SELSTAD, David Samuel HOLZ
-
Patent number: 11798141Abstract: An AR calibration system for correcting AR headset distortions. A calibration image is provided to an external screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the external screen. One or more cameras are located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera(s) cancels out an acceptable portion of the calibration image provided to the external screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.Type: GrantFiled: May 10, 2022Date of Patent: October 24, 2023Assignee: Ultrahaptics IP Two LimitedInventors: Johnathon Scott Selstad, David Samuel Holz
-
Publication number: 20220270218Abstract: An AR calibration system for correcting AR headset distortions. A calibration image is provided to an external screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the external screen. One or more cameras are located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera(s) cancels out an acceptable portion of the calibration image provided to the external screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.Type: ApplicationFiled: May 10, 2022Publication date: August 25, 2022Applicant: Ultrahaptics IP Two LimitedInventors: Johnathon Scott SELSTAD, David Samuel Holz
-
Patent number: 11354787Abstract: The disclosed technology teaches an AR calibration system for compensating for AR headset distortions. A calibration image is provided to an external screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the external screen. The camera is located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera cancels out an acceptable portion of the calibration image provided to the external screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.Type: GrantFiled: November 5, 2019Date of Patent: June 7, 2022Assignee: Ultrahaptics IP Two LimitedInventors: Johnathon Scott Selstad, David Samuel Holz
-
Publication number: 20220147137Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.Type: ApplicationFiled: January 27, 2022Publication date: May 12, 2022Applicant: Ultrahaptics IP Two LimitedInventors: John Adrian Arthur JOHNSTON, Johnathon Scott SELSTAD, Alex MARCOLINA
-
Patent number: 11237625Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.Type: GrantFiled: November 24, 2020Date of Patent: February 1, 2022Assignee: Ultrahaptics IP Two LimitedInventors: John Adrian Arthur Johnston, Johnathon Scott Selstad, Alex Marcolina
-
Publication number: 20210081036Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.Type: ApplicationFiled: November 24, 2020Publication date: March 18, 2021Applicant: Ultrahaptics IP Two LimitedInventors: John Adrian Arthur JOHNSTON, Johnathon Scott SELSTAD, Alex MARCOLINA
-
Patent number: 10866632Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.Type: GrantFiled: September 30, 2019Date of Patent: December 15, 2020Assignee: Ultrahaptics IP Two LimitedInventors: John Adrian Arthur Johnston, Johnathon Scott Selstad, Alex Marcolina
-
Publication number: 20200143524Abstract: The disclosed technology teaches an AR calibration system for compensating for AR headset distortions. A calibration image is provided to an external screen and viewable through a headset reflector, and an inverse of the calibration image is provided to a headset display, reflected off the reflector and observed by a camera of the system while it is simultaneously observing the calibration image on the external screen. The camera is located to represent a user's point of view and aligned to observe the inverse calibration image projected onto the reflector. A distortion mapping transform is created using an algorithm to search through projection positions of the inverse calibration image until the inverse image observed by the camera cancels out an acceptable portion of the calibration image provided to the external screen as observed through the reflector by the camera, and the transform is used by the headset, to compensate for distortions.Type: ApplicationFiled: November 5, 2019Publication date: May 7, 2020Applicant: Ultrahaptics IP Two LimitedInventors: Johnathon Scott SELSTAD, David Samuel HOLZ
-
Publication number: 20200097071Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.Type: ApplicationFiled: September 30, 2019Publication date: March 26, 2020Applicant: Ultrahaptics IP Two LimitedInventors: John Adrian Arthur JOHNSTON, Johnathon Scott SELSTAD, Alex MARCOLINA
-
Patent number: 10429923Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.Type: GrantFiled: May 25, 2017Date of Patent: October 1, 2019Assignee: Ultrahaptics IP Two LimitedInventors: John Adrian Arthur Johnston, Johnathon Scott Selstad, Alex Marcolina