Abstract: A method of operating a procedural training user interface system involves displaying an interactive guided process of a first user using at least one augmented reality (AR) layer through an AR device worn by a second user, where a representation of first user hands is displayed. The second user interactions may be detected during the interactive guided process, where the second user attempts to superimpose second user hands on the representation of the first user hands in the at least one AR layer. The interactive guided process of the second user may then be displayed using the AR layer through an AR device on the first user and the AR device on the second user. If the first user hands and the second user hands are not superimposed in the AR layer, the first user or the second user may be notified to take corrective action.
Type:
Grant
Filed:
August 29, 2022
Date of Patent:
September 12, 2023
Assignee:
LabLightAR, Inc.
Inventors:
Roger Brent, John Max Kellum, James Ashley
Abstract: An augmented reality system for procedural guidance identifies a fiducial marker object in a frame of a first field of view generated by a camera, determines a pose of the fiducial marker object, applies the fiducial marker pose to generate a first transformation between a first coordinate system of the fiducial marker object and a second coordinate system of the camera, and applies a pose of a headset to determine a second transformation between the first coordinate system and a third coordinate system of the headset.
Abstract: A method of operating a procedural training user interface system involves displaying an interactive guided process of a first user using at least one augmented reality (AR) layer through an AR device worn by a second user, where a representation of first user hands is displayed. The second user interactions may be detected during the interactive guided process, where the second user attempts to superimpose second user hands on the representation of the first user hands in the at least one AR layer. The interactive guided process of the second user may then be displayed using the AR layer through an AR device on the first user and the AR device on the second user. If the first user hands and the second user hands are not superimposed in the AR layer, the first user or the second user may be notified to take corrective action.
Type:
Application
Filed:
August 29, 2022
Publication date:
March 2, 2023
Applicant:
LabLightAR, Inc.
Inventors:
Roger Brent, John Max Kellum, James Ashley
Abstract: A method of operating a procedural language and content generation system that involves correlating environment objects and object movement to input controls through operation of a correlator, operating an interpreter to evaluate the correlation of the input controls and object/object movement against known libraries to generate programmatic instructions, storing the programmatic instructions as an instruction set, transforming the instruction set into executable commands through a compiler, and configuring control logic to perform the executable commands in response to receiving detected environment objects and detected object movement from an image processor.
Type:
Grant
Filed:
February 18, 2021
Date of Patent:
July 12, 2022
Assignee:
LabLightAR, Inc.
Inventors:
Roger Brent, Jamie Douglas Tremaine, John Max Kellum
Abstract: An augmented reality system for procedural guidance identifies a fiducial marker object in a frame of a first field of view generated by a camera, determines a pose of the fiducial marker object, applies the fiducial marker pose to generate a first transformation between a first coordinate system of the fiducial marker object and a second coordinate system of the camera, and applies a pose of a headset to determine a second transformation between the first coordinate system and a third coordinate system of the headset.
Abstract: An augmented reality system for procedural guidance identifies a fiducial marker object in a frame of a first field of view generated by a camera, determines a pose of the fiducial marker object, applies the fiducial marker pose to generate a first transformation between a first coordinate system of the fiducial marker object and a second coordinate system of the camera, and applies a pose of a headset to determine a second transformation between the first coordinate system and a third coordinate system of the headset.