Patents by Inventor Larry Clay Greunke
Larry Clay Greunke has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12223605Abstract: This disclosure and exemplary embodiments described herein provide a Parallel Content Authoring Method and Tool for Procedural Guidance, and a Remote Expert Method and System Utilizing Quantitative Quality Assurance in Mixed Reality. The implementation described herein is related to the generation of content/instruction set that can be viewed in different modalities, including but not limited to mixed reality, VR, audio text, however it is to be understood that the scope of this disclosure is not limited to such application.Type: GrantFiled: February 17, 2023Date of Patent: February 11, 2025Assignee: The Government of the United States of America, as represented by the Secretary of the NavyInventor: Larry Clay Greunke
-
Patent number: 12182950Abstract: This disclosure and exemplary embodiments described herein provide a Parallel Content Authoring Method and Tool for Procedural Guidance, and a Remote Expert Method and System Utilizing Quantitative Quality Assurance in Mixed Reality. The implementation described herein is related to the generation of content/instruction set that can be viewed in different modalities, including but not limited to mixed reality, VR, audio text, however it is to be understood that the scope of this disclosure is not limited to such application.Type: GrantFiled: February 17, 2023Date of Patent: December 31, 2024Assignee: THE UNITED STATES OF AMERICA, AS REPRESENTED BY THE SECRETARY OF THE NAVYInventor: Larry Clay Greunke
-
Publication number: 20240012954Abstract: This disclosure and exemplary embodiments described herein provide methods and systems for a blockchain-based system for managing discrepancies, assigning procedures, ensuring quality assurance, and updating digital twins with real-world information. According to an embodiment, the system uses a combination of smart contracts and a decentralized network to facilitate transparency, security, and efficiency in various industries such as manufacturing, construction, and maintenance. The system includes multiple block types, each representing a specific aspect of the process: marking up discrepancies, assigning Parallel Content Authored instructions, completing tasks with quantitative quality assurance, and updating digital twins. By leveraging the inherent advantages of blockchain technology, this system provides an auditable, tamper-proof record of the entire process while streamlining interactions between different blocks.Type: ApplicationFiled: May 30, 2023Publication date: January 11, 2024Inventor: Larry Clay Greunke
-
Publication number: 20230343044Abstract: This disclosure and exemplary embodiments described herein provide methods and systems for multimodal procedural guidance content creation and conversion, however, it is to be understood that the scope of this disclosure is not limited to such application. One of the implementations described herein is related to the generation of content/instruction set that can be viewed in different modalities, including but not limited to mixed reality, VR, and audio text, however it is to be understood that the scope of this disclosure is not limited to such application.Type: ApplicationFiled: May 30, 2023Publication date: October 26, 2023Inventor: Larry Clay Greunke
-
Publication number: 20230343042Abstract: This disclosure and exemplary embodiments described herein provide methods and systems using mixed-reality for the creation of in-situ cad models, and methods and systems for multimodal procedural guidance content creation and conversion, however, it is to be understood that the scope of this disclosure is not limited to such application. One of the implementations described herein is related to the generation of content/instruction set 1007 that can be viewed in different modalities, including but not limited to mixed reality 1012, VR 1012, and audio text 1008, however it is to be understood that the scope of this disclosure is not limited to such application.Type: ApplicationFiled: April 20, 2023Publication date: October 26, 2023Inventor: Larry Clay Greunke
-
Publication number: 20230343043Abstract: This disclosure and exemplary embodiments described herein provide methods and systems using mixed-reality for the creation of in-situ cad models, and methods and systems for multimodal procedural guidance content creation and conversion, however, it is to be understood that the scope of this disclosure is not limited to such application. One of the implementations described herein is related to the generation of content/instruction set 1007 that can be viewed in different modalities, including but not limited to mixed reality 1012, VR 1012, and audio text 1008, however it is to be understood that the scope of this disclosure is not limited to such application.Type: ApplicationFiled: April 20, 2023Publication date: October 26, 2023Inventor: Larry Clay Greunke
-
Publication number: 20230260224Abstract: This disclosure and exemplary embodiments described herein provide a Parallel Content Authoring Method and Tool for Procedural Guidance, and a Remote Expert Method and System Utilizing Quantitative Quality Assurance in Mixed Reality. The implementation described herein is related to the generation of content/instruction set that can be viewed in different modalities, including but not limited to mixed reality, VR, audio text, however it is to be understood that the scope of this disclosure is not limited to such application.Type: ApplicationFiled: February 17, 2023Publication date: August 17, 2023Inventor: Larry Clay Greunke
-
Publication number: 20230260415Abstract: This disclosure and exemplary embodiments described herein provide a Parallel Content Authoring Method and Tool for Procedural Guidance, and a Remote Expert Method and System Utilizing Quantitative Quality Assurance in Mixed Reality. The implementation described herein is related to the generation of content/instruction set that can be viewed in different modalities, including but not limited to mixed reality, VR, audio text, however it is to be understood that the scope of this disclosure is not limited to such application.Type: ApplicationFiled: February 17, 2023Publication date: August 17, 2023Inventor: Larry Clay Greunke
-
Patent number: 11138805Abstract: The invention relates to quantitative quality assurance in a mixed reality environment. In some embodiments, the invention includes using mixed reality sensors embedded in a mixed reality device to detect body positional movements of a user and using an indirect measuring device to determine a target location for the current state of the target equipment and a current subtask of a predefined workflow. The invention further includes using a direct measuring device associated with the target location to detect a user interaction by the user at the target location, determining a confidence value based on the user movements, the current subtask, and the user interaction, and displaying confirmation of the user interaction on a mixed reality display of the user.Type: GrantFiled: October 19, 2020Date of Patent: October 5, 2021Assignee: The Government of the United States of America, as represented by the Secretary of the NavyInventors: Christopher James Angelopoulos, Larry Clay Greunke
-
Patent number: 11062523Abstract: The invention relates to creating actual object data for mixed reality applications. In some embodiments, the invention includes using a mixed reality controller to (1) define a coordinate system frame of reference for a target object, the coordinate system frame of reference including an initial point of the target object and at least one directional axis that are specified by a user of the mixed reality controller, (2) define additional points of the target object, and (3) define interface elements of the target object. A 3D model of the target object is generated based on the coordinate system frame of reference, the additional points, and the interface elements. After receiving input metadata for defining interface characteristics for the interface elements displayed on the 3D model, the input metadata is sued to generate a workflow for operating the target object in a mixed reality environment.Type: GrantFiled: July 15, 2020Date of Patent: July 13, 2021Assignee: The Government of the United States of America, as represented by the Secretary of the NavyInventors: Larry Clay Greunke, Mark Bilinski, Christopher James Angelopoulos, Michael Joseph Guerrero
-
Publication number: 20210118234Abstract: The invention relates to quantitative quality assurance in a mixed reality environment. In some embodiments, the invention includes using mixed reality sensors embedded in a mixed reality device to detect body positional movements of a user and using an indirect measuring device to determine a target location for the current state of the target equipment and a current subtask of a predefined workflow. The invention further includes using a direct measuring device associated with the target location to detect a user interaction by the user at the target location, determining a confidence value based on the user movements, the current subtask, and the user interaction, and displaying confirmation of the user interaction on a mixed reality display of the user.Type: ApplicationFiled: October 19, 2020Publication date: April 22, 2021Inventors: Christopher James Angelopoulos, Larry Clay Greunke
-
Publication number: 20210019947Abstract: The invention relates to creating actual object data for mixed reality applications. In some embodiments, the invention includes using a mixed reality controller to (1) define a coordinate system frame of reference for a target object, the coordinate system frame of reference including an initial point of the target object and at least one directional axis that are specified by a user of the mixed reality controller, (2) define additional points of the target object, and (3) define interface elements of the target object. A 3D model of the target object is generated based on the coordinate system frame of reference, the additional points, and the interface elements. After receiving input metadata for defining interface characteristics for the interface elements displayed on the 3D model, the input metadata is sued to generate a workflow for operating the target object in a mixed reality environment.Type: ApplicationFiled: July 15, 2020Publication date: January 21, 2021Inventors: Larry Clay Greunke, Mark Bilinski, Christopher James Angelopoulos, Michael Joseph Guerrero
-
Patent number: 10438413Abstract: A method for using a virtual reality (VR) headset to view a two-dimensional (2D) technical drawing of a physical object of a real-world, real-world environment in three dimensions (3D), the method comprising: using LiDAR to produce a 3D point cloud of the real-world environment; scaling and aligning the 2D technical drawing to match the size and orientation of the physical object as depicted in the 3D point cloud; overlaying the 2D technical drawing (including all labels and dimensions) over the physical object as depicted in the 3D point cloud; and visually comparing the 3D point cloud representation of the physical object to the 2D technical drawing by simultaneously displaying the 3D point cloud of the real-world environment and the overlaid 2D technical drawing to a user with the VR headset.Type: GrantFiled: November 7, 2017Date of Patent: October 8, 2019Assignee: United States of America as represented by the Secretary of the NavyInventors: Mark Bilinski, Larry Clay Greunke
-
Publication number: 20190139306Abstract: A method for using a virtual reality (VR) headset to view a two-dimensional (2D) technical drawing of a physical object of a real-world, real-world environment in three dimensions (3D), the method comprising: using LiDAR to produce a 3D point cloud of the real-world environment; scaling and aligning the 2D technical drawing to match the size and orientation of the physical object as depicted in the 3D point cloud; overlaying the 2D technical drawing (including all labels and dimensions) over the physical object as depicted in the 3D point cloud; and visually comparing the 3D point cloud representation of the physical object to the 2D technical drawing by simultaneously displaying the 3D point cloud of the real-world environment and the overlaid 2D technical drawing to a user with the VR headset.Type: ApplicationFiled: November 7, 2017Publication date: May 9, 2019Inventors: Mark Bilinski, Larry Clay Greunke