Patents by Inventor Larry Clay Greunke

Larry Clay Greunke has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240012954
    Abstract: This disclosure and exemplary embodiments described herein provide methods and systems for a blockchain-based system for managing discrepancies, assigning procedures, ensuring quality assurance, and updating digital twins with real-world information. According to an embodiment, the system uses a combination of smart contracts and a decentralized network to facilitate transparency, security, and efficiency in various industries such as manufacturing, construction, and maintenance. The system includes multiple block types, each representing a specific aspect of the process: marking up discrepancies, assigning Parallel Content Authored instructions, completing tasks with quantitative quality assurance, and updating digital twins. By leveraging the inherent advantages of blockchain technology, this system provides an auditable, tamper-proof record of the entire process while streamlining interactions between different blocks.
    Type: Application
    Filed: May 30, 2023
    Publication date: January 11, 2024
    Inventor: Larry Clay Greunke
  • Publication number: 20230343043
    Abstract: This disclosure and exemplary embodiments described herein provide methods and systems using mixed-reality for the creation of in-situ cad models, and methods and systems for multimodal procedural guidance content creation and conversion, however, it is to be understood that the scope of this disclosure is not limited to such application. One of the implementations described herein is related to the generation of content/instruction set 1007 that can be viewed in different modalities, including but not limited to mixed reality 1012, VR 1012, and audio text 1008, however it is to be understood that the scope of this disclosure is not limited to such application.
    Type: Application
    Filed: April 20, 2023
    Publication date: October 26, 2023
    Inventor: Larry Clay Greunke
  • Publication number: 20230343044
    Abstract: This disclosure and exemplary embodiments described herein provide methods and systems for multimodal procedural guidance content creation and conversion, however, it is to be understood that the scope of this disclosure is not limited to such application. One of the implementations described herein is related to the generation of content/instruction set that can be viewed in different modalities, including but not limited to mixed reality, VR, and audio text, however it is to be understood that the scope of this disclosure is not limited to such application.
    Type: Application
    Filed: May 30, 2023
    Publication date: October 26, 2023
    Inventor: Larry Clay Greunke
  • Publication number: 20230343042
    Abstract: This disclosure and exemplary embodiments described herein provide methods and systems using mixed-reality for the creation of in-situ cad models, and methods and systems for multimodal procedural guidance content creation and conversion, however, it is to be understood that the scope of this disclosure is not limited to such application. One of the implementations described herein is related to the generation of content/instruction set 1007 that can be viewed in different modalities, including but not limited to mixed reality 1012, VR 1012, and audio text 1008, however it is to be understood that the scope of this disclosure is not limited to such application.
    Type: Application
    Filed: April 20, 2023
    Publication date: October 26, 2023
    Inventor: Larry Clay Greunke
  • Publication number: 20230260224
    Abstract: This disclosure and exemplary embodiments described herein provide a Parallel Content Authoring Method and Tool for Procedural Guidance, and a Remote Expert Method and System Utilizing Quantitative Quality Assurance in Mixed Reality. The implementation described herein is related to the generation of content/instruction set that can be viewed in different modalities, including but not limited to mixed reality, VR, audio text, however it is to be understood that the scope of this disclosure is not limited to such application.
    Type: Application
    Filed: February 17, 2023
    Publication date: August 17, 2023
    Inventor: Larry Clay Greunke
  • Publication number: 20230260415
    Abstract: This disclosure and exemplary embodiments described herein provide a Parallel Content Authoring Method and Tool for Procedural Guidance, and a Remote Expert Method and System Utilizing Quantitative Quality Assurance in Mixed Reality. The implementation described herein is related to the generation of content/instruction set that can be viewed in different modalities, including but not limited to mixed reality, VR, audio text, however it is to be understood that the scope of this disclosure is not limited to such application.
    Type: Application
    Filed: February 17, 2023
    Publication date: August 17, 2023
    Inventor: Larry Clay Greunke
  • Patent number: 11138805
    Abstract: The invention relates to quantitative quality assurance in a mixed reality environment. In some embodiments, the invention includes using mixed reality sensors embedded in a mixed reality device to detect body positional movements of a user and using an indirect measuring device to determine a target location for the current state of the target equipment and a current subtask of a predefined workflow. The invention further includes using a direct measuring device associated with the target location to detect a user interaction by the user at the target location, determining a confidence value based on the user movements, the current subtask, and the user interaction, and displaying confirmation of the user interaction on a mixed reality display of the user.
    Type: Grant
    Filed: October 19, 2020
    Date of Patent: October 5, 2021
    Assignee: The Government of the United States of America, as represented by the Secretary of the Navy
    Inventors: Christopher James Angelopoulos, Larry Clay Greunke
  • Patent number: 11062523
    Abstract: The invention relates to creating actual object data for mixed reality applications. In some embodiments, the invention includes using a mixed reality controller to (1) define a coordinate system frame of reference for a target object, the coordinate system frame of reference including an initial point of the target object and at least one directional axis that are specified by a user of the mixed reality controller, (2) define additional points of the target object, and (3) define interface elements of the target object. A 3D model of the target object is generated based on the coordinate system frame of reference, the additional points, and the interface elements. After receiving input metadata for defining interface characteristics for the interface elements displayed on the 3D model, the input metadata is sued to generate a workflow for operating the target object in a mixed reality environment.
    Type: Grant
    Filed: July 15, 2020
    Date of Patent: July 13, 2021
    Assignee: The Government of the United States of America, as represented by the Secretary of the Navy
    Inventors: Larry Clay Greunke, Mark Bilinski, Christopher James Angelopoulos, Michael Joseph Guerrero
  • Publication number: 20210118234
    Abstract: The invention relates to quantitative quality assurance in a mixed reality environment. In some embodiments, the invention includes using mixed reality sensors embedded in a mixed reality device to detect body positional movements of a user and using an indirect measuring device to determine a target location for the current state of the target equipment and a current subtask of a predefined workflow. The invention further includes using a direct measuring device associated with the target location to detect a user interaction by the user at the target location, determining a confidence value based on the user movements, the current subtask, and the user interaction, and displaying confirmation of the user interaction on a mixed reality display of the user.
    Type: Application
    Filed: October 19, 2020
    Publication date: April 22, 2021
    Inventors: Christopher James Angelopoulos, Larry Clay Greunke
  • Publication number: 20210019947
    Abstract: The invention relates to creating actual object data for mixed reality applications. In some embodiments, the invention includes using a mixed reality controller to (1) define a coordinate system frame of reference for a target object, the coordinate system frame of reference including an initial point of the target object and at least one directional axis that are specified by a user of the mixed reality controller, (2) define additional points of the target object, and (3) define interface elements of the target object. A 3D model of the target object is generated based on the coordinate system frame of reference, the additional points, and the interface elements. After receiving input metadata for defining interface characteristics for the interface elements displayed on the 3D model, the input metadata is sued to generate a workflow for operating the target object in a mixed reality environment.
    Type: Application
    Filed: July 15, 2020
    Publication date: January 21, 2021
    Inventors: Larry Clay Greunke, Mark Bilinski, Christopher James Angelopoulos, Michael Joseph Guerrero
  • Patent number: 10438413
    Abstract: A method for using a virtual reality (VR) headset to view a two-dimensional (2D) technical drawing of a physical object of a real-world, real-world environment in three dimensions (3D), the method comprising: using LiDAR to produce a 3D point cloud of the real-world environment; scaling and aligning the 2D technical drawing to match the size and orientation of the physical object as depicted in the 3D point cloud; overlaying the 2D technical drawing (including all labels and dimensions) over the physical object as depicted in the 3D point cloud; and visually comparing the 3D point cloud representation of the physical object to the 2D technical drawing by simultaneously displaying the 3D point cloud of the real-world environment and the overlaid 2D technical drawing to a user with the VR headset.
    Type: Grant
    Filed: November 7, 2017
    Date of Patent: October 8, 2019
    Assignee: United States of America as represented by the Secretary of the Navy
    Inventors: Mark Bilinski, Larry Clay Greunke
  • Publication number: 20190139306
    Abstract: A method for using a virtual reality (VR) headset to view a two-dimensional (2D) technical drawing of a physical object of a real-world, real-world environment in three dimensions (3D), the method comprising: using LiDAR to produce a 3D point cloud of the real-world environment; scaling and aligning the 2D technical drawing to match the size and orientation of the physical object as depicted in the 3D point cloud; overlaying the 2D technical drawing (including all labels and dimensions) over the physical object as depicted in the 3D point cloud; and visually comparing the 3D point cloud representation of the physical object to the 2D technical drawing by simultaneously displaying the 3D point cloud of the real-world environment and the overlaid 2D technical drawing to a user with the VR headset.
    Type: Application
    Filed: November 7, 2017
    Publication date: May 9, 2019
    Inventors: Mark Bilinski, Larry Clay Greunke