Patents by Inventor Christopher James Angelopoulos

Christopher James Angelopoulos has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11138805
    Abstract: The invention relates to quantitative quality assurance in a mixed reality environment. In some embodiments, the invention includes using mixed reality sensors embedded in a mixed reality device to detect body positional movements of a user and using an indirect measuring device to determine a target location for the current state of the target equipment and a current subtask of a predefined workflow. The invention further includes using a direct measuring device associated with the target location to detect a user interaction by the user at the target location, determining a confidence value based on the user movements, the current subtask, and the user interaction, and displaying confirmation of the user interaction on a mixed reality display of the user.
    Type: Grant
    Filed: October 19, 2020
    Date of Patent: October 5, 2021
    Assignee: The Government of the United States of America, as represented by the Secretary of the Navy
    Inventors: Christopher James Angelopoulos, Larry Clay Greunke
  • Patent number: 11062523
    Abstract: The invention relates to creating actual object data for mixed reality applications. In some embodiments, the invention includes using a mixed reality controller to (1) define a coordinate system frame of reference for a target object, the coordinate system frame of reference including an initial point of the target object and at least one directional axis that are specified by a user of the mixed reality controller, (2) define additional points of the target object, and (3) define interface elements of the target object. A 3D model of the target object is generated based on the coordinate system frame of reference, the additional points, and the interface elements. After receiving input metadata for defining interface characteristics for the interface elements displayed on the 3D model, the input metadata is sued to generate a workflow for operating the target object in a mixed reality environment.
    Type: Grant
    Filed: July 15, 2020
    Date of Patent: July 13, 2021
    Assignee: The Government of the United States of America, as represented by the Secretary of the Navy
    Inventors: Larry Clay Greunke, Mark Bilinski, Christopher James Angelopoulos, Michael Joseph Guerrero
  • Publication number: 20210118234
    Abstract: The invention relates to quantitative quality assurance in a mixed reality environment. In some embodiments, the invention includes using mixed reality sensors embedded in a mixed reality device to detect body positional movements of a user and using an indirect measuring device to determine a target location for the current state of the target equipment and a current subtask of a predefined workflow. The invention further includes using a direct measuring device associated with the target location to detect a user interaction by the user at the target location, determining a confidence value based on the user movements, the current subtask, and the user interaction, and displaying confirmation of the user interaction on a mixed reality display of the user.
    Type: Application
    Filed: October 19, 2020
    Publication date: April 22, 2021
    Inventors: Christopher James Angelopoulos, Larry Clay Greunke
  • Publication number: 20210019947
    Abstract: The invention relates to creating actual object data for mixed reality applications. In some embodiments, the invention includes using a mixed reality controller to (1) define a coordinate system frame of reference for a target object, the coordinate system frame of reference including an initial point of the target object and at least one directional axis that are specified by a user of the mixed reality controller, (2) define additional points of the target object, and (3) define interface elements of the target object. A 3D model of the target object is generated based on the coordinate system frame of reference, the additional points, and the interface elements. After receiving input metadata for defining interface characteristics for the interface elements displayed on the 3D model, the input metadata is sued to generate a workflow for operating the target object in a mixed reality environment.
    Type: Application
    Filed: July 15, 2020
    Publication date: January 21, 2021
    Inventors: Larry Clay Greunke, Mark Bilinski, Christopher James Angelopoulos, Michael Joseph Guerrero