Patents by Inventor Benjamin Michael Bishop

Benjamin Michael Bishop has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11244515
    Abstract: In a method of mapping a real-world process control environment, a mobile device is registered at a reference location, and positions and orientations of the mobile device are tracked using an inertial measurement unit. A user input indicating that a new node is to be added to a 3D map of the process control environment is detected, and a 3D position of a real-world object relative to the reference location is determined, or caused to be determined, based on a tracked position and orientation of the mobile device. A node database is caused to add the new node to the 3D map of the process control environment, at least by causing the 3D position of the real-world object to be stored in association with the new node.
    Type: Grant
    Filed: September 8, 2020
    Date of Patent: February 8, 2022
    Assignee: FISHER-ROSEMOUNT SYSTEMS, INC.
    Inventors: James Aaron Crews, Trevor Duncan Schleiss, Benjamin Michael Bishop
  • Patent number: 11080931
    Abstract: In a method of providing virtual enhanced vision to a user of an augmented reality (AR) mobile device, it is determined that a first node associated with a map of a process control environment corresponds to a first real-world object currently within a field of view of a camera of the AR mobile device. A relationship between the first node and one or more other nodes is determined, with the relationship indicating that one or more other objects corresponding to other nodes are at least partially obscured by the first object. At least partially in response to determining the relationship, one or more digital models or images depicting the other object(s) is/are retrieved from memory. A display of the AR mobile device is caused to present the retrieved digital models or images to the user while the first object is in the field of view of the camera.
    Type: Grant
    Filed: July 16, 2018
    Date of Patent: August 3, 2021
    Assignee: FISHER-ROSEMOUNT SYSTEMS, INC.
    Inventors: James Aaron Crews, Trevor Duncan Schleiss, Benjamin Michael Bishop
  • Patent number: 11062517
    Abstract: In a method of facilitating interaction between a user of an augmented reality (AR) mobile device and a first real-world object, a display device is caused to superimpose digital information on portions of a process control environment within a field of view of a camera of the device. The superimposed information is associated with nodes in a map of the environment, and the nodes correspond to other objects in the environment. The display is caused to indicate a direction to the first object. After detecting a user input that indicates selection of the first object, the display is caused to superimpose, on a portion of the process control environment currently within the field of view, a digital model or image of the first object. A user interface is caused to provide one or more virtual controls and/or one or more displays associated with the first object.
    Type: Grant
    Filed: July 16, 2018
    Date of Patent: July 13, 2021
    Assignee: FISHER-ROSEMOUNT SYSTEMS, INC.
    Inventors: James Aaron Crews, Trevor Duncan Schleiss, Benjamin Michael Bishop
  • Publication number: 20200402320
    Abstract: In a method of mapping a real-world process control environment, a mobile device is registered at a reference location, and positions and orientations of the mobile device are tracked using an inertial measurement unit. A user input indicating that a new node is to be added to a 3D map of the process control environment is detected, and a 3D position of a real-world object relative to the reference location is determined, or caused to be determined, based on a tracked position and orientation of the mobile device. A node database is caused to add the new node to the 3D map of the process control environment, at least by causing the 3D position of the real-world object to be stored in association with the new node.
    Type: Application
    Filed: September 8, 2020
    Publication date: December 24, 2020
    Inventors: James Aaron Crews, Trevor Duncan Schleiss, Benjamin Michael Bishop
  • Patent number: 10796487
    Abstract: In a method of mapping a real-world process control environment, a mobile device is registered at a reference location, and 3D positions and orientations of the mobile device are tracked using an inertial measurement unit. A user input indicating that a new node is to be added to a 3D map of the process control environment is detected, and a 3D position of a real-world object relative to the reference location is determined, or caused to be determined, based on a tracked 3D position and orientation of the mobile device. A node database is caused to add the new node to the 3D map of the process control environment, at least by causing the 3D position of the real-world object to be stored in association with the new node.
    Type: Grant
    Filed: July 16, 2018
    Date of Patent: October 6, 2020
    Assignee: FISHER-ROSEMOUNT SYSTEMS, INC.
    Inventors: James Aaron Crews, Trevor Duncan Schleiss, Benjamin Michael Bishop
  • Publication number: 20190096132
    Abstract: In a method of providing virtual enhanced vision to a user of an augmented reality (AR) mobile device, it is determined that a first node associated with a map of a process control environment corresponds to a first real-world object currently within a field of view of a camera of the AR mobile device. A relationship between the first node and one or more other nodes is determined, with the relationship indicating that one or more other objects corresponding to other nodes are at least partially obscured by the first object. At least partially in response to determining the relationship, one or more digital models or images depicting the other object(s) is/are retrieved from memory. A display of the AR mobile device is caused to present the retrieved digital models or images to the user while the first object is in the field of view of the camera.
    Type: Application
    Filed: July 16, 2018
    Publication date: March 28, 2019
    Inventors: James Aaron Crews, Trevor Duncan Schleiss, Benjamin Michael Bishop
  • Publication number: 20190096131
    Abstract: In a method of mapping a real-world process control environment, a mobile device is registered at a reference location, and 3D positions and orientations of the mobile device are tracked using an inertial measurement unit. A user input indicating that a new node is to be added to a 3D map of the process control environment is detected, and a 3D position of a real-world object relative to the reference location is determined, or caused to be determined, based on a tracked 3D position and orientation of the mobile device. A node database is caused to add the new node to the 3D map of the process control environment, at least by causing the 3D position of the real-world object to be stored in association with the new node.
    Type: Application
    Filed: July 16, 2018
    Publication date: March 28, 2019
    Inventors: James Aaron Crews, Trevor Duncan Schleiss, Benjamin Michael Bishop
  • Publication number: 20190096133
    Abstract: In a method of facilitating interaction between a user of an augmented reality (AR) mobile device and a first real-world object, a display device is caused to superimpose digital information on portions of a process control environment within a field of view of a camera of the device. The superimposed information is associated with nodes in a map of the environment, and the nodes correspond to other objects in the environment. The display is caused to indicate a direction to the first object. After detecting a user input that indicates selection of the first object, the display is caused to superimpose, on a portion of the process control environment currently within the field of view, a digital model or image of the first object. A user interface is caused to provide one or more virtual controls and/or one or more displays associated with the first object.
    Type: Application
    Filed: July 16, 2018
    Publication date: March 28, 2019
    Inventors: James Aaron Crews, Trevor Duncan Schleiss, Benjamin Michael Bishop