FACILITY OPERATIONS MANAGEMENT USING AUGMENTED REALITY

A wearable device is provided that is configured to guide a user to perform a procedure in a facility. The device includes a wearable element; a display area; a sensor; and a controller. The controller is configured to control the sensor to capture sensor data, identify equipment from the sensor data, and present an image on the display area. The image includes information indicating how to perform a current step of a procedure associated with the determined equipment. The controller includes a transceiver that enables the controller to wirelessly receive the information from a remote device.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATION

The present application claims priority to U.S. provisional application Ser. No. 62/031,277 filed Jul. 31, 2014, U.S. provisional application Ser. No. 62/031,283 filed Jul. 31, 2014, and U.S. provisional application Ser. No. 62/089,633 filed Dec. 9, 2014, the entire contents of which are herein incorporated by reference.

BACKGROUND OF THE INVENTION

1. Technical Field

The present disclosure relates to management of facility operations, and more specifically, to management of facility operations using augmented reality.

2. Discussion of Related Art

Operating mission critical facilities may involve monitoring numerous building functions and equipment on a regular basis. If an individual performing such monitoring observes that equipment is operating outside of its designed limits, various steps may need to be taken to correct the situation.

Further, comparison of equipment data with benchmark values can provide a reasonable indication that equipment is close to failing or that it is operating near or exceeding its designed limits.

In the event of emergencies, facility component maintenance shutdowns, or other site specific events, facility engineers may be required to complete procedures from memory or using paper instruction. However, since these procedures can be long and complex, they can be difficult for a human operator to perform without assistance.

Augmented reality is a live direct or indirect view of a physical, real world environment whose elements are supplemented by computer-generated sensory input. As a result, the technology functions by enhancing one's current perception of reality.

Thus, there is a need for a system that enables facility management using augmented reality.

SUMMARY OF THE INVENTION

According to an exemplary embodiment of the invention, a wearable device is provided that is configured to guide a user to perform a procedure in a facility. The device includes a wearable element (e.g., eyeglass frame, a watch band, etc.); a display area; a sensor; and a controller. The controller is configured to control the sensor to capture sensor data, identify equipment from the sensor data, and present an image on the display area. The image includes information indicating how to perform a current step of a procedure associated with the determined equipment. The controller includes a transceiver that enables the controller to wirelessly receive the information from a remote device.

According to an exemplary embodiment of the invention, a wearable device is provided to guide a user safely through a facility. The device includes a wearable element; a display area; a sensor; and a controller comprising a transceiver that enables the controller to wirelessly receive information from a remote device. The controller is configured is configured to control the sensor to capture sensor data, identify equipment from the sensor data and the received information, and present an image on the display area representing a safe path through the equipment using the sensor data and the received information.

According to an exemplary embodiment of the invention, a wearable device is provided to manage a facility using a drone. The device includes a wearable element; a display area; and a controller configured to wirelessly control the drone to capture sensor data, identify equipment from the sensor data, determine whether the equipment is malfunctioning using the sensor data, and present an image on the display area representing the equipment and providing information indicating whether the equipment is malfunctioning.

According to an exemplary embodiment of the invention, a wearable device is provided to manage a facility using at least one robot. The device includes a wearable element; a display area; and a controller configured to wirelessly control the robot to capture sensor data, identify equipment from the sensor data, determine whether the equipment is malfunctioning using the sensor data, and present an image on the display area representing the equipment and providing information indicating whether the equipment is malfunctioning.

BRIEF DESCRIPTION OF THE DRAWINGS

A more complete appreciation of the present disclosure and many of the attendant aspects thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:

FIG. 1 is a schematic diagram illustrating a system according to an exemplary embodiment of the invention.

FIG. 2 illustrates an augmented reality device according to an exemplary embodiment of the invention.

FIG. 3 illustrates a controller according to an exemplary embodiment of the invention.

FIG. 4 illustrates an augmented view of the augmented reality device according to an exemplary embodiment of the invention.

FIG. 5 illustrates augmented view of the augmented reality device according to an exemplary embodiment of the invention.

FIG. 6 illustrates augmented view of the augmented reality device according to an exemplary embodiment of the invention.

FIG. 7 illustrates augmented view of the augmented reality device according to an exemplary embodiment of the invention.

FIG. 8 illustrates augmented view of the augmented reality device according to an exemplary embodiment of the invention.

FIG. 9 illustrates augmented view of the augmented reality device according to an exemplary embodiment of the invention.

FIGS. 10A-D illustrates augmented views of the augmented reality device according to exemplary embodiments of the invention.

FIG. 11 illustrates an exemplary screen for a walkthrough function of a possible graphical user interface (GUI) of the system.

FIG. 12 illustrates another exemplary screen for the walkthrough function.

FIG. 13 illustrates an exemplary screen for a trending function of a possible GUI.

FIG. 14 illustrates another exemplary screen for the trending function.

FIG. 15 illustrates an exemplary view of a smart watch that may be used in a system of an embodiment of the invention.

FIG. 16 illustrates an exemplary view of a smart watch that may be used in a system of an embodiment of the invention.

FIG. 17 illustrates an example of a drone that may be controlled by a system of an embodiment of the invention.

FIG. 18 illustrates an example of a robot being controlled by a system of an embodiment of the invention.

FIG. 19 shows an example of a computer system capable of implementing one or more devices of the invention or methods of the invention according to embodiments of the present disclosure.

DETAILED DESCRIPTION

In describing exemplary embodiments of the present disclosure illustrated in the drawings, specific terminology is employed for sake of clarity. However, the present disclosure is not intended to be limited to the specific terminology so selected, and it is to be understood that each specific element includes all technical equivalents which operate in a similar manner.

FIG. 1 illustrates a system that enables facility management to be performed using at least one of an augmented reality device 110, a smart watch 185, a mobile device 180, a drone 190, and a robot 195 according to an exemplary embodiment of the invention. The augmented reality device 110 and the smart watch 185 are wearable devices. The system further includes a central server 120 and a database 130. The mobile device 180 may be a tablet computer, a smart phone, etc.

The augmented reality device 110 includes a client applicator layer 111 and a static presentation layer 112. The mobile device 180 or the smart watch 185 may also include layers 111 and 112.

When the server 120 interfaces with a browser 140 of a remote computer, the server 120 includes a web presentation layer 121. The server 120 further includes a dynamic presentation layer 122 to interface with the static presentation layer 112 of the augmented reality device 112 and the web presentation layer 121. The server 120 further includes a business logic layer 123 and data access layer 124 to interface with a database layer 131 of the database 130.

FIG. 2 illustrates the augmented reality device 110 according to an exemplary embodiment of the invention. Referring to FIG. 2, the augmented reality device 110 includes a controller (not shown), a frame 202, a camera lens 230, a lens 204 (e.g., a prism), a display 205 or a projected image 205, and a control panel 206 with one or more physical buttons. In an exemplary embodiment, the augmented reality device 110 includes a projector that projects an image onto the prism to realize the projected image 205. In an exemplary embodiment, a case is mounted to the frame 202 that includes the control panel 206, the controller, a camera having the camera lens 230, and the projector. In an embodiment, the case includes an opening that enables the projector to project images onto the lens 204 or prism attached adjacent to the case. In an exemplary embodiment, the augmented reality device 110 does not include a frame but is incorporated into a wrist watch (i.e., a smart watch). In this embodiment, the display 205 is not fitted over a lens 204, but corresponds to the screen of the watch.

FIG. 3 illustrates the controller according to an exemplary embodiment of the invention.

The controller may be present within any of devices 110, 180, 185, or 190. The controller includes an application processor 310, a presentation subsystem 320, a connectivity subsystem 330, a sensor subsystem 340, an input/output subsystem 350, a memory 360, and a power management system 370. The controller may omit any one of the illustrated elements shown in FIG. 3 or may include additional elements.

The application processor 310 is configured to execute a computer program that controls the augmented reality device 110. The computer program is stored in memory 360. The computer program will be discussed in more detail below. When the application processor 310 is in a controller of devices 180, 185, or 190, it is configured to execute a computer program that controls the corresponding devices.

The presentation subsystem 320 controls what is presented on display 205 or seen in the projected image 205 of the augmented reality (AR) device 110, a display of the mobile device 180, or a screen of the watch 185.

The connectivity subsystem 330 enables the AR device 110 to communicate with other devices such as the central server 120, another mobile device (e.g., 180), or other devices such as a mainframe, a workstation, a server, a database, a desktop computer, a tablet computer, a smart watch (e.g., 185), another client, etc. The connectivity subsystem 330 includes a wireless transceiver that enables the augmented reality device 110 or devices 180 or 185 to wirelessly communicate with the other devices. The connectivity subsystem 130 may include the technology (e.g., suitable hardware and/or software) to exchange data wirelessly (e.g., using radio waves) over a computer network (e.g., the Internet). This technology may enable Wi-Fi communications based on the IEEE 802.11 standard, Bluetooth communications, Near Field Communications (NFC), Radio Frequency Identification (RFID), Infrared, etc.

The sensor subsystem 340 may include one or more sensors, such as an ambient light sensor, a proximity sensor, a global positioning system (GPS), a compass, an accelerometer, a gyroscope, etc. Since the controller includes the sensor subsystem 340, and the controller may be present in any of devices 110, 180, or 185, any of these devices may provide the functions of the sensor subsystem 340.

The input/output (I/O) subsystem 350 may provide an interface to input devices, such as control panel 206. The I/O subsystem 350 may include a digital camera having lens 203 controlled by the applications processor 310 or by a controller of the I/O subsystem 350 for capturing images and videos. The I/O subsystem may be present within any of devices 110, 180, or 185. The images and videos may be stored in a memory or a buffer of the I/O subsystem 350 or the memory 360.

The memory 360 may be embodied by various types of volatile or non-volatile memory devices. For example, the memory 360 may include flash memory, such as an SD card, an MMC card, an eMMC card, hard drive, etc. The memory may be located within any of devices 110, 180, 185, or 190.

The power management subsystem 370 may include a battery, an interface for receiving power from an external power source, software and/or hardware to manage power usage of the augmented reality device 110, etc. The power management subsystem 370 may include an AC power adaptor for receiving power in a wired manner or a Wireless Power Receiver for receiving power in a wireless manner. The power manage mange subsystem 370 may be located within any of devices 110, 180, 185, or 190.

A user places the frame 202 of the AR device 110 over one or more eyes like a pair of glasses. The view perceived by the user through the lens 204 is referred to as a lens view 204-1 and the view perceived by the user though the display device 205 or an area of the projected image 205 is referred to as an augmented view 205-1. Since the lens 204 is transparent, all objects that would be visible to the naked eye are visible in the lens view 204-1. When there is no augmented data to present, anything that would be visible to the naked eye in the area of the augmented view 205-1 is visible to the user. The size and location of the augmented view 205-1 may vary within the lens view 204-1, and be adjusted by the controller of FIG. 3. The display device 205 or the projector can present augmented data (e.g., images) to all or just a portion of the augmented view 205-1. When the display device 205 or the projector presents augmented data to a portion of the augmented view 205-1, any objects that would be visible to the naked eye in the remaining area of the augmented view 205-1 is visible to the user.

The augmented data is provided by the central server 120 and is sent either directly to the connectivity subsystem 330 of the controller within the augmented reality device 110, or the augmented data is sent in an indirect manner from the central server 120 to the mobile device 180 or the watch 185 (e.g., the intermediary party), and then from the intermediary party to the augmented reality device 110.

No Internet or WiFi service may be present in an equipment room. Thus, prior to entering the facility, the AR device 110 can be preloaded wirelessly with all the necessary augmented data from the server 120, which may include all instructions that need to be performed, existing parameter data about the equipment in the facility, existing warnings, trends, etc. The mobile device 180 or the watch 185 may be preloaded with the same or a portion of this information wirelessly.

In an exemplary embodiment, the instructions are derived from operational procedures including Standard Operating Procedures (SOPs), Emergency Action Procedures (EAPs), Emergency Operating Procedures (EOPs), Maintenance Procedures (MPs), Method of Procedures (MOPs) and other facility documentation. The procedures are stored on the server 120, and procedures for a given facility are then transferred to the augmented reality device 110 prior to entering the facility. The procedures may also be transferred to the mobile device 180 or the watch 185.

The facility may be marked with tags that can be scanned by the augmented reality device 110, the mobile device 180, or the watch 185 to determine whether the user has entered a particular room or is standing in front of a particular facility component.

In an exemplary embodiment, at least one of the tags are radio frequency identification (RFID) tags or near field communication (NFC) tags that can identify a given room/floor in the facility and/or a given facility component within the facility. For example, a tag may be present in the doorway of a room in the facility that identifies the room the user is about to enter, and tags may be present on all the facility components within the room that identifies the corresponding equipment. The connectivity subsystem 330 of the augmented reality device 110, the mobile device 180, or the watch 185 may include an RFID or an NFC reader that is operated by the processor 310 to read the tags. For example, if the user passes a tag identifying a room, the reader can retrieve the next instruction for the identified room from the pre-retrieved data found in the memory 360 of the controller. In another example, if the user passes a tag identifying a given facility component, the reader can retrieve the next instruction for the identified component from the pre-retrieved data. If WiFi is available, the controller can retrieve the instruction directly from the server 120.

For equipment that is equipped with wireless metering devices, scanning the tag may cause the augmented reality device 110, the mobile device 180, or the watch 185 to interface with the metering device to automatically download the latest data readings for a given component (e.g., power left in a UPS), which could be presented in the augmented view 205-1, a display of the mobile device 180, or on a screen of the watch 185.

In another exemplary embodiment, the tags have barcodes (e.g., UPC, QR, etc.) that are scanned by a camera of the augmented reality device 110 using lens 203, or a camera of the watch 185, or a camera of the mobile device 180, and the scanned codes identify a room and/or a floor of the facility, and/or a component within the facility/room/floor.

For example, when a user wearing the augmented reality (AR) device 110 looks at a barcode, the augmented reality device 110 scans the barcode, retrieves identifying information from the read bar code, and retrieves augmented data to present based on the retrieved identifying information.

Since certain clients will object to placement of tags on their equipment or on the walls of their facility, in another embodiment of the invention, images can be captured using the AR device 110, and then image recognition can be used to automatically identify facility components and areas of the facility (e.g., a particular room, a particular floor, etc.). The AR device 110 may configured to ping the server 120 or the mobile device 180 regularly for updates. Depending on the hardware, the server 120 can be made more efficient by detecting shifts in an internal gyrometer, accelerometer, or inclinometer of the sensor subsystem 340 of the device, which allows the server 120 to be more efficient in its appropriate timing for requests for data. For example, the AR device 110 would only call for new information when visuals have changed significantly or the user has moved a certain distance. The server 120 sends information on a granular level based on the scope that the client using the AR device 110 has most recently entered. For example, when a client enters a facility, based on his particular preference during the initial installation, a configuration will be retrieved from the server 120 to the client specifying how tags (e.g., RFID, QR, or manual queue) as well as whether geometries are used or not and to what granularity. As a user enters a sub-area, or is in the vicinity of a piece of equipment, the server 120 will then request more and more granular information while the client enters different zones, either via sensors specified in the configuration or via its manual pinging.

Information can be retrieved in a row-line format of column data. Depending on the user's configuration (e.g., preference when setting up the user's data), the user can select from a list of equipment as he navigates to more and more granular levels of identifying his area/equipment. If there is data on the server 120, the user will get coordinates, or geometries, as it applies as is available for that facility.

The AR device 110 may include multiple infrared cameras to distinguish geometries. The AR device 110 will use the multiple cameras to scan the geometries to retrieve its metadata (such as angles, geometries, closest basic overall mesh color of the geometry, etc.). It will send a query to the server 120 with this data periodically to retrieve possible matches. Images of the equipment previously captured by any of the devices (110, 180, 185, 190) and stored by the server 120, along with a visual scan of an area of the facility with a geometry analyzing function or API will determine whether there is a similar fit in terms of the number of edges and the number of geometric objects. The system may be configured to confirm that the equipment being looked at is what the database algorithm and visual device's representation matches. If no exact match is found, the system will show other equipment in the nearest geometrically mapped vicinity/scope.

FIG. 4 illustrates an example of the augmented view 205-1 being used to present augmented data. For example, in the lens view 204-1, the user is viewing facility components, and in the augmented view 205-1 the user is viewing information about one or more of the components. In the example shown in FIG. 4, the equipment includes an Electrical Switchgear and the augmented view 205-1 presents data such as the name of the equipment (e.g., “Switchgear A), a current instruction to perform on the equipment (e.g., “check voltage levels”), the name of the room the equipment is housed within (e.g., “Main Electrical Room”), the time, date, etc. The presentation subsystem 320 formats the data presented in the augmented view 205-1 based on augmented data pre-stored on the augmented reality device 110, retrieved wirelessly from the mobile device 180 or the central server 120 when WiFi is present. It is assumed that prior to presenting the augmented data, a user wearing the augmented reality device 110 passed and/or scanned a tag identifying the room or the component so that the augmented reality device 110 can retrieve the next instruction that corresponds to the identified room or component. The data presented in the augmented view 205-1 shown in FIG. 4 may also be presented on a display of the mobile device 180 or a screen of the watch 185.

After the user performs the instruction, the user can acknowledge that the instruction was performed by pressing a button on control panel 206, which sends an acknowledge command to the controller shown in FIG. 3. The control panel 206 may be located on the watch 185 to allow the user to send the acknowledge command using the watch 185. The user may also use the mobile device 180 to send the acknowledge command.

Upon receipt of the acknowledge command, the controller can present a next step in the augmented view 205-1, a screen of the watch 185, or a display of the mobile device 180, that is to be performed on a component in the same room, or another instruction (e.g., “head to another room”).

While the control panel 206 is illustrated in FIG. 2 as being located next to the camera lens 203, the position of the panel 206 can be moved to various locations on the augmented reality device 110, or may be present on watch 185.

The panel 206 may include at least one depressible button such as an option to advance to display a next instruction (e.g., a “+”), an option to display a previous instruction (e.g., a “−”), an option to acknowledge that a current instruction has been performed, etc. For example, when the user has used the panel 206 or the mobile device 180 to acknowledge that a current instruction has been performed, the presentation subsystem 320 displays the text of the next step in the augmented view 205-1, on a display of the mobile device 180, or on a screen of the watch 185.

In an alternate embodiment, the controller is configured to receive voice commands from the user. For example, the sensor subsystem 340 may include a microphone for receiving the voice commands and the memory 360 may store speech recognition software executed by the processor 310 to interpret the entered voice commands. For example, the user can acknowledge that the current instruction has been completed by speaking a term recognized by the processor 310 as indicating that the current step has been completed. Upon the processor 310 recognizing that the user has responded with a voice command indicating that a current step has been completed, the controller can present the next step in the augmented view 205-1, on a display of the mobile device 180, or on a screen of the watch 185.

While the above discusses use of information to guide the operator in performing maintenance steps during a walkthrough of a facility, the invention is not limited thereto. For example, a user may be presented with information about a given component such as parameter data about the component, a warning about the component, a data trend corresponding to the component, etc.

Further, while FIG. 4 illustrates an Electrical Switchgear, the invention is not limited to providing instruction steps or information for an Electrical Switchgear, as information and instructions for various different types of facility components may be presented. For example, the facility components may include at least one of an Uninterruptible Power Supply (UPS), a Power Transfer Switch (PTS), a Computer Room Air Conditioner (CRAC), a Generator, a Boiler, or any other type of facility component that is included in a facility's core infrastructure.

FIG. 5 illustrates another example of the augmented view 205-1 being used to present augmented data according to an exemplary embodiment of the invention. For example, the user views facility components 501, 502, 503, 504, . . . , 50n that are part of an equipment path in the lens view 204-1 and a path 510 through the equipment that provides the user with safe passage through the equipment. For example, when the components are transformers, an arc flash may occur that can injure someone that is too close to the equipment.

An arc flash is a type of electrical explosion that results from a low-impedance connection to ground or another voltage phase in an electrical system. Arc flashes are often witnessed from lines or transformers just before a power outage, creating bright flashes. Most 480 volt electrical equipment have sufficient capacity to cause an arc flash hazard. Higher voltages can cause a spark to jump, initiating an arc flash without the need for physical contact.

Thus, a user wearing the AR device 110 who is about to walk past equipment is presented with the safe path in the augmented view 205-1. For example, the user is not likely to be injured by an arc flash if he stays inside the dotted lines. The safe distances between each component in the facility may be stored in the server 120 or the mobile device 180 and pre-retrieved by the augmented reality device 110 prior to entering the facility. A tag at the entrance of a room or at the beginning of a path through the equipment may identify the components along the path, so that the augmented reality device 110 can retrieve the corresponding safe distances, and then the presentation subsystem 320 can display these distances like the boundary path 510 illustrated in FIG. 5.

The watch 185 can vibrate (e.g., using an internal vibration motor) or provide an audible or visible alert when the user is getting close to stepping outside the safe boundaries, such as those illustrated in FIG. 5. For example, the watch 185 could display green to indicate the user is within a safe boundary from the equipment, yellow to indicate they are getting too close to the boundary, and red to indicate they are outside the safe boundary (i.e., too close to the equipment). However, other colors or other graphical indicator may be used. The boundary path 510 may be presented on a display of the mobile device 180 or a screen of the watch 185.

The boundary path 510 shown in FIG. 5 has a left side and a right side. However, if there is no equipment on one side, for example the right side, or if the equipment on the right side is not capable of causing arc flashes, the right dotted lines would be omitted. In the example shown in FIG. 5, the power rating of the component 503 allows the user to be closer, and accordingly its adjacent dotted line is closer to component 503. The boundary path 510 can change dynamically as the user walks past equipment (e.g., marked with a corresponding tag or recognized through image recognition), since the power ratings of each component can vary. While FIG. 5 illustrates the boundary path 510 as having dotted lines, the invention is not limited to any particular graphical shape. For example, the boundary path 510 could be represented by other types of lines (e.g., straight, dashed, etc.), or by a multisided shape with straight or curved lines. In the example shown in FIG. 5, any object visible to the naked eye would be visible in the area of the augmented view 205-1 not covered by the dotted lines. Further, the invention is not limited to arc flash boundaries, as there may be other reasons for staying a certain distance away from equipment, such as to prevent electrostatic discharge, to prevent excessive inhalation of noxious fumes, to protect the user from being harmed by equipment with sharp edges, to keep the user away from equipment that does not need to be serviced, etc.

FIG. 6 illustrates another example of the augmented view 205-1 being used to present an arc flash hazard according to an exemplary embodiment of the invention. As shown in FIG. 6 the equipment, the walls, and floor of the facility are visible in the lens view 204-1 and the augmented view 205-1 comprises several augmented images 520, 521, 522, 523, and 524. The images 520-524 are part of an example of an alert that the user would see when walking near electrical equipment. Each of the images 521-524 represent different paths, where each path has a different shade or color to indicate a different amount of energy that one would be exposed to in the event of an arc flash. For example, a first shading or color (e.g., green) associated with the first image path 521 could indicate a first amount of energy, a second shading or color (e.g., yellow) associated with the second image path 522 could indicate a second amount of energy higher than the first, a third shading or color (e.g., gold) associated with the third image path 523 could indicate a third amount of energy higher than the second, and a fourth shading or color (e.g., red) associated with the fourth image path 524 could indicate a fourth amount of energy higher than the third. Other colors or shading styles may be used, and additional or fewer image paths may be presented. No shading or the portions outside the paths 521-524 indicate zero exposure to arc flash hazards. The augmented image 520 provides a textual warning, which may be omitted. In an exemplary embodiment, one or more of the image paths 521-524 are transparent to allow the floor to be seen.

The augmented images 520-524, or data derived therefrom may be presented on a display of the mobile device 180, or a screen of the watch 185. In another embodiment, the watch 185 can vibrate when the user is close to approaching an outer one of the paths such as 521. For example, the watch 185 can vibrate at different frequencies to indicate different levels of energy that one would be exposed to in the event of an arc flash. For example, a lower level of energy such as what would be encountered by walking within path 521 could be indicated by the watch vibrating at a first frequency and a higher level of energy such as what would be encountered by walking within path 522 could be indicated by the watch 185 vibrating at a second frequency higher than the first.

The watch 185, mobile device 180, or the augmented reality device 110 may produce audible alarms to indicate the user is about to step into one of the paths 521-524 or upon entering one of the paths. The audible alarms may be different to indicate the different energy levels. For example, the audible alarm could be a beeping that increases in volume and/or frequency as the user moves from a lower energy path to a higher energy path.

FIG. 7 illustrates another example of the augmented view 205-1 being used to present augmented data according to an exemplary embodiment of the invention. As shown in FIG. 7 the equipment, the walls, and floor of the facility are visible in the lens view 204-1 and the augmented view 205-1 comprises several augmented images 530, 531, 532, and 533. The augmented images 530 and 532 may be referred to as boundary images that correspond to the boundary of one or more facility components. In an exemplary embodiment, the boundary images 530 and 532 are transparent so that the underlying component is visible. The boundary images 530 and 532 may be different colors or shading styles to indicate whether the underlying equipment has been serviced recently or is in need of service. For example, the first boundary image 530 is presented in a color (e.g., green) that indicates it was recently serviced and the second boundary image 532 is presented in a color (e.g., red) that indicates it is in need of service. The boundary images may however be presented in different color or shading styles. The textual images 531 and 533 may be presented to indicate the actual date the underlying equipment was last serviced, and/or to warn that service is overdue. For example, the first textual image 531 may textually and/or graphically (e.g., in green) indicate that service was recent and the second textual image 533 may textually and/or graphically (e.g., in red) indicate that service is overdue. The textual images 531 and 533 may be omitted.

At least some of the augmented images 531-533 or data derived therefrom may be presented on a screen of the watch 185 or a display of the mobile device 180. For example, if the user is within a certain distance from equipment that needs to be serviced, the watch 180 or the mobile device 180 can indicate with a graphical, audible, or vibratory cue that the equipment needs to be serviced.

FIG. 8 illustrates another example of the augmented view 205-1 being used to present augmented data according to an exemplary embodiment of the invention. As shown in FIG. 8 the equipment, the walls, and floor of the facility are visible in the lens view 204-1 and the augmented view 205-1 comprises several augmented images 541 and 542. The augmented data presented in FIG. 8 is a warning of what action not to be performed on equipment. The augmented data may include a boundary image 541 that surrounds the underlying equipment and may be transparent to allow the underlying equipment to be viewed. The augmented data may include a textual image 542 that describes textually and/or graphically (e.g., in yellow) the specific action not to be performed (e.g., do not turn off power, do not flip switch, do not turn lever, etc.).

At least some of the augmented images 541 and 542 or data derived therefrom may be presented on a screen of the watch 185 or a display of the mobile device 180. For example, if the user is within a certain distance from equipment that should be avoided, the watch 185 or the mobile device 180 can indicate with a graphical, audible, or vibratory cue that the equipment needs to be avoided.

FIG. 9 shows an example of the augmented view 205-1 being used to present augmented data according to an exemplary embodiment of the invention. In FIG. 9, the user is observing a facility component 600 with several sub-components 610, 620, 630, . . . , 63n, and the augmented view 205-1 indicates that the third sub-component 630 should be operated next. In an exemplary embodiment, each sub-component (e.g., 610, 620, 630, . . . , 64n) is marked with a tag that identifies the sub-component and its location so that the presentation subsystem 320 can move the augmented view 205-1 to the identified location, assuming it corresponds to a sub-component that needs to be next acted upon. For example, when tags with barcodes are used and a user wearing the augmented reality device 110 views the tag of a sub-component, the camera takes a picture of the barcode, and the display 205 presents the augmented data in the augmented view 205-1 only if the augmented reality device 110 determines augmented data is available for the scanned sub-component. A camera of the mobile device 180 or the watch 185 may also be used to take a picture of the barcode.

In another example, the component 600 is marked with an RFID or NFC tag that identifies all the sub-components and all their relative locations, a reader of the AR device 110, the watch 185, or the mobile device 180 scans the tag, and the AR device 110 displays the augmented view 205-1 at a location based on the location of the tag and the relative location corresponding to the sub-component that is next to be operated on. For example, a single tag can be located just to the left of the first sub-component 610 that represents the location of component 610 and scanning of the tag could indicate the third sub-component 630 is offset to the right by 12 inches so that the augmented view 205-1 is presented next to the third subcomponent 630. The data presented in the augmented view 205-1 of FIG. 9 can also be presented on the screen of the watch 185 or a display of the mobile device 180 when the user is close to the next sub-component that needs to be acted upon.

FIG. 10A shows an example of the augmented view 205-1 being used to present augmented data, according to an exemplary embodiment of the invention. The augmented view 205-1 includes augmented images 621 and 622. The first augmented image 621 identifies the next part of the equipment (e.g., a sub-component) that is to be operated on. For example, the first augmented image 621 may be a boundary image that surrounds the part such as a rectangle, square, circle, etc. The second augmented image 622 describes textually (e.g., use key to unlock breaker, etc.) the current action/step that is to be performed on the identified equipment part. The action or step may be derived from a procedure for the equipment that is initially provided by server 120. For example, a scanner of the AR device 110, the mobile device 180, or the watch 185 can scan a tag (e.g., RFID, barcode, etc.) near the equipment that identifies the equipment or a camera of the device 110, 180, or 185 can snap a picture that can be used to identify the equipment by image recognition, and then assuming a procedure for the equipment has already been downloaded from server 120, the AR device 110 can present a current step of procedure in the augmented view 205-1 using the augmented images 621 and 622.

FIG. 10B shows the result of the user performing the action requested in FIG. 10A. For example, since the user has unlocked the breaker, he is no longer instructed to perform the current action. The user can notify the system that the action has been performed by pressing a button on the control panel 206 on devices 110 or 185, touching a screen of the mobile device 180, or through a voice command received by devices 110, 185, or 180. The user can request a next step in the procedure by pressing the same or a different button on the control panel 206 on devices 110 or 185, or by again touching a screen of the mobile device.

FIG. 10C illustrates a second step being presented to the user in the augmented view 205-1 using a first image 621 to identify the component to be operated on and a second image 622 to present the next step of the procedure (e.g., power switch on). The user can again notify the system that the next action has been performed by pressing the button on the control panel 206 or through a voice command.

FIG. 10D illustrates that no more steps are left in the procedure and the procedure has been completed. For example, FIG. 10D illustrates a textual image 622 that indicates that the procedure is complete.

The augmented images 621 and/or 622, or data derived therefrom may be presented on a screen of the watch 185 or a display of the mobile device 180. For example, when the user is within a certain distance of equipment for which a current procedure step is to be performed, the screen of the watch 185 or the display of the mobile device can indicate a current procedure step that is to be performed. A button of the watch 185 (e.g., on panel 205) or a touch of a screen of the display device 180 can be used to acknowledge to the system that the current procedure step has in fact been performed.

The augmented reality device 110, the mobile device 180, or the watch 185 can enable the user to perform data collection, view data trends, and perform operational procedures including Standard Operating Procedures (SOPs), Emergency Action Procedures (EAPs), Emergency Operating Procedures (EOPs), Maintenance Procedures (MPs), Method of Procedures (MOPs) and other facility documentation.

Since the mobile device 180 includes a larger screen than the augmented reality device 110 or the watch 185, the steps can be presented in more detail on the mobile device 180. For example, if the user wearing the AR device 110 scans a tag associated with equipment, while a brief message related to the component can be displayed in the augmented view 205-1 or on the watch 185, more information about the component or the facility (e.g., a schematic diagram, the entire procedure, etc.) can be displayed on the mobile device 180.

FIG. 11 illustrates an example of a possible graphical user interface (GUI) 700 that can be presented on the mobile device 180, in the augmented view 205-1, or on the screen of the watch 185.

Referring to FIG. 11, the GUI 700 includes a walkthrough mode 710 that is selected to enable the user to configure their facility into separate rooms. Each of the rooms may be represented by selectable room buttons 711. When the user selects one of the room buttons 711, the electrical and mechanical apparatuses that are associated with the selected room will appear.

When the room buttons 711 appear on the display of the mobile device 180, a user can use a touch screen of the device to touch the buttons 711. When one or more of the room buttons 711 appear in the augmented view 205-1 or a screen of the watch, a user can advance to and select the room buttons 711 by selecting a button of panel 206. When the watch is used, the panel 206 may be located on the watch.

Each room button 711 may provide a graphical indicator indicating whether the room has already been configured (e.g., a check) or has yet to be configured (e.g., an ‘x’ or a blank box). However, the invention is not limited thereto. For example, the graphical indicators illustrated in FIG. 11 are merely examples, as other graphical symbols or text may be used to convey the same information.

The user may also upload facility floor layout plans to be used for navigation of this screen. The user may configure areas of the floor layout plan to correspond to rooms within the system. Selecting the room from this view will display the electrical and mechanical apparatuses as previously described.

Selection on one of the available room buttons 711 brings up a new interface screen that enables the user to enter a new facility component that is housed within the corresponding room, or view/edit facility components that were previously entered (e.g., either manually or automatically). Further, one or more of the facility components in the rooms may be pre-loaded automatically using default facility component templates or site templates. The default facility component template may be used by the GUI 700 to provide the user with a list of available facility components. Custom facility component fields may also be created by the user and/or added to the default facility component template. In this way, each room may be configured to accommodate a unique facility component setup (e.g., UPS, PTS, switchgear, generator, power distribution unit PDU, boiler, chiller, etc.).

FIG. 12 illustrates an exemplary screen of the GUI 700 when the walkthrough mode 710 is selected. This screen enables a user to be guided through a facility walkthrough room by room, clearly indicating data values that may be recorded for each facility component. In the example shown in FIG. 12, the screen includes the name of the room 712, an image 713 of the selected facility component or the room, a data entry pane 714, and buttons 715 for selecting one of the available facility components/equipment in the room. The image 713 and the room name 712 are optional. The buttons 715 may include labels that identify the corresponding facility components, which can be revised as necessary by the user.

In an exemplary embodiment, all or a portion of the information presented in FIG. 12 is presented in the augmented view 205-1 or a screen of the watch, and the user can advance to different fields by using button on panel 206 located on device 110 or the watch, or using voice commands received through device 110 or the watch. When the information presented in FIG. 12 is presented on the mobile device 180, a user can adjust the different fields using a touch screen of the device 180.

Since the augmented reality device 110 includes a camera, the image 713 (or a video) may be captured using its camera. A camera may also be present within the watch to capture the image/video. The room name 712 field may edited by the user using a virtual/physical keyboard of the mobile device 180 to identify the room or by the user speaking into a microphone of the augmented reality device 110, the watch, or the mobile device 180.

The data entry pane 714 includes one or more parameters and data entry fields corresponding to the parameters associated with the selected facility components. For example, the parameters and corresponding data entry fields for a UPS could include its current battery voltages, currents, power levels, power quality, temperatures, statuses, alarms, etc.

In an exemplary embodiment, the data entry pane 714 is presented in the augmented view 205-1 or a screen of the watch, and the user can advance to and change different parameters within the pane 714 by selecting one or more options of the button 206, which is located on the device 110 or the watch.

In an exemplary embodiment of the invention, the data fields can be one of various field types, such as numeric (e.g., integer or decimal), a text string, an array of choices, or a checkbox. The text string may be input via a virtual/physical keyboard of the mobile device 180 or by a user speaking into a microphone of the augmented reality device 110 or the watch.

The array of choices may be a selectable list box or dropdown menu with selectable items. The selectable choices can be presented in the augmented view 205-1 or the watch, where each choice could be advanced to and/or selected by pressing one or more buttons of the panel 206, which can be located on the mobile device 110 or the watch.

Each item may be associated with or return a unique integer value when selected that corresponds to an index into an array that stores the items of the list/menu. The data field may also be a label with one or more selectable arrow buttons that allow the user to increment or decrement the value on the label by a predefined amount. For example, when the selectable arrow buttons are presented in the augmented view 205-1 or a screen of the watch, an arrow button can be selected by selecting a button on panel 206. Selection on the checkbox may be stored as an integer representing whether the checkbox has been checked (e.g. 1) or unchecked (e.g., 0).

The application may maintain a data structure or object that corresponds to a facility component, which may comprise one or more of the above-described data fields. The equipment object (or facility component object) may include data regarding its name, type, image file location, its collection of fields, and a collection of document references. When an object is used, it may include access methods (e.g., object methods) that can be called by the system to set its data and read its data. The name may be a string representation of the name of the equipment/facility component (e.g., “Ferro-Resonant UPS”, “Line-Interactive UPS”, etc.)

The image file location is the string representation of an absolute or relative file path of the image file (e.g., a .png, .jpg) that may either be located within a memory file system of the AR device 110, the mobile device 180, or its appropriate location on a memory file system of the server 120 that visually describes the equipment/facility component, which may have been captured by the camera of the augmented reality device 110. The collection of fields is a collection of field objects that pertains to the equipment/facility components. Likewise, the collection of documents is a collection of strings that point to the absolute or relative file path of the documentation files that may either be located within the memory file system of devices 110 or 180, or its appropriate location on a memory file system of the server 120 that describe the structure, use, properties, or maintenance of the specific equipment/facility component.

Both fields and equipment/facility components may be used as collections within a facility room, which the system can maintain using a room object. Room objects are facility component objects that have data regarding the name, image file location, a collection of fields, and a collection of documents for the particular room. Room objects may also have a collection of equipment/facility components, as mentioned previously, which is literally a collection of equipment/facility component objects whose data collection interfaces are spatially located within that room area. In addition, room objects may have data pertaining to its representation in the walkthrough data collection. For example, a room object may include a flag that indicates whether or not the room is required to be checked during a specific scheduled walkthrough, as well as data denoting the percentage of fields within the room and the fields within the equipment/facility components of the room that may have been completed (whether problematic or not) over all the fields within the room and its equipment/facility components.

Fields, equipment/facility components, and rooms may be used as collections within a facility area, which the devices 110/180 or server 120 can maintain using an area object. An area is a specific dimension of facility space that separates the total collection of facility rooms into smaller collections. Each area object may have a name and an image file location for a picture representing that area. Areas are not only limited to different areas within the facility building itself, but also include rooftops and outside areas of a facility.

Fields, equipment/facility components, rooms, and areas may be used as collections within a facility, which the devices 110/180 or server 120 can maintain using a facility object. The facility object itself may have a name, address, image file location, a collection of all its area objects, and a collection of all the actual document objects pertaining to the entire facility. All of the rooms and equipment/facility components may have a reference to the facility's master list of documents in order to link themselves to a specific document or collection of documents.

Fields, equipment, rooms, areas, and facilities may be used as collections within a client profile, which the devices 110/180 or server 120 can maintain using a software license object. The software license object itself may have an organization name, owner, license key, a collection of all its facility objects, and a collection of all the user objects working for or within the facility. Users of the application may be represented within the devices 110/180 or server 120 using user objects. The user object itself may have a name and privileges.

The data entered into the fields may be stored in database in a memory 360 and/or in remote database 130. When a session is saved to the database 130, every room within the facility may be stored as a database table. A time stamp may be applied to each and every walkthrough session. Each record in that table may contain as columns every single field from that room and its equipment/facility components, as well as the time stamp of the walkthrough session. Every time a walkthrough session is saved, it may either overwrite the latest record within the table, or insert a new record into the table. If the current session being saved is a new session, it may insert a new record, but if it is a continued session that was previously saved it may overwrite the last record saved.

Each walkthrough session is saved in a manner which the devices 110/180 or server 120 may maintain using a walkthrough session object. The walkthrough session object itself may have a timestamp for when the session began and a timestamp for when the session ends, the user which performed the walkthrough session, and the data collected.

During walkthrough sessions, users may record information as comments attached to facility components. These comments may be maintained by the devices 110/180 or the server 120 as a comment object. The comment object itself may contain a title, a message, a timestamp, an image, a user as an author, and a sound file.

When the application is started, it may first retrieve the stored object-oriented data from the device memory, which may be encoded. The encoded data for the facility and its areas, rooms, equipment/facility components, fields, users, comments, walkthrough sessions and documents may be decoded and recreated at runtime, after a user attempting to login to the application has been authenticated. Afterwards, if any synchronization to a remote database is to be made, the last record from each table in the local database (e.g., each table may refer to a specific room within the facility) may update the values of all the fields within the facility's field objects. Also, as far as data analysis is concerned, an entire set of records across multiple timestamp ranges may be imported into the application from the remote database for the sake of viewing, analyzing, and reporting trends throughout time across equipment/facility components and fields.

Data from a record set may be collected as an array of arrays (a two-dimensional array). In an exemplary embodiment, the first position of the two-dimensional array refers to the index of the record retrieved, uniquely identified by its timestamp, while the second position of the two-dimensional array refers to the column of the record retrieved. An embodiment of the application may point to and retrieve a specific data field collected from any time. In an exemplary embodiment, multiple record sets are obtained, one for each database table (e.g., one for each room), which a user may choose any collected data field from any time as far as the database record set allows.

The devices 110/180 or server 120 may compare the entered parameter data against stored thresholds or previously entered data to determine whether an error has occurred. The thresholds may include a Maximum Threshold and a Precise Threshold.

Data fields that conform to the Maximum Threshold may not exceed the nominal value (Xnominal). Warnings may be displayed on the GUI 700 when the data (Xactual) falls outside a tolerance value (7%) as shown by the following Equation 1:


Xactual>Xnomintal−(Xnominal*T %)  (1).

For example, if the tolerance value is 10%, the actual temperature of a boiler is 200 degrees, and the nominal value is 250 degrees, since an actual of 200 is not above 250−(250*0.1) (i.e., 200 is not above 225), no warning would be displayed. However, if the temperature had risen to 226 degrees in this example, a warning would have been displayed.

The devices 110/180 or server 120 may also, or instead, average the previous logged data/parameter with the current entered parameter data, and compare this average value with a corresponding threshold to determine if a warning should be displayed. For example, if the current boiler temperature was entered at 226 degrees, but the prior 4 samples have the temperature at 200 degrees, since the overall average temperature is less than 226, no warning would be displayed. The amount of samples used for this averaging may vary and be a configurable parameter.

Data fields that conform to the Precise Threshold must not rise above or fall below the nominal value by more than the tolerance value. The GUI 700 may display a warning when data is outside of the tolerance as shown by the following Equation 2:


Xactual>Xnominal+Xnominal*T % AND Xactual<Xnominal−Xnominal*T %  (2).

If the value of the data field is beyond the threshold, then the value of the field is out of range and the system may alert the user. For example, if the tolerance value is 10%, the actual temperature of the boiler is 100 degrees, and the nominal value is 140 degrees, a warning would be displayed because an actual of 100 is less than 140−140*0.1 (e.g., 100 is less than 140−14). However, if the boiler temperature rises to 150 degrees a warning would not be displayed since it lower than 140+140*0.1.

If it has been determined that a warning is to be displayed, the devices 110/180 or server 120 may notify the user to take corrective action to maintain the facility components before a failure occurs. The notification may appear as a visual on the GUI 700, an audible alert, or a vibratory alert. For example, the visual can be presented on a display of the mobile device 180, in the augmented view 205-1, or in the screen of the watch 185. For example, a speaker of the devices 110, 180, or 185 can sound the audible alert, or vibrating mechanism of the devices 110, 180, or 185 can vibrate to present the vibratory alert.

For example, parameter data for facility components that are outside of the threshold limits may be marked with visual cues on a display of the mobile device 180, in the augmented view 205-1, or on a screen of the watch. Further, different levels of alerts may be present. For example, a low level alert may become an elevated alert if new data is entered outside of a user configurable normal tolerance threshold. Since the error may have been caused by a data entry error, the user can choose to commit the data with the value as is, or edit the data before it is committed.

The notifications may be managed by selecting the configurations tab 730. For example, the interface 700 enables the user to indicate who should be contacted, the form used for the contact (e.g., SMS text message, MMS message, e-mail, social network message, etc.). For example, in addition to presenting the user of the devices 110/180/185 with a visual or audible notification, the devices 110/180/185 can notify the site administrator via an automatically generated email, text message, social network message, etc. The devices 110/180/185 may enable the user to set the system into a training mode that prevents the application from sending a message to a remote party if it is determined that the facility components are not functioning properly. The devices 110, 180, 185, or server 120 may include a wireless transceiver to send a message using the wireless transceiver if it is determined that a facility component is not functioning properly.

The user may record comments in the comment field 716 for each room for each facility component within the room. For example, the comment field 716 displayed on the mobile device 600 can be advanced to by touching a screen option of the mobile device and then the comment can be entered by using a virtual/physical keyboard. For example, the comment field 716 displayed in the augmented view 205-1 or a screen of the watch can be advanced to using a button on panel 206 and then the comment can be entered by a user speaking into a microphone of the augmented reality device 110, the mobile device 180, or the watch 185.

The camera of the augmented reality device 110, the device 180, or watch 185 may be used to record one or more pictures that are automatically associated with the comment. The comments and associated pictures may be stored on a database of the devices 110/180/185, or the remote database 130. For example, if the user notices that the temperature of the boiler is too high, he can provide a corresponding comment and snap a photo of the instrument panel of the boiler showing the elevated temperature.

Once sufficient data has been captured by the devices 110/180/185, the user may select the trending mode 720 to access a visual representation of all data. In this visual representation, each data field may be graphed against its threshold limits and an estimated prediction trend line may be illustrated to show if and when the thresholds may be exceeded. For example, when the interface 700 is displayed in the augmented view 205-1, the user can use a button of panel 206 to advance to and select the trending mode 720.

Analysis of the data may reveal opportunities for the user to take corrective action to maintain facility components before a failure occurs. The trending may identify spikes and dips in recorded data as anomalies. The anomalies may be used as a basis of analysis where related data fields are analyzed for anomalies occurring during the same time period. Correlation between anomalies occurring during the same time period may provide a basis for corrective actions to locate and address the issue.

The AR device 110, mobile device 180, watch 185, or the server 120 may trend data input by the user during a walkthrough data collection session. A data trending activity fragment may first retrieve historic data from a database of the devices 110/180/185 or the remote database 130 up to a specified range of time defined by the user, by means of the data retrieval mechanism detailed previously. The data trending activity fragment may then calculate the positive and negative slopes of the sinusoidal curves that were created by the obtained information. The data trending activity fragment may be able to calculate transients (voltage spikes, current spikes, brownouts) by using the trends generated by the data collected. If such problematic transients are implied by the generated trends, the data trending feature may pinpoint the facility components or location of the issue within the facility.

The trending mode 720 may also be used to generate a report of the collected data stating a brief analysis of said trends or anomalies, and may also include a printout of the data the selected fields in graphical and tabular format. The report can be presented on a display of the mobile device 180, in the augmented view 205-1, or in a screen of the watch.

As shown in FIG. 13, the interface displayed by selecting the trending mode 720 may include an equipment label 721 identifying the equipment (e.g., a facility component), an overlay button 722, and a toggle button 723. The interface of FIG. 13 can be displayed on a display of the mobile device 180 or may be presented in the augmented view 205-1 or the screen of the watch 185.

The arrows to the left and right of the equipment name 721 may be used to advance to a preceding or subsequent piece of equipment in the room. For example, when the arrows are presented on the display of the mobile device 180, a user can use a touch screen to select the arrows. For example, when the arrows are presented in the augmented view 205-1 or the screen of the watch, a user can advance to and select the arrows by selecting one or more buttons on panel 206.

The trending mode 720 interface enables the user to specify equipment data fields and date ranges to be displayed on the graph for comparison of past and present data. The toggle button 723 can be used to toggle the view of the data between the graphical view and a tabular view. When the toggle button 723 is displayed on a display of the mobile device 180, a user can use a touch screen of the mobile device 180 to select the toggle button 723. When the toggle button 723 is presented in the augmented view 205-1 or the screen of the watch, a user can use one or more buttons of the panel 206 to advance to and select the toggle button 723.

The overlay button 722 can be used to view an overlay of projected equipment loading values to identify trends in data that was collected, as shown in FIG. 14. When the overlay button 722 is displayed on a display of the mobile device 180, a user can use a touch to select the overlay button 722. When the overlay button 722 is presented in the augmented view 205-1 or the screen of the watch, a user can use one buttons of the panel 206 to advance to and select the overlay button 722.

As an example, the graph 724 can display one data parameter of a piece of equipment over time, and against the threshold or capacity limits, and an estimated (e.g., extrapolated) trend line will show if and when the thresholds may be exceeded. This information may aid the user in identifying system capacity trends as new facility components are added to the facility. The graph 724 may be presented on the display of the mobile device 180, the augmented view 205-1, or the screen of the watch 185.

The user may select equipment data fields to be displayed and analyzed on the data trending graphs. These graphs may be overlaid on the same set of axes for data comparison. The interface 700 may highlight increasing and decreasing trends for the user.

In the portfolio mode 750, the devices 110/180/185 or the server 120 may be configured to generate reports of the collected data, a brief analysis, and an overview of the data that was entered for selected fields. The devices 110/180/185 enable the user to generate a full report based on the entered data. The system may be able to manage multiple facilities data, functionalities and components.

The document library mode 740 can be selected to access one or more documents such as SOPs, EAPs, EOPs, MPs, MOPs, drawings, schematics, or other relevant documentation associated with facility components in the corresponding room or the facility. For example, when the interface is presented in the augmented view 205-1 or the screen of the watch, the user can advance to and select the library mode 740 by selecting one or more buttons of panel 206. For example, when the interface is presented on a display of the mobile device 180, the user can select the library mode 740 by touching its screen.

These documents may be accessed in a “step by step” mode of the interface 700 on a display of the mobile device 180, in the augmented view 205-1, or the screen of the watch to guide the user through each step in a procedure of a corresponding one of the documents. For example, text of the steps that have been completed and the subsequent steps can be displayed on the display 205 or the display of the mobile device 180, where the next step to be completed can be emphasized (e.g., highlighted in a different color, underlined, etc.). The interface 700 enables the user to mark each step as complete.

For example, the interface 700 may provide a check box next to each step that can be selected when the user has finished that part of the procedure. The check box may be visible in the augmented view 205-1, the screen of the watch, or on the display of the mobile device 180. The user may also be notified if a step has been skipped. The “step by step” mode may reduce confusion during an emergency by guiding the user in a high stress environment, which may also increase safety.

The user may also access electrical and mechanical one-line diagrams and other various important operational drawings including floor layouts, etc. These documents may also be used to facilitate practice simulations that may reduce risk and improve life safety. The documents may be viewable in the augmented view 205-1, the screen of the watch, or a display of the mobile device 180.

The document library mode 740 may be selected to provide personnel with immediate access to important facility drawings (e.g., electrical one line diagrams, floor plans, and mechanical drawings, etc.), emergency operating procedures, and other pertinent information to keep operations running smoothly. The document library mode 740 can be selected on the mobile device 180 or using the augmented reality device 110 when the interface is presented in the augmented view 205-1.

The document library mode 740 is not limited to providing emergency related information, as it may also provide pertinent information on switching procedures, technical maintenance programs, and other operations procedures, facility one lines, and other facility infrastructure drawings including SOPs, EAPs, EOPs, MPs, MOPs, etc. Further, other documentation may be imported from a remote server and saved into the application locally, which allows use of the data without network connectivity.

The Simulation mode 760 may be selected to simulate one or more functions of the client 100. The simulation mode 760 can be selected on the mobile device 600 or using the augmented reality device 110 when the interface is presented in the augmented view 205-1.

Selection of the simulation mode 260 may present a screen that enables a trainee to run one or more available training scenarios or a site administrator to create the training scenarios. The screen may be presented on a display of the mobile device or in the augmented view 205-1.

As an example, the training scenarios may be used to teach a trainee or to simulate conditions that would be predicted to occur based on data entered by a user. For example, a training scenario could be a walkthrough of a room full of several facility components (e.g., pieces of equipment) in a room, where the trainee is expected to enter parameter data (e.g., boiler temperature, battery voltage, etc.) for each corresponding component/piece. If the trainee enters parameter data that is outside of expected thresholds, they would receive an alert that is similar to the actual one that would have been received during normal operation. However, since this is merely a simulation, the site administrator would not receive a notification (e.g., e-mail, text, etc.) of the alert.

The application that launches the interface 700 may be represented on the display of the mobile device 180 by a selectable graphical icon. The icon may also be presented in the augmented view 205-1 or a screen of the watch. When the icon is selected (e.g., using a touch screen of 180 or selecting a button of panel 206), prior to enabling the user access to the functions of the interface 700, the user may be prompted to enter a login name/identifier and a password that corresponds to one or more accounts maintained by the application. The application may support different layers of accounts, where some accounts have access to more features of the interface 700 than others. For example, the administrator may have access to all features whereas a trainee could have access only the simulation mode 760, etc. This layered access may be secured by multiple authorization methods including encrypted passphrases, hardware keys, iris scan, user fingerprint, and facial recognition.

In at least one embodiment of the invention, the devices 110/180/185 synchronize all collected data with a centralized, secure database (e.g., encrypted) 130. For example, updates to data generated on devices 110/180/185 may be uploaded to the remote database 130. The remote database 130 may then update other devices(s) with the updated data. The data may include facility configurations, collected facility component data, facility documentation, document association properties with facility, equipment, and/or rooms, or training simulations.

The remote server 130 or the devices 110/180/185 may also aggregate data from various third party data sources including electrical system metering devices, mechanical system metering devices, facility alarms and alerts, security systems, and weather data. This data may be used for analytics and for basic viewing.

The remote server 120 or database 130 may host a library containing generic manufacturer data, equipment manuals, and other documentation that may be accessed by the augmented reality device 110, the mobile device 180, or watch 185.

The devices 110/180/185 may communicate with the remote server 120 via a private network where all interactions may be encrypted. The users of the devices 110/180/185 may access all information via a locally cached database for locations that do not have network access. For example, if network access is not available, the application will locally cache data on the device's internal storage (e.g., memory 360). Once network access is restored, the application may synchronize the locally stored data with the remote database automatically.

FIG. 15 illustrates an example of a view that may be presented by the watch 185. The view may include an image of a place such as the facility, a facility room, or an area within the facility and a status message or an alert about the place. For example, FIG. 15 shows an alert that indicates the overall health of the facility shown as a grade out of 100. The view may be presented when a position of the user wearing the watch 185 is within a pre-defined distance from the location of the place depicted in the image.

FIG. 16 illustrates another example of a view that may be presented by the watch 185. The view may include an image of a facility component, and a status or alert about the facility component. For example, FIG. 16 shows an alert that indicates the depicted facility component (i.e., a UPS of system A) is powered off. The view may be presented when a position of the user wearing the watch 185 is with a predefined distance from the location of the component. Thus, if a user walks to another facility component, the view will update to show any status or alert associated with the new component. The data presented in the view can be output to the smart watch 185 from the mobile device 180. Any alerts or status messages that can be viewed on the mobile device 180 or the AR device 110 can be pushed to the smart watch 185 for display. Results calculated by analytics (e.g., located on the mobile device 180 or the server 120) can also be pushed to the smart watch 185 for display. For example, facility health may be determined from factors such as frequency of rounds performed, available documentation, redundancy of critical systems, number of recent alerts, outstanding maintenance tickets, and predictions made by the analytics. Other results that may be pushed to the watch for display include a rounds alert summary, equipment capacity trending, critical equipment status.

In an exemplary embodiment, the AR device 110 is used to control a drone 190 (e.g., a remote controlled aircraft (RCA) or unmanned aircraft system (UAS)) or a robot 195. The RCA may be controlled with a handheld radio transmitter, which communicates with a receiver aboard the aircraft or the robot 195. In an exemplary embodiment, the transmitter is embedded within the AR device 110 (e.g., within the case) to enable a user wearing the AR device 110 to control the RCA or the robot. The receiver directs the aircraft's servos move the control surfaces based on pilot input. The UAS may correspond to a quadrocopter (e.g., FIG. 17) that includes a frame, a motor, an electronic speed control (ESC), a flight control board, a radio transmitter and receiver, a battery and charger. In an exemplary embodiment of the invention, the UAS or the robot 195 is further modified to include at least one of an infra-red sensor, a camera, an ambient temperature sensor, a smoke detector, a GPS, a gyroscope, and an accelerometer. A user wearing the AR device 110 can control the camera of the drone to take pictures and videos of rooftop system components and control the IR sensor to take IR scans of equipment to detect overheating/hot spots. The pictures taken by the drone 190 or the robot 195 can be presented in the augmented view 205-1. The radio transmitter of the drone 190 or the robot can transmit the images, the IR scans, and temperature data captured by the temperature sensor to the AR device 110, the server 120, the mobile device 180, or the watch 185. The temperature sensor can be used to analyze data center cooling efficiency and identify hot spots. Large facilities covering one million square feet or more area can utilize the drone 190 to monitor temperature and equipment throughout the data center with a reduced need for an on-site staff.

In another embodiment, the drone 190 is land based (e.g., an unmanned ground system (UGS)). The UGS may be remotely controlled by the AR device 110 or the mobile device 180. A UGS is used to assist human operation in environments that are not suitable/feasible for personnel to work. The UGS may include a temperature sensor and/or a forward looking infrared (FLIR) sensor synced to a user interface of the mobile device 180 or the AR device 110 to provide real time data readings. The sensors can be used to detect hot spots on wire connections and larger scale hot spots such as server racks and equipment. Use of the UGS eliminates arc flash hazard exposure by removing personnel from close proximity of energized equipment. The UGS has the ability to take photographs of equipment, and may be able to more precisely detect hot racks/equipment than the UAS. The UGS has the ability to monitor large areas in large data centers and relay current data to the AR device 110, the mobile device 180, the watch 185, or the server 120, for access by a user. The UGS allows the facility to reduce the manpower required for monitoring equipment, which allows facility engineering staff to focus on other operational tasks.

The IR monitoring that a drone 190 or the robot 195 performs, can be used to identify hot spot locations in electrical and mechanical hardware before they cause equipment failure. A system for detecting infrared radiation using infrared sensors may include: i) an infrared source such as blackbody radiators, tungsten lamps, and silicon carbide; ii) transmission medium used for infrared transmission, which includes a vacuum, the atmosphere, and optical fibers; iii) optical components such as optical lenses made from quartz, CaF2, Ge, and Si, polyethylene Fresnel lenses, Al or Au mirrors, used to converge or focus infrared radiation, and iv) an infrared detector for detecting the infrared radiation. The IR monitoring may be used as part of a predictive maintenance regime to identify potential failures and prevent them.

The temperature sensors can be used to detect temperature fluctuation in a data center environment to identify areas where cooling efficiency can be improved. The UGS can systematically scan each row in a data center following any work to check for changes in air flow patterns. The UGS can utilize spatial detection sensors and algorithms to autonomously scan entire rooms. The sensors allow optimal cooling efficiency to reduce energy consumption to maximize the useful life of the equipment.

FIG. 18 illustrates an example of at least one robot being controlled by a wearable device (e.g., device 110, watch 185) or the mobile device 180. The circular objects depicted in the figures are the robots and the rectangular objects depicted in the figures are rows or columns of equipment that are spaced apart to create a path that is travelled by the robots. Each robot includes one or more sensors. The sensors may include a sensor to detect a biological or chemical agent, to detect radiation, to detect whether conductors are burning (e.g., smell or detect certain scents), to capture regular images (e.g., high resolution camera) or thermal images (e.g., a thermal imaging or infrared camera), to detect hot and cold spots (e.g., a temperature sensor), to detect noise frequencies that indicate a server shutdown (e.g., ultrasound sensor/instrumentation), a GPS locator to detect a location of the robot, etc.

Each robot may include a transmitter for transmitting all data collected by its sensors to the wearable device, the mobile device 180, or the central server 120. A physical cable may be connected to a port of a robot and a port of the wearable device or the mobile 180 to enable sensor data to be downloaded from the robot to the wearable device or to the mobile device 180. The wearable device or the mobile device 180 may be used to control the robots to move to a particular location within the facility or to provide an instruction to the robots so that they can carry out their duties (e.g., capture sensor data, move to various locations) in an autonomous fashion. Images or video captured by at least one of the robots may be presented in the augmented view 205-1 so that a user can see what the robot(s) see in real-time. A digitally enhanced floor plan using a 4dScape technology may be presented in the augmented view 205-1 based on the location sensed by the GPS of the robot. For example, the robot can use its GPS to detect its current location, send that location to the wearable device, the mobile device 180, or the server 120, and then wearable device, mobile device 180, or the server 120 is configured to retrieve and present the digitally enhanced floor plan that corresponds to the location.

The digitally enhanced floor plan may include three dimensional graphics representing the equipment known to be present in the room of the location, and textual information identifying the equipment and warnings based on data sensed by the robots (e.g., a hot spot, excessive radiation, etc.).

In an exemplary embodiment, each robot is configured to respond to voice commands. The voice commands may be spoken to a microphone of the robot when a user is near a robot, or transmitted by the wearable device, the mobile device 180, or the server 120 to the robot when a voice command is spoke to a microphone of the sending device.

In an exemplary embodiment, each robot includes a touch screen for entering commands to control the robot, or a touch screen of the mobile device 180 or the watch 185 is used to enter commands to control the robot.

In an exemplary embodiment, a robot includes an extendible and/or rotatable extension (e.g., an arm or leg) that can be used to remotely turn on/off equipment in the facility or make equipment adjustments. The extension can be a rod that protrudes some distance away from the robot and is oriented at a certain angle, where the distance and the angle can be adjusted remotely by the wearable device, the mobile device 180, or the server 120 using instructions transmitted from the wearable device or the mobile device 180 to the robot. The extension may be attached to a ball joint attached to the robot to enable the extension to be oriented to various different angles. A motor within the robot may be used extend or retract the extension to change the length of the extension and to adjust the angle of the extension. The extension can be used to reach places that are difficult or dangerous to reach, such as for closing a steam valve. The robots may also be controlled globally from a central command center (e.g., the central server 120). The robots of a given facility may include robots of different types. For example, in one facility, the robots may include one set of robots for normal monitoring and another set of robots for repairing systems or shutting down systems.

The AR device 110, the mobile device 180, or the watch 185 may include the computer system shown in FIG. 19. The computer system referred to generally as system 1000 may include, for example, a central processing unit (CPU) 1001, random access memory (RAM) 1004, a printer interface 1010, a display unit 1011, a local area network (LAN) data transmission controller 1005, a LAN interface 1006, a network controller 1003, an internal bus 1002, and one or more input devices 1009, for example, a keyboard, mouse etc. As shown, the system 1000 may be connected to a data storage device, for example, a hard disk, 1008 via a link 1007.

Please note that while a particular graphical user interface (GUI) 700 is described above an illustrated in the figures, embodiments of the inventive concept are not limited thereto as the GUI 700 may be changed in various ways. For example, one or more of the illustrated modes may be omitted, additional modes may be present, selection of modes may be accomplished in a different manner from that illustrated, and different interactive or descriptive graphical elements may be used from those illustrated (e.g., labels may have different text, buttons may have different size, shapes, colors, etc., text fields may be replaced with drop down menus, lists, etc.). Furthermore, the interface may not be a graphical user interface. The interface may be non-graphical and rely on other means of interaction (e.g., voice control).

Exemplary embodiments described herein are illustrative, and many variations can be introduced without departing from the spirit of the disclosure or from the scope of the appended claims. For example, elements and/or features of different exemplary embodiments may be combined with each other and/or substituted for each other within the scope of this disclosure and appended claims.

Claims

1. A wearable device configured to guide a user to perform a procedure in a facility, the device comprising:

a wearable element;
a display area;
a sensor; and
a controller configured to control the sensor to capture sensor data, identify equipment from the sensor data, and present a first image on the display area, the first image including information indicating how to perform a current step of a procedure associated with the determined equipment,
wherein the controller comprises a transceiver that enables the controller to wirelessly receive the information from a remote device.

2. The wearable device of claim 1, wherein the wearable element is one of a band or an eyeglass frame.

3. The wearable device of claim 1, further comprising a projector, where the display area includes a prism and the controller is configured project the first image onto the prism.

4. The wearable device of claim 1, wherein the remote device is configured to download the entire procedure from a central server and the current step upon the remote device establishing a connection to a network attached to the central server.

5. The wearable device of claim 1, wherein the sensor is a camera, the sensor data is a second image captured by the camera, and the equipment is identified from the second image.

6. The wearable device of claim 5, wherein the second image includes a bar code and the controller determines a code from the bar code and uses the code to identify the equipment.

7. The wearable device of claim 5, wherein the controller sends the second image to the remote device, the remote device performs image recognition on the second image to detect an object, and the remote device compares the detected objects against pre-stored images to identify the equipment.

8. The wearable device of claim 1, wherein the sensor includes a radio frequency identification (RFID) reader, the sensor data is RFID data, the controller determines a code from the RFID data and uses the code to identify the equipment.

9. The wearable device of claim 1, wherein the sensor comprises a plurality of infrared (IR) cameras, the IR cameras controller scan geometries to retrieve metadata, the controller sends the metadata to the remote device, and the remote device uses the metadata to identify the equipment.

10. The wearable device of claim 1, wherein the first image includes text of the current step and text identifying a room the equipment is housed within.

11. The wearable device of claim 1, wherein the wearable element includes a physical button and the controller determines that the current step has been completed in response to a user depressing the physical button.

12. The wearable device of claim 1, wherein the wearable element includes a physical button and the controller sends a message to the remote device indicating the current step has been completed in response to a user depressing the physical button.

13. The wearable device of claim 1, wherein the wearable element includes a physical button and the controller presents a second image on the display area including information indicating how to perform a next step of the procedure in response to a user depressing the physical button.

14. The wearable device of claim 1, wherein the first image is transparent and overlays the equipment.

15. The wearable device of claim 1, wherein the equipment comprises a plurality of components and the first image is disposed closest to one of the components that is part of the current step.

16. A wearable device configured to guide a user safely through a facility, the device comprising:

a wearable element;
a display area;
a sensor; and
a controller comprising a transceiver that enables the controller to wirelessly receive information from a remote device,
wherein the controller configured is configured to control the sensor to capture sensor data, identify equipment from the sensor data and the received information, and present an image on the display area representing a safe path through the equipment using the sensor data and the received information.

17. The wearable device of claim 16, wherein the path comprises at least one line that is spaced a distance away from the equipment.

18. The wearable device of claim 17, wherein the distance is based on a range of an arc flash predicted for the equipment.

19. The wearable device of claim 17, wherein the controller dynamically adjusts a position of the line according to a distance determined between the user and the equipment using the sensor data.

20. The wearable device of claim 17, wherein the path comprises a plurality of different adjacent sub-paths, where each sub-path represents a different amount of energy the user would be exposed to when the arc flash occurs.

21. The wearable device of claim 16, further comprising a vibration motor that vibrates when the controller determines that the user is approaching a boundary of the path.

22. A wearable device configured to manage a facility using a remotely controllable device, the device comprising:

a wearable element;
a display area; and
a controller configured to wirelessly control the remotely controllable device to capture sensor data, identify equipment from the sensor data, determine whether the equipment is malfunctioning using the sensor data, and present an image on the display area representing the equipment and providing information indicating whether the equipment is malfunctioning.

23. The wearable device of claim 22, wherein the controller is configured to receive temperature data from the remotely controllable device to determine whether a temperature of the equipment is abnormal.

24. The wearable device of claim 23, wherein the temperature data comprises infrared scans of the equipment performed by the remotely controllable device.

25. The wearable device of claim 22, wherein the remotely controllable device includes a rotatable or extendible extension and the controller enables a user to transmit a command to the remotely controllable device to rotate or extend the extension.

Patent History
Publication number: 20160035246
Type: Application
Filed: Jul 29, 2015
Publication Date: Feb 4, 2016
Inventor: PETER M. CURTIS (Bethpage, NY)
Application Number: 14/812,712
Classifications
International Classification: G09B 19/00 (20060101); G09B 19/24 (20060101); G08B 7/06 (20060101);