METHODS OF SENSOR EXPOSURE VALIDATION
Implementations of the disclosed subject matter may provide a method that includes emitting, using a light source of a mobile robot, ultraviolet (UV) light to disinfect at least a portion of an area, including the at least one of air, a surface, and an object, where the UV light may be emitted onto one or more exposure sensors disposed in the area. A processor of the mobile robot may plot a representation of the emission of the UV light onto a map of the area to generate an exposure plot, where the representation is of the UV light emitted. The mobile robot may receive images from and/or disinfection levels measured by the one or more exposure sensors. The method may include generating a validation report that includes the exposure plot, images of the exposure sensors, and/or the disinfection levels measured by the one or more exposure sensors.
Latest Blue Ocean Robotics ApS Patents:
This application claims priority to U.S. Application Ser. No. 63/254,891, filed Oct. 12, 2021, the disclosure of which is incorporated by reference in its entirety.
BACKGROUNDMobile devices, such as mobile robots, can be operated to disinfect areas of a room, such as a floor, walls, furniture, and the like that have a surface that is contaminated with pathogens. Typically, it is difficult to determine whether such mobile devices have disinfected all contaminated surfaces, or whether the disinfection has been effective.
BRIEF SUMMARYAccording to an implementation of the disclosed subject matter, a method may include emitting, using a light source of a mobile robot, ultraviolet (UV) light to disinfect at least a portion of an area, including the at least one of air, a surface, and an object, where the UV light may be emitted onto one or more exposure sensors disposed in the area. A processor of the mobile robot may plot a representation of the emission of the UV light onto a map of the area to generate an exposure plot, where the representation is of the UV light emitted on at least one of the air, the surface, the object, and the one or more exposure sensors in the area. The mobile robot may receive at least one of images of the one or more exposure sensors, and/or disinfection levels measured by the one or more exposure sensors. The method may include generating, at the mobile robot and/or a remote processor, a validation report that includes the exposure plot and at least one of the images of the one or more exposure sensors, and/or the disinfection levels measured by the one or more exposure sensors.
Additional features, advantages, and implementations of the disclosed subject matter may be set forth or apparent from consideration of the following detailed description, drawings, and claims. Moreover, it is to be understood that both the foregoing summary and the following detailed description are illustrative and are intended to provide further explanation without limiting the scope of the claims.
The accompanying drawings, which are included to provide a further understanding of the disclosed subject matter, are incorporated in and constitute a part of this specification. The drawings also illustrate implementations of the disclosed subject matter and together with the detailed description serve to explain the principles of implementations of the disclosed subject matter. No attempt is made to show structural details in more detail than may be necessary for a fundamental understanding of the disclosed subject matter and various ways in which it may be practiced.
Implementations of the disclosed subject matter provide a mobile robot to perform disinfection of an area using ultraviolet (UV) light that is emitted by a light source disposed on the mobile robot, and to generate a disinfection exposure map of the portions of the area that have been disinfected. The disinfection by the mobile robot of the area and/or the disinfection exposure map may be validated by using one or more exposure sensors (e.g., UV light dosimeter) disposed in the area, which may change color or output a dosage level when exposed to the UV light emitted by the mobile robot. That is, information from the one or more exposure sensors may be used to validate the disinfection and/or the disinfection exposure map, and may be used to generate a report regarding the disinfection of the area by the mobile robot. The validation may demonstrate that the area has been successfully disinfected, and is safe to use. For example, a mobile robot may be deployed in a hospital, medical office, dental office, pharmaceutical and/or medical device manufacturing facility, and/or other settings such as an office, a warehouse, a school, a store, or the like to disinfect an area within these facilities, and may generate a report that includes an exposure map of the area, and validation of the exposure based on the output of one or more exposure sensors that are disposed in the area.
In some implementations, one or more exposure sensors (e.g., UV light dosimeters) may be placed in an area. The exposure sensors may be disposed at disinfection positions of interest in the area. For example, the positions of interest may be selected based on likelihood of contamination, based on activity within the area which may result in contamination. The mobile robot may output UV light from a light source to disinfect the area. In some implementations, an operator may capture images of the exposure sensors with a digital camera or image capture device, where the images may show a change in color of the exposure sensor (e.g., the exposure sensor changes from a first color to a second color based on the dosage of UV light) or a dosage value that is output by the exposure sensor (e.g., on a display of the exposure sensor). In some implementations, the operator may manually record a state (e.g., when the exposure sensor changes color) or a value output by the exposure sensor, and provide the recorded states and/or value to the mobile robot via the user interface of the mobile robot. In some implementations, the operator may provide the values from the exposure sensors to the mobile robot via a user interface and/or a communications interface (e.g., where the operator may transmit the values from a user device to the mobile robot via a wired and/or wireless communications link). In some implementations, the exposure sensor may provide a state value (whether the exposure sensor has changed colors or not) or a dosage value via a wireless communications link to the mobile robot. The mobile robot and/or the remote processor may generate a report that includes an exposure map of the area, and validation of the exposure based on the assigned UV exposure category for the exposure sensors.
In some implementations, an operator may assign positions on the exposure map to individual exposure sensors. The mobile robot may locate the exposure sensors that are dispersed in the area based on the assigned positions on the map, and may capture images of the exposure sensors (e.g., a change in color of the exposure sensor, or a value output by the exposure sensor that indicates dosage) before exposure of at least a portion of the area to UV light that is output by a light source disposed on the robot. In some implementations, the exposure sensors may transmit their positional location to the mobile robot via a wireless communications link. The mobile robot may assign the position of the exposure sensors on the map based on the transmitted positional location information. The mobile robot may perform a disinfection of the area by emitting UV light, and may capture images of the exposure sensors and/or may receive state values and/or dosage values from the exposure sensors after performing disinfection by using the assigned positions of the exposure sensors. The mobile robot may assign an actual UV exposure category (e.g., dosage amount) based on the images of the exposure sensors captured after disinfection (e.g., based on the change on color of the exposure sensor or a value outputted by the exposure sensor after being radiated with UV light), and/or may assign the actual UV exposure category based on the state value and/or dosage value received from the exposure sensors via the wireless communications link. The mobile robot and/or a remote processor may generate a report that includes an exposure map of the area, and validation of the exposure based on the images of the exposure sensors and/or the values from the exposure sensors (e.g., the state values and/or dosage values) to validate the exposure and/or the exposure map.
In some implementations, the mobile robot may identify the exposure sensors disposed within the area using sensors disposed on the mobile robot (e.g., at least one image sensor). The mobile robot may capture images of the exposure sensors (e.g., a change in color of the exposure sensor, or a value output by the exposure sensor that indicates a state value and/or dosage value) before exposure of at least a portion of the area to UV light that is output by a light source disposed on the robot. The mobile robot may perform a disinfection of the area by emitting UV light, and may capture images of the exposure sensors after performing disinfection. The mobile robot may assign an actual UV exposure category (e.g., dosage amount) based on the images of the exposure sensors captured after disinfection (e.g., by an image sensor of the mobile robot). The mobile robot and/or a remote processor may generate a report that includes an exposure map of the area, and validation of the exposure based on the images of the exposure sensors or the values from the exposure sensors to validate the exposure and/or the exposure map.
In some implementations, the mobile robot may perform a disinfection of the area by emitting UV light, and may capture images of the exposure sensors after performing a disinfection operation. If the captured images of the exposure sensors do not show a UV exposure category of at least a predetermined amount, the mobile robot may perform additional disinfection of at least a portion of the area by emitting UV light from the light source. The mobile robot may capture images of the exposure sensors after performing the additional disinfection to determine whether the exposure sensors show the predetermined amount of a UV exposure category. The mobile robot and/or a remote processor may generate a report that includes an exposure map of the area, and validation of the exposure based on the images of the exposure sensors or the values (e.g., a state value and/or dosage value) from the exposure sensors to validate the exposure and/or the exposure map.
For example, the exposure sensor may be a UV light dosimeter which may change color based on the dose of UV radiation that is delivered to the exposure sensor. That is, the exposure sensor may change from a first color to a second color when the dosage of UV light received by the exposure sensor is greater than or equal to a predetermined amount. In another example, the exposure sensor may display a dosage value of UV radiation that the sensor has received. In yet another example, the exposure sensor may transmit a dosage value and/or a state value (e.g., where the state value represents a color of the exposure sensor) via a wireless communications link to the mobile robot. In some implementations, an operator may place the one or more exposure sensors in the area at relevant disinfection positions (e.g., surfaces and/or areas that may be most likely to be contaminated, such as shown by exposure sensors 320 in
In some implementations, positions of the one or more exposure sensors may be assigned on an electronic map of the area, which may be stored in memory 118 and/or fixed storage 120 shown in
At operation 14, the processor of the mobile robot (e.g., controller 114 shown in
An example of the exposure plot (e.g., exposure plot 350) is shown in
At operation 16, the mobile robot may receive at least one of image of the one or more exposure sensors (e.g., exposure sensors 320 shown in
At operation 18, the mobile robot and/or a remote processor (e.g., server 140 and/or remote platform 160 shown in
In some implementations, the mobile robot (e.g., controller 114 shown in
In some implementations, the mobile robot or a remote processor (e.g., operator device 200, server 140, and/or remote platform 160 shown in
In some implementations, the mobile robot may receive images of the one or more exposure sensors (e.g., exposure sensors 320), and/or disinfection levels (e.g., state values and/or dosage values) measured by the one or more exposure sensors before and after the emission of UV light in the area. The images and/or disinfection levels (e.g., state values and/or dosage values) may be provided by a device of the operator (e.g., operator device 200 shown in
In some implementations, the mobile robot or a remote processor (e.g., operator device 200, server 140, and/or remote platform 160 shown in
At operation 54, an image sensor (e.g., sensor 102, 102a, 102b shown in
At operation 58, the mobile robot may move or stop along the disinfection route and may output UV light to disinfect the area. The mobile robot may move or stop autonomously along the disinfection route, and/or may receive control signals via a communications interface (e.g., network interface 116 shown in
At operation 62, the image sensor (e.g., sensor 102, 102a, 102b shown in
At operation 66, the mobile robot and/or server system (e.g., server 140 and/or the remote platform 160) may receive images of the one or more exposure sensors (e.g., UV light dosimeters), receive state values of the one or more exposure sensors, and/or receive dosage values of the one or more exposure sensors before, during, and/or after the emission of the UV light. The images may be received, and/or the state values and/or dosage values may be received, from the operator device 200 via a communications link. Alternatively, the mobile robot may capture images and/or receive the state values and/or dosage values before, during, and/or after the emission of UV light, and may retrieve the captured images, the state values, and/or dosage values from memory (e.g., memory 118 and/or fixed storage 120 shown in
At operation 68, the mobile robot and/or server system (e.g., server 140 and/or remote platform 160 shown in
The mobile robot may reduce human error in disinfecting an area, room, building, or the like by tracking the location and/or intensity (e.g., optical power of UV light) of light radiated, and determine which areas may need to be radiated and/or disinfected based on the position of the exposure sensors. The representation of the emission of the UV may be based on a light emission model used by the processor, where the UV light emitted is based on a square of a distance from the light source. The exposure plot (e.g., exposure plot 350 shown in
In some implementations, the method 10 or the method 50 may include transmitting a disinfection report using a communications interface of the mobile robot (e.g., network interface 116 of the mobile robot 100 shown in
As shown in the disinfection report 400 of
The at least one first sensor 102 (including sensors 102a, 102b shown in
In some implementations, the at least one first sensor 102 may have a field of view of 70 degrees diagonally. The at least one sensor 102 may have a detection distance of 0.2-4 meters. As shown in
The at least one first sensor 102 may include a first side sensor disposed on a first side of the mobile robot 100 and a second side sensor that may be disposed on a second side of the device. For example, as shown in
The light source 104 may be one or more bulbs, one or more lamps, and/or an array of light emitting diodes (LEDs) or organic light emitting diodes (OLEDs) to emit UV light (e.g., light having a wavelength of 10 nm-400 nm). The intensity (i.e., optical power output) may be controlled by the controller 114, which may also turn on or off a portion or all of the devices (e.g., bulbs, lamps, LEDs, OLEDs) of the light source 104. The light source may be controlled to emit UV light when the mobile robot is within an area, as the mobile robot moves within the area, before the mapping of the area, during the mapping of the area, and/or after the mapping of the area.
The at least one second sensor 106 may be communicatively coupled to the controller 114 shown in
In some implementations, the sensor 102, 106 may be a time-of-flight sensor, an ultrasonic sensor, a two-dimensional (2D) Light Detection and Ranging (LiDAR) sensor, a three-dimensional (3D) LiDAR sensor, and/or a radar (radio detection and ranging) sensor, a stereo vision sensor, 3D three camera, a structured light camera, or the like. The sensor 106 may have a field of view of 20-27 degrees. In some implementations, the sensor 106 may have a detection distance of 0.05-4 meters.
The mobile robot 100 may include a motor to drive the drive system 108 to move the mobile robot in an area, such as a room, a building, or the like. The drive system 108 may include wheels, which may be adjustable so that the drive system 108 may control the direction of the mobile robot 100.
In some implementations, the mobile robot 100 may include a base with the drive system 108, and the sensor 102, 106 may be disposed on the base.
The controller 114 may control and/or operate the mobile robot 100 in an operation mode which may be a manual mode, an autonomous mode, and/or a tele-operation mode. In the manual mode, the controller 114 may receive one or more control signals from the user interface 110 and/or the stop button 112. For example, a user may control the movement, direction, and/or stop the motion of the mobile robot 100 by making one or more selections on the user interface 110. The stop button 112 may be an emergency stop (ESTOP) button which may stop all operations and/or movement of the mobile robot 100 when selected. In some implementations, the controller 114 may receive at least one control signal via a network interface 116 (shown in
In some implementations, when the mobile robot 100 is moving in a direction, the sensor 102, 106 may detect a geometry of one or more surfaces (e.g., surfaces 300, 302, 304 shown in
When detecting the surface and/or object, the sensor 102, 106 may be a time-of-flight (TOF) sensor. At least one photon of light may be output by the sensor 102, 106, and may be transmitted through the air. When the at least one photon of light radiates surface and/or an object, a portion of the light may be reflected by the surface and/or the object may return to a receiver portion of the sensor 102, 106. The sensor 106 may calculate the time between sending the at least one photon of light and receiving the reflection and multiply this value by the speed of light in air, to determine the distance between the sensor 102, 106 and surface and/or object. This may be used to generate the map of the area that the mobile robot is operating within.
An exposure calculation model may determine an intensity and/or duration of UV light to emit from the light source 104 to disinfect air, surfaces, and/or objects within a room and/or area.
The bus 122 allows data communication between the controller 114 and one or more memory components, which may include RAM, ROM, and other memory, as previously noted. Typically, RAM is the main memory into which an operating system and application programs are loaded. A ROM or flash memory component can contain, among other code, the Basic Input-Output system (BIOS) which controls basic hardware operation such as the interaction with peripheral components. Applications resident with the mobile robot 100 are generally stored on and accessed via a computer readable medium (e.g., fixed storage 120), such as a solid-state drive, hard disk drive, an optical drive, solid state drive, or other storage medium.
The network interface 116 may provide a direct connection to a remote server (e.g., server 140, database 150, and/or remote platform 160 shown in
Many other devices or components (not shown) may be connected in a similar manner. Conversely, all of the components shown in
More generally, various implementations of the presently disclosed subject matter may include or be embodied in the form of computer-implemented processes and apparatuses for practicing those processes. Implementations also may be embodied in the form of a computer program product having computer program code containing instructions embodied in non-transitory and/or tangible media, such as solid state drives, DVDs, CD-ROMs, hard drives, USB (universal serial bus) drives, or any other machine readable storage medium, such that when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing implementations of the disclosed subject matter. Implementations also may be embodied in the form of computer program code, for example, whether stored in a storage medium, loaded into and/or executed by a computer, or transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via electromagnetic radiation, such that when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing implementations of the disclosed subject matter. When implemented on a general-purpose microprocessor, the computer program code segments configure the microprocessor to create specific logic circuits.
In some configurations, a set of computer-readable instructions stored on a computer-readable storage medium may be implemented by a general-purpose processor, which may transform the general-purpose processor or a device containing the general-purpose processor into a special-purpose device configured to implement or carry out the instructions. Implementations may include using hardware that has a processor, such as a general-purpose microprocessor and/or an Application Specific Integrated Circuit (ASIC) that embodies all or part of the techniques according to implementations of the disclosed subject matter in hardware and/or firmware. The processor may be coupled to memory, such as RAM, ROM, flash memory, a hard disk or any other device capable of storing electronic information. The memory may store instructions adapted to be executed by the processor to perform the techniques according to implementations of the disclosed subject matter.
The foregoing description, for purpose of explanation, has been described with reference to specific implementations. However, the illustrative discussions above are not intended to be exhaustive or to limit implementations of the disclosed subject matter to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The implementations were chosen and described in order to explain the principles of implementations of the disclosed subject matter and their practical applications, to thereby enable others skilled in the art to utilize those implementations as well as various implementations with various modifications as may be suited to the particular use contemplated.
Claims
1. A method comprising: a validation report that includes the exposure plot and at least one selected from the group consisting of: the images of the one or more exposure sensors, and the disinfection levels measured by the one or more exposure sensors.
- emitting, using a light source of a mobile robot, ultraviolet (UV) light to disinfect at least a portion of an area, including the at least one of air, a surface, and an object, wherein the UV light is emitted onto one or more exposure sensors disposed in the area;
- plotting, using a processor of the mobile robot, a representation of the emission of the UV light onto a map of the area to generate an exposure plot, wherein the representation is of the UV light emitted on at least one of the air, the surface, the object, and the one or more exposure sensors in the area;
- receiving, at the mobile robot, at least one from the group consisting of: images of the one or more exposure sensors, and disinfection levels measured by the one or more exposure sensors; and
- generating, using at least at one selected from the group consisting of:
- (i) the mobile robot, and
- (ii) a remote processor,
2. The method of claim 1, further comprising, autonomously moving or stopping the mobile robot, and moving or stopping the mobile robot by control signals received via a communications interface along a disinfection route in the area.
- controlling, at the processor of the mobile robot, a drive system of the mobile robot to move or stop the mobile robot by at least one selected from the group consisting of:
3. The method of claim 2, wherein the emitting the UV light to disinfect at least a portion of the area comprises:
- emitting UV light as the mobile robot moves along the disinfection route.
4. The method of claim 1, further comprising:
- identifying, at an image sensor or the processor of the mobile robot, the one or more exposure sensors in the area.
5. The method of claim 1, further comprising:
- capturing, at an image sensor of the mobile robot, a reference image of the one or more exposure sensors.
6. The method of claim 5, further comprising:
- storing, in a memory communicatively coupled to the mobile robot, the captured reference image.
7. The method of claim 1, further comprising:
- assigning, at the processor of the mobile robot, positions of the one or more exposure sensors on the map of the area.
8. The method of claim 1, wherein the disinfection levels of the one or more exposure sensors are received during at least one time selected from the group consisting of: before the emission of UV light in the area, during the emission of UV light in the area, and after the emission of UV light in the area.
9. The method of claim 1, further comprising:
- assigning, at the mobile robot, an actual exposure amount of the one or more exposure sensors to the UV light based on at least one of the received images and the disinfection levels; and
- determining, at the mobile robot or the remote processor, a future disinfection plan for the mobile robot based on a comparison between the exposure plot and at least one of the received images and the disinfection levels.
10. The method of claim 1, further comprising:
- determining, at the mobile robot, whether to continue to emit UV light based on the at least one of the images of the one or more exposure sensors and disinfection levels measured by the one or more exposure sensors; and
- determining, at the mobile robot or the remote processor, a future disinfection plan for the mobile robot based on a comparison between the exposure plot and the at least one of the images of the one or more exposure sensors and disinfection levels measured by the one or more exposure sensors.
11. The method of claim 1, wherein receiving the images of the one or more exposure sensors comprises:
- capturing, at an image sensor of the mobile robot, the images of the one or more exposure sensors.
12. The method of claim 1, further comprising:
- determining a delivered dose of UV light based on at least one selected from the group consisting of: the received images of the one or more exposure sensors, and the received disinfection levels.
13. The method of claim 1, further comprising:
- comparing, at the processor of the mobile robot or a server system communicatively coupled to the mobile robot, a determined delivered dose with the exposure plot;
- storing a result of the comparison in a memory communicatively coupled to at least one selected from the group consisting of: the mobile robot, and the server system; and
- determining, at the mobile robot or the remote processor, a future disinfection plan for the mobile robot based on the stored comparison.
14. The method of claim 13, wherein the generating the validation report is further based on the result of the comparison.
Type: Application
Filed: Oct 4, 2022
Publication Date: Apr 13, 2023
Applicant: Blue Ocean Robotics ApS (Odense)
Inventors: Rasmus Vistisen (Odense), John Erland Østergaard (Odense), Per Juul Nielsen (Odense), Efraim Vitzrabin (Odense)
Application Number: 17/959,411