System, Device, and Method For Stain Detection and Cleaning Using Robotic Cleaners
Systems, devices, and methods for stain detection and cleaning using robotic cleaners are disclosed. An example robotic cleaning system may include a robotic cleaner. The robotic cleaner may include a housing, at least one wet cleaning element including at least one cleaning surface on an underside of the housing, at least one sensor, and at least one processor. The processor(s) may be programmed or configured to detect at least one stain on a surface to be cleaned based on sensor data from the at least one sensor and perform at least one cleaning operation based on the at least one stain.
This application claims priority to U.S. Provisional Patent Application No. 63/535,365, filed Aug. 30, 2023, the disclosure of which is hereby incorporated by reference in its entirety.
BACKGROUND FieldThis disclosure relates generally to automated cleaning apparatuses and, in some non-limiting embodiments or aspects, to systems, devices, and methods for stain detection and cleaning using robotic cleaners.
Technical ConsiderationsCertain robotic cleaners may be equipped for wet and/or dry cleaning operations. For example, convertible robotic cleaners may be equipped to convert from operating in a wet mode (e.g., mopping) to a dry mode (e.g., vacuuming), such as through the addition or removal of components useful for a given mode. Some robotic cleaners equipped for wet cleaning operations may use a wet cleaning element (e.g., a mop pad).
However, it can be difficult for a robotic cleaner to clean (e.g., remove) stains effectively. For example, if the path and/or cleaning operations of the robotic cleaner are not adjusted to target the location and/or type of stain, then such a stain may not be adequately cleaned.
There is a need in the art for a technical solution to enable automatic detection and cleaning of stains using a robotic cleaner (e.g., wet cleaning elements thereof, such as mop pads and/or the like).
SUMMARYAccording to some non-limiting embodiments or aspects, provided are systems, devices, and methods for stain detection and cleaning using robotic cleaners, e.g., that overcome some or all of the deficiencies identified above.
According to non-limiting embodiments or aspects, provided is a robotic cleaning system. An example robotic cleaning system may include a robotic cleaner. The robotic cleaner may include a housing, at least one wet cleaning element including at least one cleaning surface on an underside of the housing, at least one sensor, and at least one processor. The processor(s) may be programmed or configured to detect at least one stain on a surface to be cleaned based on sensor data from the at least one sensor and perform at least one cleaning operation based on the at least one stain.
In some non-limiting embodiments or aspects, the system may further include a docking station, which may include a base and/or a support extending laterally from the base. In some non-limiting embodiments or aspects, the support configured to receive at least a portion of the robotic cleaner on top of the support.
In some non-limiting embodiments or aspects, the at least one sensor may include at least one of a camera; a color camera; a red, green, and blue (RGB) camera; a three-dimensional camera; a red, green, blue, and depth (RGB-D) camera; a light emitter; a light detector; an infrared (IR) camera; a spectrometer; an IR spectrometer; an image capture device; a LiDAR device; or any combination thereof.
In some non-limiting embodiments or aspects, the sensor data may include at least one of an image, a video, a color image, an RGB image, a three-dimensional image, an RGB-D image, an IR image, spectroscopy data, IR spectroscopy data, reflectance data, specular reflectance data, diffuse reflectance data, color data, texture data, or any combination thereof.
In some non-limiting embodiments or aspects, detecting the at least one stain may include detecting the at least one stain on the surface to be cleaned based on sensor data from the at least one sensor and at least one computer vision operation.
In some non-limiting embodiments or aspects, performing the at least one cleaning operation may include performing at least one wet cleaning operation with the at least one cleaning surface of the at least one wet cleaning element.
In some non-limiting embodiments or aspects, the at least one processor may be further programmed or configured to select a cleaning operation from a plurality of predetermined cleaning operations based on the at least one stain and the surface to be cleaned. In some non-limiting embodiments or aspects, performing the at least one cleaning operation may include performing the cleaning operation selected from the plurality of predetermined cleaning operations.
In some non-limiting embodiments or aspects, selecting the cleaning operation may include selecting the cleaning operation from the plurality of predetermined cleaning operations based on at least one of a stain type of the at least one stain, a surface type of the surface to be cleaned, or any combination thereof.
In some non-limiting embodiments or aspects, the robotic cleaner may be performing a first cleaning operation before detecting the at least one stain. The at least one cleaning operation may include a second cleaning operation, and performing the at least one cleaning operation based on the at least one stain may include interrupting the first cleaning operation and initiating the second cleaning operation.
In some non-limiting embodiments or aspects, the at least one processor may be further programmed or configured to communicate a notification based on detecting the at least one stain.
In some non-limiting embodiments or aspects, communicating the notification may include communicating the notification to a remote device of a user associated with the robotic cleaner.
In some non-limiting embodiments or aspects, the remote device may display a graphical user interface (GUI) based on the at least one stain.
In some non-limiting embodiments or aspects, the GUI may include a map of the surface to be cleaned. The map may include at least one stain icon based on the at least one stain.
In some non-limiting embodiments or aspects, the remote device may be programmed or configured to receive an input from the user indicating confirmation that the at least one stain is present and/or communicate a communication based on the input.
In some non-limiting embodiments or aspects, the at least one processor may be further programmed or configured to receive the communication from the remote device. In some non-limiting embodiments or aspects, performing the at least one cleaning operation may include performing the at least one cleaning operation based on the communication.
According to non-limiting embodiments or aspects, provided is a method. An example method may include detecting (e.g., with at least one processor) at least one stain on a surface to be cleaned based on sensor data from at least one sensor of a robotic cleaner and/or controlling (e.g., with at least one processor) the robotic cleaner to perform at least one cleaning operation based on the at least one stain.
In some non-limiting embodiments or aspects, the at least one sensor may include at least one of a camera, a color camera, an RGB camera, a three-dimensional camera, an RGB-D camera, a light emitter, a light detector, an IR camera, a spectrometer, an IR spectrometer, an image capture device, a LiDAR device, or any combination thereof.
In some non-limiting embodiments or aspects, the sensor data may include at least one of an image, a video, a color image, an RGB image, a three-dimensional image, an RGB-D image, an IR image, spectroscopy data, IR spectroscopy data, reflectance data, specular reflectance data, diffuse reflectance data, color data, texture data, or any combination thereof.
In some non-limiting embodiments or aspects, detecting the at least one stain may include detecting the at least one stain on the surface to be cleaned based on sensor data from the at least one sensor and at least one computer vision operation.
In some non-limiting embodiments or aspects, performing the at least one cleaning operation may include performing at least one wet cleaning operation with at least one cleaning surface of at least one wet cleaning element of the robotic cleaner.
In some non-limiting embodiments or aspects, the method may further include selecting (e.g., with at least one processor) a cleaning operation from a plurality of predetermined cleaning operations based on the at least one stain and the surface to be cleaned. In some non-limiting embodiments or aspects, performing the at least one cleaning operation may include performing the cleaning operation selected from the plurality of predetermined cleaning operations.
In some non-limiting embodiments or aspects, selecting the cleaning operation may include selecting the cleaning operation from the plurality of predetermined cleaning operations based on at least one of a stain type of the at least one stain, a surface type of the surface to be cleaned, or any combination thereof.
In some non-limiting embodiments or aspects, the robotic cleaner may be performing a first cleaning operation before detecting the at least one stain. The at least one cleaning operation may include a second cleaning operation, and performing the at least one cleaning operation based on the at least one stain may include interrupting the first cleaning operation and initiating the second cleaning operation.
In some non-limiting embodiments or aspects, the method may further include communicating (e.g., with at least one processor) a notification based on detecting the at least one stain.
In some non-limiting embodiments or aspects, communicating the notification may include communicating the notification to a remote device of a user associated with the robotic cleaner.
In some non-limiting embodiments or aspects, the remote device may display a GUI based on the at least one stain.
In some non-limiting embodiments or aspects, the GUI may include a map of the surface to be cleaned. The map may include at least one stain icon based on the at least one stain.
In some non-limiting embodiments or aspects, the method may further include receiving, with the remote device, an input from the user indicating confirmation that the at least one stain is present and/or communicating, with the remote device, a communication based on the input.
In some non-limiting embodiments or aspects, the method may further include receiving (e.g., with at least one processor) the communication from the remote device. In some non-limiting embodiments or aspects, performing the at least one cleaning operation comprises performing the at least one cleaning operation based on the communication.
According to non-limiting embodiments or aspects, provided is a computer program product. An example computer program product may include at least one non-transitory computer-readable medium including program instructions that, when executed by at least one processor, cause the at least one processor to perform any of the methods disclosed herein.
Further non-limiting embodiments or aspects will be set forth in the following numbered clauses:
Clause 1: A robotic cleaning system, comprising: a robotic cleaner comprising: a housing; at least one wet cleaning element comprising at least one cleaning surface on an underside of the housing; at least one sensor; and at least one processor programmed or configured to: detect at least one stain on a surface to be cleaned based on sensor data from the at least one sensor; and perform at least one cleaning operation based on the at least one stain.
Clause 2: The system of clause 1, further comprising: a docking station comprising: a base; and a support extending laterally from the base, the support configured to receive at least a portion of the robotic cleaner on top of the support.
Clause 3: The system of clause 1 or clause 2, wherein the at least one sensor comprises at least one of: a camera; a color camera; a red, green, and blue (RGB) camera; a three-dimensional camera; a red, green, blue, and depth (RGB-D) camera; a light emitter; a light detector; an infrared (IR) camera; a spectrometer; an IR spectrometer; an image capture device; a LiDAR device; or any combination thereof.
Clause 4: The system of any of clauses 1-3, wherein the sensor data comprises at least one of: an image; a video; a color image; a red, green, and blue (RGB) image; a three-dimensional image; a red, green, blue, and depth (RGB-D) image; an infrared (IR) image; spectroscopy data; IR spectroscopy data; reflectance data; specular reflectance data; diffuse reflectance data; color data; texture data; or any combination thereof.
Clause 5: The system of any of clauses 1-4, wherein detecting the at least one stain comprises detecting the at least one stain on the surface to be cleaned based on sensor data from the at least one sensor and at least one computer vision operation.
Clause 6: The system of any of clauses 1-5, wherein performing the at least one cleaning operation comprises performing at least one wet cleaning operation with the at least one cleaning surface of the at least one wet cleaning element.
Clause 7: The system of any of clauses 1-6, wherein the at least one processor is further programmed or configured to: select a cleaning operation from a plurality of predetermined cleaning operations based on the at least one stain and the surface to be cleaned, wherein performing the at least one cleaning operation comprises performing the cleaning operation selected from the plurality of predetermined cleaning operations.
Clause 8: The system of any of clauses 1-7, wherein selecting the cleaning operation comprises selecting the cleaning operation from the plurality of predetermined cleaning operations based on at least one of: a stain type of the at least one stain; a surface type of the surface to be cleaned; or any combination thereof.
Clause 9: The system of any of clauses 1-8, wherein the robotic cleaner is performing a first cleaning operation before detecting the at least one stain, wherein the at least one cleaning operation comprises a second cleaning operation, and wherein performing the at least one cleaning operation based on the at least one stain comprises interrupting the first cleaning operation and initiating the second cleaning operation.
Clause 10: The system of any of clauses 1-9, wherein the at least one processor is further programmed or configured to: communicate a notification based on detecting the at least one stain.
Clause 11: The system of any of clauses 1-10, wherein communicating the notification comprises communicating the notification to a remote device of a user associated with the robotic cleaner.
Clause 12: The system of any of clauses 1-11, wherein the remote device displays a graphical user interface (GUI) based on the at least one stain.
Clause 13: The system of any of clauses 1-12, wherein the GUI comprises a map of the surface to be cleaned, the map comprising at least one stain icon based on the at least one stain.
Clause 14: The system of any of clauses 1-13, wherein the remote device is programmed or configured to: receive an input from the user indicating confirmation that the at least one stain is present; and communicate a communication based on the input.
Clause 15: The system of any of clauses 1-14, wherein the at least one processor is further programmed or configured to: receive the communication from the remote device, wherein performing the at least one cleaning operation comprises performing the at least one cleaning operation based on the communication.
Clause 16: A method, comprising: detecting, with at least one processor, at least one stain on a surface to be cleaned based on sensor data from at least one sensor of a robotic cleaner; and controlling, with at least one processor, the robotic cleaner to perform at least one cleaning operation based on the at least one stain.
Clause 17: The method of clause 16, wherein the at least one sensor comprises at least one of: a camera; a color camera; a red, green, and blue (RGB) camera; a three-dimensional camera; a red, green, blue, and depth (RGB-D) camera; a light emitter; a light detector; an infrared (IR) camera; a spectrometer; an IR spectrometer; an image capture device; a LiDAR device; or any combination thereof.
Clause 18: The method of clause 16 or clause 17, wherein the sensor data comprises at least one of: an image; a video; a color image; a red, green, and blue (RGB) image; a three-dimensional image; a red, green, blue, and depth (RGB-D) image; an infrared (IR) image; spectroscopy data; IR spectroscopy data; reflectance data; specular reflectance data; diffuse reflectance data; color data; texture data; or any combination thereof.
Clause 19: The method of any of clauses 16-18, wherein detecting the at least one stain comprises detecting the at least one stain on the surface to be cleaned based on sensor data from the at least one sensor and at least one computer vision operation.
Clause 20: The method of any of clauses 16-19, wherein performing the at least one cleaning operation comprises performing at least one wet cleaning operation with at least one cleaning surface of at least one wet cleaning element of the robotic cleaner.
Clause 21: The method of any of clauses 16-20, further comprising: selecting, with at least one processor, a cleaning operation from a plurality of predetermined cleaning operations based on the at least one stain and the surface to be cleaned, wherein performing the at least one cleaning operation comprises performing the cleaning operation selected from the plurality of predetermined cleaning operations.
Clause 22: The method of any of clauses 16-21, wherein selecting the cleaning operation comprises selecting the cleaning operation from the plurality of predetermined cleaning operations based on at least one of: a stain type of the at least one stain; a surface type of the surface to be cleaned; or any combination thereof.
Clause 23: The method of any of clauses 16-22, wherein the robotic cleaner is performing a first cleaning operation before detecting the at least one stain, wherein the at least one cleaning operation comprises a second cleaning operation, and wherein performing the at least one cleaning operation based on the at least one stain comprises interrupting the first cleaning operation and initiating the second cleaning operation.
Clause 24: The method of any of clauses 16-23, further comprising: communicating, with at least one processor, a notification based on detecting the at least one stain.
Clause 25: The method of any of clauses 16-24, wherein communicating the notification comprises communicating the notification to a remote device of a user associated with the robotic cleaner.
Clause 26: The method of any of clauses 16-25, wherein the remote device displays a graphical user interface (GUI) based on the at least one stain.
Clause 27: The method of any of clauses 16-26, wherein the GUI comprises a map of the surface to be cleaned, the map comprising at least one stain icon based on the at least one stain.
Clause 28: The method of any of clauses 16-27, further comprising: receiving, with the remote device, an input from the user indicating confirmation that the at least one stain is present; and communicating, with the remote device, a communication based on the input.
Clause 29: The method of any of clauses 16-28, further comprising: receiving, with at least one processor, the communication from the remote device, wherein performing the at least one cleaning operation comprises performing the at least one cleaning operation based on the communication.
Clause 30: A computer program product comprising at least one non-transitory computer-readable medium including program instructions that, when executed by at least one processor, cause the at least one processor to perform the method of any of clauses 16-29.
These and other features and characteristics of the present disclosure, as well as the methods of operation and functions of the related elements of structures and the combination of parts and economics of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the present disclosure. As used in the specification and the claims, the singular form of “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise.
Additional advantages and details of the disclosure are explained in greater detail below with reference to the exemplary embodiments or aspects that are illustrated in the accompanying schematic figures, in which:
For purposes of the description hereinafter, the terms “upper”, “lower”, “right”, “left”, “vertical”, “horizontal”, “top”, “bottom”, “lateral”, “longitudinal,” and derivatives thereof shall relate to non-limiting embodiments or aspects as they are oriented in the drawing figures. However, it is to be understood that non-limiting embodiments or aspects may assume various alternative variations and step sequences, except where expressly specified to the contrary. It is also to be understood that the specific devices and processes illustrated in the attached drawings, and described in the following specification, are simply exemplary embodiments or aspects. Hence, specific dimensions and other physical characteristics related to the embodiments or aspects disclosed herein are not to be considered as limiting.
No aspect, component, element, structure, act, step, function, instruction, and/or the like used herein should be construed as critical or essential unless explicitly described as such. Also, as used herein, the articles “a” and “an” are intended to include one or more items and may be used interchangeably with “one or more” and “at least one.” Furthermore, as used herein, the term “set” is intended to include one or more items (e.g., related items, unrelated items, a combination of related and unrelated items, etc.) and may be used interchangeably with “one or more” or “at least one.” Where only one item is intended, the term “one” or similar language is used. Also, as used herein, the terms “has,” “have,” “having,” or the like are intended to be open-ended terms. Further, the phrase “based on” is intended to mean “based at least partially on” unless explicitly stated otherwise. The phase “based on” may also mean “in response to” where appropriate.
Some non-limiting embodiments or aspects are described herein in connection with thresholds. As used herein, satisfying a threshold may refer to a value being greater than the threshold, more than the threshold, higher than the threshold, greater than or equal to the threshold, less than the threshold, fewer than the threshold, lower than the threshold, less than or equal to the threshold, equal to the threshold, and/or the like.
As used herein, the term “communication” may refer to the reception, receipt, transmission, transfer, provision, and/or the like of data (e.g., information, signals, messages, instructions, commands, and/or the like). For one unit (e.g., a device, a system, a component of a device or system, combinations thereof, and/or the like) to be in communication with another unit means that the one unit is able to directly or indirectly receive information from and/or transmit information to the other unit. This may refer to a direct or indirect connection (e.g., a direct communication connection, an indirect communication connection, and/or the like) that is wired and/or wireless in nature. Additionally, two units may be in communication with each other even though the information transmitted may be modified, processed, relayed, and/or routed between the first and second unit. For example, a first unit may be in communication with a second unit even though the first unit passively receives information and does not actively transmit information to the second unit. As another example, a first unit may be in communication with a second unit if at least one intermediary unit processes information received from the first unit and communicates the processed information to the second unit.
As used herein, the term “computing device” may refer to one or more electronic devices configured to process data. A computing device may, in some examples, include the necessary components to receive, process, and output data, such as a processor, a display, a memory, an input device, a network interface, and/or the like. A computing device may be a mobile device. As an example, a mobile device may include a cellular phone (e.g., a smartphone or standard cellular phone), a portable computer, a wearable device (e.g., watches, glasses, lenses, clothing, and/or the like), a personal digital assistant (PDA), and/or other like devices. A computing device may also be a desktop computer or other form of non-mobile computer.
As used herein, the term “server” may refer to or include one or more computing devices that are operated by or facilitate communication and processing for multiple parties in a network environment, such as the internet, although it will be appreciated that communication may be facilitated over one or more public or private network environments and that various other arrangements are possible. Further, multiple computing devices (e.g., servers, desktop computers, mobile devices, etc.) directly or indirectly communicating in the network environment may constitute a “system.” Reference to “a server” or “a processor,” as used herein, may refer to a previously recited server and/or processor that is recited as performing a previous step or function, a different server and/or processor, and/or a combination of servers and/or processors. For example, as used in the specification and the claims, a first server and/or a first processor that is recited as performing a first step or function may refer to the same or different server and/or a processor recited as performing a second step or function.
The systems, devices, and methods described herein provide numerous technical advantages in systems for stain detection and cleaning using robotic cleaners, including robotic cleaners equipped for wet mode operation.
Referring now to
In some non-limiting embodiments or aspects, docking station 100 may include base 102 forming a housing. Base 102 may include internal components for emptying and refilling robotic cleaner 300 when robotic cleaner 300 is docked with docking station 100. For example, base 102 may include an upper portion for housing a cleaning fluid tank 104, where new cleaning fluid may be loaded into cleaning fluid tank 104 for storage. New cleaning fluid may be delivered into robotic cleaner 300 from cleaning fluid tank 104 via cleaning fluid conduit 132. Additionally or alternatively, new cleaning fluid may be delivered to wash basin 127 from cleaning fluid tank 104 via internal cleaning fluid conduit 137. Cleaning fluid tank 104 may be removably coupled to base 102. Docking station 100 may include debris tank 106, which may be removably coupled to base 102 to allow debris tank 106 to be cleaned and emptied. Debris tank 106 may be configured to collect wet and/or dry debris collected by robotic cleaner 300 during a cleaning operation. Debris tank 106 may be filled with debris via, at least partly, docking station suction inlet 108.
Docking station 100 may include support 103 and/or suction housing 116, which may extend from support 103 to form the bottom of base 102. Suction housing 116 may enclose at least one suction motor used to create an inflow of air via docking station suction inlet 108. Suction housing 116 may further include an internal conduit to convey debris from robotic cleaner 300 to debris tank 106. Suction housing 116 may further define docking station suction inlet 108. Docking station suction inlet 108 may be configured to fluidly couple to at least a portion of robotic cleaner 300 such that at least a portion of debris stored within debris cup 308 of robotic cleaner 300 may be urged through docking station suction inlet 108 and into debris tank 106. For example, and as shown in
When robotic cleaner 300 seeks to recharge one or more batteries and/or empty debris cup 308 of robotic cleaner 300, robotic cleaner 300 may enter a docking mode. When in the docking mode, robotic cleaner 300 may approach docking station 100 in a manner that allows robotic cleaner 300 to electrically couple to charging contacts 110 and fluidly couple outlet port 316 of robotic cleaner 300 to docking station suction inlet 108. For the purpose of illustration, when in docking mode, robotic cleaner 300 may move to align itself relative to docking station 100, such that robotic cleaner 300 may become docked with docking station 100. For example, when in docking mode, robotic cleaner 300 may approach docking station 100 in a forward direction of travel until reaching a predetermined distance from docking station 100, stop at the predetermined distance and rotate approximately 180°, and proceed in a rearward direction of travel until robotic cleaner 300 docks with docking station 100.
As shown, docking station 100 may include a boot 109 that extends around docking station suction inlet 108. Boot 109 may be configured to engage debris cup 308, such that boot 109 extends around outlet port 316. Boot 109 may be resiliently deformable, such that boot 109 generally conforms to a shape of debris cup 308 of robotic cleaner 300. As such, boot 109 may be configured to sealingly engage debris cup 308. For example, boot 109 may be made of a natural or synthetic rubber, a foam, and/or any other resiliently deformable material. Boot 109 may define one or more ribs 118. Ribs 118 are configured to expand and/or compress in response to robotic cleaner 300 engaging boot 109, allowing boot 109 to deform to accommodate the form of debris cup 308.
In some non-limiting embodiments or aspects, when robotic cleaner 300 is engaging docking station 100 in a misaligned orientation, robotic cleaner 300 may be configured to pivot in place according to an oscillatory pattern. By pivoting in place, robotic cleaner 300 may cause outlet port 316 of robotic cleaner 300 to align with boot 109, such that outlet port 316 is fluidly coupled to docking station suction inlet 108.
In some non-limiting embodiments or aspects, base 102 and/or support 103 may define one or more stops configured to engage a portion of robotic cleaner 300 when robotic cleaner 300 is docking with docking station 100. For example, base 102 may define lower docking stops 112 and upper docking stops 124. One or more stops 112, 124 may be configured to prevent further movement of robotic cleaner 300 toward docking station 100 when robotic cleaner 300 is docking with docking station 100. In some non-limiting embodiments or aspects, upper docking stops 124 may define a guide surface having a taper. For example, a plurality of stops may be provided, each having a tapered guide surface such that engagement of robotic cleaner 300 with the guide surfaces urges robotic cleaner 300 towards an aligned orientation.
In some non-limiting embodiments or aspects, support 103 may define a ramp 122 to allow robotic cleaner 300 to travel onto support 103 from a surface to be cleaned (e.g., a floor). Ramp 122 may include a surface configured to be non-slip for at least one drive wheel 304 of robotic cleaner 300, such as a textured or coated surface. Support 103 may further define a guide surface 120 configured as an apron extending from base 102 toward ramp 122. Guide surface 120 may be configured to engage with an outer edge of at least one wet cleaning element 301 of robotic cleaner 300, creating a generally fluid seal between the outer edge of at least one wet cleaning element 301 and support 103.
When robotic cleaner 300 is docked with docking station 100, at least one wet cleaning element 301 may form a cover over wash basin 127 of support 103. Wash basin 127 may define a cavity in support 103 that houses a shuttle rail 114 along which at least one cleaning shuttle 130 (e.g., including upward-facing sprayers, scrubbers, agitators, and/or the like) may translate back and forth (e.g., along shuttle rail 114) to clean at least one wet cleaning element 301, when robotic cleaner 300 is docked with docking station 100. Cleaning shuttle 130 may emit cleaning fluid from cleaning fluid tank 104 to clean at least one wet cleaning element 301. Used cleaning fluid, after being emitted from cleaning shuttle 130, may drip into wash basin 127 and be carried, by fluid flow and gravity, to wash basin drain 128. Wash basin drain 128 may be mechanically coupled with a pump (not shown) to empty wash basin 127 of used cleaning fluid. The used cleaning fluid may be carried, via conduit (e.g., tubing), from wash basin drain 128 to debris tank 106. At least one detent 107 further holds robotic cleaner 300 in place, via at least one wet cleaning element 301, to prevent robotic cleaner 300 from coming dislodged during the cleaning process by cleaning shuttle 130.
In some non-limiting embodiments or aspects, at least a portion of the shuttle assembly (e.g., including shuttle rail 114 and cleaning shuttle 130) may be mechanically coupled to the at least one detent 107. In such a manner, at least one detent 107 may be configured to retract at least partly into the support 103 (e.g., detent housing 126) upon movement of the shuttle mechanism conveying cleaning shuttle 130 from a first position (e.g., directly underneath wet cleaning element 301) to a second position (e.g., tucked to the side of wash basin 127). For example, a connector (e.g., cable, armature, lever, etc.) may cause at least one detent 107 to retract when cleaning shuttle 130 has completed a cleaning cycle of at least one wet cleaning element 301 and move from an operational position to a non-operational position.
Docking station 100 may include at least one detent 107 extending vertically from support 103 (e.g., as shown, two detents 107). At least one detent 107 is configured to depress away from robotic cleaner 300 when at least a portion of robotic cleaner 300 (e.g., at least one wet cleaning element 301) travels over at least one detent 107 toward base 102 of docking station 100. When moving from a raised position to a depressed position, at least one detent 107 may recess into at least one detent housing 126. At least one detent housing 126 may include a cavity, into which at least one detent 107 may recess when being depressed. At least one detent housing 126 may further include a biasing mechanism (e.g., compression spring, torsion spring, elastomeric material, and/or the like) to urge at least one detent 107 back to a raised position when not opposed by a greater downward force. In some non-limiting embodiments or aspects, docking station 100 may include a plurality of detents 107, which may be mechanically coupled together at least partly within one or more detent housings 126. In such cases, movement of each detent 107 may be coupled such that each detent 107 raises or lowers together. At least one detent 107 may raise when at least a portion (e.g., engaging surface 320) of at least one wet cleaning element 301 of robotic cleaner 300 has passed over at least one detent 107. At least one detent 107 may contact at least a portion (e.g., engaging surface 320) of at least one wet cleaning element 301 when robotic cleaner 300 is docked with docking station 100.
Referring now to
In some non-limiting embodiments or aspects, robotic cleaner 300 may include housing 305, at least one drive wheel 304, at least one wet cleaning element 301 (e.g., including at least one cleaning surface 307 on an underside of wet cleaning element 301 and/or an underside of housing 305), any combination thereof, and/or the like, as described herein. Additionally or alternatively, robotic cleaner 300 may include at least one processor (e.g., onboard processor 322), as described herein. Additionally or alternatively, robotic cleaner 300 may include at least one sensor 323. For example, sensor(s) 323 may be mounted to and/or incorporated into robotic cleaner 300 (e.g., housing 305 thereof). For the purpose of illustration, at least one sensor 323 may be included on a front of robotic cleaner 300.
In some non-limiting embodiments or aspects, robotic cleaner 300 (e.g., onboard processor 322 thereof) may detect at least one stain 392 on a surface to be cleaned 390 based on sensor data from the sensor(s) 323. For example, stain 392 may be within field of view 325 of sensor 323 (e.g., while robotic cleaner 300 is moving and/or performing cleaning operations).
In some non-limiting embodiments or aspects, robotic cleaner 300 may perform at least one cleaning operation based on at least one stain 392. For example, the cleaning operation(s) may include at least one wet cleaning operation with the cleaning surface 307 of wet cleaning element 301. In some non-limiting embodiments or aspects, performing the at least one cleaning operation may include robotic cleaner 300 navigating (e.g., onboard processor 322 controlling robotic cleaner 300 to navigate) to stain 392. In some non-limiting embodiments or aspects, the cleaning operation(s) may include repeating the wet cleaning operation(s) multiple times in an area of surface to be cleaned 390 associated with (e.g., including, overlapping with, near, and/or the like) stain 392.
In some non-limiting embodiments or aspects, robotic cleaner 300 (e.g., onboard processor 322 thereof) may select a cleaning operation from a plurality of predetermined cleaning operations based on stain 392 and surface to be cleaned 390. Additionally or alternatively, performing the cleaning operation(s) may include performing the cleaning operation selected from the plurality of predetermined cleaning operations.
In some non-limiting embodiments or aspects, selecting the cleaning operation may include selecting the cleaning operation from the plurality of predetermined cleaning operations based on at least one of a stain type of stain 392, a surface type of surface to be cleaned 390, any combination thereof, and/or the like.
In some non-limiting embodiments or aspects, robotic cleaner 300 (e.g., onboard processor 322 thereof) may be performing a first cleaning operation before detecting stain 392. Additionally or alternatively, robotic cleaner 300 (e.g., onboard processor 322 thereof) may interrupt the first cleaning operation and initiating a second cleaning operation (e.g., based on stain 392).
In some non-limiting embodiments or aspects, sensor(s) 323 may include at least one of a camera; a color camera; a red, green, and blue (RGB) camera; a three-dimensional camera; an red, green, blue, and depth (RGB-D) camera; a light emitter; a light detector; an infrared (IR) camera; a spectrometer; an IR spectrometer; an image capture device; a LiDAR device; any combination thereof; and/or the like. Additionally or alternatively, the sensor data may include at least one of an image, a video, a color image, an RGB image, a three-dimensional image, an RGB-D image, an IR image, spectroscopy data, IR spectroscopy data, reflectance data, specular reflectance data, diffuse reflectance data, color data, texture data, any combination thereof, and/or the like. In some non-limiting embodiments or aspects, reflectance data (e.g., specular and/or diffuse reflectance data) may be obtained using multiple incident angles and/or multiple wavelengths of light emitted from sensor 323.
In some non-limiting embodiments or aspects, detecting the stain(s) 392 may include detecting the stain(s) 392 on the surface to be cleaned 390 based on sensor data from the sensor(s) 323 and at least one computer vision operation.
In some non-limiting embodiments or aspects, robotic cleaner 300 (e.g., onboard processor 322 thereof) may communicate a notification based on detecting the at least one stain. For example, robotic cleaner 300 (e.g., onboard processor 322 thereof) may communicate the notification to remote device 606 (e.g., a mobile device, a computing device, and/or the like) of a user associated with robotic cleaner 300.
In some non-limiting embodiments or aspects, remote device 600 may display a graphical user interface (GUI) based on the at least one stain. For example, referring now to
In some non-limiting embodiments or aspects, GUI 200 may include a map 390a of surface to be cleaned 390. For example, map 390a may include at least one stain icon 392a based on stain(s) 392. Additionally or alternatively, map 390a may include at least one object icon 394 (e.g., representing furniture, a column, a staircase, a docking station 100, another type of physical object or obstacle, and/or the like)
In some non-limiting embodiments or aspects, remote device 606 may receive an input from the user indicating confirmation that at least one stain 392 is present. Additionally or alternatively, remote device 606 may communicate a communication based on the input.
With continued reference to
Referring now to
As shown, robotic cleaner 300 includes a release 310 for debris cup 308 positioned between a top surface 314 of debris cup 308 and the outlet port 316. Release 310 may include opposing depressable triggers 312 configured to be actuated in opposing directions. Actuation of triggers 312 may cause at least a portion of debris cup 308 to disengage a portion of robotic cleaner 300 such that debris cup 308 may be removed therefrom.
Outlet port 316 may include an evacuation pivot door 318. Evacuation pivot door 318 may be configured to transition from an open position (e.g., when robotic cleaner 300 is docked with docking station 100) and a closed position (e.g., when robotic cleaner 300 is carrying out a cleaning operation). When transitioning to the closed position, evacuation pivot door 318 may pivot in a direction of debris cup 308. As such, during a cleaning operation, a suction force generated by a suction motor of robotic cleaner 300 may urge evacuation pivot door 318 toward the closed position. Additionally or alternatively, a biasing mechanism (e.g., a compression spring, a torsion spring, an elastomeric material, and/or any other biasing mechanism) may urge evacuation pivot door 318 toward the closed position. When transitioning to the open position, evacuation pivot door 318 may pivot in a direction away from debris cup 308. As such, when robotic cleaner 300 is docked with docking station 100, the suction generated by a suction motor of docking station 100 may urge evacuation pivot door 318 towards the open position.
At least one wet cleaning element 301 is detachably connected (e.g., using releasable attachment 309) to an underside 303 of housing 305 of robotic cleaner 300. Releasable attachments 309 usable for connecting wet cleaning element 301 to housing 305 may include, but are not limited to, friction clips, snug-fit adaptors, and/or the like. At least one wet cleaning element 301 includes an engaging surface 320 configured to contact at least one detent 107 when robotic cleaner 300 is docked with docking station 100. At least one wet cleaning element 301 may include at least one cleaning surface 307 (e.g., mop pad) that is configured to be imbued with a cleaning solution and contact a floor surface to be cleaned by robotic cleaner 300. At least one wet cleaning element 301 may be detached from housing 305 of robotic cleaner 300 by the action of being fixed in place by at least one detent 107, in combination with at least one drive wheel of robotic cleaner 300 causing robotic cleaner 300 to travel away from base 102 of docking station 100.
Referring now to
Referring now to
Docking station 100 may include one or more computing devices configured to communicate with remote device 606, cloud system 608, and/or robotic cleaner 300 at least partly over communication network 610. Docking station 100 may be configured to monitor its operational parameters, such as the unused cleaning fluid level, used cleaning fluid level, debris tank status, docking status, robotic cleaner status, and/or the like. Docking station 100 may communicate with robotic cleaner 300 to determine when to extend or retract detents 10π electronically in support 103 of docking station 100, to allow the conversion from wet mode to dry mode operation and vice versa, in certain electronic-controlled embodiments or aspects.
Robotic cleaner 300 may include one or more computing devices configured to communicate with remote device 606, cloud system 608, and/or docking station 100 at least partly over communication network 610. Robotic cleaner 300 may be configured to autonomously carry out cleaning operations in wet mode or dry mode, and may further autonomous convert between those modes in concert with docking station 100. Robotic cleaner 300 may communicate with remote device 606 to determine an operational mode and relay parameters of robotic cleaner's 300 operations to remote device 606, including an operational mode status. Robotic cleaner 300 may further communicate with cloud system 608 to relay operational parameters, including cleaning statuses, operational modes, obstacles detected, errors, failures, obstacles, and/or the like. Robotic cleaner 300 may further communicate with docking station 100 to cause docking station to extend or retract detents 107, in certain electronic-controlled embodiments or aspects.
Remote device 606 may include one or more computing devices configured to communicate with cloud system 608, robotic cleaner 300, and/or docking station 100 at least partly over communication network 610. Remote device 606 may be configured to instruct robotic cleaner 300 to change modes of operation and carry out docking or undocking procedures. Remote device 606 may further communicate with robotic cleaner 300 to receive status updates and parameters of cleaning operation. Remote device 606 may communicate with cloud system to view historical and real-time operation parameters of robotic cleaner 300. Remote device 606 may further communicate with docking station 100 to receive parameters and status information of docking station 100 operation.
Cloud system 608 may include one or more computing devices configured to communicate with remote device 606, robotic cleaner 300, and/or docking station 100 at least partly over communication network 610. Cloud system 608 may be configured to receive operational information from robotic cleaner 300 and/or docking station 100 and store at least some of the information in memory. Cloud system 608 may communicate with remote device 606 to transmit at least a portion of real-time or stored operational information that is received from robotic cleaner 300 and/or docking station 100. Cloud system 608 may further store cleaning operation parameters and preferences for a cleaning system and/or user. Cloud system 608 may communicate operation instructions to robotic cleaner 300.
Communication network 610 may include one or more wired and/or wireless networks over which the systems and devices of environment 500 may communicate. For example, communication network 610 may include a cellular network (e.g., a long-term evolution (LTE®) network, a third generation (3G) network, a fourth generation (4G) network, a fifth generation (5G) network, a code division multiple access (CDMA) network, etc.), a public land mobile network (PLMN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a telephone network (e.g., the public switched telephone network (PSTN)), a private network, an ad hoc network, an intranet, the Internet, a fiber optic-based network, a cloud computing network, and/or the like, and/or a combination of these or other types of networks.
The number and arrangement of devices and networks shown in
Referring now to
Robotic cleaner 300 may further include one or more wet cleaning elements 301 that are configured to be removably attached from housing 305. When robotic cleaner 300 is in a wet mode of operation (e.g., mopping), one or more wet cleaning elements 301 may be attached to an underside 303 of robotic cleaner 300, and each wet cleaning element 301 may include a cleaning surface 307 (e.g., a mop pad) for contacting and cleaning the surface to be cleaned. Robotic cleaner 300 may include an onboard tank for cleaning solution, which may be delivered to cleaning surface 307 via conduit at least partially within housing 305 from the onboard tank to the cleaning surface 307. One or more wet cleaning elements 301 may each include an engaging surface 320 configured to interface with and contact one or more detent 107 of docking station 100, to allow wet cleaning elements 301 to be detached from housing 305 of robotic cleaner 300.
Robotic cleaner may further include one or more agitators 328 that are configured to carry out a dry mode of operation (e.g., vacuuming). For example, agitator 328 may include a rotating agitator including bristles, fabric, or other cleaning elements, or any combination thereof, around the outside of agitator 328. A rotating agitator 328 may include, for example, strips of bristles in combination with strips of rubber or elastomer material. A rotating agitator 328 may also be removable to allow the rotating agitator 328 to be cleaned more easily and allow the user to change the size of the rotating agitator 328, change the type of bristles on the rotating agitator 328, and/or remove the rotating agitator 328 depending on the intended application. Robotic cleaner 300 may further include, as an agitator 328, a bristle strip on an underside of housing 305 and adjacent a portion of suction conduit, to contact the surface to be cleaned and urge debris toward the suction conduit of robotic cleaner 300. In some non-limiting embodiments or aspects, one or more agitators 328 may be used in concert with at least one wet cleaning element 301 for carrying out wet mode operation of robotic cleaner 300. Additionally or alternatively, one or more agitators 328 may be at least partially disabled, occluded, covered, and/or the like, by at least one wet cleaning element 301 when robotic cleaner 300 is in a wet mode of operation.
Robotic cleaner 300 may further include onboard processor 322 (e.g., a controller). Onboard processor 322 may be communicatively connected to sensors 323 of robotic cleaner 300 (e.g., bump sensors, wheel drop sensors, rotation sensors, forward obstacle sensors, side wall sensors, cliff sensors, any sensor described herein, and/or the like) and to driving mechanisms (e.g., drive motor 324, motors configured to control one or more features of an air jet assembly, agitator 328 assembly, side brush 306, etc.). Thus, onboard processor 322 may be configured to operate one or more drive wheels 304, air jet assemblies, agitators 328, etc., in response to sensed conditions. Onboard processor 322 may operate robotic cleaner 300 to perform various operations, such as autonomous cleaning (e.g., including randomly moving and turning, wall following, obstacle following, etc.), spot cleaning, and docking. Onboard processor 322 may also operate robotic cleaner 300 to avoid obstacles and cliffs and to escape from various situations where robotic cleaner 300 may become stuck. Onboard processor 322 may include one or more hardware components, such as described in
Referring now to
As shown in
Storage component 708 may store information and/or software related to the operation and use of device 700. For example, storage component 708 may include a hard disk (e.g., a magnetic disk, an optical disk, a magneto-optic disk, a solid state disk, etc.) and/or another type of computer-readable medium.
Input component 710 may include a component that permits device 700 to receive information, such as via user input (e.g., a touch screen display, a keyboard, a keypad, a mouse, a button, a switch, a microphone, etc.). Additionally or alternatively, input component 710 may include a sensor for sensing information (e.g., a global positioning system (GPS) component, an accelerometer, a gyroscope, an actuator, etc.). Output component 712 may include a component that provides output information from device 700 (e.g., a display, a speaker, one or more light-emitting diodes (LEDs), etc.).
Communication interface 714 may include a transceiver-like component (e.g., a transceiver, a separate receiver and transmitter, etc.) that enables device 700 to communicate with other devices, such as via a wired connection, a wireless connection, or a combination of wired and wireless connections. Communication interface 714 may permit device 700 to receive information from another device and/or provide information to another device. For example, communication interface 714 may include an Ethernet interface, an optical interface, a coaxial interface, an infrared interface, a radio frequency (RF) interface, a universal serial bus (USB) interface, a Wi-Fi® interface, a cellular network interface, and/or the like.
Device 700 may perform one or more processes described herein. Device 700 may perform these processes based on processor 704 executing software instructions stored by a computer-readable medium, such as memory 706 and/or storage component 708. A computer-readable medium (e.g., a non-transitory computer-readable medium) is defined herein as a non-transitory memory device. A memory device includes memory space located inside of a single physical storage device or memory space spread across multiple physical storage devices.
Software instructions may be read into memory 706 and/or storage component 708 from another computer-readable medium or from another device via communication interface 714. When executed, software instructions stored in memory 706 and/or storage component 708 may cause processor 704 to perform one or more processes described herein. Additionally or alternatively, hardwired circuitry may be used in place of or in combination with software instructions to perform one or more processes described herein. Thus, embodiments or aspects described herein are not limited to any specific combination of hardware circuitry and software. The term “programmed or configured,” as used herein, refers to an arrangement of software, hardware circuitry, or any combination thereof on one or more devices.
The number and arrangement of components shown in
Referring now to
As shown in
In some non-limiting embodiments or aspects, detecting the at least one stain 392 may include detecting the at least one stain 392 on the surface to be cleaned 390 based on sensor data from the at least one sensor 323 and at least one computer vision operation, as described herein.
In some non-limiting embodiments or aspects, robotic cleaner 300 (e.g., onboard processor 322 thereof) may communicate a notification based on detecting the at least one stain, as described herein. For example, communicating the notification may include communicating the notification to remote device 606 of a user associated with robotic cleaner 300. 26. In some non-limiting embodiments or aspects, remote device 606 may display a GUI based on the at least one stain 392, as described herein. For example, the GUI may include map 390a of the surface to be cleaned 390. The map may include at least one stain icon 392a based on the at least one stain 392, as described herein.
In some non-limiting embodiments or aspects, remote device 606 may receive an input from the user indicating confirmation that the at least one stain 392 is present, as described herein. Remote device 606 may communicate a communication based on the input (e.g., to robotic cleaner 300 and/or onboard processor 322 thereof).
As shown in
In some non-limiting embodiments or aspects, performing the at least one cleaning operation may include performing at least one wet cleaning operation with at least one cleaning surface 307 of at least one wet cleaning element 301 of robotic cleaner 300.
In some non-limiting embodiments or aspects, robotic cleaner 300 (e.g., onboard processor 322 thereof) may select the cleaning operation from a plurality of predetermined cleaning operations based on the at least one stain 392 and surface to be cleaned 390, as described herein. Performing the at least one cleaning operation may include robotic cleaner 300 performing the cleaning operation selected from the plurality of predetermined cleaning operations.
In some non-limiting embodiments or aspects, selecting the cleaning operation may include selecting the cleaning operation from the plurality of predetermined cleaning operations based on at least one of a stain type of the at least one stain, a surface type of the surface to be cleaned, or any combination thereof, as described herein.
In some non-limiting embodiments or aspects, robotic cleaner 300 may be performing a first cleaning operation before detecting the at least one stain. The at least one cleaning operation may be a second cleaning operation, and performing the at least one cleaning operation based on the at least one stain 392 may include interrupting the first cleaning operation and initiating the second cleaning operation.
In some non-limiting embodiments or aspects, robotic cleaner 300 (e.g., onboard processor 322 thereof) may receive the communication from remote device 606. Performing the at least one cleaning operation may include performing the at least one cleaning operation based on the communication.
While the principles of the disclosed subject matter have been described herein, it is to be understood by those skilled in the art that this description is made only by way of example and not as a limitation as to the scope of the disclosed subject matter. Other embodiments are contemplated within the scope of the presently disclosed subject matter in addition to the exemplary embodiments shown and described herein. Modifications and substitutions by one of ordinary skill in the art are considered to be within the scope of the presently disclosed subject matter, which is not to be limited except by the following claims.
Claims
1. A robotic cleaning system, comprising:
- a robotic cleaner comprising: a housing; at least one wet cleaning element comprising at least one cleaning surface on an underside of the housing; at least one sensor; and
- at least one processor programmed or configured to: detect at least one stain on a surface to be cleaned based on sensor data from the at least one sensor; and perform at least one cleaning operation based on the at least one stain.
2. The system of claim 1, further comprising:
- a docking station comprising: a base; and a support extending laterally from the base, the support configured to receive at least a portion of the robotic cleaner on top of the support.
3. The system of claim 1, wherein the at least one sensor comprises at least one of:
- a camera;
- a color camera;
- a red, green, and blue (RGB) camera;
- a three-dimensional camera;
- a red, green, blue, and depth (RGB-D) camera;
- a light emitter;
- a light detector;
- an infrared (IR) camera;
- a spectrometer;
- an IR spectrometer;
- an image capture device;
- a LiDAR device; or
- any combination thereof.
4. The system of claim 1, wherein the sensor data comprises at least one of:
- an image;
- a video;
- a color image;
- a red, green, and blue (RGB) image;
- a three-dimensional image;
- a red, green, blue, and depth (RGB-D) image;
- an infrared (IR) image;
- spectroscopy data;
- IR spectroscopy data;
- reflectance data;
- specular reflectance data;
- diffuse reflectance data;
- color data;
- texture data; or
- any combination thereof.
5. The system of claim 1, wherein detecting the at least one stain comprises detecting the at least one stain on the surface to be cleaned based on sensor data from the at least one sensor and at least one computer vision operation.
6. The system of claim 1, wherein performing the at least one cleaning operation comprises performing at least one wet cleaning operation with the at least one cleaning surface of the at least one wet cleaning element.
7. The system of claim 1, wherein the at least one processor is further programmed or configured to:
- select a cleaning operation from a plurality of predetermined cleaning operations based on the at least one stain and the surface to be cleaned,
- wherein performing the at least one cleaning operation comprises performing the cleaning operation selected from the plurality of predetermined cleaning operations.
8. The system of claim 7, wherein selecting the cleaning operation comprises selecting the cleaning operation from the plurality of predetermined cleaning operations based on at least one of:
- a stain type of the at least one stain;
- a surface type of the surface to be cleaned; or
- any combination thereof.
9. The system of claim 1, wherein the robotic cleaner is performing a first cleaning operation before detecting the at least one stain,
- wherein the at least one cleaning operation comprises a second cleaning operation, and
- wherein performing the at least one cleaning operation based on the at least one stain comprises interrupting the first cleaning operation and initiating the second cleaning operation.
10. The system of claim 1, wherein the at least one processor is further programmed or configured to:
- communicate a notification based on detecting the at least one stain.
11. The system of claim 10, wherein communicating the notification comprises communicating the notification to a remote device of a user associated with the robotic cleaner.
12. The system of claim 11, wherein the remote device displays a graphical user interface (GUI) based on the at least one stain.
13. The system of claim 12, wherein the GUI comprises a map of the surface to be cleaned, the map comprising at least one stain icon based on the at least one stain.
14. The system of claim 11, wherein the remote device is programmed or configured to:
- receive an input from the user indicating confirmation that the at least one stain is present; and
- communicate a communication based on the input.
15. The system of claim 14, wherein the at least one processor is further programmed or configured to:
- receive the communication from the remote device,
- wherein performing the at least one cleaning operation comprises performing the at least one cleaning operation based on the communication.
16. A method, comprising:
- detecting, with at least one processor, at least one stain on a surface to be cleaned based on sensor data from at least one sensor of a robotic cleaner; and
- controlling, with at least one processor, the robotic cleaner to perform at least one cleaning operation based on the at least one stain.
17. The method of claim 16, wherein detecting the at least one stain comprises detecting the at least one stain on the surface to be cleaned based on sensor data from the at least one sensor and at least one computer vision operation.
18. The method of claim 16, wherein performing the at least one cleaning operation comprises performing at least one wet cleaning operation with at least one cleaning surface of at least one wet cleaning element of the robotic cleaner.
19. The method of claim 16, further comprising:
- selecting, with at least one processor, a cleaning operation from a plurality of predetermined cleaning operations based on the at least one stain and the surface to be cleaned,
- wherein performing the at least one cleaning operation comprises performing the cleaning operation selected from the plurality of predetermined cleaning operations, and
- wherein selecting the cleaning operation comprises selecting the cleaning operation from the plurality of predetermined cleaning operations based on at least one of: a stain type of the at least one stain; a surface type of the surface to be cleaned; or any combination thereof.
20. The method of claim 16, wherein the robotic cleaner is performing a first cleaning operation before detecting the at least one stain,
- wherein the at least one cleaning operation comprises a second cleaning operation, and
- wherein performing the at least one cleaning operation based on the at least one stain comprises interrupting the first cleaning operation and initiating the second cleaning operation.
Type: Application
Filed: Aug 29, 2024
Publication Date: Mar 6, 2025
Inventors: Derek Lessard (Dorchester, MA), Max Davidowitz (Watertown, MA)
Application Number: 18/819,826