SYSTEM AND METHOD FOR INDICATING A PLANNED ROBOT MOVEMENT
An information system configured to indicate a condition of one or more robotic devices to a user, the information system having: a communication interface for obtaining at least one planned movement path of the robotic devices and a position of the user; an augmented-reality, AR, interface associated with the user; and processing circuitry configured to display, by means of the AR interface, a visualization of the at least one planned movement path relative to the user position, wherein the visualization is responsive to at least one quantity, which is one or more of: a robotic device's identity, a robotic device's mass or physical dimensions, a robotic device's velocity, a robotic device's proximity to the user position.
The present disclosure relates to the field of human—machine interaction and human—robot interaction in particular. The disclosure proposes a system and a method for indicating a planned movement path of a robotic device.
BACKGROUNDMobile robots are increasingly used for autonomous transportation tasks in industrial environments, such as factories. However, the increasing presence of transportation robots means that their interaction with humans (e.g., workers, operators, users) in the factory has to be managed with great care. In other words, the field of human—robot interaction with regard to mobile robots needs to be developed further.
The presence of mobile robots in a factory or plant can lead to collisions between their movements and operators in the factory if these people are not well informed about their future movements. While mobile robots can be fitted with advanced sensors to reduce the likelihood of collisions, the programmed behavior usually falls into one of two main approaches, either to stop if a collision is detected (collision detection approach) or to change route or speed so as to avoid an anticipated collision (collision avoidance approach). Neither of these approaches is a complete guarantee against collisions, and accidents do occur, causing work interruption and delays at the very least. Besides this, from time to time, operators or supervisors will need to investigate the movement information of a robot, e.g., its origin or next waypoint, for inspection or maintenance purposes.
Existing solutions that involve physical modifications of the environment (e.g., highlighting tape on the floor) or on the robots (e.g., installing additional displays) are not scalable and not easily adaptable when requirements change over time. Solutions relying on augmented reality (AR) techniques may be preferable in this respect though they have other limitations.
WO2019173396 discloses techniques for using AR to improve coordination between human and robot actors. Various embodiments provide predictive graphical interfaces such that a teleoperator controls a virtual robot surrogate, rather than directly operating the robot itself, providing the user with foresight regarding where the physical robot will end up and how it will get there. Using a human interface module, a probabilistic or other indicator of those paths may be displayed via an augmented reality display in use by a human to help aid the human in understanding the anticipated path, intent, or activities of robot. The human-intelligible presentation may indicate the position of the robot within the environment, objects within the environment, and/or collected data. For example, an arrow may be shown that always stays 15 seconds ahead of an aerial robot. As another example, the human interface module may display one or more visual cues (e.g., projections, holograms, lights, etc.) to advise nearby users of the intent of robot.
SUMMARYOne objective of the present disclosure is to make available an improved method and system for indicating to a user a planned movement path of a robotic device. Another objective is to augment the indication with at least one quantity derivable from the movement path. Another objective is to leverage techniques from augmented reality (AR), extended reality (XR) and/or virtual reality (VR) to provide such indication. A further objective is to propose a human—robot interface that improves the safety in environments where mobile robotic devices operate.
These and other objects are achieved by the invention, as defined by the appended independent claims. The dependent claims are directed to embodiments of the invention.
In a first aspect, a method of indicating a condition of one or more mobile robotic devices to a user comprises: obtaining at least one planned movement path of the robotic device or devices; obtaining a position of the user; and, by means of an AR interface associated with the user, displaying a visualization of the at least one planned movement path relative to the user position. According to an embodiment, the visualization of the movement path is responsive to at least one quantity, namely, a robotic device's identity, a robotic device's activity or task, a robotic device's mass or physical dimensions, a robotic device's velocity and/or a robotic device's proximity to the user position. A visualization that is responsive to two or more of the quantities falls within the scope of this embodiment, just like a visualization that is responsive to several robotic devices' value of one quantity.
It is understood that a “planned movement path” in the sense of the claims is a data structure or other collection of information that represents locations of the at least one robotic device at different points in time. The information may include specifics of the robotic device, such as inherent or semi-inherent properties (e.g., a hardware identifier, a basic mass, dimensions) and variable properties (e.g., a currently mounted robot tool, an assigned temporary identifier valid to be used for calls in a particular work environment, a current load). A visualization may be “responsive to” a quantity if a feature or characteristic of the visualization—or a feature or characteristic of non-visual content accompanying the visualization—is different for different values of the quantity.
Accordingly, this embodiment enables a real-time presentation of visual aids that inform the user of the robotic device's movement path together with a further quantity, which helps the user deal or collaborate with the robotic device in a safer way. In particular, the visual aids may help the user discover and avoid future collisions between themselves and mobile robots. Because the visual aids are amenable to a high degree of spatial and temporal accuracy, the user has the option of stepping out of the robot's path with a reasonable safety margin. If the user and the mobile robotic device share a work area, such accurate indications are clearly in the interest of productivity, knowing that users exposed to non-distinct warnings may otherwise develop a culture of excessive precaution, with loss of productivity.
In one embodiment, the at least one quantity is derivable from the movement path. How such derivation may proceed will be discussed in further detail below.
In another embodiment, the method further comprises obtaining said at least one quantity, e.g., in a manner similar to the obtention of the movement path.
In another aspect, there is provided an information system comprising: a communication interface for obtaining at least one planned movement path of the robotic devices and a position of the user; an AR interface associated with the user; and processing circuitry configured to display a visualization of the at least one planned movement path relative to the user position. According to an embodiment, the visualization of the movement path is responsive to at least one quantity, namely, a robotic device's identity, a robotic device's activity or task, a robotic device's mass or physical dimensions, a robotic device's velocity and/or a robotic device's proximity to the user position.
The information system is technically advantageous in a same or similar way as the method discussed initially.
A further aspect relates to a computer program containing instructions for causing a computer, or the information system in particular, to carry out the above method. The computer program may be stored or distributed on a data carrier. As used herein, a “data carrier” may be a transitory data carrier, such as modulated electromagnetic or optical waves, or a non-transitory data carrier. Non-transitory data carriers include volatile and non-volatile memories, such as permanent and non-permanent storages of magnetic, optical, or solid-state type. Still within the scope of “data carrier”, such memories may be fixedly mounted or portable.
All terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless explicitly defined otherwise herein. All references to “a/an/the element, apparatus, component, means, step, etc.” are to be interpreted openly as referring to at least one instance of the element, apparatus, component, means, step, etc., unless explicitly stated otherwise. The steps of any method disclosed herein do not have to be performed in the exact order disclosed, unless explicitly stated.
Aspects and embodiments are now described, by way of example, with reference to the accompanying drawings, on which:
The aspects of the present disclosure will now be described more fully hereinafter with reference to the accompanying drawings, on which certain embodiments of the invention are shown. These aspects may, however, be embodied in many different forms and should not be construed as limiting; rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and to fully convey the scope of all aspects of invention to those skilled in the art. Like numbers refer to like elements throughout the description.
The AR interface is here illustrated by glasses 720—also referred to as smart glasses, AR glasses or a head-mounted display (HMD)—which when worn by a user allow them to observe the environment through the glasses in the natural manner and are further equipped with arrangements for generating visual stimuli adapted to produce, from the user's point of view, an appearance of graphic elements overlaid (or superimposed) on top of the view of the environment. Various ways to generate such stimuli in see-through HMDs are known per se in the art, including diffractive, holographic, reflective, and other optical techniques for presenting a digital image to the user.
The illustrated AR interface further includes at least one acoustic trans-ducer, as illustrated by a speaker 721 in the proximity of the user's ear when worn. Preferably, the AR interface comprises at least two acoustic transducers with coherence abilities such that an audio signal with a variable imaginary point of origin can be generated to convey spatial information to the user.
To implement embodiments of the invention, the information system 700 further comprises a communication interface, symbolically illustrated in
As described above on an overview level, the visualization of the movement path displayed in the third step 330 is rendered in a manner responsive to at least one quantity, which is optionally derivable from the movement path. Specific examples of said quantity include:
-
- 1. a robotic device's identity,
- 2. a robotic device's activity or task,
- 3. a robotic device's mass or physical dimensions,
- 4. a robotic device's velocity,
- 5. a robotic device's proximity to the user position.
Example 3 provides that the robotic device's mass or physical dimensions may be represented in audible form. The information system 700 may be able to determine the mass or physical dimensions by extracting an identity of the robotic device 110 from the planned movement path and consulting a look-up table or database associating identities of the robotic devices with their model, type etc. For instance, a number of distinguishable different tunes (melodies) played as an audio signal accompanying the visualization 130 may correspond to different weight or size classes of the robotic devices. The visualization 130 may be of any of the various types described in other sections of this disclosure. Different pitches or average pitches may be used for the same purpose, e.g., that a lower pitch may correspond to a heavier and/or taller robotic device and a higher pitch may correspond to a lighter and/or lower device. This assists the user 120 in selecting an adequate safety margin, knowing that heavier robotic devices normally pose a more serious risk of physical injury.
Regarding Example 4, it is clearly within the skilled person's abilities to derive the velocity of the robotic device 110 from a planned movement path that specifies locations at different points in time. Stereophonic or spatial playback of an audio signal through multiple speakers 721 of the AR interface, wherein the relative phases and/or intensities (panning) are controlled, may furthermore be used to indicate the vector aspect (direction) of the robotic device's velocity. More precisely, an imaginary point of origin of a played-back audio signal accompanying the visualization 130 may correspond to the robotic device's 110 direction relative to the user 120. Alternatively, the imaginary point of origin may illustrate the robotic device's 110 location. Further still, an imaginary point of origin which is moving during the playback of the audio signal may correspond to the geometry of the planned movement path. This provides the user 120 with an approximate idea of the planned movement path while leaving their visual perception available for other information.
Still within Example 4, the scalar aspect (speed) of the robotic device's 110 velocity may be reflected by the visualization 130. For instance,
Turning to Example 5,
An important special case of Example 5 is the indication of a collision risk to the user 120. A collision risk may be estimated as the robotic device's 110 minimal distance to the user 120 over the course of the planned movement path, wherein a distance of zero may correspond to an unambiguous collision unless the user 120 moves. The visualization 130 may be generated such that the severity of this risk is communicated to the user 120.
The aspects of the present disclosure have mainly been described above with reference to a few embodiments. However, as is readily appreciated by a person skilled in the art, other embodiments than the ones disclosed above are equally possible within the scope of the invention, as defined by the appended patent claims.
Claims
1. A method of indicating a condition of one or more mobile robotic devices to a user, comprising the steps of:
- obtaining at least one planned movement path of the robotic devices;
- obtaining a position of the user; and
- displaying, by means of an augmented-reality, AR, interface associated with the user, a visualization of the at least one planned movement path relative to the user position, wherein the visualization of the movement path is responsive to at least one quantity selected from: a robotic device's identity, a robotic device's activity or task, a robotic device's mass or physical dimensions, a robotic device's velocity, and a robotic device's proximity to the user position.
2. The method of claim 1, further comprising obtaining said at least one quantity.
3. The method of claim 1, wherein said at least one quantity is derivable from movement path.
4. The method of claim 1, wherein the AR interface associated with the user is worn by the user.
5. The method of claim 1, wherein the robotic device's identity, activity or task is represented by any of:
- a hue of particles of a particle flow,
- a hue of animated pointing elements.
6. The method of claim 1, wherein the robotic device's mass or physical dimensions are represented by a tune or an average pitch of an audio signal accompanying the visualization.
7. The method of claim 1, wherein the visualization is responsive to the robotic device's speed.
8. The method of claim 1, wherein the visualization is responsive to the sense of the robotic device's planned movement path.
9. The method of claim 1, wherein the robotic device's direction relative to the user is represented by an imaginary point of origin of an audio signal accompanying the visualization.
10. The method of claim 1, wherein the visualization is responsive to the robotic device's proximity in terms of distance.
11. The method of claim 1, wherein the visualization is responsive to the robotic device's proximity in terms of travel time.
12. The method of claim 10, wherein the robotic device's proximity is represented by any of:
- a particle density of a particle flow,
- a lightness, brightness, colorfulness, or saturation of animated pointing elements,
- a loudness of an audio signal accompanying the visualization.
13. The method of claim 1, wherein the robot device's proximity to the user position includes the robotic device's risk of colliding with the user.
14. The method of claim 13, wherein a risk of colliding with the user that exceeds a predetermined threshold is represented by any of:
- a local deviation from the movement path of particles of a particle flow,
- a shift of animated pointing elements.
15. An information system configured to indicate a condition of one or more robotic devices to a user, the information system comprising:
- a communication interface for obtaining at least one planned movement path of the robotic devices, and a position of the user;
- an augmented-reality, AR, interface associated with the user; and
- processing circuitry configured to display, by means of the AR interface, a visualization of the at least one planned movement path relative to the user position, wherein the visualization of the movement path is responsive to at least one quantity selected from: a robotic device's identity, a robotic device's activity or task, a robotic device's mass or physical dimensions, a robotic device's velocity, and a robotic device's proximity to the user position.
16. A computer program comprising instructions to cause an information system configured to indicate a condition of one of more robotic devices to a user, the information system having:
- a communication interface for obtaining at least one planned movement path of the robotic devices, and a position of the user;
- an augmented reality, AR, interface associated with the user; and
- processing circuitry configured to display, by means of the AR Interface, a visualization of the at as planned movement path relative to the user position,
- wherein the visualization of the movement path is responsive to at least one quantity selected from: a robotic device's identity, a robotic activity or task, a robotic device's mass or physical dimensions, a robotic device's velocity, and a robotic device's proximity to the user position.
17. A data carrier having stored thereon a computer program comprising instructions to cause an information system configured to indicate a condition of one or more robotic devices to a user, the information system having:
- a communication interface for obtaining at least one planned movement path of the robotic devices, and a position of the user;
- an augmented reality, AR, interface associated with the user; and
- processing circuitry configured to display by means of the AR interface, a visualization of the at least one planned movement path relative to the user position,
- wherein the visualization of the movement path is responsive to at least one quantity selected from: a robotic device's identity, a robotic device's activity or task, a robotic device's mass or physical dimensions, a robotic device's velocity, and a robotic device's proximity to the user position.
Type: Application
Filed: Sep 24, 2020
Publication Date: Oct 19, 2023
Inventors: Saad Azhar (Västerås), Duy Khanh Le (Hoà Thành, Tây Ninh)
Application Number: 18/245,598