ASSISTANCE DEVICE, WORK MACHINE, AND RECORDING MEDIUM

An assistance device includes circuitry configured to acquire data relating to a shape of a work target around a work machine; and propose, to a user, a movement from among a plurality of candidate movements in a predetermined work of the work machine, based on the acquired data, the plurality of candidate movements being of different types from each other.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application is a continuation application of International Application No. PCT/JP2023/013195 filed on Mar. 30, 2023, which is based on and claims priority to Japanese Patent Application No. 2022-058984, filed on Mar. 31, 2022, and Japanese Patent Application No. 2022-060273, filed on Mar. 31, 2022. The contents of these applications are incorporated herein by reference in their entireties.

BACKGROUND 1. Technical Field

The present disclosure relates to an assistance device for a work machine, and the like.

2. Description of Related Art

A work machine such as a shovel is known in the related art.

SUMMARY

According to one embodiment of the present disclosure, an assistance device is provided. The assistance device includes circuitry configured to:

    • acquire data relating to a shape of a work target around a work machine; and
    • propose, to a user, a movement t from among a plurality of candidate movements in a predetermined work of the work machine, based on the acquired data, the plurality of candidate movements being of different types from each other.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a diagram illustrating an example of a shovel activation assistance system.

FIG. 2 is a top view illustrating an example of a shovel.

FIG. 3 is a diagram illustrating an example of a configuration relating to remote operation of the shovel.

FIG. 4 is a block diagram illustrating an example of a hardware configuration of the shovel.

FIG. 5 is a diagram illustrating an example of a hardware configuration of an information processing device.

FIG. 6 is a functional block diagram illustrating a first example of a functional configuration relating to a movement proposing function of the shovel activation assistance system.

FIG. 7 is a flowchart schematically illustrating a first example of processing relating to a movement proposing function of the shovel.

FIG. 8 is a functional block diagram illustrating a second example of a functional configuration relating to the movement proposing function of the shovel activation assistance system.

FIG. 9 is a flowchart schematically illustrating a second example of processing relating to the movement proposing function of the shovel.

FIG. 10 is a diagram illustrating a first example of display contents of a display device relating to the movement proposing function of a shovel.

FIG. 11 is a diagram illustrating a second example of display contents of the display device relating to the movement proposing function of the shovel.

FIG. 12 is a diagram illustrating a third example of display contents of the display device relating to the movement proposing function of the shovel.

FIG. 13 is a diagram illustrating a third example of display contents of the display device relating to the movement proposing function of the shovel.

FIG. 14 is a diagram illustrating a fourth example of display contents of the display device relating to the movement proposing function of the shovel.

FIG. 15 is a diagram illustrating a fifth example of display contents of the display device relating to the movement proposing function of the shovel.

FIG. 16 is a functional block diagram illustrating an example of a functional configuration relating to generation of a target trajectory of a work part of the shovel.

FIG. 17 is a diagram illustrating an example of a screen relating to generation of the target trajectory of the work part of the shovel.

FIG. 18 is a diagram illustrating another example of a screen relating to generation of a target trajectory of a work part of a shovel.

FIG. 19 is a diagram illustrating still another example of a screen relating to generation of a target trajectory of a work part of a shovel.

FIG. 20 is a flowchart schematically illustrating an example of processing relating to generation of a target trajectory of a work part of the shovel.

DETAILED DESCRIPTION

When a work machine is used to perform a certain work, it is necessary to selectively use a plurality of movements in accordance with a state of a work target such as a landform. For example, in the case of a ground leveling work by a shovel, a movement of sweeping out earth and sand forward using the back surface of a bucket, a horizontal leveling movement, a rolling compaction movement, and the like are used. Therefore, in the case of an inexperienced operator, it appears difficult to select an appropriate movement from among a plurality of candidate movements, and as a result, the work efficiency or the like may be degraded.

Accordingly, in view of the above problems, it is an object to provide a technique capable of performing a more appropriate movement of a work machine.

According to the above embodiment, a technique capable of operating the work machine more appropriately can be provided.

Hereinafter, embodiments will be described with reference to the drawings.

[Overview of Activation Assistance System]

First, an overview of an activation assistance system SYS according to the present embodiment will be described with reference to FIGS. 1 to 3.

FIG. 1 is a diagram illustrating an example of an activation assistance system SYS. In FIG. 1, the shovel 100 is illustrated in a left side view. FIG. 2 is a top view illustrating an example of the shovel 100. FIG. 3 is a diagram illustrating an example of a configuration relating to remote operation of the shovel. Hereinafter, a direction in the shovel 100 or a direction viewed from the shovel 100 may be described by defining a direction in which the attachment AT extends in a top view of the shovel 100 (an upper direction in FIG. 2) as “front”.

As illustrated in FIG. 1, the activation assistance system SYS includes a shovel 100 and an information processing device 200.

The activation assistance system SYS coordinates with the shovel 100 using the information processing device 200 and assists the activation of the shovel 100.

The number of shovels 100 included in the activation assistance system SYS may be one or more.

The shovel 100 is a work machine that is a target of assistance relating to activation in the activation assistance system SYS.

As illustrated in FIGS. 1 and 2, the shovel 100 includes a lower traveling body 1; an upper turning body 3; an attachment AT including a boom 4, an arm 5, and a bucket 6; and a cabin 10.

The lower traveling body 1 causes the shovel 100 to travel by using a crawler 1C. The crawler 1C includes a left crawler 1CL and a right crawler 1CR. The left crawler 1CL is hydraulically driven by a traveling hydraulic motor 1ML. Similarly, the right crawler 1CR is hydraulically driven by a traveling hydraulic motor 1MR. Thus, the lower traveling body 1 can travel by itself.

The upper turning body 3 is rotatably mounted on the lower traveling body 1 via a turning mechanism 2. For example, the upper turning body 3 turns with respect to the lower traveling body 1 by the turning mechanism 2 being hydraulically driven by the turning hydraulic motor 2M.

The boom 4 is attached to the center of the front portion of the upper turning body 3 so as to be able to be elevated and lowered about a rotation axis along a left-right direction. The arm 5 is attached to the distal end of the boom 4 so as to be rotatable about a rotation axis along the left-right direction. The bucket 6 is attached to the distal end of the arm 5 so as to be rotatable about a rotation axis along the left-right direction.

The bucket 6 is an example of an end attachment and is used for, for example, excavation work.

The bucket 6 is attached to the distal end of the arm 5 such that the bucket 6 can be appropriately replaced according to the work content of the shovel 100. That is, instead of the bucket 6, a bucket of a type different from the bucket 6, for example, a relatively large bucket, a slope bucket, a dredging bucket, or the like may be attached to the distal end of the arm 5. Further, an end attachment of a type other than the bucket, for example, a stirrer, a breaker, a crusher, or the like may be attached to the distal end of the arm 5. Further, for example, an auxiliary attachment such as a quick coupling or a tilt rotator may be provided between the arm 5 and the end attachment.

The boom 4, the arm 5, and the bucket 6 are hydraulically driven by a boom cylinder 7, an arm cylinder 8, and a bucket cylinder 9, respectively.

The cabin 10 is a control room in which an operator boards and operates the shovel 100. The cabin 10 is mounted on, for example, the left side of the front portion of the upper turning body 3.

For example, the shovel 100 activates driven elements such as the lower traveling body 1 (i.e., a pair of left and right crawlers 1CL and 1CR), the upper turning body 3, the boom 4, the arm 5, and the bucket 6 in response to an operation of an operator who boards in the cabin 10.

Further, the shovel 100 may be configured to be remotely operated from the outside of the shovel 100 instead of or in addition to being configured to be operable by the operator boarding the cabin 10. When the shovel 100 is remotely operated, the inside of the cabin 10 may be unmanned. Hereinafter, the description will be given on the assumption that an operation by an operator includes at least one of an operation on an operation device 26 by an operator of the cabin 10 or a remote operation by an external operator.

For example, as illustrated in FIG. 3, the remote operation includes a mode in which the shovel 100 is operated by an operation input relating to the actuator of the shovel 100 performed by a remote operation assistance device 300.

The remote operation assistance device 300 is provided in, for example, a management center that manages the work of the shovel 100 from the outside. The remote operation assistance device 300 may be a portable operation terminal. In this case, the operator can remotely operate the shovel 100 while directly checking the work situation of the shovel 100 from the surroundings of the shovel 100.

The shovel 100 may transmit an image (hereinafter, referred to as a “surrounding image”) representing a situation around the shovel 100 including the front of the shovel 100 based on the captured image output by an imaging device 40 described later to the remote operation assistance device 300 through a communication device 60 described later, for example. Then, the remote operation assistance device 300 may display the image (surrounding image) received from the shovel 100 on the display device. Further, various information images (information screens) displayed on an output device 50 (a display device 50A) inside the cabin 10 of the shovel 100 may be similarly displayed on the display device of the remote operation assistance device 300. Thus, the operator who uses the remote operation assistance device 300 can remotely operate the shovel 100 while checking the display content such as the image or the information screen representing the situation around the shovel 100 displayed on the display device. The shovel 100 may activate actuators in response to a remote operation signal indicating the content of the remote operation received from the remote operation assistance device 300 by the communication device 60, and drive the driven elements such as the lower traveling body 1, the upper turning body 3, the boom 4, the arm 5, and the bucket 6.

The remote operation may include, for example, a mode in which the shovel 100 is operated by voice input, gesture input, or the like from the outside to the shovel 100 by a person (e.g., a worker) around the shovel 100. Specifically, the shovel 100 recognizes a voice uttered by a worker or the like around the shovel 100, a gesture performed by the worker or the like, and the like through a voice input device (e.g., a microphone), a gesture input device (e.g., an imaging device), and the like mounted on the shovel 100. The shovel 100 may activate the actuators in accordance with the recognized contents of the voice, the gesture, and the like, and drive the driven elements such as the lower traveling body 1 (the left and right crawlers 1C), the upper turning body 3, the boom 4, the arm 5, and the bucket 6.

The work of the shovel 100 may be remotely monitored. In this case, a remote monitoring assistance device having the same function as the remote operation assistance device 300 may be provided. The remote monitoring assistance device is, for example, the information processing device 200. Thus, a monitoring person who is the user of the remote monitoring assistance device can monitor the situation of the work of the shovel 100 while checking the surrounding image displayed on the display device of the remote monitoring assistance device. Further, for example, when the monitoring person determines that the monitoring is necessary from the viewpoint of safety, the monitoring person can perform a predetermined input using the input device of the remote monitoring assistance device to intervene in the activation of the shovel 100 by the operator and perform an emergency stop.

The information processing device 200 communicates with the shovel 100 to coordinate with each other and assist the activation of the shovel 100.

The information processing device 200 is, for example, a server or a terminal device for management installed in a management office at a work site of the shovel 100 or a management center or the like that is located at a place different from the work site of the shovel 100 and manages an activation state or the like of the shovel 100. The terminal device for management may be, for example, a stationary terminal device such as desktop personal computer (PC) or a portable terminal device (portable terminal) such as a tablet terminal, a smartphone, or a laptop PC. In the latter case, a worker at the work site, a supervisor who supervises the work, a manager who manages the work site, or the like can move in the work site while carrying the portable information processing device 200. In the latter case, the operator can bring the portable information processing device 200 into the cabin of the shovel 100, for example. Further, a plurality of the information processing devices 200 may be provided in accordance with the use, for example, for remote monitoring, for processing relating to a function of proposing a movement of the shovel 100 to the operator described later, or the like.

The information processing device 200 acquires data relating to the activation state from the shovel 100, for example. Thus, the information processing device 200 can identify the activation state of the shovel 100 and monitor the presence or absence of abnormality of the shovel 100. The information processing device 200 can display data relating to the activation state of the shovel 100 through a display device 208 to be described later and cause the user to check the data.

Further, the information processing device 200 transmits various data such as programs and reference data used in the processing of the controller 30 and the like to the shovel 100, for example. Thus, the shovel 100 can perform various processes relating to the activation of the shovel 100 using various data downloaded from the information processing device 200.

The information processing device 200 performs, for example, a process for supporting a function (hereinafter, referred to as a “movement proposing function”) relating to a proposal of a movement of the shovel 100 to the operator to be described later (refer to FIG. 6). Details will be described later.

[Hardware Configuration of Activation Assistance System]

Next, a hardware configuration of the activation assistance system SYS will be described with reference to FIGS. 4 and 5 in addition to FIGS. 1 to 3.

<Hardware Configuration of Shovel>

FIG. 4 is a block diagram illustrating an example of a hardware configuration of the shovel 100.

In FIG. 4, a path through which mechanical power is transmitted is indicated by a double line, a path through which high-pressure hydraulic oil for driving the hydraulic actuator flows is indicated by a solid line, a path through which pilot pressure is transmitted is indicated by a broken line, and a path through which an electric signal is transmitted is indicated by a dotted line.

The shovel 100 includes respective components such as a hydraulic drive system relating to hydraulic drive of a driven element, an operation system relating to operation of the driven element, a user interface system relating to exchange of information with a user, a communication system relating to communication with the outside, and a control system relating to various controls.

<<Hydraulic Drive System>>

As illustrated in FIG. 4, the hydraulic drive system of the shovel 100 includes the hydraulic actuators HA that hydraulically drive the driven elements such as the lower traveling body 1 (the left and right crawlers 1C), the upper turning body 3, and the attachment AT, as described above. The hydraulic drive system of the shovel 100 according to the present embodiment includes an engine 11, a regulator 13, a main pump 14, and a control valve 17.

The hydraulic actuators HA include traveling hydraulic motors 1ML and 1MR, a turning hydraulic motor 2M, a boom cylinder 7, an arm cylinder 8, a bucket cylinder 9, and the like.

In the shovel 100, a part or all of the hydraulic actuators HA may be replaced with electric actuators. That is, the shovel 100 may be a hybrid shovel or an electric shovel.

The engine 11 is a prime mover of the shovel 100 and is a main power source in a hydraulic drive system. The engine 11 is, for example, a diesel engine using light oil as fuel. The engine 11 is mounted on, for example, a rear portion of the upper turning body 3. The engine 11 rotates at a constant target rotation speed set in advance under direct or indirect control by a controller 30 described later, and drives the main pump 14 and a pilot pump 15.

Note that, instead of or in addition to the engine 11, another prime mover (e.g., an electric motor) or the like may be mounted on the shovel 100.

The regulator 13 controls (adjusts) the discharge amount of the main pump 14 under the control of the controller 30. For example, the regulator 13 adjusts the angle of the swash plate of the main pump 14 (hereinafter, referred to as “tilt angle”) in response to a control instruction from the controller 30.

The main pump 14 supplies the hydraulic oil to the control valve 17 through a high-pressure hydraulic line. The main pump 14 is mounted, for example, on the rear portion of the upper turning body 3, as in the engine 11. The main pump 14 is driven by the engine 11 as described above. The main pump 14 is, for example, a variable displacement hydraulic pump, and as described above, the stroke length of the piston is adjusted by the tilting angle of the swash plate being adjusted by the regulator 13 under the control of the controller 30, and the discharge flow rate and the discharge pressure are controlled.

The control valve 17 drives the hydraulic actuator HA, in accordance with a content of the operation or remote operation with respect to the operation device 26 by the operator or an operation instruction corresponding to the automatic driving function. The control valve 17 is mounted, for example, in a central portion of the upper turning body 3. As described above, the control valve 17 is connected to the main pump 14 via the high-pressure hydraulic line, and selectively supplies the hydraulic oil supplied from the main pump 14 to each hydraulic actuator in response to an operation by the operator or an operation instruction corresponding to the automatic driving function. Specifically, the control valve 17 includes a plurality of control valves (also referred to as “direction switching valves”) that control the flow rate and the flow direction of the hydraulic oil supplied from the main pump 14 to each of the hydraulic actuators HA.

<<Operation System>>

As illustrated in FIG. 4, the operation system of the shovel 100 includes the pilot pump 15, the operation device 26, a hydraulic control valve 31, a shuttle valve 32, and a hydraulic control valve 33.

The pilot pump 15 supplies a pilot pressure to various hydraulic devices via a pilot line 25. The pilot pump is mounted, for example, on the rear portion of the upper turning body 3, as in the engine 11. The pilot pump 15 is, for example, a fixed displacement hydraulic pump, and is driven by the engine 11 as described above.

The pilot pump 15 may be omitted. In this case, the hydraulic oil at a relatively low pressure obtained by reducing the pressure of the hydraulic oil at a relatively high pressure discharged from the main pump 14 by a predetermined pressure reducing valve may be supplied to various hydraulic devices as the pilot pressure.

The operation device 26 is provided near an operator's cab of the cabin 10 and is used by the operator to operate various driven elements. Specifically, the operation device 26 is used for the operator to operate the hydraulic actuator HA that drives each driven element, and as a result, the operation by the operator of the driven element to be driven by the hydraulic actuator HA can be implemented. The operation device 26 includes a pedal device and a lever device for operating each driven element (hydraulic actuator HA).

For example, as illustrated in FIG. 4, the operation device 26 is a hydraulic pilot type. Specifically, the operation device 26 outputs a pilot pressure corresponding to the operation content to the pilot line 25A on the secondary side by using the hydraulic oil supplied from the pilot pump 15 through the pilot line 25 and the pilot line 27A branching therefrom. The pilot line 27A is connected to one of the inlet ports of the shuttle valve 32, and is connected to the control valve 17 via a pilot line 27 connected to the outlet port of the shuttle valve 32. Thus, the pilot pressure corresponding to the operation content relating to various driven elements (hydraulic actuator HA) in the operation device 26 can be input to the control valve 17 via the shuttle valve 32. Therefore, the control valve 17 can drive each hydraulic actuator HA in accordance with the operation content of the operation device 26 by the operator or the like.

The operation device 26 may be an electric type. In this case, the pilot line 27A, the shuttle valve 32, and the hydraulic control valves 33 are omitted. Specifically, the operation device 26 outputs an electric signal (hereinafter, referred to as an “operation signal”) corresponding to the operation content, and the operation signal is input to the controller 30. The controller 30 outputs a control instruction corresponding to the content of the operation signal, that is, a control signal corresponding to the operation content with respect to the operation device 26 to the hydraulic control valve 31. Thus, the pilot pressure corresponding to the operation content of the operation device 26 is input from the hydraulic control valve 31 to the control valve 17, and the control valve 17 can drive each hydraulic actuator HA according to the operation content of the operation device 26.

Further, the control valve (direction switching valve) built in the control valve 17 for driving each hydraulic actuator HA may be an electromagnetic solenoid type. In this case, the operation signal output from the operation device 26 may be directly input to the control valve 17, that is, the electromagnetic solenoid type control valve.

As described above, a part or all of the hydraulic actuators HA may be replaced with electric actuators. In this case, the controller 30 may output a control instruction corresponding to the operation content of the operation device 26 or the content of the remote operation defined by the remote operation signal to the electric actuator or a driver or the like that drives the electric actuator. In addition, when the shovel 100 is remotely operated, the operation device 26 may be omitted.

The hydraulic control valve 31 is provided for each driven element (hydraulic actuator HA) to be operated by the operation device 26 and for each driving direction (e.g., the raising direction and the lowering direction of the boom 4) of the driven element (hydraulic actuator HA). That is, two hydraulic control valves 31 are provided for each double-acting hydraulic actuator HA. The hydraulic control valve 31 may be provided, for example, in the pilot line 25B between the pilot pump 15 and the control valve 17, and may be configured to be able to change the flow passage areas (i.e., the cross-sectional areas through which the hydraulic oil can flow). Thus, the hydraulic control valve 31 can output a predetermined pilot pressure to the pilot line 25B on the secondary side by using the hydraulic oil of the pilot pump 15 supplied through the pilot line 27B. Therefore, as illustrated in FIG. 4, the hydraulic control valve 31 can indirectly apply a predetermined pilot pressure corresponding to a control signal from the controller 30 to the control valve 17 through the shuttle valve 32 between the pilot line 27B and the pilot line 27. Therefore, the controller 30 can cause the hydraulic control valve 31 to supply the pilot pressure corresponding to the operation content of the operation device 26 to the control valve 17, and implement the movement of the shovel 100 based on the operation by the operator. The controller 30 can cause the hydraulic control valve 31 to supply the pilot pressure corresponding to the operation instruction corresponding to the automatic operation function to the control valve 17, and can implement the movement of the shovel 100 by the automatic operation function.

Further, the controller 30 may control the hydraulic control valve 31 to implement remote operation of the shovel 100, for example. Specifically, the controller 30 outputs a control signal corresponding to the content of the remote operation designated by the remote operation signal received from the remote operation assistance device 300 to the hydraulic control valve 31 by the communication device 60. Thus, the controller 30 can cause the hydraulic control valve 31 to supply the pilot pressure corresponding to the content of the remote operation to the control valve 17, and can implement the movement of the shovel 100 based on the remote operation by the operator.

The shuttle valve 32 has two inlet ports and one outlet port, and outputs the hydraulic oil having a higher pilot pressure of the pilot pressures input to the two inlet ports to the outlet port. The shuttle valve 32 is provided for each driven element (hydraulic actuator HA) to be operated by the operation device 26 and for each driving direction of the driven element (hydraulic actuator HA). One of two inlet ports of the shuttle valve 32 is connected to a pilot line 27A on the secondary side of the operation device 26 (specifically, the above-described lever device or pedal device included in the operation device 26), and the other is connected to a pilot line 27B on the secondary side of the hydraulic control valve 31. The outlet port of the shuttle valve 32 is connected to the pilot port of the corresponding control valve of the control valve 17 through the pilot line 27. The corresponding control valve is a control valve that drives a hydraulic actuator that is a work target of the above-described lever device or pedal device connected to one inlet port of the shuttle valve 32. Therefore, each of these shuttle valve 32 can cause the higher one of the pilot pressure of the pilot line 27A on the secondary side of the operation device 26 and the pilot pressure of the pilot line 27B on the secondary side of the hydraulic control valve 31 to act on the pilot port of the corresponding control valve. That is, the controller 30 can control the corresponding control valve without depending on the operation of the operation device 26 by the operator by outputting the pilot pressure higher than the pilot pressure on the secondary side of the operation device 26 from the hydraulic control valve 31. Therefore, the controller 30 can control the operation of the driven elements (the lower traveling body 1, the upper turning body 3, and the attachment AT) regardless of the operation state of the operator on the operation device 26, and can implement the remote operation function.

The hydraulic control valve 33 is provided in the pilot line 27A that connects the operation device 26 and the shuttle valve 32. The hydraulic control valve 33 is configured to be able to change, for example, the flow passage area thereof. The hydraulic control valve 33 is activated in response to a control signal input from the controller 30. Thus, the controller 30 can forcibly reduce the pilot pressure output from the operation device 26 when the operation device 26 is operated by the operator. Therefore, even when the operation device 26 is operated, the controller 30 can forcibly inhibit or stop the activation of the hydraulic actuator corresponding to the operation of the operation device 26. Further, for example, even when the operation device 26 is operated, the controller 30 can reduce the pilot pressure output from the operation device 26 to be lower than the pilot pressure output from the hydraulic control valve 31. Therefore, the controller 30 can reliably apply a desired pilot pressure to the pilot port of the control valve in the control t valve 17, for example, regardless of the operation content of the operation device 26 by controlling the hydraulic control valve 31 and the hydraulic control valve 33. Therefore, the controller 30 can more appropriately implement the remote operation function and the automatic operation function of the shovel 100 by controlling the hydraulic control valve 33 in addition to the hydraulic control valve 31, for example.

<<User Interface System>>

As illustrated in FIG. 4, the user interface system of the shovel 100 includes the operation device 26, the output device 50, and an input device 52.

The output device 50 outputs various kinds of information to a user of the shovel 100 (e.g., an operator of the cabin 10 or an operator of an external remote operation), a person around the shovel 100 (e.g., a worker or a driver of a work vehicle), or the like.

For example, the output device 50 includes an illumination device, the display device 50A (see FIG. 6), or the like that outputs various kinds of information in a visual manner. The illumination device is, for example, a warning lamp (indicator lamp) or the like. The display device 50A is, for example, a liquid-crystal display, an organic electroluminescence (EL) display, or the like. For example, as illustrated in FIG. 2, the illumination device and the display device 50A may be provided inside the cabin 10, and output various kinds of information to an operator or the like inside the cabin 10 in a visual manner. The illumination device and the display device 50A may be provided on, for example, a side surface of the upper turning body 3, and may output various kinds of information to an operator or the like around the shovel 100 in a visual manner.

Further, for example, the output device 50 includes a sound output device 50B (not illustrated in FIG. 6) that outputs various kinds of information in an auditory manner. The sound output device 50B includes, for example, a buzzer, a speaker, and the like. The sound output device 50B may be provided, for example, at least one of inside and outside of the shovel 100, and output various kinds of information to the operator inside the shovel 100 and a person (worker or the like) around the shovel 100 in an auditory manner.

Further, for example, the output device 50 may include a device that outputs various kinds of information by a tactile method such as vibration of the operator's cab.

The input device 52 receives various inputs from the user of the shovel 100, and signals corresponding to the received inputs are taken into the controller 30. The input device 52 is provided inside the cabin 10, for example, and receives an input from an operator or the like inside the cabin 10. The input device 52 may be provided on, for example, a side surface of the upper turning body 3 and may receive an input from an operator or the like around the shovel 100.

For example, the input device 52 includes an operation input device that receives an operation input. The operation input device may include a touch panel mounted on the display device, a touch pad installed around the display device, a button switch, a lever, a toggle, a knob switch provided in the operation device 26 (lever device), and the like.

Further, for example, the input device 52 may include a voice input device that receives a voice input of the user. The voice input device includes, for example, a microphone.

Further, for example, the input device 52 may include a gesture input device that receives a gesture input of the user. The gesture input device includes, for example, an imaging device that images a state of a gesture performed by the user.

Further, for example, the input device 52 may include a biometric input device that receives a biometric input of the user. The biometric input includes, for example, input of biometric information such as a fingerprint or an iris of the user.

<<Communication System>>

As illustrated in FIG. 4, the communication system of the shovel 100 according to the present embodiment includes the communication device 60.

The communication device 60 is connected to an external communication line and communicates with a device provided separately from the shovel 100. The device provided separately from the shovel 100 may include a portable terminal device (portable terminal) brought into the cabin by the user of the shovel 100, in addition to the device outside the shovel 100. The communication device 60 may include, for example, a mobile communication module conforming to a standard such as 4G (4th Generation) or 5G (5th Generation). The communication device 60 may include, for example, a satellite communication module. The communication device 60 may include, for example, a WiFi communication module or a Bluetooth (registered trademark) communication module. The communication device 60 may include a plurality of communication devices in accordance with communication lines to be connected.

For example, the communication device 60 communicates with an external device such as the information processing device 200 or the remote operation assistance device 300 in the work site through a local communication line constructed in the work site. The local communication line is, for example, a mobile communication line of a local 5G (so-called local 5G) constructed in a work site or a local area network (LAN) of a WiFi6.

For example, the communication device 60 communicates with the information processing device 200, the remote operation assistance device 300, and the like outside the work site through a communication line of a wide area including the work site, that is, a wide area network (WAN). The wide area network includes, for example, a wide area mobile communication network, a satellite communication network, the Internet, and the like.

<<Control System>>

As illustrated in FIG. 4, the control system of the shovel 100 includes a controller 30. The control system of the shovel 100 according to the present embodiment includes an operation pressure sensor 29, the imaging device 40, and the sensors S1 to S5.

The controller 30 performs various controls relating to the shovel 100.

The functions of the controller 30 may be implemented by any given hardware, or a combination of any given hardware and software, or the like. For example, as illustrated in FIG. 4, the controller 30 includes an auxiliary storage device 30A, a memory device 30B, a central processing unit (CPU) 30C, and an interface device 30D, which are connected to each other via a bus B1.

The auxiliary storage device 30A is a non-volatile storage unit, and stores necessary files, data, and the like together with programs to be installed. The auxiliary storage device 30A is, for example, an electrically erasable programmable read-only memory (EEPROM), a flash memory, or the like.

The memory device 30B loads the program in the auxiliary storage device 30A so that the CPU 30C can read the program, for example, when an instruction to start the program is given. The memory device 30B is, for example, a static random access memory (SRAM).

The CPU 30C executes, for example, a program loaded into the memory device 30B, and implements various functions of the controller 30 in accordance with instructions of the program.

The interface device 30D functions as, for example, a communication interface for connection to a communication line inside the shovel 100. The interface device 30D may include a plurality of different types of communication interfaces in accordance with the type of communication line to be connected.

The interface device 30D functions as an external interface for reading and writing of data from and to recording media. The recording medium is, for example, a dedicated tool that is connected to a connector installed inside the cabin 10 by a detachable cable. The recording medium may be a general-purpose recording medium such as an SD memory card or a universal serial bus (USB) memory. Thus, the program for implementing various functions of the controller 30 can be provided by, for example, portable recording media and installed in the auxiliary storage device 30A of the controller 30. The program may be downloaded from another computer outside the shovel 100 through the communication device 60 and installed in the auxiliary storage device 30A.

Note that some of the functions of the controller may be implemented by another controller (control device). That is, the functions of the controller 30 may be implemented by a plurality of controllers in a distributed manner.

The operation pressure sensor 29 detects a pilot pressure on the secondary side (pilot line 27A) of the hydraulic pilot-type operation device 26, that is, a pilot pressure corresponding to the operation state of each of the driven elements (hydraulic actuators) in the operation device 26. A detection signal of the pilot pressure corresponding to the operation state of each driven element (hydraulic actuator HA) in the operation device 26 by the operation pressure sensor 29 is taken into the controller 30.

When the operation device 26 is an electric type, the operation pressure sensor 29 is omitted. This is because the controller 30 can identify the operation state of each driven element through the operation device 26 based on the operation signal taken in from the operation device 26.

The imaging device 40 acquires an image of the surroundings of the shovel 100. The imaging device 40 may acquire (generate) three dimensional data (hereinafter, simply referred to as “three dimensional data of an object”) indicating the position and the outer shape of an object around the shovel 100 in the imaging range (angle of view) based on the acquired image and data relating to a distance described later. The three dimensional data of the object around the shovel 100 is, for example, data of coordinate information of a point group representing the surface of the object, distance image data, or the like.

For example, as illustrated in FIG. 2, the imaging device 40 includes a camera 40F that images the front of the upper turning body 3, a camera 40B that images the rear of the upper turning body 3, a camera 40L that images the left side of the upper turning body 3, and a camera 40R that images the right side of the upper turning body 3. Thus, the imaging device 40 can image the entire circumference around the shovel 100, that is, a range over an angular direction of 360 degrees in a top view of the shovel 100. The operator can visually recognize the captured images of the cameras 40B, 40L, and 40R and the surrounding images such as the processed images generated based on the captured images through the output device 50 (the display device) and the remote operation display device, and can check the states of the left side, the right side, and the rear side of the upper turning body 3. Further, the operator can remotely operate the shovel 100 while checking the operation of the attachment AT including the bucket 6 by visually recognizing the captured image of the camera 40F and the surrounding image such as the processed image generated based on the captured image through the remote operation display device. Hereinafter, the cameras 40F, 40B, 40L, and 40R may be collectively or individually referred to as a “camera 40X”.

The camera 40X is, for example, a monocular camera. The camera 40X may be a stereo camera or a TOF (Time Of Flight) camera. The camera may be capable of acquiring depth information in addition to a two dimensional image, as in a camera or the like (hereinafter collectively referred to as a “3D camera”).

The controller 30 receives output data (e.g., image data or three dimensional data of an object around the shovel 100) captured by the imaging device 40 (camera 40X) via a one-to-one communication line or an in-vehicle network. Thus, for example, the controller 30 can monitor an object around the shovel 100 based on output data of the camera 40X. Further, for example, the controller 30 can determine the surrounding environment of the shovel 100 based on output data of the camera 40X. In addition, for example, the controller 30 can determine the posture state of the attachment AT in the captured image based on the output data of camera 40X (camera 40F). Further, for example, the controller 30 can determine the posture state of the machine body (the upper turning body 3) of the shovel 100 with reference to an object around the shovel 100 based on the output data of the camera 40X.

Note that some of the cameras 40F, 40B, 40L, and 40R may be omitted. For example, when the remote operation of the shovel 100 is not performed, the camera 40F and the camera 40L may be omitted. This is because the operator of the cabin 10 can relatively easily check the state of the front and the left side of the shovel 100. Instead of or in addition to the imaging device 40 (camera 40X), a distance sensor may be provided in the upper turning body 3. The distance sensor is attached to, for example, an upper portion of the upper turning body 3, and acquires data relating to the distance and direction of a surrounding object with respect to the shovel 100. The distance sensor may acquire (generate) three dimensional data (e.g., data of coordinate information of a point group) of an object around the shovel 100 in the sensing range based on the acquired data. The distance sensor is, for example, a light detection and ranging (LIDAR). Further, for example, the distance sensor may be a millimeter wave radar, an ultrasonic sensor, an infrared sensor, or the like.

The sensor S1 is attached to the boom 4 and detects a posture angle (hereinafter, referred to as a “boom angle”) around a rotation axis of a base end corresponding to a coupling portion of the boom 4 with the upper turning body 3. The sensor S1 includes, for example, rotary potentiometers, rotary encoders, accelerometers, angular accelerometers, six axis sensors, and IMU (Inertial Measurement Unit). Hereinafter, the same may be applied to the sensors S2 to S4. The sensor S1 may include a cylinder sensor that detects the extension/contraction position of the boom cylinder 7. The same applies to the sensors S2 and S3. A detection signal of the boom angle by the sensor S1 is taken into the controller 30. Thus, the controller 30 can identify the posture state of the boom 4.

The sensor S2 is attached to the arm 5 and detects a posture angle (hereinafter, referred to as an “arm angle”) around a rotation axis of a base end corresponding to a coupling portion of the arm 5 with the boom 4. A detection signal of the arm angle by the sensor S2 is taken into the controller 30. Thus, the controller 30 can identify the posture state of the arm 5.

The sensor S3 is attached to the bucket 6 and detects a posture angle (hereinafter, referred to as an “arm angle”) around a rotation axis of a base end corresponding to a coupling portion of the bucket 6 with the arm 5. A detection signal of the arm angle by the sensor S3 is taken into the controller 30. Thus, the controller 30 can identify the posture state of the bucket 6.

The sensor S4 detects an inclined state of the machine body (e.g., the upper turning body 3) with respect to a predetermined reference surface (e.g., a horizontal plane). The sensor S4 is attached to, for example, the upper turning body 3, and detects inclination angles (hereinafter, referred to as a “front-rear inclination angle” and a “left-right inclination angle”) around two axes in the front-rear direction and the left-right direction of the shovel 100 (i.e., the upper turning body 3). A detection signal corresponding to the inclination angle (the front-rear inclination angle and the left-right inclination angle) detected by the sensor S4 is taken into the controller 30. Thus, the controller 30 can identify the inclination state of the machine body (upper turning body 3).

The sensor S5 is attached to the upper turning body 3 and outputs detection information relating to a turning state of the upper turning body 3. The sensor S5 detects, for example, a turning angular velocity and a turning angle of the upper turning body 3. The sensor S5 includes, for example, a gyro sensor, a resolver, a rotary encoder, and the like. Detection information relating to the turning state detected by the sensor S5 is taken into the controller 30. Thus, the controller 30 can identify the turning state such as the turning angle of the upper turning body 3.

Note that, in a case where the sensor S4 includes a gyro sensor, a six-axis sensor, an inertial measurement unit (IMU), or the like capable of detecting angular velocities about three axes, the turning state (e.g., turning angular velocities) of the upper turning body 3 may be detected based on detection signals of the sensor S4. In this case, the sensor S5 may be omitted. In addition, when it is possible to identify the posture state of the upper turning body 3, the attachment AT, or the like based on the output of the imaging device 40 or the distance sensor, at least some of the sensors S1 to S5 may be omitted.

<Hardware Configuration of Information Processing Device>

FIG. 5 is a block diagram illustrating an example of a hardware configuration of the information processing device 200.

The functions of the information processing device 200 are implemented by any given hardware, a combination of any given hardware and software, or the like. For example, as illustrated in FIG. 5, the information processing device 200 includes an external interface 201, an auxiliary storage device 202, a memory device 203, a CPU 204, a high-speed arithmetic device 205, a communication interface 206, an input device 207, and the display device 208, which are connected via a bus B2.

The external interface 201 functions as an interface for reading data from recording media 201A and writing data to the recording media 201A. The recording media 201A include, for example, flexible disks, CDs (Compact Discs), DVDs (Digital Versatile Discs), BDs (Blu-ray (trademark) Discs), SD memory cards, USB-type memories, and the like. The information processing device 200 can read various kinds of data used in processing through the recording media 201A, store the information in the auxiliary storage device 202, and install programs for implementing various functions.

The information processing device 200 may acquire various data and programs used in processing from an external device via the communication interface 206.

The auxiliary storage device 202 stores the installed various programs, and also stores files, data, and the like necessary for various processes. The auxiliary storage device 202 includes, for example, a hard disk drive (HDD), a solid state disc (SSD), a flash memory, or the like.

When an instruction to activate a program is issued, the memory device 203 reads the program from the auxiliary storage device 202 and stores the program. The memory device 203 includes, for example, a dynamic random access memory (DRAM) or an SRAM.

The CPU 204 executes various programs loaded from the auxiliary storage device 202 to the memory device 203 and implements various functions relating to the information processing device 200 according to the programs.

The high-speed arithmetic device 205 performs arithmetic processing at a relatively high speed in conjunction with the CPU 204. The high-speed arithmetic device 205 includes, for example, a graphics processing unit (GPU), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or the like.

The high-speed arithmetic device 205 may be omitted depending on the speed of necessary arithmetic processing.

The communication interface 206 is used as an interface for connecting to an external device so as to be able to communicate with the external device. Thus, the information processing device 200 can communicate with an external device such as the shovel 100 through the communication interface 206. The communication interface 206 may include a plurality of types of communication interfaces depending on a communication method with a device to be connected.

The input device 207 receives various inputs from a user.

The input device 207 includes, for example, an operation input device that receives a mechanical operation input from a user. The operation input device includes, for example, a button, a toggle, a lever, and the like. The operation input device includes, for example, a touch panel mounted on the display device 208, a touch pad provided separately from the display device 208, and the like.

The input device 207 includes, for example, a voice input device capable of receiving a voice input from the user. The voice input device includes, for example, a microphone capable of collecting a voice of the user.

The input device 207 includes, for example, a gesture input device capable of receiving a gesture input from the user. The gesture input device includes, for example, a camera capable of capturing an image of a gesture of the user.

The input device 207 includes, for example, a biometric input device capable of receiving a biometric input from the user. The biometric input device includes, for example, a camera capable of acquiring image data containing information on a fingerprint or an iris of the user.

The display device 208 displays an information screen and an operation screen to the user. For example, the display device 208 includes the above-described remote operation display device. The display device 208 is, for example, a liquid crystal display an or organic electroluminescence (EL) display.

Note that the remote operation assistance device 300 may be implemented by any given hardware or a combination of any given hardware and software, and may have the same hardware configuration as the information processing device 200. For example, the remote operation assistance device 300 is mainly configured by a computer including a CPU, a memory device, an auxiliary storage device, an interface device, an input device, and a display device, as in the information processing device 200 (FIG. 5). The memory device is, for example, an SRAM, a DRAM, or the like. The auxiliary storage device is, for example, an HDD, an SSD, an EEPROM, a flash memory, or the like. The interface device includes an external interface for connection to an external recording medium and a communication interface for communication with the outside such as the shovel 100. The input device includes, for example, a lever-type operation input device. Thus, the operator can perform an operation input relating to the actuator of the shovel 100 using the operation input device, and the remote operation assistance device 300 can transmit a signal corresponding to the operation input to the shovel 100 using the communication interface. Therefore, the operator can remotely operate the shovel 100 using the remote operation assistance device.

First Example of Movement Proposing Function

Next, a first example of a function of proposing a movement of the shovel 100 to a user (operator) (movement proposing function) will be described with reference to FIGS. 6 and 7 in addition to FIGS. 1 to 5.

<Functional Configuration>

FIG. 6 is a functional block diagram illustrating a first example of a functional configuration relating to a movement proposing function of the activation assistance system SYS.

The shovel 100 includes an assistance device 150. The assistance device 150 assists an operation of the shovel 100 performed by the operator.

As illustrated in FIG. 6, the assistance device 150 includes a controller 30, the imaging device 40, the output device 50 (e.g., display device 50A), and the communication device 60.

The controller 30 includes a movement log providing unit 301 and a work assistance unit 302 as functional units.

In a case where the activation assistance system SYS includes a plurality of shovels 100, there may be two shovels 100; one shovel 100 includes the controller 30 having only the movement log providing unit 301 out of the movement log providing unit 301 and the work assistance unit 302, and the other shovel 100 includes the controller 30 having only the work assistance unit 302 out of the movement log providing unit 301 and the work assistance unit 302. In this case, the former shovel 100 only has a function of acquiring a movement log of the shovel 100 and providing the movement log to the information processing device 200, which is used for the operator's operation assistance function (movement proposing function) in the latter shovel 100. The same applies to a second example (FIG. 8) of the movement proposing function described below.

The information processing device 200 includes, as functional units, a movement log acquisition unit 2001, a movement log storage unit 2002, a training data generation unit 2003, a machine learning unit 2004, a trained model storage unit 2005, and a distribution unit 2006.

The movement log providing unit 301 is a functional unit for acquiring a movement log of the shovel 100, which is original data for implementing the movement proposing function, and providing the movement log to the information processing device 200. Specifically, the movement log of an operator who has been operating the shovel 100 for a long time and is a relatively experienced operator (hereinafter, for convenience, “skilled operator”) is obtained and provided to the information processing device 200.

The movement log of the shovel 100 includes data relating to the shape of the work target around the shovel 100 and data relating to the movement of the shovel 100 executed with respect to the shape of the work target. The data relating to the shape of the work target around the shovel 100 is, for example, data relating to the topography of the ground of the work site as the work target of the shovel 100. The data relating to the shape of the work target of the shovel 100 is, for example, image data of the imaging device 40 or three dimensional data of the work target obtained from the image data. The data relating to the movement of the shovel 100 is, for example, data representing the operation content of the operator. The data representing the operation content of the operator is, for example, output data of the operation pressure sensor 29 in the case of the hydraulic pilot type operation device 26 or output data (data of an operation signal) of the operation device 26 in the case of the electric operation device 26. The data relating to the movement of the shovel 100 may be data representing the movement state of the shovel 100 actually executed in response to the operation by the operator. The data indicating the movement state of the shovel 100 is, for example, a data outputted from the sensors S1 to S5 or a data relating to the posture state of the shovel 100 acquired from the data outputted from the sensors S1 to S5.

The movement log providing unit 301 includes a movement log recording unit 301A, a movement log storage unit 301B, and a movement log transmission unit 301C.

The movement log recording unit 301A acquires a movement log of the shovel 100 and records the movement log in the movement log storage unit 301B. For example, every time the movement of the shovel 100 is executed, the movement log recording unit 301A records, in the movement log storage unit 301B, data relating to the shape of the work target around the shovel 100 at the start of execution of the movement or immediately before the execution of the movement and data relating to the movement of the shovel 100.

The movement log storage unit 301B stores movement logs of the shovel 100 in an accumulated manner. For example, in the movement log storage unit 301B, data relating to the shape of the work target around the shovel 100 for each movement of the shovel 100 and data relating to the corresponding movement of the shovel 100 are stored in association with each other. For example, the movement log storage unit 301B may accumulate record data representing the correspondence between data relating to the shape of the work target around the shovel 100 for each movement of the shovel 100 and data relating to the corresponding movement of the shovel 100, and a database of movement logs may be constructed.

The movement log in the movement log storage unit 301C, which has been transmitted to the information processing device 200 by the movement log transmission unit 301B described later, may be deleted later.

The movement log transmission unit 301C transmits the movement log of the shovel 100 stored in the movement log storage unit 301B to the information processing device 200 through the communication device 60. The movement log transmission unit 301C may transmit the record data representing the correspondence relationship between data relating to the shape of the work target around the shovel 100 for each movement of the shovel 100 and data relating to the corresponding movement of the shovel 100 to the information processing device 200.

For example, the movement log transmission unit 301C transmits the movement log of the shovel 100 which is stored in the movement log storage unit 301B and has not been transmitted to the information processing device 200 in response to a signal (hereinafter, referred to as a “transmission request signal”) for requesting transmission of the movement log of the shovel 100, which is received from the information processing device 200. The movement log transmission unit 301C may automatically transmit the movement log of the shovel 100 which is stored in the movement log storage unit 301B and has not been transmitted to the information processing device 200 at a predetermined timing. The predetermined timing is, for example, the time of stopping the activation of the shovel 100 (the time of turning off the key switch) or the time of starting the activation (the time of turning on the key switch). The movement log acquisition unit 2001 acquires the movement log of the shovel 100 received from the shovel 100.

The movement log acquisition unit 2001 acquires the movement log of the shovel 100 by transmitting a transmission request signal to the shovel 100 in response to an operation of the user of the information processing device 200 or automatically at a predetermined timing. The movement log acquisition unit 2001 may acquire the movement log of the shovel 100 transmitted from the shovel 100 at a predetermined timing.

The movement log storage unit 2002 stores the movement logs of the shovel 100 acquired by the movement log acquisition unit 2001 in an accumulated manner. For example, in the movement log storage unit 2002, as in the case of the movement log storage unit 301B, data relating to the shape of the work target around the shovel 100 for each movement of the shovel 100 and data relating to the corresponding movement of the shovel 100 are stored in association with each other.

The training data generation unit 2003 generates training data for machine learning based on the movement log of the shovel 100 in the movement log storage unit 2002. The data generation unit 2003 may automatically training generate training data by batch processing or may generate training data in response to an input from a user of the information processing device 200. The training data is of a combination of data including data relating to the shape of the work target around the shovel 100 as input data and data (hereinafter, “ground truth data”) representing the movement of the shovel 100 suitable for the shape of the work target corresponding to the input data as ground truth output data.

The ground truth data includes, for example, data representing the type of movement selected from among a plurality of candidate movements that can be executed in a predetermined work. In the case of a ground leveling work at the work site, the plurality of candidate movements include, for example, a sweep-out movement, a horizontal leveling movement, a rolling compaction movement, a broom-turning movement, and the like. The sweep-out movement is, for example, a movement of sweeping out earth and sand forward with the back surface of the bucket 6 by moving the attachment AT and pushing the bucket 6 forward along the ground. In the sweep-out movement, for example, the attachment AT performs a lowering movement of the boom 4 and an opening movement of the arm 5. The horizontal leveling movement is, for example, a movement of leveling the unevenness of the ground (topographic surface) by moving the attachment AT and moving the claw tip of the bucket 6 in a substantially horizontal direction along the ground so as to pull the claw tip back toward the front. In the horizontal leveling movement, for example, the attachment AT performs the raising movement of the boom 4 and the closing movement of the arm 5. The rolling compaction movement is, for example, moving of the attachment AT to press the ground with the back surface of the bucket 6. The rolling compaction movement may be a movement of pushing the bucket 6 forward along the ground to sweep the earth and sand to a predetermined position in front of the bucket 6 with the back surface of the bucket 6 and then pressing the ground at the predetermined position with the back surface of the bucket 6. In the rolling compaction movement, for example, the attachment AT performs a lowering movement of the boom 4 when pressing the ground. The broom-turning movement is, for example, moving of the upper turning body 3 to turn the bucket 6 to the left and right in a state of being along the ground. The broom-turning movement may be, for example, a movement of pushing the bucket 6 forward while moving the attachment AT and the upper turning body 3 to turn the bucket 6 alternately to the left and to the right in a state where the bucket 6 is placed along the ground. In the broom-turning movement, for example, the upper turning body 3 alternately repeats a turning movement to the left and to the right. In the broom-turning movement, for example, the attachment AT may perform the lowering movement of the boom 4 and the opening movement of the arm 5 in addition to alternate turning movement to the left and to the right of the upper turning body 3, as in the sweep-out movement. The ground truth data may include, for example, data representing the trajectory of the bucket 6 during the movement of the shovel 100.

The machine learning unit 2004 causes the base learning model to perform machine learning based on the set of training data generated by the training data generation unit 2003, and generates a trained model LM. The trained model LM (base learning model) includes, for example, a neural network such as a deep neural network (DNN).

The trained model LM outputs a prediction probability for each of a plurality of candidate movements executed in a predetermined work, for example, with data relating to the shape of the work target around the shovel 100 as input conditions. Each prediction probability represents the reliability of a candidate movement. As described above, the trained model LM reflects the movement log of the skilled operator when the operator operates the shovel 100, and it is considered that the higher the prediction probability, the higher the reliability of selecting the candidate movement. The prediction probability represents a suitability for the shape of the work target around the shovel 100 as the input condition. This is because it is considered that the higher the prediction probability, the more likely it is that a skilled operator will determine that the candidate movement is suitable for the shape of the work target. The trained model LM may output data representing a trajectory (hereinafter, referred to as a “target trajectory”) of the bucket 6 for each of a plurality of candidate movements using data relating to the shape of the work target around the shovel 100 as the input condition. In addition, the trained model may output a plurality of data representing a target trajectory of the bucket 6 for each of a plurality of candidate movements and may output the prediction probability for each of the plurality of target trajectories of the bucket 6, using data relating to the shape of the work target around the shovel 100 as the input conditions. The prediction probability represents the reliability of the target trajectory or the suitability for the shape of the work target around the shovel 100 as the input condition, as in the prediction probability of the candidate movement. The trained model LM may be generated for each of a plurality of different works. For example, the trained model LM is generated for each work such as the ground leveling work, the slope construction work, and the filling work.

The trained model LM output by the machine learning unit 2004 is stored in the trained model storage unit 2005.

The distribution unit 2006 distributes the trained model LM to the shovel 100.

For example, when the trained model LM is generated by the machine learning unit 2004, the distribution unit 2006 distributes the most recently generated trained model LM to the shovel 100. The distribution unit 2006 may distribute the latest trained model LM of the trained model storage unit 2005 to the shovel 100 in response to a signal for requesting distribution of the trained model LM received from the shovel 100.

The work assistance unit 302 is a functional unit for assisting work of the shovel 100 by an operation by the operator.

The work assistance unit 302 includes a trained model storage unit 302A, a work target shape acquisition unit 302B, an estimation unit 302C, and a proposing unit 302D.

The trained model storage unit 302A stores the trained model LM distributed from the information processing device 200 and received through the communication device 60.

The work target shape acquisition unit 302B acquires data relating to a shape (topography) of a work target around the shovel 100 based on outputs of the imaging device 40 and the distance sensor.

The estimation unit 302C estimates a movement having a relatively high reliability or a relatively high suitability for the shape of the work target around the shovel 100 among a plurality of candidate movements that can be performed in the predetermined work, based on the data relating to the shape of the work target around the shovel 100. The estimation unit 302C may estimate one or more target trajectories of the bucket 6 having relatively high reliability or suitability, based on the data relating to the shape of the work target around the shovel 100, for each of the plurality of candidate movements.

For example, the estimation unit 302C may estimate a movement having a relatively high reliability or a relatively high suitability for the shape of the work target around the shovel 100 using the trained model LM, based on the data relating to the shape of the work target around the shovel 100 as an input condition. The estimation unit 302C may estimate one or a plurality of target trajectories of the bucket 6 having relatively high reliability or suitability using the trained model LM, based on the data relating to the shape of the work target around the shovel 100 as an input condition.

The proposing unit 302D proposes a movement of the shovel 100 having relatively high reliability or suitability for the shape of the work target around the shovel 100 to the operator of the cabin 10 through the output device 50 such as the display device 50A based on the estimation result of the estimation unit 302C. Thus, even an inexperienced operator can select a more appropriate movement in accordance with the current shape of the work target around the shovel 100. Therefore, it is possible to improve the convenience of the operator and to improve the work efficiency of the shovel 100. The number of movements proposed to the operator may be one or more. For example, the proposing unit 302D notifies the numerical value of the suitability (reliability) for the shape of the work target around the shovel 100 for all or some of the plurality of candidate movements, thereby proposing a movement having a relatively high suitability (see FIG. 10 described later).

The proposing unit 302D may propose, through the output device 50, one target trajectory of the bucket 6 for the proposed movement, which has a relatively high reliability or suitability for the shape of the work target around the shovel 100, based on the estimation result of the estimation unit 302C. Thus, even an inexperienced operator can identify a more appropriate target trajectory of the bucket 6 in accordance with the current shape of the work target around the shovel 100, and operate the shovel 100 so as to achieve the target trajectory. Therefore, it is possible to further improve the convenience of the operator and to further improve the work efficiency of the shovel 100.

The proposing unit 302D may propose a plurality of target trajectories of the bucket 6 for the proposed movement, which have relatively high reliability and suitability for the shape of the work target around the shovel 100, through the output device 50 of the work target. Thus, the operator can identify a plurality of more appropriate target trajectories of the bucket 6 in accordance with the current shape of the work target around the shovel 100, and can operate the shovel 100 so as to achieve one target trajectory selected by the operator. Therefore, the work efficiency of the shovel 100 can be improved in a manner reflecting the intention of the operator. For example, the proposing unit 302D notifies the operator of a numerical value of the suitability (reliability) for the shape of the work target around the shovel 100 for each of the plurality of target trajectories of the proposed movement, thereby proposing a target trajectory of the bucket 6 having a relatively high suitability (see FIGS. 11 and 13 described later).

In addition, in a case where the shovel 100 is remotely operated, the proposing unit 302D may propose an operation or a target trajectory of the bucket 6 having relatively high reliability or suitability for the operator using the remote operation assistance device 300 through the communication device 60. In this case, the proposing unit 302D transmits data representing the proposed content to the remote operation assistance device 300 through the communication device 60. Thus, the remote operation assistance device 300 can propose an operation or a target trajectory of the bucket 6 having relatively high reliability or suitability for the operator using the remote operation assistance device 300, using the display device, the sound output device, or the like.

<Processing>

FIG. 7 is a flowchart schematically illustrating a first example of processing relating to the movement proposing function of the shovel 100.

The flowchart of FIG. 7 is started when a predetermined input for starting the movement proposing function is received through the input device 52 or the input device of the remote operation assistance device 300, for example. The same applies to the flowchart of FIG. 9 described below.

As illustrated in FIG. 7, in step S102 (an example of an acquiring step), the work target shape acquisition unit 302B acquires data relating to the shape of the work target around the shovel 100 based on the output of the imaging device 40.

When the processing of step S102 is completed, the controller 30 proceeds to step S104.

In step S104, the estimation unit 302C estimates a movement with a relatively high suitability (reliability) for the shape of the current work target around the shovel 100, based on the data acquired in step S102.

When the processing of step S104 is completed, the controller 30 proceeds to step S106.

In step S106 (an example of a proposing step), the proposing unit 302D causes the display device 50A to display a proposed movement from among the plurality of candidate movements and a target trajectory of the movement based on the estimation result in step S104.

When the processing of step S106 is completed, the controller 30 proceeds to step S108.

In step S108, the controller 30 determines whether the driven elements (actuators) have been operated. The controller 30 proceeds to step S110 when the driven elements are not operated, and proceeds to step S112 when the driven elements are operated.

In step S110, the controller 30 determines whether the end condition is satisfied. The end condition is, for example, a condition that a predetermined input representing the end of the movement proposing function from the operator is received through the input device 52 or the input device of the remote operation assistance device 300. The end condition may be a condition that a predetermined input representing the end of the work is received from the operator through the input device 52 or the input device of the remote operation assistance device 300. The end condition may be a condition that the controller 30 determines the end of the work based on the captured image of the imaging device 40. The controller 30 ends the processing of the current flowchart when the end condition is satisfied, and returns to step S108 when the end condition is not satisfied.

Meanwhile, in step S112, the controller 30 determines whether the operation of the driven elements corresponding to one movement of the shovel 100 has been completed, based on the operation state of the operation device 26, the movement state of the shovel 100, and the like. The controller 30 can identify the operation state of the operation device 26, the movement state of the shovel 100, and the like based on the outputs of the operation pressure sensor 29, the operation signals outputted from the operation device 26, the outputs of the sensors S1 to S5, and the like. The controller 30 proceeds to step S114 when the operation of the driven elements corresponding to one movement of the shovel 100 has been completed, and waits until the operation is completed (repeats the processing of step S112) when the operation is not completed.

In step S114, the controller 30 determines whether the end condition is satisfied. The controller 30 ends the processing of the current flowchart when the end condition is satisfied, and returns to step S102 when the end condition is not satisfied.

In this way, in the present example, the assistance device 150 can propose a movement of the shovel 100 and a target trajectory of the bucket 6 having a high suitability (reliability) with respect to the shape of the work target around the shovel 100 to the operator through the display device 50A and the remote operation assistance device 300. [Second Example of Movement Proposing Function]

Next, a second example of the movement proposing function of the shovel 100 to the user (operator) will be described with reference to FIGS. 8 and 9 in addition to FIGS. 1 to 5.

Hereinafter, the same symbols are attached to the same or corresponding configurations as in the first example above, and the description will focus on the parts that differ from the first example above, while the description of the same or corresponding contents may be simplified or omitted.

<Functional Configuration>

FIG. 8 is a functional block diagram illustrating a first example of a functional configuration relating to a movement proposing function of the activation assistance system SYS.

As illustrated in FIG. 8, the assistance device 150 of the shovel 100 includes the controller 30, the hydraulic control valve 31, the imaging device 40, the output device 50 (the display device 50A), the input device 52, and a communicating device 60.

The controller 30 includes, as functional units, the movement log providing unit 301 and the work assistance unit 302, as in the first example described above.

The work assistance unit 302 includes a trained model storage unit 302A, a work target shape acquisition unit 302B, an estimation unit 302C, a proposing unit 302D, and a movement control unit 302E.

The movement control unit 302E controls the hydraulic control valve 31 in response to an input of an instruction from the operator, which is received through the input device 52 or the communication device 60, and causes the hydraulic control valve 31 to automatically execute the movement of the shovel 100 proposed to the operator by the proposing unit 302D. Thus, the assistance device 150 can cause the shovel 100 to automatically perform the proposed movement in accordance with the current shape of the work target around the shovel 100 on the assumption of the input of an instruction from the operator. Therefore, even an inexperienced operator can more appropriately perform the movement of the shovel 100 in accordance with the shape of the current work target around the shovel 100 only by inputting the instruction. Therefore, it is possible to further improve the convenience of the operator and to further improve the work efficiency of the shovel 100.

For example, when the number of proposed movements is one, the movement control unit 302E controls the hydraulic control valve 31 in response to an input of an instruction from the operator, and causes the movement of the shovel 100 to be automatically executed as the movement proposed by the proposing unit 302D. For example, when there are a plurality of proposed movements, the movement control unit 302E automatically executes one movement selected by an input of an instruction of the operator among the plurality of proposed movements. For example, when one target trajectory of the bucket 6 is proposed by the proposing unit 302D, the movement control unit 302E automatically executes the movement of the shovel 100 to be proposed so that the bucket 6 moves along the target trajectory to be proposed. For example, when a plurality of target trajectories of the bucket 6 are proposed by the proposing unit 302D, the movement control unit 302E automatically executes the proposed movement so that the bucket 6 moves along one target trajectory selected by the input of the instruction of the operator from among the plurality of target trajectories.

The movement log recording unit 301A records a movement log including data relating to the shape of the work target acquired by the work target shape acquisition unit 302B and data representing the movement performed by the movement control unit 302E and the target trajectory in the movement log storage unit 301B. Thus, the movement log transmission unit 301C can accumulate the movement log including the data representing the movement of the shovel 100 actually executed by the operator and the target trajectory in the movement log storage unit 301B. Further, the movement log transmission unit 301C can upload the accumulated movement logs to the information processing device 200. Therefore, the machine learning unit 2004 can update the trained model LM by retraining or additionally training the trained model LM using the movement logs.

The machine learning unit 2004 may compare the retrained or additionally-trained trained model LM with the current trained model LM using predetermined evaluation data and update the trained model LM in the trained model storage unit 2005 when the evaluation result of the retrained or additionally-trained trained model LM is high.

<Processing>

FIG. 9 is a flowchart schematically illustrating a second example of the processing relating to the movement proposing function of the shovel 100.

As illustrated in FIG. 9, the processing of steps S202 to S206 is the same as the processing of steps S102 to S106 of FIG. 7, and thus the description thereof will be omitted.

When the processing of step S206 is completed, the controller 30 proceeds to step S208.

In step S208, the controller 30 determines whether an input instructing execution of the proposed movement (hereinafter, “input of an execution instruction”) has been received from the operator through the input device 52 or the communication device 60. The controller 30 proceeds to step S210 when the input of the execution instruction is not received, and proceeds to step S212 when the input of the execution instruction is received.

In step S210, the controller 30 determines whether the end condition is satisfied. The controller 30 ends the processing of the current flowchart when the end condition is satisfied, and returns to step S208 when the end condition is not satisfied.

On the other hand, in step S212, the movement control unit 302E controls the hydraulic control valve 31 to automatically execute the operation designated by the input of the execution instruction. Further, when the target trajectory is instructed by the input of the execution instruction, the movement control unit 302E causes the movement of the shovel 100 designated by the input of the execution instruction to be executed so that the bucket 6 moves along the target trajectory designated by the input of the execution instruction.

When the processing of step S212 is completed, the controller 30 proceeds to step S214.

In step S214, the movement log recording unit 301A records a movement log including data relating to the shape of the work target acquired by the work target shape acquisition unit 302B and data relating to the movement performed by the movement control unit 302E and the target trajectory in the movement log storage unit 301B.

When the processing of step S214 is completed, the controller 30 proceeds to step S216.

In step S216, the controller 30 determines whether the end condition is satisfied. The controller 30 ends the processing of the current flowchart when the end condition is satisfied, and returns to step S202 when the end condition is not satisfied.

In this example, the work target shape acquisition unit 302B may acquire data relating to the shape of the work target by predicting a change in the shape of the work target due to the movement of the shovel 100 in the processing of the previous step S212.

In this way, in the present example, the assistance device 150 can automatically execute the movement of the shovel 100 and the target trajectory of the bucket 6 having a high suitability (reliability) with respect to the shape of the work target around the shovel 100, in response to the instruction of the operator.

In this example, the assistance device 150 can accumulate a movement log including data relating to the shape of the work target around the shovel 100 and data relating to the automatically performed movement of the shovel 100 and the target trajectory. Therefore, the assistance device 150 can update the trained model LM using the accumulated movement logs.

Specific Example of Display Content Relating to Movement Proposing Function of Shovel

Next, specific examples of display contents of the display device 50A relating to the movement proposing function of the shovel 100 will be described with reference to FIGS. 10 to 15.

The display contents of FIGS. 10 to 15 may be displayed on the display device of the remote operation assistance device 300.

First Example

FIG. 10 is a diagram illustrating a first example (a screen 1000) of display contents of the display device 50A relating to the movement proposing function of the shovel 100.

The screen 1000 includes images 1001 to 1006.

The image 1001 is an image representing a work target around the shovel 100. In the present example, the image 1001 is an image representing the work target (the ground at the work site) around the shovel 100 when viewed from a predetermined viewpoint around the shovel 100, which is generated based on the output (image data) of the imaging device 40 using a known image processing technique.

The image 1002 is an image schematically representing the shovel 100. In the present example, the image 1002 is an image schematically representing the shovel 100 when viewed from the same viewpoint as the image 1001, and is displayed superimposed on the image 1001.

The image 1003 includes images representing movements proposed by the proposing unit 302D in a list format from among the plurality of candidate movements. In the present example, the image 1003 includes images 1003A to 1003D representing respective rows of the sweep-out movement, the horizontal leveling movement, the rolling compaction movement, and the broom-turning movement from among the plurality of candidate movements, as the proposed movements. In this example, the images 1003A to 1003D represent the reliability (suitability) of each of the broom-turning movement, the sweep-out movement, the horizontal leveling movement, and the rolling compaction movement. Thus, the operator can select a movement to be executed by his/her operation or to be automatically performed by the shovel 100 in consideration of the reliability (suitability) from the proposed movements.

Note that only the movement having the highest reliability (suitability) (the sweep-out movement in this example) may be represented in the image 1003. That is, the proposing unit 302D may propose only the movement having the highest reliability (suitability), from among the plurality of candidate movements in the predetermined work to the operator through the image 1003. In addition, in the image 1003, only the movements (e.g., the sweep-out movement and the horizontal leveling movement) having reliability (suitability) equal to or higher than a predetermined reference (e.g., 30%) among plurality of candidate a movements may be represented. That is, the proposing unit 302D may propose only the movement having the reliability (suitability) equal to or higher than a predetermined reference to the operator among the plurality of candidate movements.

The image 1004 includes images representing target trajectories for respective proposed movements, which are represented in the image 1003. The image 1004 is displayed in a superimposed manner on the image 1002 in the periphery of the image 1001. Thus, the operator can easily identify a target trajectory for each proposed movement while comparing the image 1001 representing a state of the ground surface of the work site around the shovel 100 and the image 1002 representing the shovel 100. The image 1004 includes images 1004A to 1004D.

The image 1004A is an image representing the target trajectory of the sweep-out movement.

The image 1004B is an image representing a target trajectory of the horizontal leveling movement.

The image 1004C is an image representing a target trajectory of the rolling compaction movement.

The image 1004D is an image representing a target trajectory of the broom-turning movement.

The images 1004A to 1004D may be expressed in a manner that a portion of the target trajectory that comes into contact with the work target (ground) can be distinguished from another portion. For example, the images 1004A to 1004D may be expressed in a manner that a portion of the target trajectory that comes into contact with the work target and the other portion are different in color. This can assist in providing perspective on images 1004A to 1004D, which correspond to the target trajectories, on the image 1001.

In this example, a shaded cursor is represented in the image 1003A corresponding to the sweep-out movement. In this example, the image 1004A corresponding to the sweep-out movement of the image 1004 is expressed by a thicker line than the images 1004B to 1004D corresponding to the other movements. Thus, in this example, a state in which the sweep-out movement is selected is represented. For example, the operator can select any one of the sweep-out movement, the horizontal leveling movement, the rolling compaction movement, and the broom-turning movement by designating any one of the images 1003A to 1003D using the input device 52. Similarly, for example, the operator can select any one of the sweep-out movement, the horizontal leveling movement, the rolling compaction movement, and the broom-turning movement by designating any one of the images 1004A to 1004D using the input device 52.

The image 1005 is an icon for determining execution of a movement selected by the user (operator) from the proposed movements.

For example, the operator can cause the shovel 100 to automatically execute the selected movement by operating the image 1005 using the input device 52.

The image 1006 is an icon for ending the movement proposing function of the shovel 100. The same applies to images 1106, 1206, 1306, 1406, and 1506 described below.

In this way, in the present example, the controller causes the display device 50A to display a plurality of movements from among a plurality of candidate movements in the ground leveling work together with the reliability (suitability) of each movement with respect to the current shape (topography) of the work target around the shovel 100. Thus, the controller 30 can propose, to the operator, a movement having a relatively high reliability (suitability) with respect to the current shape (topography) of the work target around the shovel 100.

Second Example

FIG. 11 is a diagram illustrating a second example (a screen 1100) of a display content of the display device 50A relating to the movement proposing function of the shovel 100.

The following description will focus on the parts that differ from the first example above, and may be simplified or omitted from the description of the same or corresponding contents as in the first example above.

The screen 1100 includes images 1101 to 1106.

The image 1101 is an image representing a work target around the shovel 100, as in the image 1001 of FIG. 10.

The image 1102 is an image schematically representing the shovel 100, as in the image 1002 of FIG. 10.

The image 1103 includes images representing a plurality of target trajectories of the bucket 6 in a list format with respect to one movement proposed by the proposing unit 302D from among a plurality of candidate movements. In the present example, the image 1103 includes images 1103A to 1103D representing respective rows of four target trajectories (“Sweep-out I” to “Sweep-out IV”) of the bucket 6 for the sweep-out movement as one proposed movement. In the present example, the reliabilities (suitabilities) of the four target trajectories of the bucket 6 are represented in the images 1103A to 1103D. Thus, the operator can select a target trajectory of the bucket 6 to be executed by his/her operation or to be automatically performed by the shovel 100 in consideration of the reliability from among the four target trajectories of the bucket 6 for one proposed movement (the sweep-out movement).

Note that only the target trajectory (in this example, “Sweep-out I”) of the bucket 6 having the highest reliability (suitability) may be represented in the image 1103. That is, the proposing unit 302D may propose only the movement having the highest reliability (suitability) from among the plurality of target trajectories for one proposed movement in the predetermined work to the operator through the image 1103. In addition, only the target trajectories (e.g., “Sweep-out I” and “Sweep-out II”) having reliability (suitability) equal to or higher than a predetermined reference (e.g., 30%) from among the plurality of target trajectories of the bucket 6 may be represented in the image 1103. That is, the proposing unit 302D may propose, to the operator, only the target trajectories having the reliability (suitability) equal to or higher than a predetermined reference from among the plurality of target trajectories of the bucket 6 for one proposed movement. The same applies to an image 1203A described below.

The image 1104 includes images representing four target trajectories for one proposed movement, which are represented in the image 1103. The image 1104 is displayed in a superimposed manner on the image 1001 around the image 1002, as in the image 1004. Thus, the operator can easily identify a plurality of target trajectories for proposed one movement (the sweep-out movement) while comparing the image 1101 representing the state of the ground surface of the work site around the shovel 100 and the image 1102 representing the shovel 100. The image 1104 includes images 1104A to 1104D.

The image 1104A is an image representing a target trajectory of the sweep-out movement corresponding to the image 1103A (“Sweep-out I”).

The image 1104B is an image representing a target trajectory of the sweep-out movement corresponding to the image 1103B (“Sweep-out II”).

The image 1104C is an image representing a target trajectory of the sweep-out movement corresponding to the image 1103C (“Sweep-out III”).

The image 1104D is an image representing a target trajectory of the sweep-out movement corresponding to the image 1103C (“Sweep-out IV”).

In this example, a shaded cursor is represented in the image 1103A corresponding to the “sweep-out movement I”. In this example, the image 1104A corresponding to the target trajectory of the “sweep-out movement I” in the image 1104 is expressed by a thicker line than the images 1104B to 1104D corresponding to the other target trajectories. Thus, in the present example, a state in which the target trajectory of “Sweep-out I” is selected from among the four target trajectories is represented. For example, the operator can select any one of the four target trajectories by designating any one of the images 1103A to 1103D using the input device 52. Similarly, for example, the operator can select any one of the four target trajectories of the bucket 6 by designating any one of the images 1104A to 1104D using the input device 52.

The image 1105 is an icon for executing one proposed movement (sweep-out movement) such that the bucket 6 moves along a target trajectory selected by the user (operator) from among a plurality of target trajectories.

For example, the operator can cause the shovel 100 to automatically execute one proposed movement such that the bucket 6 moves along the selected target trajectory by operating the image 1105 using the input device 52.

In this way, in the present example, the controller causes the display device 50A to display a plurality of target trajectories of the bucket 6 for one movement relating to the ground leveling work together with the reliability (suitability) of each of the target trajectories with respect to the current shape (topography) of the work target around the shovel 100. Thus, the controller 30 can propose, to the operator, a plurality of target trajectories of the bucket 6 having relatively high reliability (suitability) with respect to the current shape (topography) of the work target around the shovel 100.

Third Example

FIGS. 12 and 13 are diagrams illustrating a third example (a screen 1200 or 1300) of the display contents of the display device 50A relating to the movement proposing function of the shovel 100.

The following description will focus on the parts that differ from the first and second examples described above, and may be simplified or omitted from the description of the same or corresponding contents as in the first and second examples above.

The screen 1200 includes images 1201 to 1206.

The image 1201 is an image representing the work target around the shovel 100, as in the image 1001 of FIG. 10.

The image 1202 is an image schematically representing the shovel 100, as in the image 1002 of FIG. 10.

The image 1203 includes movements proposed by the proposing unit 302D among the plurality of candidate movements in a list format, as in the image 1003 of FIG. 10. In the present example, the image 1203 includes images 1203A to 1203D representing rows of the sweep-out movement, the horizontal leveling movement, the images representing rolling compaction movement, and the broom-turning movement, respectively, from among the plurality of candidate f movements, as the proposed movements. In this example, the images 1203A to 1203D represent the reliability (suitability) of each of the broom-turning movement, the sweep-out movement, the horizontal leveling movement, and the rolling compaction movement.

Further, the image 1203A is an image representing a plurality of target trajectories of the bucket 6 in a list format for one proposed movement (sweep-out movement), as in the image 1103 of FIG. 11. In the present example, the image 1203A includes images 1203A1 to 1203A4 representing respective rows of four target trajectories (“Sweep-out I” to “Sweep-out IV”) of the bucket 6 for the sweep-out movement having the highest reliability (suitability) from among the plurality of candidate movements. In the present example, the reliabilities (suitabilities) of the four target trajectories of the bucket 6 are expressed in the images 1203A1 to 1203A4.

The image 1204 includes, as in the image 1004 of FIG. 10, images representing target trajectories for respective proposed movements, which are represented in the image 1203. The image 1204 includes images 1204A to 1204D.

The image 1204A is an image representing a target trajectory of the sweep-out movement. Specifically, the image represents one target trajectory (“Sweep-out I”) having the highest reliability from among the target trajectories (“Sweep-out I” to “Sweep-out IV”) of the bucket 6 corresponding to the sweep-out movement.

The images 1204B to 1204D are the same as the images 1004B to 1004D of FIG. 10, and thus the description thereof will be omitted.

In this example, a shaded cursor is represented in the image 1203A corresponding to the sweep-out movement. In this example, the image 1204A corresponding to the sweep-out movement of the image 1204 is expressed by a thicker line than the images 1204B to 1204D corresponding to the other movement. Thus, in this example, a state in which the sweep-out movement is selected is represented.

The image 1205 is an icon for determining execution of a movement selected by the user (operator) from the proposed movements.

The image 1205 is an icon for shifting to the screen 1300 for selecting four target trajectories of the bucket 6 corresponding to the sweep-out movement in a state where the sweep-out movement having the highest reliability is selected from among the proposed movements. That is, when the image 1205 is operated through the input device 52 in the state of the screen 1200, the screen shifts to the screen 1300.

The screen 1300 includes images 1301 to 1306.

The image 1303 is an image representing movements proposed by the proposing unit 302D in a list format, from among the plurality of candidate movements, as in the image 1203 of FIG. 12. Specifically, the image 1303 includes images 1303A to 1303D representing rows of the sweep-out movement, the horizontal leveling movement, the rolling compaction movement, and the broom-turning movement, respectively, from among the plurality of candidate movements, as the proposed movements.

Further, the image 1303A is an image representing a plurality of target trajectories of the bucket 6 in a list format for one proposed movement (the sweep-out movement), as in the image 1203A of FIG. 12. Specifically, the image 1303A includes images 1303A1 to 1303A4 representing respective rows of four target trajectories (“Sweep-out I” to “Sweep-out IV”) of the bucket 6 for the sweep-out movement having the highest reliability (suitability) from among the plurality of candidate movements.

The image 1304 includes, as in the image 1204 of FIG. 12, images representing target trajectories for respective proposed movements, which are represented in the image 1303. Specifically, the image 1304 includes images 1304A to 1304D.

The images 1304A to 1304D are the same as the images 1104A to 1104D of FIG. 11, respectively, and thus the description thereof will be omitted.

In this example, a shaded cursor is represented in the image 1303A1 corresponding to the “sweep-out movement I”. In this example, the image 1304A corresponding to the target trajectory of the “sweep-out movement I” in the image 1304 is expressed by a thicker line than the images 1304B to 1304D corresponding to the other target trajectories. Thus, in the present example, a state in which the target trajectory of “Sweep-out I” is selected among the four target trajectories is represented.

The image 1305 is an icon for executing one proposed movement (sweep-out movement) such that the bucket 6 moves along a target trajectory selected by the user (operator) from among a plurality of target trajectories.

In this way, in the present example, the controller causes the display device 50A to display a plurality of movements from among a plurality of candidate movements relating to the ground leveling work together with the respective reliabilities, and causes the display device 50A to display a plurality of target trajectories for a movement having the highest reliability. Thus, the controller 30 can propose, to the operator, a plurality of movements having relatively high reliability with respect to the current shape (topography) of the work target around the shovel 100 and a plurality of target trajectories of the bucket 6 for the movement having the highest reliability.

Fourth Example

FIG. 14 is a diagram illustrating a fourth example (screen 1400) of display contents of the display device 50A relating to the movement proposing function of the shovel 100.

Hereinafter, a description will be given focusing on a portion different from the first to third examples described above, and a description of the same or corresponding content as the first to third examples described above may be simplified or omitted.

The screen 1400 includes images 1401 to 1406.

The image 1401 is an image representing a work target around the shovel 100, as in the image 1001 of FIG. 10.

The image 1402 is an image schematically representing the shovel 100, as in the image 1002 of FIG. 10.

The image 1403 includes images representing movements proposed by the proposing unit 302D from among the plurality of candidate movements in a list format, as in the image 1003 of FIG. 10. Specifically, the image 1403 includes images 1403A to 1403D representing rows of the sweep-out movement, the horizontal leveling movement, the rolling compaction movement, and the broom-turning movement, respectively, from among the plurality of candidate movements, as the proposed movements.

The image 1404 includes images representing target trajectories for respective proposed movements, which are represented in the image 1403, as in the image 1004 of FIG. 10. Specifically, the image 1404 includes images 1404A to 1404D.

The images 1404A to 1404D are the same as the images 1004A to 1004D of FIG. 10, respectively, and thus the description thereof will be omitted.

In this example, a worker W is illustrated in an image area in which the image 1403A is displayed in the image 1401 in a superimposed manner. Therefore, when the movement with the highest reliability (the sweep-out movement) is selected, the attachment AT may come too close to the worker W or the attachment AT may come into contact with the worker W.

To handle such a situation, in the present example, a shaded cursor is represented in the image 1403B corresponding to the horizontal leveling movement, and the image 1404B corresponding to the horizontal leveling movement is represented by a thicker line than the images 1404A, 1404C, and 1404D corresponding to the other movements. That is, in this example, the operator selects a movement (the horizontal leveling movement) different from the sweep-out movement through the input device 52 and causes the shovel 100 to execute the selected movement. This can prevent the attachment AT from coming too close to the worker W or from coming into contact with the worker W.

The image 1405 is the same as the image 1005 in FIG. 10, and thus the description thereof will be omitted.

In this way, in the present example, the target trajectory of the proposed movement is displayed, in a superimposed manner, on the image 1401 representing the situation around the shovel 100, and thus the operator can identify the relationship between the target trajectory and an obstacle such as the worker W in the work site. Therefore, it is possible to improve the safety of the shovel 100 while improving the convenience of the operator and the work efficiency of the shovel 100.

Fifth Example

FIG. 15 is a diagram illustrating a fifth example (screen 1500) of display contents of the display device 50A relating to the movement proposing function of the shovel 100.

Hereinafter, a description will be given focusing on a portion different from the first to fourth examples described above, and a description of the same or corresponding content as the first to fourth examples described above may be simplified or omitted.

The screen 1500 includes images 1501 to 1506.

The image 1501 is an image representing a work target around the shovel 100, as in the image 1001 of FIG. 10.

The image 1502 is an image schematically representing the shovel 100, as in the image 1002 of FIG. 10.

The image 1503 includes images representing movements proposed by the proposing unit 302D from among the plurality of candidate movements in a list format, as in the image 1003 of FIG. 10. Specifically, the image 1503 includes images 1503A to 1503D representing respective rows of the horizontal leveling movement, the sweep-out movement, the rolling compaction movement, and the broom-turning movement from among the plurality of candidate movements as the proposed movements.

The image 1504 includes images representing target trajectories for respective proposed movements, which are represented in the image 1503, as in the image 1004 of FIG. 10. Specifically, the image 1504 includes images 1504A to 1504D.

The images 1504A to 1504D are the same as the images 1004A to 1004D of FIG. 10, respectively, and thus the description thereof will be omitted.

The image 1504A is an image representing a target trajectory of the horizontal leveling movement.

The image 1504B is an image representing a target trajectory of the sweep-out movement.

The image 1504C is an image representing a target trajectory of the rolling compaction movement.

The image 1504D is an image representing a target trajectory of the broom-turning movement.

In this example, the image area in which the images 1504A, 1504C, and 1504D of the image 1501 are displayed in a superimposed manner is included in the area 1501A in which the ground leveling work has been already completed. Therefore, when the movement with the highest reliability (the horizontal leveling movement corresponding to the image 1504A) is selected, not only is unnecessary work performed on the area for which the ground leveling work has already been completed, but also work for restoring the area in which the ground leveling work has already been completed is required due to the effect of the unnecessary work. As a result, there is a possibility that a decrease in the work efficiency of the shovel 100 or a delay in the progress of work at the work site is caused.

In contrast, in the present example, a shaded cursor is represented in the image 1503B corresponding to the sweep-out movement, and the image 1504B corresponding to the sweep-out movement is represented by a thicker line than the images 1504A, 1504C, and 1504D corresponding to the other movements. That is, in this example, the operator selects a movement (sweep-out movement) different from the horizontal leveling movement and causes the shovel 100 to execute the selected movement. Thus, it is possible to prevent a situation in which the shovel 100 performs work on an area where work has already been completed. Therefore, it is possible to prevent a decrease in the work efficiency of the shovel 100, a delay in the progress of the work at the work site, and the like.

In this example, an image indicating an area in which the ground leveling work has been completed (an image in which the area 1501A is covered with oblique lines, or the like) is displayed in a superimposed manner on the area 1501A. Thus, the operator can more reliably recognize that the area 1501A is an area where the ground leveling work is completed. In this case, information on the area in which the work in the work site is completed is distributed from the information processing device 200 to the shovel 100, for example.

The image 1505 is the same as the image 1005 in FIG. 10, and thus the description thereof will be omitted.

In this way, in the present example, the target trajectory of the proposed movement is displayed, in a superimposed manner, on the image 1401 representing the situation around the shovel 100, and thus the operator can identify the relationship between the target trajectory and the completed situation of the work in the work site. Therefore, the operator can select a more appropriate movement or target trajectory in accordance with the completed situation of the work at the work site.

Another Example of Movement Proposing Function

Next, another example of a movement proposing function will be described.

The first and second examples of the movement proposing function described above may be combined, modified, or changed as appropriate.

For example, in the first example or the second example of the movement proposing function described above, the proposing unit 302D may propose only a movement, from among a plurality of candidate movements that can be performed in a predetermined work, and may omit proposing a target trajectory corresponding to the proposed movement.

In the first example and the second example of the movement proposing functions described above, and modifications of the first and the second examples, the proposing unit 302D may intentionally propose not only a target trajectory having a relatively high reliability (suitability) with respect to the shape of the work target around the shovel 100 but may also intentionally propose a movement or a target trajectory having a relatively low reliability (suitability). For example, the movement having a relatively low reliability (suitability) is a movement of the shovel 100 or a target trajectory of the bucket 6 which is estimated based on the trained model LM and has a reliability (suitability) lower than a predetermined reference. Further, the movement having a relatively low reliability (suitability) may be a movement of the shovel 100 or a target trajectory of the bucket 6 estimated based on a trained model different from the trained model LM, that is, a trained model obtained by machine learning using a training data set including an inappropriate movement or target trajectory. This makes it possible to reduce the operator's dependence on the movement proposing function and to prompt the operator to use the movement proposing function based on the operator's appropriate determination. In this case, when a movement with a relatively low reliability (suitability) is selected, the selection may be recorded as a log in the auxiliary storage device 30A of the controller or the like. The log may include identification information on the operator. Thus, for example, the assistance device 150 (the controller 30) can attach a label to the operator who has a high probability of selecting the movement of the shovel 100 or the target trajectory of the bucket 6 with a relatively low reliability (suitability). Therefore, a manager or the like at the work site can manage a use situation of the movement proposing function for each operator by checking the log, the label, or the like thereafter.

In the first example, the second example, and the modifications of the movement proposing function described above, a part or all of the functions of the assistance device 150 may be transferred to the remote operation assistance device 300. For example, the function of the proposing unit 302D is transferred to the remote operation assistance device 300. In addition to the function of the proposing unit 302D, the function of the estimation unit 302C may be transferred to the remote operation assistance device 300. In addition to the functions of the proposing unit 302D and the estimation unit 302C, the function of the work target shape acquisition unit 302B may be transferred to the remote operation assistance device 300. In this case, the image data of the imaging device 40 is transmitted from the shovel 100 to the remote operation assistance device 300.

In the first example, the second example, and the modifications of the movement proposing function described above, a part or all of the functions of the assistance device 150 may be transferred to the information processing device 200. For example, the work target shape acquisition unit 302B is transferred to the information processing device 200. In this case, the image data of the imaging device 40 is transmitted from the shovel 100 to the information processing device 200. In addition, the function of the estimation unit 302C may be transferred to the information processing device 200 in addition to the work target shape acquisition unit 302B. In addition, the function of the proposing unit 302D may be transferred to the information processing device 200 in addition to the work target shape acquisition unit 302B and the estimation unit 302C. Thus, for example, the portable information processing device 200 carried into the cabin 10 can propose one or more movements from among a plurality of candidate movements of the shovel 100 that can be performed in a predetermined work, or propose a trajectory of such one or more movements to the operator.

The assistance device 150 according to the first example and the second example of the movement proposing function, or the modifications of described above may propose, to the operator, one or more movements among a plurality of candidate movements in a predetermined work of another work machine different from the shovel 100. For example, the other work machine is a forestry machine having a harvester device. In this case, the assistance device 150 may propose one or more movements for trees that are targets for the movements by the harvester device from among multiple candidate movements for trees present at the site.

[Effects of Assistance Device (1)]

Next, the effects of the assistance device according to the present embodiment will be described.

In the present embodiment, the assistance device includes an acquisition unit and a proposing unit. The assistance device is, for example, the assistance device 150. The work machine is, for example, the shovel 100. The acquisition unit is, for example, the work target shape acquisition unit 302B. The proposing unit is, for example, a proposing unit 302D. Specifically, the acquisition unit acquires data relating to a shape (e.g., topography) of the work target around the work machine. The proposing unit proposes, to a user, a movement from among a plurality of candidate movements of the work machine in a predetermined work, based on the data relating to the shape of the work target around the work machine acquired by the acquisition unit.

In the present embodiment, the work machine may include the above-described assistance device.

Thus, the assistance device can propose, to an operator of the work machine, a movement that is more suitable for the shape of the work target around the work machine from among the plurality of candidate movements that can be performed in the predetermined work, for example. Thus, the work machine can be moved more appropriately. Therefore, the work efficiency of the work machine can be improved.

In the present embodiment, the assistance device includes the estimation unit. The estimation unit is, for example, the estimation unit 302C. Specifically, the estimation unit estimates a movement suitable for the shape of the work target around the work machine from among the plurality of candidate movements, based on data relating to the shape of a work target around the work machine acquired by the acquisition unit, using a trained model trained by machine learning with training data relating to movements of the work machine, where the movements of the work machine correspond to the shape of the work target, and are operated by a relatively highly skilled operator. The trained model is, for example, the trained model LM. The proposing unit may propose a movement from among the plurality of candidate movements based on the estimation result of the estimation unit.

Thus, the assistance device can propose a movement more suitable for the shape of the work target around the work machine from among the plurality of candidate movements that can be performed in the predetermined work, using the trained model.

In the present embodiment, the proposing unit may propose, to the user, a plurality of movements from among the plurality of candidate movements, based on the data relating to the shape of the work target around the work machine acquired by the acquisition unit.

Thus, the assistance device can provide the operator with options and prompt the operator to make a determination by his/her own will. Therefore, by reflecting the determination of the operator, the work machine can be operated more appropriately.

In the present embodiment, the plurality of proposed movements may include a movement having a relatively low suitability for the shape of the work target around the work machine from among the plurality of candidate movements.

Thus, the assistance device can prompt the operator to make a determination by his/her own will. Therefore, by reflecting the decision of the operator, the work machine can be operated more appropriately.

In the present embodiment, the proposing unit may propose a movement from among a plurality of candidate movements, based on the data relating to the shape of the work target around the work machine acquired by the acquisition unit, together with a suitability for the shape of the work target around the work machine by the proposed movement.

Thus, the assistance device can provide the operator with information for determining whether to execute the movement. Therefore, the assistance device can prompt the operator to make a more appropriate determination and cause the work machine to make a movement more appropriately.

In the present embodiment, the proposing unit may propose a plurality of movements from among the plurality of candidate movements, based on the data relating to the shape of the work target around the work machine acquired by the acquisition unit, together with suitability for the shape of the work target around the work machine on a per plurality of movements basis.

Thus, the assistance device can provide the operator with a plurality of options relating to the movements of the work machine, and can provide the operator with information for determining the options. Therefore, the assistance device can prompt the operator to make a more appropriate determination and cause the work machine to perform a more appropriate movement.

In the present embodiment, the proposing unit may propose a movement from among a plurality of candidate movements, based on the data relating to the shape of the work target around the work machine acquired by the acquisition unit, together with a trajectory of the work part of the work machine by the movement. The work part is, for example, the bucket 6.

Thus, the assistance device can provide the operator with information for determining whether to execute the movement. Therefore, the assistance device can prompt the operator to make a more appropriate determination and cause the work machine to make a movement more appropriately.

In the present embodiment, the proposing unit may propose a movement from among a plurality of candidate movements, together with a plurality of trajectories of the work part by the proposed movement, based on the data relating to the shape of the work target around the work machine acquired by the acquisition unit.

Thus, the assistance device can provide the operator with a plurality of options relating to the trajectory of the work part corresponding to the proposed movement. Therefore, the assistance device can prompt the operator to make a more appropriate determination and cause the work machine to make a movement more appropriately.

In the present embodiment, the proposing unit may propose a movement from among the plurality of candidate movements, based on the data relating to the shape of the work target around the work machine acquired by the acquisition unit, together with a plurality of trajectories of the work part by the proposed movement and suitability of the proposed movement for the shape of the work target around the work machine on a per plurality of trajectories basis.

Thus, the assistance device can provide the operator with a plurality of options for the trajectory of the work part and can provide the operator with information for determining the options. Therefore, the assistance device can prompt the operator to make a more appropriate determination and cause the work machine to make a movement more appropriately.

In the present embodiment, the assistance device may include a display unit. The display unit is, for example, the display device 50A. The proposing unit may cause the display unit to display the trajectory of the work part by the proposed movement from among the plurality of candidate movements in a superimposed manner on the image representing the situation around the work machine.

Thus, the operator can more easily understand the relationship between the trajectory of the work part and the shape of the work target around the work machine. Therefore, the assistance device can prompt the operator to make a more appropriate determination and cause the work machine to make a movement more appropriately.

In the present embodiment, the proposing unit may cause the display unit to display a trajectory portion that comes into contact with a work target and other trajectory portions excluding the trajectory portion in the trajectory by the proposed movement from among the plurality of candidate movements in different modes.

This enables the assistance device to assist with perspective on the image representing the situation around the work machine, and enables the operator to more appropriately identify the trajectory.

In the present embodiment, the assistance device includes the control unit. The control unit is, for example, the movement control unit 302E. Specifically, the control unit may cause the work machine to automatically execute the movement proposed by the proposing unit in response to an input of an instruction from the user.

This makes it possible to further improve the convenience of the user (operator) and to operate the work machine more appropriately.

In the present embodiment, the acquisition unit may acquire the data relating to the shape of the work target around the work machine by predicting the shape of the work target around the work machine after the execution of the movement automatically executed by the control unit.

Thus, the assistance device can propose a movement of the work machine more smoothly when assisting the work of the work machine while repeatedly proposing a movement of the work machine. Therefore, the work efficiency of the work machine can be further improved.

[Functions Relating to Generation of Trajectory of Work Part of Shovel]

Next, functions relating to generation of a trajectory (hereinafter, referred to as a “target trajectory”) of a work part of the shovel 100 will be described with reference to FIGS. 16 to 19 in addition to FIGS. 1 to 5.

Hereinafter, the same reference numerals are applied to the same or corresponding components as those of the above-described movement proposing function, and the description will be made mainly on the parts different from the above-described movement proposing function, and the description of the same or corresponding contents as those of the above-described movement proposing function may be simplified or omitted.

<Functional Configuration>

FIG. 16 is a block diagram illustrating a first example of a functional configuration relating to generation of a target trajectory of the work part of the shovel 100. FIG. 17 is a diagram illustrating an example (a screen 700) of a screen relating to generation of a target trajectory of a work part of the shovel 100, which is displayed on the display device 50A. FIG. 18 is a diagram illustrating another example (screen 800) of the screen relating to the generation of the target trajectory of the work part of the shovel 100, which is displayed on the display device 50A. FIG. 19 is a diagram illustrating still another example (a screen 900) of the screen relating to the generation of the target trajectory of the work part of the shovel 100, which is displayed on the display device 50A.

When the shovel 100 is remotely operated, the same screens as those in FIGS. 17 and 18 are displayed on the remote operation assistance device 300 (e.g., display device 50A).

The work part of the shovel 100 is, for example, the claw tip or the back surface of the bucket 6.

The shovel 100 includes an assistance device 150. The assistance device 150 assists the work of the shovel 100.

As illustrated in FIG. 16, the assistance device 150 includes the operation device 26, the controller 30, the imaging device 40, and the output device 50 (display device 50A). In addition, in a case where the shovel 100 is remotely operated, the assistance device 150 may include the communication device 60.

The controller 30 includes a movement log providing unit 301 and a work assistance unit 302 as functional units.

In a case where the activation assistance system SYS includes a plurality of shovels 100, there may be two shovels 100; one shovel 100 includes the controller 30 having only the movement log providing unit 301 out of the movement log providing unit 301 and the work assistance unit 302, and the other shovel 100 includes the controller 30 having only the work assistance unit 302 out of the movement log providing unit 301 and the work assistance unit 302. In this case, the former shovel 100 only has a function of acquiring a movement log of the shovel 100 and providing the movement log to the information processing device 200, which is used for an operator's operation assistance function (function relating to generation of the trajectory of the work part) of the latter shovel 100.

The information processing device 200 includes, as functional units, a movement log acquisition unit 2001, a movement log storage unit 2002, a training data generation unit 2003, a machine learning unit 2004, a trained model storage unit 2005, and a distribution unit 2006.

The movement log providing unit 301 is a functional unit for acquiring a movement log of the shovel 100, which is original data for implementing a function of generating a target trajectory of the work part of the shovel 100, and providing the movement log to the information processing device 200. Specifically, the movement log acquisition unit 130 acquires a movement log when an operator (hereinafter, referred to as a “skilled operator” for convenience) who has a long operating experience of the shovel 100 and is a relatively experienced operator operates the shovel 100, and provides the movement log to the information processing device 200.

The movement log of the shovel 100 includes data relating to the shape of the work target around the shovel 100 and data relating to the movement of the shovel 100 executed with respect to the shape of the work target. The data relating to the shape of the work target around the shovel 100 is, for example, data relating to the topography of the ground of the work site as the work target of the shovel 100. The data relating to the shape of the work target of the shovel 100 is, for example, image data of the imaging device 40 or three dimensional data of the work target obtained from the image data. The data relating to the movement of the shovel 100 is, for example, data representing the operation content of the operator. The data representing the operation content of the operator is, for example, output data of the operation pressure sensor 29 in the case of the hydraulic pilot type operation device 26 or output data (data of an operation signal) of the operation device 26 in the case of the electric operation device 26. The data relating to the movement of the shovel 100 may be data representing the movement state of the shovel 100 actually executed in response to the operation by the operator. The data representing the movement state of the shovel 100 is, for example, data outputted from the sensors S1 to S5 or data relating to the posture state of the shovel 100 acquired from the data outputted from the sensors S1 to S5.

The movement log providing unit 301 includes a movement log recording unit 301A, a movement log storage unit 301B, and a movement log transmission unit 301C.

The movement log recording unit 301A acquires a movement log of the shovel 100 and records the movement log in the movement log storage unit 301B. For example, every time the movement of the shovel 100 is executed, the movement log recording unit 301A records, in the movement log storage unit 301B, data relating to the shape of the work target around the shovel 100 at the start of execution of the movement or immediately before the execution of the movement and data relating to the movement of the shovel 100.

The movement log storage unit 301B stores movement logs of the shovel 100 in an accumulated manner. For example, in the movement log storage unit 301B, data relating to the shape of the work target around the shovel 100 for each movement of the shovel 100 and data relating to the corresponding movement of the shovel 100 are stored in association with each other. For example, the movement log storage unit 301B may accumulate record data representing the correspondence between data relating to the shape of the work target around the shovel 100 for each movement of the shovel 100 and data relating to the corresponding movement of the shovel 100, and a database of movement logs may be constructed.

The movement log in the movement log storage unit 301C, which has been transmitted to the information processing device 200 by the movement log transmission unit 301B described later, may be deleted later.

The movement log transmission unit 301C transmits the movement log of the shovel 100 stored in the movement log storage unit 301B to the information processing device 200 through the communication device 60. The movement log transmission unit 301C may transmit the record data representing the correspondence relationship between data relating to the shape of the work target around the shovel 100 for each movement of the shovel 100 and data relating to the corresponding movement of the shovel 100 to the information processing device 200.

For example, the movement log transmission unit 301C transmits the movement log of the shovel 100 which is stored in the movement log storage unit 301B and has not been transmitted to the information processing device 200 in response to a signal (hereinafter, referred to as a “transmission request signal”) for requesting transmission of the movement log of the shovel 100, which is received from the information processing device 200. The movement log transmission unit 301C may automatically transmit the movement log of the shovel 100 which is stored in the movement log storage unit 301B and has not been transmitted to the information processing device 200 at a predetermined timing. The predetermined timing is, for example, the time of stopping the activation (the time of turning off the key switch) or the time of starting the activation (the time of turning on the key switch) of the shovel 100.

The movement log acquisition unit 2001 acquires the movement log of the shovel 100 received from the shovel 100.

The movement log acquisition unit 2001 acquires the movement log of the shovel 100 by transmitting a transmission request signal to the shovel 100 in response to an operation of the user of the information processing device 200 or automatically at a predetermined timing. The movement log acquisition unit 2001 may acquire the movement log of the shovel 100 transmitted from the shovel 100 at a predetermined timing.

The movement log storage unit 2002 stores the movement log of the shovel 100 acquired by the movement log acquisition unit 2001 in an accumulated manner. For example, in the movement log storage unit 2002, as in the case of the movement log storage unit 301B, data relating to the shape of the work target around the shovel 100 for each movement of the shovel 100 and data relating to the corresponding movement of the shovel 100 are stored in association with each other.

The training data generation unit 2003 generates training data for machine learning based on the movement log of the shovel 100 in the movement log storage unit 2002. The training data generation unit 2003 may automatically generate training data by batch processing or may generate training data in response to an input from a user of the information processing device 200. The training data is a combination of data including data relating to the shape of the work target around the shovel 100 as input data and data (hereinafter, “ground truth data”) representing the trajectory (locus) of the work part of the shovel 100 corresponding to the input data as ground truth output data.

The trajectory of the work part of the shovel 100 is generated based on, for example, outputs from the sensors S1 to S5 included in data relating to the movement of the shovel 100.

The machine learning unit 2004 causes the base learning model to perform machine learning based on the set of training data generated by the training data generation unit 2003, and generates a trained model LM. The trained model LM (base learning model) includes, for example, a neural network such as a deep neural network (DNN).

The trained model LM outputs data representing a target trajectory of the work part of the shovel 100 and a prediction probability, for example, with types of the movement of the shovel 100 and data relating to the shape of the work target around the shovel 100 as input conditions. The types of movement of the shovel 100 include, for example, an excavating movement, a sweep-out movement, a horizontal leveling movement, a rolling compaction movement, a broom-turning movement, and the like. The sweep-out movement is, for example, moving the attachment AT to sweep earth and sand forward with the back surface of the bucket 6 and push the bucket 6 forward along the ground. In the sweep-out movement, for example, the attachment AT performs the movement of lowering the boom 4 and the movement of opening the arm 5. The horizontal leveling movement is, for example, moving the attachment AT to cause the claw tip of the bucket 6 to move along the ground in a substantially horizontal manner and pull the claw tip back toward the front, thereby leveling the unevenness of the surface of the ground. In the horizontal leveling movement, for example, the attachment AT performs the raising movement of the boom 4 and the closing movement of the arm 5. The rolling compaction movement is, for example, operating the attachment AT such that the ground is pressed with the back surface of the bucket 6. The rolling compaction movement may be a movement of pushing the bucket 6 forward along the ground to sweep out the earth and sand to a predetermined position in front of the bucket 6 with the back surface of the bucket 6 and then pressing the ground at the predetermined position with the back surface of the bucket 6. In the rolling compaction movement, for example, the attachment AT performs a lowering movement of the boom 4 when pressing the ground. The broom-turning movement is, for example, a movement of operating the upper turning body 3 to turn the bucket 6 to the left and right in a state of being along the ground. The broom-turning movement may be, for example, a movement of pushing the bucket 6 forward while operating the attachment AT and the upper turning body 3 to turn the bucket 6 alternately to the left and right in a state where the bucket 6 is along the ground. In the broom-turning movement, for example, the upper turning body 3 alternately repeats a turning movement to the left and right. In the broom-turning movement, for example, in addition to the left and right alternate turning movement of the upper turning body 3, the lowering movement of the boom 4 and the opening movement of the arm 5 may be performed as in the sweep-out movement. The prediction probability represents the reliability of the target trajectory of the work part. This is because, as described above, it is considered that the trained model LM reflects the movement log of the skilled operator operating the shovel 100, and the higher the prediction probability, the higher the reliability of the target trajectory of the working part. The prediction probability represents the suitability of the target trajectory of the work part with respect to the shape of the work target around the shovel 100 as the input conditions. This is because it is considered that the higher the prediction probability, the higher the possibility of the skilled person determining that the candidate movement is suitable for the shape of the work target. For example, the trained model LM is generated for each work such as the ground leveling work, the slope construction work, and the filling work.

The trained model LM output by the machine learning unit 2004 is stored in the trained model storage unit 2005.

The distribution unit 2006 distributes the trained model LM to the shovel 100.

For example, when the trained model LM is generated by the machine learning unit 2004, the distribution unit 2006 distributes the most recently generated trained model LM to the shovel 100. The distribution unit 2006 may distribute the latest trained model LM of the trained model storage unit 2005 to the shovel 100 in response to a signal for requesting distribution of the trained model LM received from the shovel 100.

The work assistance unit 302 is a functional unit for assisting work of the shovel 100 by an operation by the operator.

The work assistance unit 302 includes a trained model storage unit 302A, a work target shape acquisition unit 302B, a movement selecting unit 302F, a condition setting unit 302G, a trajectory generation unit 302H, a display processing unit 302I, and a movement control unit 302E.

The trained model storage unit 302A stores the trained model LM distributed from the information processing device 200 and received through the communication device 60.

The work target shape acquisition unit 302B acquires data relating to a shape (topography) of a work target around the shovel 100 based on outputs of the imaging device 40 and the distance sensor.

The movement selecting unit 302F selects (the type of) the movement of the shovel 100 from among a plurality of candidate movements, in response to an input from the user (operator) received through the input device 52. In addition, when the shovel 100 is remotely operated, the movement selecting unit 302F may select the movement of the shovel 100 from among a plurality of candidate movements, in response to an input from the user (operator) who uses the remote operation assistance device 300, which is received through the communication device 60.

The condition setting unit 302G sets a precondition relating to generation of a target trajectory of the work part of the shovel 100, in response to an input from a user (operator) received through the input device 52. When the shovel 100 is remotely operated, the condition setting unit 302G may set a precondition relating to the target trajectory of the shovel 100, in response to an input from a user (operator) using the remote operation assistance device 300 received through the communication device 60. The condition setting unit 302G may automatically set the precondition without depending on the input from the user. For example, the condition setting unit 302G may automatically set the precondition, based on a trained model, which is generated using, as a training dataset, a history of combined data including data relating to a shape of the work target and the precondition set for the shape of the work target. In this case, the condition setting unit 302G may automatically correct the set precondition, in response to an input from the user.

The precondition includes, for example, to a point serving as a target in a topography around the shovel 100 (hereinafter, referred to as a “target point”) during movement of the shovel 100. The target point includes, for example, target points through which the work part passes during movement of the shovel 100, a point corresponding to a place where the earth and sand in the bucket 6 is discharged during movement of the shovel 100, and the like. The precondition may include a posture state of the bucket 6 (posture angle of the bucket 6) at the target point.

The trajectory generation unit 302H generates a target trajectory of the work part of the shovel 100, based on the data acquired by the work target shape acquisition unit 302B, a target shape of the work target, the movement selected by the movement selecting unit 302F, and the precondition set by the condition setting unit 302G. The target shape of the work target is, for example, a target construction surface, which represents a plane or a curved surface as a construction target formed by work on the work target (the ground at the work site). The target shape of the work target is set by, for example, inputting parameters representing the plane or the curved surface from the user through the input device 52 or the remote operation assistance device 300 (input device). The target shape of the work target may be distributed to the shovel 100 from an external device such as the information processing device 200, for example. The trajectory generation unit 302H applies the trained model LM to the data acquired by the work target shape acquisition unit 302B, the target shape of the work target, the movement selected by the movement selecting unit 302F, and the precondition set by the condition setting unit 302G as the input data. The trajectory generation unit 302H may use the target shape of the work target, the movement selected by the movement selecting unit 302F, and the data acquired by the work target shape acquisition unit 302B as the input data, and may output the target trajectory of the work part from the trained model LM. Then, the trajectory generation unit 302H may optimize the output target trajectory of the work part according to the precondition set by the condition setting unit 302G to generate the target trajectory of the work part.

The display processing unit 302I causes the display device 50A to display a screen relating to generation of the target trajectory of the work part of the shovel 100 (see FIGS. 17 and 18). The screen relating to the generation of the target trajectory of the work part of the shovel 100 includes, for example, an operation screen for the user (operator) to perform an operation input relating to the movement of the shovel 100 selected by the movement selecting unit 302F and the precondition set by the condition setting unit 302G. The screen relating to the generation of the target trajectory of the work part of the shovel 100 also includes a screen for displaying the target trajectory of the work part of the shovel 100 generated by the trajectory generation unit 302H. In addition, in a case where the shovel 100 is remotely operated, the display processing unit 302I may transmit data on a screen relating to generation of the target trajectory of the work part of the shovel 100 to the remote operation assistance device 300 through the communication device 60. Thus, the display processing unit 302I can cause the remote operation assistance device 300 (display device) to display the screen relating to generation of the target trajectory of the work part of the shovel 100.

For example, as illustrated in FIGS. 17 and 18, the display processing unit 302I causes the display device 50A to display the screen 700 or a screen 800.

As illustrated in FIG. 17, the screen 700 includes images TG, CG, SB, and PB1.

The image TG is an image representing the topography around the shovel 100. The image is generated based on the data acquired by the work target shape acquisition unit 302B. In the present example, the image TG is an image representing the topography around the shovel 100 as viewed from a predetermined viewpoint outside the shovel 100. The predetermined viewpoint can be changed in accordance with, for example, an input from the user (operator) through the input device 52 or the remote operation assistance device 300 (input device).

The image CG is an image representing the shovel 100.

The positional relationship between the images TG and CG is set to be the same as the actual positional relationship between the shovel 100 and the topography around the shovel 100.

The image SB includes images representing candidate movements selectable by the movement selecting unit 302F. In the present example, the image SB includes images SB1 to SB5 representing the candidate movements of the shovel 100 that can be performed in the ground leveling work.

The image SB1 is an operation icon that enables the user to select a combination of the excavating movement and an earth discharging movement of the shovel 100.

The image SB2 is an operation icon that enables the user to select the sweep-out movement of the shovel 100.

The image SB3 is an operation icon that enables the user to select the horizontal leveling movement of the shovel 100.

The image SB4 is an operation icon that enables the user to select the broom-turning movement of the shovel 100.

The image SB5 is an operation icon that enables the user to select the rolling compaction movement of the shovel 100.

The user can designate any one of the images SB1 to SB5 through the input device 52 or the remote operation assistance device 300 (input device) and select the movement of the shovel 100 through the movement selecting unit 302F. In this example, a cursor (shaded area in the drawing) is present on the image SB1, and a state in which a combination of the excavating movement and the earth discharging movement of the shovel 100 is selected is represented.

Note that, in addition to the images SB1 to SB5, an operation icon for the user to select another movement different from the movements corresponding to the images SB1 to SB5 may be displayed on the image SB. In addition, in the image SB, instead of at least one of the images SB1 to SB5, an operation icon for the user to select another movement different from those of the images SB1 to SB5 may be displayed.

In the present example, the image TG includes image areas TG1 and TG2.

The image area TG1 represents a protruded portion on the ground around (in front of) the shovel 100.

The image area TG2 represents a recessed portion on the ground around (in front of) the shovel 100.

In this example, the screen 700 includes images P1 and P2 corresponding to target points.

The image P1 is displayed in a superimposed manner on the image area TG1.

The image P2 is displayed in a superimposed manner on the image area TG2.

For example, the user can set the target points corresponding to the images P1 and P2 through the condition setting unit 302G by designating the image areas TG1 and TG2 through the input device 52 or the remote operation assistance device 300 (input device). The user may be able to set the target points in the entire range of the image TG through the input device 52 or the remote operation assistance device 300 (input device), or may be able to set the target points only in a range in which the work part of the bucket 6 can reach in the entire range of the image TG. In the former case, when the target points are set in a range in which the work part of the bucket 6 can reach in the entire range of the image TG, a display content indicating an error (warning) may be displayed on the screen 700. In the latter case, an image representing a range in which the work part of the bucket 6 can reach in the entire range of the image TG may be displayed in a superimposed manner on the image TG. The user may be able to delete the set target points through the input device 52 or the remote operation assistance device 300 (input device).

In this example, the images P1 and P2 are displayed on the screen 700 accompanied by the images RC1 and RC2, respectively.

The image RC1 is an image representing a precondition of the posture angle of the bucket 6 corresponding to the target point corresponding to the image P1.

The image RC2 is an image representing a precondition of the posture angle of the bucket 6 corresponding to the target point corresponding to the image P2.

For example, the user can set the precondition of the posture angle of the bucket 6 corresponding to the images P1 and P2 through the condition setting unit 302G by designating the images RC1 and RC2 through the input device 52 or the remote operation assistance device 300 (input device).

The image PB1 is an icon for operation for causing the trajectory generation unit 302H to generate a trajectory of the work part of the bucket 6 in accordance with a movement selected on the screen 800 and the precondition set on the screen 800.

For example, the user can generate the target trajectory of the bucket 6 through the trajectory generation unit 302H by operating the image PB1 through the input device 52 or the remote operation assistance device 300 (input device).

When the image PB1 is operated, the display contents of the display device 50A transition from the screen 700 to the screen 800.

The screen 800 includes images TG, CG, and SB, as in the screen 700. The screen 800 includes the images P1 and P2, as in the screen 700. The screen 800 includes images OG, CG1, and PB2.

An image OG is an image representing a target trajectory.

The image CG1 is an image representing the bucket 6, which is displayed in a manner accompanied by the image OG, which corresponds to a target trajectory.

In the present example, the image OG represents a target trajectory for implementing the excavating movement to scoop the earth and sand at the target point corresponding to the image P1 and discharging the earth and sand to the target point corresponding to the image P2. The image OG may be expressed such that a trajectory portion where the work part of the bucket 6 comes into contact with the earth and sand can be distinguished from other trajectory portions. For example, in the image OG, the trajectory portion where the work part of the bucket 6 comes into contact with earth and sand and other trajectory portions are displayed in different colors.

The image PB2 is an icon for operation for reproducing, on the screen 800, the movement of the work part of the bucket 6 along the target trajectory corresponding to the image OG by a video (animation).

For example, the user can display a video in which the image CG1 corresponding to the bucket 6 moves along the image OG corresponding to the target trajectory on the screen 800 by operating the image PB2 through the input device 52 or the remote operation assistance device 300 (input device). Therefore, the user can determine whether the target trajectory is appropriate by checking the moving image.

In the moving image, the shape (topography) of the work target after the shovel 100 is operated to move the bucket 6 along the target trajectory may be displayed. That is, the screen 800 may display the shape (topography) of the work target around the shovel 100, which is predicted after the shovel 100 is operated to move the bucket 6 along the target trajectory corresponding to the image OG. Thus, the user can more appropriately determine whether the target trajectory is appropriate by checking the video and the predicted change in the topography.

When the image PB1 is operated, the display contents of the display device 50A transition from the screen 800 to the screen 900.

The screen 900 includes images TG, CG, and SB, as in the screen 800. The screen 900 includes the images P1 and P2, as in the screen 800. The screen 900 includes images OG and CG1. The screen 900 includes an image PB3.

The image PB3 is an operation icon for automatically moving the shovel 100 so as to move the work part of the bucket 6 along the target trajectory corresponding to the image OG.

For example, the user can automatically move the shovel 100 through the movement control unit 302E so that the bucket 6 moves along the target trajectory corresponding to the image OG by operating the image PB3 through the input device 52 or the remote operation assistance device 300 (input device).

Note that the shovel 100 may be automatically moved so that the bucket 6 moves along the target trajectory corresponding to the image OG without the user checking the above-described video. In this case, in addition to the image PB2, an operation icon corresponding to the image PB3 is displayed on the screen 800.

Returning to FIG. 17, the movement control unit 302E operates the shovel 100 so that the work part of the bucket 6 moves along the target trajectory generated by the trajectory generation unit 302H in response to an input from the user (operator) received through the input device 52. Specifically, the movement control unit 302E can operate the shovel 100 so that the work part of the bucket 6 moves along the target trajectory by controlling the hydraulic control valve 31 while identifying the position of the work part of the bucket 6 from the outputs of the sensors S1 to S5 and the like.

For example, the movement control unit 302E operates the shovel 100 such that the work part of the bucket 6 moves along the target trajectory generated by the trajectory generation unit 302H in response to the input of the instruction to execute the operation from the user.

The movement control unit 302E may operate the shovel 100 so that the work part of the bucket 6 moves along the target trajectory generated by the trajectory generation unit 302H in a manner of assisting the operation by the operator in response to the operation of the operation device 26 or the remote operation signal.

<Processing>

Next, the processing relating to generation of a target trajectory of the work part of the shovel 100 will be described with reference to FIG. 20.

FIG. 20 is a flowchart schematically illustrating an example of the processing relating to generation of a target trajectory of the work part of the shovel 100.

The flowchart of FIG. 20 is repeatedly executed during activation of a function relating to generation of a target trajectory of the work part of the shovel 100, for example. The function relating to the generation of the target trajectory of the work part of the shovel 100 is activated (started) by an input of an instruction from the user, which is received through the input device 52 or the remote operation assistance device 300 (input device).

As illustrated in FIG. 20, in step S302 (an example of an acquiring step), the work target shape acquisition unit 302B acquires data relating to the shape of the work target around the shovel 100 from the imaging device 40.

When the processing of step S302 is completed, the controller 30 proceeds to step S304.

In step S304 (an example of a display step), the display processing unit 302I causes the display device 50A or the remote operation assistance device 300 (display device) to display a setting screen (e.g., the screen 700) including an image representing the topography, based on the data acquired in step S302.

When the processing of step S304 is completed, the controller 30 proceeds to step S306.

In step S306, the movement selecting unit 302F selects one movement from among a plurality of candidate movements of the shovel 100 in response to an input from the user.

When the processing of step S306 is completed, the controller 30 proceeds to step S308.

In step S308, the condition setting unit 302G (an example of a setting step) sets a precondition relating to generation of the target trajectory of the work part of the shovel 100 in response to an input from the user.

When the processing of step S308 is completed, the controller 30 proceeds to step S310.

The order of steps S306 and S308 may be reversed, in response to an input from the user.

In step S310 (an example of a generation step), the trajectory generation unit 302H generates a target trajectory of the work part of the shovel 100 under the precondition set in step S306 in response to the movement selected in step S308.

When the processing of step S310 is completed, the controller 30 proceeds to step S312.

In step S312, the display processing unit 302I causes the display device 50A or the remote operation assistance device 300 (display device) to display an image representing the target trajectory generated in step S310.

When the processing of step S312 is completed, the controller 30 proceeds to step S314.

In step S314, the controller 30 determines whether an operation input for instructing the shovel 100 to execute the movement of the work part of the bucket 6 along the target trajectory generated in step S312 has been received. The controller 30 proceeds to step S316 when the operation input for instructing the shovel 100 to execute the movement is received, and returns to step S306 when another operation, specifically, an operation for generating the target trajectory is received again.

In step S316, the movement control unit 302E controls the hydraulic control valve 31 to automatically move the shovel 100 so that the work part of the bucket 6 moves along the target trajectory generated in the processing of the most recent step S310.

When the processing of step S316 is completed, the controller 30 ends the processing of the flowchart of this time.

At the time of completion of the processing of step S316, the shovel 100 (attachment AT) may be in a state where the work part of the bucket 6 is at the end point of the target trajectory, or may be returned to the posture state before the start of the processing of step S314.

In this way, in the present example, the assistance device 150 (controller 30) can generate a target trajectory suitable for the shape of the work target. Therefore, the work efficiency of the shovel 100 can be improved.

In this example, the assistance device 150 can generate a target trajectory that matches a precondition such as a target point or a posture angle of the bucket 6. Therefore, it is possible to generate a more appropriate target trajectory by reflecting the user's determination or intention based on the shape of the work target or the like.

In this example, the assistance device 150 can automatically move the shovel 100 so that the work part of the bucket 6 moves along the generated target trajectory. Therefore, even an inexperienced operator can cause the shovel 100 to perform an appropriate movement, and as a result, the work efficiency of the shovel 100 can be further improved.

Another Example of Function for Generation of Trajectory of Work Part

Next, another example of a function relating to the generation of the trajectory of the work part will be described.

The embodiments of the functions relating to the generation of the trajectory of the work part described above may be combined with each other, or may be modified or changed as appropriate.

For example, in the above-described embodiment, the trajectory generation unit 302H may generate the target trajectory without using the trained model LM. For example, a trajectory serving as a reference of the work part may be defined in advance for each of the plurality of candidate movements, and the trajectory generation unit 302H may optimize the trajectory serving as a reference of the movement selected by the movement selecting unit 302F according to the shape (topography) of the work target around the shovel 100 and the precondition to generate the target trajectory of the work part.

In the embodiment and the modifications thereof described above, data relating to the shape of the work target around the shovel 100 may be acquired, based on data of an imaging device, a distance sensor, or the like installed outside the shovel 100. For example, the shovel 100 receives information from the imaging device or the distance sensor installed in the work site through the communication device 60, and thus the work target shape acquisition unit 302B can acquire information on the shape of the work target around the shovel 100. For example, the shovel 100 receives information from the imaging device or the distance sensor mounted on a drone flying above the work site through the communication device 60, and thus the work target shape acquisition unit 302B can acquire information relating to the shape of the work target around the shovel 100.

The assistance device 150 according to the above-described embodiment and the modifications thereof may generate a trajectory of a work part of another work machine different from the shovel 100.

In the above-described embodiment and the modifications thereof, the assistance device 150 may have the above-described movement proposing function of the work machine in addition to the function relating to the generation of the trajectory of the work part of the work machine.

In the above-described embodiment and the modifications thereof, some or all of the functions of the assistance device 150 may be transferred to the remote operation assistance device 300.

In the above-described embodiment and the modifications thereof, some or all of the functions of the assistance device 150 may be transferred to the information processing device 200.

[Effects of Assistance Device (2)]

Next, effects of the assistance device according to the present embodiment will be described.

For example, a technique is known in which teaching points are set and a trajectory of a work part of a work machine is generated based on the teaching points (e.g., see Japanese Unexamined Patent Application Publication No. 2021-50576).

However, for example, in Patent Document 1, it is necessary to set the teaching points by operating the shovel and operating the work machine in practice, and as a result, there is a possibility that a substantial amount of time and effort are required for generating the target trajectory of the work part.

In contrast, in the present embodiment, the assistance device includes an acquisition unit, a display unit, a setting unit, and a generation unit. The assistance device is, for example, the assistance device 150. The acquisition unit is, for example, the work target shape acquisition unit 302B. The display unit is, for example, the display device 50A. The setting unit is, for example, the condition setting unit 302G. The generation unit is, for example, the trajectory generation unit 302H. Specifically, the acquisition unit acquires data relating to the topography of the construction target around the work machine. The work machine is, for example, the shovel 100. The display unit displays an image representing the topography of the construction target, based on the data acquired by the acquisition unit. The setting unit sets a point (target point) serving as a target in the topography of the construction target during movement of the work machine. The generation unit generates a trajectory of the work part of the work machine, based on the data acquired by the acquisition unit, the target shape of the construction target, and the point set by the setting unit.

This makes it possible to more easily generate the trajectory of the work part of the work machine. Further, since the target point is set, for example, it is possible to generate a more appropriate trajectory of the work part in which the determination or intention of the user who visually recognizes the shape of the work target around the work machine is reflected. Therefore, the work efficiency of the work machine can be improved.

In the present embodiment, the assistance device may include a selecting unit. Specifically, the selecting unit may select one movement from among a plurality of candidate movements of the work machine, in response to an input from the user. The generation unit may generate the trajectory of the work part by the one movement of the work machine, based on the data acquired by the acquisition unit and the point set by the setting unit.

Thus, the assistance device can define a movement of the work machine and generate the trajectory of the work part. Therefore, the work efficiency of the work machine can be further improved.

In the present embodiment, the setting unit may set a point (target point) serving as a target in the topography around the work machine during movement of the work machine and the posture of the work part corresponding to the point, in response to an input from the user.

Thus, the assistance device can generate a more appropriate trajectory of the work part in which the user's determination and intention relating to the posture of the work part are reflected. The work efficiency of the work machine can be further improved.

In the present embodiment, the display unit may display an image representing the trajectory generated by the generation unit on an image representing the topography around the work machine, in a superimposed manner.

This enables the user to visually check the generated image. Further, the user can more appropriately determine the validity of the generated trajectory by visually recognizing the topography around the work machine and the generated trajectory at the same time.

In the present embodiment, the display unit may display a video in which the work part moves along the trajectory generated by the generation unit on an image representing the topography around the work machine, in a superimposed manner.

Thus, the user can more appropriately determine the validity of the generated trajectory by checking the moving image.

In the present embodiment, the display unit may display an image representing a shape of the work target around the work machine, which is predicted after the work part has moved along the trajectory generated by the generation unit.

Thus, the validity of the generated trajectory can be more appropriately determined by checking the shape of the work target after the work part has moved along the generated trajectory.

In the present embodiment, the assistance device may include a control unit that automatically moves the work machine, based on the trajectory generated by the generation unit, in response to an input from the user. The control unit is, for example, the movement control unit 302E.

Thus, even a relatively inexperienced operator can move the work part along the target trajectory. Therefore, the working efficiency can be improved. In addition, the convenience of the user can be improved.

In the present embodiment, the work machine may include the above-described operation assistance device.

Thus, the work machine can more easily generate the target trajectory and improve the work efficiency.

Although the embodiments of the present disclosure have been described in detail, the present disclosure is not limited to such specific embodiments, and various modifications and changes can be made within the scope of the gist described in the claims.

Claims

1. An assistance device comprising:

circuitry configured to:
acquire data relating to a shape of a work target around a work machine; and
propose, to a user, a movement from among a plurality of candidate movements in a predetermined work of the work machine, based on the acquired data, the plurality of candidate movements being of different types from each other.

2. The assistance device according to claim 1,

wherein when the predetermined work is a ground leveling work at a work site, the plurality of candidate movements include at least two of a sweep-out movement, a horizontal leveling movement, a rolling compaction movement, or a broom-turning movement.

3. The assistance device according to claim 1,

wherein the circuitry is further configured to:
estimate a movement suitable for the shape of the work target around the work machine from among the plurality of candidate movements, based on the acquired data relating to the shape of the work target around the work machine, using a trained model trained by machine learning with training data relating to movements of the work machine, the movements of the work machine corresponding to the shape of the work target and being operated by a relatively highly skilled operator, and
wherein the circuitry proposes the movement from among the plurality of candidate movements, based on an estimation result.

4. The assistance device according to claim 1,

wherein the circuitry proposes, to the user, a plurality of movements from among the plurality of candidate movements, based on the acquired data.

5. The assistance device according to claim 4,

wherein the plurality of proposed movements include a movement having a relatively low suitability for the shape of the work target around the work machine, from among the plurality of candidate movements.

6. The assistance device according to claim 1,

wherein the circuitry proposes one or a plurality of movements from among the plurality of candidate movements, based on the acquired data, together with a suitability of the proposed one or the proposed plurality of movements for the shape of the work target around the work machine.

7. The assistance device according to claim 1,

wherein the circuitry proposes the movement from among the plurality of candidate movements, based on the acquired data, together with one or more trajectories of a work part of the work machine by the proposed movement.

8. The assistance device according to claim 7,

wherein the circuitry proposes the movement from among the plurality of candidate movements, based on the acquired data, together with a plurality of trajectories of the work part by the proposed movement and a suitability for the shape of the work target around the work machine on a per plurality of trajectories basis.

9. The assistance device according to claim 7, further comprising:

a display device,
wherein the circuitry causes the display unit to display the one or more trajectories of the work part by the proposed movement from among the plurality of candidate movements in a superimposed manner on an image representing a situation around the work machine.

10. The assistance device according to claim 1, wherein the circuitry is further configured to:

cause, when a movement of the work machine is proposed, the work machine to automatically execute the proposed movement, in response to an input of an instruction from the user.

11. The assistance device according to claim 1, further comprising:

a display unit configured to display an image representing the shape of the work target, based on the acquired data,
wherein the circuitry is further configured to:
set a point serving as a target in the shape of the work target during movement of the work machine; and
generate a trajectory of a work part of the work machine, based on the acquired data, a target shape of the work target, and the set point.

12. A work machine comprising:

circuitry configured to:
acquire data relating to a shape of a work target around a work machine; and
propose, to a user, a movement from among a plurality of candidate movements in a predetermined work of the work machine, based on the acquired data, the plurality of candidate movements being of different types from each other.

13. A non-transitory computer-readable recording medium storing a program, wherein when executed by a processor of an assistance device, causes the assistance device to execute a process, the process including:

acquiring data relating to a shape of a work target around a work machine; and
proposing, to a user, a movement from among a plurality of candidate movements in a predetermined work of the work machine, based on the data acquired in the acquiring, the plurality of candidate movements being of different types from each other.
Patent History
Publication number: 20250019937
Type: Application
Filed: Sep 27, 2024
Publication Date: Jan 16, 2025
Inventors: Ryuji TSUZUKI (Kanagawa), Hiroki ADACHI (Kanagawa), Kosuke HARA (Kanagawa)
Application Number: 18/899,378
Classifications
International Classification: E02F 9/26 (20060101); E02F 3/43 (20060101);