Posture Estimation Method, Posture Estimation Device, And Movable Device
A posture estimation method includes: measuring a bias value BW and a variance value PWW of an angular velocity sensor and a bias value BA and a variance value PVV of an acceleration sensor in a predetermined operation of an object and storing the measured values in a storage unit; reading the bias value BW, the variance value PWW, the bias value BA, and the variance value PVV from the storage unit and setting the read values as initial setting values during reset; and estimating a posture of the object using a Kalman filter from an output of the angular velocity sensor and an output of the acceleration sensor.
The present application is based on, and claims priority from JP Application Serial Number 2021-021507, filed Feb. 15, 2021, the disclosure of which is hereby incorporated by reference herein in its entirety.
BACKGROUND 1. Technical FieldThe present disclosure relates to a posture estimation method, a posture estimation device, and a movable device.
2. Related ArtA device and a system are known in which an inertial measurement unit (IMU) is attached to an object and a position and a posture of the object are calculated using an output signal of the inertial measurement unit. Since the output signal of the inertial measurement unit has a bias error and an error also occurs in posture calculation, a method is proposed in which a Kalman filter is used to correct these errors and estimate an accurate posture of the object. For example, JP-A-2020-20631 describes a posture estimation method for estimating a posture of an object by correcting predicted posture information on the object based on error information when an angular velocity sensor exceeds an effective measurement range.
However, an inertial measurement unit equipped with an angular velocity sensor and an acceleration sensor has different X-axis, Y-axis, and Z-axis directions depending on mounting positions associated with an operation of an object, and a bias error in the directions occur from an initial state. Thus, estimation accuracy of a posture of the object may be decreased and high-precision measurement cannot be performed.
SUMMARYA posture estimation method includes: measuring a bias value BW and a variance value PWW of an angular velocity sensor and a bias value BA and a variance value PVV of an acceleration sensor in a predetermined operation of an object and storing the measured values in a storage unit; reading the bias value BW, the variance value PWW, the bias value BA, and the variance value PVV from the storage unit and setting the read values as initial setting values during reset; measuring an angular velocity and an acceleration by the angular velocity sensor and the acceleration sensor in a stationary state; updating the bias value BW and the variance value PWW of the angular velocity sensor and the variance value PVV of the acceleration sensor from the initial setting values; and estimating a posture of the object using a Kalman filter from an output of the angular velocity sensor and an output of the acceleration sensor.
A posture estimation device includes a storage unit that stores a bias value BW and a variance value PWW of an angular velocity sensor and a bias value BA and a variance value PVV of an acceleration sensor in a predetermined operation of an object, and a processing unit that estimates a posture of the object using a Kalman filter from an output of the angular velocity sensor and an output of the acceleration sensor.
A movable device includes the posture estimation device described above and a control device configured to control the posture of the object based on posture information on the object estimated by the posture estimation device.
First, a posture estimation device 1 according to the first embodiment will be described with reference to
As shown in
In the present embodiment, as shown in
In the present embodiment, the inertial measurement unit 10 includes an angular velocity sensor 12, an acceleration sensor 14, and a signal processing unit 16. The inertial measurement unit 10 of the present embodiment may have a part of these components changed or removed or have another component added.
The angular velocity sensor 12 measures angular velocities in directions of three axes that intersect with each other and are perpendicular to each other in ideal, and outputs analog signals corresponding to magnitudes and directions of the measured angular velocities on the three axes.
The acceleration sensor 14 measures accelerations in the directions of the three axes that intersect with each other and are perpendicular to each other in ideal, and outputs analog signals corresponding to magnitudes and directions of the measured accelerations on the three axes.
The signal processing unit 16 performs processing of sampling the output signals of the angular velocity sensor 12 at a predetermined sampling interval Δt to convert the output signals into angular velocity data dω having a digital value. The signal processing unit 16 performs processing of sampling the output signals of the acceleration sensor 14 at the predetermined sampling interval Δt to convert the output signals into acceleration data dα having a digital value.
Ideally, the angular velocity sensor 12 and the acceleration sensor 14 are attached to the inertial measurement unit 10 such that the three axes coincide with three axes (x-axis, y-axis, z-axis) of a sensor coordinate system that is an orthogonal coordinate system defined for the inertial measurement unit 10. However, an error occurs in a mounting angle in practice. Thus, the signal processing unit 16 performs processing of converting the angular velocity data dω and the acceleration data dα into data in an xyz coordinate system, by using a correction parameter that is calculated in advance in accordance with the error in the mounting angle. The signal processing unit 16 also performs processing of correcting the temperature in the angular velocity data dω and the acceleration data dα in accordance with temperature characteristics of the angular velocity sensor 12 and the acceleration sensor 14.
A function of A/D conversion or temperature correction may be assembled in the angular velocity sensor 12 and the acceleration sensor 14.
The inertial measurement unit 10 outputs the angular velocity data dω and the acceleration data dα after the processing by the signal processing unit 16 to the processing unit 20 of the posture estimation device 1.
The ROM 30 stores programs for the processing unit to perform various types of processing, and various programs or various types of data for implementing application functions. The ROM 30 stores a bias value BW and a variance value PWW of the angular velocity sensor 12 and a bias value BA and a variance value PVV of the acceleration sensor 14 in a state in which the inertial measurement unit 10 is placed at a mounting position associated with a predetermined operation of the object. The predetermined operation is most frequently performed one of operations of the object. The predetermined operation may include a plurality of operations. At this time, the ROM 30 stores the bias values BW, BA and the variance values PWW, PVV according to the plurality of operations, that is, types of the operations.
The RAM 40 is a storage unit that is used as a work area of the processing unit 20, and temporarily stores a program or data read out from the ROM 30 or operation results obtained by the processing unit 20 performing processing in accordance with various programs.
The recording medium 50 is a non-volatile storage unit that stores data required to be preserved for a long term among data generated by processing of the processing unit 20. The recording medium 50 may store programs for the processing unit 20 to perform various types of processing, and various programs or various types of data for implementing application functions.
The processing unit 20 performs various types of processing in accordance with a program stored in the ROM 30 or the recording medium 50 or in accordance with a program received from a server via a network and then stored in the RAM 40 or the recording medium 50. In particular, in the present embodiment, the processing unit 20 executes the program to function as a bias removal unit 22, a posture change amount calculation unit 24, a velocity change amount calculation unit 26, and a posture estimation unit 28, and performs a predetermined operation on the angular velocity data dω and the acceleration data dα output at the interval Δt by the inertial measurement unit 10 to perform processing of estimating the posture of the object. Upon receiving a reset instruction from a user, the processing unit 20 reads the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 from the ROM 30 and sets the read values as initial setting values. When the predetermined operation includes a plurality of operations, the processing unit 20 reads the bias values BW, BA and the variance values PWW, PVV corresponding to an operation selected by the user from the plurality of operations from the ROM 30 and sets the read values as the initial setting values.
In the present embodiment, as shown in
The bias removal unit 22 reads the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 from the ROM 30, sets the read values as the initial setting values of a bias error, and then performs processing of calculating the angular velocities on the three axes obtained by removing the bias error from the output of the angular velocity sensor 12 and processing of calculating the accelerations on the three axes obtained by removing the bias error from the output of the acceleration sensor 14.
The posture change amount calculation unit 24 calculates a posture change amount of the object based on the output of the angular velocity sensor 12. Specifically, the posture change amount calculation unit 24 performs processing of calculating the posture change amount of the object by approximation with a polynomial expression in which the sampling interval Δt is used as a variable by using the angular velocities on the three axes in which the bias error is removed by the bias removal unit 22.
The velocity change amount calculation unit 26 calculates a velocity change amount of the object based on the output of the angular velocity sensor 12 and the output of the acceleration sensor 14. Specifically, the velocity change amount calculation unit 26 performs processing of calculating the velocity change amount of the object by using the angular velocities on the three axes and accelerations on the three axes in which the bias error is removed by the bias removal unit 22.
The posture estimation unit 28 functions as an integration calculation unit 101, a posture information prediction unit 102, an error information update unit 103, a correction coefficient calculation unit 104, a posture information correction unit 105, a normalization unit 106, an error information correction unit 107, a rotational error component removal unit 108, a bias error limitation unit 109, and an error information adjustment unit 110. The posture estimation unit 28 performs processing of estimating the posture of the object with the posture change amount calculated by the posture change amount calculation unit 24 and the velocity change amount calculated by the velocity change amount calculation unit 26. In practice, the posture estimation unit 28 performs processing of estimating a state vector x and an error covariance matrix Σx2 thereof with an extended Kalman filter.
The integration calculation unit 101 performs integration processing of integrating the posture change amount calculated by the posture change amount calculation unit 24 with a previous estimated value of the posture that is corrected by the posture information correction unit 105 and normalized by the normalization unit 106. The integration calculation unit 101 performs integration processing of integrating the velocity change amount calculated by the velocity change amount calculation unit 26 with a previous estimated value of the velocity that is corrected by the posture information correction unit 105 and normalized by the normalization unit 106.
The posture information prediction unit 102 performs processing of predicting posture quaternion q that is posture information on the object using the posture change amount calculated by the posture change amount calculation unit 24. The posture information prediction unit 102 also performs processing of predicting a motion velocity vector v that is velocity information on the object based on the velocity change amount calculated by the velocity change amount calculation unit 26. In practice, the posture information prediction unit 102 performs processing of predicting the state vector x including the posture quaternion q and the motion velocity vector v as components.
The error information update unit 103 performs processing of updating the error covariance matrix Σx2 that is error information based on the output of the angular velocity sensor 12. Specifically, the error information update unit 103 performs processing of updating a posture error of the object with the angular velocities on the three axes in which the bias error is removed by the bias removal unit 22. In practice, the error information update unit 103 performs processing of updating the error covariance matrix Σx2 with the extended Kalman filter.
The rotational error component removal unit 108 performs processing of removing a rotational error component around a reference vector in an error covariance matrix Σ2 that is the error information. Specifically, the rotational error component removal unit 108 performs processing of removing an azimuth error component included in the posture error in the error covariance matrix Σx2 updated by the error information update unit 103. In practice, the rotational error component removal unit 108 performs processing of generating the error covariance matrix Σx2 in which rank limitation of an error covariance matrix Σq2 of the posture and removal of the azimuth error component are performed on the error covariance matrix Σx2.
The error information adjustment unit 110 determines whether the output of the angular velocity sensor 12 is within an effective range. When the error information adjustment unit 110 determines that the output of the angular velocity sensor 12 is not within the effective range, the error information adjustment unit 110 increases a posture error component in the error covariance matrix Σx2 that is the error information and reduces a correlation component between the posture error component and an error component other than the posture error component in the error covariance matrix Σx2, for example, the correlation component is set to zero. The error information adjustment unit 110 determines whether the output of the acceleration sensor 14 is within an effective range. When the error information adjustment unit 110 determines that the output of the angular velocity sensor 12 or the output of the acceleration sensor 14 is not within a corresponding effective range, the error information adjustment unit 110 increases a motion velocity error component in an error covariance matrix Σx, k2 and reduces a correlation component between the motion velocity error component and an error component other than the motion velocity error component in the error covariance matrix Σx, k2, for example, the correlation component is set to zero. Specifically, in an angular velocity off-scale recovery period after the error information adjustment unit 110 determines that the output of the angular velocity sensor 12 is not within the effective range, the error information adjustment unit 110 increases the posture error component and the motion velocity error component in the error covariance matrix Σx2 generated by the rotational error component removal unit 108 and reduces the correlation component between the posture error component and the error component other than the posture error component and a correlation component between the motion velocity error component and an error component other than the motion velocity error component, for example, the correlation components are set to zero. In an acceleration off-scale recovery period after the error information adjustment unit 110 determines that the output of the angular velocity sensor 12 is within the effective range and the output of the acceleration sensor 14 is not within the effective range, the error information adjustment unit 110 increases the motion velocity error component in the error covariance matrix Σx2 generated by the rotational error component removal unit 108 and reduces the correlation component between the motion velocity error component and the error component other than the motion velocity error component, for example, the correlation component is set to zero.
The bias error limitation unit 109 performs processing of limiting a bias error component of an angular velocity around the reference vector in the error covariance matrix Σx2 that is the error information. Specifically, the bias error limitation unit 109 performs processing of limiting a vertical component of the bias error of the angular velocity in the error covariance matrix Σx2 generated by the error information adjustment unit 110. In practice, the bias error limitation unit 109 performs processing as follows. The bias error limitation unit 109 determines whether the vertical component of the bias error of the angular velocity exceeds an upper limit value. When the vertical component exceeds the upper limit value, the bias error limitation unit 109 generates the error covariance matrix Σx2 in which limitation is applied such that the vertical component has the upper limit value.
The correction coefficient calculation unit 104 performs processing of calculating a correction coefficient based on the error covariance matrix Σx2 that is the error information generated by the bias error limitation unit 109. The correction coefficient determines a correction amount of the posture quaternion q that is the posture information on the object by the posture information correction unit 105 or the motion velocity vector v that is the velocity information, and a correction amount of an error covariance matrix Σx that is the error information by the error information correction unit 107. In practice, the correction coefficient calculation unit 104 performs processing of calculating an observation residual Δz, a Kalman coefficient K, and a transformation matrix H.
The posture information correction unit 105 performs processing of correcting the posture quaternion q that is the posture information on the object predicted by the posture information prediction unit 102 based on the error covariance matrix Σx that is the error information. Specifically, the posture information correction unit 105 performs processing of correcting the posture quaternion q by using the error covariance matrix Σx generated by the bias error limitation unit 109 and the Kalman coefficient K and an observation residual Δza of the gravitational acceleration calculated by the correction coefficient calculation unit 104 based on a gravitational acceleration vector g that is the reference vector and an acceleration vector a obtained from the output of the acceleration sensor 14. In practice, the posture information correction unit 105 performs processing of correcting the state vector x predicted by the posture information prediction unit 102 with the extended Kalman filter.
The normalization unit 106 performs processing of normalizing the posture quaternion q that is the posture information on the object corrected by the posture information correction unit 105 so that the magnitude thereof does not change. In practice, the normalization unit 106 performs processing of normalizing the state vector x corrected by the posture information correction unit 105.
The error information correction unit 107 performs processing of correcting the error covariance matrix Σx that is the error information. Specifically, the error information correction unit 107 performs processing of correcting the error covariance matrix Σx generated by the bias error limitation unit 109 with the extended Kalman filter and the transformation matrix H and the Kalman coefficient K calculated by the correction coefficient calculation unit 104.
The posture quaternion q that is the posture information on the object estimated by the processing unit may be transmitted to another device via the communication unit 60.
As described above, the posture estimation device 1 of the present embodiment can estimate the posture of the object in a stored predetermined operation of the object from the output of the angular velocity sensor 12 and the output of the acceleration sensor 14 using the Kalman filter based on the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14. Therefore, the bias error in the initial state caused by the difference in the mounting position associated with the predetermined operation of the object can be corrected, and thus the posture of the object can be measured with high accuracy.
1.2 Posture Estimation MethodNext, a posture estimation method for the posture estimation device 1 according to the first embodiment will be described with reference to
As shown in
First, in step S101, the inertial measurement unit 10 is installed at the mounting position associated with most frequently performed one of operations of the object, which is the predetermined operation of the object, to measure the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14.
1.2.2 Storage StepIn step S102, the processing unit 20 stores the measured bias value BW and variance value PWW of the angular velocity sensor 12 and the measured bias value BA and variance value PVV of the acceleration sensor 14 in the ROM 30 as the storage unit.
1.2.3 Initial Value Setting StepIn step S103, upon receiving the reset instruction from the user, the processing unit 20 reads the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 from the ROM 30 and sets the read values as the initial setting values of the bias error.
1.2.4 Angular Velocity and Acceleration Measurement StepIn step S104, the angular velocity and the acceleration are measured by the angular velocity sensor 12 and the acceleration sensor 14 in a stationary state. Measurement time is 200 msec.
1.2.5 Initial Setting Value Updating StepIn step S105, the processing unit 20 adds the bias value BW and the variance value PWW of the angular velocity sensor 12 to the measured angular velocity data dω, and updates the bias error from the initial setting values. Further, the variance value PVV of the acceleration sensor 14 is added to the acceleration data dα and the bias error is updated from the initial setting values. The bias value BA of the acceleration sensor 14 is not used to update the initial setting values.
1.2.6 Posture Estimation StepIn step S106, the processing unit 20 estimates the posture of the object from the output of the angular velocity sensor 12 and the output of the acceleration sensor 14 using the Kalman filter based on the updated bias value BW, variance value PWW, and variance value PVV and the un-updated bias value BA. The posture quaternion q that is the estimated posture information on the object is transmitted to another device via the communication unit 60.
As described above, the posture estimation method of the present embodiment can estimate the posture of the object from the output of the angular velocity sensor 12 and the output of the acceleration sensor 14 using the Kalman filter based on the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 in the stored predetermined operation of the object. Therefore, the bias error in the initial state caused by the difference in the mounting position associated with the predetermined operation of the object can be corrected, and thus the posture of the object can be measured with high accuracy.
2. Second Embodiment 2.1 Posture Estimation MethodNext, a posture estimation method according to the second embodiment will be described with reference to
Compared to the posture estimation method of the first embodiment, the posture estimation method of the present embodiment is the same as the posture estimation method of the first embodiment except that the predetermined operation of the object includes a plurality of operations and the corresponding bias values BW, BA and the variance values PWW, PVV are stored in the ROM 30 according to types of the operations. Differences from the first embodiment described above will be mainly described, and the description of similar matters will be omitted.
The posture estimation method according to the second embodiment will be described with reference to
As shown in
First, in step S201, the inertial measurement unit 10 is installed at a mounting position associated with a first operation of the object, to measure the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14.
2.1.2 Storage StepIn step S202, the measured bias value BW and variance value PWW of the angular velocity sensor 12 and the measured bias value BA and variance value PVV of the acceleration sensor 14 are stored in the ROM 30 that is the storage unit as the bias values BW, BA and the variance values PWW, PVV of the first operation.
Next, the inertial measurement unit 10 is installed at a mounting position associated with a second operation of the object, and the steps of step S201 and step S202 are repeated. The steps of step S201 and step S202 are repeated according to the plurality of operations of the object, that is, the types of the operations.
2.1.3 Initial Value Selection StepAfter acquiring the bias values BW, BA and the variance values PWW, PVV of the sensors 12 and 14 corresponding to the plurality of operations of the object, in step S203, the processing unit 20 receives an operation selection instruction from a user.
2.1.4 Initial Value Setting StepIn step S204, upon receiving a reset instruction from the user, the processing unit 20 reads the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 associated with the operation selected from the ROM 30 and sets the read values as the initial setting values of the bias error.
2.1.5 Angular Velocity and Acceleration Measurement StepStep S205 is similar to step S104 of the first embodiment, and thus the description thereof is omitted.
2.1.6 Initial Setting Value Updating StepStep S206 is similar to step S105 of the first embodiment, and thus the description thereof is omitted.
2.1.7 Posture Estimation StepStep S207 is similar to step S106 of the first embodiment, and thus the description thereof is omitted.
With such a configuration, even though the inertial measurement unit 10 is installed at mounting positions corresponding to a plurality of operations of the object, the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 can be selected according to the types of the operations, and thus the bias error in the initial state can be corrected and the posture of the object can be measured with high accuracy.
3. Third Embodiment 3.1 Movable DeviceNext, a movable device 600 including the posture estimation device 1 according to the third embodiment will be described with reference to
The posture estimation device 1 of the above embodiment can be effectively used in posture control of a construction machine.
As shown in
The work mechanism 620 includes a boom 613, an arm 614, a bucket link 616, a bucket 615, a boom cylinder 617, an arm cylinder 618, and a bucket cylinder 619, as the plurality of members. The boom 613 is attached to the front portion of the upper revolving body 611 to move up and down. The arm 614 is attached to a top side of the boom 613 to move up and down. The bucket link 616 is attached to a top side of the arm 614 to be pivotable. The bucket 615 is attached to top sides of the arm 614 and the bucket link 616 to be pivotable. The boom cylinder 617 drives the boom 613. The arm cylinder 618 drives the arm 614. The bucket cylinder 619 drives the bucket 615 through the bucket link 616.
A base end of the boom 613 is supported by the upper revolving body 611 to be pivotable in the up-and-down direction, and is rotationally driven relative to the upper revolving body 611 by expansion and contraction of the boom cylinder 617. An inertial measurement unit 10c functioning as an inertial sensor that detects a motion state of the boom 613 is disposed in the boom 613.
One end of the arm 614 is supported by the top side of the boom 613 to be rotatable. The arm 614 is rotationally driven relative to the boom 613 by expansion and contraction of the arm cylinder 618. An inertial measurement unit 10b functioning as an inertial sensor that detects a motion state of the arm 614 is disposed in the arm 614.
The bucket link 616 and the bucket 615 are supported by the top side of the arm 614 to be pivotable. The bucket link 616 is rotationally driven relative to the arm 614 by expansion and contraction of the bucket cylinder 619, and the bucket 615 is rotationally driven relative to the arm 614 together with the bucket link 616. An inertial measurement unit 10a functioning as an inertial sensor that detects a motion state of the bucket link 616 is disposed in the bucket link 616.
The inertial measurement units 10a, 10b, 10c, and 10d can detect at least one of an angular velocity and an acceleration acting on the members of the work mechanism 620 or the upper revolving body 611. According to mounting positions of the upper revolving body 611, the boom 613, the arm 614, and the bucket 615 that operate differently, the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 are actually measured before mounting. Therefore, the inertial measurement units 10a, 10b, 10c, and 10d can correct bias errors in an initial state. Further, as shown in
Further, as shown in
As a construction machine in which the posture estimation device 1 in the above embodiment is used, for example, a rough terrain crane (crane car), a bulldozer, an excavator and loader, a wheel loader, and an aerial work vehicle (lift car) are provided in addition to the hydraulic shovel (jumbo, back hoe, power shovel) exemplified above.
According to the present embodiment, with the posture estimation device 1, it is possible to obtain information on a posture with high accuracy, and thus it is possible to implement appropriate posture control of the movable device 600. According to the movable device 600, the inertial measurement units 10a, 10b, 10c, and 10d having the bias value BW and the variance value PWW of the angular velocity sensor 12 and the bias value BA and the variance value PVV of the acceleration sensor 14 according to the mounting position are mounted on the upper revolving body 611, the boom 613, the arm 614, and the bucket 615 that operate differently, so that the bias error in the initial state can be corrected, and thus the posture of the object can be measured with high accuracy.
In the present embodiment, descriptions are made by using a four-wheel vehicle such as an agricultural machine and the construction machine as an example of the movable device in which the posture estimation device 1 is used. However, in addition, motorcycles, bicycles, trains, airplanes, biped robots, remote-controlled or autonomous aircraft (such as radio-controlled aircraft, radio-controlled helicopters and drones), rockets, satellites, ships, automated guided vehicles (AGVs) are provided. The predetermined operation described above may be an operation of raising the bucket 615, an operation of lowering the bucket 615, or revolving or movement. The plurality of operations described above may be any two of the operation of raising the bucket 615, the operation of lowering the bucket 615, revolving or movement, and the like. The predetermined operation may be a predetermined operation of a robot arm or a drone. Further, the predetermined operation may be a walking operation of a biped robot. In the inertial measurement units 10a, 10b, 10c, 10d, the bias value BW and the variance value PWW of the angular velocity sensor 12 may be different values or the same values in the inertial measurement units 10a, 10b, 10c, 10d. The movable device 600 has four inertial measurement units 10a, 10b, 10c, and 10d, but may have two, three, or five or more.
The present disclosure is not limited to the present embodiments, and various modifications can be made within the scope of the gist of the present disclosure.
The above-described embodiments and modifications are merely examples, and the present disclosure is not limited thereto. For example, it is also possible to appropriately combine embodiments and modifications.
The present disclosure includes a configuration substantially the same as the configuration described in the embodiments, for example, a configuration having the same function, method, and result, or a configuration having the same purpose and effect. The present disclosure includes a configuration in which a non-essential portion of the configuration described in the embodiments is replaced. The present disclosure includes a configuration having the same function and effect as the configuration described in the embodiments, or a configuration capable of achieving the same purpose. The present disclosure includes a configuration in which a known technique is added to the configuration described in the embodiments.
Claims
1. A posture estimation method comprising:
- measuring a bias value BW and a variance value PWW of an angular velocity sensor and a bias value BA and a variance value PVV of an acceleration sensor in a predetermined operation of an object and storing the measured values in a storage unit;
- reading the bias value BW, the variance value PWW, the bias value BA, and the variance value PVV from the storage unit and setting the read values as initial setting values during reset;
- measuring an angular velocity and an acceleration by the angular velocity sensor and the acceleration sensor in a stationary state of the object;
- updating the bias value BW and the variance value PWW of the angular velocity sensor and the variance value PVV of the acceleration sensor from the initial setting values; and
- estimating a posture of the object using a Kalman filter from an output of the angular velocity sensor and an output of the acceleration sensor.
2. The posture estimation method according to claim 1, wherein
- the predetermined operation is most frequently performed one of operations of the object.
3. The posture estimation method according to claim 1, wherein
- the predetermined operation includes a plurality of operations, and the bias value BW, the variance value PWW, the bias value BA, and the variance value PVV corresponding to one of the plurality of operations according to a type of the plurality of operations are read from the storage unit.
4. The posture estimation method according to claim 1, wherein
- measurement time in the stationary state is 200 msec.
5. A posture estimation device comprising:
- a storage unit configured to store a bias value BW and a variance value PWW of an angular velocity sensor and a bias value BA and a variance value PVV of an acceleration sensor in a predetermined operation of an object; and
- a processing unit configured to estimate a posture of the object using a Kalman filter from an output of the angular velocity sensor and an output of the acceleration sensor.
6. The posture estimation device according to claim 5, wherein
- the predetermined operation is most frequently performed one amoung operations of the object.
7. The posture estimation device according to claim 5, wherein
- the predetermined operation includes a plurality of operations, and
- the storage unit stores the bias value BW, the variance value PWW, the bias value BA, and the variance value PVV corresponding to the plurality of operations.
8. A posture estimation device comprising:
- a storage unit configured to store, in association with each of a plurality of movements of the object, a bias value BW and a variance value PWW of an angular velocity sensor and a bias value BA and a variance value PVV of an acceleration sensor; and
- a processing unit configured to estimate a posture of the object using a Kalman filter from an output of the angular velocity sensor and an output of the acceleration sensor.
9. A movable device comprising:
- the posture estimation device according to claim 5; and
- a control device configured to control the posture of the object based on posture information on the object estimated by the posture estimation device.
10. A movable device comprising:
- the posture estimation device according to claim 8; and
- a control device configured to control the posture of the object based on posture information on the object estimated by the posture estimation device.
Type: Application
Filed: Feb 14, 2022
Publication Date: Aug 18, 2022
Inventor: Kentaro YODA (Chino)
Application Number: 17/670,577