MOTION ANALYSIS METHOD AND MOTION ANALYSIS DEVICE

In a motion analysis method, an impact point analysis section identifies a posture of a hitting area of sporting equipment at impact using an output of an inertial sensor. A determination section determines a type of a course of a hit ball using information of the posture of the hitting area thus identified. The course of the hit ball is estimated with the posture of the hitting area at impact. The type of the course of the hit ball is determined in accordance with the posture of the hitting area.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
BACKGROUND

1. Technical Field

The present invention relates to a motion analysis method and a motion analysis device for a golf swing and so on.

2. Related Art

As described in JP-A-2011-110164 (Document 1), a display method of a swing is commonly known. In this display method, an approach angle θ of the club head is assigned to the x axis, and the face angle φ is assigned to the y axis. In accordance with such a two-dimensional coordinate system, a slice area, a fade area, a straight area, a draw area, and a hook area are displayed on a screen. The face angle φ and the approach angle θ of the club head measured in the swing action are plotted from the face angle φ and the approach angle θ at the impact. Further, JP-A-2008-73210 is an example of a related art document.

In the display method described in Document 1, the swing action is shot by a camera when measuring the face angle φ and the approach angle θ. On this occasion, in the three-dimensional space thus shot, a target line connecting the center of a golf ball and the target to each other is identified. The face angle φ and the approach angle θ are measured based on the target line. When the hit ball deviates from the hit ball direction expected by the golfer, it is not achievable to distinguish between the fact that it has been affected by the orientation of the body at address and the fact that it has been affected by the swing action itself. It is conceivable that if an analysis is performed on each of these influential factors, the golfer can more efficiently improve his or her own form of the swing.

SUMMARY

An advantage of at least one aspect of the invention is to provide a motion analysis method to identify the hit ball direction from the swing action without being affected by the orientation of the body.

(1) An aspect of the invention relates to a motion analysis method including identifying a posture of a hitting area of sporting equipment at an impact using an output of an inertial sensor, and determining a type of a course of a hit ball using information of the posture of the hitting area identified.

The course of the hit ball is estimated with the posture of the hitting area at impact. The type of the course of the hit ball is determined in accordance with the posture of the hitting area. When determining the type of the course of the ball, the output of the inertial sensor is used. Unlike the target line identified in the three-dimensional space shot, the output of the inertial sensor can reflect the orientation of the body of the user. Therefore, when the hit ball is deviated from the expected hit ball direction, the influence of the orientation of the body can be estimated. The type of the course of the hit ball can be identified from the swing action without being affected by the orientation of the body. Since the analysis of the swing action is performed in such a manner as described above without being affected by the orientation of the body, the user can more efficiently improve the form of the swing.

(2) Another aspect of the invention relates to a motion analysis method including identifying a trajectory of sporting equipment at an impact prior to the impact using an output of an inertial sensor, and determining a type of a course of a hit ball using information of the trajectory identified.

The course of the hit ball is estimated with the trajectory of the sporting equipment at impact. The type of the course of the hit ball is determined in accordance with the trajectory. When determining the type of the course of the ball, the output of the inertial sensor is used. Unlike the target line identified in the three-dimensional space shot, the output of the inertial sensor can reflect the orientation of the body of the user. Therefore, when the hit ball is deviated from the expected hit ball direction, the influence of the orientation of the body can be estimated. The type of the course of the hit ball can be identified from the swing action without being affected by the orientation of the body. Since the analysis of the swing action is performed in such a manner as described above without being affected by the orientation of the body, the user can more efficiently improve the form of the swing.

(3) Still another aspect of the invention relates to a motion analysis method including identifying a posture of a hitting area of sporting equipment at an impact using an output of an inertial sensor, and identifying a trajectory of the sporting equipment at the impact prior to the impact using an output of the inertial sensor, and determining a type of a course of a hit ball using information of the posture of the hitting area identified and the trajectory of the sporting equipment identified.

The course of the hit ball is estimated with the posture of the hitting area and the trajectory at impact. The type of the course of a ball is determined in accordance with the combination of the posture of the hitting area and the trajectory. When determining the type of the course of the ball, the output of the inertial sensor is used. Unlike the target line identified in the three-dimensional space shot, the output of the inertial sensor can reflect the orientation of the body of the user. Therefore, when the hit ball is deviated from the expected hit ball direction, the influence of the orientation of the body can be estimated.

The type of the course of the hit ball can be identified from the swing action without being affected by the orientation of the body. Since the analysis of the swing action is performed in such a manner as described above without being affected by the orientation of the body, the user can more efficiently improve the form of the swing.

(4) The motion analysis method may include identifying an initial posture of the hitting area of the sporting equipment before beginning of a motion using the output of the inertial sensor. On this occasion, according to the motion analysis method, the posture of the hitting area at the impact relative to the initial posture of the hitting area can be identified. In general, at the swing action, the user previously takes a posture for checking the hitting area of the sporting equipment at the impact position. At that occasion, the initial posture of the hitting area is established. The target place of the hit ball is set in accordance with the initial posture of the hitting area. Since the posture of the hitting area at impact is identified based on the hitting area at such an initial posture, the type of the course of a ball is identified from the swing action without being affected by the orientation of the body of the user. Since the analysis of the swing action is performed in such a manner as described above without being affected by the orientation of the body, the user can more efficiently improve the form of the swing.

(5) The motion analysis method may include identifying an initial position of the hitting area of the sporting equipment before beginning of a motion using the output of the inertial sensor. In general, at the swing action, the user previously takes a posture for checking the hitting area of the sporting equipment at the impact position. The initial position of the hitting area is established. The target place of the hit ball is set in accordance with the initial position of the hitting area. Since the trajectory at impact is identified based on such an initial position, the type of the course of a ball is identified from the swing action without being affected by the orientation of the body of the user. Since the analysis of the swing action is performed in such a manner as described above without being affected by the orientation of the body, the user can more efficiently improve the form of the swing.

(6) The motion analysis method may include calculating a variation in an angle of the hitting area at the impact relative to the hitting area at the beginning of the motion. According to such calculation of the angle, the courses of the hit ball are finely classified in accordance with the sign and the magnitude of the angle. As a result, the user can more effectively improve the form of the swing.

(7) The motion analysis method may include identifying, when identifying the trajectory, a first coordinate point representing a position of the hitting area at the impact and a second coordinate point representing a position of the hitting area at a sampling point prior to the impact. When calculating the angle of the trajectory, the first coordinate point and the second coordinate point are identified. A vector in the moving direction is identified by a plane (or a line segment) including the first coordinate point and the second coordinate point. In such a manner as described above, the angle of the trajectory can reliably be calculated.

(8) The motion analysis method may include calculating an incident angle of the trajectory at the impact with respect to a line segment orthogonally intersecting with the hitting area at the position of the hitting area at a resting state. According to such calculation of the incident angle, the courses of the hit ball are finely classified in accordance with the sign and the magnitude of the angle. As a result, the user can more effectively improve the form of the swing.

(9) The motion analysis method may include displaying a result obtained by determining a type of the course of the hit ball. When the type of the course of a ball is visually presented, the user can image the course of a ball type by type. The image of the course of a ball can effectively be communicated to the user compared to a simple numerical presentation. It is possible for the user to efficiently improve the form of the swing based on such an image.

(10) The motion analysis method may include displaying the posture of the hitting area represented by one coordinate axis and a state of the trajectory at the impact represented by the other coordinate axis. When the swing action is performed, the course of a ball of the swing action is determined in accordance with the posture of the hitting area and the trajectory. The course of a ball is plotted on the Cartesian coordinates. Therefore, the user can easily recognize the type of the course of a ball in accordance with the posture of the hitting area and the trajectory. The image of the course of a ball can effectively be communicated to the user compared to a simple numerical presentation.

(11) The motion analysis method may include performing matrix representation dividing the one coordinate axis into a plurality of areas, and dividing the other coordinate axis into a plurality of areas. The user can easily recognize the type of the course of a ball in accordance with the posture of the hitting area and the trajectory. The image of the course of a ball can effectively be communicated to the user compared to a simple numerical presentation.

(12) The motion analysis method may include displaying the course of a ball having a straight-ahead direction assigned to a central area of the matrix display. The user can easily recognize the type of the course of a ball in accordance with the posture of the hitting area and the trajectory. The image of the course of a ball can effectively be communicated to the user compared to a simple numerical presentation.

(13) The motion analysis method may include displaying an image including a target area for identifying the course of a hit ball targeted by a user in an overlapping manner. The user can set the target course of a ball prior to the measurement of the swing action. When the swing action is performed, the user can easily observe similarity or difference between the course of a ball identified by the swing action and the target course of a ball. In such a manner as described above, it is possible for the user to improve the form of the swing over try and error.

(14) The motion analysis method may include displaying, when displaying plotted points based on a latest swing action, a latest plotted point so as to be visually distinguished from previous plotted points. The previous plotted points remain in the image. Therefore, the user can visually check the history of the posture of the hitting area and the trajectory. When checking the history, the plotted point of the latest course of a ball is visually distinguished from the plotted points of the previous course of a ball. The user can easily extract the plotted point formed in the latest swing action even if the plurality of plotted points remains.

(15) Yet another aspect of the invention relates to a motion analysis device including an impact point analysis section adapted to identify a posture of a hitting area of sporting equipment at an impact, and identify a trajectory of the sporting equipment at the impact prior to the impact using an output of an inertial sensor, and a determination section adapted to determine a type of a course of a hit ball using information of the posture of the hitting area identified and the trajectory of the sporting equipment identified.

BRIEF DESCRIPTION OF THE DRAWINGS

The invention will be described with reference to the accompanying drawings, wherein like numbers reference like elements.

FIG. 1 is a conceptual diagram schematically showing a configuration of a golf swing analysis device according to an embodiment of the invention.

FIG. 2 is a conceptual diagram schematically showing a relationship between a motion analysis model, and a golfer and a golf club.

FIG. 3 is an enlarged front view schematically showing a structure of a club head.

FIG. 4 is a block diagram schematically showing a configuration of an arithmetic processing circuit according to an embodiment of the invention.

FIG. 5 is a diagram showing a concept of an angle of a face plane and an angle of a trajectory.

FIG. 6 is a diagram showing a specific example of an image.

FIG. 7 is a diagram showing types of a course of a ball.

DESCRIPTION OF AN EXEMPLARY EMBODIMENT

Hereinafter, an embodiment of the invention will be explained with reference to the accompanying drawings. It should be noted that the embodiment explained below does not unreasonably limit the content of the invention as set forth in the appended claims, and all of the constituents set forth in the embodiment are not necessarily essential as means for solving the problem according to the invention.

(1) Configuration of Golf Swing Analysis Device

FIG. 1 schematically shows a configuration of a golf swing analysis device (a motion analysis device) 11 according to an embodiment of the invention. The golf swing analysis device 11 is provided with, for example, an inertial sensor 12. The inertial sensor 12 incorporates an acceleration sensor and a gyro sensor. The acceleration sensor is capable of separately detecting accelerations in three-axis directions perpendicular to each other. The gyro sensor is capable of individually detecting angular velocities around the three axes perpendicular to each other. The inertial sensor 12 outputs a detection signal. The detection signal identifies the acceleration and the angular velocity for each of the axes. The acceleration sensor and the gyro sensor detect the information of the acceleration and the angular velocity with relative accuracy. The inertial sensor 12 is attached to a golf club (sporting equipment) 13. The golf club 13 is provided with a shaft 13a and a grip 13b. The grip 13b is held by hand. The grip 13b is formed coaxially with the axis of the shaft 13a. A club head 13c is connected to the tip of the shaft 13a. It is desirable that the inertial sensor 12 is attached to the shaft 13a or the grip 13b of the golf club 13. It is sufficient for the inertial sensor 12 to be fixed to the golf club 13 so as to be unable to move relatively to the golf club 13. Here, when attaching the inertial sensor 12, one of the detection axes of the inertial sensor 12 is adjusted to the axis of the shaft 13a.

The golf swing analysis device 11 is provided with an arithmetic processing circuit 14. The inertial sensor 12 is connected to the arithmetic processing circuit 14. In the connection, a predetermined interface circuit 15 is connected to the arithmetic processing circuit 14. The interface circuit 15 can be connected to the inertial sensor 12 with wire, or can also be connected wirelessly to the inertial sensor 12. The arithmetic processing circuit 14 is supplied with the detection signal from the inertial sensor 12.

A storage device 16 is connected to the arithmetic processing circuit 14. The storage device 16 can store, for example, a golf swing analysis software program (a motion analysis program) 17 and related data. The arithmetic processing circuit 14 executes the golf swing analysis software program 17 to realize a golf swing analysis method. The storage device 16 can include a dynamic random access memory (DRAM), a mass-storage unit, a nonvolatile memory, and so on. For example, the DRAM temporarily holds the golf swing analysis software program 17 when performing the golf swing analysis method. The mass-storage unit such as a hard disk drive (HDD) stores the golf swing analysis software program 17 and the data. The nonvolatile memory stores a program and data relatively small in volume such as a basic input and output system (BIOS).

An image processing circuit 18 is connected to the arithmetic processing circuit 14. The arithmetic processing circuit 14 transmits predetermined image data to the image processing circuit 18. A display device 19 is connected to the image processing circuit 18. In the connection, a predetermined interface circuit (not shown) is connected to the image processing circuit 18. The image processing circuit 18 transmits an image signal to the display device 19 in accordance with the image data input. An image identified by the image signal is displayed on a screen of the display device 19. A flat panel display such as a liquid crystal display is used as the display device 19. Here, the arithmetic processing circuit 14, the storage device 16, and the image processing circuit 18 are provided as a computer system.

An input device 21 is connected to the arithmetic processing circuit 14. The input device 21 is provided with at least alphabet keys and a numerical keypad. Character information and numerical information are input to the arithmetic processing circuit 14 from the input device 21. The input device 21 can be formed of, for example, a keyboard. The combination of the computer system and the keyboard can be replaced with, for example, a smart phone.

(2) Motion Analysis Model

The arithmetic processing circuit 14 defines a virtual space. The virtual space is formed of a three-dimensional space. The three-dimensional space defines a real space. As shown in FIG. 2, the three-dimensional space has an absolute reference coordinate system (a global coordinate system) Σxyz. In the three-dimensional space, there is built a three-dimensional motion analysis model 26 in accordance with the absolute reference coordinate system Σxyz. Point constraint by a pivot point 28 (coordinate x) is applied to a rod 27 of the three-dimensional motion analysis model 26. The rod 27 three-dimensionally acts as a pendulum around the pivot point 28. The position of the pivot point 28 can be moved. Here, in accordance with the absolute reference coordinate system Σxyz, the position of the centroid 29 of the rod 27 is identified by the coordinate xg, and the position of the club head 13c is identified by the coordinate xh.

The three-dimensional motion analysis model 26 corresponds to what is obtained by modeling the golf club 13 at swing. The rod 27 of the pendulum projects the shaft 13a of the golf club 13. The pivot point 28 of the rod 27 projects the grip 13b. The inertial sensor 12 is fixed to the rod 27. In accordance with the absolute reference coordinate system Σxyz, the position of the inertial sensor 12 is identified by the coordinate xs. The inertial sensor 12 outputs an acceleration signal and an angular velocity signal. In the acceleration signal, an acceleration after deduction of the influence of the gravitational acceleration g is identified as ({umlaut over (X)}s−g). Further, in the angular velocity signal, angular velocities ω1, ω2 are identified.

Similarly, the arithmetic processing circuit 14 fixes a local coordinate system Σs to the inertial sensor 12. The origin of the local coordinate system Σs is set to the origin of the detection axis of the inertial sensor 12. The y axis of the local coordinate system Σs coincides with the axis of the shaft 13a. The x axis of the local coordinate system Σs coincides with the hit ball direction identified by the orientation of the face. Therefore, in accordance with the local coordinate system Σs, the position lsj of the pivot point is identified by (0, lsjy, 0). Similarly, on the local coordinate system Σs, the position lsg of the centroid 29 is identified by (0, lsgy, 0), and the position lsh of the club head 13c is identified by (0, lshy, 0).

As shown in FIG. 3, the arithmetic processing circuit 14 identifies the posture and the position of the face plane 31 on the club head 13c in accordance with the local coordinate system Σs. When identifying the posture and the position, a first measurement point 32 and a second measurement point 33 are set on the face plane 31. The first measurement point 32 and the second measurement point 33 are disposed at positions distant from each other. Here, the first measurement point 32 is located on the heel 34 side end of the face plane 31, and the second measurement point 33 is located on the toe 35 side end of the face plane 31. The first measurement point 32 and the second measurement point 33 are disposed in a horizontal plane 36 parallel to the ground G. Therefore, a line segment 37 connecting the first measurement point 32 and the second measurement point 33 to each other can identify the orientation of the face plane 31 when projected on the ground G. It is sufficient for the first measurement point 32 and the second measurement point 33 to be set on a line segment parallel to a score line 38.

(3) Configuration of Arithmetic Processing Circuit

FIG. 4 schematically shows a configuration of the arithmetic processing circuit 14 according to the embodiment. The arithmetic processing circuit 14 is provided with a position detection section 41 and a posture detection section 42. The position detection section 41 and the posture detection section 42 are connected to the inertial sensor 12. The position detection section 41 is supplied with an acceleration signal from the inertial sensor 12. The position detection section 41 calculates the position of the inertial sensor 12 at each sampling point based on the accelerations in three axis directions. When calculating the position, double integration is performed on the acceleration for each of the detection axes. In such a manner as described above, the direction components (an x-axis direction displacement, a y-axis direction displacement, and a z-axis direction displacement) of the displacement are identified for the respective detection axes. The position of the inertial sensor 12 is identified by the origin position of the local coordinate system Σs unique to the inertial sensor 12.

The posture detection section 42 calculates the posture of the inertial sensor 12 at each sampling point based on the accelerations around the three axes. In the calculation, rotation matrix Rs is identified from the angular velocities.

Rs = ( w 2 + x 2 - y 2 - z 2 2 ( xy - wz ) 2 ( xz + wy ) 2 ( xy + wz ) w 2 - x 2 + y 2 - z 2 2 ( yz - wx ) 2 ( xz - wy ) 2 ( yz + wx ) w 2 - x 2 - y 2 + z 2 )

Here, when identifying the rotation matrix Rs, a quaternion Q is identified.

Q = ( w , x , y , z ) w = cos θ 2 x = ω x ω · sin θ 2 y = ω y ω · sin θ 2 z = ω z ω · sin θ 2

Here, the amount of the angular velocity is calculated by the following formula.


|{right arrow over (ω)}|=√{square root over (ωx2y2z2)}

In this case, the angular velocity [red/s] measured is expressed by the following formula.


{right arrow over (ω)}=(ωxyz)

An angular variation [rad] per unit time Δt is calculated by the following formula.


θ=|{right arrow over (ω)}|

The arithmetic processing circuit 14 is provided with a rest determination section 43 and an impact determination section 44. The rest determination section 43 and the impact determination section 44 are connected to, for example, the inertial sensor 12. The rest determination section 43 identifies the resting state of the golf club 13 based on the output of the inertial sensor 12. When the output of the inertial sensor 12 falls below a threshold value, the rest determination section 43 determines the resting state of the golf club 13. It is sufficient for the threshold value to be set to a value with which an influence of the detection signal representing a minute vibration such as a body motion can be eliminated. When confirming the resting state for a predetermined period of time, the rest determination section 43 outputs a rest notification signal. It is sufficient for the threshold value to be stored in the storage device 16 in advance. It is sufficient for the threshold value to be acquired to the storage device 16 over the operation of the input device 21.

The impact determination section 44 identifies the point of the impact based on the output of the inertial sensor 12. At the moment of the impact, an acceleration and an angular velocity act on the golf club 13 unlike the case of a practice swing. Therefore, the output of the inertial sensor 12 is disturbed at the point of the impact. For example, a high acceleration is observed in a specific direction. The point of the impact can be identified based on such a threshold value of the acceleration. When detecting the impact, the impact determination section 44 outputs an impact notification signal. It is sufficient for the threshold value to be stored in the storage device 16 in advance. It is sufficient for the threshold value to be acquired to the storage device 16 over the operation of the input device 21.

The arithmetic processing circuit 14 is provided with a coordinate conversion section 45. The coordinate conversion section 45 is connected to the position detection section 41, the posture detection section 42, the rest determination section 43, and the impact determination section 44. The coordinate conversion section 45 is supplied with outputs from the position detection section 41, the posture detection section 42, the rest determination section 43, and the impact determination section 44. The coordinate conversion section 45 identifies the posture and the position of the face plane 31 of the club head 13c in the absolute reference coordinate system Σxyz for identifying the real space. When identifying the posture and the position, the coordinate conversion section 45 identifies the first measurement point 32 and the second measurement point 33 on the face plane 31 in accordance with the local coordinate system Σs. It is sufficient for the coordinate values of the first measurement point 32 and the second measurement point 33 to be stored in, for example, the storage device 16 in advance. It is sufficient for the coordinate values to be acquired to the storage device 16 over the operation of the input device 21. The coordinate conversion section 45 performs the coordinate conversion on the coordinate values of the local coordinate system Σs, and then identifies the first measurement point 32 and the second measurement point 33 in accordance with the absolute reference coordinate system Σxyz.

When performing the coordinate conversion, the coordinate conversion section 45 identifies the rotation matrix Rs for each sampling point. The posture variation of the inertial sensor 12 from the beginning of the measurement corresponds to an accumulated value of the rotation matrix Rs from the beginning of the measurement to the point of the calculation. The coordinate conversion section 45 can store the coordinate value of the first measurement point 32 and the coordinate value of the second measurement point 33 after performing the coordinate conversion in a temporary storage memory 46 for each sampling time point.

The arithmetic processing circuit 14 is provided with a rest point analysis section 47 and an impact point analysis section 48. The rest point analysis section 47 and the impact point analysis section 48 are connected to the coordinate conversion section 45. The rest point analysis section 47 and the impact point analysis section 48 are supplied with an output from the coordinate conversion section 45. The coordinate conversion section 45 supplies the rest point analysis section 47 with the coordinate value of the first measurement point 32 and the coordinate value of the second measurement point 33 obtained by the coordinate conversion in response to reception of the rest notification signal from the rest determination section 43. Similarly, the coordinate conversion section 45 supplies the impact point analysis section 48 with the coordinate value of the first measurement point 32 and the coordinate value of the second measurement point 33 obtained by the coordinate conversion in response to reception of the impact notification signal from the impact determination section 44.

The rest point analysis section 47 is provided with a posture identification section 51 and a position identification section 52. The posture identification section 51 identifies the posture of the face plane 31 in the absolute reference coordinate system Σxyz at rest (i.e., at address). When identifying the posture, as shown in, for example, FIG. 5, the posture identification section 51 connects the first measurement point 32 (=rh (0)) and the second measurement point 33 (=rt(0)) at rest to each other with a first line segment L1. The posture of the face plane 31 is identified by the first line segment L1. On this occasion, the first line segment L1 is projected on a horizontal plane (a plane expanding in parallel to the ground G) perpendicular to the y axis in the absolute reference coordinate system Σxyz.

The position identification section 52 identifies a second line segment L2 perpendicular to the face plane 31 in the absolute reference coordinate system Σxyz at rest. The second line segment L2 intersects perpendicularly with the face plane 31 at the first measurement point 32 (=rh (0)). When identifying the second line segment L2, the position identification section 52 identifies the first line segment L1. The position identification section 52 sets the second line segment L2 at the first measurement point 32 in a direction perpendicular to the first line segment L1. The second line segment L2 represents a so-called target line. Specifically, the second line segment L2 indicates a straight-ahead direction leading to the target point of the hit ball. On this occasion, the second line segment L2 is projected on a horizontal plane perpendicular to the y axis in the absolute reference coordinate system Σxyz similarly to the first line segment L1.

The impact point analysis section 48 is provided with a posture identification section 53 and a trajectory identification section 54. The posture identification section 53 identifies the posture of the face plane 31 in the absolute reference coordinate system Σxyz at impact. When identifying the posture, as shown in, for example, FIG. 5, the posture identification section 53 connects the first measurement point 32 (=rh(imp)) and the second measurement point 33 (=rt(imp)) at impact to each other with a third line segment L3. The posture of the face plane 31 at impact is identified by the third line segment L3. On this occasion, similarly to the above, the third line segment L3 is projected on a horizontal plane perpendicular to the y axis in the absolute reference coordinate system Σxyz.

The trajectory identification section 54 identifies the trajectory of the first measurement point 32 in the absolute reference coordinate system Σxyz at impact. When identifying the trajectory, the trajectory identification section 54 identifies a first coordinate point P1 on the absolute reference coordinate system Σxyz indicating the position rh (imp) of the first measurement point 32 at impact, and a second coordinate point P2 on the absolute reference coordinate system Σxyz indicating the position rh(imp−1) of the first measurement point 32 at a sampling point coming before the impact. Here, the sampling point immediately before the impact is assigned to the second coordinate point P2. The first coordinate point P1 and the second coordinate point P2 are connected to each other with a fourth line segment L4. The direction of the trajectory is identified by the fourth line segment L4. On this occasion, similarly to the above, the fourth line segment L4 is projected on a horizontal plane perpendicular to the y axis in the absolute reference coordinate system Σxyz.

The arithmetic processing circuit 14 is provided with a determination section 55. The determination section 55 determines the type of the course of the hit ball based on the information of the posture of the face plane 31 and the information of the trajectory of the golf club 13. Here, the determination section 55 includes a face angle calculation section 56 and the incident angle calculation section 57. The face angle calculation section 56 is connected to the posture identification section 51 of the rest point analysis section 47 and the posture identification section 53 of the impact point analysis section 48. The face angle calculation section 56 is supplied with the outputs from the posture identification sections 51, 53. The face angle calculation section 56 calculates an angle (a face angle) φ of the face plane 31 at impact relatively to the face plane 31 at rest. When calculating the angle φ, an angle is measured in a horizontal plane of the absolute reference coordinate system Σxyz between the first line segment L1 identified by the posture identification section 51 and the third line segment L3 identified by the posture identification section 53. The face angle calculation section 56 outputs first angle information data. The first angle information data identifies the angle φ of the face plane 31.

The incident angle calculation section 57 is connected to the position identification section 52 of the rest point analysis section 47 and the trajectory identification section 54 of the impact point analysis section 48. The incident angle calculation section 57 is supplied with the outputs from the position identification section 52 and the trajectory identification section 54. The incident angle calculation section 57 calculates the angle θ of the trajectory relatively to a line segment orthogonally intersecting with the face plane 31 at the first measurement point 32 of the face plane 31 at rest, namely the second line segment L2. When calculating the angle θ, an angle is measured in a horizontal plane of the absolute reference coordinate system Σxyz between the second line segment L2 identified by the position identification section 52 and the fourth line segment L4 identified by the trajectory identification section 54. The incident angle calculation section 57 outputs second angle information data. The second angle information data identifies the angle θ of the trajectory.

The determination section 55 includes an image data generation section 58. The image data generation section 58 is connected to the face angle calculation section 56 and the incident angle calculation section 57. The image data generation section 58 is supplied with the first angle information data and the second angle information data. The image data generation section 58 generates the image data for identifying the image for visually displaying the type of the course of a ball assigned based on the angle of the face plane and the angle of the trajectory thus supplied. When generating the image data, the image data generation section 58 obtains a background image data from the storage device 16. The background image data identifies the image of Cartesian coordinates. As shown in, for example, FIG. 6, in the Cartesian coordinates, the angle φ of the face plane 31 is assigned to one coordinate axis (the x axis), and the angle θ of the trajectory is assigned to the other coordinate axis (the y axis). The Cartesian coordinates are divided into nine areas S1 through S9 arranged in a 3×3 matrix. The types of the course of a ball are respectively assigned to the areas S1 through S9. To the center area S5 among the nine areas S1 through S9, there is assigned the course of a ball “Straight” having the straight-ahead direction. Here, for right-handed golfers, in the case in which the angle of the trajectory increases in the positive direction while keeping the angle of the face plane 31 compared to the center area S5, the course of a ball “Push” (the area S4) is assigned, and similarly, in the case in which the angle of the trajectory deceases in the negative direction while keeping the angle of the face plane 31 compared to the center area S5, the course of a ball “Pull” (the area S6) is assigned. In such a manner as described above, the types of the course of a ball are assigned to the areas S4, S5, and S6 located in the three rows of the center column. In the case in which the angle of the face plane 31 increases in the positive direction while keeping the trajectories of “Push,” “Straight,” and “Pull,” the types of “Push Slice,” “Slice,” and “Fade” are respectively assigned to the areas S1, S2, and S3 located in the three rows of the right column, and in the case in which the angle of the face plane 31 increases in the negative direction while keeping the trajectories of “Push,” “Straight,” and “Pull,” the types of “Draw,” “Hook,” and “Pull Hook” are respectively assigned to the areas S7, S8, and S9 located in the three rows of the left column.

As shown in FIG. 7, in the case of right-handed golfers, the course 61 of a ball of “Push,” the course 62 of a ball of “Fade,” and the course 63 of a ball of “Slice” are sequentially identified as the degree of the curve of the hit ball increases outward, and the course 64 of a ballot “Pull,” the course 65 of a ball of “Draw,” and the course 65 of a ball of “Hook” are sequentially defined as the degree of the curve of the hit ball increases inward.

The image data generation section 58 plots the measurement values on the image of the Cartesian coordinates. As shown in FIG. 6, the plotted image is superimposed on the nine areas S1 through S9. Every time the angle φ of the face plane 31 and the angle θ of the trajectory are identified, the image data generation section 58 forms the image for displaying a plotted point 68a on the Cartesian coordinates in accordance with the angle φ of the face plane 31 and the angle θ of the trajectory. On this occasion, the latest plotted point 68a is drawn so as to be visually distinguished from previous plotted points 68b. Here, the latest plotted point 68a is visually a first feature, and the previous plotted points 68b are a second feature distinguishable from the first feature. In such characterization, it is possible to, for example, make the shapes or the colors of the plotted points differ from each other. Here, the latest plotted point 68a is expressed by a “cross” mark, while the previous plotted points 68b corresponding to the history are each expressed by a rectangular mark. When the latest plotted point 68a is drawn with the first feature, the plotted point 68b having been drawn with the first feature until then is redrawn with the second feature. In such a manner as described above, the latest plotted point 68a is distinguished from the other plotted points 68b.

Here, the image data generation section 58 forms an image to be superimposed on the nine areas S1 through S9, and including a target area 69 for identifying the course of a ball targeted by the user. Such a target area 69 can be taken by the storage device 16 through, for example, the operation of the input device 21. In such a manner as described above, the user can show the targeted course of a ball on the image of the Cartesian coordinates.

(4) Operation of Golf Swing Analysis Device

The operation of the golf swing analysis device 11 will briefly be explained. Firstly, the golf swing of a golfer is measured. Prior to the measurement, the necessary information is input from the input device 21 to the arithmetic processing circuit 14. Here, input of the position lsj of the pivot point 28 according to the local coordinate system Σs, the position of the first measurement point 32 and the position of the second measurement point 33, a rotation matrix R0 of the initial posture of the inertial sensor 12, the course of a ball targeted by the golfer, and so on is prompted in accordance with the three-dimensional motion analysis model 26. The information thus input is managed under, for example, specific identifiers. It is sufficient for the identifiers to identify specific golfers.

Prior to the measurement, the inertial sensor 12 is attached to the shaft 13a of the golf club 13. The inertial sensor 12 is fixed to the golf club 13 so as to be unable to be displaced relatively to the golf club 13. Here, one of the detection axes of the inertial sensor 12 is adjusted to the axis of the shaft 13a. One of the detection axes of the inertial sensor 12 is adjusted to the hit ball direction identified by the orientation of the face plane 31.

Prior to the execution of the golf swing, the measurement by the inertial sensor 12 is started. At the beginning of the operation, the inertial sensor 12 is set to a predetermined position and a predetermined posture. The position and the posture correspond to those identified by the rotation matrix R0 of the initial posture. The inertial sensor 12 continuously measures the acceleration and the angular velocity at predetermined sampling intervals. The sampling intervals define the resolution of the measurement. The detection signal of the inertial sensor 12 is fed to the arithmetic processing circuit 14 at real time. The arithmetic processing circuit 14 receives the signal for identifying the output of the inertial sensor 12.

The golf swing starts with address, and then reaches follow-through and then finish via the downswing from the backswing, and the impact. The golf club 13 is swung. When the golf club 13 is swung, the posture of the golf club 13 varies in accordance with the time axis. The inertial sensor 12 detects the accelerations along the three axes of the local coordinate system Σs and the angular velocities around the three axes of the local coordinate system Σs in accordance with the posture of the golf club 13. The detection signal for identifying the accelerations and the angular velocities is output. The position detection section 41 calculates the position of the inertial sensor 12 in the absolute reference coordinate system Σxyz, namely the position of the origin of the local coordinate system Σs. The posture detection section 42 calculates from the angular velocities the rotation matrix Rs for each sampling point. The coordinate conversion section 45 performs the coordinate conversion on coordinate values of the first measurement point 32 and the second measurement point 33 of the local coordinate system Σs based on the rotation matrix Rs to identify the first measurement point 32 and the second measurement point 33 in the coordinate values of the absolute reference coordinate system Σxyz.

When the golf club 13 is settled at address, the rest identification section 43 detects the resting state of the golf club 13. The rest detection section 43 outputs the rest notification signal. In response to the reception of the rest notification signal, the coordinate conversion section 45 outputs the position signal for identifying the coordinate values of the first measurement point 32 and the second measurement point 33 in accordance with the absolute reference coordinate system Σxyz toward the rest point analysis section 47. The rest point analysis section 47 identifies the posture of the face plane 31 with the first line segment L1 based on the coordinate values of the first measurement point 32 and the second measurement point 33. Similarly, the rest point analysis section 47 identifies the second line segment L2, namely the target line, based on the coordinate values of the first measurement point 32 and the second measurement point 33.

During the swing action, the impact determination section 44 detects the impact. The impact determination section 44 outputs the impact notification signal. In response to the reception of the impact notification signal, the coordinate conversion section 45 outputs the position signal for identifying the coordinate values of the first measurement point 32 and the second measurement point 33 at the impact in accordance with the absolute reference coordinate system Σxyz, and the position signal for identifying the coordinate value of the first measurement point 32 identified by the sampling immediately before the impact point in accordance with the absolute reference coordinate system Σxyz toward the impact point analysis section 48. The impact point analysis section 48 identifies the posture of the face plane 31 with the third line segment L3 based on the coordinate values of the first measurement point 32 and the second measurement point 33. Similarly, the impact point analysis section 48 identifies the fourth line segment L4 based on the first coordinate point P1 and the second coordinate point P2.

The face angle calculation section 56 calculates the angle between the first line segment L1 and the third line segment L3. The angle thus calculated is fed to the image data generation section 58. The incident angle calculation section 57 calculates the angle between the second line segment L2 and the fourth line segment L4. The angle thus calculated is fed to the image data generation section 58. The image data generation section 58 generates the image data for identifying the image of the Cartesian coordinates. As a result, as shown in FIG. 6, the type of the course of a ball is identified for each of the plotted points 68a, 68b.

The course of the hit ball is estimated with the posture of the face plane 31 at impact and the trajectory. The type of the course of a ball is determined in accordance with the combination of the posture of the face plane 31 and the trajectory. When the type of the course of a ball is visually presented, the golfer can image the course of a ball type by type. The image of the course of a ball can effectively be communicated to the golfer compared to a simple numerical presentation. It is possible for the golfer to efficiently improve the form of the swing based on such an image.

In this golf swing analysis device 11, the posture of the face plane 31 and the trajectory are identified at impact. At the identification, the coordinate conversion from the local coordinate system Σs unique to the inertial sensor 12 into the absolute reference coordinate system Σxyz is performed on the output of the inertial sensor 12 based on the rotation matrix Rs. On this occasion, the absolute reference coordinate system Σxyz is identified by the face plane 31 at rest. In general, at the swing action, the golfer previously rests at the impact position with a posture for checking the face plane 31 of the golf club 13. The target line is set in accordance with the posture of the face plane 31. Since the posture of the face plane 31 at impact and the trajectory are identified based on the face plane 31 at such a resting state as described above, the type of the course of a ball is identified from the swing action without being affected by the orientation of the body of the golfer. Since the analysis of the swing action is performed in such a manner as described above without being affected by the orientation of the body, the golfer can more efficiently improve the form of the swing.

In particular, when separating the courses of a ball, the angle of the face plane 31 at impact is calculated relatively to the face plane 31 at rest. According to such calculation of the angle, the courses of the hit ball are finely classified in accordance with the sign and the magnitude of the angle. Further, since the angle of the trajectory is calculated relatively to the target line determined by the face plane 31 at rest, the courses of the hit ball are finely classified in accordance with the sign and the magnitude of the angle similarly. As a result, the golfer can more effectively improve the form of the swing.

When identifying the trajectory of the first measurement point 32, the first coordinate point P1 on the absolute reference coordinate system Σxyz indicating the position of the face plane 31 at impact, and the second coordinate point P2 on the absolute reference coordinate system Σxyz indicating the position of the first measurement point 32 at the sampling point coming before the impact are identified. When calculating the angle between the target line and the trajectory, the first coordinate point P1 and the second coordinate point P2 are identified. A vector in the moving direction is identified by the fourth line segment L4 including the first coordinate point P1 and the second coordinate point P2. In such a manner as described above, the angle of the trajectory is reliably calculated.

As described above, when calculating the angle of the face plane 31 and the angle of the trajectory, the first line segment L1, the second line segment L2, the third line segment L3, and the fourth line segment L4 are projected on a horizontal plane in the absolute reference coordinate system Σxyz. In such a manner as described above, the z axis perpendicular to the horizontal plane is omitted in the calculation process. The calculation process is simplified.

In the Cartesian coordinates identified by the image data, the angle of the face plane 31 is represented by one coordinate axis, and the angle of the trajectory is represented by the other coordinate axis. Moreover, the Cartesian coordinates are divided into nine areas 51 through S9 arranged in the 3×3 matrix, and the types of the course of a ball are individually assigned to the areas S1 through S9. On this occasion, to the center area S5 among the nine areas S1 through S9, there is assigned the course of a ball in the straight-ahead direction. When the swing action is performed, the course of a ball of the swing action is determined in accordance with the angle of the face plane 31 and the angle of the trajectory. The course of a ball is plotted on the Cartesian coordinates. Therefore, the golfer can easily recognize the type of the course of a ball in accordance with the angle of the face plane 31 and the angle of the trajectory. The image of the course of a ball can effectively be communicated to the golfer compared to a simple numerical presentation.

The Cartesian coordinates identified by the image data further include the target area 69 for identifying the course of a ball targeted by the golfer overlapping the nine areas S1 through S9. The golfer can set the target course of a ball prior to the measurement of the swing action. When the swing action is performed, the golfer can easily observe similarity or difference between the course of a ball identified by the swing action and the target course of a ball. In such a manner as described above, it is possible for the golfer to improve the form of the swing over try and error.

Further, in the Cartesian coordinates identified by the image data, when displaying the plotted points 68a, 68b in accordance with the angle of the face plane 31 and the angle of the trajectory based on the latest swing action, the latest plotted point 68a is displayed so as to be visually distinguished from the previous plotted points 68b. The previous plotted points 68b remain in the image. Therefore, the golfer can visually check the history of the angle of the face plane 31 and the angle of the trajectory. When checking the history, the plotted point 68a of the latest course of a ball is visually distinguished from the plotted points 68b of the previous course of a ball. The golfer can easily extract the plotted point 68a formed in the latest swing action even if the plurality of plotted points 68a, 68b remains.

For example, the course of a ball of a left-handed golfer corresponds to a course obtained by interchanging the outside and the inside of the course of a ball of a right-handed golfer. Therefore, it is desired to assign the display corresponding to a left-handed golfer to the areas S1 through S9 of the Cartesian coordinates. For left-handed golfers, the courses of a ball “Pull,” “Straight,” and “Push” are assigned to the areas S4, S5, and S6 located in the three rows of the center column. In the case in which the angle of the face plane 31 increases in the positive direction while keeping the trajectories of “Pull,” “Straight,” and “Push,” the types of “Pull Hook,” “Hook,” and “Draw” are respectively assigned to the areas S1, S2, and S3 located in the three rows of the right column, and in the case in which the angle of the face plane 31 increases in the negative direction while keeping the trajectories of “Pull,” “Straight,” and “Push,” the types of “Fade,” “Slice,” and “Push Slice” are respectively assigned to the areas S7, S8, and S9 located in the three rows of the left column.

Besides the above, the courses of a hit ball can also be classified in a simple manner. For example, the course of a ball can be classified into “Hook,” “Straight,” and “Slice” based on the angle φ of the face plane 31. On this occasion, it is sufficient for the Cartesian coordinates to be divided into three parts along one of the coordinate axes.

Alternatively, the course of a ball can also be classified into “Push,” “Straight,” and “Pull” based on the angle θ of the trajectory. On this occasion, it is sufficient for the Cartesian coordinates to be divided into three parts along one of the coordinate axes.

Further, the notification of the course of a hit ball is not limited to such screen display as described above, a sound or a vibration, for example, can also be used when notifying the type of the course of a ball. It is sufficient that the sounds different in type from each other or the vibration patterns different in type from each other are assigned for each course of a hit ball. At the notification with the sound, it is sufficient for the golf swing analysis device 11 to be provided with a speaker to be mounted to the golf club 13 or the user. At the notification with the vibration, it is sufficient for the golf swing analysis device 11 to be provided with a vibrator to be mounted to the grip 13b of the golf club 13 or the user.

It should be noted that in the embodiment described hereinabove, each of the functional blocks of the arithmetic processing circuit 14 is realized in accordance with the execution of the golf swing analysis software program 17. It should also be noted that each of the functional blocks can also be realized by the hardware without resort to the software processing. Besides the above, the golf swing analysis device 11 can also be applied to the swing analysis of sporting equipment (e.g., a tennis racket, a table-tennis racket, and a baseball bat) held by hand and then swung.

Although the present embodiment is hereinabove explained in detail, it should easily be understood by those skilled in the art that it is possible to make a variety of modifications not substantially departing from the novel matters and the advantages of the invention. Therefore, such modified examples are all included in the scope of the invention. For example, a term described at least once with a different term having a broader sense or the same meaning in the specification or the accompanying drawings can be replaced with the different term in any part of the specification or the accompanying drawings. Further, the configurations and the operations of the inertial sensor 12, the golf club 13, the arithmetic processing circuit 14, the three-dimensional motion analysis model 26, and so on are not limited to those explained in the embodiment, but can variously be modified.

The entire disclosure of Japanese Patent Application No. 2013-130656, filed Jun. 21, 2013 is expressly incorporated by reference herein.

Claims

1. A motion analysis method comprising:

identifying a posture of a hitting area of sporting equipment at an impact using an output of an inertial sensor; and
determining a type of a course of a hit ball using information of the posture of the hitting area identified.

2. A motion analysis method comprising:

identifying a trajectory of sporting equipment at an impact prior to the impact using an output of an inertial sensor; and
determining a type of a course of a hit ball using information of the trajectory identified.

3. A motion analysis method comprising:

identifying a posture of a hitting area of sporting equipment at an impact using an output of an inertial sensor;
identifying a trajectory of the sporting equipment at the impact prior to the impact using an output of the inertial sensor; and
determining a type of a course of a hit ball using information of the posture of the hitting area identified and the trajectory of the sporting equipment identified.

4. The motion analysis method according to claim 1, further comprising:

identifying an initial posture of the hitting area of the sporting equipment before beginning of a motion using the output of the inertial sensor; and
identifying the posture of the hitting area at the impact relative to the initial posture of the hitting area.

5. The motion analysis method according to claim 2, further comprising:

identifying an initial position of the hitting area of the sporting equipment before beginning of a motion using output of the inertial sensor.

6. The motion analysis method according to claim 4, further comprising:

calculating a variation in an angle of the hitting area at the impact relative to the hitting area at the beginning of the motion.

7. The motion analysis method according to claim 5, further comprising:

identifying, when identifying the trajectory, a first coordinate point representing a position of the hitting area at the impact and a second coordinate point representing a position of the hitting area at a sampling point prior to the impact.

8. The motion analysis method according to claim 5, further comprising:

calculating an incident angle of the trajectory at the impact with respect to a line segment orthogonally intersecting with the hitting area at the position of the hitting area at a resting state.

9. The motion analysis method according to claim 7, further comprising:

calculating an incident angle of the trajectory at the impact with respect to a line segment orthogonally intersecting with the hitting area at the position of the hitting area at a resting state.

10. The motion analysis method according to claim 3, further comprising:

displaying a result obtained by determining a type of the course of the hit ball.

11. The motion analysis method according to claim 10, further comprising:

displaying the posture of the hitting area represented by one coordinate axis and a state of the trajectory at the impact represented by the other coordinate axis.

12. The motion analysis method according to claim 11, further comprising:

performing matrix representation dividing the one coordinate axis into a plurality of areas, and dividing the other coordinate axis into a plurality of areas.

13. The motion analysis method according to claim 12, further comprising:

displaying the course of a hit ball having a straight-ahead direction assigned to a central area of the matrix display.

14. The motion analysis method according to claim 11, further comprising:

displaying an image including a target area for identifying the course of a hit ball targeted by a user in an overlapping manner.

15. The motion analysis method according to claim 11, further comprising:

displaying, when displaying plotted points based on a latest swing action, a latest plotted point so as to be visually distinguished from previous plotted points.

16. A motion analysis device comprising:

an impact point analysis section adapted to identify a posture of a hitting area of sporting equipment at an impact, and identify a trajectory of the sporting equipment at the impact prior to the impact using an output of an inertial sensor; and
a determination section adapted to determine a type of a course of a hit ball using information of the posture of the hitting area identified and the trajectory of the sporting equipment identified.
Patent History
Publication number: 20140378239
Type: Application
Filed: Jun 17, 2014
Publication Date: Dec 25, 2014
Inventors: Masafumi SATO (Hara-mura), Kazuhiro SHIBUYA (Shiojiri-shi), Kazuo NOMURA (Shiojiri-shi), Kenya KODAIRA (Azumino-shi)
Application Number: 14/306,931
Classifications
Current U.S. Class: Electrical (473/199)
International Classification: A63B 69/36 (20060101);