EXERCISE ANALYSIS DEVICE, EXERCISE ANALYSIS SYSTEM, EXERCISE ANALYSIS METHOD, PROGRAM, AND RECORDING MEDIUM

- SEIKO EPSON CORPORATION

An exercise analysis device includes: a first specifying unit that specifies an inclination of a first axis which lies in a major axis direction of a shaft of an exercise tool at a time of address or a time of impact in a swing, using output of an inertial sensor; an acquisition unit that acquires a first criterion axis to be compared to the first axis; and a comparison unit that compares an inclination of the first axis to an inclination of the first criterion axis.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
BACKGROUND

1. Technical Field

The present invention relates to an exercise analysis device, an exercise analysis system, an exercise analysis method, a program, and a recording medium.

2. Related Art

JP-A-2010-82430 discloses that an image is acquired by performing photographing from the rear side in a hitting direction between an address state to the end of a swing and the image is split into at least three regions by a first straight line passing through a shaft axis of a golf club in the address state and a second straight line intersecting the first straight line and passing through the root of an installed tee and the base of the neck of a golfer.

A posture at the time of address of a golfer (also referred to as an address posture) has an influence on goodness and badness of a swing. For example, a distance between a golfer and a ball or a golfer's way to bend his or her waist can be confirmed from the address posture. For example, when a distance between a golfer and a ball or a golfer's way to bend his or her waist is different from in the usual time, there is a possibility of a swing state being deteriorating. However, it is difficult for a golfer to objectively see his or her address posture and it is also difficult for the golfer to recognize a change in the address posture. For this reason, it is difficult to know reasons why a swing state is not good. Further, it is also difficult to estimate goodness and badness of a swing based on an address posture.

In particular, in JP-A-2010-82430, a golfer can photograph his or her address posture with a camera and see his or her address posture on images. However, in JP-A-2010-82430, for example, it is sometimes difficult to install a camera so that the entire golfer is contained in an image or it is sometimes difficult to visually confirm a difference between current and previous address postures on an image.

SUMMARY

An advantage of some aspects of the invention is that it supports simple estimation of goodness or badness of a swing.

An exercise analysis device according to an aspect of the invention includes: a first specifying unit that specifies an inclination of a first axis which lies in a longitudinal direction of a shaft of an exercise tool at a time of address or a time of impact in a swing, using output of an inertial sensor; an acquisition unit that acquires a first criterion axis to be compared to the first axis; and a comparison unit that compares an inclination of the first axis to an inclination of the first criterion axis. With this configuration, for example, the user can examine a difference in the position or inclination of the first axis (shaft plane) at the time of previous and current address or impact, and thus can estimate goodness and badness of a swing simply and accurately.

The comparison unit may compare magnitudes of the inclination of the first axis and the inclination of the first criterion axis, and the exercise analysis device may further include an output processing unit that performs output according to a comparison result. With this configuration, for example, the user can objectively recognize a difference in the position or inclination of the first axis (shaft plane) at the time of previous and current address or impact, and thus can estimate goodness and badness of a swing simply and accurately.

The output processing unit may generate a signal for making report to a user when a difference between the inclination of the first axis and the inclination of the first criterion axis exceeds a predetermined value. With this configuration, for example, the user can simply recognize that the position or inclination of the first axis (shaft plane) at the time of previous and current address or impact is considerably different.

The output processing unit may generate a signal for making different report to a user according to a difference between the inclination of the first axis and the inclination of the first criterion axis. With this configuration, for example, the user can simply recognize how the position or inclination of the first axis (shaft plane) at the time of previous and current address or impact is different.

The output processing unit may generate a signal for making report to the user by light emission. With this configuration, for example, the user can recognize that the position or inclination of the first axis at the time of previous and current address or impact is different simply and in substantially real time even when the user does not view a screen.

The output processing unit may generate a signal for causing a type of light emission to be different according to the comparison result. With this configuration, for example, the user can recognize how the position or inclination of the first axis at the time of previous and current address or impact is different simply and in substantially real time even when the user does not view a screen.

The first specifying unit may specify a first imaginary plane including the first axis and a third axis indicating a hitting direction, the acquisition unit may acquire a first criterion imaginary plane including the first criterion axis and the third axis, and the comparison unit may compare an inclination of the first imaginary plane to an inclination of the first criterion imaginary plane. With this configuration, for example, the user can examine a difference in the position or inclination of the first imaginary plane (shaft plane) at the time of previous and current address or impact, and thus can estimate goodness and badness of a swing simply and accurately.

An exercise analysis system according to another aspect of the invention includes: an inertial sensor; a first specifying unit that specifies an inclination of a first axis which lies in a longitudinal direction of a shaft of an exercise tool at a time of address or a time of impact in a swing, using output of the inertial sensor; an acquisition unit that acquires a first criterion axis to be compared to the first axis; and a comparison unit that compares an inclination of the first axis to an inclination of the first criterion axis. With this configuration, for example, the user can examine a difference in the position or inclination of the first axis (shaft plane) at the time of previous and current address or impact, and thus can estimate goodness and badness of a swing simply and accurately.

An exercise analysis method according to still another aspect of the invention includes: specifying an inclination of a first axis which lies in a longitudinal direction of a shaft of an exercise tool at a time of address or a time of impact in a swing, using output of an inertial sensor; acquiring a first criterion axis to be compared to the first axis; and comparing an inclination of the first axis to an inclination of the first criterion axis. Thus, for example, the user can examine a difference in the position or inclination of the first axis (shaft plane) at the time of previous and current address or impact, and thus can estimate goodness and badness of a swing simply and accurately.

The comparing of the inclinations may include comparing magnitudes of the inclination of the first axis and the inclination of the first criterion axis, and performing output according to a comparison result.

In the performing of the output, a signal for making report to a user may be generated when a difference between the inclination of the first axis and the inclination of the first criterion axis exceeds a predetermined value.

In the performing of the output, a signal for making different report to a user may be generated according to a difference between the inclination of the first axis and the inclination of the first criterion axis.

In the performing of the output, a signal for making report to the user by light emission may be generated.

In the performing of the output, a signal for causing a type of light emission to be different may be generated according to the comparison result.

In the specifying of the inclination of the first axis, a first imaginary plane including the first axis and a third axis indicating a hitting direction may be specified. In the acquiring of the first criterion axis, a first criterion imaginary plane including the first criterion axis and the third axis may be acquired. In the comparing of the inclinations, an inclination of the first imaginary plane may be compared to an inclination of the first criterion imaginary plane.

A program according to yet another aspect of the invention causes a computer to implement: specifying an inclination of a first axis which lies in a major axis direction of a shaft of an exercise tool at a time of address or a time of impact in a swing, using output of an inertial sensor; acquiring a first criterion axis to be compared to the first axis; and comparing an inclination of the first axis to an inclination of the first criterion axis. Thus, for example, the user can examine a difference in the position or inclination of the first axis (shaft plane) at the time of previous and current address or impact, and thus can estimate goodness and badness of a swing simply and accurately.

A recording medium according still yet another aspect of the invention records a program causing a computer to implement: specifying an inclination of a first axis which lies in a longitudinal direction of a shaft of an exercise tool at a time of address or a time of impact in a swing, using output of an inertial sensor; acquiring a first criterion axis to be compared to the first axis; and comparing an inclination of the first axis to an inclination of the first criterion axis.

BRIEF DESCRIPTION OF THE DRAWINGS

The invention will be described with reference to the accompanying drawings, wherein like numbers reference like elements.

FIG. 1 is a diagram illustrating an overview of an exercise analysis system according to an embodiment of the invention.

FIG. 2 is a diagram illustrating examples of a shaft plane and a Hogan's plane.

FIG. 3 is a block diagram illustrating an example of the configuration of an exercise analysis system.

FIG. 4 is a flowchart illustrating an example of an exercise analysis process.

FIG. 5 is a plan view illustrating a golf club and a sensor unit at the time of stopping of a user when viewed from the negative side of the X axis.

FIG. 6 is a diagram illustrating a cross section obtained by cutting the shaft plane along the YZ plane when viewed from the negative side of the X axis.

FIG. 7 is a diagram illustrating a cross section obtained by cutting the Hogan's plane along the YZ plane when viewed from the negative side of the X axis.

FIG. 8 is a diagram illustrating examples of angular velocities output from the sensor unit.

FIG. 9 is a diagram illustrating an example of a norm of an angular velocity.

FIG. 10 is a diagram illustrating an example of a differential value of the norm of an angular velocity.

FIG. 11 is a diagram illustrating an example of an image including a shaft plane and a Hogan's plane.

FIG. 12 is a diagram illustrating an example of a screen for displaying an address posture.

FIG. 13 is a diagram illustrating examples of a procedure and an image for comparing inclination angles.

FIG. 14 is a diagram illustrating examples of a procedure and an image for comparing V zones.

FIG. 15 is a diagram illustrating an example of an image in which parameters of the V zone are plotted.

DESCRIPTION OF EXEMPLARY EMBODIMENTS

Hereinafter, embodiments of the invention will be described with reference to the drawings. Hereinafter, an exercise analysis system performing analysis of a golf swing will be described as an example.

FIG. 1 is a diagram illustrating an overview of an exercise analysis system according to an embodiment of the invention.

An exercise analysis system 1 includes a sensor unit 10 and an exercise analysis device 20.

The sensor unit 10 can measure acceleration generated in each axis direction of three axes and an angular velocity generated in each rotation of the three axes as an inertial sensor and is mounted on a golf club 3 which is an exercise tool. For example, the sensor unit 10 is fitted on apart of the shaft of the golf club 3 when one axis among three detection axes (the x axis, the y axis, and the z axis), for example, the y axis, conforms to the major axis direction of the shaft. Preferably, the sensor unit 10 is fitted at a position close to a grip in which a shock at the time of a shot is rarely delivered and a centrifugal force is not applied at the time of a swing. The shaft is a portion of the handle excluding the head of the golf club 3 and also includes the grip.

A user 2 performs a swing motion of hitting a golf ball (not illustrated) in a pre-decided procedure. For example, the user 2 first holds the golf club 3, takes a posture of address so that the major axis (longitudinal direction) of the shaft of the golf club 3 is vertical to a target line (for example, a hitting target direction), and stops for a predetermined time or more (for example, 1 second or more). Next, the user 2 performs a swing motion to hit the golf ball (which is also referred to as a short or a stroke). The posture of address in the present specification includes a posture in a stop state of the user before start of a swing or a posture in a state in which the user shakes an exercise tool (which is also referred to as waggling) before start of a swing. The target line refers to any hitting direction and is decided as, for example, a hitting target direction in the embodiment.

While the user 2 performs the motion to hit the golf ball in the above-described procedure, the sensor unit 10 measures triaxial acceleration and triaxial angular velocities at a predetermined period (for example, 1 ms) and sequentially transmits the measurement data to the exercise analysis device 20. The sensor unit 10 may immediately transmit the measurement data, or may store the measurement data in an internal memory and transmit the measurement data at a desired timing such as the end of a swing motion of the user 2. Communication between the sensor unit 10 and the exercise analysis device 20 may be wireless communication or wired communication. Alternatively, the sensor unit 10 may store the measurement data in a recording medium such as a memory card which can be detachably mounted and the exercise analysis device 20 may read the measurement data from the recording medium.

The exercise analysis device 20 analyzes a swing exercise performed with the golf club 3 by the user 2 using the data measured by the sensor unit 10. In particular, in the embodiment, the exercise analysis device 20 specifies a shaft plane (which corresponds to a first imaginary plane or a first axis according to the invention) and a Hogan's plane (which corresponds to a second imaginary plane or a second axis according to the invention) at the time of stopping of the user 2 (the time of address) using the data measured by the sensor unit 10. The exercise analysis device 20 calculates a trajectory of the golf club 3 at a swing after the user 2 starts the swing motion. The exercise analysis device 20 generates image data including the trajectory of the golf club 3, the shaft plane, and the Hogan's plane in the swing of the user 2 and causes a display unit to display an image according to the image data. By displaying the shaft plane and the Hogan's plane, it is possible to recognize a space called a V zone between the shaft plane and the Hogan's plane. The exercise analysis device 20 may be, for example, a portable device such as a smartphone or a personal computer (PC). In FIG. 1, the exercise analysis device 20 is mounted on the waist of the user 2, but the mounted position is not particularly limited. Further, the exercise analysis device 20 may not be mounted on the user 2.

FIG. 2 is a diagram illustrating examples of the shaft plane and the Hogan's plane. In the embodiment, an XYZ coordinate system (global coordinate system) in which a target line indicating a hitting target direction is an X axis, an axis on a horizontal plane vertical to the X axis is a Y axis, and an upward vertical direction (which is an opposite direction to the direction of the gravity acceleration) is a Z axis is defined. In FIG. 2, the X, Y, and Z axes are shown.

A shaft plane 30 at the time of address of the user 2 is an imaginary plane which includes a first line segment 51 serving as a first axis which lies in the major axis direction of the shaft of the golf club 3 and a third line segment 52 serving as a third axis indicating a hitting target direction and has four vertexes T1, T2, S1, and S2. In the embodiment, a position 61 of the head (blow portion) of the golf club 3 is set as the origin O (0, 0, 0) of the XYZ coordinate system. The first line segment 51 is a line segment which connects the position 61 (the origin O) of the head of the golf club 3 to a position 62 of a grip end. The third line segment 52 is a line segment which has T1 and T2 on the X axis as both ends, has a length TL, and centers on the origin O. When the user 2 takes the above-described address posture at the time of the address, the shaft of the golf club 3 is vertical to the target line (the X axis). Therefore, the third line segment 52 is a line segment which is perpendicular to the major axis direction of the shaft of the golf club 3 (which can also be a line segment perpendicular to or intersects the blow surface of the head on which a ball is hit), that is, a line segment perpendicular to the first line segment 51. The shaft plane 30 is specified by calculating the coordinates of the four vertexes T1, T2, S1, and S2 in the XYZ coordinate system. A method of calculating the coordinates of the four vertexes T1, T2, S1, and S2 will be described in detail below.

A Hogan's plane 40 is an imaginary plane which includes the third line segment 52 and a second line segment 53 serving as a second axis and has four vertexes T1, T2, H1, and H2. In the embodiment, for example, the second line segment 53 is a line segment which connects the position 62 (which is an example of a blow position) of the head (which is a blow portion) of the golf club 3 to a predetermined position 63 (which is, for example, the position of the base of the neck or the position of one of the right and left shoulders) on a line segment connecting both shoulders of the user 2 to one another. Here the second line segment 53 may be, for example, a line segment which connects the predetermined position 63 to the position (which is an example of the blow position) of a ball at the time of address. The Hogan's plane 40 is specified by calculating the coordinates of the four vertexes T1, T2, H1, and H2 in the XYZ coordinate system. A method of calculating the coordinates of the four vertexes T1, T2, H1, and H2 will be described in detail below.

FIG. 3 is a block diagram illustrating an example of the configuration of an exercise analysis system.

The sensor unit 10 includes a control unit 11, a communication unit 12, an acceleration sensor 13, and an angular velocity sensor 14.

The acceleration sensor 13 measures acceleration generated in each of mutually intersecting (ideally, orthogonal) triaxial directions and outputs digital signals (acceleration data) according to the sizes and directions of the measured triaxial accelerations.

The angular velocity sensor 14 measures an angular velocity generated at axis rotation of each of mutually intersecting (ideally, orthogonal) three axes and outputs digital signals (angular velocity data) according to the sizes and directions of the measured triaxial angular velocities.

The control unit 11 controls the sensor unit in an integrated manner. The control unit 11 receives the acceleration data and the angular velocity data from the acceleration sensor 13 and the angular velocity sensor 14, appends time information, and stores the acceleration data and the angular velocity data in a storage unit (not illustrated). The control unit 11 generates packet data in conformity to a communication format by appending time information to the stored measurement data (the acceleration data and the angular velocity data) and outputs the packet data to the communication unit 12.

The acceleration sensor 13 and the angular velocity sensor 14 are ideally fitted in the sensor unit 10 so that the three axes of each sensor match the three axes (the x axis, the y axis, and the z axis) of the rectangular coordinate system (sensor coordinate system) defined for the sensor unit 10, but errors of the fitting angles actually occur. Accordingly, the control unit 11 performs a process of converting the acceleration data and the angular velocity data into data of the xyz coordinate system, using correction parameters calculated in advance according to the errors of the fitting angles.

The control unit 11 may perform a temperature correction process of the acceleration sensor 13 and the angular velocity sensor 14. Alternatively, a temperature correction function may be embedded in the acceleration sensor 13 and the angular velocity sensor 14.

The acceleration sensor 13 and the angular velocity sensor 14 may output analog signals. In this case, the control unit 11 may perform A/D (analog/digital) conversion on each of an output signal of the acceleration sensor 13 and an output signal of the angular velocity sensor 14, generate measurement data (acceleration data and angular velocity data), and generate packet data for communication using the measurement data.

The communication unit 12 performs, for example, a process of transmitting the packet data received from the control unit 11 to the exercise analysis device 20 or a process of receiving control commands from the exercise analysis device 20 and transmitting the control commands to the control unit 11. The control unit 11 performs various processes according to the control commands.

The exercise analysis device 20 includes a control unit 21, a communication unit 22, an operation unit 23, a storage unit 24, a display unit 25, and a sound output unit 26.

The communication unit 22 performs, for example, a process of receiving the packet data transmitted from the sensor unit 10 and transmitting the packet data to the control unit 21 or a process of transmitting a control command from the control unit 21 to the sensor unit 10.

The operation unit 23 performs a process of acquiring operation data from the user and transmitting the operation data to the control unit 21. The operation unit 23 may be, for example, a touch panel type display, a button, a key, or a microphone.

The storage unit 24 is configured by, for example, any of various IC memories such as a read-only memory (ROM), a flash ROM, and a random access memory (RAM) or a recording medium such as a hard disk or a memory card.

The storage unit 24 stores, for example, programs used for the control unit 21 to perform various calculation processes or control processes, or various programs or data used for the control unit 21 to realize application functions. In particular, in the embodiment, the storage unit 24 stores an exercise analysis program which is read by the control unit 21 to perform an analysis process. The exercise analysis program may be stored in advance in a nonvolatile recording medium. Alternatively, the exercise analysis program may be received from a server via a network by the control unit 21 and may be stored in the storage unit 24.

In the embodiment, the storage unit 24 stores body information of the user 2, club specification information indicating the specification of the golf club 3, and sensor-mounted position information. For example, when the user 2 operates the operation unit 23 to input the body information such as a height, a weight, and a sex, the input body information is stored as body information in the storage unit 24. For example, the user 2 operates the operation unit 23 to input a model number of the golf club 3 (or selects the model number from a model number list) to be used and sets specification information regarding the input model number as the club specification information among pieces of specification information for each model number (for example, information regarding the length of the shaft, the position of the center of gravity, a lie angle, a face angle, a loft angle, and the like) stored in advance in the storage unit 24. For example, when the user 2 operates the operation unit 23 to input a distance between the position at which the sensor unit 10 is mounted and the grip end of the golf club 3, information regarding the input distance is stored as the sensor-mounted position information in the storage unit 24. Alternatively, by mounting the sensor unit 10 at a decided predetermined position (for example, a distance of 20 cm from the grip end), information regarding the predetermined position may be stored in advance as the sensor-mounted position information.

The storage unit 24 is used as a work area of the control unit 21 and temporarily stores, for example, data input from the operation unit 23 and calculation results performed according to various programs by the control unit 21. The storage unit 24 may store data necessarily stored for a long time among the data generated through the processes of the control unit 21.

The display unit 25 displays a processing result of the control unit 21 as text, a graph, a table, animations, or another image. The display unit 25 may be, for example, a cathode ray tube (CRT) display, a liquid crystal display (LCD), an electrophoretic display (EPD), a display using an organic light-emitting diode (OLED), a touch panel type display, or a head-mounted display (HMD). The functions of the operation unit 23 and the display unit 25 may be realized by one touch panel type display.

The sound output unit 26 outputs a processing result of the control unit 21 as audio such as a sound or a buzzer tone. The sound output unit 26 may be, for example, a speaker or a buzzer.

The control unit 21 performs a process of transmitting a control command to the sensor unit 10, various calculation processes on data received from the sensor unit 10 via the communication unit 22, and other various control processes according to various programs. In particular, in the embodiment, the control unit 21 executes an exercise analysis program to function as a sensor information acquisition unit 210, a first imaginary plane specifying unit (which corresponds to a first specifying unit according to the invention) 211, a second imaginary plane specifying unit (which corresponds to a second specifying unit according to the invention) 212, an exercise analysis unit 213, a criterion acquisition unit (which corresponds to an acquisition unit according to the invention) 214, a comparison unit 215, an image generation unit 216, and an output processing unit 217. The first and second specifying units may be realized by separate calculation units or may be realized by the same calculation unit.

The control unit 21 may be realized by a computer that includes a central processing unit (CPU) which is a calculation device, a random access memory (RAM) which is a volatile storage device, a ROM which is a non-volatile storage device, an interface (I/F) circuit connecting the control unit 21 to the other units, a bus mutually connecting these units and the like. The computer may include various dedicated processing circuits such as image processing circuits. The control unit 21 may also be realized by an application specific integrated circuit (ASIC) or the like.

The sensor information acquisition unit 210 receives the packet data received from the sensor unit 10 by the communication unit 22 and acquires the time information and the measurement data from the received packet data. The sensor information acquisition unit 210 stores the acquired time information and measurement data in the storage unit 24 in association therewith.

The first imaginary plane specifying unit 211 performs a process of specifying the first line segment 51 in the major axis direction of the shaft of the golf club 3 at the time of stopping of the user, using the measurement data output by the sensor unit 10. Further, the first imaginary plane specifying unit 211 performs a process of specifying the shaft plane (first imaginary plane) 30 (see FIG. 2) including the first line segment 51 and the third line segment 52 indicating the hitting target direction.

The first imaginary plane specifying unit 211 may calculate the coordinates of the position 62 of the grip end of the golf club 3 using the measurement data output by the sensor unit 10 and specify the first line segment 51 based on the coordinates of the position 62 of the grip end. For example, the first imaginary plane specifying unit 211 may calculate an inclination angle (an inclination relative to the horizontal plane (the XY plane) or the vertical plane (the XZ plane)) of the shaft of the golf club 3, using the acceleration data measured by the acceleration sensor 13 at the time of stopping of the user 2 (the time of the address) and specify the first line segment 51 using the calculated inclination angle and information regarding the length of the shaft included in the club specification information.

The first imaginary plane specifying unit 211 may calculate the width of the shaft plane 30 using the length of an arm of the user 2 based on the body information and the length of the first line segment 51.

The second imaginary plane specifying unit 212 performs a process of specifying the second line segment 53 forming a predetermined angle θ relative to the first line segment 51 specified by the first imaginary plane specifying unit 211, using the hitting target direction (the third line segment 52) as the rotation axis. Further, the second imaginary plane specifying unit 212 performs a process of specifying the Hogan's plane (second imaginary plane) 40 (see FIG. 2) including the second line segment 53 and the third line segment 52.

For example, the second imaginary plane specifying unit 212 performs a process of estimating the predetermined position 63 between the head and the chest of the user 2 at the time of stopping of the user 2 (for example, a line segment connecting both shoulders to one another) using the body information and the measurement data output by the sensor unit 10 and specifying the second line segment 53 connecting the estimated predetermined position 63 to the position 62 of the head (blow portion) of the golf club 3. The second imaginary plane specifying unit 212 performs a process of specifying the Hogan's plane 40 including the second line segment 53 and the third line segment 52.

The second imaginary plane specifying unit 212 may estimate the predetermined position 63 using the coordinates of the position 62 of the grip end calculated by the first imaginary plane specifying unit 211 and the length of the arm of the user 2 based on the body information. Alternatively, the second imaginary plane specifying unit 212 may calculate the coordinates of the position 62 of the grip end of the golf club 3 using the measurement data output by the sensor unit 10. In this case, the first imaginary plane specifying unit 211 may specify the shaft plane 30 using the coordinates of the position 62 of the grip end calculated by the second imaginary plane specifying unit 212.

The second imaginary plane specifying unit 212 may calculate the width of the Hogan's plane 40 using the length of the arm of the user 2 based on the body information and the length of the first line segment 51.

The exercise analysis unit 213 performs a process of analyzing a swing exercise of the user 2 using the measurement data output by the sensor unit 10. Specifically, the exercise analysis unit 213 first calculates an offset amount included in the measurement data using the measurement data (the acceleration data and the angular velocity data) at the time of stopping of the user 2 (the time of the address), which is stored in the storage unit 24. Next, the exercise analysis unit 213 subtracts the offset amount from the measurement data after start of a swing, which is stored in the storage unit 24 to correct a bias and calculates the position and posture of the sensor unit 10 during a swing motion of the user 2 using the measurement data in which the bias is corrected.

For example, the exercise analysis unit 213 calculates the position (initial position) of the sensor unit 10 at the time of stopping of the user 2 (the time of the address) in the XYZ coordinate system (global coordinate system), using the acceleration data measured by the acceleration sensor 13, the club specification information, and the sensor-mounted position information, integrates the subsequent acceleration data, and chronologically calculates a change in the position of the sensor unit 10 from the initial position. Since the user 2 stops at a predetermined address posture, the X coordinate of the initial position of the sensor unit 10 is 0. Further, the y axis of the sensor unit 10 is identical to the major axis direction of the shaft of the golf club 3, and the acceleration sensor 13 measures only the gravity acceleration at the time of stopping of the user 2. Therefore, the exercise analysis unit 213 can calculate an inclination angle of the shaft (an inclination relative to the horizontal plane (the XY plane) or the vertical plane (the XZ plane)), using y-axis acceleration data. Then, the exercise analysis unit 213 can calculate the Y and Z coordinates of the initial position of the sensor unit 10 using the inclination angle of the shaft, the club specification information (the length of the shaft), and the sensor-mounted position information (the distance from the grip end) and specify the initial position of the sensor unit 10. Alternatively, the exercise analysis unit 213 may calculate the coordinates of the initial position of the sensor unit 10 using the coordinates of the position 62 of the grip end of the golf club 3 calculated by the first imaginary plane specifying unit 211 or the second imaginary plane specifying unit 212 and the sensor-mounted position information (the distance from the grip end).

The exercise analysis unit 213 calculates the posture (initial posture) of the sensor unit 10 at the time of stopping of the user 2 (the time of the address) in the XYZ coordinate system (global coordinate system), using the acceleration data measured by the acceleration sensor 13, performs rotation calculation using the angular velocity data measured subsequently by the angular velocity sensor 14, and chronologically calculates a change in the posture from the initial posture of the sensor unit 10. The posture of the sensor unit 10 can be expressed by, for example, rotation angles (a roll angle, a pitch angle, and a yaw angle) around the X axis, the Y axis, and the Z axis, Eulerian angles, quaternions, or the like. At the time of stopping of the user 2, the acceleration sensor 13 measures only the gravity acceleration. Therefore, the exercise analysis unit 213 can specify an angle formed between of each of the x, y, and z axes of the sensor unit 10 and a gravity direction using triaxial acceleration data. Since the user 2 stops at the predetermined address posture, the y axis of the sensor unit 10 is present on the YZ plane at the time of stopping of the user 2. The exercise analysis unit 213 can specify the initial posture of the sensor unit 10.

The control unit 11 of the sensor unit 10 may calculate the offset amount of the measurement data and correct the bias of the measurement data or a bias correction function may be embedded in the acceleration sensor 13 and the angular velocity sensor 14. In this case, it is not necessary to correct the bias of the measurement data by the exercise analysis unit 213.

The exercise analysis unit 213 defines an exercise analysis model (a double pendulum model or the like) in consideration of the body information (the height (length of the arm) of the user 2), the club specification information (the length or the position of the center of the shaft), the sensor-mounted position information (the distance from the grip end), features (rigid body and the like) of the golf club 3, and features of a human body (for example, a joint bending direction is decided), and then calculate a trajectory of the golf club 3 at a swing of the user 2 using the exercise analysis model and the information regarding the position and posture of the sensor unit 10.

The exercise analysis unit 213 detects a series of motions (also referred to as a “rhythm”) from start to end of a swing of the user 2, for example, start of a swing, a backswing, a top, a downswing, an impact, follow-through, and end of the swing, using time information and the measurement data stored in the storage unit 24. For example, the exercise analysis unit 213 calculates a composite value of the measurement data (the acceleration data or the angular velocity data) output by the sensor unit 10 and specifies a timing (time) of an impact by the user 2 based on the composite value.

Using the exercise analysis model and information regarding the position and posture of the sensor unit 10, the exercise analysis unit 213 may generate a rhythm of a swing from a backswing to follow-through, a head speed, an incident angle (club pass) or a face angle at the time of hitting, shaft rotation (a change amount of face angle during a swing), information regarding a deceleration rate or the like of the golf club 3, or information regarding a variation in each piece of information when the user 2 performs the swing a plurality of times.

The criterion acquisition unit 214 acquires information regarding a swing serving as a criterion (also referred to as a criterion swing) used by the comparison unit 215 to be described below. The information regarding the criterion swing includes, for example, an inclination angle (an inclination angle of the shaft plane) of the first line segment 51 serving as the first axis and an inclination angle (an inclination angle of the Hogan's plane) of the second line segment 53 serving as the second axis at the time of address in the criterion swing. The criterion swing is, for example, a previous swing of the user 2 and a swing of another person such as a professional golfer. For example, the criterion acquisition unit 214 receives setting of the criterion swing via the operation unit 23 from the user 2 and stores the information regarding the criterion swing in the storage unit 24.

The comparison unit 215 compares the criterion swing acquired by the criterion acquisition unit 214 to a swing to be a compared (also referred to as a comparison target swing). The comparison unit 215 acquires, as information regarding the comparison target swing, for example, an inclination angle (an inclination angle of the shaft plane) of the first line segment 51 serving as the first axis specified by the first imaginary plane specifying unit 211 and an inclination angle (an inclination angle of the Hogan's plane) of the second line segment 53 serving as the second axis specified by the second imaginary plane specifying unit 212 at the time of address in the comparison target swing. For example, the comparison unit 215 compares an inclination angle of the first line segment 51 of the criterion swing to an inclination angle of the first line segment 51 of the comparison target swing and determines a difference or a magnitude relation between these angles. For example, the comparison unit 215 compares a center angle of a V zone formed by the first line segment 51 and the second line segment 53 of the criterion swing to a center angle of a V zone formed by the first line segment 51 and the second line segment 53 of the comparison target swing and determines a difference or a magnitude relation between these angles.

The image generation unit 216 performs a process of generating image data corresponding to an image of an exercise analysis result displayed on the display unit 25. In particular, the image generation unit 216 generates image data including the shaft plane 30 specified by first imaginary plane specifying unit 211, the Hogan's plane 40 specified by the second imaginary plane specifying unit 212, and the trajectory of the golf club 3 at a swing (in particular, a downswing) of the user 2, which is calculated by the exercise analysis unit 213. For example, the image generation unit 216 generates polygon data of the shaft plane 30 having the four vertexes T1, T2, S1, and S2 illustrated in FIG. 2 based on information regarding the coordinates of T1, T2, S1, and S2 and generates polygon data of the Hogan's plane 40 having the four vertexes T1, T2, H1, and H2 based on information regarding the coordinates of T1, T2, H1, and H2. The image generation unit 216 generates curved-line data indicating the trajectory of the golf club 3 at the time of a downswing of the user 2. Then, the image generation unit 216 generates image data including the polygon data of the shaft plane 30, the polygon data of the Hogan's plane 40, and the curved-line data indicating the trajectory of the golf club 3.

For example, the image generation unit 216 generates image data including the shaft plane 30 at the time of address in the criterion swing and the shaft plane 30 at the time of address in the comparison target swing. For example, the image generation unit 216 generates image data including the shaft plane 30 and the Hogan's plane 40 at the time of address in the criterion swing, and the shaft plane 30 and the Hogan's plane 40 at the time of address in the comparison target swing. For example, the image generation unit 216 generates image data indicating a comparison result of the criterion swing and the comparison target swing determined by the comparison unit 215.

The first imaginary plane specifying unit 211, the second imaginary plane specifying unit 212, the exercise analysis unit 213, the criterion acquisition unit 214, the comparison unit 215, and the image generation unit 216 also perform a process of storing various kinds of calculated information in the storage unit 24.

The output processing unit 217 performs a process of causing the display unit 25 to display various images (including not only an image corresponding to the image data generated by the image generation unit 216 but also text or signs or the like) (that is, generates a signal for performing display). For example, the output processing unit 217 causes the display unit 25 to display the image corresponding to the image data generated by the image generation unit 216 automatically or according to an input operation of the user 2 after a swing motion of the user 2 ends. Alternatively, the sensor unit 10 may include a display unit, the output processing unit 217 may transmit the image data to the sensor unit 10 via the communication unit 22, and various images may be displayed on the display unit of the sensor unit 10.

The output processing unit 217 performs a process of causing the sound output unit 26 to output various kinds of audio (also including sound or buzzer tone or the like) (that is, generates a signal for performing sound output). For example, the output processing unit 217 reads various kinds of information stored in the storage unit 24 and causes the sound output unit 26 to output audio or sound for exercise analysis automatically or at the time of performing a predetermined input operation after a swing motion of the user 2 ends. Alternatively, the sensor unit 10 may include an sound output unit, the output processing unit 217 may transmit various kinds of audio data or sound data to the sensor unit 10 via the communication unit 22, and the sound output unit of the sensor unit 10 may be caused to output the various kinds of audio or sound.

The exercise analysis device 20 or the sensor unit 10 may include a vibration mechanism and various kinds of information may be converted into vibration information by the vibration mechanism to be presented to the user 2.

FIG. 4 is a flowchart illustrating an example of an exercise analysis process. The control unit 21 executes an exercise analysis program stored in the storage unit 24 to perform the exercise analysis process in the procedure of the flowchart illustrated in FIG. 4.

First, the sensor information acquisition unit 210 acquires the measurement data of the sensor unit 10 (step S10). When the control unit 21 acquires the first measurement data in a swing motion (also including a stopping motion) of the user 2, the control unit 21 may perform processes subsequent to step S20 in real time. Alternatively, after the control unit 21 acquires some or all of the series of measurement data in the swing motion of the user 2 from the sensor unit 10, the control unit 21 may perform the processes subsequent to step S20.

Next, the exercise analysis unit 213 detects a stopping motion (address motion) of the user 2 using the measurement data acquired from the sensor unit 10 (step S20). When the control unit 21 performs the process in real time and detects the stopping motion (address motion), for example, the control unit 21 outputs a predetermined image or audio. Alternatively, the sensor unit 10 may include a light-emitting unit such as a light emitting diode (LED) and blinks the light-emitting unit to notify the user 2 that the stopped state is detected so that the user 2 confirms the notification and subsequently starts a swing.

Next, the first imaginary plane specifying unit 211 specifies the shaft plane 30 (the first imaginary plane) using the measurement data (the measurement data in the stopping motion (address motion) of the user 2) acquired from the sensor unit 10 and the club specification information (step S30).

Next, the second imaginary plane specifying unit 212 specifies the Hogan's plane 40 (the second imaginary plane) using the measurement data (the measurement data in the stopping motion (address motion) of the user 2) acquired from the sensor unit 10 and the body information (step S40).

Next, the exercise analysis unit 213 calculates the initial position and the initial posture of the sensor unit 10 using the measurement data (the measurement data in the stopping motion (address motion) of the user 2) acquired from the sensor unit 10 (step S50).

Next, the exercise analysis unit 213 detects a series of motions (rhythm) from the start of the swing to the end of the swing using the measurement data acquired from the sensor unit 10 (step S60).

The exercise analysis unit 213 calculates the position and posture of the sensor unit 10 during the swing motion of the user 2 concurrently with the process of step S60 (step S70).

Next, the exercise analysis unit 213 calculates the trajectory of the golf club 3 during the swing motion of the user 2 using the rhythm detected in step S60 and the position and posture of the sensor unit 10 calculated in step S70 (step S80).

Next, the image generation unit 216 generates the image data including the shaft plane specified in step S30, the Hogan's plane specified in step S40, and the trajectory of the golf club calculated in step S80 during the swing motion, and then the output processing unit 217 causes the display unit 25 to display the image data (step S90). Then, the control unit 21 ends the processes of the flowchart illustrated in FIG. 4.

In the flowchart of FIG. 4, the sequence of the processes may be appropriately changed within a possible range.

Next, an example of the process (the process of step S30 in FIG. 4) of specifying the shaft plane (the first imaginary plane) will be described in detail.

As illustrated in FIG. 2, the first imaginary plane specifying unit 211 first calculates the coordinates (0, GY), GZ) of the position 62 of the grip end based on the acceleration data at the time of the stopping measured by the sensor unit 10 and the club specification information by using the position 61 of the head of the golf club 3 as the origin O (0, 0, 0) of the XYZ coordinate system (global coordinate system). FIG. 5 is a plan view illustrating the golf club 3 and the sensor unit 10 at the time of stopping of the user 2 (the time of the address) when viewed from the negative side of the X axis. In FIG. 5, the position 61 of the head of the golf club 3 is the origin O (0, 0, 0) and the coordinates of the position 62 of the grip end are (0, GY, GZ). Since the gravity acceleration G is applied to the sensor unit 10 at the time of stopping of the user 2, a relation between the y axis acceleration y(0) and an inclination angle (an angle formed by the major axis of the shaft and the horizontal plane (XY plane)) α of the shaft of the golf club 3 is expressed in equation (1).


y(0)=G·sin α  (1)

Accordingly, when L1 is the length of the shaft of the golf club 3 included in the club specification information, GY and GZ are calculated using the length L1 and the inclination angle α of the shaft in equations (2) and (3), respectively.


GY=L1·cos α  (2)


GZ=L1·sin α  (3)

Next, the first imaginary plane specifying unit 211 multiplies the coordinates (0, GY, GZ) of the position 62 of the grip end of the golf club 3 by a scale factor S to calculate the coordinates (0, SY, SZ) of a midpoint S3 of the vertexes S1 and S2 of the shaft plane 30. That is, SY and SZ are calculated using equations (4) and (5).


SY=GY·S  (4)


SZ=GZ·S  (5)

FIG. 6 is a diagram illustrating a cross section obtained by cutting the shaft plane 30 in FIG. 2 along the YZ plane when viewed from the negative side of the X axis. The length (the width of the shaft plane 30 in a direction perpendicular to the X axis) of a line segment connecting the origin O to the midpoint S3 of the vertexes S1 and S2 is S times the length L1 of the first line segment 51. The scale factor S is set to a value so that the trajectory of the golf club 3 during the swing motion of the user 2 falls within the shaft plane 30. For example, when L2 is the length of an arm of the user 2, the scale factor S may be set as in equation (6) so that a width S×L1 in the direction perpendicular to the X axis of the shaft plane 30 is twice a sum of the length L1 of the shaft and the length L2 of the arm.

S = 2 · ( L 1 + L 2 ) L 1 ( 6 )

The length L2 of the arm of the user 2 has correlation with a height L0 of the user 2. For example, based on statistical information, a correlation equation as in equation (7) is expressed when the user 2 is male, and a correlation equation as in equation (8) is expressed when the user 2 is female.


L2=0.41×L0−45.5 [mm]  (7)


L2=0.46×L0−126.9 [mm]  (8)

Accordingly, the length L2 of the arm of the user is calculated by equation (7) or (8) using the height L0 and sex of the user 2 included in the body information.

Next, the first imaginary plane specifying unit 211 calculates the coordinates (−TL/2, 0, 0) of the vertex T1, the coordinates (TL/2, 0, 0) of the vertex T2, the coordinates (−TL/2, SY, SZ) of the vertex S1, and the coordinates (TL/2, SY, SZ) of the S2 of the shaft plane 30 using the coordinates (0, SY, SZ) of the midpoint S3 calculated as described above and the width (the length of the third line segment 52) TL of the shaft plane 30 in the X axis direction. The width TL in the X axis direction is set to a value so that the trajectory of the golf club 3 during the swing motion of the user 2 falls within the shaft plane 30. For example, the width TL in the X axis direction may be set to be the same as the width S×L1 in the direction perpendicular to the X axis, that is, twice the sum of the length L1 of the shaft and the length L2 of the arm.

The shaft plane 30 is specified based on the coordinates of the four vertexes T1, T2, S1, and S2 calculated in this way.

Next, an example of the process (the process of step S40 in FIG. 4) of specifying the Hogan's plane (the second imaginary plane) will be described in detail.

First, the second imaginary plane specifying unit 212 estimates the predetermined position 63 on the line segment connecting both shoulders of the user 2 to one another to calculate the coordinates (AX, AY, AZ), using the coordinates (0, GY, GZ) of the position 62 of the grip end of the golf club 3 calculated as described above and the body information of the user 2.

FIG. 7 is a diagram illustrating a cross section obtained by cutting the Hogan's plane 40 in FIG. 2 along the YZ plane when viewed from the negative side of the X axis. In FIG. 7, the midpoint of the line segment connecting both shoulders of the user 2 to one another is set as the predetermined position 63, and the predetermined position 63 is present on the YZ plane. Accordingly, the X coordinate AX of the predetermined position 63 is 0. Then, the second imaginary plane specifying unit 212 estimates that a position obtained by moving the position 62 of the grip end of the golf club 3 by the length L2 of the arm of the user 2 in the positive direction of the Z axis is the predetermined position 63. Accordingly, the Y coordinate AY of the predetermined position 63 is the same as the Y coordinate GY of the position 62 of the grip end, and the Z coordinate AZ of the predetermined position 63 is calculated as a sum of the Z coordinate GZ of the position 62 of the grip end and the length L2 of the arm of the user 2, as in equation (9).


AZ=GZ+L2  (9)

The length L2 of the arm of the user is calculated in equation (7) or (8) using the height L0 and sex of the user 2 included in the body information.

Next, the second imaginary plane specifying unit 212 multiples the Y coordinate AY and the Z coordinate AZ of the predetermined position 63 by a scale factor H to calculate the coordinates (0, HY, HZ) of a midpoint H3 of the vertexes H1 and H2 of the Hogan's plane 40. That is, HY and HZ are calculated using equations (10) and (11).


HY≦AY·H  (10)


HZ=AZ·H  (11)

As illustrated in FIG. 7, a length (a width of the Hogan's plane 40 in a direction perpendicular to the X axis) of a line segment connecting the origin O to the midpoint H3 of the vertexes H1 and H2 is H times the length L3 of the second line segment 53. The scale factor H is set to a value so that the trajectory of the golf club 3 during the swing motion of the user 2 falls within the Hogan's plane 40. For example, the Hogan's plane 40 may have the same shape and size as the shaft plane 30. In this case, since a width H×L3 of the Hogan's plane 40 in the direction perpendicular to the X axis is identical to the width S×L1 of the shaft plane 30 in the direction perpendicular to the X axis and is twice the sum of the length L1 of the shaft of the golf club 3 and the length L2 of the arm of the user 2, the scale factor H may be set as in equation (12).

H = 2 · ( L 1 + L 2 ) L 3 ( 12 )

The length L3 of the second line segment 53 is calculated from equation (13) using the Y coordinate AY and the Z coordinate AZ of the predetermined position 63.


L3=√{square root over (AY2+AZ2)}  (13)

Next, the second imaginary plane specifying unit 212 calculates the coordinates (−TL/2, 0, 0) of the vertex T1, the coordinates (TL/2, 0, 0) of the vertex T2, the coordinates (−TL/2, HY, HZ) of the vertex H1, and the coordinates (TL/2, HY, HZ) of the H2 of the Hogan's plane 40 using the coordinates (0, HY, HZ) of the midpoint H3 calculated as described above and the width (the length of the third line segment 52) TL of the Hogan's plane 40 in the X axis direction. The width TL in the X axis direction is set to a value so that the trajectory of the golf club 3 during the swing motion of the user 2 falls within the Hogan's plane 40. In the embodiment, for example, the width TL of the Hogan's plane 40 in the X axis direction may be set to be the same as the width of the shaft plane 30 in the X axis, and thus may be set to be twice the sum of the length L1 of the shaft and the length L2 of the arm, as described above.

The Hogan's plane 40 is specified based on the coordinates of the four vertexes T1, T2, H1, and H2 calculated in this way.

Next, an example of the process (the process of step S60 in FIG. 4) of detecting a series of motions (rhythm) from the start of the swing to the end of the swing of the user 2 will be described in detail.

The exercise analysis unit 213 detects a series of motions (rhythm) from the start of the swing to the end of the swing, for example, the start of the swing, a backswing, a top, a downswing, an impact, follow-through, and the end of the swing, using the measurement data acquired from the sensor unit 10. A specific rhythm detection procedure is not particularly limited. For example, the following procedure can be adopted.

First, the exercise analysis unit 213 calculates a sum (referred to as a composite value or a norm) of the magnitudes of the angular velocities around the axes at each time t using the acquired angular velocity data of each time t. The exercise analysis unit 213 may integrate the norm of the angular velocities at each time t by time.

Here, a case of a graph in which angular velocities around three axes (x, y, and z axes) is shown, for example, in FIG. 8 (which is a diagram illustrating examples of angular velocities output from the sensor unit) will be considered. In FIG. 8, the horizontal axis represents a time (msec) and the vertical axis represents an angular velocity (dps). The norm of the angular velocities is shown in the graph illustrated in, for example, FIG. 9 (which is a diagram illustrating an example of the norm of the angular velocities). In FIG. 9, the horizontal axis represents a time (msec) and the vertical axis represents the norm of the angular velocities. A differential value of the nor of the angular velocities is shown in a graph illustrated in, for example, FIG. 10 (which is a diagram illustrating an example of the differential value of the nor of the angular velocities). In FIG. 10, the horizontal axis represents a time (msec) and the vertical axis represents the differential value of the norm of the angular velocities. FIGS. 8 to 10 are exemplified to facilitate understanding of the embodiment and do not show accurate values.

The exercise analysis unit 213 detects a timing of an impact in the swing using the calculated norm of the angular velocities. For example, the exercise analysis unit 213 detects a timing at which the norm of the angular velocities is the maximum as the timing of the impact (T5 in FIG. 9). For example, the exercise analysis unit 213 may detect a former timing between timings at which the value of the differential of the calculated norm of the angular velocities is the maximum and the minimum as the timing of the impact (T5 in FIG. 10).

For example, the exercise analysis unit 213 detects a timing at which the calculated norm of the angular velocities is the minimum before the impact as a timing of a top of the swing (T3 in FIG. 9). For example, the exercise analysis unit 213 specifies a period in which the norm of the angular velocities is continuously equal to or less than a first threshold value before the impact, as a top period (which is an accumulation period at the top) (T2 to T4 in FIG. 9).

For example, the exercise analysis unit 213 detects a timing at which the norm of the angular velocities is equal to or less than a second threshold value before the top, as a timing of the start of the swing (T1 in FIG. 9).

For example, the exercise analysis unit 213 detects a timing at which the norm of the angular velocities is the minimum after the impact, as a timing of the end (finish) of the swing (T7 in FIG. 9). For example, the exercise analysis unit 213 may detect a first timing at which the norm of the angular velocities is equal to or less than the third threshold value after the impact, as the timing of the end (finish) of the swing. For example, the exercise analysis unit 213 specifies a period in which the norm of the angular velocities is continuously equal to or less than a fourth threshold value after the timing of the impact and close to the timing of the impact, as a finish period (T6 to T8 in FIG. 9).

In this way, the exercise analysis unit 213 can detect the rhythm of the swing. The exercise analysis unit 213 can specify each period (for example, a backswing period from the start of the swing to the start of the top, a downswing period from the end of the top to the impact, and a follow-through period from the impact to the end of the swing) during the swing by detecting the rhythm.

Next, an example of the process (the process of step S90 in FIG. 4) of generating and displaying the image data indicating the V zone and the trajectory will be described in detail.

FIG. 11 is a diagram illustrating an example of an image including the shaft plane and the Hogan's plane. FIG. 11 is a diagram in which the shaft plane and the Hogan's plane are projected to the YZ plane.

An image 500 includes polygon data 501 indicating the shaft plane 30, polygon data 502 indicating the Hogan's plane 40, and a curved line 503 indicating the trajectory of the golf club 3 at the time of a downswing of the user 2. In the image 500, a V zone which is a space between the polygon data 501 and the polygon data 502 can be recognized.

In FIG. 11, when the V zone is displayed, the V zone may not necessarily be displayed with the planes, but only the first line segment 51 (or a straight line along the first line segment 51) included in the shaft plane 30 and the second line segment 53 (or a straight line along the second line segment 53) included in the Hogan's plane 40 may be displayed. The image illustrated in FIG. 11 may be a 3-dimensional image of which a display angle (a viewpoint for viewing an image) can be changed according to an operation of the user 2.

Next, a process of comparing the criterion swing to the comparison target swing will be described in detail. Through the above-described exercise analysis process, information regarding each of the previous swings of the user 2 is stored in the storage unit 24.

For example, the output processing unit 217 receives a predetermined operation including designation of a swing from the user 2 via the operation unit 23 and causes the display unit 25 to display a screen 600 illustrated in FIG. 12 (which is a diagram illustrating an example of a screen for displaying an address posture).

The screen 600 includes a swing image field 610, a previous swing button 620, a subsequent swing button 630, a criterion swing registration button 640, an inclination angle comparison button 650, a V zone comparison button 660 and the like.

For example, based on the information regarding the designated swing, the output processing unit 217 displays the image 500 (the polygon data 501 of the shaft plane 30 and the polygon data 502 of the Hogan's plane 40) regarding this swing in the swing image field 610. Of course, the curved line 503 indicating the trajectory of the golf club 3 at the time of the swing may be displayed. For example, when the previous swing button 620 or the subsequent swing button 630 is operated via the operation unit 23, the output processing unit 217 displays the image 500 regarding a previous swing or a subsequent swing of the displayed swing in a predetermined procedure in the swing image field 610.

For example, when the criterion swing registration button 640 is operated via the operation unit 23, the criterion acquisition unit 214 sets the swing displayed in the swing image field 610 as a criterion swing and acquires information regarding the criterion swing.

Of course, a setting procedure of the information regarding the criterion swing is not limited to the foregoing procedure. For example, the criterion acquisition unit 214 may receive selection of a criterion swing among the previous swings stored in the storage unit 24 or an external storage device and acquire information regarding the selected criterion swing. For example, the criterion acquisition unit 214 may receive setting of an acquisition destination (for example, an URL or a file path) of information regarding a criterion swing and acquire information regarding the criterion swing from the acquisition destination. For example, when a predetermined operation (for example, an operation of tapping the shaft of the golf club 3 predetermined times) is detected based on a signal of at least one of the acceleration sensor 13 and the angular velocity sensor 14 of the sensor unit 10, the criterion acquisition unit 214 may select a finally analyzed swing as a criterion swing and acquire information regarding the selected criterion swing. For example, information regarding a swing of a golfer such as a professional golfer other than the user 2 may be accumulated in the storage unit 24 or an external storage device in association with body information of the golfer, and the criterion acquisition unit 214 may receive the body information of the user 2, specify a golfer of which body information (for example, his or her height) is close or identical to the user 2, and set the swing of the golfer as a criterion swing.

For example, when the inclination angle comparison button 650 is operated via the operation unit 23, the comparison unit 215 compares an inclination angle of the criterion swing to an inclination angle of the comparison target swing. In the embodiment, the comparison target swing is a swing that is displayed in the swing image field 610. Of course, for example, the comparison unit 215 may receive setting of the comparison target swing via the operation unit 23. The comparison unit 215 acquires the information regarding the criterion swing and the information regarding the comparison target swing.

FIG. 13 is a diagram illustrating examples of a procedure and a screen for comparing inclination angles. The comparison unit 215 compares a shaft plane 30a (an inclination angle αa) of the comparison target swing to a shaft plane 30b (an inclination angle αb) of the criterion swing. Specifically, for example, the comparison unit 215 calculates a difference between the shaft planes 30a and 30b or determines a magnitude relation between the shaft planes 30a and 30b. The image generation unit 216 generates the image 500 including a message (for example, a message indicating the difference between the shaft planes 30a and 30b or the magnitude relation between the shaft planes 30a and 30b) 510 according to polygon data 501a of the shaft plane 30a, polygon data 501b of the shaft plane 30b, and a comparison result obtained by the comparison unit 215 and displays the image 500 in the swing image field 610 (see FIG. 12). In the example of FIG. 13, since the inclination angle of the shaft plane 30a is smaller than the inclination angle of the shaft plane 30b, a message “INCLINATION ANGLE IS SMALLER THAN CRITERION” is displayed. Further, it may be determined whether the difference between the shaft planes 30a and 30b exceeds a predetermined angle, and a message may be displayed when the difference exceeds the predetermined angle.

Referring back to FIG. 12 for description, the comparison unit 215 compares the V zone of the criterion swing to the V zone of the comparison target swing, for example, when the V zone comparison button 660 is operated via the operation unit 23. In the embodiment, the comparison target swing is a swing that is displayed in the swing image field 610. Of course, for example, the comparison unit 215 may receive setting of the comparison target swing via the operation unit 23. The comparison unit 215 acquires the information regarding the criterion swing and the information regarding the comparison target swing.

FIG. 14 is a diagram illustrating examples of a procedure and a screen for comparing V zones. The comparison unit 215 compares the shaft plane 30a (the inclination angle αa) and a Hogan's plane 40a (an inclination angle αaa) of the comparison target swing to the shaft plane 30b (the inclination angle αb) and a Hogan's plane 40b (an inclination angle αbb) of the criterion swing. Specifically, for example, the comparison unit 215 calculates a difference between an center angle (predetermined angle θa) of a V zone formed by the shaft plane 30a and the Hogan's plane 40a and an center angle (predetermined angle θb) of a V zone formed by the shaft plane 30b and the Hogan's plane 40b or determines a magnitude relation between the two center angles. The image generation unit 216 generates the image 500 including the polygon data 501a of the shaft plane 30a, polygon data 502a of the Hogan's plane 40a, the polygon data 501b of the shaft plane 30b, polygon data 502b of the Hogan's plane 40b, and the message (for example, a message indicating a difference between the center angles θa and θb or a magnitude relation between the center angles θa and θb) 510 according to the comparison result obtained by the comparison unit 215, and displays the image 500 in the swing image field 610 (see FIG. 12). In the example of FIG. 14, since the center angle θa is less than the center angle θb, a message “V ZONE IS NARROWER THAN CRITERION” is displayed. It may be determined whether the difference between the center angles θa and θb exceeds a predetermined angle, and a message may be displayed when the difference exceeds the predetermined angle.

In the foregoing example, by displaying the screen 600 on the display unit 25, the user 2 is informed of the comparison result of the criterion swing and the comparison target swing. However, the user 2 may be informed according to another method.

For example, the sensor unit 10 may include a light-emitting unit (also referred to as a display unit) such as an LED. In this case, the output processing unit 217 causes the light-emitting unit to emit light in a light emission form (for example, a light emission pattern or a light emission color) according to the comparison result obtained by the comparison unit 215 (that is, generates a signal for performing light emission). Specifically, for example, the output processing unit 217 causes the light-emitting unit to emit light in a predetermined light emission form when the difference between the shaft planes 30a and 30b exceeds the predetermined angle. For example, when the shaft plane 30a is less than the shaft plane 30b, the output processing unit 217 causes the light-emitting unit to emit light in a first light emission form. When the shaft plane 30a is greater than the shaft plane 30b, the output processing unit 217 causes the light-emitting unit to emit light in a second light emission form. In this way, the user 2 can recognize the magnitude relation of the inclination angle of the shaft plane of the comparison target swing relative to the criterion swing simply even if the user 2 does not view the screen.

For example, when the difference between the center angles θa and θb exceeds a predetermined angle, the output processing unit 217 causes the light-emitting unit to emit light in a predetermined light emission form. For example, when the center angle θa is less than the center angle θb, the output processing unit 217 causes the light-emitting unit to emit light in a third light emission form. When the center angle θa is greater than the center angle θb, the output processing unit 217 causes the light-emitting unit to emit light in a fourth light emission form. In this way, the user 2 can recognize the magnitude relation of the center angle of the V zone of the comparison target swing relative to the criterion swing simply even if the user 2 does not view the screen.

The exercise analysis device 20 may include a light-emitting unit and the output processing unit 217 may cause the light-emitting unit of the exercise analysis device 20 to emit light in a light emission form according to a comparison result obtained by the comparison unit 215.

A timing at which the user 2 is informed of the comparison result using the light-emitting unit included in the sensor unit 10 or the exercise analysis device 20, as described above, is not limited to a timing at which designation of a swing from the user 2 on the screen 600 is received. For example, when the analysis of the swing in the flowchart of FIG. 4 ends (for example, after step S80 ends), the comparison unit 215 may compare a preset criterion swing to a comparison target swing (a swing for which analysis ends) and the output processing unit 217 may cause the light-emitting unit to emit light in a light emission form according to a comparison result obtained by the comparison unit 215. In this way, the user 2 can recognize the comparison result of a swing just finished to the criterion swing simply and in substantially real time.

For example, the sensor unit 10 may include a sound output unit such as a speaker. In this case, the output processing unit 217 causes the sound output unit to output sound according to a comparison result obtained by the comparison unit 215. Specifically, for example, when the difference between the shaft planes 30a and 30b exceeds a predetermined angle, the output processing unit 217 causes the sound output unit to output predetermined sound. For example, when the shaft plane 30a is less than the shaft plane 30b, the output processing unit 217 causes the sound output unit to output first sound. When the shaft plane 30a is greater than the shaft plane 30b, the output processing unit 217 causes the sound output unit to output second sound. In this way, the user 2 can recognize the magnitude relation of the inclination angle of the shaft plane of the comparison target swing relative to the criterion swing simply even if the user 2 does not view the screen.

For example, when the difference between the center angles θa and θb exceeds a predetermined angle, the output processing unit 217 causes the sound output unit to output predetermined sound. For example, when the center angle θa is less than the center angle θb, the output processing unit 217 causes the sound output unit to output third sound. When the center angle θa is greater than the center angle θb, the output processing unit 217 causes the sound output unit to output fourth sound. In this way, the user 2 can recognize the magnitude relation of the center angle of the V zone of the comparison target swing relative to the criterion swing simply even if the user 2 does not view the screen.

The exercise analysis device 20 may include a light-emitting unit and the output processing unit 217 may cause the sound output unit 26 of the exercise analysis device 20 to output sound according to a comparison result obtained by the comparison unit 215.

A timing at which the user 2 is informed of the comparison result using the sound output unit included in the sensor unit 10 or the exercise analysis device 20, as described above, is not limited to a timing at which designation of a swing from the user 2 on the screen 600 is received. For example, when the analysis of the swing in the flowchart of FIG. 4 ends (for example, after step S80 ends), the comparison unit 215 may compare a preset criterion swing to a comparison target swing (a swing for which analysis ends) and the output processing unit 217 may cause the sound output unit to output sound according to a comparison result obtained by the comparison unit 215. In this way, the user 2 can recognize the comparison result of a swing just finished to the criterion swing simply and in substantially real time.

In the foregoing examples, one criterion swing and one comparison target swing are displayed on the screen 600, but an image including information regarding a plurality of swings may be displayed.

For example, the comparison unit 215 receives selection of a plurality of swings to be displayed among the previous swings stored in the storage unit 24 via the operation unit 23. The output processing unit 217 causes the display unit 25 to display a screen including an image 700 illustrated in FIG. 15 (which is a diagram illustrating an example of an image in which parameters of the V zone are plotted).

The image 700 includes a graph in which the horizontal axis represents the inclination angle α of the shaft plane and the vertical axis represents the center angle θ of the V zone. The image generation unit 216 acquires parameters (the inclination angle α of the shaft plane 30 and the center angle θ of the V zone formed by the shaft plane 30 and the Hogan's plane) of the V zone of each of the selected swings from the storage unit 24 and generates a graph in which these values are plotted. Here, the image generation unit 216 may draw a dot 710 indicating a recent swing and dots 720 indicating other swings in different display forms so that the dot 710 and the dots 720 can be distinguished. In the example of FIG. 15, the dot 710 is shown with white and the dots 720 are shown with black.

By displaying such a graph, the user 2 can simply recognize variations in the position of the V zone, the inclination, the area, and the like of each swing. Further, the user 2 can simply compare the V zone of the recent swing to the V zone of the previous swings and recognize the state of the V zone of the recent swing. For each swing, a graph in which one of the inclination angle α of the shaft plane and the center angle θ of the V zone is plotted may be generated.

Here, the criterion acquisition unit 214 may obtain the degree (for example, a standard deviation σ) of the variation in the parameter based on the parameter (at least one of the inclination angle of the shaft plane, the inclination angle of the Hogan's plane, and the center angle of the V zone) regarding the V zone of each swing. The comparison unit 215 may specify the V zone having the parameter of a deviated value (for example, a value exceeding an average value +2σ of the parameter or less than an average value −2σ of the parameter) and the swing of the V zone. For example, based on the degree of the variation, the comparison unit 215 may determine whether a comparison target swing has the deviated value. In this case, the image generation unit 216 may draw a dot indicating a swing having the deviated value and dots indicating other swings in different display forms so that the dots can be distinguished from each other. In this way, the user 2 can simply recognize, for example, a swing which is considerably different from a swing in usual time.

When the analysis of a swing in the flowchart of FIG. 4 ends (for example, after step S80 ends), the criterion acquisition unit 214 may obtain the degree of a variation in a parameter based on the parameter of the V zone of each swing and the comparison unit 215 may specify a V zone having the parameter of a deviated value and a swing of the V zone. For example, based on the degree of the variation, the comparison unit 215 may determine whether an analyzed recent swing (comparison target swing) has the deviated value. In this case, for example, when the parameter of the recent swing has the deviated value, the output processing unit 217 may cause a light-emitting unit to emit light in a light emission form indicating that the parameter of the recent swing has the deviated value or cause a sound output unit to output sound indicating that the parameter of the recent swing has the deviated value. In this way, the user 2 can recognize, for example, whether a swing just finished is considerably different from a swing at the usual time simply and in substantially real time.

In the foregoing examples, the criterion acquisition unit 214 acquires one swing as the criterion swing, but may acquire a plurality of swings as criterion swings.

For example, the criterion acquisition unit 214 acquires information regarding all of the swings stored in the storage unit 24 or swings selected from the swings. Then, the criterion acquisition unit 214 calculates an average value of the inclination angles of the shaft plane 30 or an average value of the center angles of the V zones formed by the shaft plane 30 and the Hogan's plane 40a based on information regarding the plurality of swings, and uses the average values as the inclination angle and the center angle of a criterion swing. In this way, the user 2 can recognize the comparison result of an average swing rather than a single specific swing.

The embodiment of the invention has been described above. According to the embodiment, since the user can objectively recognize the address posture from the positions and inclinations of the shaft plane and the Hogan's plane, the size of the V zone, and the like, it is possible to estimate the goodness and badness of a swing more simply. Since the user can recognize the positional relation between the trajectory of the golf club and the shaft plane and the Hogan's plane at the time of a swing, it is possible to estimate the goodness and badness of a swing more accurately than in the related art.

According to the embodiment, by imposing the restriction that the user performs address so that the major axis of the shaft of the golf club is vertical to the target line, the exercise analysis device can specify the third line segment indicating the target direction of the hitting using the measurement data of the sensor unit at the time of the address. Accordingly, the exercise analysis device can appropriately specify the shaft plane in accordance with the direction of the third line segment.

According to the embodiment, since the predetermined position on the line segment connecting both shoulders of the user in the Hogan's plane is specified in consideration of the body information of the user, the shaft plane and the Hogan's plane can be specified using the measurement data of one sensor unit. It is possible to specify the Hogan's plane suitable for the body shape of the user more accurately than a case in which an imaginary plane obtained by rotating the shaft plane by a predetermined angle (for example, 30°) is specified as the Hogan's plane.

According to the embodiment, the criterion swing and the comparison target swing are compared. Thus, for example, since the user can objectively recognize a difference between the previous and current address postures, the user can estimate goodness and badness of a swing simply and accurately.

According to the embodiment, the inclination angles of the shaft planes of the criterion swing and the comparison target swing are compared. Thus, for example, since the user can objectively recognize a difference between the previous and current inclination angles of the shaft, the user can recognize a distance between his or her body and a ball at the time of address simply and accurately and estimate goodness and badness of a swing.

According to the embodiment, the areas of the center angles of the V zones of the criterion swing and the comparison target swing are compared. Thus, for example, since the user can objectively recognize a difference between the areas of the previous and current V zones, the user can recognize a way to bend his or her waist at the time of address simply and accurately and estimate goodness and badness of a swing.

According to the embodiment, since the shaft plane and the Hogan's plane are specified using the sensor unit, it is not necessary to use a large-scale device such as a camera and restriction of a place where a swing is analyzed is small.

The invention is not limited to the above-described embodiments, but may be modified in various forms within the scope of the gist of the invention.

In the foregoing embodiments, the shaft plane or the like of the criterion swing and the shaft plane or the like of the comparison target swing are included in the image 500 in an overlapping manner (see FIGS. 12 and 13), but the shaft planes may be included in different images and these images may be displayed side by side.

In the foregoing embodiments, the criterion acquisition unit 214 may acquire a plurality of swings as criterion swings and the image generation unit 216 may include the shaft planes or the like of the plurality of criterion swings and the shaft plane or the like of the comparison target swing in the image 500 in an overlapping manner. Of course, the shaft planes or the like of the plurality of criterion swings may be included in different images and these images may be displayed side by side. Similarly, a plurality of comparison target swings may be able to be selected.

In the foregoing embodiments, when information regarding swings is managed according to the specifications of clubs and the criterion swing is compared to the comparison target swing, a restriction may be imposed so that the criterion swing is compared to the comparison target swing in regard to the specifications of identical or similar clubs.

In the foregoing embodiments, for example, the inclination angles of the shaft planes or the center angles of the V zones at the time of address are compared to each other for the criterion swing and the comparison target swing. However, the inclination angles of the shaft planes or the center angles of the V zones at different viewpoints during a swing may be specified and compared. For example, the first imaginary plane specifying unit 211 specifies the first line segment 51 and the shaft plane 30 at the time of impact during a swing, using the measurement data output by the sensor unit 10. The second imaginary plane specifying unit 212 specifies the second line segment 53 and the Hogan's plane 40 at the time of the impact during the swing, using the body information and the measurement data output by the sensor unit 10. The comparison unit 215 may compare the inclination angles of the shaft planes or the center angles of the V zones at the time of impact in regard to the criterion swing and the comparison target swing.

In the foregoing embodiments, the inclination angles of the shaft planes are compared, but the angles of the Hogan's planes may be compared.

In the foregoing embodiments, the second imaginary plane specifying unit 212 specifies the second line segment 53 using the body information and the measurement data output by the sensor unit 10. However, a process of specifying the second line segment 53 connecting the positions 63 and 62 of the head (the blow portion) of the golf club 3 may be performed using the first line segment 51 specified by the first imaginary plane specifying unit 211 and the predetermined angle θ relative to the first line segment 51.

In the foregoing embodiments, the exercise analysis device 20 specifies the shaft plane and the Hogan's plane using the measurement data of the sensor unit 10 mounted on the golf club 3 and calculates the trajectory of the golf club 3 during the swing. Besides, for example, the shaft plane and the Hogan's plane may be specified and the trajectory of the golf club 3 may be calculated using the measurement data of the sensor unit 10 mounted on an arm (wrist or the like) of the user 2 in the same method as that of the foregoing embodiment. Alternatively, the plurality of sensor units 10 may be mounted on parts such as the golf club 3 or the arms, shoulders, or the like of the user. Then, the shaft plane and the Hogan's plane may be specified and the trajectory of the golf club 3 may be calculated using the measurement data of each of the plurality of sensor units 10.

In the foregoing embodiments, the acceleration sensor 13 and the angular velocity sensor 14 are embedded in the sensor unit 10 to be integrated, but the acceleration sensor 13 and the angular velocity sensor 14 may not be integrated. Alternatively, the acceleration sensor 13 and the angular velocity sensor 14 may not be embedded in the sensor unit 10, but may be mounted directly on the golf club 3 or the user 2. In the foregoing embodiments, the sensor unit 10 and the exercise analysis device 20 are separated, but the sensor unit 10 and the exercise analysis device 20 may be integrated to be mounted on the golf club 3 or the user 2.

In the foregoing embodiments, the exercise analysis device 20 calculates the Z coordinate AZ of the predetermined position 63 on the line segment connecting both shoulders of the user 2 to one another as the sum of the Y coordinate GY of the position 62 of the grip end and the length L2 of the arm of the user 2 as in equation (9), but another equation may be used. For example, the exercise analysis device 20 may multiply L2 by a coefficient K and adds GY to calculate AZ as in AZ=GY+K·L2.

In the foregoing embodiments, the exercise analysis system (the exercise analysis device) analyzing a golf swing has been exemplified. However, the invention can be applied to an exercise analysis system (exercise analysis device) analyzing swings of various exercises of tennis, baseball, and the like.

The above-described embodiments and modification examples are merely examples and the invention is not limited thereto. For example, the embodiments and the modification examples can also be appropriately combined.

The configuration of the exercise analysis system 1 illustrated in FIG. 3 is classified according to main processing content in order to facilitate understanding of the configuration of the exercise analysis system 1. The invention is not limited by the method of classifying the constituent elements or the names of the constituent elements. The configuration of the exercise analysis system 1 can be classified into further many constituent elements according to the processing content. One constituent element can be classified to perform more processes. The process of each constituent element may be performed by one piece of hardware or may be performed by a plurality of pieces of hardware. Charge of the process or function of each constituent element is not limited to the above description as long as the goal and advantage of the invention can be achieved. In the foregoing embodiment, the sensor unit 10 and the exercise analysis device 20 have been described as separate elements, but the function of the exercise analysis device 20 may be mounted on the sensor unit 10.

Units of processes in the flowchart illustrated in FIG. 4 are divided according to main processing content in order to facilitate understanding of the process of the exercise analysis device 20. The invention is not limited by a method of dividing the units of processes or the names of the units of processes. The process of the exercise analysis device 20 can be divided in more units of processes according to the processing content. One unit of process can be divided to include more processes. The processing procedure of the foregoing flowchart is not limited to the example illustrated in the drawing.

The entire disclosure of Japanese Patent Application No. 2014-257255, filed Dec. 19, 2014 is expressly incorporated by reference herein.

Claims

1. An exercise analysis device comprising:

a first specifying unit that specifies an inclination of a first axis which lies in a longitudinal direction of a shaft of an exercise tool at a time of address or a time of impact in a swing, using output of an inertial sensor;
an acquisition unit that acquires a first criterion axis to be compared to the first axis; and
a comparison unit that compares an inclination of the first axis to an inclination of the first criterion axis.

2. The exercise analysis device according to claim 1,

wherein the comparison unit compares magnitudes of the inclination of the first axis and the inclination of the first criterion axis, and
wherein the exercise analysis device further comprises:
an output processing unit that performs output according to a comparison result.

3. The exercise analysis device according to claim 2,

wherein the output processing unit generates a signal for making report to a user when a difference between the inclination of the first axis and the inclination of the first criterion axis exceeds a predetermined value.

4. The exercise analysis device according to claim 2,

wherein the output processing unit generates a signal for making different report to a user according to a difference between the inclination of the first axis and the inclination of the first criterion axis.

5. The exercise analysis device according to claim 3,

wherein the output processing unit generates a signal for making report to the user by light emission.

6. The exercise analysis device according to claim 5,

wherein the output processing unit generates a signal for causing a type of light emission to be different according to the comparison result.

7. The exercise analysis device according to claim 1,

wherein the first specifying unit specifies a first imaginary plane including the first axis and a third axis indicating a hitting direction,
wherein the acquisition unit acquires a first criterion imaginary plane including the first criterion axis and the third axis, and
wherein the comparison unit compares an inclination of the first imaginary plane to an inclination of the first criterion imaginary plane.

8. An exercise analysis system comprising:

the exercise analysis device according to claim 1; and
an inertial sensor.

9. An exercise analysis system comprising:

the exercise analysis device according to claim 2; and
an inertial sensor.

10. An exercise analysis system comprising:

the exercise analysis device according to claim 3; and
an inertial sensor.

11. An exercise analysis system comprising:

the exercise analysis device according to claim 4; and
an inertial sensor.

12. An exercise analysis method comprising:

specifying an inclination of a first axis which lies in a major axis direction of a shaft of an exercise tool at a time of address or a time of impact in a swing, using output of an inertial sensor;
acquiring a first criterion axis to be compared to the first axis; and
comparing an inclination of the first axis to an inclination of the first criterion axis.

13. The exercise analysis method according to claim 12,

wherein the comparing of the inclinations includes comparing magnitudes of the inclination of the first axis and the inclination of the first criterion axis, and performing output according to a comparison result.

14. The exercise analysis method according to claim 13,

wherein in the performing of the output, a signal for making report to a user is generated when a difference between the inclination of the first axis and the inclination of the first criterion axis exceeds a predetermined value.

15. The exercise analysis method according to claim 13,

wherein in the performing of the output, a signal for making different report to a user is generated according to a difference between the inclination of the first axis and the inclination of the first criterion axis.

16. The exercise analysis method according to claim 14,

wherein in the performing of the output, a signal for making report to the user by light emission is generated.

17. The exercise analysis method according to claim 16,

wherein in the performing of the output, a signal for causing a type of light emission to be different is generated according to the comparison result.

18. The exercise analysis method according to claim 12,

wherein in the specifying of the inclination of the first axis, a first imaginary plane including the first axis and a third axis indicating a hitting direction is specified,
wherein in the acquiring of the first criterion axis, a first criterion imaginary plane including the first criterion axis and the third axis is acquired, and
wherein in the comparing of the inclinations, an inclination of the first imaginary plane is compared to an inclination of the first criterion imaginary plane.

19. A program causing a computer to implement:

specifying an inclination of a first axis which lies in a longitudinal direction of a shaft of an exercise tool at a time of address or a time of impact in a swing, using output of an inertial sensor;
acquiring a first criterion axis to be compared to the first axis; and
comparing an inclination of the first axis to an inclination of the first criterion axis.

20. A recording medium that records a program causing a computer to implement:

specifying an inclination of a first axis which lies in a longitudinal direction of a shaft of an exercise tool at a time of address or a time of impact in a swing, using output of an inertial sensor;
acquiring a first criterion axis to be compared to the first axis; and
comparing an inclination of the first axis to an inclination of the first criterion axis.
Patent History
Publication number: 20160175680
Type: Application
Filed: Dec 9, 2015
Publication Date: Jun 23, 2016
Applicant: SEIKO EPSON CORPORATION (Tokyo)
Inventor: Masafumi SATO (Hara-mura)
Application Number: 14/963,649
Classifications
International Classification: A63B 69/36 (20060101);