ELECTRONIC DEVICE, DISPLAY METHOD, DISPLAY SYSTEM, AND RECORDING MEDIUM

- SEIKO EPSON CORPORATION

An electronic device includes a processing unit configured to discriminate, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and a first position and a second position registered in advance, a plurality of states including a first exercise state in which a user is carrying out a first exercise event, a second exercise state in which the user is carrying out a second exercise event, and a first transition state halfway in transition from the first exercise state to the second exercise state.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCES TO RELATED APPLICATIONS

This application claims priority to Japanese Patents Application No. 2016-213378, filed Oct. 31, 2016, and No. 2016-213379, filed Oct. 31, 2016, all of which are herein incorporated by reference.

BACKGROUND 1. Technical Field

The present invention relates to an electronic device, a display method, a display system, and a recording medium.

2. Related Art

JP-A-2006-221251 (Patent Literature 1) describes a system in which, in a triathlon, when athletes record passing times using digital pens in switching gates of respective athletic events of swimming, bicycling, and running (a state gate of a bicycle even and a start gate of a running event), information concerning the recorded times is transmitted to a server via communication terminals and the server tabulates the passing times of the switching gates for each of the athletes, calculates times of the athletic events, and generates a print including a result sheet.

However, in the system described in Patent Literature 1, the athletes need to perform work for recording the passing times using the digital pens in the switching gates of the athletic events. This is time consuming and times required for switching (transition) of the athletic events (exercise events) increase to increase total times of the athletic events.

SUMMARY

An advantage of some aspects of the invention is to provide an electronic device, a display method, a display system, and a recording medium that make it unnecessary for a user to perform work when an exercise event carried out by the user is switched.

The invention can be implemented as the following forms or application examples.

APPLICATION EXAMPLE 1

An electronic device according to this application example includes a processing unit configured to discriminate, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and a first position and a second position registered in advance, a plurality of states including a first exercise state in which a user is carrying out a first exercise event, a second exercise state in which the user is carrying out a second exercise event, and a first transition state halfway in transition from the first exercise state to the second exercise state.

The electronic device according to this application example may include a position-information generating unit configured to generate the position information on the basis of the satellite signal transmitted from the position information satellite.

With the electronic device according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event, the first transition state halfway in transition from the first exercise state to the second exercise state, and the second exercise state in which the user is carrying out the second exercise event. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition.

APPLICATION EXAMPLE 2

The electronic device according to the application example may include a notifying unit configured to notify a state discriminated by the processing unit.

The notifying unit may notify the state with at least one of display, sound, and vibration.

With the electronic device according to this application example, since the electronic device notifies the discriminated state, the user can confirm whether the notified state is correct. Alternatively, a person (e.g., a coach) different from the user can confirm the notified state of the user.

APPLICATION EXAMPLE 3

In the electronic device according to the application example, the processing unit may determine on the basis of the position information whether the user passes the first position and whether the user passes the second position, determine that the user is in the first exercise state until the user passes the first position, determine that the user is in the first transition state until the user passes the second position after passing the first position, and determine that the user is in the second exercise state after the user passes the second position.

With the electronic device according to this application example, since a state of the user is switched when the user passes the first position or the second position, it is possible to discriminate the state of the user.

APPLICATION EXAMPLE 4

In the electronic device according to the application example, the processing unit may calculate times respectively required for the first exercise state, the second exercise state, and the first transition state.

With the electronic device according to this application example, it is possible to calculate a time required for carrying out the first exercise event, a time required for switching from the first exercise event to the second exercise event, and a time required for carrying out the second exercise event by the user.

APPLICATION EXAMPLE 5

In the electronic device according to the application example, the plurality of states may include a third exercise state in which the user is carrying out a third exercise event and a second transition state halfway in transition from the second exercise state to the third exercise state, and the processing unit may discriminate the plurality of states on the basis of a third position and a fourth position registered in advance.

With the electronic device according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event, the first transition state halfway in transition from the first exercise state to the second exercise state, the second exercise state in which the user is carrying out the second exercise event, the second transition state halfway in transition from the second exercise state to the third exercise state, and the third exercise state in which the user is carrying out the third exercise event. Therefore, the user does not need to perform work when the exercise event carried out by the user is switched from the first exercise event to the second exercise event and when the exercise event carried out by the user is switched from the second exercise event to the third exercise event.

APPLICATION EXAMPLE 6

In the electronic device according to the application example, the processing unit may determine on the basis of the position information whether the user passes the third position and whether the user passes the fourth position, determine that the user is in the second exercise state until the user passes the third position, determine that the user is in the second transition state until the user passes the fourth position after passing the third position, and determine that the user is in the third exercise state after the user passes the fourth position.

With the electronic device according to this application example, since a state of the user is switched when the user passes the third position or the fourth position, it is possible to discriminate the state of the user.

APPLICATION EXAMPLE 7

In the electronic device according to the application example, the processing unit may calculate times respectively required for the third exercise state and the second transition state.

With the electronic device according to this application example, it is possible to calculate a time required for switching from the second exercise event to the third exercise event and a time required for carrying out the third exercise event by the user.

APPLICATION EXAMPLE 8

In the electronic device according to the application example, the processing unit may discriminate the plurality of states on the basis of at least either one of an output signal of a first motion sensor and an output signal of a pressure sensor, the position information, and the first position and the second position.

The electronic device according to this application example may include the first motion sensor or may include the pressure sensor. The electronic device according to this application example may include a position-information generating unit configured to generate the position information on the basis of the satellite signal transmitted from the position information satellite.

The first motion sensor may be an acceleration sensor.

With the electronic device according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event and the second exercise state in which the user is carrying out the second exercise event. Therefore, the user does not need to perform work when the exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition.

APPLICATION EXAMPLE 9

In the electronic device according to the application example, the processing unit may calculate moving speed on the basis of the position information, determine whether the output signal of the first motion sensor has periodicity, detect a change in pressure on the basis of the output signal of the pressure sensor, and discriminate the plurality of states on the basis of the moving speed, whether the output signal of the first motion sensor has periodicity, and the change in the pressure.

With the electronic device according to this application example, it is possible to recognize a change in speed of the user, a change in exercise of the user, and a change in a peripheral environment of the user and discriminate a state of the user on the basis of the moving speed, whether a waveform of the output signal of the first motion sensor has periodicity, and the change in the pressure.

APPLICATION EXAMPLE 10

In the electronic device according to the application example, the processing unit may discriminate the plurality of states on the basis of an output signal of a second motion sensor.

The processing unit may determine whether a waveform of the output signal of the second motion sensor has periodicity and discriminate the plurality of states on the basis of whether the waveform has periodicity.

The second motion sensor may be an angular velocity sensor.

With the electronic device according to this application example, it is possible to more accurately discriminate a state of the user on the basis of the position information and at least either one of the output signal of the first motion sensor and the output signal of the pressure sensor and on the basis of the output signal of the second motion sensor.

APPLICATION EXAMPLE 11

In the electronic device according to the application example, the processing unit may discriminate the plurality of states on the basis of an output signal of a temperature sensor.

The processing unit may detect a change in temperature on the basis of the output signal of the temperature sensor and discriminate the plurality of states on the basis of the change in the temperature.

With the electronic device according to this application example, it is possible to more accurately discriminate a state of the user on the basis of the position information and at least either one of the output signal of the first motion sensor and the output signal of the pressure sensor and on the basis of the output signal of the temperature sensor.

APPLICATION EXAMPLE 12

In the electronic device according to the application example, the first exercise event may be swimming, the second exercise event may be bicycling, and the third exercise event may be running.

With the electronic device according to this application example, it is possible to discriminate a state of the user who carries out a triathlon.

APPLICATION EXAMPLE 13

A display method according to this application example includes: discriminating, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and a first position and a second position registered in advance, a plurality of states including a first exercise state in which a user is carrying out a first exercise event, a second exercise state in which the user is carrying out a second exercise event, and a first transition state halfway in transition from the first exercise state to the second exercise state; and displaying a discriminated state.

With the display method according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event, the first transition state halfway in transition from the first exercise state to the second exercise state, and the second exercise state in which the user is carrying out the second exercise event and display the discriminated state. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition. Alternatively, a person (e.g., a coach) different from the user can visually recognize a state of the user displayed on a display unit and give advice or the like to the user.

APPLICATION EXAMPLE 14

In the display method according to the application example, when the discriminated state is the first transition state, an image including a flashing object may be displayed.

With the display method according to this application example, it is possible to highlight and display a state during switching from the first exercise event to the second exercise event by the user.

APPLICATION EXAMPLE 15

In the display method according to the application example, when the discriminated state is the first transition state, an elapsed time from a start of the first transition state may be displayed.

With the display method according to this application example, it is possible to cause the user and the like to recognize a time required for switching from the first exercise event to the second exercise event by the user.

APPLICATION EXAMPLE 16

In the display method according to the application example, the elapsed time may be displayed to be comparable with a target time set in advance.

With the display method according to this application example, it is possible to cause the user and the like to recognize whether a time required for switching from the first exercise event to the second exercise event by the user is long or short compared with the set target time.

APPLICATION EXAMPLE 17

In the display method according to the application example, the plurality of states may be discriminated on the basis of at least either one of an output signal of a first motion sensor and an output signal of a pressure sensor, the position information, and the first position and the second position, and the discriminated state may be displayed.

With the display method according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event and the second exercise state in which the user is carrying out the second exercise event and display the discriminated state. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition. Alternatively, a person (e.g., a coach) different from the user can visually recognize a state of the user displayed on the display unit and give advice or the like to the user.

APPLICATION EXAMPLE 18

A display system according to this application example includes: a processing unit configured to discriminate, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and a first position and a second position registered in advance, a plurality of states including a first exercise state in which a user is carrying out a first exercise event, a second exercise state in which the user is carrying out a second exercise event, and a first transition state halfway in transition from the first exercise state to the second exercise state; and a display unit configured to display a discriminated state.

With the display system according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event, the first transition state halfway in transition from the first exercise state to the second exercise state, and the second exercise state in which the user is carrying out the second exercise event and display the discriminated state. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition. Alternatively, a person (e.g., a coach) different from the user can visually recognize a state of the user displayed on the display unit and give advice or the like to the user.

APPLICATION EXAMPLE 19

A computer-readable recording medium according to an application example 19 records therein a computer program for causing a computer to execute a step of discriminating, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and a first position and a second position registered in advance, a plurality of states including a first exercise state in which a user is carrying out a first exercise event, a second exercise state in which the user is carrying out a second exercise event, and a first transition state halfway in transition from the first exercise state to the second exercise state.

With the computer program and the recording medium according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event, the first transition state halfway in transition from the first exercise state to the second exercise state, and the second exercise state in which the user is carrying out the second exercise event and display the discriminated state. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition.

APPLICATION EXAMPLE 20

The recording medium according to the application example is a computer-readable recording medium having recorded therein a computer program for causing the computer to execute steps of: determining on the basis of the position information whether the user passes the first position and whether the user passes the second position; determining that the user is in the first exercise state until the user passes the first position; determining that the user is in the first transition state until the user passes the second position after passing the first position; and determining that the user is in the second exercise state after the user passes the second position.

With the recording medium according to this application example, since a state of the user is switched when the user passes the first position or the second position, it is possible to discriminate the state of the user. Therefore, the user does not need to perform work when the exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition.

APPLICATION EXAMPLE 21

In the electronic device according to Application Example 1 or 2, a plurality of the first positions and a plurality of the second positions may be registered in advance, and the processing unit may determine on the basis of the position information whether the user passes any one of the plurality of first positions and whether the user passes any one of the plurality of second positions, determine that the user is in the first exercise state until the user passes any one of the plurality of first positions, determine that the user is in the first transition state until the user passes any one of the plurality of second positions after passing any one of the plurality of first positions, and determine that the user is in the second exercise state after the user passes any one of the plurality of second positions.

With the electronic device according to this application example, a state of the user is switched when the user passes any one of the plurality of first positions or any one of the plurality of second positions. Therefore, compared with when only one first position and one second position are registered, it is possible to more accurately discriminate the state of the user.

APPLICATION EXAMPLE 22

In the electronic device according to Application Example 5, a plurality of the third positions and a plurality of the fourth positions may be registered in advance, and the processing unit may determine on the basis of the position information whether the user passes any one of the plurality of third positions and whether the user passes any one of the plurality of fourth positions, determine that the user is in the second exercise state until the user passes any one of the plurality of third positions, determine that the user is in the second transition state until the user passes any one of the plurality of fourth positions after passing any one of the plurality of third positions, and determine that the user is in the third exercise state after the user passes any one of the plurality of fourth positions.

With the electronic device according to this application example, a state of the user is switched when the user passes any one of the plurality of third positions or any one of the plurality of fourth positions. Therefore, compared with when only one third position and one fourth position are registered, it is possible to more accurately discriminate the state of the user.

APPLICATION EXAMPLE 23

A computer program according to this application example causes a computer to execute a step of discriminating, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and a first position and a second position registered in advance, a plurality of states including a first exercise state in which a user is carrying out a first exercise event, a second exercise state in which the user is carrying out a second exercise event, and a first transition state halfway in transition from the first exercise state to the second exercise state.

With the computer program according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event, the first transition state halfway in transition from the first exercise state to the second exercise state, and the second exercise state in which the user is carrying out the second exercise event. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition.

APPLICATION EXAMPLE 24

An electronic device according to this application example includes a processing unit configured to discriminate, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and at least either one of an output signal of a first motion sensor and an output signal of a pressure sensor, a plurality of states including a first exercise state in which a user is carrying out a first exercise event and a second exercise state in which the user is carrying out a second exercise event.

The electronic device according to this application example may include the first motion sensor or may include the pressure sensor. The electronic device according to this application example may include a position-information generating unit configured to generate the position information on the basis of the satellite signal transmitted from the position information satellite.

The first motion sensor may be an acceleration sensor.

With the electronic device according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event and the second exercise state in which the user is carrying out the second exercise event. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition.

APPLICATION EXAMPLE 25

The electronic device according to the application example 24 may include a notifying unit configured to notify the state discriminated by the processing unit.

The notifying unit may notify the state with at least one of display, sound, and vibration.

With the electronic device according to this application example, since the electronic device notifies the discriminated state, the user can confirm whether the notified state is correct. Alternatively, a person (e.g., a coach) different from the user can confirm the notified state of the user.

APPLICATION EXAMPLE 26

In the electronic device according to the application example 24 or 25, the processing unit may calculate moving speed on the basis of the position information, determine whether a waveform of the output signal of the first motion sensor has periodicity, detect a change in pressure on the basis of the output signal of the pressure sensor, and discriminate the plurality of states on the basis of the moving speed, whether the waveform has periodicity, and the change in the pressure.

With the electronic device according to this application example, it is possible to recognize a change in speed of the user, a change in exercise of the user, and a change in a peripheral environment of the user and discriminate a state of the user on the basis of the moving speed, whether the waveform of the output signal of the first motion sensor has periodicity, and the change in the pressure.

APPLICATION EXAMPLE 27

In the electronic device according to any one of the application examples 24 to 26, the plurality of states may include a third exercise state in which the user is carrying out a third exercise event.

With the electronic device according to this application example, it is possible to discriminate a state of the user who carries out an athletic competition including three or more exercise events.

APPLICATION EXAMPLE 28

In the electronic device according to the application example 27, the plurality of states may include a first transition state halfway in transition from the first exercise state to the second exercise state and a second transition state halfway in transition from the second exercise state to the third exercise state.

With the electronic device according to this application example, it is possible to discriminate whether a state of the user is a state in which the user is carrying out the first exercise event, a state in which the user is switching the first exercise event to the second exercise event, a state in which the user is carrying out the second exercise event, a state in which the user is switching the second exercise event to the third exercise event, or a state in which the user is carrying out the third exercise event.

APPLICATION EXAMPLE 29

In the electronic device according to the application example 28, the processing unit may calculate times respectively required for the first exercise state, the second exercise state, the third exercise state, the first transition state, and the second transition state.

With the electronic device according to this application example, it is possible to calculate a time required for carrying out the first exercise event, a time required for switching from the first exercise event to the second exercise event, a time required for carrying out the second exercise event, a time required for switching from the second exercise event to the third exercise event, and a time required for carrying out the third exercise event by the user.

APPLICATION EXAMPLE 30

In the electronic device according to the application example 28 or 29, the first exercise event may be swimming, the second exercise event may be bicycling, and the third exercise event may be running.

With the electronic device according to this application example, it is possible to discriminate a state of the user who carries out a triathlon.

APPLICATION EXAMPLE 31

In the electronic device according to any one of the application examples 24 to 30, the processing unit may discriminate the plurality of states on the basis of an output signal of a second motion sensor.

The processing unit may determine whether a waveform of the output signal of the second motion sensor has periodicity and discriminate the plurality of states on the basis of whether the waveform has periodicity.

The second motion sensor may be an angular velocity sensor.

With the electronic device according to this application example, it is possible to more accurately discriminate a state of the user on the basis of the position information and at least either one of the output signal of the first motion sensor and the output signal of the pressure sensor and on the basis of the output signal of the second motion sensor.

APPLICATION EXAMPLE 32

In the electronic device according to any one of the application examples 24 to 31, the processing unit may discriminate the plurality of states on the basis of an output signal of a temperature sensor.

The processing unit may detect a change in temperature on the basis of the output signal of the temperature sensor and discriminate the plurality of states on the basis of the change in the temperature.

With the electronic device according to this application example, it is possible to more accurately discriminate a state of the user on the basis of the position information and at least either one of the output signal of the first motion sensor and the output signal of the pressure sensor and on the basis of the output signal of the temperature sensor.

APPLICATION EXAMPLE 33

A display method according to this application example includes: discriminating, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and at least either one of an output signal of a first motion sensor and an output signal of a pressure sensor, a plurality of states including a first exercise state in which a user is carrying out a first exercise event and a second exercise state in which the user is carrying out a second exercise event; and displaying a discriminated state.

With the display method according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event and the second exercise state in which the user is carrying out the second exercise event and display the discriminated state. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition. Alternatively, a person (e.g., a coach) different from the user can visually recognize a state of the user displayed on a display unit and give advice or the like to the user.

APPLICATION EXAMPLE 34

In the display method according to the application example 33, the plurality of states may include a first transition state halfway in transition from the first exercise state to the second exercise state.

With the display method according to this application example, it is possible to discriminate whether a state of the user is a state in which the user is carrying out the first exercise event, a state in which the user is switching the first exercise event to the second exercise event, or a state in which the user is carrying out the second exercise event and display the discriminated state.

APPLICATION EXAMPLE 35

In the display method according to the application example 34, when the discriminated state is the first transition state, an image including a flashing object may be displayed.

With the display method according to this application example, it is possible to highlight and display a state during switching from the first exercise event to the second exercise event by the user.

APPLICATION EXAMPLE 36

In the display method according to the application example 34 or 35, when the discriminated state is the first transition state, an elapsed time from a start of the first transition state may be displayed.

With the display method according to this application example, it is possible to cause the user and the like to recognize a time required for switching from the first exercise event to the second exercise event by the user.

APPLICATION EXAMPLE 37

In the display method according to the application example 36, the elapsed time may be displayed to be comparable with a target time set in advance.

With the display method according to this application example, it is possible to cause the user and the like to recognize whether a time required for switching from the first exercise event to the second exercise event by the user is long or short compared with the set target time.

APPLICATION EXAMPLE 38

A display system according to this application example including: a processing unit configured to discriminate, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and at least either one of an output signal of a first motion sensor and an output signal of a pressure sensor, a plurality of states including a first exercise state in which a user is carrying out a first exercise event and a second exercise state in which the user is carrying out a second exercise event; and a display unit configured to display a discriminated state.

With the display system according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event and the second exercise state in which the user is carrying out the second exercise event and display the discriminated state. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition. Alternatively, a person (e.g., a coach) different from the user can visually recognize a state of the user displayed on the display unit and give advice or the like to the user.

APPLICATION EXAMPLE 39

A computer program according to this application example causes a computer to execute a step of discriminating, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and at least either one of an output signal of a first motion sensor and an output signal of a pressure sensor, a plurality of states including a first exercise state in which a user is carrying out a first exercise event and a second exercise state in which the user is carrying out a second exercise event.

APPLICATION EXAMPLE 40

A computer-readable recording medium according to an application example records therein a computer program for causing a computer to execute a step of discriminating, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and at least either one of an output signal of a first motion sensor and an output signal of a pressure sensor, a plurality of states including a first exercise state in which a user is carrying out a first exercise event and a second exercise state in which the user is carrying out a second exercise event.

With the computer program and the recording medium according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event and the second exercise state in which the user is carrying out the second exercise event. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition.

BRIEF DESCRIPTION OF THE DRAWINGS

The invention will be described with reference to the accompanying drawings, wherein like numbers reference like elements.

FIG. 1 is a diagram showing a configuration example of an exercise information management system in a first embodiment.

FIG. 2 is an explanatory diagram concerning an overview of the exercise information management system in the first embodiment.

FIG. 3 is a diagram showing an example concerning a course used for a triathlon and registration of positions in the first embodiment.

FIG. 4 is an example of a functional block diagram of a user terminal.

FIG. 5 is a flowchart for explaining an example of a procedure of a part of processing performed by a processing unit of the user terminal.

FIG. 6 is a flowchart for explaining an example of details of a state discrimination processing in the first embodiment.

FIG. 7 is a diagram showing a display example of states of a user that the processing unit causes a display unit to display.

FIG. 8 is a diagram showing an example of images at the time when a state of a user is “first transition”.

FIG. 9 is a diagram showing an example of an image displayed on a display unit of an information terminal.

FIG. 10 is a diagram showing an example of an image displayed on the display unit of the information terminal.

FIG. 11 is a diagram showing an example of an image displayed on the display unit of the information terminal.

FIG. 12 is a diagram showing an example of an image displayed on the display unit of the information terminal.

FIG. 13 is a diagram showing an example concerning registration of positions in a second embodiment.

FIG. 14 is a flowchart for explaining an example of details of state discrimination processing in the second embodiment.

FIG. 15 is a diagram showing an example concerning registration of positions in a third embodiment.

FIG. 16 is a flowchart for explaining an example of details of state discrimination processing in the third embodiment.

FIG. 17 is a diagram showing an example of a course used in a triathlon.

FIG. 18 is a flowchart for explaining an example of details of state discrimination processing in a fourth embodiment.

FIG. 19 is a flowchart for explaining swim determination processing in a fifth embodiment.

FIG. 20 is a flowchart for explaining an example of first transition determination processing in the fifth embodiment.

FIG. 21 is a flowchart for explaining an example of bike determining processing in the fifth embodiment.

FIG. 22 is a flowchart for explaining an example of second transition determination processing in the fifth embodiment.

FIG. 23 is a flowchart for explaining an example of run determination processing in the fifth embodiment.

FIG. 24 is a flowchart for explaining an example of swim determination processing in a sixth embodiment.

FIG. 25 is a flowchart for explaining an example of first transition determination processing in the sixth embodiment.

FIG. 26 is a flowchart for explaining an example of bike determination processing in the sixth embodiment.

FIG. 27 is a flowchart for explaining an example of second transition determination processing in the sixth embodiment.

FIG. 28 is a flowchart for explaining an example of run determination processing in the sixth embodiment.

FIG. 29 is a flowchart for explaining an example of details of state discrimination processing in a seventh embodiment.

FIG. 30 is a flowchart for explaining an example of running movement A determination processing.

FIG. 31 is a flowchart for explaining an example of running movement B determination processing.

FIG. 32 is a flowchart for explaining an example of running movement C determination processing.

FIG. 33 is a flowchart for explaining an example of running movement D determination processing.

FIG. 34 is a diagram showing another example of the images at the time when the state of the user is the “first transition”.

FIG. 35 is a diagram showing another example of the images at the time when the state of the user is the “first transition”.

FIG. 36 is a diagram showing another example of an image at the time when the state of the user is “bike”.

FIG. 37 is a diagram showing another example of the image at the time when the state of the user is the “first transition”.

FIG. 38 is a diagram showing a disposition example of a plurality of pressure sensors in a modification.

DESCRIPTION OF EXEMPLARY EMBODIMENTS

Preferred embodiments of the invention are explained in detail below with reference to the drawings. Note that the embodiments explained below do not unduly limit contents of the invention described in the appended claims. Not all of components explained below are essential constituent elements of the invention.

In the following explanation, an exercise information management system that manages a state of exercise of a user who carries out a triathlon as a competition including a plurality of athletic events (exercise events) is explained as an example.

Note that, in an exercise information management system 1 in second to seventh embodiments and modifications, the same components as components in a first embodiment are denoted by the same reference numerals and signs and explanation of the components is omitted or simplified. Differences from the first embodiment or the fourth embodiment are mainly explained.

1. First Embodiment 1-1. Configuration of a System

FIG. 1 is a diagram showing a configuration example of an exercise information management system 1 in a first embodiment. As shown in FIG. 1, the exercise information management system 1 in this embodiment includes a user terminal 3, a server 4, and information terminals 5. The server 4 is connected to a network 6 such as the Internet or a LAN (Local Area Network).

A user 2 carries the user terminal 3 (an example of an “electronic device”) and carries out a triathlon. The user 2 may carry out the triathlon in a tournament or may carry out the triathlon in a practice. The triathlon is configured by three athletic events (exercise events) of swim (swimming), bike (bicycling), and run (running). The user 2 carries out the events in the order of the swim, the bike, and the run.

As shown in FIG. 2, in this embodiment, the user terminal 3 is a wrist-type (wristwatch-type) electronic device and mounted on a wrist or the like of the user 2. The user terminal 3 can receive satellite signals transmitted from GPS (Global Positioning System) satellites 7 (an example of a “position information satellite”) and transmit exercise information of the user 2 to the information terminal 5 (5a). Note that FIG. 2 is a diagram at the time when the user 2 is carrying out the run.

FIG. 3 is a diagram showing an example of a course used for the triathlon. A solid line C1 represents a course of the swim, a broken line C2 represents a course of the bike, and an alternate long and short dash line C3 represents a course of the run. A sign S1 represents a start point of the swim (a start point of the triathlon), a sign S2 represents a start point of the bike, and a sign S3 represents a start point of the run. A sign G1 represents a goal point of the swim, a sign G2 represents a goal point of the bike, and a sign G3 represents a goal point of the run (a goal point of the triathlon). A sign TA represents a transition area.

In the triathlon, for example, an elapsed time from time when the user 2 starts the start point S1 of the swim until the user 2 passes the start point S2 of the bike is regarded as a time required for the swim (a swim time). An elapsed time from time when the user 2 passes the start point S2 of the bike until the user 2 passes the start point S3 of the run is regarded as a time required for the bike (a bike time). An elapsed time from time when the user 2 passes the start point S3 of the run until the user 2 passes the goal point G3 of the run is regarded as a time required for the run (a run time).

In this case, a sum of an elapsed time (a first transition time) from time when the user 2 passes the goal point G1 of the swim until the user 2 passes the start point S2 of the bike, that is, a time in which the user 2 moves from the goal point G1 of the swim to the transition area TA, a time required for a change of clothes (e.g., wearing of bicycle shoes, a helmet, sunglasses, and the like) and the like in the transition area TA, and a time in which the user 2 moves to the start point S2 of the bike is included in the swim time.

Similarly, a sum of an elapsed time (a second transition time) from time when the user 2 passes the goal point G2 of the bike until the user 2 passes the start point S3 of the run, that is, a time in which the user 2 moves from the goal point G2 of the bike to a clothes change place in the transition area TA, a time required for a change of clothes (e.g., removal of the helmet, the sunglasses, the bicycle shoes, and the like and wearing of run shoes and the like) and the like, and a time in which the user 2 moves to the start point S3 of the run is included in the bike time. A sum of the swim time, the bike time, and the run time is a total time.

In this embodiment, as shown in FIG. 3, before starting the triathlon, the user 2 registers the position (the latitude and the longitude) of the goal point G1 of the swim or the vicinity of the goal point G1 as a position P1 (an example of a “first position”), registers the position of the start point S2 of the bike or the vicinity of the start point S2 as a position P2 (an example of a “second position”), registers the position of the goal point G2 of the bike or the vicinity of the goal point G2 as a position P3 (an example of a “third position”), and registers the position of the start point S3 of the run or the vicinity of the start point S3 as a position P4 (an example of a “fourth position”) in advance in a storing unit 140 (refer to FIG. 4 referred to below) of the user terminal 3.

The user 2 may actually go to the goal point G1 of the swim, the start point S2 of the bike, the goal point G2 of the bike, and the start point S3 of the run and operate an operation unit 120 (see FIG. 4) of the user terminal 3 to register the positions (the latitudes and the longitudes) of present places in the storing unit 140 as the positions P1, P2, P3, and P4.

Alternatively, the user 2 may select, in the information terminal 5, positions respectively corresponding to the goal points G1 of the swim, the start point S2 of the bike, the goal point G2 of the bike, and the start point S3 of the run on map data of an area where the triathlon is performed. The user terminal 3 may receive information concerning the selected positions (the latitudes and the longitudes) via a communication unit 170 (see FIG. 4) and register the information in the storing unit 140 as the positions P1, P2, P3, and P4.

When starting the triathlon (when starting the swim in the start point S1), the user 2 performs measurement start operation on the user terminal 3.

The user terminal 3 incorporates a clocking unit 130 (see FIG. 4 referred to below). The user terminal 3 measures an elapsed time from the measurement start operation, that is, a total elapsed time Ttotal from time when the user 2 starts the triathlon and sequentially displays information concerning the measured total elapsed time Ttotal on a display unit 150 (see FIG. 4) or the like (on a real-time basis).

The user terminal 3 discriminates, on the basis of position information obtained on the basis of satellite signals transmitted from the GPS (Global Positioning System) satellites 7 (an example of the “position information satellite”) and the positions P1, P2, P3, and P4 registered in advance, a plurality of states including a state “swim” (an example of a “first exercise state”) in which the user 2 is carrying out the swim (an example of a “first exercise event”), a state “first transition” (an example of a “first transition state”) halfway in transition from the “swim” to the “bike”, a state “bike” (an example of a “second exercise state”) in which the user 2 is carrying out the bike (an example of a “second exercise event”), a state “second transition” (an example of a “second transition state”) halfway in transition from the “bike” to the “run”, and a state “run” (an example of a “third exercise state”) in which the user 2 is carrying out the run (an example of a “third exercise event”). That is, in this embodiment, the user terminal 3 discriminates five states of the “swim”, the “first transition”, the “bike”, the “second transition”, and the “run”.

The user terminal 3 measures an elapsed time Tswim from the start to the end of the “swim”, an elapsed time Ttran1 from the start to the end of the “first transition”, an elapsed time Tbike from the start to the end of the “bike”, an elapsed time Ttran2 from the start to the end of the “second transition”, and an elapsed time Trun from the start to the end of the “run” and sequentially displays information concerning the discriminated states and the measured elapsed times of the states on the display unit 150 or the like (on a real-time basis).

The user terminal 3 generates, on the basis of output signals of various sensors, information such as speed, a pace, a distance, a track, a pulse rate, a heart rate, a pitch, a swimming stroke, and a run stride of the user 2 and causes the incorporated storing unit 140 (see FIG. 4) to sequentially store the information.

While the user 2 is carrying out the triathlon, the user terminal 3 transmits exercise information (the total elapsed time Ttotal, the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun, a discriminated state, speed, a pace, a distance, a track, a pulse rate, a heart rate, a pitch, a swimming stroke, a run stride, and the like) of the user 2 to the information terminal 5 (5a) by short-range wireless communication.

The information terminal 5 (5a) displays the exercise information received from the user terminal 3 on a display unit. The information terminal 5 (5a) is, for example, a head mount display (HMD) worn by the user 2. The user 2 can carry out the triathlon while grasping exercise information displayed on the head mount display. Alternatively, the information terminal 5 (5a) is, for example, a smartphone or a personal computer carried by a coach of the user 2. The coach can provide information such as advice to the user 2, who is carrying out the triathlon, on the basis of the exercise information of the user 2 displayed on the smartphone or the personal computer.

In this embodiment, when ending the triathlon (passing the goal point G3), the user 2 performs measurement end operation on the user terminal 3.

When the measurement end operation is performed, the user terminal 3 ends the measurement processing of the total elapsed time Ttotal, the discrimination processing of the five states of the “swim”, the “first transition”, the “bike”, the “second transition”, and the “run”, and the measurement processing of the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun in the states and causes the incorporated storing unit 140 (see FIG. 4) to store the total elapsed time Ttotal and the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun. The total elapsed time Ttotal stored in the storing unit 140 is equivalent to the “total time” explained above. A sum of the elapsed time Tswim and the elapsed time Ttran1 stored in the storing unit 140 is equivalent to the “swim time” explained above. A sum of the elapsed time Tbike and the elapsed time Ttran2 stored in the storing unit 140 is equivalent to the “bike time” explained above. The elapsed time Trun stored in the storing unit 140 is equivalent to the “run time” explained above. The elapsed time Ttran1 stored in the storing unit 140 is equivalent to the “first transition time” explained above. The elapsed time Ttran2 stored in the storing unit 140 is equivalent to the “second transition time” explained above.

The user terminal 3 is connectable to the network 6 via the information terminal 5 (5b). After the user 2 ends the triathlon, the exercise information stored in the storing unit 140 is transferred to the server 4 via the information terminal 5 (5b) and the network 6 and saved in a storing unit (not shown in the figure) of the server 4. The information terminal 5 (5b) may be, for example, a smartphone or a personal computer.

The information terminal 5 (5c) receives the exercise information of the user 2 saved in the storing unit of the server 4 via the network 6 and displays the exercise information on the display unit. The information terminal 5 (5c) is, for example, a smartphone or a personal computer of the user 2 or a person related to the user 2. The user 2 and the person related to the user 2 can analyze, on the basis of the exercise information displayed on the smartphone or the personal computer, results of triathlons carried out by the user 2 in the past.

1-2. Configuration of the User Terminal

FIG. 4 is an example of a functional block diagram of the user terminal 3. As shown in FIG. 4, the user terminal 3 includes a processing unit 100, a GPS sensor 110, a terrestrial magnetism sensor 111, a pressure sensor 112, an acceleration sensor 113, an angular velocity sensor 114, a pulse sensor 115, a temperature sensor 116, an operation unit 120, a clocking unit 130, a storing unit 140, a display unit 150, a sound output unit 160, a communication unit 170, and a battery 180. However, the configuration of the user terminal 3 may be a configuration in which a part of the components are deleted or changed or other components are added.

The GPS sensor 110 (an example of a “position-information generating unit”) generates position information on the basis of satellite signals transmitted from the GPS satellites 7. For example, the GPS sensor 110 may be a GPS receivers that receives the satellite signals transmitted from the GPS satellites 7 with a not-shown antenna, demodulates a navigation message from the satellite signals, generates positioning data (data such as latitude, longitude, altitude, and a speed vector), which is position information indicating the position and the like of the user terminal 3, on the basis of the navigation message, and outputs the positioning data.

The terrestrial magnetism sensor 111 is a sensor that detects and outputs the magnetic field (the terrestrial magnetism) of the Earth. The terrestrial magnetism sensor 111 generates and outputs a terrestrial magnetism signal indicating magnetic flux densities in three axial directions orthogonal to one another. In the terrestrial magnetism sensor 111, for example, an MR (Magnet resistive) element, an MI (Magnet impedance) element, and a Hall element are used.

The pressure sensor 112 is a sensor that detects and outputs peripheral pressure (air pressure, water pressure, wind pressure, etc.). The pressure sensor 112 includes, for example, a pressure sensitive element of a type for using a change in an oscillation frequency of a vibration piece (a vibration type). The pressure sensitive element is, for example, a piezoelectric vibrator formed of a piezoelectric material such as quartz, lithium niobate, or lithium tantalate. For example, a tuning fork-type vibrator, a dual tuning fork-type vibrator, an AT vibrator (a thickness shear vibrator), or a SAW resonator is applied. Alternatively, the pressure sensor 112 may be a MEMS-type air pressure sensor manufactured using a semiconductor manufacturing technique. For example, the pressure sensor 112 includes a diaphragm unit deflectively deformed by received pressure and a distortion detecting element that detects deflection of the diaphragm unit. The diaphragm unit is formed of, for example, silicon. The distortion detection element is, for example, a piezo-resistance element.

The acceleration sensor 113 detects accelerations in the respective three axial directions crossing (ideally orthogonal to) one another and outputs a signal (an acceleration signal) corresponding to the magnitudes and the directions of the detected three axis accelerations.

The angular velocity sensor 114 detects angular velocities in the respective three axial directions crossing (ideally orthogonal to) one another and outputs a signal (an angular velocity signal) corresponding to the magnitudes and the directions of the measured three axis angular velocities.

Note that at least one of the output signal (the pressure signal) of the pressure sensor 112, the output signal (the acceleration signal) of the acceleration sensor 113, and the output signal (the angular velocity signal) of the angular velocity sensor 114 is used for correcting the information concerning the positions included in the positioning data generated by the GPS sensor 110.

The pulse sensor 115 is a sensor that generates and outputs a signal indicating a pulse of the user 2. The pulse sensor 115 includes, for example, a light source such as an LED (Light Emitting Diode) light source that irradiates measurement light having an appropriate wavelength toward a blood vessel under a skin and a light receiving element that detects an intensity change of light that occurs in the blood vessel according to the measurement light. For example, it is possible to measure a pulse rate (a pulse rate per one minute) by processing an intensity change waveform (a pulse wave) of the light with a publicly-known method such as a frequency analysis. Note that, it is said that a heart rate (heat beats per one minute) and the pulse rate are substantially the same unless arrhythmia or pulse defect occurs. Therefore, the heart rate can be measured by the pulse sensor 115. As the pulse sensor 115, an ultrasonic sensor that detects contraction of a blood vessel with an ultrasonic wave and measures a pulse rate (a heart rate) may be adopted instead of a photoelectric sensor including a light source and a light receiving element. For example, a sensor that feeds a feeble current from an electrode into the body and measures a pulse rate (a heart rate) may be adopted.

The temperature sensor 116 is a sensor that outputs a signal (a temperature signal) corresponding to an ambient temperature.

The operation unit 120 is configured by, for example, buttons, keys, a microphone, a touch panel, a sound recognizing function (which uses a not-shown microphone), and an action detecting function (which uses the acceleration sensor 113 or the like). The operation unit 120 performs processing for converting an instruction received from the user 2 into an appropriate signal and sending the signal to the processing unit 100.

The clocking unit 130 is configured by, for example, a real time clock (RTC) IC. The clocking unit 130 generates time data such as year, month, day, time, minute, and second and sends the time data to the processing unit 100. Note that the time data may be corrected as appropriate on the basis of time information included in the positioning data generated by the GPS sensor 110.

The storing unit 140 is configured by, for example, one or a plurality of IC (Integrated Circuit) memories. The storing unit 140 includes a ROM (Read Only Memory) in which data such as computer programs are stored, a RAM (Random Access Memory) serving as a work area of the processing unit 100, and a recording medium (a recording medium readable by the user terminal 3 (an example of a computer)) such as a memory card for storing computer programs and data. In the ROM or the recording medium, various computer programs for the processing unit 100 to perform various kinds of calculation processing and control processing, various computer programs and various data for realizing application functions, and the like are stored.

Note that the user terminal 3 may receive various computer programs and various data stored in a recording medium (an optical disk (a CD or a DVD), a magneto-optical disk (MO), a magnetic disk, a hard disk, a magnetic tape, etc.) included in the server 4 or a storing unit via the information terminal 5 (5b) and the network 6 and store the received various computer programs and various data in the storing unit 140 (the RAM).

The display unit 150 is configured by, for example, an LCD (Liquid Crystal Display), an organic EL (Electroluminescence) display, an EPD (Electrophoretic Display), or a touch panel display. The display unit 150 displays various images according to instructions from the processing unit 100. Note that, as the display unit 150, a head mount display (HMD) provided separately from the user terminal 3 can also be used.

The sound output unit 160 is configured by, for example, a speaker, a buzzer, or a vibrator. The sound output unit 160 generates various kinds of sound (or vibration) according to instructions from the processing unit 100. Note that, as the sound output unit 160, a bone conduction device provided separately from the user terminal 3 can also be used.

The communication unit 170 performs various kinds of control for establishing data communication between the user terminal 3 and the information terminal 5. The communication unit 170 includes a transceiver corresponding to a short-range wireless communication standard such as Bluetooth (registered trademark) (including BTLE: Bluetooth Low Energy), Wi-Fi (Wireless Fidelity) (registered trademark), Zigbee (registered trademark), NFC (Near Field Communication), or ANT+ (registered trademark) and a connector corresponding to a communication bus standard such as USB (Universal Serial Bus).

The battery 180 supplies electric power to the units configuring the user terminal 3 and is, for example, a rechargeable battery. As a charging type of the battery 180, for example, contactless charging or contact charging (charging performed using a cradle or the like) can be applied. The battery 180 may be an interchangeable battery or may be a solar generation-type battery.

The processing unit 100 (a processor) is configured by, for example, an MPU (Micro Processing Unit), a DSP (Digital Signal Processor), or an ASIC (Application Specific Integrated Circuit). The processing unit 100 executes various kinds of processing on the basis of computer programs stored in the storing unit 140 and signals input from the operation unit 120. The processing by the processing unit 100 includes data processing for output signals of the GPS sensor 110, the terrestrial magnetism sensor 111, the pressure sensor 112, the acceleration sensor 113, the angular velocity sensor 114, the pulse sensor 115, the temperature sensor 116, and the clocking unit 130, display processing for causing the display unit 150 to display an image, sound output processing for causing the sound output unit 160 to output sound, communication processing for performing communication with the information terminal 5 via the communication unit 170, and power control processing for supplying electric power received from the battery 180 to the units.

In particular, in this embodiment, the processing unit 100 performs processing for receiving a signal indicating setting of the positions P1, P2, P3, and P4 from the operation unit 120 or the communication unit 170 and registering the positions P1, P2, P3, and P4 in the storing unit 140.

The processing unit 100 performs, as one kind of the data processing, processing for measuring, on the basis of an output signal of the clocking unit 130, an elapsed time (the total elapsed time Ttotal) after a signal indicating measurement start operation is received from the operation unit 120.

The processing unit 100 performs, as one kind of the data processing, processing for discriminating the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of the user 2 on the basis of positioning data generated and output by the GPS sensor 110 (the position information obtained on the basis of satellite signals transmitted from the GPS satellites 7) and the positions P1, P2, P3, and P4 registered in advance in the storing unit 140.

Specifically, the processing unit 100 determines on the basis of the positioning data (the position information) whether the user 2 passes the position P1 and whether the user 2 passes the position P2, determines that the user 2 is in the state “swim” until the user 2 passes the position P1, determines that the user 2 is in the state “first transition” until the user 2 passes the position P2 after passing the position P1, and determines that the user 2 is in the state “bike” after the user 2 passes the position P2.

Further, the processing unit 100 determines on the basis of the positioning data (the position information) whether the user 2 passes the position P3 and whether the user 2 passes the position P4, determines that the user 2 is in the state “bike” until the user 2 passes the position P3 after passing the position P2, determines that the user 2 is in the state “second transition” until the user 2 passes the position P4 after passing the position P3, and determines that the user 2 is in the state “run” after the user 2 passes the position P4.

The processing unit 100 performs, as one kind of the data processing, processing for calculating times respectively required for the plurality of states “swim”, “first transition”, “bike”, “second transition”, and the “run” of the user 2. That is, the processing unit 100 performs processing for measuring, on the basis of an output signal of the clocking unit 130, the elapsed time Tswim of the state “swim”, the elapsed time Ttran1 of the state “first transition”, the elapsed time Tbike of the state “bike”, the elapsed time Ttran2 of the state “second transition”, and the elapsed time Trun of the state “run”.

The processing unit 100 performs, as one kind of the data processing, processing for generating, on the basis of output signals of the GPS sensor 110, the terrestrial magnetism sensor 111, the pressure sensor 112, the acceleration sensor 113, the angular velocity sensor 114, the pulse sensor 115, the temperature sensor 116, and the clocking unit 130, information such as speed, a pace, a distance, a track, a pulse rate, a heart rate, a pitch, a swimming stroke, and a run stride of the user 2 after a signal indicating measurement start operation is received from the operation unit 120 and causing the storing unit 140 to store the information.

The processing unit 100 performs, as one kind of the data processing, processing for ending, when receiving a signal indicating measurement end operation from the operation unit 120, the measurement processing of the total elapsed time Ttotal, the discrimination processing of the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run”, and the measurement processing of the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun of the states and causing the incorporated storing unit 140 to store the total elapsed time Ttotal and the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun.

The processing unit 100 may perform, as one kinds of the display processing, processing for causing the display unit 150 to display at least one of the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of the user 2. In this case, the display unit 150 functions as a notifying unit that notifies a state discriminated by the processing unit 100.

The processing unit 100 may perform, as one kind of the display processing, processing for causing the display unit 150 to display at least a part of the exercise information (the total elapsed time Ttotal, the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun, the discriminated state, the speed, the pace, the distance, the track, the pulse rate, the heart rate, the pitch, the swimming stroke, the run stride, etc.) of the user 2.

The processing unit 100 may perform, as one kinds of the sound output processing, processing for causing the sound output unit 160 to output, as sound, at least one of the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of the user 2. In this case, the sound output unit 160 functions as a notifying unit that notifies a state discriminated by the processing unit 100.

The processing unit 100 may perform, as one kind of the sound output processing, processing for causing the sound output unit 160 to output, as sound, at least a part of the exercise information (the total elapsed time Ttotal, the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun, the discriminated state, the speed, the pace, the distance, the track, the pulse rate, the heart rate, the pitch, the swimming stroke, the run stride, etc.) of the user 2.

The processing unit 100 may perform, as one kinds of the communication processing, processing for transmitting at least one of the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of the user 2 to the information terminals 5(5a and 5b) via the communication unit 170. In this case, the communication unit 170 functions as a notifying unit that notifies a state discriminated by the processing unit 100.

The processing unit 100 may perform, as one kind of the communication processing, processing for transmitting the exercise information (the total elapsed time Ttotal, the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun, the discriminated state, the speed, the pace, the distance, the track, the pulse rate, the heart rate, the pitch, the swimming stroke, the run stride, etc.) of the user 2 to the information terminals 5 (5a and 5b) via the communication unit 170.

1-3. Procedure of processing of the user terminal

FIG. 5 is a flowchart for explaining an example of a procedure of a part of processing performed by the processing unit 100 of the user terminal 3. The processing unit 100 of the user terminal 3 executes a computer program stored in the storing unit 140 (the storage medium, the ROM, or the RAM) to thereby execute the processing in the procedure of the flowchart of FIG. 5.

As shown in FIG. 5, first, the processing unit 100 stays on standby until the processing unit 100 receives a signal indicating measurement start operation from the operation unit 120 (N in step S10). When receiving the signal indicating the measurement start operation (Y in step S10), the processing unit 100 starts generation processing of the exercise information (the total elapsed time Ttotal, the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun, the discriminated state, the speed, the pace, the distance, the track, the pulse rate, the heart rate, the pitch, the swimming stroke, the run stride, etc.) of the user 2.

Subsequently, the processing unit 100 performs state discrimination processing for discriminating a state of the user 2 (step S14). Details of the state discrimination processing are explained below.

Subsequently, the processing unit 100 stays on standby until the processing unit 100 receives a signal indicating measurement end operation from the operation unit 120 (N in step S16). When receiving the signal indicating the measurement end operation (Y in step S16), the processing unit 100 ends the generation processing of the exercise information of the user 2 (step S18).

FIG. 6 is a flowchart for explaining an example of details of the state discrimination processing (the processing in step S14 in FIG. 5) in the first embodiment.

As shown in FIG. 6, first, the processing unit 100 sets the state of the user 2 to the “swim” (step S100).

Subsequently, the processing unit 100 acquires positioning data (position information) from the GPS sensor 110 (step S102) and determines on the basis of the acquired position information and the registered position P1 whether the distance between the position of the user 2 and the position P1 is equal to or smaller than a threshold (step S104). The threshold only has to be decided as appropriate.

If the distance between the position of the user 2 and the position P1 is not equal to or smaller than the threshold (N in step S104), the processing unit 100 performs the processing in steps S102 and S104 again. On the other hand, if the distance between the position of the user 2 and the position P1 is equal to or smaller than the threshold (Y in step S104), the processing unit 100 changes the state of the user 2 from the “swim” to the “first transition” (step S106).

Subsequently, the processing unit 100 acquires positioning data (position information) from the GPS sensor 110 (step S108) and determines on the basis of the acquired position information and the registered position P2 whether the distance between the position of the user 2 and the position P2 is equal to or smaller than the threshold (step S110).

If the distance between the position of the user 2 and the position P2 is not equal to or smaller than the threshold (N in step S110), the processing unit 100 performs the processing in steps S108 and S110 again. On the other hand, if the distance between the position of the user 2 and the position P2 is equal to or smaller than the threshold (Y in step S110), the processing unit 100 changes the state of the user 2 from the “first transition” to the “bike” (step S112).

Subsequently, the processing unit 100 acquires positioning data (position information) from the GPS sensor 110 (step S114) and determines on the basis of the acquired position information and the registered position P3 whether the distance between the position of the user 2 and the position P3 is equal to or smaller than the threshold (step S116).

If the distance between the position of the user 2 and the position P3 is not equal to or smaller than the threshold (N in step S116), the processing unit 100 performs the processing in steps S114 and S116 again. On the other hand, if the distance between the position of the user 2 and the position P3 is equal to or smaller than the threshold (Y in step S116), the processing unit 100 changes the state of the user 2 from the “bike” to the “second transition” (step S118).

Subsequently, the processing unit 100 acquires positioning data (position information) from the GPS sensor 110 (step S120) and determines on the basis of the acquired position information and the registered position P4 whether the distance between the position of the user 2 and the position P4 is equal to or smaller than the threshold (step S122).

If the distance between the position of the user 2 and the position P4 is not equal to or smaller than the threshold (N in step S122), the processing unit 100 performs the processing in steps S120 and S122 again. On the other hand, if the distance between the position of the user 2 and the position P4 is equal to or smaller than the threshold (Y in step S122), the processing unit 100 changes the state of the user 2 from the “second transition” to the “run” (step S124).

1-4. Display Method of States of the User

In this embodiment, the user terminal 3 (the processing unit 100) discriminates a plurality of states of the user 2 according to the state discrimination processing (the processing in step S14 in FIG. 5) and causes the display unit 150 to display the discriminated states. FIG. 7 is a diagram showing a display example of states of the user 2 that the processing unit 100 causes the display unit 150 to display while the user 2 is carrying out the triathlon.

As shown in FIG. 7, when a state of the user 2 is the “swim”, the processing unit 100 causes the display unit 150 to display an image A1 including an object OB1 for reminding that the user 2 is carrying out the swim. When the state of the user 2 is the “swim”, the processing unit 100 may cause the display unit 150 to display the elapsed time Tswim from the start of the “swim”. The image A1 includes the total elapsed time Ttotal (0:15:15) serving as a total time and the elapsed time Tswim (0:15:15) of the swim.

When the state of the user 2 is the “first transition”, the processing unit 100 causes the display unit 150 to display an image A2 including an object OB2 for reminding that the user 2 is transitioning from the swim to the bike. When the state of the user 2 is the “first transition”, the processing unit 100 may cause the display unit 150 to display the elapsed time Ttran1 from the start of the “first transition”. The image A2 includes the total elapsed time Ttotal (0:30:12) serving as the total time and the elapsed time Ttran1 (0:01:05) of the first transition.

When the state of the user 2 is the “bike”, the processing unit 100 causes the display unit 150 to display an image A3 including an object OB3 for reminding that the user 2 is carrying out the bike. When the state of the user 2 is the “bike”, the processing unit 100 may cause the display unit 150 to display the elapsed time Tbike from the start of the “bike”. The image A3 includes the total elapsed time Ttotal (1:01:45) serving as the total time and the elapsed time Tbike (0:26:59) of the bike.

When the state of the user 2 is the “second transition”, the processing unit 100 causes the display unit 150 to display an image A4 including an object OB4 for reminding that the user 2 is transitioning from the bike to the run. When the state of the user 2 is the “second transition”, the processing unit 100 may cause the display unit 150 to display the elapsed time Ttran2 from the start of the “second transition”. The image A4 includes the total elapsed time Ttotal (1:32:38) serving as the total time and the elapsed time Ttran2 (0:00:55) of the second transition.

When the state of the user 2 is the “run”, the processing unit 100 causes the display unit 150 to display an image A5 including an object OB5 for reminding that the user 2 is carrying out the run. When the state of the user 2 is the “run”, the processing unit 100 may cause the display unit 150 to display the elapsed time Trun from the start of the “run”. The image A5 includes the total elapsed time Ttotal (2:12:33) serving as the total time and the elapsed time Trun (0:39:22) of the run.

Note that the processing unit 100 may transmit information concerning the images A1 to A5 representing the states of the user 2 to the information terminal 5 (5a) via the communication unit 170 and cause the display unit of the information terminal 5 (5a) to display the images A1 to A5.

In this way, in this embodiment, the display system is configured that includes the processing unit 100 that discriminates the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of the user 2 and the display unit 150 of the user terminal 3 or the display unit of the information terminal 5 that displays the discriminated states.

Note that, in order to reduce the total time, it is requested to reduce the elapsed time Ttran1 of the first transition and the elapsed time Ttran2 of the second transition as much as possible. Therefore, when a discriminated state is the “first transition” or the “second transition”, the processing unit 100 may generate an image including a flashing object and cause the display unit 150 or the display unit of the information terminal 5 (5a) to display the image. Consequently, in FIG. 7, the image A2 at the time when the state of the user 2 is the “first transition” and the image A4 at the time when the state of the user 2 is the “second transition” are images more highlighted than the other images A1, A3, and A5. For example, as shown in FIG. 8, in the image A2 at the time when the state of the user 2 is the “first transition”, an object imitating T, which is a part of the object OB2, may be flashed. In the example shown in FIG. 8, the object imitating T is lit for one second (an image A2-1), extinguished for one second (an image A2-2), lit for one second (an image A2-3), and extinguished for one second (an image A2-4). Although not shown in the figure, similarly, in the image A4 at the time when the state of the user 2 is the “second transition”, an object imitating T, which is a part of the object OB4, may be flashed.

1-5. Display Method of Exercise Information

In this embodiment, the user terminal 3 (the processing unit 100) transmits the exercise information of the user 2 generated by the generation processing of exercise information (the processing starting in step S12 and ending in step S18 in FIG. 5) to the information terminals 5 (5a and 5b) via the communication unit 170.

While the user 2 is carrying out the triathlon, the information terminal 5a receives the exercise information of the user 2 from the user terminal 3 (the processing unit 100) and displays at least a part of the exercise information on the display unit.

After the user 2 ends the triathlon, the information terminal 5b receives the exercise information of the user 2 from the user terminal 3 (the processing unit 100) and transfers the exercise information to the server 4 via the network 6. The server 4 saves the exercise information of the user 2 received from the information terminal 5b in the storing unit. Thereafter, the information terminal 5c receives the exercise information of the user 2 saved in the storing unit of the server 4 via the network 6 and displays the exercise information on the display unit. FIGS. 9 to 12 are diagrams showing examples of images displayed on the display unit of the information terminal 5c.

The image shown in FIG. 9 includes information such as trend graphs of an average pace, an altitude, and a heart rate with time plotted on the horizontal axis, the objects indicating the states of the user 2, the elapsed time Ttran1 (62 seconds) of the first transition, and the elapsed time Ttran2 (44 seconds) of the second transition.

The image shown in FIG. 10 includes information such as trend graphs of an average pace, an altitude, and a pitch with time plotted on the horizontal axis, the objects indicating the states of the user 2, the elapsed time Ttran1 (62 seconds) of the first transition, and the elapsed time Ttran2 (44 seconds) of the second transition.

In the image shown in FIG. 11, a pulse rate is classified into five stages of 30 to 100, 101 to 130, 131 to 160, 161 to 190, and 191 to 240. The image includes information concerning times in which the pulse rate is in the stages and ratios of the times.

The image shown in FIG. 12 includes information concerning a moving track of the user 2 (an athlete A) and a moving track of another user (e.g., a professional athlete B).

From the images shown in FIG. 9, 10, or 11, for example, when time of the user 2 was improved, for example, it is possible to determine which of the state of the “swim”, the “first transition”, the “bike”, the “second transition”, and the “run” was good, determine that time efficiency in the “second transition” was better than time efficiency in the “first transition”, and determine that a pace in the “run” was stable because the user 2 was able to save power in the “bike”. For example, when the user 2 desires to improve the time of the user 2, for example, it is possible to determine improvement of which of the “swim”, the “first transition”, the “bike”, the “second transition”, and the “run” is effective, determine that it is better to reduce a pace in the “bike” in preparation for the “run”, and determine that the user 2 should repeat practices in a pool to develop basic strength because a pace in the “swim” is unstable.

From the image shown in FIG. 12, for example, it is possible to compare course selections of the user 2 and the other user (e.g., the professional athlete) and determine an event that caused a loss and a place of the loss (a curve or a straight line). Although not shown in the figure, an image for enabling trend graphs of average paces, altitudes, heart rates, and pitches to be compared between the user 2 and the other user (e.g., the professional athlete) may be displayed on the display unit of the information terminal 5c. With such an image, it is possible to determine that, for example, a pace of the other user (the professional athlete) is overwhelmingly higher that a pace of the user 2 or the user 2 has a good match with the other user (the professional athlete) in the swim and the run.

1-6. Action and Effects

As explained above, with the exercise information management system 1 in the first embodiment, on the basis of the positioning data (the position information) of the GPS sensor 110 and the positions P1, P2, P3, and P4 registered in advance, the processing unit 100 of the user terminal 3 determines that the user 2 is in the state “swim” until the user 2 passes the position P1, determines that the user 2 is in the state “first transition” until the user 2 passes the position P2 after passing the position P1, determines that the user 2 is in the state “bike” until the user 2 passes the position P3 after passing the position P2, determines that the user 2 is in the state “second transition” until the user 2 passes the position P4 after passing the position P3, and determines that the user 2 is in the state “run” after the user 2 passes the position P4. That is, since the processing unit 100 automatically discriminates the state of the user 2, the user 2 does not need to perform work when an athletic event carried out by the user 2 is switched from the “swim” to the “bike” and when the athletic event is switched from the “bike” to the “run”. Therefore, the user 2 can concentrate on the triathlon.

With the exercise information management system 1 in the first embodiment, the states of the user 2 discriminated by the processing unit 100 of the user terminal 3 are displayed on the display unit 150 of the user terminal 3 together with the elapsed times. Therefore, the user 2 is capable of recognizing the displayed elapsed times of the states and performing adjustment such as an increase and a reduction in a pace. The states of the user 2 discriminated by the processing unit 100 of the user terminal 3 are displayed on the display unit of the information terminal 5a together with the elapsed times. Therefore, a person (e.g., a coach) carrying the information terminal 5a can recognize the elapsed times of the states of the user 2 and give advice or the like such as an increase or a reduction in a pace to the user 2. In particular, the user 2 and the like can separately recognize a time required for the state “swim” and a time required for the state “first transition”. Therefore, the user 2 and the like can recognize whether a pace of the swim is faster or slower than an assumption and whether a time required for switching from the swim to the bike is longer or shorter than an assumption and appropriately determine whether the pace should be increased or reduced in the bike. Similarly, the user 2 and the like can separately recognize a time required for the state “bike” and a time required for the state “second transition”. Therefore, the user 2 and the like can recognize whether a pace of the bike is faster or slower than an assumption and whether a time required for switching from the bike to the run is longer or shorter than an assumption and appropriately determine whether the pace should be increased or reduced in the run.

2. Second Embodiment

The exercise information management system 1 in a second embodiment is explained below.

In the exercise information management system 1 in the second embodiment, as shown in FIG. 13, before starting a triathlon, the user 2 stores, in advance, in the storing unit 140 of the user terminal 3, a plurality of positions P1 in the goal point G1 of swim or the vicinity of the goal point G1, a plurality of positions P2 in the start point S2 of bike or the vicinity of the start point S2, a plurality of positions P3 in the goal point G2 of the bike or the vicinity of the goal point G2, and a plurality of positions P4 in the start point S3 of run or the vicinity of the start point S3.

The processing unit 100 performs processing for discriminating a plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of the user 2 on the basis of positioning data (position information) generated and output by the GPS sensor 110 and the plurality of positions P1, P2, P3, and P4 registered in the storing unit 140 in advance.

Specifically, the processing unit 100 determines on the basis of the positioning data (the position information) whether the user 2 passes any one of the plurality of positions P1 and whether the user 2 passes any one of the plurality of positions P2, determines that the user 2 is in the state “swim” until the user 2 passes any one of the plurality of positions P1, determines that the user 2 is in the state “first transition” until the user 2 passes any one of the plurality of positions P2 after passing any one of the plurality of positions P1, and determines that the user 2 is in the state “bike” after the user 2 passes any one of the plurality of positions P2.

Further, the processing unit 100 determines on the basis of the positioning data (the position information) whether the user 2 passes any one of the plurality of positions P3 and whether the user 2 passes any one of the plurality of positions P4, determines that the user 2 is in the state “bike” until the user 2 passes any one of the plurality of positions P3 after passing any one of the plurality of positions P2, determines that the user 2 is in the state “second transition” until the user 2 passes any one of the plurality of positions P4 after passing any one of the plurality of positions P3, and determines that the user 2 is in the state “run” after the user 2 passes any one of the plurality of positions P4.

As in the first embodiment, in the second embodiment, the processing unit 100 executes the computer program stored in the storing unit 140 to thereby, for example, execute the processing in the procedure of the flowchart of FIG. 5.

FIG. 14 is a flowchart for explaining an example of details of the state discrimination processing (the processing in step S14 in FIG. 5) in the second embodiment.

As shown in FIG. 14, first, the processing unit 100 sets a state of the user 2 to the “swim” (step S200).

Subsequently, the processing unit 100 acquires positioning data (position information) from the GPS sensor 110 (step S202) and determines on the basis of the acquired position information and the registered plurality of positions P1 whether the distance between the position of the user 2 and any one of the plurality of positions P1 is equal to or smaller than a threshold (step S204). The threshold only has to be decided as appropriate.

If the distance between the position of the user 2 and any one of the plurality of positions P1 is not equal to or smaller than the threshold (N in step S204), the processing unit 100 performs the processing in steps S202 and S204 again. On the other hand, if the distance between the position of the user 2 and any one of the plurality of positions P1 is equal to or smaller than the threshold (Y in step S204), the processing unit 100 changes the state of the user 2 from the “swim” to the “first transition” (step S206).

Subsequently, the processing unit 100 acquires positioning data (position information) from the GPS sensor 110 (step S208) and determines on the basis of the acquired position information and the registered plurality of positions P2 whether the distance between the position of the user 2 and any one of the plurality of positions P2 is equal to or smaller than the threshold (step S210).

If the distance between the position of the user 2 and any one of the plurality of positions P2 is not equal to or smaller than the threshold (N in step S210), the processing unit 100 performs the processing in steps S208 and S210 again. On the other hand, if the distance between the position of the user 2 and any one of the plurality of positions P2 is equal to or smaller than the threshold (Y in step S210), the processing unit 100 changes the state of the user 2 from the “first transition” to the “bike” (step S212).

Subsequently, the processing unit 100 acquires positioning data (position information) from the GPS sensor 110 (step S214) and determines on the basis of the acquired position information and the registered plurality of positions P3 whether the distance between the position of the user 2 and any one of the plurality of positions P3 is equal to or smaller than the threshold (step S216).

If the distance between the position of the user 2 and any one of the plurality of positions P3 is not equal to or smaller than the threshold (N in step S216), the processing unit 100 performs the processing in steps S214 and S216 again. On the other hand, if the distance between the position of the user 2 and any one of the plurality of positions P3 is equal to or smaller than the threshold (Y in step S216), the processing unit 100 changes the state of the user 2 from the “bike” to the “second transition” (step S218).

Subsequently, the processing unit 100 acquires positioning data (position information) from the GPS sensor 110 (step S220) and determines on the basis of the acquired position information and the registered plurality of positions P4 whether the distance between the position of the user 2 and any one of the plurality of positions P4 is equal to or smaller than the threshold (step S222).

If the distance between the position of the user 2 and any one of the plurality of positions P4 is not equal to or smaller than the threshold (N in step S222), the processing unit 100 performs the processing in steps S220 and S222 again. On the other hand, if the distance between the position of the user 2 and any one of the plurality of positions P4 is equal to or smaller than the threshold (Y in step S222), the processing unit 100 changes the state of the user 2 from the “second transition” to the “run” (step S224).

The exercise information management system 1 in the second embodiment explained above can achieve the same effects as the effects in the first embodiment.

Further, with the exercise information management system 1 in the second embodiment, the processing unit 100 of the user terminal 3 discriminates the states assuming that the state of the user 2 is switched when the user 2 passes any one of the plurality of positions P1, any one of the plurality of positions P2, any one of the plurality of positions P3, or any one of the plurality of positions P4. Therefore, compared with the first embodiment in which only one each of the positions P1, P2, P3, and P4 is registered, it is possible to more accurately discriminate the state of the user 2.

3. Third Embodiment

The exercise information management system 1 in a third embodiment is explained below.

In the first embodiment, the processing unit 100 calculates the elapsed time Ttran1 from the start to the end of the state “first transition” as a time equivalent to the first transition time (the elapsed time from time when the user 2 passes the goal point G1 of the swim until the user 2 passes the start point S2 of the bike).

Actually, the goal point G1 of the swim and the transition area TA are sometimes away. The first transition time is a sum of a time in which the user 2 moves from the goal point G1 of the swim to the transition area TA and a time required for a change of clothes in the transition area TA, movement to the start point S2 (a riding line) of the bike, and the like.

Therefore, in the third embodiment, a state halfway in transition from the “swim” to the “bike” consists of two states, that is, a state “running movement” in which the user 2 is moving from the goal point G1 of the swim to the transition area TA and a state “first transition” in which the user 2 is performing a change of clothes in the transition area TA and movement to the start point S2 (the riding line) of the bike. The processing unit 100 of the user terminal 3 discriminates the two states “running movement” and “first transition” respectively as separate states.

In this embodiment, as shown in FIG. 15, the user 2 registers, in advance, a position P5 in a point E1 of an entrance of the transition area TA or the vicinity of the point E1 present on a moving route from the goal point G1 of the swim in addition to the positions P1, P2, P3 and P4.

The processing unit 100 performs processing for discriminating a plurality of states “swim”, “running movement”, “first transition”, “bike”, “second transition”, and “run” on the basis of positioning data (position information) generated and output by the GPS sensor 110 and the positions P1, P2, P3, P4, and P5 registered in the storing unit 140 in advance. The “running movement” is a state in which the user 2 is moving from the goal point G1 of the swim to the transition area TA.

Specifically, the processing unit 100 determines on the basis of the positioning data (the position information) whether the user 2 passes the position P1 and whether the user 2 passes the position P2, determines that the user 2 is in the state “swim” until the user 2 passes the position P1, determines that the user 2 is in the state “running movement” until the user 2 passes the position P5 after passing the position P1, determines that the user 2 is in the “first transition” until the user 2 passes the position P2 after passing the position P5, and determines that the user 2 is in the state “bike” after the user 2 passes the position P2.

Further, the processing unit 100 determines on the basis of the positioning data (the position information) whether the user 2 passes the position P3 and whether the user 2 passes the position P4, determines that the user 2 is in the state “bike” until the user 2 passes the position P3 after passing the position P2, determines that the user 2 is in the state “second transition” until the user 2 passes the position P4 after passing the position P3, and determines that the user 2 is in the state “run” after the user 2 passes the position P4.

The processing unit 100 performs, as one kind of the data processing, processing for measuring, on the basis of an output signal of the clocking unit 130, the elapsed time Tswim of the state “swim”, an elapsed time Tmove of the state “running movement”, the elapsed time Ttran1 of the state “first transition”, the elapsed time Tbike of the state “bike”, the elapsed time Ttran2 of the state “second transition”, and the elapsed time Trun of the state “run”.

The processing unit 100 performs, as one kind of the data processing, processing for ending, when receiving a signal indicating measurement end operation from the operation unit 120, the measurement processing of the total elapsed time Ttotal, the discrimination processing of the “swim”, the “running movement”, the “first transition”, the “bike”, the “second transition”, and the “run”, and the measurement processing of the elapsed times Tswim, Tmove, Ttran1, Tbike, Ttran2, and Trun of the states and causing the incorporated storing unit 140 to store the total elapsed time Ttotal and the elapsed times Tswim, Tmove, Ttran1, Tbike, Ttran2, and Trun.

The processing unit 100 may perform, as one kinds of the display processing, processing for causing the display unit 150 to display at least one of the plurality of states “swim”, “running movement”, “first transition”, “bike”, “second transition”, and “run” of the user 2. In this case, the display unit 150 functions as a notifying unit that notifies a state discriminated by the processing unit 100.

The processing unit 100 may perform, as one kind of the display processing, processing for causing the display unit 150 to display at least a part of the exercise information (the total elapsed time Ttotal, the elapsed times Tswim, Tmove, Ttran1, Tbike, Ttran2, and Trun, the discriminated state, the speed, the pace, the distance, the track, the pulse rate, the heart rate, the pitch, the swimming stroke, the run stride, etc.) of the user 2.

The processing unit 100 may perform, as one kinds of the sound output processing, processing for causing the sound output unit 160 to output, as sound, at least one of the plurality of states “swim”, “running movement”, “first transition”, “bike”, “second transition”, and “run” of the user 2. In this case, the sound output unit 160 functions as a notifying unit that notifies a state discriminated by the processing unit 100.

The processing unit 100 may perform, as one kinds of the communication processing, processing for transmitting at least one of the plurality of states “swim”, “running movement”, “first transition”, “bike”, “second transition”, and “run” of the user 2 to the information terminals 5(5a and 5b) via the communication unit 170. In this case, the communication unit 170 functions as a notifying unit that notifies a state discriminated by the processing unit 100.

The processing unit 100 may perform, as one kind of the communication processing, processing for transmitting the exercise information (the total elapsed time Ttotal, the elapsed times Tswim, Tmove, Ttran1, Tbike, Ttran2, and Trun, the discriminated state, the speed, the pace, the distance, the track, the pulse rate, the heart rate, the pitch, the swimming stroke, the run stride, etc.) of the user 2 to the information terminals 5 (5a and 5b) via the communication unit 170.

As in the first embodiment, in the third embodiment, the processing unit 100 executes the computer program stored in the storing unit 140 to thereby, for example, execute the processing in the procedure of the flowchart of FIG. 5.

FIG. 16 is a flowchart for explaining an example of details of the state discrimination processing (the processing in step S14 in FIG. 5) in the third embodiment.

As shown in FIG. 16, first, the processing unit 100 sets a state of the user 2 to the “swim” (step S300).

Subsequently, the processing unit 100 acquires positioning data (position information) from the GPS sensor 110 (step S302) and determines on the basis of the acquired position information and the registered position P1 whether the distance between the position of the user 2 and the position P1 is equal to or smaller than a threshold (step S304). The threshold only has to be decided as appropriate.

If the distance between the position of the user 2 and the position P1 is not equal to or smaller than the threshold (N in step S304), the processing unit 100 performs the processing in steps S302 and S304 again. On the other hand, if the distance between the position of the user 2 and the position P1 is equal to or smaller than the threshold (Y in step S304), the processing unit 100 changes the state of the user 2 from the “swim” to the “running movement (running movement A)” (step S306).

Subsequently, the processing unit 100 acquires positioning data (position information) from the GPS sensor 110 (step S308) and determines on the basis of the acquired position information and the registered position P5 whether the distance between the position of the user 2 and the position P5 is equal to or smaller than the threshold (step S310).

If the distance between the position of the user 2 and the position P5 is not equal to or smaller than the threshold (N in step S310), the processing unit 100 performs the processing in steps S308 and S310 again. On the other hand, if the distance between the position of the user 2 and the position P5 is equal to or smaller than the threshold (Y in step S310), the processing unit 100 changes the state of the user 2 from the “running movement” to the “first transition” (step S312).

Subsequently, the processing unit 100 acquires positioning data (position information) from the GPS sensor 110 (step S314) and determines on the basis of the acquired position information and the registered position P2 whether the distance between the position of the user 2 and the positions P2 is equal to or smaller than the threshold (step S316).

If the distance between the position of the user 2 and the position P2 is not equal to or smaller than the threshold (N in step S316), the processing unit 100 performs the processing in steps S314 and S316 again. On the other hand, if the distance between the position of the user 2 and the position P2 is equal to or smaller than the threshold (Y in step S316), the processing unit 100 changes the state of the user 2 from the “first transition” to the “bike” (step S318).

Subsequently, the processing unit 100 acquires positioning data (position information) from the GPS sensor 110 (step S320) and determines on the basis of the acquired position information and the registered position P3 whether the distance between the position of the user 2 and the position P3 is equal to or smaller than the threshold (step S322).

If the distance between the position of the user 2 and the position P3 is not equal to or smaller than the threshold (N in step S322), the processing unit 100 performs the processing in steps S320 and S322 again. On the other hand, if the distance between the position of the user 2 and the position P3 is equal to or smaller than the threshold (Y in step S322), the processing unit 100 changes the state of the user 2 from the “bike” to the “second transition” (step S324).

Subsequently, the processing unit 100 acquires positioning data (position information) (step S326) and determines on the basis of the acquired position information and the registered position P4 whether the distance between the position of the user 2 and the position P4 is equal to or smaller than the threshold (step S328).

If the distance between the position of the user 2 and the position P4 is not equal to or smaller than the threshold (N in step S328), the processing unit 100 performs the processing in steps S326 and S328 again. On the other hand, if the distance between the position of the user 2 and the position P4 is equal to or smaller than the threshold (Y in step S328), the processing unit 100 changes the state of the user 2 from the “second transition” to the “run” (step S330).

The exercise information management system 1 in the third embodiment explained above can achieve the same effects as the effects in the first embodiment.

Further, with the exercise information management system 1 in the third embodiment, the processing unit 100 of the user terminal 3 can separately recognize a time required by the user 2 for movement from the goal point G1 of the swim to the transition area TA (a time required for the state “running movement”) and a time required by the user 2 for a change of clothes in the transition area TA, movement to the start point S2 (the riding line) of the bike, and the like (a time required for the state “first transition”). Therefore, the user 2 and the like can grasp, in detail, for example, points that should be improved in switching from the swim to the bike.

Note that, when the transition area TA and the start point S2 of the bike are away, similarly, the user 2 may register, in advance, a position P6 in a point of an exit of the transition area TA or the vicinity of the point present on a moving route to the start point S2 of the bike. The processing unit 100 may determine on the basis of positioning data (position information) whether the user 2 passes the position P6, determine that the user 2 is in the state “first transition” until the user 2 passes the position P6, and determine that the user 2 is in a state “running movement B” until the user 2 passes the position P2 after passing the position P6. Similarly, when the goal point G2 of the bike and the transition area TA are away, the user 2 may register, in advance, a position P7 in a point of an entrance of the transition area TA or the vicinity of the point present on a moving route from the goal point G2 of the bike. The processing unit 100 may determine on the basis of positioning data (position information) whether the user 2 passes the position P7, determine that the user 2 is in a state “running movement C” until the user 2 passes the position P7 after passing the position P3, and determine that the user 2 is in the state “second transition” after the user 2 passes the position P7. Similarly, when the transition area TA and the start point S3 of the run are away, the user 2 may register, in advance, a position P8 in a point of an exit of the transition area TA or the vicinity of the point present on a moving route to the start point S3 of the run. The processing unit 100 may determine on the basis of positioning data (position information) whether the user 2 passes the position P8, determine that the user 2 is in the state “second transition” until the user 2 passes the position P8, and determine that the user 2 is in a state “running movement D” until the user 2 passes the position P3 after passing the position P8.

4. Fourth Embodiment

The exercise information management system 1 in a fourth embodiment is explained below.

In the first embodiment, the processing unit 100 of the user terminal 3 automatically discriminates the state of the user 2 on the basis of the positioning data (the position information) of the GPS sensor 110 and the positions P1, P2, P3, and P4 registered in advance. However, in the exercise information management system 1 in the fourth embodiment, the processing unit 100 determines the state of the user 2 on the basis of at least either one of an output signal of the acceleration sensor 113 (an example of the “first motion sensor”) and an output signal of the pressure sensor 112.

4-1. Configuration of the System

In the exercise information management system 1 in this embodiment, as shown in FIG. 17, when starting a triathlon (when starting the swim in the start point S1), the user 2 performs measurement start operation on the user terminal 3.

The user terminal 3 discriminates, on the basis of position information obtained on the basis of satellite signals transmitted from the GPS (Global Positioning System) satellites 7 (an example of the “position information satellite”) and at least either one of an output signal of the acceleration sensor 113 (an example of the “first motion sensor”) (see FIG. 4) and an output signal of the pressure sensor 112 (see FIG. 4), a plurality of states including a state “swim” (an example of the “first exercise state”) in which the user 2 is carrying out swim (an example of the “first exercise event”), a state “bike” (an example of the “second exercise state”) in which the user 2 is carrying out bike (an example of the “second exercise event”), and a state “run” (an example of the “third exercise state”) in which the user 2 is carrying out run (an example of the “third exercise event”). In this embodiment, the plurality of states discriminated by the user terminal 3 include a state “first transition” (an example of the “first transition state”) halfway in transition from the “swim” to the “bike” and a state “second transition” (an example of the “second transition state”) halfway in transition from the “bike” to the “run”. That is, in this embodiment, the user terminal 3 discriminates five states of the “swim”, the “first transition”, the “bike”, the “second transition”, and the “run”.

4-2. Configuration of the User Terminal

The configuration of the user terminal 3 in this embodiment is the same as the configuration in the embodiments explained above.

In particular, in this embodiment, the processing unit 100 (the processor) performs, as one kind of the data processing, processing for measuring, on the basis of an output signal of the clocking unit 130, an elapsed time (the total elapsed time Ttotal) after the processing unit 100 receives a signal indicating measurement start operation from the operation unit 120.

The processing unit 100 performs, as one kind of the data processing, processing for discriminating the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of the user 2 on the basis of positioning data generated and output by the GPS sensor 110 (position information obtained on the basis of satellite signals transmitted from the GPS satellites 7) and at least either one of an output signal of the acceleration sensor 113 (an example of the “first motion sensor”) and an output signal of the pressure sensor 112.

In general, in the swim, a stroke of the arms of the user 2 is regular (has periodicity). Therefore, a waveform of an output signal of the acceleration sensor 113 is regular (has periodicity). Swimming speed (moving speed) of the user 2 is in a predetermined speed range (e.g., approximately 3 km/h). Further, since a state in which the arms of the user 2 are in the air and a state in which the arms of the user 2 are in the water are alternately repeated, the pressure sensor 112 detects an air pressure and a water pressure. In the first transition, since the user 2 is performing a change of clothes and the like, the position of the user 2 hardly changes and the user 2 nearly stops (moving speed is nearly zero). In the bike, running speed (moving speed) of the user 2 is predetermined speed (e.g., 20 km/h) or more. Since the user 2 receives wind, the pressure sensor 112 detects a wind pressure. In the second transition, since the user 2 is performing a change of clothes and the like, the position of the user 2 hardly changes and the user 2 nearly stops (moving speed is nearly zero). In the run, an arm swing of the user 2 is regular (has periodicity). Therefore, a waveform of an output signal of the acceleration sensor 113 is regular (has periodicity). Running speed (moving speed) of the user 2 is in a predetermined speed range (e.g., 8 km/h to 20 km/h).

Therefore, the processing unit 100 may calculate moving speed of the user 2 on the basis of positioning data (position information) generated and output by the GPS sensor 110, determine whether a waveform of an output signal of the acceleration sensor 113 has periodicity, detect a change in pressure on the basis of an output signal of the pressure sensor 112, and discriminate the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of the user 2 on the basis of the moving speed of the user 2, whether the waveform of the output signal of the acceleration sensor 113 has periodicity, and the change in the pressure.

The processing unit 100 in this embodiment can perform the data processing, the display processing, the sound output processing, and the communication processing as in the embodiments explained above.

4-3. Procedure of Processing of the User Terminal

The processing unit 100 of the user terminal 3 executes a computer program stored in the storing unit 140 (the storage medium, the ROM, or the RAM) to thereby execute the processing in the procedure of the flowchart of FIG. 5.

FIG. 18 is a flowchart for explaining an example of details of the state discrimination processing (the processing in step S14 in FIG. 5) in the fourth embodiment.

As shown in FIG. 18, in this embodiment, the processing unit 100 performs swim determination processing (step S400), first transition determination processing (step S500), bike determination processing (step S600), second transition determination processing (step S700), and run determination processing (step S800) in order.

As explained above, in the swim, the stroke of the arms of the user 2 is regular (has periodicity), the swimming speed of the user 2 is in the predetermined speed range (e.g., approximately 3 km/h), and the state in which the arms of the user 2 are in the air and the state in which the arms of the user 2 are in the water are alternately repeated. Therefore, in the swim determination processing (step S400), if an acceleration waveform (an output waveform of the acceleration sensor 113) is regular (has periodicity) (Y in step S401), if moving speed obtained by differentiating the positions of the user terminal 3 included in positioning data of the GPS sensor 110 is approximately 3 km/h (Y in step S402), and if a water pressure and an air pressure are detected on the basis of an output signal of the pressure sensor 112 (Y in step S403), the processing unit 100 determines that the user 2 is carrying out the swim and changes the state of the user 2 from an undecided state to the “swim” (step S404).

If a cycle in which a voltage of an output signal of the acceleration sensor 113 coincides with a threshold Vt1 is substantially fixed (within a predetermined range) for a predetermined time, the processing unit 100 may determine that an acceleration waveform is regular. Vt1 only has to be decided as appropriate. If moving speed of the user terminal 3 is 3 km/h−α1 or more and 3 km/h+α2 or less, the processing unit 100 may determine that the moving speed is approximately 3 km/h. α1 and α2 only have to be decided as appropriately. Since the water pressure is larger than the air pressure by a predetermined amount or more, when pressure applied to the user terminal 3 calculated using an output signal of the pressure sensor 112 periodically changes and a difference between a maximum value and a minimum value of the pressure is equal to or larger than a threshold Pt1, the processing unit 100 may determine that the water pressure and the air pressure are detected. Pt1 only has to be decided as appropriate.

As explained above, in the first tradition, since the user 2 is performing a change of clothes and the like, the position of the user 2 hardly changes. Therefore, in the first transition determination processing (step S500), if the moving speed of the user terminal 3 is nearly zero (the user terminal 3 nearly stops) (Y in step S501), the processing unit 100 determines that the user 2 is in the state of the first transition and changes the state of the user 2 from the “swim” to the “first transition” (step S502).

When the moving speed of the user terminal 3 is equal to or higher than 131, the processing unit 100 may determine that the user terminal 3 nearly stops. 131 only has to be decided as appropriate.

As explained above, in the bike, the running speed of the user 2 is the predetermined speed (e.g., 20 km/h) or more and the user 2 receives wind. Therefore, in the bike determination processing (step S600), if the moving speed of the user terminal 3 is 20 km/h or more (Y in step S601), and if a wind pressure is detected on the basis of the output signal of the pressure sensor 112 (Y in step S602), the processing unit 100 determines that the user 2 is carrying out the bike and changes the state of the user 2 from the “first transition” to the “bike” (step S603).

As explained above, in the second transition, since the user 2 is performing a change of clothes and the like, the position of the user 2 hardly changes. Therefore, in the second transition determination processing (step S700), if the moving speed of the user terminal 3 is nearly zero (the user terminal 3 nearly stops) (Y in step S701), the processing unit 100 determines that the user 2 is in the state of the second transition and changes the state of the user 2 from the “bike” to the “second transition” (step S702).

As explained above, in the run, the arm swing of the user 2 is regular (has periodicity) and the running speed of the user 2 is in the predetermined speed range (e.g., 8 km/h to 20 km/h). Therefore, in the run determination processing (step S800), if an acceleration waveform (an output waveform of the acceleration sensor 113) is regular (has periodicity) (Y in step S801) and the moving speed of the user terminal 3 is 8 km/h to 20 km/h(Y in step S802), the processing unit 100 determines that the user 2 is carrying out the run and changes the state of the user 2 from the “second transition” to the “run” (step S803).

4-4. Display Method of the State of the User

In this embodiment, as in the embodiments explained above, the user terminal 3 (the processing unit 100) discriminates the plurality of states of the user 2 according to the state discrimination processing (the processing in step S14 in FIG. 5) and causes the display unit 150 to display the discriminated states.

4-5. Display Method of Exercise Information

In this embodiment, as in the embodiments explained above, the user terminal 3 (the processing unit 100) transmits the exercise information of the user 2 generated by the generation processing of exercise information (the processing starting in step S12 and ending in step S18 in FIG. 5) to the information terminals 5 (5a and 5b) via the communication unit 170.

4-6. Action and Effects

As explained above, with the exercise information management system 1 in the fourth embodiment, the processing unit 100 of the user terminal 3 can automatically discriminate the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of the user 2 on the basis of the positioning data (the position information) of the GPS sensor 110 and at least either one of the output signal of the acceleration sensor 113 and the output signal of the pressure sensor 112. Therefore, the user 2 does not need to perform work when an athletic event carried out by the user 2 is switched from the “swim” to the “bike” or switched from the “bike” to the “run”. Therefore, the user 2 can concentrate on the triathlon.

With the exercise information management system 1 in the fourth embodiment, the states of the user 2 discriminated by the processing unit 100 of the user terminal 3 are displayed on the display unit 150 of the user terminal 3 together with the elapsed times. Therefore, the user 2 is capable of recognizing the displayed elapsed times of the states and performing adjustment such as an increase and a reduction in a pace. The states of the user 2 discriminated by the processing unit 100 of the user terminal 3 are displayed on the display unit of the information terminal 5a together with the elapsed times. Therefore, a person (e.g., a coach) carrying the information terminal 5a can recognize the elapsed times of the states of the user 2 and give advice or the like such as an increase or a reduction in a pace to the user 2. In particular, the user 2 and the like can separately recognize a time required for the state “swim” and a time required for the state “first transition”. Therefore, the user 2 and the like can recognize whether a pace of the swim is faster or slower than an assumption and whether a time required for switching from the swim to the bike is longer or shorter than an assumption and appropriately determine whether the pace should be increased or reduced in the bike. Similarly, the user 2 and the like can separately recognize a time required for the state “bike” and a time required for the state “second transition”. Therefore, the user 2 and the like can recognize whether a pace of the bike is faster or slower than an assumption and whether a time required for switching from the bike to the run is longer or shorter than an assumption and appropriately determine whether the pace should be increased or reduced in the run.

5. Fifth Embodiment

The exercise information management system 1 in a fifth embodiment is explained below.

In the exercise information management system 1 in the fifth embodiment, a detailed procedure of the state discrimination processing (step S14 in FIG. 5) for discriminating the state of the user 2 by the processing unit 100 of the user terminal 3 is different from the procedure in the fourth embodiment.

In the fifth embodiment, the processing unit 100 performs processing for discriminating a plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of the user 2 on the basis of positioning data (position information) generated and output by the GPS sensor 110 and an output signal of the acceleration sensor 113 and an output signal of the pressure sensor 112. As in the fourth embodiment, in the fifth embodiment, the processing unit 100 performs the swim determination processing (step S400), the first transition determination processing (step S500), the bike determination processing (step S600), the second transition determination processing (step S700), and the run determination processing (step S800) in order.

As shown in FIG. 19, as in the fourth embodiment, in the swim determination processing (step S400), if an acceleration waveform (an output waveform of the acceleration sensor 113) is regular (has periodicity) (Y in step S411), if moving speed obtained by differentiating the positions of the user terminal 3 included in positioning data of the GPS sensor 110 is approximately 3 km/h (Y in step S412), and if a water pressure and an air pressure are detected on the basis of an output signal of the pressure sensor 112 (Y in step S413), the processing unit 100 determines that the user 2 is carrying out the swim and changes the state of the user 2 from an undecided state to the “swim” (step S414).

In the first transition, since the user 2 is performing a change of clothes and the like, the movement of the arms of the user 2 is irregular (does not have periodicity) and a waveform of an output signal of the acceleration sensor 113 is irregular (does not have periodicity). The position of the user 2 hardly changes and the user 2 nearly stops (moving speed is nearly zero). Further, since the arms of the user 2 are always in the air, the pressure sensor 112 detects only an air pressure.

Therefore, as shown in FIG. 20, in the first transition determination processing (step S500), if an acceleration waveform (an output waveform of the acceleration sensor 113) is irregular (does not have periodicity) (Y in step S511), if moving speed of the user terminal 3 is nearly zero (the user terminal 3 nearly stops) (Y in S512), and if only an air pressure is detected on the basis of an output signal of the pressure sensor 112 (a water pressure is not detected) (Y in step S513), the processing unit 100 determines that the user 2 is in the state of the first transition and changes the state of the user 2 from the “swim” to the “first transition” (step S514).

When a cycle in which a voltage of an output signal of the acceleration sensor 113 coincides with a threshold Vt2 is not substantially fixed (within a predetermined range) for a predetermined time or when a state in which the voltage is smaller than the threshold Vt2 is continued for a predetermined time, the processing unit 100 may determine that the acceleration waveform is irregular. Vt2 only has to be decided as appropriate. When a state in which pressure applied to the user terminal 3 calculated using an output signal of the pressure sensor 112 is smaller than a threshold Pt2 is continued for a predetermined time, the processing unit 100 may determine that only an air pressure is detected. Pt2 only has to be decided as appropriate.

In the bike, the movement of the arms of the user 2 is irregular (does not have periodicity). Therefore, a waveform of an output signal of the acceleration sensor 113 is irregular (does not have periodicity). Running speed (moving speed) of the user 2 is predetermined speed (e.g., 20 km/h) or more. Since the user 2 receives wind, the pressure sensor 112 detects a wind pressure. Therefore, as shown in FIG. 21, in the bike determination processing (step S600), if an acceleration waveform (an output waveform of the acceleration sensor 113) is irregular (does not have periodicity) (Y in step S611), if moving speed of the user terminal 3 is 20 km/h or more (Y in step S612), and if a wind pressure is detected on the basis of an output signal of the pressure sensor 112 (Y in step S613), the processing unit 100 determines that the user 2 is carrying out the bike and changes the state of the user 2 from the “first transition” to the “bike” (step S614).

In the second transition, since the user 2 is performing a change of clothes and the like, the movement of the arms of the user 2 is irregular (does not have periodicity) and a waveform of an output signal of the acceleration sensor 113 is irregular (does not have periodicity). The position of the user 2 hardly changes and the user 2 nearly stops (moving speed is nearly zero). Further, since the arms of the user 2 are always in the air, the pressure sensor 112 detects only an air pressure. Therefore, as shown in FIG. 22, in the second transition determination processing (step S700), if an acceleration waveform (an output waveform of the acceleration sensor 113) is irregular (does not have periodicity) (Y in step S711), if moving speed of the user terminal 3 is nearly zero (the user terminal 3 nearly stops) (Y in step S712), and if only an air pressure is detected on the basis of an output signal of the pressure sensor 112 (a water pressure is not detected) (Y in step S713), the processing unit 100 determines that the user 2 is in the state of the second transition and changes the state of the user 2 from the “bike” to the “second transition” (step S714).

In the run, an arm swing of the user 2 is regular (has periodicity). Therefore, a waveform of an output signal of the acceleration sensor 113 is regular (has periodicity). Running speed (moving speed) of the user 2 is in a predetermined speed range (e.g., 8 km/h to 20 km/h). Further, since the arms of the user 2 are always in the air, the pressure sensor 112 detects only an air pressure. Therefore, as shown in FIG. 23, in the run determination processing (step S800), if an acceleration waveform (an output waveform of the acceleration sensor 113) is regular (has periodicity) (Y in step S811), if moving speed of the user terminal 3 is 8 km/h to 20 km/h (Y in step S812), and if only an air pressure is detected on the basis of an output signal of the pressure sensor 112 (a water pressure is not detected) (Y in step S813), the processing unit 100 determines that the user 2 is carrying out the run and changes the state of the user 2 from the “second transition” to the “run” (step S814).

The exercise information management system 1 in the fifth embodiment explained above can achieve the same effects as the effects in the fourth embodiment.

Further, with the exercise information management system 1 in the fifth embodiment, the processing unit 100 of the user terminal 3 discriminates the states assuming that the state of the user 2 is switched when all of a condition concerning positioning data (position information) of the GPS sensor 110, a condition concerning an output signal of the acceleration sensor 113, and a condition concerning an output signal of the pressure sensor 112 are satisfied. Therefore, compared with the fourth embodiment in which the state of the user 2 is switched when only a part of the conditions are satisfied, it is possible to more accurately discriminate the state of the user 2.

6. Sixth Embodiment

In the exercise information management system 1 in a sixth embodiment, a detailed procedure of the state discrimination processing (step S14 in FIG. 5) for discriminating a state of the user 2 by the processing unit 100 of the user terminal 3 is different from the procedures in the fourth and fifth embodiments.

In the sixth embodiment, the processing unit 100 performs processing for discriminating a plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of the user 2 on the basis of positioning data (position information) generated and output by the GPS sensor 110, at least either one of an output signal of the acceleration sensor 113 and an output signal of the pressure sensor 112, and at least either one of an output signal of the angular velocity sensor 114 (an example of a “second motion sensor”) and an output signal of the temperature sensor 116.

In general, in the swim, a stroke of the arms of the user 2 is regular (has periodicity). Therefore, a waveform of an output signal of the angular velocity sensor 114 is regular (has periodicity). A state in which the arms of the user 2 are in the air and a state in which the arms of the user 2 are in the water are alternately repeated. Therefore, the temperature sensor 116 detects a water temperature. In the first transition, since the user 2 is performing a change of clothes and the like, the movement of the arms of the user 2 is irregular (does not have periodicity) and a waveform of an output signal of the angular velocity sensor 114 is irregular (does not have periodicity). Since the arms of the user 2 are always in the air, the temperature sensor 116 detects an air temperature and a body temperature of the user 2. In the bike, the movement of the arms of the user 2 is irregular (does not have periodicity). Therefore, a waveform of an output signal of the angular velocity sensor 114 is irregular (does not have periodicity). Since the arms of the user 2 are always in the air, the temperature sensor 116 detects an air temperature and a body temperature of the user 2. In the second transition, since the user 2 is performing a change of clothes and the like, the movement of the arms of the user 2 is irregular (does not have periodicity) and a waveform of an output signal of the angular velocity sensor 114 is irregular (does not have periodicity). Since the arms of the user 2 are always in the air, the temperature sensor 116 detects an air temperature and a body temperature of the user 2. In the run, an arm swing of the user 2 is regular (has periodicity). Therefore, a waveform of an output signal of the angular velocity sensor 114 is regular (has periodicity). Since the arms of the user 2 are always in the air, the temperature sensor 116 detects an air temperature and a body temperature of the user 2.

Therefore, in addition to the same processing as the processing in the fifth embodiment, the processing unit 100 may further determine whether a waveform of an output signal of the angular velocity sensor 114 has periodicity and discriminate the plurality of states of the user 2 on the basis of the moving speed of the user 2, whether the waveform of the output signal of the acceleration sensor 113 has periodicity, the change in pressure, and whether the waveform of the output signal of the angular velocity sensor 114 has periodicity. Alternatively, in addition to the same processing as the processing in the fifth embodiment, the processing unit 100 may further detect a change in temperature on the basis of the output signal of the temperature sensor 116 and discriminate the plurality of states of the user 2 on the basis of the moving speed of the user 2, whether the waveform of the output signal of the acceleration sensor 113 has periodicity, the change in pressure, and the change in the temperature. Alternatively, the processing unit 100 may discriminate the plurality of states of the user 2 on the basis of the moving speed of the user 2, whether the waveform of the output signal of the acceleration sensor 113 has periodicity, the change in pressure, whether the waveform of the output signal of the angular velocity sensor 114 has periodicity, and the change in temperature.

As in the fourth and fifth embodiments, in the sixth embodiment, the processing unit 100 performs the swim determination processing (step S400), the first transition determination processing (step S500), the bike determination processing (step S600), the second transition determination processing (step S700), and the run determination processing (step S800) in order.

As explained above, in the swim, the stroke of the arms of the user 2 is regular (has periodicity), the swimming speed of the user 2 is in the predetermined speed range (e.g., approximately 3 km/h), and the state in which the arms of the user 2 are in the air and the state in which the arms of the user 2 are in the water are alternately repeated.

Therefore, as shown in FIG. 24, in the swim determination processing (step S400), first, the processing unit 100 resets a count value of a not-shown counter to 0 (step S421).

Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor 113) is regular (has periodicity) (Y in step S422), the processing unit 100 increments the count value by 1 (step S423).

If moving speed obtained by differentiating the positions of the user terminal 3 included in positioning data of the GPS sensor 110 is approximately 3 km/h (Y in step S424), the processing unit 110 increments the count value by 1 (step S425).

When a water pressure and an air pressure are detected on the basis of an output signal of the pressure sensor 112 (Y in step S426), the processing unit 100 increments the count value by 1 (step S427).

If an angular velocity waveform (an output waveform of the angular velocity sensor 114) is regular (has periodicity) (Y in step S428), the processing unit 100 increments the count value by 1 (step S429). If a cycle in which a voltage of an output signal of the angular velocity sensor 114 coincides with a threshold Vt3 is substantially fixed (within a predetermined range) for a predetermined time, the processing unit 100 may determine that the angular velocity waveform is regular. Vt3 only has to be decided as appropriate.

If a water temperature is detected on the basis of an output signal of the temperature sensor 116 (Y in S430), the processing unit 100 increments the count value by 1 (step S431).

If the count value is less than 3 (N in step S432), the processing unit 100 performs the processing in step S421 and subsequent steps again. If the count value is 3 or more (Y in step S432), the processing unit 100 determines that the user 2 is carrying out the swim and changes the state of the user 2 from the undecided state to the “swim” (step S433).

Note that, in the flowchart of FIG. 24, the order of the determinations in steps S422, S424, S426, S428, and S430 may be changed as appropriate.

As explained above, in the first transition, the user 2 is performing a change of clothes and the like. Therefore, the movement of the arms of the user 2 is irregular (does not have periodicity), the position of the user 2 hardly changes, and the arms of the user 2 are always in the air.

Therefore, as shown in FIG. 25, in the first transition determination processing (step S500), first, the processing unit 100 resets a count value of the not-shown counter to 0 (step S521).

Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor 113) is irregular (does not have periodicity) (Y in step S522), the processing unit 100 increments the count value by 1 (step S523).

If moving speed of the user terminal 3 is nearly zero (the user terminal 3 nearly stops) (Y in step S524), the processing unit 100 increments the count value by 1 (step S525).

If only an air pressure is detected on the basis of an output signal of the pressure sensor 112 (a water pressure is not detected) (Y in step S526), the processing unit 100 increments the count value by 1 (step S527).

If an angular velocity waveform (an output waveform of the angular velocity sensor 114) is irregular (does not have periodicity) (Y in step S528), the processing unit 100 increments the count value by 1 (step S529). If a cycle in which a voltage of an output signal of the angular velocity sensor 114 coincides with a threshold Vt4 is not substantially fixed (within a predetermined range) for a predetermined time or a state in which the voltage is smaller than the threshold Vt4 is continued for a predetermined time, the processing unit 100 may determine that the angular velocity waveform is irregular. Vt4 only has to be decided as appropriate.

If an air temperature and a body temperature of the user 2 are detected on the basis of an output signal of the temperature sensor 116 (Y in S530), the processing unit 100 increments the count value by 1 (step S531).

If the count value is less than 3 (N in step S532), the processing unit 100 performs the processing in step S521 and subsequent steps again. If the count value is 3 or more (Y in step S532), the processing unit 100 determines that the user 2 is in the state of the first transition and changes the state of the user 2 from the “swim” to the “first transition” (step S533).

Note that, in the flowchart of FIG. 25, the order of the determinations in steps S522, S524, S526, S528, and S530 may be changed as appropriate.

As explained above, in the bike, the movement of the arms of the user 2 is irregular (does not have periodicity), the running speed of the user 2 is the predetermined speed (e.g., 20 km/h) or more, the user 2 receives wind, and the arms of the user 2 are always in the air.

Therefore, as shown in FIG. 26, in the bike determination processing (step S600), first, the processing unit 100 resets a count value of the not-shown counter to 0 (step S621).

Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor 113) is irregular (does not have periodicity) (Y in step S622), the processing unit 100 increments the count value by 1 (step S623).

If moving speed of the user terminal 3 is 20 km/h or more (Y in step S624), the processing unit 100 increments the count value by 1 (step S625).

If a wind pressure is detected on the basis of an output signal of the pressure sensor 112 (Y in step S626), the processing unit 100 increments the count value by 1 (step S627).

If an angular velocity waveform (an output waveform of the angular velocity sensor 114) is irregular (does not have periodicity) (Y in step S628), the processing unit 100 increments the count value by 1 (step S629).

If an air temperature and a body temperature of the user 2 are detected on the basis of an output signal of the temperature sensor 116 (Y in step S630), the processing unit 100 increments the count value by 1 (step S631).

If the count value is less than 3 (N in step S632), the processing unit 100 performs the processing in step S621 and subsequent steps again. If the count value is 3 or more (Y in step S632), the processing unit 100 determines that the user 2 is carrying out the bike and changes the state of the user 2 from the “first transition” to the “bike” (step S633).

Note that in the flowchart of FIG. 26, the order of the determinations in steps S622, S624, S626, S628, and S630 may be changed as appropriate.

As explained above, in the second transition, the user 2 is performing a change of clothes and the like. Therefore, the movement of the arms of the user 2 is irregular (does not have periodicity), the position of the user 2 hardly changes, and the arms of the user 2 are always in the air.

As shown in FIG. 27, in the second transition determination processing (step S700), first, the processing unit 100 resets a count value of the not-shown counter to 0 (step S721).

Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor 113) is irregular (does not have periodicity) (Y in step S722), the processing unit 100 increments the count value by 1 (step S723).

If moving speed of the user terminal 3 is nearly zero (the user terminal 3 nearly stops) (Y in step S724), the processing unit 100 increments the count value by 1 (step S725).

If only an air pressure is detected on the basis of an output signal of the pressure sensor 112 (a water pressure is not detected) (Y in step S726), the processing unit 100 increments the count value by 1 (step S727).

If an angular velocity waveform (an output waveform of the angular velocity sensor 114) is irregular (does not have periodicity) (Y in step S728), the processing unit 100 increments the count value by 1 (step S729).

If an air temperature and a body temperature of the user 2 are detected on the basis of an output signal of the temperature sensor 116 (Y in step S730), the processing unit 100 increments the count value by 1 (step S731).

If the count value is less than 3 (N in step S732), the processing unit 100 performs the processing in step S721 and subsequent steps again. If the count value is 3 or more (Y in step S732), the processing unit 100 determines that the user 2 is in the state of the second transition and changes the state of the user 2 from the “bike” to the “second transition” (step S733).

Note that, in the flowchart of FIG. 27, the order of the determinations in steps S722, S724, S726, S728, and S730 cay be changed as appropriate.

As explained above, in the run, the arm swing of the user 2 is regular (has periodicity), the running speed of the user 2 is in the predetermined speed range (e.g., 8 km/h to 20 km/h), and the arms of the user 2 is always in the air.

Therefore, as shown in FIG. 28, in the run determination processing (step S800), first, the processing unit 100 resets a count value of the not-shown counter to 0 (step S821).

Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor 113) is regular (has periodicity) (Y in step S822), the processing unit 100 increments the count value by 1 (step S823).

If moving speed of the user terminal 3 is 8 km/h to 20 km/h (Y in step S824), the processing unit 100 increments the count value by 1 (step S825).

If only an air pressure is detected on the basis of an output signal of the pressure sensor 112 (a water pressure is not detected) (Y in step S826), the processing unit 100 increments the count value by 1 (step S827).

If an angular velocity waveform (an output waveform of the angular velocity sensor 114) is regular (has periodicity) (Y in step S828), the processing unit 100 increments the count value by 1 (step S829).

If an air temperature and a body temperature of the user 2 are detected on the basis of an output signal of the temperature sensor 116 (Y in step S830), the processing unit 100 increments the counter value by 1 (step S831).

If the count value is less than 3 (N in step S832), the processing unit 100 performs the processing in step S821 and subsequent steps again. If the count value is 3 or more (Y in step S832), the processing unit 100 determines that the user 2 is carrying out the run and changes the state of the user 2 from the “second transition” to the “run” (step S833).

Note that, in the flowchart of FIG. 28, the order of the determinations in steps S822, S824, S826, S828, and S830 may be changed as appropriate.

The exercise information management system 1 in the sixth embodiment explained above can achieve the same effects as the effects in the fourth embodiment or the fifth embodiment.

Further, with the exercise information management system 1 in the sixth embodiment, the processing unit 100 of the user terminal 3 discriminates the states assuming that the state of the user 2 is switched when three or more conditions among of a condition concerning positioning data (position information) of the GPS sensor 110, a condition concerning an output signal of the acceleration sensor 113, a condition concerning an output signal of the pressure sensor 112, a condition concerning an output signal of the angular velocity sensor 114, and a condition concerning the temperature sensor 116 are satisfied. Therefore, it is possible to more accurately discriminate the state of the user 2.

7. Seventh Embodiment

In the exercise information management system 1 in a seventh embodiment, a detailed procedure of the state discrimination processing (step S14 in FIG. 5) for discriminating the state of the user 2 by the processing unit 100 of the user terminal 3 is different from the procedures in the fourth to sixth embodiments.

In the fourth to sixth embodiments, the processing unit 100 calculates the elapsed time Ttran1 from the start to the end of the state “first transition” as the time equivalent to the first transition time (the elapsed time from the time when the user 2 passes the goal point G1 of the swim until the user 2 passes the start point S2 of the bike). Similarly, the processing unit 100 calculates the elapsed time Ttran2 from the start to the end of the state “second transition” as the time equivalent to the second transition time (the elapsed time from the time when the user 2 passes the goal point G2 of the bike until the user 2 passes the start point S3 of the run).

Actually, the first transition time is a sum of a time in which the user 2 moves from the goal point G1 of the swim to the transition area TA, a time required for a change of clothes and the like in the transition area TA, and a time in which the user 2 moves to the start point S2 (the riding line) of the bike. The second transition time is a sum of a time in which the user 2 moves from the goal point G2 (an alighting line)of the bike to a clothes change place in the transition area TA, a time required for a change of clothes and the like, and a time in which the user 2 moves to the start point S3 of the run.

Therefore, in the seventh embodiment, a state halfway in transition from the “swim” to the “bike” consists of three states, that is, a state “running movement A” in which the user 2 is moving from the goal point G1 of the swim to the transition area TA, a state “first transition” in which the user 2 is performing a change of clothes and the like in the transition area TA, and a state “running movement B” in which the user 2 is moving from the transition area TA to the start point S2 of the bike. The processing unit 100 of the user terminal 3 discriminates the three states “running movement A”, “first transition”, and “running movement B” respectively as separate states. Similarly, a state halfway in transition from the “bike” to the “run” consists of three states, that is, a state “running movement C” in which the user 2 moves from the goal point G2 of the bike to the transition area TA, a state “second transition” in which the user 2 is performing a change of clothes and the like in the transition area TA, and a state “running movement D” in which the user 2 is moving from the transition area TA to the start point S3 of the run. The processing unit 100 of the user terminal 3 discriminates the three states “running movement C”, “second transition”, and “running movement D” respectively as separate states.

That is, in the seventh embodiment, the processing unit 100 performs, as one kinds of the data processing, processing for discriminating a plurality of states “swim”, “running movement A”, “first transition”, “running movement B”, “bike”, “running movement C”, “second transition”, “running movement D”, and “run” of the user 2 on the basis of positioning data (position information) of the GPS sensor 110, an output signal of the acceleration sensor 113, an output signal of the pressure sensor 112, an output signal of the angular velocity sensor 114, and an output signal of the temperature sensor 116.

The processing unit 100 performs, as one kind of the data processing, processing for measuring, on the basis of an output signal of the clocking unit 130, the elapsed time Tswim of the state “swim”, an elapsed time TmoveA of the state “running movement A”, the elapsed time Ttran1 of the state “first transition”, an elapsed time TmoveB of the state “running movement B”, the elapsed time Tbike of the state “bike”, an elapsed time TmoveC of the state “running movement C”, the elapsed time Ttran2 of the state “second transition”, an elapsed time TmoveD of the state “running movement D”, and the elapsed time Trun of the state “run”.

The processing unit 100 performs, as one kind of the data processing, processing for ending, when receiving a signal indicating measurement end operation from the operation unit 120, the measurement processing of the total elapsed time Ttotal, the discrimination processing of the “swim”, the “running movement A”, the “first transition”, the “running movement B”, the “bike”, the “running movement C”, the “second transition”, the running movement D″, and the “run”, and the measurement processing of the elapsed times Tswim, TmoveA, Ttran1, TmoveB, Tbike, TmoveC, Ttran2, TmoveD, and Trun of the states and causing the incorporated storing unit 140 to store the total elapsed time Ttotal and the elapsed times Tswim, TmoveA, Ttran1, TmoveB, Tbike, TmoveC, Ttran2, TmoveD, and Trun.

The processing unit 100 may perform, as one kinds of the display processing, processing for causing the display unit 150 to display at least one of the plurality of states “swim”, “running movement A”, “first transition”, “running movement B”, “bike”, “running movement C”, “second transition”, “running movement D”, and “run” of the user 2. In this case, the display unit 150 functions as a notifying unit that notifies a state discriminated by the processing unit 100.

The processing unit 100 may perform, as one kind of the display processing, processing for causing the display unit 150 to display at least a part of the exercise information (the total elapsed time Ttotal, the elapsed times Tswim, TmoveA, Ttran1, TmoveB, Tbike, TmoveC, Ttran2, TmoveD, and Trun, the discriminated state, the speed, the pace, the distance, the track, the pulse rate, the heart rate, the pitch, the swimming stroke, the run stride, etc.) of the user 2.

The processing unit 100 may perform, as one kinds of the sound output processing, processing for causing the sound output unit 160 to output, as sound, at least one of the plurality of states “swim”, “running movement A”, “first transition”, “running movement B”, “bike”, “running movement C”, “second transition”, “running movement D”, and “run” of the user 2. In this case, the sound output unit 160 functions as a notifying unit that notifies a state discriminated by the processing unit 100.

The processing unit 100 may perform, as one kinds of the communication processing, processing for transmitting at least one of the plurality of states “swim”, “running movement A”, “first transition”, “running movement B”, “bike”, “running movement C”, “second transition”, “running movement D”, and “run” of the user 2 to the information terminals 5(5a and 5b) via the communication unit 170. In this case, the communication unit 170 functions as a notifying unit that notifies a state discriminated by the processing unit 100.

The processing unit 100 may perform, as one kind of the communication processing, processing for transmitting the exercise information (the total elapsed time Ttotal, the elapsed times Tswim, TmoveA, Ttran1, TmoveB, Tbike, TmoveC, Ttran2, TmoveD, and Trun, the discriminated state, the speed, the pace, the distance, the track, the pulse rate, the heart rate, the pitch, the swimming stroke, the run stride, etc.) of the user 2 to the information terminals 5 (5a and 5b) via the communication unit 170.

As in the fourth to sixth embodiments, in the seventh embodiment, the processing unit 100 executes the computer program stored in the storing unit 140 to thereby, for example, execute the processing in the procedure of the flowchart of FIG. 5.

FIG. 29 is a flowchart for explaining an example of details of the state discrimination processing (the processing in step S14 in FIG. 5) in the seventh embodiment.

As shown in FIG. 29, the processing unit 100 performs the swim determination processing (step S400), running movement A determination processing (step S450), the first transition determination processing (step S500), running movement B determination processing (step S550), the bike determination processing (step S600), running movement C determination processing (step S650), the second transition determination processing (step S700), running movement D determination processing (step S750), and the run determination processing (step S800) in order. Detailed procedure of the swim determination processing (step S400), the first transition determination processing (step S500), the bike determination processing (step S600), the second transition determination processing (step S700), and the run determination processing (step S800) are the same as the procedures in the sixth embodiment (FIGS. 24 to 28). Therefore, illustration of flowcharts and explanation of the flowcharts are omitted concerning the detailed procedures.

In the state “running movement A” in which the user 2 is moving from the goal point G1 of the swim to the transition area TA, an arm swing of the user 2 is regular (has periodicity), running speed of the user 2 is in a predetermined speed range (e.g., 8 km/h to 20 km/h), and the arms of the user 2 are always in the air.

Therefore, as shown in FIG. 30, in the running movement A determination processing (step S450), first, the processing unit 100 resets a count value of the not-shown counter to 0 (step S451).

Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor 113) is regular (has periodicity) (Y in step S452), the processing unit 100 increments the count value by 1 (step S453).

If moving speed of the user terminal 3 is 8 km/h to 20 km/h (Y in step S454), the processing unit 100 increments the count value by 1 (step S455).

If only an air pressure is detected on the basis of an output signal of the pressure sensor 112 (a water pressure is not detected) (Y in step S456), the processing unit 100 increments the count value by 1 (step S457).

If an angular velocity waveform (an output waveform of the angular velocity sensor 114) is regular (has periodicity) (Y in step S458), the processing unit 100 increments the count value by 1 (step S459).

If an air temperature and a body temperature of the user 2 are detected on the basis of an output signal of the temperature sensor 116 (Y in S460), the processing unit 100 increments the count value by 1 (step S461).

If the count value is less than 3 (N in step S462), the processing unit 100 performs the processing in step S451 and subsequent steps again. If the count value is 3 or more (Y in step S462), the processing unit 100 determines that the user 2 is in the state of the running movement A and changes the state of the user 2 from the “swim” to the “running movement A” (step S463).

Note that, in the flowchart of FIG. 30, the order of the determinations in steps S452, S454, S456, S458, and S460 may be changed as appropriate.

In the state “running movement B” in which the user 2 is moving from the transition area TA to the start point S2 of the bike, since the user 2 grips a handle of a bicycle and runs, the arms of the user 2 slightly vibrate, running speed of the user 2 is in a predetermined speed range (e.g., 8 km/h to 20 km/h), and the arms of the user 2 are always in the air.

Therefore, as shown in FIG. 31, in the running movement B determination processing (step S550), first, the processing unit 100 resets a count value of the not-shown counter to 0 (step S551).

Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor 113) is oscillatory (Y in step S552), the processing unit 100 increments the count value by 1 (step S553). If a cycle in which a voltage of an output signal of the acceleration sensor 113 coincides with a threshold Vt5 is within a predetermined range continuously for a predetermined time, the processing unit 100 may determine that the acceleration waveform is oscillatory. Vt5 only has to be decided as appropriate.

If moving speed of the user terminal 3 is 8 km/h to 20 km/h (Y in step S554), the processing unit 100 increments the count value by 1 (step S555).

If only an air pressure is detected on the basis of an output signal of the pressure sensor 112 (a water pressure is not detected) (Y in step S556), the processing unit 100 increments the count value by 1 (step S557).

If an angular velocity waveform (an output waveform of the angular velocity sensor 114) is irregular (does not have periodicity) (Y in step S558), the processing unit 100 increments the count value by 1 (step S559).

If an air temperature and a body temperature of the user 2 are detected on the basis of an output signal of the temperature sensor 116 (Y in S560), the processing unit 100 increments the count value by 1 (step S561).

If the count value is less than 3 (N in step S562), the processing unit 100 performs the processing in step S551 and subsequent steps again. If the count value is 3 or more (Y in step S562), the processing unit 100 determines that the user 2 is in the state of the running movement B and changes the state of the user 2 from the “first transition” to the “running movement B” (step S563).

Note that, in the flowchart of FIG. 31, the order of the determinations in steps S552, S554, S556, S558, and S560 may be changed as appropriate.

In the state “running movement C” in which the user 2 is moving from the goal point G2 of the bike to the transition area TA, since the user 2 grips the handle of the bicycle and runs, the arms of the user 2 slightly vibrate, running speed of the user 2 is in a predetermined speed range (e.g., 8 km/h to 20 km/h), and the arms of the user 2 are always in the air.

Therefore, as shown in FIG. 32, in the running movement C determination processing (step S650), first, the processing unit 100 resets a count value of the not-shown counter to 0 (step S651).

Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor 113) is oscillatory (Y in step S652), the processing unit 100 increments the count value by 1 (step S653).

If moving speed of the user terminal 3 is 8 km/h to 20 km/h (Y in step S654), the processing unit 100 increments the count value by 1 (step S655).

If only an air pressure is detected on the basis of an output signal of the pressure sensor 112 (a water pressure is not detected) (Y in step S656), the processing unit 100 increments the count value by 1 (step S657).

If an angular velocity waveform (an output waveform of the angular velocity sensor 114) is irregular (does not have periodicity) (Y in step S658), the processing unit 100 increments the count value by 1 (step S659).

If an air temperature and a body temperature of the user 2 are detected on the basis of an output signal of the temperature sensor 116 (Y in S660), the processing unit 100 increments the count value by 1 (step S661).

If the count value is less than 3 (N in step S662), the processing unit 100 performs the processing in step S651 and subsequent steps again. If the count value is 3 or more (Y in step S662), the processing unit 100 determines that the user 2 is in the state of the running movement C and changes the state of the user 2 from the “bike” to the “running movement C” (step S663).

Note that, in the flowchart of FIG. 32, the order of the determinations in steps S652, S654, S656, S658, and S660 may be changed as appropriate.

In the state “running movement D” in which the user 2 is moving from the transition area TA to the start point S3 of the run, an arm swing of the user 2 is regular (has periodicity), running speed of the user 2 is in a predetermined speed range (e.g., 8 km/h to 20 km/h), and the arms of the user 2 are always in the air.

Therefore, as shown in FIG. 33, in the running movement D determination processing (step S750), first, the processing unit 100 resets a count value of the not-shown counter to 0 (step S751).

Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor 113) is regular (has periodicity) (Y in step S752), the processing unit 100 increments the count value by 1 (step S753).

If moving speed of the user terminal 3 is 8 km/h to 20 km/h (Y in step S754), the processing unit 100 increments the count value by 1 (step S755).

If only an air pressure is detected on the basis of an output signal of the pressure sensor 112 (a water pressure is not detected) (Y in step S756), the processing unit 100 increments the count value by 1 (step S757).

If an angular velocity waveform (an output waveform of the angular velocity sensor 114) is regular (has periodicity) (Y in step S758), the processing unit 100 increments the count value by 1 (step S759).

If an air temperature and a body temperature of the user 2 are detected on the basis of an output signal of the temperature sensor 116 (Y in S760), the processing unit 100 increments the count value by 1 (step S761).

If the count value is less than 3 (N in step S762), the processing unit 100 performs the processing in step S751 and subsequent steps again. If the count value is 3 or more (Y in step S762), the processing unit 100 determines that the user 2 is in the state of the running movement D and changes the state of the user 2 from the “second transition” to the “running movement D” (step S763).

Note that, in the flowchart of FIG. 33, the order of the determinations in steps S752, S754, S756, S758, and S760 may be changed as appropriate.

The exercise information management system 1 in the seventh embodiment explained above can achieve the same effects as the effects in the fourth embodiment, the fifth embodiment, or the sixth embodiment.

Further, with the exercise information management system 1 in the seventh embodiment, the processing unit 100 of the user terminal 3 can separately recognize a time required by the user 2 for movement from the goal point G1 of the swim to the transition area TA (a time required for the state “running movement A”), a time required by the user 2 for a change of clothes and the like in the transition area TA (a time required for the state “first transition”), and a time required by the user 2 for movement from the transition area TA to the start point S2 of the bike (a time required for the state “running movement B”). Therefore, the user 2 and the like can grasp, in detail, for example, points that should be improved in switching from the swim to the bike.

Similarly, with the exercise information management system 1 in the seventh embodiment, the processing unit 100 of the user terminal 3 can separately recognize a time required by the user 2 for movement from the goal point G2 of the bike to the transition area TA (a time required for the state “running movement C”), a time required by the user 2 for a change of clothes and the like in the transition area TA (a time required for the state “second transition”), and a time required by the user 2 for movement from the transition area TA to the start point S3 of the run (a time required for the state “running movement D”). Therefore, the user 2 and the like can grasp, in detail, for example, points that should be improved in switching from the bike to the run.

8. Modifications

The present is not limited to the embodiments. Various modified implementations are possible within the scope of the gist of the invention. Modifications are explained below. Note that the same components as the components in the embodiments are denoted by the same reference numerals and signs and redundant explanation of the components is omitted.

For example, the processing unit 100 of the user terminal 3 may discriminate the plurality of states of the user 2 and, when the discriminated state is the “first transition” or the “second transition”, generate an image including a plurality of flashing objects and cause the display unit 150 or the display unit of the information terminal 5 (5a) to display the image. In FIGS. 34 and 35, examples of images at the time when the state of the user 2 is the “first transition” in this modification are shown.

In the example shown in FIG. 34, in images A6-1 to A6-5 at the time when the state of the user 2 is the “first transition”, three triangular objects, which are parts of an object OB6 for reminding that the user 2 is transitioning from the swim to the bike, are flashed at cycles different from one another. Specifically, the three triangular objects are extinguished for one second (the image A6-1), subsequently, only the triangular object at the left end is lit for one second (the image A6-2), only two triangular objects from the left end are lit for one second (the image A6-3), the three triangular objects are lit for one second (the image A6-4), and the three triangular objects are extinguished for one second again (the image A6-5). Although not shown in the figure, similarly, in images at the time when the state of the user 2 is the “second transition”, three triangular objects, which are parts of an object for reminding that the user 2 is transitioning from the bike to the run, may be flashed at cycles different from one another.

In the example shown in FIG. 35, in images A7-1 to A7-4 at the time when the state of the user 2 is the “first transition”, three triangular objects, which are parts of an object OB7 for reminding that the user 2 is transitioning from the swim to the bike, are flashed at timings different from one another and at the same cycle. Specifically, only the triangular object at the left end is lit for one second (the image A7-1), subsequently, only the triangular object in the middle is lit for one second (the image A7-2), only the triangular object at the right end is lit for one second (the image A7-3), and only the triangular object at the left end is lit for one second again (the image A7-4). Although not shown in the figure, similarly, in images at the time when the state of the user 2 is the “second transition”, three triangular objects, which are parts of an object for reminding that the user 2 is transitioning from the bike to the run, may be flashed at timings different from one another at the same cycle.

For example, the processing unit 100 of the user terminal 3 may discriminate a plurality of states (e.g., the “swim”, the “first transition”, the “bike”, the “second transition”, and the “run”) of the user 2, generate an image including information indicating to a degree to which a discriminated state has ended (a degree in which the discriminated state remains), and cause the display unit 150 or the display unit of the information terminal 5 (5a) to display the image. For example, before starting the triathlon, the user 2 registers, in the storing unit 140 of the user terminal 3, respective kinds of position information of the start point S1 and the goal point G1 of the swim, the start point S2 and the goal point G2 of the bike, and the start point S3 and the goal point G3 of the run. The processing unit 100 can calculate, on the basis of the respective kinds of position information registered in the storing unit 140 and a time series of positioning data (position information) generated and output by the GPS sensor 110, a degree to which a present state of the user 2 has ended (a degree in which the present state will end).

In FIG. 36, an example of an image at the time when the state of the user 2 is the “bike” in this modification is shown. In the example shown in FIG. 36, like the image A3 shown in FIG. 7, an image 8A at the time when the state of the user 2 is the “bike” includes the object OB3, the total elapsed time Ttotal (1:01:45) serving as the total time, and the elapsed time Tbike (0:26:59) of the bike and further includes an object OB8. The object OB8 includes ten rectangular objects arranged in one row. All of the ten rectangular objects are painted in white at a point in time when the user 2 starts the bike. Every time the user 2 advances by a predetermined distance, the rectangular objects are painted in black one by one in order from the left end. When the bike approaches the end (or the bike ends), all of the rectangular objects are painted in black. In the example shown in FIG. 36, the seven rectangular objects are painted in black to indicate that the bike has ended to approximately 70% of the entire course (approximately 30% of the entire course remains). The user 2 can view an image A8 and the like displayed on the display unit 150 or receive contact from the coach or the like, who views the image A8 and the like displayed on the display unit of the information terminal 5 (5a), and adjust the subsequent pace.

For example, the processing unit 100 of the user terminal 3 may discriminate the plurality of states (e.g., the “swim”, the “first transition”, the “bike”, the “second transition”, and the “run”) of the user 2 and, when the discriminated state is the “first transition”, cause the display unit 150 or the display unit of the information terminal 5 (5a) to display the elapsed time Ttran1 from the start of the “first transition” to be comparable with a target time set in advance. Similarly, when the discriminated state is the “second transition”, the processing unit 100 may cause the display unit 150 or the display unit of the information terminal 5 (5a) to display the elapsed time Ttran2 from the start of the “second transition” to be comparable with a target time set in advance. For example, before starting the triathlon, the user 2 sets target times of the first transition and the second transition, alternatively, sets, as the target times of the first transition and the second transition, times in the past of other users (friends, etc.), professional athletes, and the user 2 himself/herself, and causes the storing unit 140 of the user terminal 3 to register the target time. The processing unit 100 may display the elapsed time Ttran1 and the elapsed time Ttran2 to be comparable with the target time.

In FIG. 37, an example of an image at the time when the state of the user 2 is the “first transition” is shown. In the example shown in FIG. 37, like the image A2 shown in FIG. 7, an image A9 at the time when the state of the user 2 is the “first transition” includes the total elapsed time Ttotal (0:30:12) serving as the total time and the elapsed time Ttran1 (0:01:05) of the first transition and further includes a target time (0:01:30). Although not shown in the figure, similarly, like the image A4 shown in FIG. 7, the image at the time when the state of the user 2 is the “second transition” includes the object OB4, the total elapsed time Ttotal serving as the total time, and the elapsed time Ttran2 of the second transition and further include a target time. The user 2 can view the image A9 and the like displayed on the display unit 150 or receive contact from the coach or the like, who views the image A9 and the like displayed on the display unit of the information terminal 5 (5a), grasp whether elapsed times of the first transition and the second transition are longer or shorter than the target time, and adjust the subsequent pace.

For example, in the first embodiment or the second embodiment, the processing unit 100 of the user terminal 3 may change the state of the user 2 from the “swim” to the “running movement A” when the user 2 passes the position P1 (or any one of the plurality of positions P1), when the state of the user 2 is the state “running movement A”, change the state of the user 2 to the “first transition” when determining on the basis of positioning data (position information) of the GPS sensor 110 that the user 2 stops, and, when the state of the user 2 is the state “first transition”, change the state of the user 2 to the “running movement B” when determining on the basis of positioning data (position information) of the GPS sensor 110 that the user 2 starts to move. Similarly, the processing unit 100 of the user terminal 3 may change the state of the user 2 from the “bike” to the “running movement C” when the user 2 passes the position P3 (or any one of the plurality of positions P3), when the state of the user 2 is the state “running movement C”, change the state of the user 2 to the “second transition” when determining on the basis of positioning data (position information) of the GPS sensor 110 that the user 2 stops, and, when the state of the user 2 is the state “second transition”, change the state of the user 2 to the “running movement D” when determining on the basis of positioning data (position information) of the GPS sensor 110 that the user 2 starts to move.

For example, the processing unit 100 of the user terminal 3 may discriminate the “swim”, the “bike”, and the “run” as the plurality of states of the user 2 and does not have to discriminate the “first transition” and the “second transition”. In this case, for example, the user 2 may register, in advance, a position P11 in the start point S2 of the bike or the vicinity of the start point S2 and a position P12 in the start point S3 of the run or the vicinity of the start point S3. The processing unit 100 of the user terminal 3 may determine on the basis of positioning data (position information) of the GPS sensor 110 whether the user 2 passes the position P11 and whether the user 2 passes the position P12, determine that the user 2 is in the state “swim” until the user 2 passes the position P11, determine that the user 2 is in the state “bike” until the user 2 passes the position P12 after passing the position P11, and determine that the user 2 is in the state “run” after the user 2 passes the position P12.

For example, in the fourth to seventh embodiments, the processing unit 100 of the user terminal 3 may discriminate the “swim”, the “bike”, and the “run” as the plurality of states of the user 2 and does not have to discriminate the “first transition” and the “second transition”. In this case, for example, the processing unit 100 of the user terminal 3 does have to perform the first transition determination and the second transition determination.

For example, the user terminal 3 may include a plurality of pressure sensors 112a to 112d. In the bike determination processing, the processing unit 100 may more accurately detect a wind pressure on the basis of output signals of the plurality of pressure sensors 112a to 112d. FIG. 38 is a diagram showing a disposition example of the plurality of pressure sensors 112a to 112d and is a plan view of the user terminal 3. When the user 2 wears the user terminal 3 on the left hand and travels with the bike, wind hits the user terminal 3 from the right side in FIG. 38. Therefore, the pressure sensor 112b detects a positive wind pressure, the pressure sensor 112a detects a negative wind pressure, and the pressure sensors 112c and 112d hardly detect a wind pressure. When the user 2 wears the user terminal 3 on the right hand and travels with the bike, wind hits the user terminal 3 from the left side in FIG. 38. Therefore, the pressure sensor 112a detects a positive wind pressure, the pressure sensor 112b detects a negative wind pressure, and the pressure sensors 112c and 112d hardly detect a wind pressure. Therefore, the processing unit 100 can more accurately detect a wind pressure on the basis of the pressure sensors 112a, 112b, 112c, and 112d.

For example, in the sixth embodiment or the seventh embodiment, the processing unit 100 of the user terminal 3 may discriminate the plurality of states of the user 2 without using one of an output signal of the angular velocity sensor 114 and an output signal of the temperature sensor 116.

For example, in the embodiments, the processing unit 110 of the user terminal 3 may perform state discrimination processing using the terrestrial magnetism sensor 111 as a motion sensor. That is, in the embodiments, the processing unit 100 may perform the state discrimination processing by replacing an output signal of the acceleration sensor 113 with an output signal of the terrestrial magnetism sensor 111 (an example of the “first motion sensor”). In the sixth embodiment or the seventh embodiment, the processing unit 100 may perform the state discrimination processing by replacing an output signal of the angular velocity sensor 114 with an output signal of the terrestrial magnetism sensor 111 (an example of the “second motion sensor”).

For example, in the embodiments, the processing unit 100 of the user terminal 3 discriminates the plurality of states of the user 2 in the triathlon (the swing, the bike, and the run). However, the processing unit 100 may discriminate a plurality of states of the user 2 in any competition including a plurality of athletic events such as winter triathlon (snow run, snow bike, and cross-country ski), duathlon (first run, bike, and second run), aquathlon (run and swim or first run, swim, and second run), or biathlon (cross-country ski and rifle shooting).

For example, at least a part of the various sensors (the GPS sensor 110, the terrestrial magnetism sensor 111, the pressure sensor 112, the acceleration sensor 113, the angular velocity sensor 114, the pulse sensor 115, and the temperature sensor 116) do not have to be integrated with the user terminal 3.

For example, in the embodiments, a part of the functions of the server 4 or the information terminal 5 may be mounted on the user terminal 3 or a part of the functions of the user terminal 3 may be mounted on the server 4 or the information terminal 5.

For example, in the embodiments, functions of a publicly-known smartphone such as a camera function, a call function, and a communication function may be mounted or other sensing functions (a humidity sensor, etc.) may be mounted on the user terminal 3 or the information terminal 5.

For example, the user terminal 3 can be configured as, besides the wrist-type electronic device, electronic devices of various types such as an earphone-type electronic device, a fingering-type electronic device, a pendant-type electronic device, an electronic device attached to a sports instrument and used, a smartphone, and a head mount display (HMD). The user terminal 3 only has to be mounted on a position where an exercise state of the user 2 can be analyzed. The user terminal 3 may be mounted on, besides the wrist, for example, an arm, a waist, a chest, or a foot.

For example, the user terminal 3 or the information terminal 5 may perform notification of information through image display, may perform the notification of information through sound output, vibration, or the like, or may perform the notification of information through a combination of at least two of the image display, the sound output, and the vibration.

For example, in the embodiments explained above, the user terminal 3 performs the various kinds of processing using the satellite signals from the GPS satellites. However, the user terminal 3 may use satellite signals from positioning satellites of a Global Navigation Satellite System (GNSS) other than the GPS or positioning satellites other than the GNSS. For example, the user terminal 3 may use satellite signals from satellites of one or two or more systems of satellite positioning systems such as a WAAS (Wide Area Augmentation System), an EGNOS (European Geostationary-Satellite Navigation Overlay Service), a QZSS (Quasi Zenith Satellite System), a GLONASS (GLObal Navigation Satellite System), a GALILEO, and a BeiDou (BeiDou Navigation Satellite System).

The embodiments and the modifications explained above are examples. The invention is not limited to the embodiments and the modifications. For example, it is also possible to combine the embodiments and the modifications as appropriate.

The invention includes configurations substantially the same as the configurations explained in the embodiments (e.g., configurations having the same functions, methods, and results or configurations having the same purposes and effects). The invention includes configurations in which non-essential portions of the configurations explained in the embodiments are replaced. The invention includes configurations that realize the same action and effects as the action and effects of the configurations explained in the embodiments and configurations that can achieve the same objects as the objects of the embodiments. The invention includes configurations in which publicly-known techniques are added to the configurations explained in the embodiment.

Claims

1. An electronic device comprising a processing unit configured to discriminate, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and a first position and a second position registered in advance, a plurality of states including a first exercise state in which a user is carrying out a first exercise event, a second exercise state in which the user is carrying out a second exercise event, and a first transition state halfway in transition from the first exercise state to the second exercise state.

2. The electronic device according to claim 1, further comprising a notifying unit configured to notify a state discriminated by the processing unit.

3. The electronic device according to claim 1, wherein

the processing unit determines on the basis of the position information whether the user passes the first position and whether the user passes the second position, determines that the user is in the first exercise state until the user passes the first position, determines that the user is in the first transition state until the user passes the second position after passing the first position, and determines that the user is in the second exercise state after the user passes the second position.

4. The electronic device according to claim 1, wherein

the processing unit calculates times respectively required for the first exercise state, the second exercise state, and the first transition state.

5. The electronic device according to claim 1, wherein

the plurality of states include a third exercise state in which the user is carrying out a third exercise event and a second transition state halfway in transition from the second exercise state to the third exercise state, and
the processing unit discriminates the plurality of states on the basis of a third position and a fourth position registered in advance.

6. The electronic device according to claim 5, wherein

the processing unit determines on the basis of the position information whether the user passes the third position and whether the user passes the fourth position, determines that the user is in the second exercise state until the user passes the third position, determines that the user is in the second transition state until the user passes the fourth position after passing the third position, and determines that the user is in the third exercise state after the user passes the fourth position.

7. The electronic device according to claim 5, wherein

the processing unit calculates times respectively required for the third exercise state and the second transition state.

8. The electronic device according to claim 1, wherein

the processing unit discriminates the plurality of states on the basis of at least either one of an output signal of a first motion sensor and an output signal of a pressure sensor, the position information, and the first position and the second position.

9. The electronic device according to claim 8, wherein

the processing unit calculates moving speed on the basis of the position information, determines whether the output signal of the first motion sensor has periodicity, detects a change in pressure on the basis of the output signal of the pressure sensor, and discriminates the plurality of states on the basis of the moving speed, whether the output signal of the first motion sensor has periodicity, and the change in the pressure.

10. The electronic device according to claim 8, wherein

the processing unit discriminates the plurality of states on the basis of an output signal of a second motion sensor.

11. The electronic device according to claim 8, wherein

the processing unit discriminates the plurality of states on the basis of an output signal of a temperature sensor.

12. The electronic device according to claim 5, wherein

the first exercise event is swimming, the second exercise event is bicycling, and the third exercise event is running.

13. A display method comprising:

discriminating, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and a first position and a second position registered in advance, a plurality of states including a first exercise state in which a user is carrying out a first exercise event, a second exercise state in which the user is carrying out a second exercise event, and a first transition state halfway in transition from the first exercise state to the second exercise state; and
displaying a discriminated state.

14. The display method according to claim 13, wherein,

when the discriminated state is the first transition state, an image including a flashing object is displayed.

15. The display method according to claim 13, wherein,

when the discriminated state is the first transition state, an elapsed time from a start of the first transition state is displayed.

16. The display method according to claim 13, wherein

the elapsed time is displayed to be comparable with a target time set in advance.

17. The display method according to claim 13, wherein

the plurality of states are discriminated on the basis of at least either one of an output signal of a first motion sensor and an output signal of a pressure sensor, the position information, and the first position and the second position, and the discriminated state is displayed.

18. A display system comprising:

a processing unit configured to discriminate, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and a first position and a second position registered in advance, a plurality of states including a first exercise state in which a user is carrying out a first exercise event, a second exercise state in which the user is carrying out a second exercise event, and a first transition state halfway in transition from the first exercise state to the second exercise state; and
a display unit configured to display a discriminated state.

19. A computer-readable recording medium having recorded therein a computer program for causing a computer to execute a step of discriminating, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and a first position and a second position registered in advance, a plurality of states including a first exercise state in which a user is carrying out a first exercise event, a second exercise state in which the user is carrying out a second exercise event, and a first transition state halfway in transition from the first exercise state to the second exercise state.

20. The computer-readable recording medium according to claim 19, further having recorded therein a computer program for causing the computer to execute steps of:

determining on the basis of the position information whether the user passes the first position and whether the user passes the second position;
determining that the user is in the first exercise state until the user passes the first position;
determining that the user is in the first transition state until the user passes the second position after passing the first position; and
determining that the user is in the second exercise state after the user passes the second position.
Patent History
Publication number: 20180117414
Type: Application
Filed: Sep 28, 2017
Publication Date: May 3, 2018
Applicant: SEIKO EPSON CORPORATION (Tokyo)
Inventors: Eiji MIYASAKA (Okaya-shi), Osamu YAMADA (Chino-shi)
Application Number: 15/719,350
Classifications
International Classification: A63B 24/00 (20060101); A63B 71/06 (20060101); A63B 69/00 (20060101);