Method for Operating a Training Device and Training Device

A method for operating a training device is provided. According to the method, a property of a user using the training device during a training session is automatically determined, and a fitness level of the user is automatically determined from the user property. Based on the fitness level of the user a performance of a virtual opponent is automatically defined, and information representing the performance of the virtual opponent is output to the user.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description

This application claims priority from U.S. Provisional Application Ser. No. 61/368,412, filed Jul. 28, 2010, and from European Patent Application No. EP 10007620.7, filed Jul. 22, 2010, the disclosures of which are incorporated herein in their entirety.

The present invention relates to a method for operating a training device and especially to a training device adapted to perform the method.

BACKGROUND OF THE INVENTION

Training or exercising, for example running, jogging, Nordic walking or biking, has become very popular in recent years as it is considered to contribute improvement to physical health. However, many people who start training abandon the training already after a few training sessions due to lack of motivation. Training in a group may be more motivating, but this requires training at a certain place and at a certain time which is not always possible, and it requires somebody who organizes such training in a group. Furthermore, training in a group may be on the other hand annoying or frustrating if the fitness level of the participants of the group varies widely.

Therefore, there is a need for supporting and facilitating a person during a training session and it is an object of the present invention to provide a method and a device for facilitating and supporting a user during a training session.

SUMMARY OF THE INVENTION

According to an aspect of the present invention, a method for operating a training device is provided. According to the method, a property of a user using the training device during a training session is automatically determined. Furthermore, from the determined user property, a fitness level of the user is automatically determined.

Based on the determined fitness level of the user, a performance of a virtual opponent is automatically defined and information representing the performance of the virtual opponent is output to the user.

The property of the user may comprise for example a property of the user which is determined during a former training session of the user or during a current training session of the user. The property of the user may comprise for example a heartbeat rate of the user or a heartbeat characteristic of the heartbeat of the user, for example a heartbeat rate recovery time after a training session, a maximum heartbeat rate during a training session, an average heartbeat rate during a training session, a resting heartbeat rate or a heartbeat rate variability. Furthermore, the property of the user may comprise a breathing rate or a blood pressure of the user. Based on the determined user properties, a generic fitness level of the user indicating an average fitness can be derived. Based on this fitness level the performance of the virtual opponent is defined, for example, in terms of a running speed of the virtual opponent, a biking speed of the virtual opponent or a heartbeat rate of the virtual opponent. The performance of the virtual opponent may be additionally defined depending on the kind of training of the current training session, for example a basic endurance training, a speed training, speed endurance training and so on. By outputting the information representing the performance of the virtual opponent to the user, the user gets feedback about the user's current performance in relation to the performance of the virtual opponent which is representing the generic fitness level of the user. Therefore, the user has to compete with the user's own fitness level. From this, the user gets a motivation for training without being annoyed or frustrated by real opponents having a much lower or higher fitness level.

According to an embodiment, the performance of the virtual opponent is defined additionally based on a fitness level of another predetermined person. The performance of the virtual opponent is scaled based on the fitness level of the user and the fitness level of the predetermined person. Thus, the user can compete with another person having a different fitness level via the virtual opponent and it depends on the form of the day if the user performs better than the virtual opponent or not. This may motivate the user to keep on training.

According to another embodiment of the present invention, a performance relation is automatically determined based on the current performance of the user in relation to the performance of the virtual opponent and this performance relation is output to the user as the information representing the performance of the virtual opponent. For example, the performance relation is represented by a three-dimensional audio data output to the user. When the user is running or jogging, the three-dimensional audio data may comprise sounds of footsteps which are arranged in relation to the user based on the performance relation. For example, when the performance of the user is lower than the performance of the virtual opponent, sounds of footsteps are output to the user as three-dimensional audio data in such a way that it appears to the user that the opponent is approaching from behind and passes the user. This may motivate the user to speed up or to continue training to gain a higher fitness level such that the user can keep up or even pass the virtual opponent the next time. When the performance of the user is higher than the performance of the virtual opponent, three-dimensional audio data is output to the user in such a way that it appears to the user that the user is approaching the virtual opponent and then passes the virtual opponent. However, also a mono, stereo or any other kind of audio output may be used to represent the performance of the opponent.

According to another embodiment, the performance relation is output to the user by a visual representation of the virtual opponent in an augmented reality system. The augmented reality system may be displayed to the user by special eye glasses of an augmented reality system displaying the virtual opponent to the user by overlaying an image of the virtual opponent over the reality information of an environment. The virtual opponent is arranged in relation to the user based on the performance relation. For example, when the performance of the user is higher than the performance of the virtual opponent, the virtual opponent may appear as augmentation information running in front of the user and the user is approaching and finally passing the virtual opponent. In case the performance of the virtual opponent is derived from another predetermined person, as described above, an image of the predetermined person may be displayed as the virtual opponent in the augmented reality system.

According to yet another embodiment, the performance relation is output by a visual representation of the virtual opponent and the user on a display displaying a map. The virtual opponent and the user are arranged on the map based on their performance relation. For example, the virtual opponent and the user start on the same starting point on the map and follow the same track. As long as the performance of the virtual opponent is higher than the performance of the user, a symbol representing the virtual opponent on the track is leading compared to a symbol representing the user on the map. When the user speeds up the user may pass the virtual opponent, and this is represented accordingly on the map.

By representing the performance of the virtual opponent to the user as described above, the user may get highly motivated to keep on training or even to make more training to keep up or to perform even better than the virtual opponent. Furthermore, each training session can be considered as a competition of the user and the virtual opponent, wherein the virtual opponent has in general the same fitness level as the user. This increases fun and motivation during exercising and makes it possible to compete with another person even if the other person is not on the fitness level.

According to another aspect of the present invention, a training device is provided. The training device provides a monitoring unit, a processing unit, and an output unit. The monitoring unit is adapted to determine a property of a user using a training device during a training session. The processing unit is adapted to determine a fitness level of the user from the determined user property and to define a performance of a virtual opponent based on the determined fitness level of the user. The output unit is adapted to output information to the user, wherein the information represents the performance of the virtual opponent. The training device may be adapted to perform the above-described method and comprises therefore the above-described advantages. The monitoring unit may comprise for example any kind of heart beat monitoring unit, breathing frequency monitoring unit, blood pressure measuring unit or a galvanic skin response sensor.

The training device may comprise a mobile device, for example a mobile phone, a mobile navigation system, a mobile music player, or a mobile pulse monitor. Furthermore, the training device may comprise a mobile device with a screen or without a screen having audio output only, or may comprise a headset for measuring breathing or a heart beat and providing earphones as the output unit. Such mobile devices are lightweight and robust and therefore adapted to be carried around by the user during training. Furthermore, mobile devices like mobile phones or mobile music players are already widely in use during exercising and training for entertainment. Therefore, integrating the above-described method into the above-described mobile devices can be achieved at low cost.

Although specific features described in the above summary and the following detailed description are described in connection with specific embodiments, it is to be understood that the features of the embodiments can be combined with each other unless noted otherwise.

BRIEF DESCRIPTION OF THE DRAWINGS

The invention will now be described in more detail with reference to the accompanying drawings.

FIG. 1 shows a schematic diagram of a training device according to an embodiment of the present invention.

FIG. 2 shows an output on a display of the training device of FIG. 1 representing a map, a user and a virtual opponent.

DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS

In the following, exemplary embodiments of the present invention will be described in more detail. It is to be understood that the following description is given only for the purpose of illustrating the principles of the invention and is not to be taken in a limiting sense. Rather, the scope of the invention is defined only by the appended claims and not intended to be limited by the exemplary embodiments hereinafter.

It is to be understood that the features of the various exemplary embodiments described herein may be combined with each other unless specifically noted otherwise. Same reference signs in the various instances of the drawings refer to similar or identical components.

FIG. 1 shows a training device 100. The training device 100 comprises eye glasses 101, a processing unit 102 and a display 103. The eye glasses 101 comprise two eye glass lenses 104 and 105 which are adapted to pass through light from an environment in front of the eye glasses 101 to eyes of a user wearing the eye glasses 101, and to display at the same time augmentation information which is generated by the processing unit 102 to the user by overlaying the augmentation information over the reality information of the environment. Therefore, the eye glass lenses 104, 105 are coupled as shown in FIG. 1 to the processing unit 102. A camera 106 attached to a frame of the eye glasses 101 is also coupled to the processing unit 102 and adapted to capture the reality information comprising an image of the environment in front of the eye glasses 101. Based on the reality information received from the camera 106, the processing unit 102 is adapted to overlay the augmentation information at a position corresponding to the viewing direction of the user wearing the eye glasses 101. The processing unit 102 may be integrated into the frame of the eye glasses 101, or may be integrated into a mobile device the user of the eye glasses 101 is carrying and may be coupled to the eye glasses 101 via a wired or a wireless connection. Furthermore, at the temple arm of the frame of the eye glasses 101 headphones or ear speakers 107 and 108 are mounted such that they are each near an ear of the user wearing the eye glasses 101. The head phones or ear speakers 107, 108 are coupled to the processing unit 102 and adapted to output audio data generated by the processing unit 102. Furthermore, a heartbeat measuring device is integrated into one of the headphones or ear speaker 107, 108, or is integrated into both of them. The heartbeat monitoring device is coupled to the processing unit 102 and the processing unit 102 is adapted to determine in combination with the heartbeat monitoring device a heartbeat rate of the user wearing the eye glasses 101. As an alternative, the heartbeat monitoring device may be a heartbeat monitoring belt the user is wearing around the chest or a heartbeat monitoring wristband.

In operation, the user wears the eye glasses 101 during a training session and the processing unit 102 determines a current heartbeat rate of the user via the heartbeat monitoring device integrated into the ear phones 107, 108 or the via chest heartbeat monitoring device or the wristband heartbeat monitoring device. Based on the determined heartbeat rate the processing unit determines a fitness level of a user. Preferably, for determining the fitness level, the processing unit additionally may take into account heartbeat rate values of former training sessions, for example a maximum heartbeat rate of a former session or an average heartbeat rate of a former training session. Furthermore, the processing unit may take into account a resting heartbeat rate determined before the training session and a heartbeat rate variability indicating a variety of the time interval between heartbeats. The fitness level may be determined from these heartbeat properties of the user according to methods known in the art by the processing unit 102. The user may then enter the kind of training which is planned for the training session. For example, the user may want to do a basic endurance training, a speed training or a speed endurance training. Based on the determined fitness level and the kind of training the processing unit defines a performance of a virtual opponent. For example, if the user decided to do a basic endurance training, the performance of the virtual opponent may be defined in terms of a heartbeat rate which may be in this case for example a heartbeat rate of 70% of the maximum heartbeat rate of the user. For a speed training, the performance of the virtual opponent may be defined in terms of a heartbeat rate having 90% of the maximum of the heartbeat rate of the user.

During the training session the current performance of the user is compared to the performance of the virtual opponent. As described above, the performance of the virtual opponent may be defined in terms of the heartbeat rate. In this case, the current heartbeat rate of the user is compared to the heartbeat rate defined by the performance of the virtual opponent. As long as the heartbeat rate of the user is lower than the heartbeat rate of the virtual opponent, the processing unit 102 defines that the user is slower than the virtual opponent. In case the heartbeat rate of the user is higher than the heartbeat rate of the virtual opponent, the processing unit defines that the user is faster than the virtual opponent. This relation between the speed of the virtual opponent and the user is represented to the user by one or more of the following representations:

    • Three-dimensional audio representation
    • Visual augmented reality representation
    • Map representation.

In the three-dimensional audio representation three-dimensional audio data is output to the user via the headphones or ear speakers 107, 108. The audio data comprises sounds of steps of the running virtual opponent. Assuming that both the user and the virtual opponent started at the same point, depending on the performance relation between the virtual opponent and the user sounds of steps of the virtual opponent are output to the user in such a way that it appears to the user that the virtual opponent departs behind the user if the user is faster than the virtual opponent, departs in front of the user, if the virtual opponent is faster than the user, or stays beside the user, if the virtual opponent and the user have the same speed.

In the visual augmented reality representation an image of the opponent is displayed as an augmentation of the reality on the eye glass lenses 104 and 105. For example, if the virtual opponent is running faster than the user, a back view of the virtual opponent will be displayed overlaid to the reality information in front of the user. In case the user has passed the virtual opponent, a front view of the virtual opponent will be displayed as augmentation information whenever the user turns the head and looks behind. Determining the current viewing direction of the user and overlaying the augmentation information over the reality information is known to a person skilled in the art and will therefore not be described in detail.

In the map representation the user and the virtual opponent will be displayed as symbols on the display 103. FIG. 2 shows such a representation on the display 103. On the display 103, a track 201 is displayed. The information about the track may be derived from a map information of a global positioning system of the training device 100. On the track 201 the user as well as the virtual opponent are displayed by corresponding symbols, for example the user as a circle 202 and the virtual opponent as a square 203. Movement of the user may be determined by a global positioning system and displayed on the display 103 correspondingly. Depending on the performance relation between the user and the virtual opponent the symbol 203 of the virtual opponent is arranged on the track 201 in relation to the position of the symbol 202 representing the user.

While exemplary embodiments have been described above, various modifications may be implemented in other embodiments. For example, the performance of the virtual opponent may not only be derived from the fitness level of the user, but may also be derived from a performance of another predetermined person. Thus, the performance of the virtual opponent varies with the current fitness level of the other predetermined person. The fitness level of the other predetermined person may be transmitted to the training device via a wireless connection and the internet or a server.

In case the other predetermined person is also training or exercising, the fitness level of the other predetermined person may directly vary the performance of the virtual opponent. Thus, the user and the other predetermined person may compete with each other although they are physically not co-located. Furthermore, the above-described training device may be used not only for running, jogging or walking, but also for other sports like biking or cross-country skiing. In this case, the representation of the three-dimensional audio sounds and the visual augmented reality have to be adapted accordingly. Furthermore, it is to be understood that the three kinds or representations (three-dimensional audio, augmented reality, and map) are only examples and other kinds of representations may be used. For example, a pitch or a speed (e.g. beats per minute, BPM) of a song being played back by the training device may be changed. Furthermore, these kinds of representations can be used individually or combined with each other as appropriate.

Finally, it is to be understood that all the embodiments described above are considered to be comprised by the present invention as it is defined by the appended claims.

Claims

1. A method for operating a training device, comprising:

automatically determining a property of a user using the training device during a training session,
automatically determining a fitness level of the user from the determined user property,
automatically defining a performance of a virtual opponent based on the determined fitness level of the user, and
outputting information representing the performance of the virtual opponent to the user.

2. The method according to claim 1, wherein the property of the user comprises a property of the user determined in a former training session of the user.

3. The method according to claim 1, wherein the property of the user comprises a property of the user determined during a current training session of the user.

4. The method according to claim 1, wherein the property of the user comprises a heart beat rate of the user.

5. The method according to claim 1, wherein the property of the user comprises at least one heart beat characteristic from the group comprising a heart beat rate recovery after a training session, a maximum heart beat rate during a training session, an average heart beat rate during a training session, a resting heart beat rate, and a heart beat rate variability.

6. The method according to claim 1, wherein the performance of the virtual opponent is defined from a fitness level of another predetermined person, wherein the performance of the virtual opponent is scaled based on the fitness level of the user and the fitness level of the predetermined person.

7. The method according claim 1, wherein the information representing the performance of the virtual opponent comprises a performance relation representing a current performance of the user in relation to the performance of the virtual opponent, wherein the performance relation is automatically determined based on the current performance of the user in relation to the performance of the virtual opponent.

8. The method according to claim 7, wherein the performance relation is output by a three-dimensional audio data output to the user, the three-dimensional audio data comprising sounds of footsteps arranged in relation to the user based on the performance relation.

9. The method according to claim 7, wherein the performance relation is output by a visual representation of the virtual opponent in an augmented reality system, wherein the virtual opponent is arranged as augmentation information in relation to the user based on the performance relation.

10. The method according to claim 7, wherein the performance relation is output by a visual representation of the virtual opponent and the user on a map, wherein the virtual opponent is arranged in relation to the user based on the performance relation.

11. A training device, comprising:

a monitoring unit adapted to determine a property of a user using the training device during training session,
a processing unit adapted to determine a fitness level of the user from the determined user property and to define a performance of a virtual opponent based on the determined fitness level of the user, and
an output unit adapted to output information representing the performance of the virtual opponent to the user.

12. The training device according to claim 11, wherein the training device comprises a mobile device selected from the group comprising a mobile phone, a mobile navigation system, a mobile music player, and a mobile pulse monitor.

13. The training device according to claim 11, wherein the property of the user comprises a property of the user determined in a former training session of the user.

14. The training device according to claim 11, wherein the property of the user comprises a property of the user determined during a current training session of the user.

15. The training device according to claim 11, wherein the property of the user comprises a heart beat rate of the user.

16. The training device according to claim 11, wherein the property of the user comprises at least one heart beat characteristic from the group comprising a heart beat rate recovery after a training session, a maximum heart beat rate during a training session, an average heart beat rate during a training session, a resting heart beat rate, and a heart beat rate variability.

17. The training device according to claim 11, wherein the performance of the virtual opponent is defined from a fitness level of another predetermined person, wherein the performance of the virtual opponent is scaled based on the fitness level of the user and the fitness level of the predetermined person.

18. The training device according to claim 11, wherein the information representing the performance of the virtual opponent comprises a performance relation representing a current performance of the user in relation to the performance of the virtual opponent, wherein the performance relation is automatically determined based on the current performance of the user in relation to the performance of the virtual opponent.

19. The training device according to claim 18, wherein the performance relation is output by a three-dimensional audio data output to the user, the three-dimensional audio data comprising sounds of footsteps arranged in relation to the user based on the performance relation.

20. The training device according to claim 18, wherein the performance relation is output by a visual representation of the virtual opponent in an augmented reality system, wherein the virtual opponent is arranged as augmentation information in relation to the user based on the performance relation.

21. The training device according to claim 18, wherein the performance relation is output by a visual representation of the virtual opponent and the user on a map, wherein the virtual opponent is arranged in relation to the user based on the performance relation.

Patent History
Publication number: 20120021393
Type: Application
Filed: Jul 21, 2011
Publication Date: Jan 26, 2012
Inventor: Ola Thörn (Lund)
Application Number: 13/187,906
Classifications
Current U.S. Class: Physical Education (434/247)
International Classification: A63B 69/00 (20060101);