EXERCISE EQUIPMENT WITH MOVABLE HANDLE BARS TO SIMULATE STEERING MOTION IN A SIMULATED ENVIRONMENT AND METHODS THEREFOR

- Expresso Fitness Corp.

An apparatus of exercise equipment with movable handle bars to simulate steering motion in a virtual environment and methods therefor are disclosed. One embodiment of the apparatus includes, exercise equipment having a display unit and a computing unit, a foot actuator mounted on a frame of the exercise equipment, a seat assembly coupled to a rail mounted on the frame of the exercise equipment, and/or a handle bar coupled to the seat assembly on one side. The handle bar can be programmable or reprogrammable to cause a leftwards steering motion or a rightwards steering motion to be simulated in the virtual environment in response to rotation in a given direction.

Latest Expresso Fitness Corp. Patents:

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CLAIM OF PRIORITY

This application claims priority to U.S. Provisional Patent Application No. 61/083,891 entitled “Cardio-Fitness Station with Virtual-Reality Capability and Enhanced Riding Control”, which was filed on Jul. 25, 2008, the contents of which are incorporated by reference herein.

BACKGROUND

Exercise equipment is ubiquitous in homes and fitness clubs. Exercising in a fitness club or in one's home has become a preferred way for some to satisfy their exercise routines. However, on exercise equipment where the exercising user sits upright, the user's weight and impact forces are generally focused on the lower back and hip regions. This impact force could, in the long term, potentially cause lower back and hip injuries.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 depicts a diagram of exercise equipment that are network-enabled and are able to establish communication with one another and/or with a host server.

FIG. 2 depicts an example exercise equipment having a movable handle bar coupled to the seat assembly.

FIG. 3 depicts a diagram illustrating how the handle bars can be rotated to cause steering motion to be simulated in the virtual environment.

FIG. 4 depicts gear shifting unit having one or more switches that can be actuated for gear adjustment and simulation of gear upshifting/downshifting in the virtual environment.

FIG. 5 depicts an example functional block diagram of the computing unit of the exercise equipment that generates a virtual environment through which a user travels during exercise in which steering motion is simulated based on detected handle bar motion.

FIG. 6 depicts an example block diagram of the components of the computing unit of the exercise equipment.

FIG. 7A depicts a flow diagram illustrating example processes for simulating steering motion through a virtual environment based on a difference in displacement angles between first and second handle bars.

FIG. 7B depicts a flow diagram illustrating example processes for configuring the handle bars and switches for gear shifting.

FIG. 7C depicts a flow diagram illustrating example processes for adjusting pedaling resistance and/or simulating gear upshifting/downshifting in the virtual environment.

FIG. 8 depicts an example illustration of a keypad panel.

FIG. 9 depicts an example image shown on the display unit of the exercise equipment.

FIG. 10 shows a diagrammatic representation of a machine in the example form of a computing system within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed.

DETAILED DESCRIPTION

The following description and drawings are illustrative and are not to be construed as limiting. Numerous specific details are described to provide a thorough understanding of the disclosure. However, in certain instances, well-known or conventional details are not described in order to avoid obscuring the description. References to one or an embodiment in the present disclosure can be, but not necessarily are, references to the same embodiment; and, such references mean at least one of the embodiments.

Reference in this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the disclosure. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not other embodiments.

The terms used in this specification generally have their ordinary meanings in the art, within the context of the disclosure, and in the specific context where each term is used. Certain terms that are used to describe the disclosure are discussed below, or elsewhere in the specification, to provide additional guidance to the practitioner regarding the description of the disclosure. For convenience, certain terms may be highlighted, for example using italics and/or quotation marks. The use of highlighting has no influence on the scope and meaning of a term; the scope and meaning of a term is the same, in the same context, whether or not it is highlighted. It will be appreciated that the same thing can be said in more than one way.

Consequently, alternative language and synonyms may be used for any one or more of the terms discussed herein, nor is any special significance to be placed upon whether or not a term is elaborated or discussed herein. Synonyms for certain terms are provided. A recital of one or more synonyms does not exclude the use of other synonyms. The use of examples anywhere in this specification including examples of any terms discussed herein is illustrative only, and is not intended to further limit the scope and meaning of the disclosure or of any exemplified term. Likewise, the disclosure is not limited to various embodiments given in this specification.

Without intent to further limit the scope of the disclosure, examples of instruments, apparatus, methods and their related results according to the embodiments of the present disclosure are given below. Note that titles or subtitles may be used in the examples for convenience of a reader, which in no way should limit the scope of the disclosure. Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. In the case of conflict, the present document, including definitions will control.

Embodiments of the present disclosure include exercise equipment with movable handle bars to simulate steering motion in a simulated environment. In general, embodiments are related to an exercise equipment that is recumbent where the user sits reclined in a seat assembly of the equipment.

FIG. 1 depicts a diagram of exercise equipment 108A-N that are network-enabled and are able to establish communication with one another and with a host server 100.

The exercise equipment 108A-N can be a recumbent unit. On a recumbent unit, the exercise equipment 108A-N may be displaced laterally from the seat such that a user sits reclined in the seat assembly. In one embodiment, the seat assembly is coupled to a handle bar which is generally movable to simulate steering motion of a virtual body in the virtual environment. In addition, the handle bar may be programmable or reprogrammable (e.g., by an exercising user, system maintainer, via the equipment or via the host server 100) to cause a leftwards steering motion or a rightwards steering motion to be simulated in the virtual environment.

In addition, the exercise equipment 108A-N can be network-enabled to communicate with other equipment or devices over a network (e.g., the Internet). For example, the exercise equipment may communicate with other exercise equipment so that exercising users can compete during an exercise session. In addition, the exercise equipment 108A-N may be able to communicate with the host server 100 or other devices (e.g., a computer or portable device) via network connectivity.

In one embodiment, the hardware and/or software on the equipment 108A-N can be configured and/or programmed remotely using the host server 100 or other devices in communication. Software/firm upgrades for the exercise equipment 108A-N can be downloaded from the host server 100. In addition, the handle bar may be programmed or reprogrammed remotely through the host server 100 to cause a leftwards steering motion or a rightwards steering motion to be simulated in the virtual environment. The handle bar may also be programmed or reprogrammed locally through settings adjustable via the exercise equipment 108A-N. An example of the exercise equipment 108A-N is illustrated with further reference to FIG. 2.

The client devices 102A-N can be any system and/or device, and/or any combination of devices/systems that is able to establish a networked connection with another device, a server and/or other systems. The client devices 102A-N and exercise equipment 108A-N typically include display or other output functionalities to present data exchanged between the devices to a user. For example, the client devices 102A-N can be, but are not limited to, a server desktop, a desktop computing unit, a computing unit cluster, a mobile computing device such as a notebook, a laptop computing unit, a handheld computing unit, a mobile phone, a smart phone, a PDA, a Blackberry device, a Treo, and/or an iPhone, etc.

In one embodiment, the client devices 102A-N and exercise equipment 108A-N are coupled to a network 106. In some embodiments, the exercise equipment 108A-N may be directly connected to one another. The client devices 102A-N can be used by exercising users or other users (e.g., health care providers, fitness advisors, etc.) to program/reprogram the handle bars or gear shifting switches of a gear shifting unit on exercise equipment.

The network 106, over which the client devices 102A-N and exercise equipment 108A-N communicate, may be a telephonic network, an open network, such as the Internet, or a private network, such as an intranet and/or the extranet. For example, the Internet can provide file transfer, remote log in, email, news, RSS, and other services through any known or convenient protocol, such as, but is not limited to the TCP/IP protocol, Open System Interconnections (OSI), FTP, UPnP, iSCSI, NFS, ISDN, PDH, RS-232, SDH, SONET, etc.

The network 106 can be any collection of distinct networks operating wholly or partially in conjunction to provide connectivity to the client devices 102A-N, host server 100, and/or the exercise equipment 108A-N and may appear as one or more networks to the serviced systems and devices. In one embodiment, communications to and from the client devices 102A-N and exercise equipment 108A-N can be achieved by, an open network, such as the Internet, or a private network, such as an intranet and/or the extranet. In one embodiment, communications can be achieved by a secure communications protocol, such as secure sockets layer (SSL), or transport layer security (TLS).

In addition, communications can be achieved via one or more wireless networks, such as, but is not limited to, one or more of a Local Area Network (LAN), Wireless Local Area Network (WLAN), a Personal area network (PAN), a Campus area network (CAN), a Metropolitan area network (MAN), a Wide area network (WAN), a Wireless wide area network (WWAN), Global System for Mobile Communications (GSM), Personal Communications Service (PCS), Digital Advanced Mobile Phone Service (D-Amps), Bluetooth, Wi-Fi, Fixed Wireless Data, 2G, 2.5G, 3G networks, enhanced data rates for GSM evolution (EDGE), General packet radio service (GPRS), enhanced GPRS, messaging protocols such as, TCP/IP, SMS, MMS, extensible messaging and presence protocol (XMPP), real time messaging protocol (RTMP), instant messaging and presence protocol (IMPP), instant messaging, USSD, IRC, or any other wireless data networks or messaging protocols.

The data repository 132 and/or content repository 134 can store software, descriptive data, multimedia content, exercise routes, user data, historical exercise records, user ID, user preferences, user fitness information, user health information, system information of exercise equipment, equipment setting/configuration, drivers, and/or any other data item utilized by other components of the host server 100 and/or the exercise equipment 108A-N for operation.

FIG. 2 depicts an example exercise equipment 200 having a movable handle bar 202A coupled to the seat assembly 204.

The exercise equipment 200 can generally be any type of exercise equipment including but not limited to, cardio-fitness equipment. For example, the exercise equipment 200 is a bicycle and can include a handle bar 202A that is movable. The equipment 200 may further include a seat assembly 204 and a seat-adjustment lever 212. The seat-adjustment lever 212 can be actuated to slide the seat assembly 204 along the rail 210 for position adjustment.

In general, the handle bar 202A is coupled to one side of the seat assembly 204 such that the handle bar 202A can move in conjunction with the seat assembly 204 when a position of the seat assembly 204 is adjusted. The seat assembly can include a seat 206 and optionally a back rest 208. One embodiment of the equipment 200 can further include a second handle bar 202B connected to a second side of the seat assembly 204. For example, the handle bar 202A can be connected to the left side of the seat assembly 204 and the handle bar 202B can be connected to the right side of the seat assembly 204.

Depending on the type of equipment, the exercise equipment 200 can include a foot actuator mounted on a frame 220 of the exercise equipment 200 including but not limited to a stepper, pedals, or the like assembly. In the example of a bicycle, the exercise equipment 200 includes pedals 214. In one embodiment, the exercise bicycle is a recumbent bicycle where the foot actuator 214 is displaced laterally from the seat 206 such that a user can sit reclined in the seat assembly 204.

The exercise equipment 200 can further include a computing unit 250 (e.g., a computer) and/or a display unit 216 coupled to the computing unit 250. The display unit 216 can include, for example, one or more LCD displays. The computing unit 250 can execute one or more instruction sets embodied on a machine-readable (storage) medium. The instruction sets, when executed, can cause the computing unit 250 to generate a virtual environment for display on the display unit 216, for example, while a user is using the equipment 200 to exercise.

The handle bar 202A can be coupled to the seat assembly 204 on one side and can be physically movable to cause the computing unit 250 to simulate a steering motion in the virtual environment. For example, the handle bar 202A can be rotated clockwise or counterclockwise around an axis to simulate the steering motion. The handle bar can be rotated via upwards or downwards motion of a hand of a user sitting on the seat and holding the handle bar. The translation of physical movement of the handle bar 202A and/or the handle bar 202B to simulated steering motion in the virtual environment by the computing unit 250 can be described with further reference to the example of FIG. 3. In general, a magnitude of the steering that is simulated is proportional to a rotation angle of the handle bar 202A from an initial position about the axis. The handle bar and the rotation from an initial position about an axis are illustrated with further reference to the example of FIG. 3.

In one embodiment, the handle bar 202A is programmable or reprogrammable to cause simulation of a leftwards steering motion or a rightwards steering motion in the virtual environment in response to physical rotation in a given direction. For example, clockwise motion of the handle bar is configurable (e.g., by a user or others) to cause simulation of a leftwards steering motion or a rightwards steering motion in the virtual environment. Similarly, counterclockwise motion of the handle bar is programmable to cause simulation of a leftwards steering motion or a rightwards steering motion in the virtual environment.

In addition, a user may optimize pedaling resistance by adjusting the gear. Each gear can be designated with a number. Other designations, such as, low, medium, high, or overdrive, are possible, for example. The user can change the ratio between the rotational-velocity of the pedals and the virtual speed of the primary virtual body in order to optimize the force needed to turn the pedals and go forwards. For motion of desired speed up a virtual terrain of a given slope, a lower transmission ratio (or lower gear) provides for smaller resistance to pedaling, but higher cadence to achieve given bicycle speed. By changing the gear, the user is able to adapt the exercise level to the virtual terrain and desired virtual speed of bicycling.

In one embodiment, the exercise equipment 200 further includes a gear shifting unit 211 disposed on the handle bar 202A used to change the gear. The gear shifting unit 211 includes a first switch 207 for upshifting the gear and its actuation can further trigger simulation of gear upshifting. The gear shifting unit 211 may further include a second switch 209 for down shifting the gear and its actuation can further trigger simulation of gear downshifting in the virtual environment.

Detection of actuation of one or more of the switches causes the computing unit 250 to adjust the resistance to pedaling on the exercise equipment and to simulate upshifting or downshifting of the gear in the virtual environment. In general, first and second switches 207 and 209 are programmable or reprogrammable. For example, a first switch 207 on the handle bar 202A can be configured for gear upshifting and the second switch 209 can be configured for gear downshifting; and vice versa. The gear shifting unit 211 is illustrated with further reference to the example of FIG. 4.

In one embodiment, the gear shifting unit 211 includes a two-pole momentary switch whose momentary connection to any one of the two poles is made upon actuation of one or more of the switches 207 and 209. For example, the connection to one of the poles can be interpreted by the computing unit 250 as an increment to a higher gear number, while the momentary connection to the other pole is interpreted by the computing unit 250 as a decrement in the gear number.

In exercise equipment 200 shown in the example of FIG. 1, the display unit 216 includes a video monitor (e.g., an LCD or CRT display monitor). In some embodiments, video monitor of the example of FIG. 1 may be replaced with and/or supplemented by other types of display units. In general, display units include but are not limited to video monitor, video monitor with video glasses which when used jointly allow the user to view three-dimensional graphics on a single monitor, video goggles that allow the user to experience three-dimensional images (without the use a monitor), and monitors that employ an array of cylindrical refractive lenses that allow simulated three-dimensional imaging without the use of any glasses or goggles. For example, a head-mounted display may be coupled to the equipment 200 and used in lieu of or in conjunction with the video monitor. In general, the display unit 216 is positioned in the plain view of the user while the user is seated on the seat 206.

The exercise equipment 200 may further be coupled to headphones 224 and/or speakers which are output devices for sound. Other output devices include a mechanism that provides response to the forward-motion actuator on the exercise equipment. For example, for an exercise bicycle the forward-motion actuator are the rotating pedals, while the response from the virtual-reality computing unit manifests itself as a varying resistance to pedal rotation. At a given rotational velocity of the pedals referred to as cadence, higher resistance to the rotation of the pedals results in increased dissipation of power by the user during exercise. Another example of an output device is a fan located on the front of the exercise equipment and whose rotational velocity may be controlled by the computing unit depending on the perceived velocity of the virtual body in the virtual environment. The fan gives the user the perception of wind resistance experienced when moving forward. Another example of an output is the vibration of part of the equipment, for example, the seat, which simulates perceived road quality or impact with objects. Yet another example of an output device is resistance to steering, if applicable.

While the user is exercising on the exercise equipment 200, the user can view the images on the display unit 216, listens to sounds coming from the headphones 224 and/or speakers, and/or speak through a microphone. The headphones 224 can be coupled to the equipment 200 through link 226 (e.g., an analog audio link or a digital link). The microphone may be connected to the headphone set 224 and is in communication with the computing unit 250 via link 226 as well.

The exercise equipment 100 can be used by the exercising user to control a virtual body, also referred to as a “primary” virtual body in a virtual environment. The virtual body's movement can be determined based on the motion of the user while exercising. For example, the image shown on the display unit 216 can include the first-person perspective view of the primary virtual body from the exercising user as it moves in the virtual environment. An example of such an image is shown in FIG. 9. Features of the virtual environment are also further described with reference to the example of FIG. 9.

In one embodiment, the exercise equipment 200 is network-enabled. For example, the exercise equipment 200 can established wired and/or wireless communications with other exercise equipments, servers, and/or remote storage units via the Internet or other types of networks (e.g., cellular, WiFi, etc.). The exercise equipment 200 includes or is coupled to a biometric sensor 228 which may be in communication with the computing unit 250. The biometric sensor may take any shape depending on what biometric function it senses.

FIG. 3 depicts a diagram illustrating how the handle bars 302A and 302B can be rotated to cause steering motion to be simulated in the virtual environment.

The handle bar 302A attached to exercise equipment can be moved by a user sitting on the seat assembly of the exercise equipment to rotate about an axis. A second handle bar 302B may also be attached to the exercise equipment and be available for physical movement by the user such that the handle bar 302B rotates. Steering motion (e.g., left or right motion of a virtual body in simulated environment), upon detection of physical movement of one or more of the handle bars 302A and 302B, can be simulated in a virtual environment that is depicted on a display unit connected to the exercise equipment on which the user is exercising or using.

In one embodiment, the handle bar 302A and/or 302B is moveable to rotate around an axis 304. Either one of the handle bars or both can be moved to rotate about the axis 304 to trigger simulation of steering motion of a virtual body. The handle bar 302A and/or 302B can be moved from an initial position (e.g., indicated by dashed lines 303A and 303B respectively). The initial position 303A and/or 303B is generally a position where the handle bar 302A and/or 302B rests neutrally when not actuated by a user.

When moved to rotate from the initial position, handle bar 302A forms an angle αL 305A with the initial position 303A. In one embodiment, the handle bar 302A can be moved to rotate either clockwise or counterclockwise. Alternatively, the handle bar 302A can be rotated in both clockwise (αL>0) and counterclockwise (αL<0) directions relative to the initial position 303A when the user shifts the handle bar 302A up or down, respectively. When not pulled in either direction, the handle bar can return to the initial position (αL=0) (e.g., via spring action). The right handle bar 403 operates similarly: It forms an angle αR relative to the initial position 305B.

The steering of a virtual body within the virtual environment can be determined from the angle formed between the handle bar 302A and the initial position 305A. In addition, the steering can be determined based on the movement of both of the handle bars 302A and 302B. The movement may be quantified by the displacement of the handle bars from their respective initial positions.

For example, a magnitude of the steering can be proportional to a difference (e.g., (αL−αR)) between a first rotation angle αL of the handle bar 302A from the axis 304 and a second rotation angle αR of the handle bar 302B from the axis 304. In general, a positive or negative value of the angle difference is configurable to cause either a leftwards or rightwards steering motion to be simulated in the virtual environment. In one embodiment, the user can set which of two steering polarities to use during a particular exercise session or for the machine.

Since the extent and direction of steering that depends on a difference between rotation angles αL and αr,steering can also be simulated when only one of the handle bars (e.g., 302A or 302B) is moved to rotate—by leaving for example αL=0, and moving only the other handle bar (varying αR). In addition, steering with both hands by depressing one handle bar and pulling the other facilitates realizing a quicker turn than with only using one of the handle bars. Moreover, since the steering does not depend on the average position of the handle bars (αRL)/2, but only on the difference in the angles (αL−αR) it is possible to move straight ahead in the virtual environment when both handle bars are held at an elevated (or depressed angle) as long as αRL such that users with shorter or longer arms may use the handle bars to achieve equally efficient and comfortable steering.

FIG. 4 depicts a switch 407 on the gear shifting unit 400 that can be actuated for gear adjustment and simulation of gear upshifting/downshifting in the virtual environment.

The gear shifting unit 400 can include a switch 407 which can be actuated for adjusting the gear level. Another switch 409 can also be included in the gear shifting unit 400. Although two switches 407 and 409 are shown, any number of switches can be provided on a single gear shifting unit (e.g., unit 400). The switch 407 and/or 409 on the gear shifting unit 400 can be actuated during exercise on the exercise equipment independently or simultaneously for gear adjustment to modify the resistance to pedaling and/or to cause simulation of gear shifting. The switches 407 and/or 409, in one embodiment, are actuated via depressing and releasing actions.

In one embodiment, the exercise equipment is equipped with four switches: a gear-up and a gear-down switch on each of the handle bars. For example, the handle bar 400 features a gear down switch 407 and a gear up switch 409. The handle bar 400 can optionally include heart-rate monitor pads 410. The detected actuation (e.g., pressing and releasing) one of the switches can be interpreted by the computer as a request to increment or decrement the gear number. In general, each switch can be configured and/or re-configured/reprogrammed for either gear upshifting or downshifting.

In general, actuating a switch configured for upshifting increments the gear number increasing the resistance to pedal, and actuating a switch configured for downshifting decrements the gear number thus decreasing the resistance to pedal. The switches can be actuated together to achieve different gear adjustment results and/or various simulation effects. For example, two upshifting switches (e.g., on each handle bar) can be actuated simultaneously such that more gear is added on over a period of time as compared to only actuating one upshifting switch. Similarly, two downshifting switches can be actuated simultaneously to decrement more gear as opposed to actuating a single switch. Thus, a user can change gears faster than that with actuating a single gear shifting lever or switch—which can be used for racing in the virtual environment.

FIG. 5 depicts an example functional block diagram of the computing unit 550 of the exercise equipment that generates a virtual environment through which a user travels during exercise in which steering motion is simulated based on detected handle bar motion.

The computing unit 550 includes a network interface 502, a virtual environment simulator 504, a steering module 506, and/or a gear shifting module 508. Additional or fewer modules can be included without deviating from the novel art of this disclosure. In addition, each module in the example of FIG. 5 can include any number and combination of sub-modules, and systems, implemented with any combination of hardware and/or software modules.

In the example of FIG. 5, the network interface 502 can be one or more networking devices that enable the computing unit 550 to mediate data in a network with an entity that is external to the computing unit, through any known and/or convenient communications protocol supported by the host and the external entity. The network interface 502 can include one or more of a network adaptor card, a wireless network interface card, a router, an access point, a wireless router, a switch, a multilayer switch, a protocol converter, a gateway, a bridge, a bridge router, a hub, a digital media receiver, and/or a repeater.

A firewall, can, in some embodiments, be included to govern and/or manage permission to access/proxy data in a computer network, and track varying levels of trust between different machines and/or applications. The firewall can be any number of modules having any combination of hardware and/or software components able to enforce a predetermined set of access rights between a particular set of machines and applications, machines and machines, and/or applications and applications, for example, to regulate the flow of traffic and resource sharing between these varying entities. The firewall may additionally manage and/or have access to an access control list which details permissions including for example, the access and operation rights of an object by an individual, a machine, and/or an application, and the circumstances under which the permission rights stand.

Other network security functions performed or included in the functions of the firewall, can be, for example, but are not limited to, intrusion-prevention, intrusion detection, next-generation firewall, personal firewall, etc. without deviating from the novel art of this disclosure. In some embodiments, the functionalities of the network interface 502 and the firewall are partially or wholly combined and the functions of which can be implemented in any combination of software and/or hardware, in part or in whole.

The network interface 502 includes a communications module or a combination of communications modules communicatively coupled to the network interface 502 to manage a one-way, two-way, and/or multi-way communication sessions over a plurality of communications protocols.

One embodiment of the computing unit 550 includes a virtual environment simulator 504. The virtual environment simulator 504 can be any combination of software agents and/or hardware modules able to generate, simulate, modify, update, a virtual or interactive environment that is shown on a display unit of exercise equipment.

In general, the virtual/interactive environment includes an exercise route in a landscape through which a virtual body moves. Multiple virtual bodies maybe present in the virtual environment however generally, at least one virtual body is The virtual/interactive environment is depicted in FIG. 9 and additional details regarding the features thereof are described with further reference to the example of FIG. 9.

In one embodiment, the simulator 504 simulates steering motion of the virtual body in response to movement of a handle bar coupled to the exercise equipment. For example, the simulator 504 can communicate with the steering module 506 which can detect physical movement/rotation of one or more handle bars (e.g., rotation about an axis) attached to the equipment. In addition, the steering module 506 can determine (e.g., quantify) the displacement due to physical motion/rotation of one or more of the handle bars. For example, the steering module 506 can determine the angle of displacement and/or the direction of the displacement, of one or more of the handle bars. In one embodiment, the angle detector senses the angle/direction of the displacement.

Based on the angle and/or direction of the displacement of a handle bar, the simulator 504 can modify simulation of motion of the virtual body through the virtual environment such that the exercising user perceives steering motion according to the physical movement/rotation of one or more handle bars. Note that angle measurements can also be positive or negative to represent the relative direction of the displacements of the handle bars.

The displacement of one handle bar can be used by the simulator 504 for simulating steering motion. In addition, displacement of each of multiple handle bars can also be used in conjunction by the simulator 504 to simulate steering motion of a virtual body. For example, the steering module 506 can determine the difference between a first rotation angle of the handle bar from the axis and a second rotation angle of the second handle bar from the axis. The simulator module 504 can simulate steering motion based on the difference in rotation angles between two handle bars. Additional handle bars can also be used on a single exercise machine and maneuvered by a user to simulate steering motion while exercising.

In one embodiment, the simulator 504 simulates upshifting or downshifting of a gear in response to actuation of a gear shifting unit. For example, the simulator 504 can communicate with the gear shifting module 508 which can detect actuation of one or more gear shifting units or switches. The gear shifting module 508 can determine whether the activated switches are configured for gear upshifting or gear downshifting. For example, the increment/decrement detector can determine whether the activated switch or combination of switches increments or decrements the gear. In some instances, activating multiple switches at a time can cause the gear to increment more compared to activating one gear. For example, for an exercise equipment with a gear shifting unit on each of two handle bars, the upshifting switches on each of the gear shifting units can be actuated (e.g., pressed and depressed) simultaneously to increment more gear than when one switch is activated. Similarly, both downshifting switches can be activated simultaneously to decrement more gear.

The simulator module 504 is in communication with the gear shifting module 508 can accordingly simulates gear upshifting or downshifting in the virtual environment. Note that in general, assignment of a switch on a gear shifting unit for upshifting or down shifting is user-programmable or re-programmable (e.g., either through the exercise equipment or a remote computer).

The computing unit 550, although illustrated as comprised of distributed components (physically distributed and/or functionally distributed), could be implemented as a collective element. In some embodiments, some or all of the modules, and/or the functions represented by each of the modules can be combined in any convenient or known manner. Furthermore, the functions represented by the modules can be implemented individually or in any combination thereof, partially or wholly, in hardware, software, or a combination of hardware and software.

FIG. 6 depicts an example block diagram of the components of the computing unit 650 of the exercise equipment.

In one embodiment, computing unit 650 includes a network interface 602, a processing unit 604, a memory unit 606, a storage unit 608, a video processor 610, and/or a TV tuner 612. Additional or less units or modules may be included. One example of a suitable network interface 602 has been described in the example of FIG. 5.

One embodiment the computing unit 650 further includes a processing unit 604. The data received from the network interface 602 can be input to the processing unit 604. The data that is received can include search queries, content from various content sources or a user content repository. The processing unit 604 can include one or more processors, CPUs, microcontrollers, FPGAs, ASICs, DSPs, or any combination of the above. Data that is input to the computing unit 650 can be processed by the processing unit 604 and output to a display and/or output via a wired or wireless connection to an external device, such as a mobile phone, a portable device, a host or server computer by way of a communications component.

One embodiment of the computing unit 650 further includes a memory unit 606 and a storage unit 608. The memory unit 606 and a storage unit 608 are, in some embodiments, coupled to the processing unit 604. The memory unit can include volatile and/or non-volatile memory. In dynamically configuring exercise equipment and adjusting characteristics of virtual exercise routes, the processing unit 604 may perform one or more processes related to acquiring user exercise data (e.g., motion parameters and/or biometric parameters) and comparing the exercise data with the user's preferences.

In some embodiments, any portion of or all of the functions described of the various example modules in the computing unit of the example of FIG. 5 can be performed by the processing unit 604. In particular, with reference to the computing unit illustrated in FIG. 5, the functions and techniques executed by the virtual environment simulator 504, the steering module 506, and/or the gear shifting module 508 can be performed via any of the combinations of modules in the control subsystem that are not illustrated, including, but not limited to, the processing unit 604 and/or the memory unit 606.

FIG. 7A depicts a flow diagram illustrating example processes for simulating steering motion through a virtual environment based on a difference in displacement angles between first and second handle bars.

In process 702, the virtual environment is generated and displayed on the display unit of the recumbent exercise bicycle during an exercise session. In general, the virtual environment can be simulated based on detected motion of pedals of the exercise bicycle. In addition, the virtual environment can be simulated to modify the experience of the user while exercising in response to detected physical motion of one or more handle bars attached to the exercise equipment.

For example, the physical motion of the handle bars can be translated to steering motion of a virtual body through the virtual environment. In process 704, a first displacement of a first handle bar of an exercise bicycle (e.g., recumbent bicycle) from an initial position is detected. In process 706, a second displacement of a second handle bar of the recumbent exercise bicycle from an initial position is detected.

In general, motion through the virtual environment can be based individually on the first and second displacements or in combination. For example, a first angle of the displacement of the first handle bar from the initial position can be determined and a second angle of the second displacement of the second handle bar from the initial position can be determined. In process 708, a difference between the first angle and the second angle is computed. In process 710, motion through the virtual environment is simulated based on the computed difference.

Note that a positive difference can be associated with either simulation of a left steering motion or a left steering motion in the virtual environment. The user can adjust this setting based on preferences or based on the virtual route through which he/she is riding while exercising. The configuration process is further illustrated with reference to the example of FIG. 7B. In general, the larger the magnitude of the difference, the more steering is simulated in the virtual environment.

FIG. 7B depicts a flow diagram illustrating example processes for configuring the handle bars and switches for gear shifting.

In process 722, a request to configure a handle bar coupled to exercise equipment (e.g., recumbent bicycle) is received. The configuration request can be placed by an exercising user of an exercise machine (e.g., a recumbent exercise bicycle) during exercise or while setting up the machine. In addition, the request can be submitted through a server connected to the exercise machine over a network and provided by an equipment maintainer. In addition, the handle bars may automatically be configured to a default setting when software is installed and/or upgraded either locally or remotely (e.g., via connection to a remote server).

In process 724, clockwise motion of the first handle bar is assigned to cause simulation of a leftwards steering motion or a rightwards steering motion. In process 726, counterclockwise motion of the first handle bar is assigned to cause simulation of a leftwards steering motion or a rightwards steering motion. In process 728, a configuration request of the gear shifting switches is received. The gear shifting switches may be un-configured or configured to default settings. A user can request to configure/program or reconfigure/reprogram the settings for the gear shifting switches. For example, in process 730, a first switch on the first handle bar is configured for gear upshifting. In process 732, a second switch on the first handle bar is configured for gear downshifting.

The switches can be reconfigured to trigger gear upshifting or downshifting for a specific amount of time (e.g., for a certain number of exercise sessions), for a specific type of virtual exercise route, for a type of operating mode, or for use under unspecified types of condition (e.g., indefinitely until the switches are reconfigured/reprogrammed).

FIG. 7C depicts a flow diagram illustrating example processes for adjusting pedaling resistance and/or simulating gear upshifting/downshifting in the virtual environment.

In process 742, actuation of a switch on the first handle bar is detected. In process 744, it is determined whether the actuated switch is configured for gear upshifting or gear downshifting. If the actuated switch is for configured for gear upshifting, in process 746, upshifting of the gear is simulated in the virtual environment. If the actuated switch is for configured for gear downshifting, in process 746, downshifting of the gear is simulated in the virtual environment. In addition, the pedaling resistance experienced by a user when exercising on the equipment can, though not necessarily be adjusted according to whether the actuated switch is configured for gear upshifting or downshifting.

FIG. 8 depicts an example illustration of a keypad panel 800.

The user's selection of exercise parameters and features is performed via the keypad 800 (e.g., keypad 218 in the example of FIG. 2). Selections are scanned using the cursors 807 and the selection of entries using the enter key 808. The speaker or headset volume is controlled using the volume control keys 805. The selection of music or TV channel may be performed using the numeric keypad 804. The numeric and character entry is possible through the keyboard 804. Keys 801 and 802 can be used to select the exercising mode and key 303 can be used to switch the TV on/off. The headphone or the headset with microphone can be coupled to port 823.

FIG. 9 depicts an example image shown on the display unit 900 of the exercise equipment, according to one embodiment.

The exercise equipment 200 of the example of FIG. 2 can be used to control a primary virtual body in a simulated virtual environment (e.g., an interactive environment). In one embodiment, the image shown on the display unit 900 (e.g., the display unit 216 of the example of FIG. 2) includes the first-person perspective view of the primary virtual body as it moves in the virtual environment. A view of an example of this image is illustrated in FIG. 9.

In one embodiment, the image of a virtual environment on the screen of the display unit 900 as it is seen by the primary virtual body. In this embodiment, the user perceives him or her as riding a virtual bicycle whose handle bars 901 are visible in the image. The virtual environment can include a virtual countryside with a road 920, a tree 921, distant mountains 928, a house 927, and a pair of bicycles with riders on it 906.

As the user turns the pedals of the exercise equipment, the primary virtual body moves forward in the direction shown as middle of the image. In addition to the mentioned virtual-reality image, the image on the display unit 905 includes an information display overlaid over the virtual reality image. In one embodiment, the path on which the user is to ride is predetermined at the beginning of the session. It is referred to as the virtual exercise route (VER). The overlaid information may include a map 907 of the virtual exercise route in the virtual landscape and user's own virtual bicycle position on that path. It may also include a summary of time, total dissipated calories (or Joules), miles traveled, and distance remaining on that virtual exercise route.

The video screen area 929 shows the written messages to the user delivered by the computing unit and other innovative functions that are part of the embodiments of this disclosure and will be described in further text. The detailed view of the lower part of the image 900 can be referred to as the “heads-up display” and shows exercise information 930: Cadence (momentary rotational speed of the pedals measured in revolutions per minute), gear number, virtual slope against which the bicycle is moving (noted as grade), momentary power dissipation by the user and heart-rate are can be shown and displayed in the image.

The indicator “cadence” in the user interface can be displayed to show the momentary rotational speed of the pedals measured in revolutions per minute. The gear number indicator in the user interface can correspond to the physical gear number to which the exercise equipment is currently set. Cadence and the current gear number can be used to determine the bicycle speed measured in miles per hour or any other suitable speed units. The speed can be computed in the virtual environment. The position of the primary virtual bicycle in the virtual landscape determines the slope against which the bicycle is moving (when pedals are rotating).

The slope can be indicated by “grade”. From the grade and the speed of the primary virtual bicycle, the virtual reality simulator can compute the resistance that the rider would experience via the physical pedals to simulate a real-life like experience. This information is communicated to the pedal, which electronically adjusts the resistance to rotation. With known resistance to rotation and cadence, the program can determine the momentary power dissipation by the user. The instantaneous power and heart-rate can be measured in Watts and beats-per-minute and are shown with. The total time spent riding, the total energy dissipated (calories), and miles traveled can also be depicted on the user interface.

In one embodiment, the path on which the user is to ride (or, “exercise route” or “virtual exercise route”) is generally determined at the beginning of the session. In this case, the map and the elevation profile of the exercise route are known. The map is schematically shown with while its elevation profile is also depicted. The momentary position of virtual bicycle on this virtual exercise route is noted in. The characteristics of the virtual environment are described in more detail in the text below.

VIRTUAL LANDSCAPE MODEL: The control and power delivered by the user to the bicycle, i.e., the motion parameters captured by sensors of the exercise equipment are mapped to motion parameters within the virtual world. In one embodiment, the virtual environment and the user's interaction with the virtual environment is designed so that virtual bodies/elements that move through it obey physical laws of motion. In one embodiment, the mapping between the real world motion parameters and the virtual-world motion parameters are approximately one to one.

For example, if the virtual world is modeled after a real-world landscape and contains a virtual exercise route that is a representation of a real-world path of specified length and elevation challenge, then the user exercising along this virtual-exercise route with have to dissipate approximately the same amount of energy and level of exertion as he would if he was riding the real-world path: Bodies have size, mass, moment of inertia, and the landscape has hills, bodies of water, and paths with slopes and surface features that are similar to those occurring in nature. In one embodiment, the virtual landscape is modeled after a real landscape; it is a computing unit-graphic-stylized exercise routes of a real-life landscape with hills, valleys, road, and road obstacles.

In one embodiment, the virtual landscape features landscape, roads, and road obstacles that are entirely fictional with features and living beings that do not have real-world counterparts, but the objects and the virtual exercise routes still obey real-world physical laws of motion, hence the mapping between the motion parameters on the exercise equipment are approximately mapped one-to-one with the physical-law parameters in the virtual world. In another embodiments, the virtual environment is a landscape on a different planet that features weaker gravitational pull, hence the mapping between the motion parameters captured on the exercise equipment will map to the motion appropriate to the virtual environment and the representation of the primary virtual body in that environment.

In one embodiment, the view of the rendering of the virtual environment provided to the user of the exercise equipment on the display unit 705 is corrected for perspective.

MOTION CONTROL: The motion and location of any virtual body in the virtual environment is determined by its virtual-motion attributes. In an embodiment, the virtual-motion attributes include but are not limited to the mass and moments of inertia of the virtual body, its location, velocity and acceleration, and the position of the steering mechanism (eg. handle bars), considering that the velocity and the acceleration are vector quantities. The virtual-motion parameters of the primary virtual body are controlled by the motion and biometric parameters acquired from the exercise equipment while it is operated by the user.

HILLS AND VALLEYS: In one embodiment, when the primary virtual vehicle is moving up a hill in the computing unit-generated landscape or any other virtual environment, it requires power proportional to its mass and instantaneous velocity in the virtual environment. In one embodiment, the power delivered to the virtual body towards forward motion in a virtual-environment designed to follow real-world physical laws is substantially equal to the instantaneous power delivered to the exercise station by the user pedaling. In other words, when the primary virtual body moves up a hill in the virtual environment, the difficulty in pedaling for the user exercising increases and the power necessary to surmount the hill in the virtual environment is substantially equal to the power that would be necessary for the user to surmount such a hill in real life on a real bicycle.

VIRTUAL EXERCISE ROUTE: In one embodiment, the primary virtual body is allowed to move on a specified path in the virtual environment. This path is referred to as the Virtual Exercise Route (VER). In another embodiment, the primary virtual body is allowed to move anywhere through the virtual environment—to ride over the entire virtual terrain. Virtual exercise route is a path in a virtual landscape along which virtual vehicles move, at least one of the virtual vehicles being controlled by the actions of the user exercising on the exercise equipment. A related term is a virtual tour, which is a sequence of events experienced by the user who is sitting on the stationary exercise equipment, pedaling, steering, changing gears and watching images of a virtual environment shown on the video device in front of him or her. The user watches the images on the video device and acts as if he or she is the driver of the virtual vehicle or the runner running through the virtual landscape or along a virtual exercise route.

SHAPE OF ROUTE: On any closed-loop virtual exercise route, the virtual body may get from one arbitrary point on the exercise route to another arbitrary point on the exercise route in at least two ways: moving forward from one to the other point or by moving backwards. This is the case the exercise path is a closed loop. In one embodiment, the virtual exercise path has more than two ways the virtual vehicle can get from one arbitrary point to another arbitrary point on the virtual exercise route. For example, the virtual path can be shaped as number eight (8) or any other homotopic shape (homotopic shape=if one shape can be continuously deformed into the other). This means that the virtual exercise path features path branching at which the user can make a selection which branch of the path she wants to take. In another embodiment, the selection of the path is determined by another source. In another embodiment, the virtual exercise route is a maze. For example, the exercise route can be a tour puzzle containing loops and the user may have to reach the finish in a given amount of time. In another embodiment the virtual exercise route is a nonplanar graph. In graph theory, a planar graph is a graph that can be drawn so that no edges intersect in the plane. A nonplanar graph cannot be drawn in the plane without edge intersections. In another embodiment, the virtual exercise route comprises more than one unconnected paths. In order to move from one closed path to another the user is challenged to execute a goal. Reaching the goal transports the user to one of the other unconnected paths. All the described embodiments related to the shape of exercise route, increase the entertainment potential of the exercise method according to the present invention. In one embodiment, the virtual vehicle may move freely on the surface of the virtual environment with out being constrained to a path. In another embodiment, the surface of the virtual environment is not simply connected. In topology, a geometrical object or space is called simply connected if it is path-connected and every path between two points can be continuously transformed into every other. An object is simply connected if it consists of one piece and doesn't have any “holes” that pass all the way through it.

REPRESENTATION OF VIRTUAL VEHICLE: The primary virtual body is the body that experiences sensation in the virtual environment to relay these sensory experiences to the user exercising and the body that exerts action in the virtual world under the control of the user exercising. The primary virtual body is generally associated with a virtual vehicle of some kind and when they are inseparable in the virtual environment, we refer to the primary virtual vehicle and the primary virtual body interchangeably since they are not separable during the course of the exercise. The primary virtual vehicle and its representation in the virtual environment may vary. In one embodiment, the exercise equipment is modeled after a real-life bicycle and the virtual representation of the primary virtual vehicle is that of a computing unit-stylized bicycle. In another embodiment, the exercise equipment is modeled after a row boat and the primary virtual vehicle is represented as a computing unit-stylized row boat. In yet another embodiment, the primary virtual body is a fictitious object moving in an arbitrary imaginary world.

DISPLAY OF BIOMETRIC DATA: In one embodiment, the video device displays current exercise data collected on the user exercising. Exercise data includes motion data and biometric data. An example of biometric and motion data displayed on the video device for the user to observe is shown in FIG. 2B. In one embodiment, the video device displays meta-information such as heart-zones, perceived exertion level, or similar quantities. In another embodiment, the video device displays target exercise data for the user to observe and attempt to reach. In one embodiment, the exercise data targets have been set by a live coach. In another embodiment the target exercise data have been set the computing unit function referred to as “virtual coach”. In yet another embodiment, the displayed target exercise data have been previously recorded by the same user on the same virtual exercise route (VER).

VIRTUAL REALITY In the last decade, there has been significant development of virtual reality software and programs that have virtual reality attributes (inherent characteristics). Virtual reality is an artificial environment, which is experienced through sensory stimuli (most often by but not limited to sights and sounds) provided by a computing unit and in which one's actions partially determine what happens in the environment.

The essential elements of a virtual-reality system are (a) computing unit that runs virtual reality program, (b) person (“the user”) using the system, and (c) set of interfaces, some of which receive input from the user, and some of which deliver sensory stimuli to the user. The function common to all virtual reality programs is that the computing unit simulates the presence of a virtual body in a virtual environment, and that the sensory experiences of that virtual body are delivered to the person (“the user”) using sensory interfaces. In many virtual-reality systems, the user also has the ability to control the actions of the virtual body, and hence has an effect on the virtual environment.

Virtual environment may include activities of multiple virtual bodies and activities resulting from natural and artificial (fictitious) phenomena. Consequently, the tasks of the virtual reality program are to (a) simulate the activities of mentioned multiple virtual bodies and phenomena and (b) create the sensory stimuli experienced by one virtual body we refer to as the primary virtual body, or the recipient of the virtual sensory stimuli. A virtual body within the virtual environment may be controlled by a real-life person also referred to as the user of the virtual reality system. The virtual body controlled by a user is referred to as primary virtual body. Multiple users can control a variety of virtual bodies within same virtual environment and they can interact within the virtual environment. Interaction with the virtual environment typically refers to causing any action to be performed on any element/body in the virtual environment.

The control of the virtual bodies is realized by capturing the actions of a user using input devices which are in turn processed by the computing unit. When the primary virtual body is a vehicle or a runner, the control parameters may include information about the direction, velocity, and acceleration of that virtual vehicle. In the following text the word vehicle or virtual vehicle will be understood to mean any vehicle, a virtual runner or any other virtual creature or virtual machine that moves through the virtual environment.

The sensory experiences of the primary virtual body in the virtual environment are delivered to the user, the sensory recipient, using output devices. Depending on the architecture of the virtual-reality system, the simulation of the activities of multiple objects and virtual bodies may be indistinguishable from creating stimuli experienced by the recipient. For the purpose of this description, computing unit simulation of virtual body activities also means computing unit generation of stimuli to be delivered to the recipient.

A simulation is the imitative representation of the functioning of one system or process by means of the functioning of another, i.e., a computing unit simulation. As a non-limiting example, a road bicycle ridden through a real landscape is imitatively represented by a computing unit-simulated bicycle riding in a computing unit-simulated landscape. A computing unit-simulated bicycle is also referred to as a virtual bicycle, hence real road vehicles are represented as virtual vehicles, including runners or climbers as virtual runners or virtual climbers. Virtual vehicles may not be representations of real road vehicles; they may also be fictional and do not necessarily have to obey real-world laws of motion.

A related concept is computing unit reconstruction. To reconstruct means to construct again: as to establish or assemble again; to build up again mentally, a computing unit-reconstructed landscape is a landscape that is modeled and its image simulated by a computing unit, using a suitable computing unit program. The objects and phenomena appearing within the computing unit-simulated environment are referred to as “virtual” objects and phenomena. In this application, the computing unit-simulated utilizes rendering and allows interaction between the users and between the user and the virtual objects/elements and phenomena. It can be referred to as “simulated interactive environment”, but the term “virtual environment” for short can also be used.

Virtual bodies, other than the primary virtual body, may exist in the virtual environment, regardless of whether they can be seen, heard, or in any other way interact with the primary virtual body. These virtual bodies are referred to as persistent virtual bodies. Alternatively, virtual bodies that exist in the virtual environment only in the regions where the primary body can see them, hear them, or in any other way interact, but do not exist when they cannot be seen, heard or interacted with, are regarded as non-persistent.

Typically, a virtual reality program, or subset of such program, provides stimuli to one user based on the experiences of the primary virtual body. When two or more users interact in the virtual environment, the virtual-reality system architecture may include two or more computing units, each running its own virtual reality program and each virtual-reality program with its own primary virtual body, and each computing unit coupled to all other computing units each running its own virtual reality program and having its own primary virtual body. Some virtual bodies as controlled by a computing unit and exhibit artificial intelligence. There are other architectures that can be employed to serve multiple users.

Viewing Angle and Appearance of Virtual Bodies

In a virtual reality application the sensory stimuli from the computing unit is delivered to the user via visual and auditory output devices. This creates the perception with the user that he or she is experiencing the activity experienced by the primary virtual body. If the viewing angle and the sounds delivered to the user are those that the primary virtual body appears to experience in the virtual environment, the experience is referred to as first-person perspective. Most virtual reality programs operate in this perspective as the user's control of the primary body's activity in the virtual environment is by exercise routey nature a first-person control. One embodiment of the present disclosure employs the first-person perspective. It is also possible to create a situation in which the user controls the virtual body as a first person, but observes the actions of this body from a different perspective, namely, from the perspective of a third person located behind or some distance away from the primary body. This case is referred to as third person perspective. In one embodiment of present disclosure, the third-person perspective is realized by placing a fixed camera view at some predetermined location within the virtual environment. In yet another embodiment, the user is allowed to choose the location of the camera, and hence, the user is allowed to define his or her own third-person perspective angle and location.

The appearance of virtual bodies in the virtual environment is arbitrary. In one embodiment, the appearance is modeled after a person or a vehicle of choice. In another embodiment, the appearance is selected by the user.

EXERCISE EQUIPMENT AS A CONTROLLER: In the present disclosure, the activity of the primary virtual body is controlled by the user's motion on exercise equipment and user's biometric data captured using biometric sensors that are coupled to the user during exercise. The virtual reality program simulates the motion of a vehicle or a runner moving within the virtual environment. The input devices capture user's actions that define the direction, velocity, and acceleration of the virtual vehicle. The input devices may sense the activity of various control mechanisms on the exercise equipment and the biometric monitors may sense user's physical condition. In the present disclosure, any one or all of the inputs captured by the input devices may be used to affect the activity in with virtual environment. The motion of the virtual vehicle is controlled using moveable elements on exercise equipment that include but are not limited to steering mechanism (e.g. handle bars), motion input devices (e.g. pedals, moving stairs, running track, oars), motion retardants (e.g. brakes), gear shifter, vibration sensors, or body movement sensors.

EXERCISE RECORD: The embodiments of this disclosure allow digital recording (capturing and storing) of the motion of all parts of the exercise equipment, all acquired biometric data, and the activity simulated within the virtual environment. The data recorded allow reconstruction of all of the activity occurring on the exercise equipment at a later time for the purpose of analysis or reconstructing the activity within the virtual environment. An exercise record is associated with a user and comprises several types of data: exercise session data, exercise preferences, and fitness record. Exercise session data include biometric data and motion data.

Biometric data is a term collectively used to describe a set of instantaneous biometric parameters acquired during an activity session, the temporal profile (history) of these parameters acquired during an activity session, and the calculated quantities calculated from these biometric parameters.

Motion data comprises physical quantities that describe motion of the different parts of the exercise bicycle, quantities that describe the virtual motion and activities of the primary virtual body within a simulated interactive environment for the specific session, and any quantity that is calculated from these. Motion data is logged during an activity (exercise) session. In one embodiment, motion data comprises a reduced set of above-mentioned information: for example, rather than logging the velocity and power as a function of time, the instantaneous position (of a virtual body) versus time is logged.

Examples of motion data are instantaneous cadence, power delivered to the exercise machine, gear number, position of the handle bars, position, elevation, and velocity profile traversed by the primary virtual body within the computing unit-simulated interactive environment, the virtual-exercise route, the total energy dissipated. Specifically, the pedal rotation velocity (referred to as cadence) and the resistance to the pedaling determines the instantaneous power dissipation by the user.

The instantaneous force pushing the pedals (controlled by user and the pedal resistance) multiplied by the instantaneous rotational velocity of the pedals equals the instantaneous power delivered by the user to the exercise machine. The power delivered to the exercise machine is expressed in Watts and is integrated (accumulated) by the computing unit. The time integral of power is energy, which is expressed in calories (or Joule). The gear number determines the relationship between the cadence and the virtual velocity of the vehicle in the virtual environment.

The instantaneous power, cadence, gear number, their history, and the energy dissipated during one exercise session are some of the motion data displayed on the computing unit screen for the user to see and stored by the computing unit on storage unit. Rotation (or steering) of the handle bars to a certain angle from its steady state un-deflected position is another motion parameter. An exercise or an activity session is a process that starts with the user selecting the exercise mode and ends when the user requests stop or abandons the exercise equipment. Biometric session data are similarly logged and include instantaneous profile and their history. Exercise session data (motion and biometric data) are associated with an activity session.

Exercise preferences comprise personal preferences and short-term and long-term fitness targets. For example, exercise preferences may include a target to ride certain number of miles week, burn a certain number of calories day on average or in total, and maintain heart-rate below a specified number, time for traversing a certain exercise route, weight, glucose level after specified amount of caloric burn, etc. In one embodiment of the present disclosure the computing unit program controlling the exercise equipment compares current exercise-session data to the user's exercise preferences and as a result sends messages to the user when his or her fitness target has been reached.

Fitness record contains information on the overall performance of the user, his or her physical condition and capability; it may contain a high-level analysis of one's long-term fitness plan. This information may be of interest to the user's physician or personal trainer (coach).

In one embodiment of the present disclosure, the exercise session data is stored on storage unit at the end of an exercise session. The stored exercise session data are loaded into a computing unit with virtual-reality capability and the entire exercise session can be reviewed in completion. The session may be reviewed from a different perspective: For example, the person reviewing the exercise session may be using a third-person perspective, while the person that created the exercise session (the person exercising) was using first-person perspective. This enables the reviewer to assess the exercise from a different point of view and provide feedback to the user.

FIG. 10 shows a diagrammatic representation of a machine in the example form of a computer system 1000 within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed.

In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in a client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.

The machine may be a server computer, a client computer, a personal computer (PC), a tablet PC, a laptop computer, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, an iPhone, a Blackberry, a processor, a telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.

While the machine-readable medium or machine-readable storage medium is shown in an exemplary embodiment to be a single medium, the term “machine-readable medium” and “machine-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” and “machine-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the presently disclosed technique and innovation.

In general, the routines executed to implement the embodiments of the disclosure, may be implemented as part of an operating system or a specific application, component, program, object, module or sequence of instructions referred to as “computer programs.” The computer programs typically comprise one or more instructions set at various times in various memory and storage devices in a computer, and that, when read and executed by one or more processing units or processors in a computer, cause the computer to perform operations to execute elements involving the various aspects of the disclosure.

Moreover, while embodiments have been described in the context of fully functioning computers and computer systems, those skilled in the art will appreciate that the various embodiments are capable of being distributed as a program product in a variety of forms, and that the disclosure applies equally regardless of the particular type of machine or computer-readable media used to actually effect the distribution.

Further examples of machine-readable storage media, machine-readable media, or computer-readable (storage) media include but are not limited to recordable type media such as volatile and non-volatile memory devices, floppy and other removable disks, hard disk drives, optical disks (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks, (DVDs), etc.), among others, and transmission type media such as digital and analog communication links.

Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.” As used herein, the terms “connected,” “coupled,” or any variant thereof, means any connection or coupling, either direct or indirect, between two or more elements; the coupling of connection between the elements can be physical, logical, or a combination thereof. Additionally, the words “herein,” “above,” “below,” and words of similar import, when used in this application, shall refer to this application as a whole and not to any particular portions of this application. Where the context permits, words in the above Detailed Description using the singular or plural number may also include the plural or singular number respectively. The word “or,” in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.

The above detailed description of embodiments of the disclosure is not intended to be exhaustive or to limit the teachings to the precise form disclosed above. While specific embodiments of, and examples for, the disclosure are described above for illustrative purposes, various equivalent modifications are possible within the scope of the disclosure, as those skilled in the relevant art will recognize. For example, while processes or blocks are presented in a given order, alternative embodiments may perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks may be deleted, moved, added, subdivided, combined, and/or modified to provide alternative or subcombinations. Each of these processes or blocks may be implemented in a variety of different ways. Also, while processes or blocks are at times shown as being performed in series, these processes or blocks may instead be performed in parallel, or may be performed at different times. Further any specific numbers noted herein are only examples: alternative implementations may employ differing values or ranges.

The teachings of the disclosure provided herein can be applied to other systems, not necessarily the system described above. The elements and acts of the various embodiments described above can be combined to provide further embodiments.

Any patents and applications and other references noted above, including any that may be listed in accompanying filing papers, are incorporated herein by reference. Aspects of the disclosure can be modified, if necessary, to employ the systems, functions, and concepts of the various references described above to provide yet further embodiments of the disclosure.

These and other changes can be made to the disclosure in light of the above Detailed Description. While the above description describes certain embodiments of the disclosure, and describes the best mode contemplated, no matter how detailed the above appears in text, the teachings can be practiced in many ways. Details of the system may vary considerably in its implementation details, while still being encompassed by the subject matter disclosed herein. As noted above, particular terminology used when describing certain features or aspects of the disclosure should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the disclosure with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the disclosure to the specific embodiments disclosed in the specification, unless the above Detailed Description section explicitly defines such terms. Accordingly, the actual scope of the disclosure encompasses not only the disclosed embodiments, but also all equivalent ways of practicing or implementing the disclosure under the claims.

While certain aspects of the disclosure are presented below in certain claim forms, the inventors contemplate the various aspects of the disclosure in any number of claim forms. For example, while only one aspect of the disclosure is recited as a means-plus-function claim under 35 U.S.C. §112, ¶13, other aspects may likewise be embodied as a means-plus-function claim, or in other forms, such as being embodied in a computer-readable medium. (Any claims intended to be treated under 35 U.S.C. §112, ¶13 will begin with the words “means for”.) Accordingly, the applicant reserves the right to add additional claims after filing the application to pursue such additional claim forms for other aspects of the disclosure.

Claims

1. An apparatus, comprising,

an exercise equipment having a display unit and a computing unit;
a foot actuator mounted on a frame of the exercise equipment;
a seat assembly coupled to a rail mounted on the frame of the exercise equipment;
a handle bar coupled to the seat assembly on one side;
wherein, the computing unit is able to execute one or more instruction sets embodied on a machine-readable medium, the one or more instruction sets causing the computing unit to:
generate a virtual environment for display on the display unit;
wherein, the handle bar is physically movable to simulate a steering motion in the virtual environment;
wherein, the handle bar is programmable or reprogrammable to cause a leftwards steering motion or a rightwards steering motion to be simulated in the virtual environment in response to physical rotation of the handle bar in a given direction.

2. An apparatus, comprising,

an exercise equipment having a display unit and a computing unit;
a foot actuator mounted on a frame of the exercise equipment;
a seat assembly coupled to a rail mounted on the frame of the exercise equipment;
a handle bar coupled to the seat assembly on one side;
wherein, the computing unit is able to execute one or more instruction sets embodied on a machine-readable medium, the one or more instruction sets causing the computing unit to:
generate a virtual environment for display on the display unit;
wherein, the handle bar is physically movable to simulate a steering motion in the virtual environment.

3. The apparatus of claim 2, wherein, the exercise equipment is a recumbent bicycle wherein, the foot actuator is displaced laterally from the seat such that a user sits reclined in the seat assembly.

4. The apparatus of claim 2,

wherein, the foot actuator comprises pedals or steppers; and
wherein, the virtual environment is at least further simulated based on detected motion the foot actuator.

5. The apparatus of claim 2, wherein, the handle bar is physically rotated clockwise or counterclockwise around an axis to simulate the steering motion.

6. The apparatus of claim 5, wherein, the handle bar is physically rotated via upwards or downwards motion of a hand of a user sitting on the seat.

7. The apparatus of claim 5, wherein, clockwise motion of the handle bar is programmable to cause simulation of a leftwards steering motion or a rightwards steering motion in the virtual environment.

8. The apparatus of claim 5, wherein, counterclockwise motion of the handle bar is programmable to cause simulation of a leftwards steering motion or a rightwards steering motion in the virtual environment.

9. The apparatus of claim 1, wherein, the handle bar is programmable or reprogrammable.

10. The apparatus of claim 5, wherein, a magnitude of the steering is proportional to a rotation angle of the handle bar from the axis.

11. The apparatus of claim 2, further comprising, a second handle bar coupled to a second side of the seat assembly.

12. The apparatus of claim 11, wherein, a magnitude of the steering is proportional to a difference between a first rotation angle of the handle bar from the axis and a second rotation angle of the second handle bar from the axis.

13. The apparatus of claim 12, wherein, a positive or negative value of the difference is configurable to cause simulation of a leftwards steering motion in the virtual environment.

14. The apparatus of claim 12, wherein, a positive or negative value of the difference is configurable to cause simulation of a rightwards steering motion in the virtual environment.

15. The apparatus of claim 11, wherein, the side and the second side are the left and right sides of the seat assembly.

16. The apparatus of claim 2, wherein, the seat assembly further comprises a back rest.

17. The apparatus of claim 2, further comprising, a seat-adjustment lever that is actuated to slide the seat assembly along the rail for position adjustment.

18. The apparatus of claim 17, wherein, the handle bar is coupled to the seat assembly such that the handle bar moves in conjunction with the seat assembly when a position of the seat assembly is adjusted.

19. The apparatus of claim 2,

further comprising, a gear shifting unit disposed on the handle bar;
wherein, the gear shifting unit is adjusted to increase or decrease resistance to move the foot actuator.

20. The apparatus of claim 19,

wherein, the gear shifting unit includes a first switch for upshifting the gear and a second switch for downshifting the gear;
wherein, the first and second switches are programmable or reprogrammable.

21. The apparatus of claim 2, wherein, the display unit includes an LCD display.

22. The apparatus of claim 2, wherein the exercise equipment is network-enabled and is connected to a remote server via the Internet.

23. A machine-readable storage medium having stored thereon a set of instructions which when executed perform a method for simulating a virtual environment for display on a display unit of a recumbent exercise bicycle, the method, comprising,

generating the virtual environment and displaying the virtual environment on the display unit of the recumbent exercise bicycle during an exercise session;
detecting a first displacement of a first handle bar of the recumbent exercise bicycle from a first initial position;
detecting a second displacement of a second handle bar of the recumbent exercise bicycle from a second initial position;
simulating motion through the virtual environment based on the first and second displacements.

24. The method of claim 23, further comprising,

determining a first angle of the displacement of the first handle bar from the first initial position;
determining a second angle of the displacement of the second handle bar from the second initial position;
computing a difference between the first angle and the second angle;
wherein, the motion through the virtual environment is simulated based on the difference.

25. The method of claim 23,

further comprising, in response to receiving a configuration request, assigning clockwise motion of the first handle bar to cause simulation of a leftwards steering motion or a rightwards steering motion in the virtual environment;
further comprising, in response to receiving another configuration request, assigning counterclockwise motion of the first handle bar to cause simulation of a leftwards steering motion or a rightwards steering motion in the virtual environment.

26. The method of claim 25, wherein, the configuration request is placed by an exercising user of the recumbent exercise bicycle.

27. The method of claim 23, further comprising, responsive to a request, configuring a first switch on the first handle bar for gear upshifting and configuring a second switch on the first handle bar for gear downshifting.

28. The method of claim 23, further comprising,

detecting actuation of a switch on the first handle bar;
determining whether the actuated switch is configured for gear upshifting or gear downshifting;
incrementing or decrementing resistance to moving pedals of the recumbent exercise bicycle;
accordingly simulating upshifting or downshifting of the gear in the virtual environment.

29. A machine-readable storage medium having stored thereon a set of instructions which when executed perform a method for simulating a virtual environment for display on a display unit of an exercise bicycle, the method, comprising,

generating the virtual environment and displaying the virtual environment on the display unit of the exercise bicycle during an exercise session;
determining a first angle of displacement of the first handle bar from a first initial position;
determining a second angle of displacement of the second handle bar from a second initial position;
computing a difference between the first angle and the second angle;
simulating motion through the virtual environment based on the difference between the first and second angles;
detecting actuation of a switch on the first handle bar;
determining whether the actuated switch is configured for gear upshifting or gear downshifting;
accordingly simulating upshifting or downshifting of the gear in the virtual environment;
accordingly simulating upshifting or downshifting of the gear in the virtual environment.
Patent History
Publication number: 20100022354
Type: Application
Filed: Jul 22, 2009
Publication Date: Jan 28, 2010
Applicant: Expresso Fitness Corp. (Sunnyvale, CA)
Inventor: John Fisher (Los Gatos, CA)
Application Number: 12/507,709
Classifications
Current U.S. Class: Monitors Exercise Parameter (482/8); Bicycling (482/57)
International Classification: A63B 71/00 (20060101); A63B 22/06 (20060101);