Teleportation Systems and Methods in a Virtual Environment

Provided are systems and methods for teleportation in a virtual environment. One embodiment of such a system can be implemented as a head mounted display configured to provide an immersive virtual environment and a teleportation device configured to provide navigation in the virtual environment; at least one feedback device configured to provide a user with information corresponding to movement of the teleportation device within the virtual environment. The system also includes a plurality of input devices configured to generate a plurality of input signals in response to inputs from the user and a computing device configured to receive the plurality of input signals and control the at least one feedback device.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATION

This application claims priority to copending U.S. provisional application entitled, “TELEPORTATION SYSTEMS AND METHODS,” having Ser. No. 60/659,283, filed Mar. 7, 2005, which is entirely incorporated herein by reference.

TECHNICAL FIELD

The present disclosure is generally related to virtual technology and, more particularly, is related to systems and methods for providing user interaction in a virtual environment.

BACKGROUND

Large scale Immersive Virtual Environments (IVEs) are common in current research. Some of the major problems in large scale IVEs, however, are traveling and navigation. These problems has been addressed by input devices such as handheld and fixed station user input devices as well as environment specific devices such as, for example, a virtual reality snowboard. The utilization of these input devices, however, is awkward or unnatural and may require extensive training, especially if the device offers many degrees of freedom. For example, some of the previous input devices have required the user to memorize and perform specific coded gestures or sequences of gestures to make virtual environmental changes such as a direction or mode change. In such a device having many degrees of freedom, the user is tasked with memorizing and performing many potentially unnatural tasks and gestures to travel and navigate within a large scale IVE.

SUMMARY

Embodiments of the present disclosure provide a system and method for teleportation in a virtual environment. Briefly described one embodiment of the system, among others, can be implemented as follows: a head mounted display configured to provide an immersive virtual environment; a teleportation device configured to provide navigation in the virtual environment; at least one feedback device configured to provide a user with information corresponding to movement of the teleportation device within the virtual environment; a plurality of input devices configured to generate a plurality of input signals in response to inputs from the user; and a computing device configured to receive the plurality of input signals and control the at least one feedback device.

Embodiments of the present disclosure can also be viewed as methods for providing teleportation in a virtual environment. In this regard, one embodiment of such a method, among others, can be broadly summarized by the following steps: delivering a video signal, corresponding to a virtual environment, to a user; delivering an audio signal, corresponding to the virtual environment, to the user; receiving a plurality of inputs corresponding to a three-dimensional position for each of a plurality of user physiological features; providing a vibratory feedback, corresponding to the virtual environment, to the user; and directing air towards the user to create a motion sensation.

Other systems, methods, features, and advantages of the present disclosure will be or become apparent to one with skill in the art upon examination of the following drawings and detailed description. It is intended that all such additional systems, methods, features, and advantages be included within this description, be within the scope of the present disclosure, and be protected by the accompanying claims.

BRIEF DESCRIPTION OF THE DRAWINGS

Many aspects of the disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.

FIG. 1 is a schematic diagram of an embodiment of a system for teleportation in a virtual environment.

FIG. 2 is a schematic diagram of an alternative embodiment of a system for teleportation in a virtual environment.

FIG. 3 is a schematic diagram illustrating a top view of an embodiment of a system for teleportation in a virtual environment.

FIG. 4 is a schematic diagram illustrating a top view of an embodiment of a teleportation device showing exemplary inputs to a directional input component.

FIG. 5 is a schematic diagram illustrating a partial front view of a system for teleportation in a virtual environment.

FIG. 6 is a schematic diagram illustrating a side view of an embodiment of a teleportation device showing exemplary inputs to a directional input component.

FIG. 7 is a schematic diagram illustrating a side view of an alternative embodiment of a teleportation device.

FIG. 8 is a functional block diagram illustrating an embodiment of a control arrangement for a teleportation system as disclosed herein.

FIG. 9 is a block diagram illustrating an embodiment of an architecture for controlling a teleportation system.

FIG. 10 is a block diagram illustrating an embodiment of a method for providing teleportation in a virtual environment.

DETAILED DESCRIPTION

Having summarized various aspects of the present disclosure, reference will now be made in detail to the description of the disclosure as illustrated in the drawings. While the disclosure will be described in connection with these drawings, there is no intent to limit it to the embodiment or embodiments disclosed herein. On the contrary, the intent is to cover all alternatives, modifications and equivalents included within the spirit and scope of the disclosure as defined by the appended claims.

Reference is first made to FIG. 1, which is a schematic diagram of an embodiment of a system 100 for teleportation in an immersive virtual environment. An immersive virtual environment includes multiple sources of feedback for a user to create the sensation that the user is fully immersed in the virtual environment. The system 100 includes a teleportation device 104 that provides for general purpose navigation in virtual environments. The navigation activities can include, for example, traveling from one place to another for exploring and searching within the virtual environment. A user 108 can rotate himself/herself and the teleportation device 104, physically move forward and backward (and up and down), and change the speed of travel.

The system 100 also includes a computing device 102, which can include a processor, memory, and one or more input/output devices, all communicatively coupled via one or more data buses. The computing device 102 is configured to provide data to a head mounted display 114. The head mounted display 114 is configured to communicate video and audio signals to a user 108 using one or more displays and audio output components. The computing device 102 is also configured to receive user position data from user position sensors 112 proximate to different user physiological features. Examples of user physiological features that might provide useful position data include, but are not limited to, the head, hands, arms, feet, and legs. The embodiment of FIG. 1 includes user position sensors 112 at the users head and hands. In addition to providing three-dimensional position data, the user position sensors 112 can also be used to provide orientation data to the computing device 102.

The computing device 102 is also configured to receive position and orientation data from one or more teleportation device position sensors 116 that are mounted to the teleportation device 104. In this manner, the computing device can render the virtual environment based on the position and orientation of the teleportation device 104.

The teleportation device 104 includes a base 118 configured to optionally support all or a portion of the user 108. The base 118 is attached to a directional input component 110 through a moveable coupling 120. The moveable coupling 120 of this embodiment includes one or more springs configured in modes of compression, tension, or some combination thereof.

The teleportation device 104 also includes a vibratory feedback device 106 configured to be controlled by the computing device 102. The vibratory feedback device 106 is used to deliver sound and/or vibration to the user 108 to simulate varying rates of movement within the virtual environment. In this maimer the sound and/or vibration of the teleportation device 104 in motion is simulated. For example, the vibratory feedback device 106 may be configured to operate at a low frequency and output level when the teleportation device 104 is moving through the virtual environment at a slow speed. Accordingly, the output level and frequency might be increased as the speed of the teleportation device 104 is increased. In some embodiments, the vibratory feedback device 106 can be configured as a subwoofer speaker, for example. Alternatively, or in addition, to the vibratory feedback device 106 can be implemented as vibrotactile devices mounted at a variety of points on the teleportation device 104.

Reference is now made to FIG. 2, which is a schematic diagram of an alternative embodiment of a system 122 for teleportation in a virtual environment. In addition to the components of the system 100 described above in reference to FIG. 1, the system 122 also includes a position interface 124, configured to communicate with the position sensors 112, 116. Communication between the position interface 124 and the position sensors 112, 116 can be accomplished using any one of a variety of wired or wireless communication technologies. The position interface 124, also referred to as a 3-D tracker, reports the position and orientation of each of the position sensors 112, 116 to the computing device 102.

The system 122 also includes one or more fans 128 for generating a wind simulation. The fan or fans 128 can be controlled by the computing device 102 through an output device controller 130. The output device controller 130 can include, for example, relays and or electronic speed controllers to vary the speed and direction of the simulated wind.

The system 122 can also optionally include a status interface system 125 configured to maintain the status of one or more of the peripheral devices external to the computing device 102. The status interface system 125 can be implemented to replace or supplement either or both of the position interface 124 and the output device controller 130. Additionally, the status interface system 125 includes the functionality to detect the operation of user input devices such as buttons or switches. The status interface system 125 may be implemented in separate units, or as a single unit (e.g., with two cards in it, one corresponding to the switching action function of a relay controller and the other having functionality to detect button presses and releases). The status interface system 125, when implemented as a single unit, may have additional cards corresponding to analog-to-digital conversion (ADC) and digital-to-analog conversion (DAC) to control, for example, fan speed.

Reference is now made to FIG. 3, which is a schematic diagram illustrating a top view of an embodiment of a system for teleportation in a virtual environment. The system 138 includes a teleportation device 104 having a base 118 and a directional input component 110, also referred to as a steering wheel or handle bar. The teleportation device 104 includes a vibratory feedback device 106 and one or more user interface devices configured to allow the user to cause or trigger an operation within the virtual environment. The user interface devices can include switches and buttons, among others. Alternative embodiments may include user interface devices using one or more touch screens.

The user interface devices can include an UP button 140 and a DOWN button 142 for causing the teleportation device 104 to move up or down within the virtual environment. Alternatively, the UP and DOWN functions could be combined into one multiple position switch, for example a three position center return switch. User interface devices can also be implemented as a STOP 144 button configured to cause the teleportation device 104 to stop within the virtual environment. A FLY/DRIVE switch 150 is also included. The FLY/DRIVE switch 150 can be toggled between a fly mode and a drive mode.

Also included are an INC button 154 and a DEC button 156 configured to cause the teleportation device to increase speed or decrease speed, respectively. Like the UP and DOWN functions, the INC and DEC functions can alternatively be combined into a multiple position switch such as a toggle switch. Other alternative embodiments can include throttle and/or handbrake structures that can generate, for example, analog signals to increase or decrease the speed, respectively. The analog signals from a throttle and/or a handbrake may be processed using, for example, analog-to-digital conversion hardware and/or software. A throttle and/or handbrake can also be configured to generate digital signals. For example, devices providing a quadrature pulse output in conjunction with a counter can be used for increasing and decreasing the speed.

Other user interface devices can be included such as a LIGHTS button 148 for adjusting the lighting levels in the virtual environment. Some embodiments may feature a simple on and off control for the lighting. Other embodiments may include incremental changes in the lighting levels through actuation of the LIGHTS button 148. The teleportation device 104 can also include a DEBUG button 146 configured to allow the user to debug one or more applications running on the computing device 102. For example, a user may experience a situation where he or she cannot move in the virtual environment due to a collision with multiple objects, such as might occur during a glitch in an application's implementation. A user can activate the DEBUG button 146 and disable collision detection temporarily to enable testing of other parts of the application. The teleportation device 104 can also include a JUMP button 152 to permit the vehicle to jump over obstacles in the virtual environment when in drive mode.

The system 138 also includes all example arrangement of fans 128. The fans 128 used independently or in selective combination can be used to simulate wind that corresponds to the motion within the virtual environment. For example, where the teleportation device 104 is traveling to one side or another, the corresponding fan 128 would be activated to simulate wind commensurate with that motion. Also, when the teleportation device 104 is turned or rotated, a three dimensional position sensor 116 can detect which direction the teleportation device 104 is facing and operate one or more fans 128 corresponding to movement in the new direction.

Brief reference is now made to FIG. 4, which is a schematic diagram illustrating a top view of an embodiment of a teleportation device showing exemplary inputs to a directional input component. The teleportation device 104 includes a base 118 moveably coupled to a directional input component 110. By way of example, when a user rotates the directional input component 110 clockwise, the teleportation device 104 will turn to the right in the virtual environment. Similarly, when the a user rotates the directional input component 110 counter-clockwise, the teleportation device 104 will turn to the left in the virtual environment. To cause an upward movement of the teleportation device 104 in the virtual environment, the directional input component 110 is pulled or tilted towards the user. Similarly, to cause a downward movement of the teleportation device 104 in the virtual environment, the directional input component 110 is pushed or tilted away from the user. Alternative embodiments may use a directional input component 110 mounted to a telescopic shaft where the up and down motions are accomplished by manipulating the directional input component in a substantially vertical up and down motion.

Brief reference is now made to FIG. 5, which is a schematic diagram illustrating a partial front view of a system for teleportation in a virtual environment. An arrangement of multiple fans of an embodiment includes an over the head fan 210 for simulating, for example, upward movement in the virtual environment. Similarly, the arrangement includes a right side fan 212 and a left side fan 214 for simulating right and left motion, respectively. A left ground fan 218 and a right ground fan 216 can be used to simulate left and right downward movement, respectively. Similarly, a front of face fan 220 can be used to simulate forward motion. Each of the fans can be driven at varying speeds to create the sensation of changing speeds within the virtual environment. Additionally, the fans can be used alone or in combination to create varying degrees of speed and directional simulation.

Brief reference is made to FIG. 6, which is a schematic diagram illustrating a side view of an embodiment of a teleportation device showing exemplary inputs to a directional input component. The teleportation device 104 includes a base 118 coupled to a directional input component 110 via a moveable coupling 120. To direct the teleportation device 104 to move down, the user 108 pushes or tilts the directional input component 110 away from himself/herself. Similarly, to direct the teleportation device 104 to move up, the user 108 pulls or tilts the directional input device 110 towards himself/herself. Alternative embodiments can feature a telescopic arrangement such that the directional input device is moved substantially vertically up and down to direct the upward or downward movement of the teleportation device 104 within the virtual environment.

Brief reference is made to FIG. 7, which is a schematic diagram illustrating a side view of an alternative embodiment of a teleportation device. The teleportation device 104 includes a base 118 attached to a directional input component 110 through a moveable coupling 120. In this embodiment, the moveable coupling 120 is a spring. Additional springs 230 are included to provide force feedback through additional resistance. Multiple springs or other biasing elements can be used independently or in combination to achieve a desired level of force feedback in all or selected axes.

Reference is now made to FIG. 8, which is a functional block diagram illustrating an embodiment of a control arrangement for a teleportation system as disclosed herein. The computer 160 (herein, computer or host computer) communicates with a 3-D tracker 162, a fan/relay controller 176, an eye tracking controller 182, and the status interface 166. Note that in some embodiments, fewer or more components and/or functionality can be implemented. The 3-D tracker 162 provides the position and orientation of the user's head, hands, the teleportation device, etc. to perform the following functionality:

    • 1. Enable a glove interface 186 (used to manipulate 3-D objects in a virtual environment) and a gesture recognizer 188 to recognize gestures and manipulate 3-D objects in a virtual environment.
    • 2. Enable a collision detector 194 to detect collisions between the user, the teleportation device, and a 3-D virtual environment.

As shown in FIG. 8, the teleportation system comprises a physics component 196 to simulate gravity, so that the user stays on the ground and not in the middle of the air when operating in the “drive” (as opposed to “fly”) mode. For example, when the user jumps over an obstacle in a virtual environment, he/she lands on the ground in the virtual environment.

The output of the physics component 196 is fed to a vibrator controller 198 that simulates vibrations and it also provides input to a graphics generator 190 that drives the 3-D output graphics on a head mounted display 202 that is fully immersive. The graphics generator 190 may retrieve environment data from an environment storage 192. In one embodiment, the head mounted display 202 is used and comprises a display and headphones or speakers. The headphones can be used to hear things or events in the virtual environment, such as a bouncing ball, etc. For example, the teleportation system can simulate circumstances such as when a user collides with another object by activating one or more vibration units 200 to provide tactile simulation. That is, if the user takes a turn at 100 mile/hour or 5 miles/hour in the virtual environment, he/she feels the difference in wind blowing at him/her, the vibration from the subwoofer, and perhaps the vibration from the vibrotactile devices.

In one embodiment, the host computer 160, the status interface 166, or both, controls the wind generator units 180 (on/off) and their speed (how much air they blow). The wind generator units 180 can be driven through a fan speed controller 178 and a fan/relay controller 176. In one embodiment, the host computer 160 also drives a sound generator 172 that simulates the noise generated by the teleportation device and can also serve as a secondary vibration mechanism. The sound generator 172 can be used to drive sound output units 174 using data in a sound data storing unit 170. The status interface 166 can use a switch polling facility 168 to detect button presses (e.g., user interface devices coupled to activation devices or switches) from user input devices that are attached to the teleportation device and sends that information to the host computer 160. In some embodiments, the teleportation system may also comprise a speech recognizer 164 that recognizes commands that a user verbally issues.

In one embodiment, the eye tracking controller 182 can communicate with a separate computer coupled to the host computer through, for example, an output interface 184. A head mounted display 202 comprises a camera that tracks the user's eye. The eye tracking controller 182 determines the coordinates of the eye and further determines what the user is observing in the virtual environment. Such a feature may be useful in games. For example, as a missile from the enemy is coming at the user, the user can look at the missile and press a button located on the teleportation device and activate a missile interceptor to destroy the incoming missile.

FIG. 9 is a block diagram illustrating an embodiment of an architecture for controlling a teleportation system. The control computer generally includes a processor 240, memory 242, and one or more input and/or output (I/O) devices 250 (or peripherals) that are communicatively coupled via a local interface 244. The local interface 244 may be, for example, one or more buses or other wired or wireless connections. The local interface 244 may have additional elements such as controllers, buffers (caches), drivers, repeaters, and receivers, to enable communication. Further, the local interface 244 may include address, control, and/or data connections that enable appropriate communication among the aforementioned components.

The processor 240 is a hardware device for executing software, particularly that which is stored in memory. The processor 240 may be any custom made or commercially available processor, a central processing unit (CPU), an auxiliary processor among several processors associated with the processing device, a semiconductor-based microprocessor (in the form of a microchip or chip set), a macroprocessor, or generally any device for executing software instructions.

The memory 242 may include any one or combination of volatile memory elements (e.g., random access memory (RAM)) and nonvolatile memory elements (e.g., ROM, hard drive, etc.). Moreover, the memory 242 may incorporate electronic, magnetic, optical, and/or other types of storage media. Note that the memory 242 may have a distributed architecture in which where various components are situated remotely from one another but may be accessed by the processor 240.

The software in memory 242 may include one or more separate programs, each of which comprises an ordered listing of executable instructions for implementing logical functions, such as the logical functions shown in FIG. 8. In the example of FIG. 9, the software in the memory 242 includes control software 246 for providing one or more of the functionality shown in FIG. 8 according to an embodiment. The memory 242 may also comprise a suitable operating system (O/S) 248. The operating system 248 essentially controls the execution of other computer programs, such as the control software, and provides scheduling, input-output control, file and data management, memory management, and communication control and related services.

The control software 246 is a source program, executable program (object code), script, or any other entity comprising a set of instructions to be performed. The control software 246 can be implemented, in one embodiment, as a distributed network of modules, where one or more of the modules can be accessed by one or more applications or programs or components thereof. In some embodiments, the control software 246 can be implemented as a single module with all of the functionality of the aforementioned modules. When the control software 246 is a source program, then the program is translated via a compiler, assembler, interpreter, or the like, which may or may not be included within the memory, so as to operate properly in connection with the operating system 248. Furthermore, the control software 246 can be written with (a) an object oriented programming language, which has classes of data and methods, or (b) a procedure programming language, which has routines, subroutines, and/or functions, for example but not limited to, C, C++, Pascal, Basic, Fortran, Cobol, Perl, Java, and Ada.

The I/O devices 250 may include input devices such as, for example, a keyboard, mouse, scanner, microphone, sensor(s), etc. Furthermore, the I/O devices 250 may also include output devices such as, for example, a printer, display, audio devices, vibration devices, etc. Finally, the I/O devices 250 may further include devices that communicate both inputs and outputs such as, for instance, a modulator/demodulator (modem for accessing another device, system, or network), a radio frequency (RF) or other transceiver, a telephonic interface, a bridge, a router, etc.

When the control computer is in operation, the processor 240 is configured to execute software stored within the memory 242, to communicate data to and from the memory 242, and to generally control operations of the control computer pursuant to the software. The control software 246 and the operating system 248, in whole or in part, but typically the latter, are read by the processor 240, perhaps buffered within the processor 240, and then executed.

It should be noted that the control software 246 can be stored on any computer-readable medium for use by or in connection with any computer-related system or method. In the context of this document, a computer-readable medium is an electronic, magnetic, optical, or other physical device or means that can contain or store a computer program for use by or in connection with a computer related system or method. The control software 246 can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.

In an alternative embodiment, where the functionality of the control software 246 is implemented in hardware, or as a combination of software and hardware, the functionality of the control software 246 can be implemented with any or a combination of the following technologies, which are each well known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc; or can be implemented with other technologies now known or later developed.

Reference is now made to FIG. 10, which is a block diagram illustrating an embodiment of a method 300 for providing teleportation in a virtual environment. The method 300 includes the step of delivering a video signal to the user in block 310. The video signal may be delivered using, for example, one or more displays configured in a head mounted device. The video signal provides the user with the visual information corresponding to the virtual environment. The method 300 also includes the step of delivering an audio signal to a user in block 320. The audio signal can be delivered through, for example, headphones or speakers. The audio signal can be used to communicate sounds within the virtual environment that correspond to objects or events.

The method 300 also includes the step of receiving position inputs relating to user physiological features and the teleportation device in block 330. For example, the three-dimensional position and orientation of the hands and head of the user can serve to ensure that the user's position and video signal correspond to the virtual environment. Similarly, by receiving the three-dimensional position and orientation data for the teleportation device, the computer controlling the virtual environment can correctly render the teleportation device in the virtual environment.

A user is provided vibratory feedback in block 340. By providing the vibratory feedback, a user can experience the sounds and vibrations corresponding to different rates of speed and events such as collisions in the virtual environment. Additionally, to further enhance the sensation of motion, air is directed towards the user in block 350. The air is directed at varying rates and from different directions to create the sensation of moving at different speeds and in different directions. Air can be directed using multiple wind generation devices including for example, fans or blowers. Each wind generation device can be driven independently or in combination at one or more preset speeds or at any speed over a range of speeds. Controlling the wind generation units can be accomplished using relays, electronic speed controllers, electronic motor drives, or any combination thereof.

Any process descriptions or blocks in flow charts should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternate implementations are included within the scope of an embodiment of the present disclosure in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present disclosure.

It should be emphasized that the above-described embodiments of the present disclosure, particularly, any illustrated embodiments, are merely possible examples of implementations, merely set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiment(s) of the disclosure without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and the present disclosure and protected by the following claims.

Claims

1. A system for teleportation in a virtual environment, comprising:

a head mounted display configured to provide an immersive virtual environment;
a teleportation device configured to provide navigation in the virtual environment;
at least one feedback device configured to provide a user with information corresponding to movement of the teleportation device within the virtual environment;
a plurality of input devices configured to generate a plurality of input signals in response to inputs from the user; and
a computing device configured to receive the plurality of input signals and control the at least one feedback device.

2. The system of claim 1, wherein the head mounted display comprises:

a video display configured to provide a video signal corresponding to the virtual environment; and
an audio device configured to provide an audio signal corresponding to the virtual environment.

3. The system of claim 1, wherein the at least one feedback device comprises a fan directed to the user and configured to create a motion sensation.

4. The system of claim 3, further comprising a plurality of fans configured to create the motion sensation in a plurality of directions.

5. The system of claim 1, wherein the at least one feedback device comprises a speaker configured to generate information in the form of an audio signal and a vibratory signal to the user, the audio and vibratory signals configured to create a motion sensation corresponding to changes in the virtual environment.

6. The system of claim 1, wherein the plurality of input devices comprise a plurality of user position sensors configured to provide three-dimensional location data corresponding to a plurality of user physiological features.

7. The system of claim 6, wherein the plurality of user physiological features are selected from the group consisting of: hands, arms, head, and torso.

8. The system of claim 6, wherein one of the plurality of sensors comprises a teleportation device position sensor configured to provide three dimensional location data corresponding to the teleportation device.

9. The system of claim 1, wherein one of the plurality of input devices comprises a user interface device configured to trigger an operation within the virtual environment.

10. The system of claim 9, wherein the user interface device is an electrical switch.

11. The system of claim 9, wherein the operation is selected from the group consisting of: flight mode, lights, stop, move up, move down, and jump.

12. The system of claim 1, further comprising a position interface configured to receive a position sensor input and transmit position and orientation data to the computing device.

13. The system of claim 1, wherein the teleportation device comprises:

a base configured to support at least a portion of the user; and
a directional input portion coupled to the base using a moveable coupling and configured to simulate a directional input member of a personal vehicle.

14. The system of claim 13, wherein the moveable coupling comprises a biasing element.

15. The system of claim 13, wherein the directional input portion is configured to tilt away from the user to cause upward movement in the virtual environment and wherein the directional input portion is configured to tilt toward the user to cause a downward movement in the virtual environment.

16. The system of claim 1, further comprising a means for controlling a plurality of fans with the computing device.

17. A method for providing teleportation in a virtual environment, comprising:

delivering a video signal, corresponding to a virtual environment, to a user;
delivering an audio signal, corresponding to the virtual environment, to the user;
receiving a plurality of inputs corresponding to a three-dimensional position for each of a plurality of user physiological features;
providing a vibratory feedback, corresponding to the virtual environment, to the user; and
directing air towards the user to create a motion sensation.

18. The method of claim 17, wherein the directing comprises varying a fan output to create the motion sensation corresponding to a plurality of velocities.

19. The method of claim 17, further comprising receiving user interface inputs configured to trigger an operation within the virtual environment.

20. The method of claim 19, wherein the operation is selected from the group consisting of: flight mode, lights, stop, move up, move down, and jump.

21. The method of claim 17, further comprising supporting a portion of the user in a configuration consistent with a personal vehicle.

22. The method of claim 21, wherein the personal vehicle comprises a scooter.

23. A system for teleportation in a virtual environment, comprising:

a head mounted display configured to provide a video signal and an audio signal to a user;
a teleportation device configured to support a portion of the user, the teleportation device comprising a base moveably coupled to a directional input component;
a plurality of user position sensors configured to transmit three-dimensional position and orientation data corresponding to a plurality of user physiological features;
a directional input component sensor configured to transmit three-dimensional position and orientation data corresponding to the directional input component of the teleportation device;
a low frequency driver attached to the teleportation device and configured to provide vibratory feedback to the user corresponding to motion in the virtual environment;
a plurality of fans directed at the user and configured to create a motion sensation in a plurality of directions by controlling the output of each of the plurality of fans independently;
a computing device configured to receive a plurality of input signals and generate a plurality of output commands to control a plurality of output devices;
an output device controller, configured to receive a portion of the plurality of output commands and control a portion of the plurality of output devices; and
a position interface device configured to receive signals from the plurality of user position sensors and transmit three-dimensional position and orientation data to the computing device.
Patent History
Publication number: 20080153591
Type: Application
Filed: Mar 7, 2006
Publication Date: Jun 26, 2008
Inventor: Leonidas Deligiannidis (Lexington, MA)
Application Number: 11/816,968
Classifications
Current U.S. Class: Visual (e.g., Enhanced Graphics, Etc.) (463/31)
International Classification: A63F 9/24 (20060101);