SYSTEM AND METHOD FOR A 3D COMPUTER GAME WITH TRUE VECTOR OF GRAVITY
A computer interaction system includes an augmented interaction device and a computer. The augmented interaction device includes a display device that displays augmented reality or virtual reality images, a first tracking mechanism that tracks a position of a physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism, and a second tracking mechanism that tracks a position of the second tracking mechanism relative to a reference and an orientation of the second tracking mechanism relative to the reference. The computer includes a processor and a memory, and processes position and orientation information received from the first tracking mechanism, and position and orientation information received from the second tracking mechanism. The computer can be configured to compensate for movement relative to the physical object of the first and second tracking mechanisms, and to output augmented reality or virtual reality information to the display.
This application claims the benefit of U.S. Provisional Application No. 61/322,521, filed Apr. 9, 2010, which is hereby incorporated by reference in its entirety.
FIELDThe present application relates to a computer interaction system and a method of facilitating interaction between an interaction delivery device and a physical object in an environment.
BACKGROUNDComputer games have been developed that account for gravity. An example of a game accounting for gravity is a virtual marble labyrinth game. In a virtual marble labyrinth game, a user tilts a gameboard having one or more virtual balls such that the virtual balls roll about the gameboard as if the virtual balls were real balls on the surface of the gameboard being influenced by a gravitational force and the user's actions. Additionally, the virtual balls can encounter a number of virtual objects that can affect the virtual balls' motion in the same manner as a physical object on the gameboard would affect the physical balls' motion. The objective of the virtual marble labyrinth game is to move the virtual balls to a desired position on the gameboard.
SUMMARYOne aspect of the presently disclosed subject matter provides an interaction delivery device that can include a display device, a first tracking mechanism, and a second tracking mechanism. The display device can be configured to display virtual objects. The first tracking mechanism can be configured to track one or more of a position of a physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism. The second tracking mechanism can be configured to track one or more of a position of the second tracking mechanism relative to a reference and an orientation of the second tracking mechanism relative to the reference. In certain embodiments, one or more of position information received from the first tracking mechanism and orientation information received from the first tracking mechanism can be used to generate motion data for the physical object. In certain embodiments, one or more of position information received from the second tracking mechanism and orientation information received from the second tracking mechanism can be used to generate adjusted motion data for the physical object. In certain embodiments, the adjusted motion data for the physical object is used to generate virtual object information. In certain embodiments, the virtual object information is received by the display.
Another aspect of the presently disclosed subject matter provides an interaction processing device that can include a processor, a memory, an input unit configured to receive information, and an output unit configured to output information. In certain embodiments, the input unit can be configured to receive at least one of physical object position information (e.g., data, video, images, etc.), physical object orientation information (e.g., data, video, images, etc.), tracking mechanism position information (e.g., data, video, images, etc.), and tracking mechanism orientation information (e.g., data, video, images, etc.). In certain embodiments, the processor can be configured to generate motion data for a physical object using at least one of the physical object position information and the physical object orientation information. In certain embodiments, the processor can be configured to generate adjusted motion data for the physical object using at least one of tracking mechanism position information and tracking mechanism orientation information. In certain embodiments, the adjusted motion data for the physical object can compensate for movement of a tracking mechanism relative to the physical object. In certain embodiments, the processor can be configured to generate virtual object information using the adjusted motion data for the physical object. In certain embodiments, the output unit can be configured to output the virtual object information.
Another aspect of the presently disclosed subject matter provides a computer interaction system that can include an interaction delivery device and a computer. The interaction delivery device can include a display device configured to display one or more of virtual objects, images, videos, and other information or media, a first tracking mechanism configured to track one or more of a position of a physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism, and a second tracking mechanism configured to track one or more of a position of the second tracking mechanism relative to a reference and an orientation of the second tracking mechanism relative to the reference. The computer can include a processor and a memory, and can be configured to process one or more of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism. Further, in certain embodiments, the computer can be configured to compensate for movement relative to the physical object of at least one of the first tracking mechanism, the second tracking mechanism, and the display device. In some embodiments, the computer can be further configured to output virtual object information to the display device. The computer can be any processing device.
In one embodiment, the first tracking mechanism can be an optical tracking mechanism which can include one or more cameras. Further, the interaction delivery device can be configured to be head-worn and to display one or more of video see-through augmented reality (e.g., video depicting a physical environment viewed through one or more attached cameras that is augmented with additional virtual graphics), virtual reality video (e.g., video depicting an entirely virtual environment), and optical see-through augmented reality (e.g., viewing the physical environment directly, rather than through video, with virtual graphics overlaid on the field of vision by using optical elements such as mirrors, lenses, etc.). The second tracking mechanism can include one or more of a three-axis accelerometer and a magnetometer. Moreover, the second tracking mechanism can be configured to determine a true direction of a natural force, such as gravity or magnetism, acting on the second tracking mechanism. The second tracking mechanism can have a position that is rigidly fixed relative to the first tracking mechanism, or a position that is otherwise known relative to the first tracking mechanism.
In a further embodiment, the second tracking mechanism can be configured to comprise a six-degree-of-freedom tracker that can be configured to determine a three-dimensional position of the second tracking mechanism relative to the reference and a three-dimensional orientation of the second tracking mechanism relative to the reference. The reference can be the earth or fixed to the earth. Alternatively, the reference can be fixed to an one or more additional physical object moving relative to the earth.
In another embodiment, the second tracking mechanism can be configured to determine a true direction of a natural force vector. Further, the computer can be configured to simulate motion of virtual objects based on a true direction of a natural force and based on at least one of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism. Alternatively or additionally, the computer can be configured to simulate motion of virtual objects based on a direction of force that is different from the true direction of a natural force and based on at least one of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism.
In a further embodiment, the computer can further include one or more of a game engine and a physics engine, wherein the computer can be configured to simulate virtual objects. For example, the computer can be configured to simulate virtual forces acting, in a true physical direction, on the virtual objects based on one or more of the position information received from the first tracking mechanism, the orientation information received from the first tracking mechanism, the position information received from the second tracking mechanism, the orientation information received from the second tracking mechanism, acceleration information about the second tracking mechanism received from the second tracking mechanism, velocity information about the second tracking mechanism received from the second tracking mechanism, and true physical gravity vector information. The position information received from the first tracking mechanism can include a first position of the physical object relative to the first tracking mechanism at a first time and a second position of the physical object relative to the first tracking mechanism at a second time. The orientation information received from the first tracking mechanism can include a first orientation of the physical object relative to the first tracking mechanism at the first time and a second orientation of the physical object relative to the first tracking mechanism at the second time. The position and orientation information received from the first tracking mechanism can be data, video streams, images, or other forms of information. The position information received from the second tracking mechanism can include a first position of the second tracking mechanism relative to the reference at the first time and a second position of the second tracking mechanism relative to the reference at the second time. The orientation information received from the second tracking mechanism can include a first orientation of the second tracking mechanism relative to the reference at the first time and a second orientation of the second tracking mechanism relative to the reference at the second time.
Additionally, the second tracking mechanism can have a position rigidly fixed, or otherwise known, relative to the first tracking mechanism. A predetermined time can exist between the first time and the second time. Moreover, the computer can be configured to determine a physical object movement vector between the first position of the physical object and the second position of the physical object. The computer can be configured to determine a second tracking mechanism movement vector between the first position of the second tracking mechanism and the second position of the second tracking mechanism. Further, the computer can be configured to calculate an adjusted physical object movement vector by subtracting the second tracking mechanism movement vector from the physical object movement vector if a magnitude of the second tracking mechanism movement vector exceeds a predetermined threshold distance (e.g., the predetermined threshold distance can be set as moving about 0.02 meters in the about 1 second) The computer can be configured to simulate virtual forces acting on the virtual objects based on at least the adjusted physical object movement vector if the magnitude of the second tracking mechanism movement vector exceeds, or is equal to, the predetermined threshold distance. Additionally, the computer can be configured to simulate virtual forces acting on the virtual objects based on at least the physical object movement vector if the magnitude of the second tracking mechanism movement vector is less than the predetermined threshold distance.
In yet another embodiment, the computer can be configured to determine a physical object rotation tensor between the first orientation of the physical object and the second orientation of the physical object. The computer can be configured to determine a second tracking mechanism rotation tensor between the first orientation of the second tracking mechanism and the second orientation of the second tracking mechanism. Additionally, the computer can be configured to calculate an adjusted physical object rotation tensor by applying a transformation based on a resultant rotation of the second tracking mechanism, if the resultant rotation of the second tracking mechanism exceeds, or is equal to, a predetermined threshold rotation value (e.g., a predetermined threshold rotation value of, for example 3 degrees in a predetermined time of 1 second). The computer can be configured to simulate natural forces acting on the virtual objects based on at least the adjusted physical object rotation tensor if the resultant rotation of the second tracking mechanism exceeds, or is equal to, the threshold rotation magnitude. Additionally, the computer can be configured to simulate natural forces acting on the virtual objects based on at least the physical object rotation tensor if the resultant rotation of the second tracking mechanism is less than the predetermined threshold rotation value.
Another aspect of the presently disclosed subject matter provides a non-transitory computer readable medium having computer readable instructions stored thereon, which, when executed by a computer having a processor to execute a plurality of processes, are configured to cause the processor to perform several functions. In certain embodiments, the computer readable instructions can cause the processor to obtain one or more of first tracking mechanism information and second tracking mechanism information. In certain embodiments, the computer readable instructions can cause the processor to determine one or more of a physical object movement vector using the first tracking mechanism information, a second tracking mechanism movement vector using the second tracking mechanism information, and direction of a true physical gravity vector relative to the second tracking mechanism. In certain embodiments, the computer readable instructions can cause the processor to simulate motion data of a virtual object. The processor can simulate motion data of the virtual object based on a true or not true virtual gravity vector using an adjusted physical object movement vector if a magnitude of the second tracking mechanism movement vector is greater than or equal to a predetermined threshold distance, the physical object movement vector if the magnitude of the second tracking mechanism movement vector is less than the predetermined threshold distance, and the direction of a true physical gravity vector relative to the second tracking mechanism. In certain embodiments, the computer readable instructions can cause the processor to output the motion data of the virtual object to a display.
In one embodiment, the non-transitory computer readable medium having computer readable instructions stored thereon, which, when executed by a computer having a processor to execute a plurality of processes, are configured further to cause the processor to determine one or more of a physical object rotation tensor using the first tracking mechanism information, and a second tracking mechanism rotation tensor using the second tracking mechanism information. The computer readable instructions can also cause the processor to simulate motion data of an a virtual object using additionally: an adjusted physical object rotation tensor if a resultant rotation of the second tracking mechanism is greater than or equal to a predetermined threshold rotation value, and the physical object rotation tensor if the resultant rotation of the second tracking mechanism is less than the predetermined threshold rotation value.
The first tracking mechanism information can include a first position of a physical object relative to a first tracking mechanism and a first orientation of the physical object at a first time, and a second position of the physical object relative to the first tracking mechanism and a second orientation of the physical object at a second time. The second tracking mechanism information can include a first position of a second tracking mechanism relative to a reference and a first orientation of the second tracking mechanism at the first time, and a second position of the second tracking mechanism relative to the reference and a second orientation of the second tracking mechanism at the second time.
Another aspect of the presently disclosed subject matter provides a method of facilitating interaction between an interaction delivery device and a physical object in an environment, the method including generating one or more virtual objects in the environment; detecting a change in the physical object; determining whether the change in the physical object is based on a change in the state of the virtual objects and the physical object, or both a force applied to the interaction delivery device and a change in the state of the virtual objects and the physical object; measuring a direction and effect (e.g., magnitude, etc.) of a natural force interacting with the environment; and updating the virtual objects based on a result of the determining and the measuring. In certain embodiments, the detecting can further include: detecting a change in position of the physical object over a given time, and detecting a change in position of the interaction delivery device over the given time. In certain embodiments, the determining can further include determining whether a magnitude of the change in position of the interaction delivery device over the given time exceeds, or is equal to, a predetermined threshold value, determining that the detected change in the physical object is based on a change in the state of the virtual objects and physical object if the change in position of the interaction delivery device over the given time is less than the predetermined threshold value, and determining that the detected change in the physical object is based on both a force applied to the interaction delivery device and a change in the state of the virtual objects and the physical object if the change in position of the interaction delivery device over the given time exceeds or is equal to the predetermined threshold value. The updating can further include updating positions of the virtual objects to simulate motion consistent with the natural force and the detected change in position of the physical object over the given time if the detected change in the physical object is based on a change in the state of the virtual objects and the physical object, and updating positions of the virtual objects to simulate motion consistent with the natural force and the detected change in position of the physical object and adjusted to remove effects caused by the force applied to the interaction delivery device over the given time if the detected change in the physical object is based on both a force applied to the interaction delivery device and a change in the state of the virtual objects and the physical object.
Another aspect of the presently disclosed subject matter provides an interaction system that can include a physical object, at least one virtual object, an interaction delivery device, and a processing device. The interaction delivery device can include: a tracking mechanism that can be configured to track at least one of a position of the physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism; a detecting mechanism configured to detect motion of the detecting mechanism, wherein a position of the detecting mechanism relative to a position of the tracking mechanism is predetermined; and a display configured to display the at least one virtual object. The processing device can be configured to receive physical object position information from the tracking mechanism. The processing device can be configured to receive detecting mechanism motion information from the detecting mechanism. The processing device can be configured to perform at least one of: determining if a magnitude of acceleration of the detecting mechanism is greater than a predetermined acceleration threshold value and generating adjusted physical object position information if the magnitude of acceleration of the detecting mechanism is greater than the predetermined acceleration threshold value; and determining if a magnitude of velocity of the detecting mechanism is greater than a predetermined velocity threshold value and generating adjusted physical object position information if the magnitude of velocity of the detecting mechanism is greater than the predetermined velocity threshold value. The processing device can be configured to generate motion information for the at least one virtual object based on the adjusted physical object position information if the processing device generates the adjusted physical object position information. The processing device can be configured to generate the motion information for the at least one virtual object based on the physical object position information if the processing device does not generate the adjusted physical object position information. The processing device can be configured to output the motion information for the at least one virtual object to the display.
The detecting mechanism can be configured to detect a correct direction and magnitude of a physical gravity vector. Additionally, the processing device can be further configured to generate the motion information for the at least one virtual object additionally based on the correct direction and magnitude of the physical gravity vector. Further, the tracking mechanism can be an optical tracking device. The physical object can be configured to not include attached or embedded electronic devices or components.
In one embodiment of the present aspect of the application, the at least one virtual object can be configured to move according to a virtual gravity vector, wherein the virtual gravity vector is substantially identical to the physical gravity vector. In another embodiment of the present aspect of the application, the at least one virtual object can be configured to move according to a virtual gravity vector, wherein the virtual gravity vector is different from the physical gravity vector.
Another aspect of the presently disclosed subject matter describes an interaction delivery system comprising: a first tracking mechanism rigidly attached to a display device configured to track the position and orientation of a physical object relative to the display device; a second tracking mechanism rigidly attached to the display device, configured to track the absolute orientation of the display device relative to the earth; and a processing device configured to process tracking information output from the first tracking mechanism and tracking information output from the second tracking mechanism, wherein the processing device is configured to simulate motion information of at least one virtual object from the tracking information output from the first tracking mechanism and tracking information output from the second tracking mechanism, wherein the processing device is configured to output the simulated motion information of the at least one virtual object to the display, wherein the display is configured to display the at least one virtual object disposed relative to the physical object, and wherein the at least one virtual object is configured to behave as if acted on by a virtual gravity vector in a same direction as a physical gravity vector acting on the physical object.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and are intended to provide further explanation of the application claimed.
The accompanying drawings, which are incorporated in and constitute part of this specification, are included to illustrate and provide a further understanding of the apparatus and method of the application. Together with the written description, the drawings serve to explain the principles of the application.
The disclosed subject matter can be exemplified with, but is not limited to, a virtual marble labyrinth game. The virtual marble labyrinth game has been implemented using a number of technologies. For example, the virtual marble labyrinth game can consist of a hand-held display for displaying the virtual balls and other virtual objects, such as obstacles, bumps, ramps, prizes, walls, pools, holes, channels, etc. The hand-held display is further integrated with a set of built-in accelerometers that can directly measure the direction of gravity relative to the display. An example of this technology in action is available at http://www.youtube.com/watch?v=lipHmNbi7ss. This implementation has several drawbacks: the game graphics are restricted to the hand-held display; the weight and fragility of the gameboard depend upon the technology it contains; and the position and orientation of the player's head relative to the gameboard are not known, so graphics must be generated for an assumed head position and orientation, and cannot respond to changes in actual head position and orientation during gameplay.
Another example of a virtual marble labyrinth game uses a static camera to optically track a passive physical gameboard. This makes it possible to determine a six-degree-of-freedom position and orientation of the gameboard relative to the camera. The virtual objects are displayed on an external monitor, with the virtual objects overlaid on the actual view from the camera. An example of this technology in action is available at http://www.youtube.com/watch?v=L7dC8HU2KJY. In this case, the camera is assumed to be in a specific fixed pose relative to the direction of gravity. This fixed pose can be either a hardwired pose or a pose that is calibrated prior to gameplay. Consequently, changing the position or orientation of the camera during gameplay will be indistinguishable from a complementary change in the position or orientation of the gameboard. This is typically undesirable in gameplay, as it rules out having the camera move, since that would cause an unwanted change in the effective direction of gravity, preventing mobile game play or the use of a head-worn camera. Further, if the camera is rigidly affixed to a moving platform, such as a car, ship, plane, or train, on which the game is being played, the motion of the platform relative to the earth will not be properly taken into account.
Additional examples of a virtual marble labyrinth game use a display with a known or assumed orientation. One implementation allows the user to manipulate a virtual gameboard using an input device such as a mouse. This approach does not provide the natural feel of controlling a virtual gameboard in the same manner as one would feel while controlling a physical gameboard. Another implementation allows the user to manipulate a game controller containing an orientation tracker, effectively changing the direction of gravity in the game, but without changing the visual orientation of the gameboard. An example of this technology is available at http://www.youtube.com/watch?v=BjEKoDW9S-4. In both of these approaches, the gameboard is not held by the player.
Yet another example of the virtual marble labyrinth game uses a gameboard and head-worn unit. The gameboard and the head-worn unit are each individually tracked in six-degrees of freedom relative to a real world reference. This implementation requires that additional infrastructure for tracking the gameboard is added to the environment or added directly to the gameboard. Adding such additional infrastructure to the environment limits where the game can be played. Adding such additional infrastructure to the gameboard make it heavier, or more fragile. In both cases, adding additional infrastructure can make the system more expensive, more complicated, and awkward for the user.
Reference will now be made in detail to embodiments of the disclosed subject matter, examples of which are illustrated in the accompanying drawings and at on the internet at http://www.youtube.com/watch?v=6AKgH4On65A.
While, solely for purpose of convenience, the computer interaction system and the method of facilitating interaction between an interaction delivery device and physical objects in an environment presented herein are described in the context of developing and playing augmented reality computer games, use of the systems and methods of the presently disclosed subject matter is not limited thereto. For example, the presently disclosed systems and methods can be employed for other uses, such as, but not limited to, pedagogical computer environments for teaching physics, controls for user interfaces, pedagogical “virtual reality” computer environments for medical training, pilot training, etc. Additionally, the disclosed subject matter can be employed to develop and play entirely virtual games using a physical object as an interface to control virtual objects within an immersive virtual environment.
In accordance with the disclosed subject matter, a computer interaction system is provided. The computer interaction system includes an interaction delivery device and a computer. The interaction delivery device can include a display device configured to display virtual objects, a first tracking mechanism configured to track one or more of a position of a physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism, and a second tracking mechanism configured to track one or more of a position of the second tracking mechanism relative to a reference and an orientation of the second tracking mechanism relative to the reference. The computer can include a processor and a memory, and can be configured to process at least one of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism. Further, the computer can be configured to compensate for movement relative to the physical object of at least one of the first tracking mechanism and the second tracking mechanism and to output virtual object information to the display.
For purpose of explanation and illustration, and not limitation, an exemplary embodiment of the computer interaction system in accordance with the application is shown in
A similar process occurs in S1-B and S1-D with respect to tracking data output from head tracker 12. Specifically, when head tracker 12 outputs head tracking data to computer 14, the tracking software can instruct processor 17 to process the location of head tracker 12, if necessary, and to determine one or more of the position and orientation of head tracker 12 relative to reference 24. The software can instruct processor 17 to store the position and orientation of head tracker 12 relative to reference 24 in memory 18 at discrete time intervals, each time interval being a predetermined length of time (e.g., 0.03 seconds). S1-B and S1-D can occur separately or simultaneously. If necessary, processor 17 can store information about the relative positions of head tracker 12 and optical trackers 11 in memory 18.
For each successive change in position of board 16 over a discrete time interval, the tracking software can instruct the processor to determine a board movement vector, as shown in S2-A. The processor can also determine a board rotation tensor for the successive change in orientation of board 16 over each discrete time interval as shown in S2-C.
Further, the tracking software can instruct processor 17 to process one or more of the orientation and position information output by head tracker 12 to determine the position and orientation of head tracker 12 relative to reference 24. Processor 17 can also determine one or more of a head tracker movement vector and a head tracker rotation tensor for each successive change in position of board 16 and orientation of board 16, respectively as shown in S2-B and S2-D. If head tracker 12 is not configured to output information regarding the relative direction of the true physical gravity vector to computer 14, the tracking software can instruct processor 17 to determine the direction of the true physical gravity vector relative to head tracker 12 from the position and orientation of head tracker 12 relative to reference 24 as shown in S2-E.
The tracking software can also instruct processor 17 to perform a compensation procedure to account for motion of optical sensors 13, as shown in
If processor 17 determines that head tracker 12 has changed position, then processor 17 can determine a resultant rotation from head tracker rotation tensor and compare the resultant rotation of head tracker 12 with a predetermined threshold rotation value stored in memory 18 as shown in S4-A. This predetermined threshold rotation value stored in memory 18 can be determined based on the particular implementation of the application smaller thresholds, for complex simulations such as simulated medical training, and larger thresholds for simple simulations such as basic physics simulations). If the resultant rotation of head tracker 12 exceeds or is equal to the predetermined threshold rotation value, processor 17 can determine that head tracker 12 has changed both position and orientation. If the resultant rotation of head tracker 12 is less than the predetermined threshold rotation value, processor 17 can determine that head tracker 12 has not changed orientation but has changed position.
If processor 17 determines that head tracker 12 has not changed position, then processor 17 can determine a resultant rotation from head tracker rotation tensor and compare the resultant rotation of head tracker 12 with a predetermined threshold rotation value stored in memory 18 as shown in S3-C. If the resultant rotation of head tracker 12 exceeds or is equal to the predetermined threshold rotation value, processor 17 can determine that head tracker 12 has changed orientation but not position. If the resultant rotation of head tracker 12 is less than the predetermined threshold rotation value, processor 17 can determine that head tracker 12 has not changed position or orientation.
Alternatively, one or more of the comparisons of S3-A, S3-C, and S4-A can occur simultaneously or as part of the same process. It is not necessary to separately compare relative rotation information and movement information of head tracker 12. Accordingly, relative rotation information and movement information of head tracker 12 can be compared with a combined threshold value.
If processor 17 determines that both the position and orientation of head tracker 12 has changed, processor 17 can apply adjustments to the corresponding board movement vector and the corresponding board rotation tensor to account for the change in position and orientation of head tracker 12 and optical sensors 11, as described in S5-A. Specifically, because optical sensors 11 are rigidly fixed to head tracker 12, the orientation changes and position changes of optical sensors 11 can be determined by applying the proper adjustments. Here, the adjustments include subtracting the head tracker movement vector from the board movement vector for corresponding discrete time intervals and applying frame rotations to correct for changes in orientation in the proper sequence. This adjustment produces an adjusted board movement vector and an adjusted board rotation tensor which more accurately model motion of board 16.
If processor 17 determines that the position of head tracker 12 has changed but not the orientation, processor 17 can apply adjustments to the corresponding board movement vector to account for the change in position of head tracker 12 and optical sensors 11, as described in S5-B. The position changes of optical sensors 11 can be determined by applying the proper sequence of translations. Here, the adjustments include subtracting the head tracker movement vector from the board movement vector for corresponding discrete time intervals. This adjustment produces an adjusted board movement vector which more accurately models motion of board 16 and retains the previously determined board rotation tensor.
If processor 17 determines that the orientation of head tracker 12 has changed but not the position, processor 17 can apply adjustments to the corresponding board rotation tensor to account for the change in orientation of head tracker 12 and optical sensors 11, as described in S5-C. The orientation changes of optical sensors 11 can be determined by applying the proper sequence of rotations. Here, the adjustments include applying coordinate transformations to the board rotation tensor which remove effects of the resultant rotation of the second tracking mechanism in the proper sequence. This adjustment produces an adjusted board rotation tensor which more accurately models motion of board 16 and retains the previously determined board movement vector.
If processor 17 determines that both the position and orientation of head tracker 12 have not changed, processor 17 is not instructed to perform an adjustment and proceeds directly to simulation.
If processor 17 has determined an adjusted board movement vector and an adjusted board rotation tensor, processor 17 can use the information regarding the relative direction of the true physical gravity vector determined in S2-E, the adjusted board movement vector, and the adjusted board rotation tensor to simulate the motion of a virtual object 15, such as ball 15a, under a true virtual gravity vector based on the actual motion of board 16, as described in S6-A.
If processor 17 has determined an adjusted board movement vector but has not adjusted the optical sensor rotation tensor, processor 17 can use the information about the relative direction of the true physical gravity vector determined in S2-E, the adjusted board movement vector, and the unadjusted board rotation tensor to simulate the motion of a virtual object 15, such as ball 15a, under a true virtual gravity vector based on the actual motion of board 16, as described in S6-B.
If processor 17 has determined an adjusted board rotation tensor but has not adjusted the board movement vector, processor 17 can use the information about the relative direction of the true physical gravity vector determined in S2-E, the unadjusted board movement vector, and the adjusted board rotation tensor to simulate the motion of a virtual object 15, such as ball 15a, under a true virtual gravity vector based on the actual motion of board 16, as described in S6-C.
If processor 17 has not adjusted the board movement vector and the board rotation tensor, processor 17 can use the information about the relative direction of the true physical gravity vector determined in S2-E, the unadjusted board movement vector, and the unadjusted board rotation tensor to simulate the motion of a virtual object 15, such as ball 15a, under a true virtual gravity vector based on the actual motion of board 16, as described in S6-D.
Once processor performs one or more of S6-A, S6-B, S6-C, and S6-D to simulate the motion of a virtual object 15, computer 14 outputs this simulation as augmented reality information or virtual reality information to head-worn unit 10. Accordingly, displays 13 can display a combination of virtual objects 15 and physical objects, such as board 16, in real-time. Alternatively, displays 13 can display only virtual objects 15.
For purpose of explanation and illustration, and not limitation, another exemplary embodiment of the interaction system in accordance with the application is shown in
A similar process occurs in S11-C and S11-D with respect to head motion detector motion information and physical gravity vector information output from head motion detector 126. Specifically, when head motion detector 126 outputs head motion detector motion information and physical gravity vector information to processing device 114, the processing device 114 can process the motion information of head motion detector, if necessary, and to determine one or more of the acceleration and velocity of head motion detector relative to a reference frame. Processing device 114 can store the acceleration and velocity of head motion detector relative to a reference frame at discrete time intervals, each time interval being a predetermined length of time (e.g., 0.03 seconds). S11-C and S11-D can occur separately or simultaneously. If necessary, processing device 114 can store information about the relative positions of head motion detector 126 and optical trackers 11.
For each successive change in position of the gameboard over a discrete time interval, processing device 114 can determine a board movement vector, as shown in S12-A. Processing device 114 can also determine a board rotation tensor for the successive change in orientation of the gameboard over each discrete time interval as shown in S12-B.
Further, in S12-C, processing device 114 can process one or more of the motion information output by head motion detector 126 to determine one or more of the velocity and acceleration of head motion detector 126 relative to the reference frame.
Processing device 114 can perform a compensation procedure to account for motion of optical sensors 113, as shown in
If processing device 114 determines that head motion detector 126 has changed at least one of position and orientation as a result of acceleration, processing device 114 can apply adjustments to the corresponding at least one of board movement vector and board rotation tensor to account for the change in at least one of position and orientation of head motion detector 126 and optical sensors 11, as described in S14-A. Specifically, because optical sensors 11 are rigidly fixed to head motion detector 126, or have an otherwise known relative position, the orientation changes and position changes of optical sensors 11 can be determined by applying the proper adjustments. Here, the adjustments can include one or more heuristics for compensation, including, but not limited to: ignoring changes in one or more of board position information and board orientation information over the corresponding discrete time interval when simulating the physics of the virtual reality object, while simulating a physically accurate field of vision. These adjustments produce at least one of an adjusted board movement vector and an adjusted board rotation tensor which more accurately model motion of the gameboard.
Processing device 114 performs a comparison of velocity values in S23 if processing device determines the velocity of head motion detector 126 relative to the reference frame in S12-C. In S23 of
If processing device 114 infers that head motion detector 126 has changed at least one of position and orientation as a result of acceleration, processing device 114 can apply adjustments to the corresponding at least one of board movement vector and board rotation tensor to account for the change in at least one of position and orientation of head motion detector 126 and optical sensors 11, as described in S24-A. Specifically, because optical sensors 11 are rigidly fixed to head motion detector 126, or have an otherwise known relative position, the orientation changes and position changes of optical sensors 11 can be determined by applying the proper adjustments. Here, the adjustments include one or more of subtracting the head tracker movement vector from the board movement vector for corresponding discrete time intervals and applying frame rotations to correct for changes in orientation in the proper sequence. This adjustment produces at least one of an adjusted board movement vector and an adjusted board rotation tensor which more accurately model motion of the gameboard.
If processing device 114 has adjusted one or more of the board movement vector and the board rotation tensor, processing device 114 can use the information about the relative direction of the true physical gravity vector received in S11-D and one or more of the adjusted board movement vector determined in S14-A or S24-A, and the adjusted board rotation tensor determined in S14-A or S24-A to simulate the motion of a virtual objects, under a true virtual gravity vector based on the actual motion of the gameboard, as described in S15-A (comparing acceleration) or S25-A (comparing velocity). If an adjusted board movement vector is determined in both S14-A S24-A, the processor can be programmed to select one of the adjusted board movement vector determined in S14-A and the adjusted board movement vector determined in S24-A, the processor can be programmed to implement only one of S15-A and S25-A. If an adjusted board rotation tensor is determined in both S14-A S24-A, the processor can be programmed to select one of the adjusted board rotation tensor determined in S14-A and the adjusted board rotation tensor determined in S24-A, the processor can be programmed to implement only one of S15-A and S25-A.
If processing device 114 infers that both the position and orientation of head motion detector 126 have not changed as a result of acceleration (S13) or velocity (S23), processing device 114 does not perform an adjustment and proceeds directly to simulation.
If processing device 114 has not adjusted the board movement vector and the board rotation tensor, processing device 114 can use the information about the relative direction of the true physical gravity vector received in S11-D, the unadjusted board movement vector determined in S12-A, and the unadjusted board rotation tensor determined in S12-B to simulate the motion of a virtual objects, under a true virtual gravity vector based on the actual motion of the gameboard, as described in S15-B (comparing acceleration) or S25-B (comparing velocity).
Once processing device 114 performs one or more of S15-A, S25-A, S15-B and S25-B to simulate the motion of at least one virtual object, processing device 114 outputs this simulation as augmented reality information or virtual reality information to head-worn unit 110. Accordingly, displays 113 can display a combination of virtual objects and physical objects, such as the gameboard, in real-time. Alternatively, displays 113 can display only virtual objects.
As described above, a virtual gravitational force can act on ball 15a as natural gravity would act on a physical object placed on board 16. Thus, even when the head-worn unit 10 moves independently of board 16 or changes orientation with respect to board 16, the virtual force of gravity will appear to act in the true physical gravity vector direction with respect to board 16.
The computer interaction environment is not limited to the embodiments described above. Interaction delivery devices other than head-worn unit 10 can be used in place of or in combination with head-worn unit 10. For example, interaction delivery devices can include tracked physical displays that are held, worn on additional or alternative parts of the body, or mounted on objects in the environment. Alternatively, one or more of the first tracking mechanism, the second tracking mechanism, and each display device, individually or in combination, can be disposed separately from the interaction delivery device.
Further, the first tracking mechanism can be a device other than optical sensors 11 and can include fewer than two optical sensors or other tracking mechanisms. For example, the first tracking mechanism can include dephth cameras or acoustic tracking devices, such as sonar devices, etc. Additionally, second tracking mechanisms other than head tracker 12 can be used in place of or in combination with head tracker 12. For example, the second tracking mechanism can be or fixed to earth or to an object moving relative to earth rather than being fixed to or at a predetermined position relative to the first tracking mechanism. Here, the second tracking mechanism can track the first tracking mechanism as the reference. The second tracking mechanism can include one or more of a three-axis accelerometer, a three-axis magnetometer, a three-axis gyroscope, and a full six-degrees-of-freedom tracker. Additionally, the second tracking mechanism can determine the relative direction of a true force vector acting in the force's natural direction (true physical vector direction) other than that of gravity, including, but not limited to, the true vectors of magnetic forces, electrostatic forces, friction, additional artificial forces, etc.
Further, the tracking software can be configured to cause the processor to determine only one of a movement vector and a rotation tensor for each of the board and the second tracking mechanism. Alternatively, the tracking software can be configured to cause the processor to determine both a movement vector and a rotation tensor for each of the board and the second tracking mechanism as described above. Additionally, the tracking software can be configured to cause the processor to apply an adjustment to only movement vectors, only rotation tensors, or some combination of rotation tensors and movement vectors.
Further, the simulated forces are not limited to gravitational forces applied in the true physical gravity vector direction. The virtual gravity vector can be intentionally different from the true physical gravity vector (e.g., a not true virtual gravity vector). For example, the simulated gravitational force can have a different magnitude or act in a direction different from the true physical gravity vector. Additionally, the simulated force can include, but is not limited to, magnetic forces, electrostatic forces, and additional artificial forces in true or not true directions. Thus, the virtual gravity vector can be an intentionally not true virtual gravity vector.
In addition to the specific embodiments and features disclosed herein, this application also incorporates by reference the entire disclosure of each and every patent publication identified herein. This application therefore includes any possible combination of the various features disclosed, incorporated by reference or claimed herein. As such, the particular features presented in the dependent claims and disclosed above can be combined with each other in other manners within the scope of the application such that the application should be recognized as also specifically directed to other embodiments having any other possible combinations. Thus, the foregoing description of specific embodiments of the application has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the application to those embodiments disclosed.
Claims
1. An interaction delivery device comprising:
- a display device configured to display virtual objects;
- a first tracking mechanism configured to track one or more of a position of a physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism; and
- a second tracking mechanism configured to track one or more of a position of the second tracking mechanism relative to a reference and an orientation of the second tracking mechanism relative to the reference,
- wherein one or more of position information received from the first tracking mechanism and orientation information received from the first tracking mechanism is used to generate motion data for the physical object,
- wherein one or more of position information received from the second tracking mechanism and orientation information received from the second tracking mechanism is used to generate adjusted motion data for the physical object,
- wherein the adjusted motion data for the physical object compensates for movement relative to the physical object of at least one of the first tracking mechanism and the second tracking mechanism, and
- wherein the adjusted motion data for the physical object is used to generate virtual object information, and
- wherein the virtual object information is received by the display.
2. An interaction processing device comprising:
- a processor;
- a memory;
- an input unit configured to receive information; and
- an output unit configured to output information,
- wherein the input unit is configured to receive at least one of physical object position information, physical object orientation information, tracking mechanism position information, and tracking mechanism orientation information,
- wherein the processor is configured to generate motion data for a physical object using at least one of the physical object position information and the physical object orientation information,
- wherein the processor is configured to generate adjusted motion data for the physical object using at least one of tracking mechanism position information and tracking mechanism orientation information,
- wherein the adjusted motion data for the physical object compensates for movement of a tracking mechanism relative to the physical object,
- wherein the processor is configured to generate virtual object information using the adjusted motion data for the physical object, and
- wherein the output unit is configured to output the virtual object information.
3. A computer interaction system, comprising:
- an interaction delivery device comprising: a display device configured to display virtual objects, a first tracking mechanism configured to track one or more of a position of a physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism, and a second tracking mechanism configured to track one or more of a position of the second tracking mechanism relative to a reference and an orientation of the second tracking mechanism relative to the reference; and
- a computer comprising: a processor, and a memory,
- wherein the computer is configured to process at least one of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism,
- wherein the computer is further configured to compensate for movement relative to the physical object of at least one of the first tracking mechanism and the second tracking mechanism, and
- wherein the computer is further configured to output virtual object information to the display device.
4. The computer interaction system of claim 3, wherein the first tracking mechanism is an optical tracking mechanism.
5. The computer interaction system of claim 4, wherein the interaction delivery device is configured to be head-worn and to display see-through video.
6. The computer interaction system of claim 3, wherein the second tracking mechanism comprises a three-axis accelerometer.
7. The computer interaction system of claim 3, wherein the second tracking mechanism comprises a six-degree-of-freedom tracker configured to determine a three-dimensional position of the second tracking mechanism relative to the reference and a three-dimensional orientation of the second tracking mechanism relative to the reference.
8. The computer interaction system of claim 7, wherein the reference is earth or is fixed to earth.
9. The computer interaction system of claim 7, wherein the reference is fixed to an object moving relative to earth.
10. The computer interaction system of claim 3, wherein the second tracking mechanism is configured to determine a true direction of a natural force vector.
11. The computer interaction system of claim 10, wherein the computer is configured to simulate motion of virtual objects based on a true direction of a natural force and based on at least one of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism.
12. The computer interaction system of claim 10, wherein the computer is configured to simulate motion of virtual objects based on a direction of force that is different from the true direction of the natural force and based on at least one of position information received from the first tracking mechanism, orientation information received from the first tracking mechanism, position information received from the second tracking mechanism, and orientation information received from the second tracking mechanism.
13. The computer interaction system of claim 3,
- wherein the computer further comprises a physics engine, and
- wherein the computer is configured to simulate virtual objects.
14. The computer interaction system of claim 13,
- wherein the computer is configured to simulate natural forces acting, in the true direction, on the virtual objects based on at least one of the position information received from the first tracking mechanism, the orientation information received from the first tracking mechanism, the position information received from the second tracking mechanism, and the orientation information received from the second tracking mechanism,
- wherein the position information received from the first tracking mechanism comprises a first position of the physical object relative to the first tracking mechanism at a first time and a second position of the physical object relative to the first tracking mechanism at a second time,
- wherein the orientation information received from the first tracking mechanism comprises a first orientation of the physical object relative to the first tracking mechanism at the first time and a second orientation of the physical object relative to the first tracking mechanism at the second time,
- wherein the position information received from the second tracking mechanism comprises a first position of the second tracking mechanism relative to the reference at the first time and a second position of the second tracking mechanism relative to the reference at the second time, and
- wherein the orientation information received from the second tracking mechanism comprises a first orientation of the second tracking mechanism relative to the reference at the first time and a second orientation of the second tracking mechanism relative to the reference at the second time.
15. The computer interaction system of claim 14,
- wherein the second tracking mechanism has a position rigidly fixed relative to the first tracking mechanism,
- wherein a predetermined time exists between the first time and the second time,
- wherein the computer is configured to determine a physical object movement vector between the first position of the physical object and the second position of the physical object,
- wherein the computer is configured to determine a second tracking mechanism movement vector between the first position of the second tracking mechanism and the second position of the second tracking mechanism,
- wherein the computer is configured to calculate an adjusted physical object movement vector if a magnitude of the second tracking mechanism movement vector is greater than or equal to a predetermined threshold distance, and to simulate natural forces acting on the virtual objects based on at least the adjusted physical object movement vector if the magnitude of the second tracking mechanism movement vector is greater than or equal to the predetermined threshold distance, and
- wherein the computer is configured to simulate natural forces acting on the virtual objects based on at least the physical object movement vector if the magnitude of the second tracking mechanism movement vector is less than the predetermined threshold distance.
16. The computer interaction system of claim 14,
- wherein the computer is configured to determine a physical object rotation tensor between the first orientation of the physical object and the second orientation of the physical object,
- wherein the computer is configured to determine a second tracking mechanism rotation tensor between the first orientation of the second tracking mechanism and the second orientation of the second tracking mechanism,
- wherein the computer is configured to calculate an adjusted physical object rotation tensor if a resultant rotation of the second tracking mechanism is greater than or equal to a predetermined threshold rotation value,
- wherein the computer is configured to simulate natural forces acting on the virtual objects based on at least the adjusted physical object rotation tensor if the resultant rotation of the second tracking is greater than or equal to the predetermined threshold rotation value, and
- wherein the computer is configured to simulate natural forces acting on the virtual objects based on at least the physical object rotation tensor if the resultant rotation of the second tracking mechanism is less than the predetermined threshold rotation value.
17. A non-transitory computer readable medium having computer readable instructions stored thereon, which, when executed by a computer having a processor to execute a plurality of processes, are configured to cause the processor to:
- obtain first tracking mechanism information;
- obtain second tracking mechanism information;
- determine a physical object movement vector using the first tracking mechanism information;
- determine a second tracking mechanism movement vector using the second tracking mechanism information;
- determine a direction of a true physical gravity vector relative to the second tracking mechanism;
- simulate motion data of a virtual object using: an adjusted physical object movement vector if a magnitude of the second tracking mechanism movement vector is greater than or equal to a predetermined threshold distance, the physical object movement vector if the magnitude of the second tracking mechanism movement vector is less than the predetermined threshold distance, and the direction of a true physical gravity vector relative to the second tracking mechanism; and
- output the motion data of the virtual object to a display.
18. The non-transitory computer readable medium having computer readable instructions stored thereon of claim 17, which, when executed by a computer having a processor to execute a plurality of processes, are configured further to cause the processor to:
- determine a physical object rotation tensor using the first tracking mechanism information;
- determine a second tracking mechanism rotation tensor using the second tracking mechanism information; and
- simulate motion data of an a virtual object using additionally: an adjusted physical object rotation tensor if a resultant rotation of the second tracking mechanism is greater than or equal to a predetermined threshold rotation value, and the physical object rotation tensor if the resultant rotation of the second tracking mechanism is less than the predetermined threshold rotation value.
19. The non-transitory computer readable medium having computer readable instructions stored thereon according to claim 18, wherein:
- the first tracking mechanism information comprises: a first position of a physical object relative to a first tracking mechanism and a first orientation of the physical object at a first time, and a second position of the physical object relative to the first tracking mechanism and a second orientation of the physical object at a second time; and
- the second tracking mechanism information comprises: a first position of a second tracking mechanism relative to a reference and a first orientation of the second tracking mechanism at the first time, and a second position of the second tracking mechanism relative to the reference and a second orientation of the second tracking mechanism at the second time.
20. A method of facilitating interaction between an interaction delivery device and a physical object in an environment, the method comprising:
- generating one or more virtual objects in the environment;
- detecting a change in the physical object;
- determining whether the change in the physical object is based on a change in the state of the virtual objects and the physical object, or both a force applied to the interaction delivery device and a change in the state of the virtual objects and the physical object;
- measuring a direction and effect of a natural force interacting with the environment; and
- updating the virtual objects based on a result of the determining and the measuring.
21. The method of facilitating interaction between an interaction delivery device and a physical object in an environment of claim 20,
- wherein the detecting further comprises: detecting a change in position of the physical object over a given time, and detecting a change in position of the interaction delivery device over the given time;
- wherein the determining further comprises: determining whether a magnitude of the change in position of the interaction delivery device over the given time is greater than or equal to a threshold value, determining that the detected change in the physical object is based on a change in the state of the virtual objects and the physical object if the change in position of the interaction delivery device over the given time is less than the threshold value, and determining that the detected change in the physical object is based on both a force applied to the interaction delivery device and a change in the state of the virtual objects and the physical object if the change in position of the interaction delivery device over the given time is greater than or equal to the threshold value; and
- wherein the updating further comprises: updating positions of the virtual objects to simulate motion consistent with the natural force and the detected change in position of the physical object over the given time if the detected change in the physical object is based on a change in the state of the virtual objects and the physical object, and updating positions of the virtual objects to simulate motion consistent with the natural force and the detected change in position of the physical object and adjusted to remove effects caused by the force applied to the interaction delivery device over the given time if the detected change in the physical object is based on both a force applied to the interaction delivery device and a change in the state of the virtual objects and the physical object.
22. An interaction system comprising:
- a physical object;
- at least one virtual object;
- an interaction delivery device comprising: a tracking mechanism configured to track at least one of a position of the physical object relative to the first tracking mechanism and an orientation of the physical object relative to the first tracking mechanism, a detecting mechanism configured to detect motion of the detecting mechanism, wherein a position of the detecting mechanism relative to a position of the tracking mechanism is predetermined, and a display configured to display the at least one virtual object; and
- a processing device,
- wherein the processing device is configured to receive physical object position information from the tracking mechanism,
- wherein the processing device is configured to receive detecting mechanism motion information from the detecting mechanism,
- wherein the processing device is configured to perform at least one of: determining if a magnitude of acceleration of the detecting mechanism is greater than a predetermined acceleration threshold value and generating adjusted physical object position information if the magnitude of acceleration of the detecting mechanism is greater than the predetermined acceleration threshold value, and determining if a magnitude of velocity of the detecting mechanism is greater than a predetermined velocity threshold value and generating adjusted physical object position information if the magnitude of velocity of the detecting mechanism is greater than the predetermined velocity threshold value,
- wherein the processing device is configured to generate motion information for the at least one virtual object based on the adjusted physical object position information if the processing device generates the adjusted physical object position information, and
- wherein the processing device is configured to generate the motion information for the at least one virtual object based on the physical object position information if the processing device does not generate the adjusted physical object position information, and
- wherein the processing device is configured to output the motion information for the at least one virtual object to the display.
23. The interaction system of claim 22,
- wherein the detecting mechanism is further configured to detect a correct direction and magnitude of a physical gravity vector, and
- wherein the processing device is further configured to generate the motion information for the at least one virtual object additionally based on the correct direction and magnitude of the physical gravity vector.
24. The interaction system of claim 23,
- wherein the at least one virtual object is configured to move according to a virtual gravity vector, and
- wherein the virtual gravity vector is substantially identical to the physical gravity vector.
25. The interaction system of claim 23,
- wherein the at least one virtual object is configured to move according to a virtual gravity vector, and
- wherein the virtual gravity vector is different from the physical gravity vector.
26. The interaction system of claim 22,
- wherein the tracking mechanism is an optical tracking device and the physical object does not include attached or embedded electronic devices or components.
27. An interaction delivery system comprising:
- a first tracking mechanism rigidly attached to a display device configured to track the position and orientation of a physical object relative to the display device;
- a second tracking mechanism rigidly attached to the display device, configured to track the absolute orientation of the display device relative to the earth; and
- a processing device configured to process tracking information output from the first tracking mechanism and tracking information output from the second tracking mechanism,
- wherein the processing device is configured to simulate motion information of at least one virtual object from the tracking information output from the first tracking mechanism and tracking information output from the second tracking mechanism,
- wherein the processing device is configured to output the simulated motion information of the at least one virtual object to the display,
- wherein the display is configured to display the at least one virtual object disposed relative to the physical object, and
- wherein the at least one virtual object is configured to behave as if acted on by a virtual gravity vector in a same direction as a physical gravity vector acting on the physical object.
Type: Application
Filed: Apr 11, 2011
Publication Date: Oct 13, 2011
Inventors: Steven K. Feiner (New York, NY), Ohan Oda (New York, NY)
Application Number: 13/084,488
International Classification: A63F 9/24 (20060101);