VIRTUAL SENSOR IN A VIRTUAL ENVIRONMENT
A system for generating haptic effects includes a virtual environment having environmental properties, virtual objects, and object property information. A programmatic virtual sensor is placed on a virtual object in the virtual environment. A rendering engine for the virtual environment renders the virtual environment. A module for the virtual sensor receives virtual sensor data including position and time for the sensor and calculates sensor output data including acceleration data and object interaction data for the virtual sensor. A haptic track generator generates a haptic track based on the sensor output data.
This application is a continuation of U.S. patent application Ser. No. 14/584,025, filed on Dec. 29, 2014, the specification of which is hereby incorporated by reference.
FIELDOne embodiment is directed to a virtual sensor. More particularly, one embodiment is directed to a virtual sensor for haptics generation in a virtual environment.
BACKGROUND INFORMATIONSensor haptics can be used to quickly and realistically generate haptic effects. For example, sensors can be added to real-world objects and gather data about the physics of how the objects interact in their environment. Sensor data can be used to author haptic effects to accompany filmed or authored environments that have similar interactions as those captured with sensor data.
A haptic designer may develop haptic information for playback on one or more haptic output devices based on sensor data. In authored environments, like a three dimensional (“3D”) animated movie or show or 3D-scape game, a haptic designer can incorporate sensor data in developing haptic effects. Haptic effects can be determined using sensor data along with the haptic designer's expertise and experience. The haptic designer can create a haptic track, which when played on a haptic output device will produce haptic effects.
SUMMARYOne embodiment is a system for generating haptic effects in a virtual environment with environmental properties, virtual objects, and object property information. A programmatic virtual sensor is placed on a virtual object in the virtual environment. A rendering engine for the virtual environment renders the virtual environment. A module for the virtual sensor receives virtual sensor data including position and time for the sensor and calculates sensor output data including acceleration data and object interaction data for the virtual sensor. A haptic track generator generates a haptic track based on the sensor output data.
Embodiments allow a designer to place one or more virtual sensors, rather than actual sensors, in an authoring environment. These virtual sensors can gather data within the environment and, using the gathered data, generate a haptic track for playback on a haptic device or for further editing. In some embodiments, an existing haptic effects track can be modified to provide enhanced haptic playback information.
Automatic haptic effect generation at the level of a physics engine using virtual sensors in a video game or other virtual environment can provide another layer of complexity of haptic effects that are altered by a spatialization engine. In addition, virtual sensors can generate entirely new haptic effects on-the-fly in real time rendered environments (or run time) or for later playback in produced rendered environments. Embodiments further include applications that run computer simulations of objects and environments. Other uses also include movie animation, such as in a 3D animated feature film. Because every action in an animated movie must be designed within an authoring tool or framework, simulating sensors to detect collisions or physics in accordance with embodiments enables the auto generation of a haptic track to reduce or remove the need to author one.
Computer readable media may be any available media that can be accessed by processor 22 and includes both volatile and nonvolatile media, removable and non-removable media, and communication media. Communication media may include computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
Processor 22 is further coupled via bus 12 to a display 24, such as a Liquid Crystal Display (“LCD”). A keyboard 26 and a cursor control device 28, such as a computer mouse, are further coupled to bus 12 to enable a user to interface with system 10.
In one embodiment, memory 14 stores software modules that provide functionality when executed by processor 22. The modules include an operating system 15 that provides operating system functionality for system 10. The modules further include a virtual sensor control 16 that provides and processes data from a virtual sensor 19, as disclosed in more detail below. System 10 can be part of a larger system, such as a haptic output system, haptic playback system, or haptic creation system. Therefore, system 10 will typically include one or more additional functional modules 18 to include the additional functionality. A database 17 is coupled to bus 12 to provide centralized storage for modules 16 and 18 and store one or more data sets to support contextual data processing, etc. Some embodiments may not include all of the elements in
A haptic output system (not pictured) can include actuators or other haptic output devices for outputting haptic effects. An actuator for example can be any type motor, including without limitation an Eccentric Rotating Mass (“ERM”), a Linear Resonant Actuator vibration motor (“LRA”), a piezoelectric motor, or a solenoid actuator. Other types of haptic output devices may be non-mechanical or non-vibratory devices such as devices that use electrostatic friction (“ESF”), ultrasonic surface friction (“USF”), devices that induce acoustic radiation pressure with an ultrasonic haptic transducer, devices that use a haptic substrate and a flexible or deformable surface or shape changing devices and that may be attached to a user's body, devices that provide projected haptic output such as a puff of air using an air jet, devices that provide electrical muscle stimulation, etc.
Haptic output devices can be contained in a hand held device such as a phone, tablet, or controller; a wearable device, such as a strap, glove, or clothing article; or directly attached to a user's skin through an adhesive or mechanical device. The haptic output system can decide what haptic effects are to be played and the order in which the effects are played based on high level parameters. In general, the high level parameters that define a particular haptic effect include magnitude, frequency, and duration. Low level parameters such as streaming motor commands could also be used to determine a particular haptic effect. A haptic effect may be considered “dynamic” if it includes some variation of these parameters when the haptic effect is generated or a variation of these parameters based on a user's interaction.
At 205, a virtual sensor, such as virtual sensor 19, is placed in a virtual environment. The virtual environment can be, for example, a two dimensional (“2D”) or 3D video game engine, a video authoring program, a 3D modeling environment or workspace, a virtual reality environment, and the like. Virtual sensor 19 can be placed on an object in the virtual environment. The virtual sensor can be a special object within an authoring framework based on a plugin. It can be placed by dragging and dropping the sensor to the object or location on which it will be placed. As the object in the environment moves, the virtual sensor will also move, maintaining a fixed position relative to the object on which it is placed. In some embodiments virtual sensor 19 can be visible in an authoring software environment and invisible in a rendered software environment.
At 210, the virtual environment can be rendered. Depending on the type of environment, the rendering can be done in real time, such as with a video game with characters being controlled by controllers, or can be produced such as with a movie with computer generated interactive graphics. As the virtual environment is rendered, virtual sensor 19 will move about the environment either according to a script or according to input from a controller. The object on which virtual sensor 19 is placed can contact other virtual objects, such as characters, walls, floors, and so forth, during rendering.
At 215, virtual sensor data is collected. Data can include positional data (such as location data on three axes), other object data (especially object data from objects that come within a certain proximity of the virtual sensor), time data (fourth axis), and environmental data such as gravity, temperature, and so forth. From this data, haptic effects can be generated. At 220, haptic effects are generated. For example, if the virtual sensor, such as virtual sensor 19, is moving in a direction over a period of time, then over a short period of time change direction or stop, this can be calculated as a sharp deceleration qualifying for a particular type of haptic effect.
In some embodiments, the virtual sensor, such as virtual sensor 19, can have associated with it particular haptic effects. For example, a developer can place a virtual sensor at a point that will produce a rumble haptic effect and another virtual sensor at the same point (or a different point) that will produce a poke haptic effect, and so forth.
As part of the generation of haptic effects at 220, the virtual sensor module 16 can use other information available in the environment to tailor haptic effects more specifically. For example, in a virtual environment where the gravity property has been set to be one third Earth gravity, a haptic effect can be less intense when the sensor detects a change in a downward motion, since the gravitational acceleration force is less in that environment as compared to an environment with Earth gravity. As another example of the type of information available in the environment, a floor can have a property of being rough, and a sensor that moves in close proximity to the floor can be determined to be “rubbing” on the floor, causing and an appropriate haptic effect to be produced to more realistically emulate a sensation of rubbing against a rough surface.
At 305, global haptic preferences for the scene of the virtual environment can be set. These can include those environmental properties as discussed above, such as gravity, humidity, in-air versus in-water, and so forth. The properties can be input into the authoring environment via a haptic plugin module in the software. At 310, a model or object in the virtual environment can be selected. At 315, haptic generation can be enabled in the parameters for the object selected in 310. Other parameters in an object can include any type of physical property such as weight, density, material (which could be based on a preset), etc. At 320, the objects or parameters of interaction can be specified. For example, the parameters of interaction can include edge monitoring of an object so that when the edge of the object comes in contact with the edge of another object, a haptic effect is generated. The parameters can further specify how the intensity of the haptic effect can vary depending on the perceived or calculated intensity of the contact between the two objects.
At 325, an output haptic track can be identified that specifies where to save or play the haptic effect when the scene is rendered. For example, the haptic track can correspond to a haptic output device if the rendering is done in real time as in a video game, or the haptic track can be a saved format for playing back haptic information on haptic output devices if the rendering produces a video file.
At 330, haptic effect priority in the scene can be specified. For example, certain types of haptic effects can have preference in the scene. A rumble may have a lower priority than a press sensation, so that if an edge of the object contacts another object, a press can be felt distinctly over an accompanying rumble. In another example, haptic effects that could be generated from substantially constant contact with a floor object or the like may receive lower priority since a constant haptic effect from such contact may diminish the effects generated related to other contact in the scene. In embodiments utilizing a virtual sensor, such as virtual sensor 19, where multiple sensors are used, each of the sensors can have a different priority assigned so that sensors with less priority can generate less haptic effects or the haptic effects generated based on the virtual sensor data can be attenuated.
At 335, the scene is rendered. Rendering can be a production effort, such as with a movie or television show where rendering occurs and is saved to a video file, or rendering can be from a real time rendering engine, such as with a video game where rendering is output directly to a video display.
At 340, haptic effects are generated based on the priorities and preferences discussed in the preceding flow. This is where objects are detected and tracked through the scene. Objects edges (or wireframe edge points if appropriate) can be tracked through the scene. A haptic effect can be generated based on the tracking and movement of objects through the scene. For example, one algorithm may track a wireframe edge on a first object and a wireframe edge on a second object. The algorithm can then note that the first and second object edges come within approximately the same point in an X-Y-Z axes. The algorithm can then note that while within the same proximity, the first object edge rapidly accelerates away from the second object edge. The algorithm can determine that a haptic effect should be generated for the motion. One of skill in the art can develop algorithms for generating different haptic effects based on the acceleration and deceleration of objects (or virtual sensors), the proximity of the objects (or virtual sensors) to other objects, and environmental factors. Just like a sensor in the real-world analogue, the algorithm will be largely based on acceleration and position in the scene. One example of how this can work is discussed in greater detail below.
At 345, the generated haptics are output to the specified tracks. Haptic effects can be played on haptic output devices or saved to a haptic playback file. Haptic effects generated to specific haptic output devices can also be saved to a file as appropriate.
One example of how the flows of
In another example of how the flows of
At 415, virtual objects can be created or generated in the virtual environment. At 420, the virtual objects can have object properties and parameters assigned, including properties analogous to the real world, such as weight, bounciness, mass, size, radiation/emanation, etc.
At 425, the environment can be run or rendered including virtual objects. The virtual objects and environments are aware of object interaction, such as collision detection, speed, motion, direction, orientation, and perspective. In embodiments using virtual sensors, such as virtual sensor 19, the sensors can serve as proxies for the actual objects or specific points of interest relative to an object. At 430, as part of the rendering or running process for the objects and environment, settings related to rendering or running the environment and objects can be used. Settings can include properties such as frame rate, resolution, numbers of triangles in objects, and so forth.
At 435, a haptic engine can be used to generate appropriate haptic effects based on the actions in the rendered or run environment. The engine can be driven by one or more algorithms based on style or other options. For example, as part of the settings, the perspective of the scene, e.g., that of a viewer/player or another camera angle, can be set which can alter the types of effect generated. Other options may include a targeted haptic platform and an optional haptic effect prioritization engine or mixer.
At 440, haptic effects can be generated by the haptic engine of 435, based on object or environmental properties. For example, upon rendering, the generation of haptics can be determined by the camera of the scene and the number of and priority of elements specified to print in the mix of haptic effects. In some embodiments, haptic effects can be rendered or previewed while rendering. At 445, the haptic effects can be output to editable or playable formats. Generation and output can be done at run time, such as during game play, animation playback, or on collision detection to a device capable of rendering the generated haptic effects, such as a haptic output device. Generation and output can be saved into a file storing a haptic track, similar to an audio stem file that isolates specific audio elements to one track.
One example where the flow of
Another example where the flow of
Another example where the flow of
At 510, properties of secondary, tertiary, etc. object(s) are determined along with global or environmental properties. Global or environmental properties can include things such as gravity, light source(s)/intensity, atmospheric conditions, default values for objects, and so forth.
At 515, based on object and environmental properties, simulate physics of objects in environment. Collision detection, for example, can determine the interaction of objects in the virtual space when they come in contact with each other. It should be noted that physics engines and collision detection algorithms are established technologies in gaming and computer generated graphics. However, the physics of the individual objects as they collide can determine an appropriate haptic response. For example, if one of the objects is a device with a rubber-like material, just as in real life, the rubber would absorb impact and lessen a haptic response. Described in physics terms, the rubber acts to decrease the absolute value of the acceleration of the object, which in turn decreases the force of the collision. Compared to an impact with a hard object, such as a bowling ball, the force of the collision would be greater, even for objects of the same mass because the absolute value of the acceleration (or the speed at which the ball slows a stop) would be less.
At 520, numerical values for the physical interactions of the objects can be determined, such as values for an impact(s)′ strength, frequency, attack, and decay. For example, force equals mass times acceleration (F=ma). Thus, a numerical value for strength can be derived from collision data. Similarly, a frequency numerical value can be derived from collision data. Attack and decay can be derived from collision data and physical properties of objects, such as force per area of impact. As an example of how numerical values can be derived, a high force value would lead to a high impact strength. A high frequency of impact, such as two object sliding along each other creating essentially many impacts quickly, or a low frequency of impact, such as a ball bouncing up and down, can affect whether the determined haptic effect would be targeted on a high or low frequency haptic output device (such as a LRA or piezo actuator for high frequency effects or an ERM for low frequency effects).
At 525, numerical values are mapped to haptic effect generation based on render settings. Render settings can include real-time rendering settings such as in a game or settings associated with a production based rendering. Settings can also include a desired output type and/or available or targeted haptic output devices. For example, a haptic effect can include consideration for a targeted type of haptic output device or, if the targeted haptically-enabled device is known (the brand/model of a user's smartphone or gaming system), a tailored haptic effect/track can be created. In some embodiments, multiple haptic tracks can include information in each track specific to a type of haptically-enabled device.
Mapping of numerical values to haptic effects can depend on available haptic output devices, and can model collision impacts. For example, having calculated numerical values for a rubber ball hitting a hand, the numerical values can be mapped to a haptic effect. The haptic effect could include a magnitude, frequency, duration, and rate of decay for one or more haptic output devices.
At 530, a haptic signal can be output or rendered based on the haptic effect generated and parameters (e.g., outputting in real-time, writing to a haptic output track, etc.). At 535, if applicable, prioritization options can be applied to available haptic effects (or prioritization information included in the haptic tracks) on playback of the haptic effects. For example, in some embodiments, the haptic effects can be altered based on the camera angle or other environmental factors.
At 605, a first haptic effect is read. The first haptic effect can be one that is authored or previously automatically generated by engine parameters or derived from an audio track. At 610, virtual sensor data (or virtual object data) is collected, such as in flow elements 215 of
At 615, the first haptic effect is modified based on sensor data to create a second haptic effect. The modification can be a modulation of the first haptic effect, such as, based on sensor data from a virtual accelerometer, pitch shifting the frequency of the original effect. Another modification can be a collision detection sensor to alter the strength of the effect when appropriate. Modification of the first haptic effect can be based on a library of real-world sensor based haptic effect information. In some embodiments, the second haptic effect can be output to replace the first haptic effect, and in some embodiments, the second haptic effect can be output as a separate haptic track to provide multiple available haptic effects tracks that can be selected depending on changes to the environment or camera angles. One of skill will understand, of course, that a third haptic effect can be created by modifying the first or second haptic effects, and so forth. At 620, the second effect is output and/or played (similar to the outputting of a haptic track as described above).
As disclosed, embodiments incorporate virtual sensor or virtual object tracking in a virtual environment to gather data about the objects (or sensors) in the environment and rendering or run time. Based on the sensor data and properties of the environment and the objects, haptic effects are generated to provide an interactive atmosphere for a user.
Several embodiments are specifically illustrated and/or described herein. However, it will be appreciated that modifications and variations of the disclosed embodiments are covered by the above teachings and within the purview of the appended claims without departing from the spirit and intended scope of the invention.
Claims
1-20. (canceled)
21: A system for generating haptic effects comprising:
- a virtual environment including a first virtual object;
- a processor;
- a virtual sensor associated with the first virtual object, wherein the virtual sensor gathers virtual sensor data and the processor is configured to generate a haptic effect based on the virtual sensor data, the haptic effect simulating physics of the first virtual object in the virtual environment and provided by a haptic output device.
22: The system according to claim 21, wherein the virtual sensor data is based on an environmental property of the virtual environment selected from the group consisting of gravity, temperature, light source, light intensity, and humidity.
23: The system according to claim 21, wherein the virtual sensor data is based on an object property of the first virtual object selected from the group consisting of mass, weight, volume, material, density, size, hardness, and texture.
24: The system according to claim 21, wherein the virtual sensor data is based on an interaction of the first virtual object with a second virtual object in the virtual environment.
25: The system according to claim 21, wherein the haptic effect is generated as a second haptic effect subsequent to a first haptic effect.
26: The system according to claim 25, wherein the second haptic effect comprises a modification of the first haptic effect, the modification being based on the virtual sensor data.
27: The system according to claim 21, wherein the virtual sensor is placed on the first virtual object such that the virtual sensor moves with the first virtual object in the virtual environment.
28: The system according to claim 27, wherein the virtual sensor enables sensing across the entire first virtual object in the virtual environment.
29: The system according to claim 27, wherein the first virtual object corresponds to a real object.
30: The system according to claim 29, wherein the real object is a game controller.
31: The system according to claim 30, wherein the game controller is a rifle controller and the virtual sensor is placed on a butt, a grip, or a trigger of the rifle controller in the virtual environment or the virtual sensor is placed on a shoulder, a hand, or a finger of an avatar holding the rifle controller in the virtual environment.
32. A method of generating haptic effects, comprising:
- rendering a virtual environment via a rendering engine, the virtual environment including a first virtual object;
- associating a virtual sensor with the first virtual object;
- receiving virtual sensor data from the virtual sensor;
- calculating sensor output data based on the virtual sensor data;
- generating a haptic effect based on the sensor output data, the haptic effect simulating physics of the first virtual object in the virtual environment; and
- providing the haptic effect by a haptic output device.
33. The method of claim 32, further comprising determining an environmental property of the virtual environment selected from the group consisting of gravity, temperature, light source, light intensity, and humidity, wherein the virtual sensor data is based on the environmental property.
34. The method of claim 32, further comprising determining an object property of the first virtual object selected from the group consisting of mass, weight, volume, material, density, size, hardness, and texture.
35. The method of claim 32, wherein the virtual environment includes a second virtual object and the virtual sensor data is based on an interaction of the first virtual object with the second virtual object in the virtual environment.
36. The method of claim 35, further comprising determining a numerical value for the interaction of the first virtual object with the second virtual object.
37. The method of claim 36, wherein the numerical value is selected from the group consisting of an impact strength, a frequency, an attack, and a decay.
38. The method of claim 36, further comprising mapping the numerical value to the haptic effect.
39. The method of claim 38, wherein the haptic effect includes at least one of a magnitude, a frequency, a duration, and a rate of decay.
40. The method of claim 32, further comprising generating the haptic effect as a second haptic effect subsequent to a first haptic effect, the second haptic effect comprises a modification of the first haptic effect based on the virtual sensor data.
Type: Application
Filed: May 31, 2018
Publication Date: Oct 4, 2018
Inventor: William RIHN (San Jose, CA)
Application Number: 15/994,878