DYNAMIC LIGHTING AND RENDERING TECHNIQUES IMPLEMENTED IN GAMING ENVIRONMENTS

- 3G STUDIOS, INC.

Various aspects described or referenced herein are directed to different methods, systems, and computer program products for facilitating dynamic, real-time adjustment and rendering of lighting, shading, and/or other display characteristics associated with stationary virtual objects and/or dynamically movable virtual objects. In at least one embodiment, the dynamic rendering of pixels for a selected virtual object within a given scene may be performed in real-time using predefined light source influence criteria representing the amount of light intensity or light influence which each individual light source (of the scene) has on the pixel being rendered.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
RELATED APPLICATION DATA

The present application claims benefit, pursuant to the provisions of 35 U.S.C. §119, of U.S. Provisional Application Ser. No. 61/504,141 (Attorney Docket No. 3GSTP001P), titled “USER BEHAVIOR, SIMULATION AND GAMING TECHNIQUES”, naming Kosta et al. as inventors, and filed Jul. 1, 2011, the entirety of which is incorporated herein by reference for all purposes.

COPYRIGHT NOTICE/PERMISSION

A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever. The following notice applies to the software and data as described below and in the drawings hereto: Copyright© 2010-2012, Dean E. Wolf, All Rights Reserved.

BACKGROUND

The present disclosure relates to gaming environments. More particularly, the present disclosure relates to dynamic lighting and rendering techniques implemented in gaming environments.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 illustrates a simplified block diagram of a specific example embodiment of a portion of a Computer Network 100

FIG. 2 is a simplified block diagram of an exemplary gaming machine 200 in accordance with a specific embodiment.

FIG. 3 shows a diagrammatic representation of machine in the exemplary form of a client (or end user) computer system 300.

FIG. 4 is a simplified block diagram of an exemplary gaming device 400 in accordance with a specific embodiment.

FIG. 5 illustrates an example embodiment of a Server System 580 which may be used for implementing various aspects/features described herein.

FIG. 6 illustrates an example of a functional block diagram of a Server System 600 in accordance with a specific embodiment.

FIGS. 7-14 show various example embodiments illustrating some of the dynamic lighting and rendering techniques described herein.

FIG. 15 shows a flow diagram of a Dynamic Light Influence Rendering Procedure in accordance with a specific embodiment.

DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS Overview

Various aspects described or referenced herein are directed to different methods, systems, and computer program products for facilitating dynamic, real-time adjustment and rendering of lighting, shading, and/or other display characteristics associated with stationary virtual objects and/or dynamically movable virtual objects. At least some aspects described herein are directed to different methods, systems, and computer program products for facilitating dynamic, real-time adjustment and rendering of lighting, shading, and/or other display characteristics associated with stationary virtual objects and/or dynamically movable virtual objects. In at least one embodiment, the dynamic rendering of pixels for a selected virtual object within a given scene may be performed in real-time using predefined light source influence criteria representing the amount of light intensity or light influence which each individual light source (of the scene) has on the pixel being rendered.

A first aspect is directed to different methods, systems, and computer program products for operating a gaming system. According to different embodiments, the gaming system may be operable to facilitate, enable, initiate, and/or perform one or more of the following operation(s), action(s), and/or feature(s) (or combinations thereof): real-time, dynamic adjustment of lighting characteristics of individual light sources within a rendered scene; real-time dynamic adjustment of rendered lighting characteristics of virtual object(s) (e.g., including both static objects and movable objects) within a rendered scene based on the dynamic adjustment of the lighting characteristics of individual light sources within the rendered scene; real-time dynamic rendering and shading of virtual object(s) (e.g., including both static objects and movable objects) and/or shadows within a rendered scene; real-time dynamic rendering and shading of virtual object(s) (e.g., including both static objects and movable objects) and/or shadows within a rendered scene based on the dynamic adjustment of lighting characteristics of individual light sources within the rendered scene; real-time dynamic rendering of lighting and shading characteristics of virtual objects and/or shadows within a given scene based on the lighting characteristics of individual light sources within that scene; real-time projection of properties of reflected light (radiosity) onto virtual objects within a given scene based on the lighting characteristics of individual light sources and objects within that scene; real-time adjustment of lighting intensity, color and falloff; calculation of Light Source Influence criteria for selected features (e.g., virtual objects, pixels, Light Influence Grid Points, object vertices, etc.) of a virtual scene. In at least one embodiment, the Light Source Influence criteria associated with a given pixel characterizes the amount of light intensity or light influence which each distinct, identified light source has on that particular pixel; dynamic calculation (e.g., during runtime) of rendered display characteristics for selected features (e.g., virtual objects, pixels, Light Influence Grid Points, object vertices, etc.) of a virtual scene using predetermined Light Source Influence criteria and using current lighting characteristics of light source(s) within the scene; enabling the rendered color and/or brightness characteristics of one or more pixels, vertices, Light Influence Grid Points and/or other points in space of the scene being rendered to dynamically change over one or more different time intervals, for example, by dynamically adjusting the lighting characteristics (e.g., RGB, brightness, falloff, etc.) of one or more light sources in the scene; dynamic runtime rendering of pixels for a selected virtual object within a given scene using predefined light source influence criteria representing an amount of light intensity or light influence which each individual light source (of the scene) has on the pixel being rendered.

A second aspect is directed to different methods, systems, and computer program products for operating a gaming system. According to different embodiments, the gaming system may be operable to facilitate, enable, initiate, and/or perform one or more of the following operation(s), action(s), and/or feature(s) (or combinations thereof): initiate a first active gaming session at the first gaming system; identify a first virtual scene to be rendered for display during the first active gaming session; identify a first virtual light source associated with the first virtual scene, the first virtual light source having associated therewith a first portion of lighting characteristics; dynamically set the first portion of lighting characteristics to be in accordance with a first set of values; dynamically render and display, during the first active gaming session and using the first portion of lighting characteristics, a first virtual object of the first scene, wherein the first rendered virtual object is displayed in accordance with a first set of rendered lighting characteristics relating to a first visual appearance of the first rendered virtual object; dynamically modify the first portion of lighting characteristics to be accordance with a second set of values; dynamically adjust the visual appearance of the first rendered virtual object in response to the dynamic modification of the first portion of lighting characteristics, wherein the dynamic adjustment of the visual appearance of the first rendered virtual object includes dynamically calculating, using the second set of values, a modified set of rendered lighting characteristics relating to a modified visual appearance of the first rendered virtual object; identify a second virtual light source associated with the first virtual scene, the second virtual light second light source having associated therewith a second portion lighting characteristics; dynamically render and display, during the first active gaming session and using the first portion of lighting characteristics and second portion of lighting characteristics, the first virtual object of the first scene, wherein the first rendered virtual object is displayed in accordance with a third set of rendered lighting characteristics relating to a third visual appearance of the first rendered virtual object; dynamically modify the second portion of lighting characteristics to be accordance with a modified set of values; dynamically adjust the visual appearance of the first rendered virtual object in response to the dynamic modification of the second portion of lighting characteristics, wherein the dynamic adjustment of the visual appearance of the first rendered virtual object includes dynamically calculating, using the modified set of values, a second modified set of rendered lighting characteristics relating to a second modified visual appearance of the first rendered virtual object; identify a second virtual light source associated with the first virtual scene, the second virtual light second light source having associated therewith a second portion lighting characteristics; identify a first pixel associated with the first virtual object; identify first light source influence criteria relating to the identified first pixel, wherein the first light source influence criteria characterizes a first amount of light intensity or light influence which the first virtual light source has over the first pixel; identify second light source influence criteria relating to the first pixel, wherein the second light source influence criteria characterizes a second amount of light intensity or light influence which the second virtual light source has over the first pixel; dynamically calculate, during rendering of the first virtual object and using the first and second light source influence criteria, a set of first pixel lighting characteristics relating to a visual appearance of the first pixel; dynamically calculate, during rendering of the first virtual object and using the first and second light source influence criteria, a set of first pixel lighting characteristics relating to a composite light intensity of the first pixel; identify a first current color profile associated with the first virtual light source; identify a second current color profile associated with the second virtual light source; dynamically calculate, during rendering of the first virtual object and using the first and second light source influence criteria, a set of first pixel lighting characteristics relating to a composite light intensity of the first pixel; dynamically determine, during rendering of the first virtual object, at least one color characteristic of the first pixel using the set of first pixel lighting characteristics and using the first and second current color profiles.

Various objects, features and advantages of the various aspects described or referenced herein will become apparent from the following descriptions of its example embodiments, which descriptions should be taken in conjunction with the accompanying drawings.

Specific Example Embodiments

Various techniques will now be described in detail with reference to a few example embodiments thereof as illustrated in the accompanying drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of one or more aspects and/or features described or reference herein. It will be apparent, however, to one skilled in the art, that one or more aspects and/or features described or reference herein may be practiced without some or all of these specific details. In other instances, well known process steps and/or structures have not been described in detail in order to not obscure some of the aspects and/or features described or reference herein.

One or more different inventions may be described in the present application. Further, for one or more of the invention(s) described herein, numerous embodiments may be described in this patent application, and are presented for illustrative purposes only. The described embodiments are not intended to be limiting in any sense. One or more of the invention(s) may be widely applicable to numerous embodiments, as is readily apparent from the disclosure.

These embodiments are described in sufficient detail to enable those skilled in the art to practice one or more of the invention(s), and it is to be understood that other embodiments may be utilized and that structural, logical, software, electrical and other changes may be made without departing from the scope of the one or more of the invention(s). Accordingly, those skilled in the art will recognize that the one or more of the invention(s) may be practiced with various modifications and alterations. Particular features of one or more of the invention(s) may be described with reference to one or more particular embodiments or figures that form a part of the present disclosure, and in which are shown, by way of illustration, specific embodiments of one or more of the invention(s). It should be understood, however, that such features are not limited to usage in the one or more particular embodiments or figures with reference to which they are described. The present disclosure is neither a literal description of all embodiments of one or more of the invention(s) nor a listing of features of one or more of the invention(s) that must be present in all embodiments.

Headings of sections provided in this patent application and the title of this patent application are for convenience only, and are not to be taken as limiting the disclosure in any way.

Devices that are in communication with each other need not be in continuous communication with each other, unless expressly specified otherwise. In addition, devices that are in communication with each other may communicate directly or indirectly through one or more intermediaries.

A description of an embodiment with several components in communication with each other does not imply that all such components are required. To the contrary, a variety of optional components are described to illustrate the wide variety of possible embodiments of one or more of the invention(s).

Further, although process steps, method steps, algorithms or the like may be described in a sequential order, such processes, methods and algorithms may be configured to work in alternate orders. In other words, any sequence or order of steps that may be described in this patent application does not, in and of itself, indicate a requirement that the steps be performed in that order. The steps of described processes may be performed in any order practical. Further, some steps may be performed simultaneously despite being described or implied as occurring non-simultaneously (e.g., because one step is described after the other step). Moreover, the illustration of a process by its depiction in a drawing does not imply that the illustrated process is exclusive of other variations and modifications thereto, does not imply that the illustrated process or any of its steps are necessary to one or more of the invention(s), and does not imply that the illustrated process is preferred.

When a single device or article is described, it will be readily apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described (whether or not they cooperate), it will be readily apparent that a single device/article may be used in place of the more than one device or article.

The functionality and/or the features of a device may be alternatively embodied by one or more other devices that are not explicitly described as having such functionality/features. Thus, other embodiments of one or more of the invention(s) need not include the device itself.

Techniques and mechanisms described or reference herein will sometimes be described in singular form for clarity. However, it should be noted that particular embodiments include multiple iterations of a technique or multiple instantiations of a mechanism unless noted otherwise.

Various aspects described or referenced herein are directed to different methods, systems, and computer program products for facilitating dynamic, real-time adjustment and rendering of lighting, shading, and/or other display characteristics associated with stationary virtual objects and/or dynamically movable virtual objects.

FIGS. 7-13 show various example embodiments illustrating some of the dynamic lighting and rendering techniques described herein.

Vertex-Based Lighting Influence

FIG. 7 illustrates one embodiment of an example scene (700) which may be used for illustrating one or more of the vertex-based dynamic lighting techniques described herein. For example, in the specific example embodiment of FIG. 7, a three-dimensional scene 700 is depicted which is to be rendered (e.g., in real-time) on a display screen such as the display of a gaming system. In this example, it is assumed that a stationary virtual object 710 is to be displayed within the scene, and that the object is illuminated by three different light sources, namely L1, L2, and L3. Additionally, in this example embodiment, it is assumed that the object 710 includes five distinct vertices, namely V1-V5.

In at least one embodiment, each light source (L1, L2, L3) may have associated therewith a respective set of unique lighting characteristics, which, for example, may include, but are is limited to, one or more of the following (or combinations thereof):

    • Individually definable Red (R), Green (G), and Blue (B) component (or channel) values (e.g., based on the well known RGB color model).
    • Composite brightness/intensity value
    • Saturation value;
    • Falloff ratio (e.g., distance vs. attenuation)
    • Specular Highlight color value(s)
    • Radiosity value (e.g., carryover from light bounces)

In at least one embodiment, the RGB color model may be used to describe the lighting characteristics of a given light source by defining or indicating the respective amounts of red, green, and blue components which collectively make up the composite light emitted from that light source. For example, in one embodiment, the lighting characteristics of a given light source may be expressed as an RGB triplet (R,G,B), where the respective value of each component may vary from zero to a defined maximum value. If all the components are at zero the result is black; if all are at maximum, the result is the brightest representable white. According to different embodiments, these ranges may be quantified in several different ways, such as, for example, one or more of the following (or combinations thereof): (i) From 0 to 1, with any fractional value in between; a percentage, e.g., from 0% to 100%; as integer numbers in the range 0 to 255 (e.g., the range that a single 8-bit byte can offer); as integer numbers in the range 0 to 1023 (e.g., the range able to be represented using 10 bits; as integer numbers in the range 0 to 65535 (e.g., the range able to be represented using 16 bits); etc.

FIG. 8 shows an example representation of the lighting characteristics associated with light sources L1, L2, L3 (e.g., FIG. 7). FIG. 10 shows an example representation of the lighting characteristics associated with light sources L1, L2, L3 in accordance with an alternate embodiment.

As illustrated in the example embodiment of FIG. 8, the lighting characteristics of a given light source (e.g., L1) may be represented as a composite of individually definable RGB components (e.g., R1, G1, B1). Thus, for example, in one embodiment, using the lighting characteristic data shown in the example embodiment of FIG. 8, the lighting characteristics of light sources L1, L2, L3 may be expressed by the matrices:


L1=[R1,G1,B1];


L2=[R2,G2,B2];


L3=[R3,B3,G3].

As illustrated in the example embodiment of FIG. 10, one or more of the different light sources may also have associated therewith a other types of lighting characteristics (e.g., radiosity, ambient occlusion, falloff, etc.) which may be represented by additional fields and/or variables (e.g., 1007, etc.).

In the field of computer graphic lighting techniques, there are several conventional computer graphic lighting techniques which may be utilized for lighting virtual objects in a rendered image. For example, one popular lighting technique known as “per-pixel lighting” refers to a technique for lighting an image or scene that calculates illumination for each pixel on a rendered image. Many modern video game engineers prefer to implement lighting using per-pixel techniques to achieve increased detail and realism. However, per-pixel lighting is considered to be “costly” or “expensive” in terms of the computational resources required to calculate illumination on a per-pixel basis, which can often introduce latency issues with respect to real-time rendering of scenes. In particular, lighting influence from each light in the scene must be calculated on a per-pixel basis. Color is then derived from the light influence multiplied by the light color. This is high quality but also very computationally intensive. Accordingly, in an effort to reduce such computational resource and latency issues, some video games employ a lighting technique referred to as “pre-baked” vertex lighting, in which the lighting characteristics at each vertex of a 3D model are pre-calculated based upon a composite of predetermined, static light sources within the scene, and the “pre-baked” lighting characteristics of the vertices are then used to interpolate the resulting values over the model's outer surfaces to calculate the final per-pixel lighting characteristics. However, one significant limitation of the pre-baked vertex lighting technique is that it does not offer the ability to permit dynamic modification of the lighting characteristics of individual light sources within the scene. For example, referring to the example embodiment of FIG. 7 in which the object 710 is illuminated by light sources L1, L2, and L3, the conventional pre-baked vertex lighting technique is not configured to handle or support dynamic adjustment of the lighting characteristics of individual light sources. Thus, for example, the conventional pre-baked vertex lighting technique is not configured to support the use of light sources exhibiting one or more of the following lighting characteristics (or combinations thereof): dimming; strobing; pulsing; flickering, etc.

In contrast, the various dynamic lighting and rendering techniques described herein may provide functionality for supporting real-time, dynamic adjustment of the lighting characteristics of individual light sources within a rendered scene, and may provide functionality for supporting real-time dynamic adjustment of rendered lighting characteristics of virtual object(s) (e.g., including both static objects and movable objects) within a rendered scene based on the dynamic adjustment of the lighting characteristics of individual light sources within the rendered scene. Moreover, as described in greater detail herein, the dynamic lighting and rendering functionality disclosed herein may be implemented using vertex-based lighting techniques, pixel-based lighting techniques, and/or grid-based lighting techniques.

For purposes of illustration, an example embodiment of the dynamic lighting/rendering technique will now be described by way of example with reference to FIGS. 7-9 of the drawings. In this particular example, as illustrated in the example embodiment of FIG. 7, a three-dimensional scene 700 is depicted which includes a stationary virtual object 710 which is assumed to be illuminated by three different light sources, namely L1, L2, and L3.

Taking into account the various different types of environmental and lighting features, characteristics, properties, and objects (e.g., radiosity, stationary v. dynamic objects, shadows, static v. dynamic light sources, ambient occlusion, falloff, etc.) which may influence lighting within scene portion 700, numerous algorithmic and/or complex computations may be performed (e.g., in advance of real-time game play) to thereby generate a set (or matrix) of Composite Light Intensity value(s) for each vertex (V1-V5) of virtual object 710. In at least one embodiment, the Composite Light Intensity Value for a given vertex (e.g., vertex V1) may be dynamically calculated (e.g., during runtime) using a set of one or more separately defined Light Source Intensity values each representing an amount of light intensity or light influence which a given light source (e.g., L1, L2, L3) has on that particular vertex. Thus, for example, in the example scene 700 which includes three different light sources (L1, L2, L3), the Composite Light Intensity Value I(V1) for vertex V1 may be defined as a set of separate Light Source Intensity values such as:


I(V1)=I(L1@V1)+I(L2@V1)+I(L3@V1),

where:

I(L1@V1) represents the computed amount of light intensity or light influence which light source L1 has upon vertex V1 (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and shadows which affect Light Source L1's influence at vertex V1);

I(L2@V1) represents the computed amount of light intensity or light influence which light source L2 has upon vertex V1 (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and shadows which affect Light Source L2's influence at vertex V1); and

I(L3@V1) represents the computed amount of light intensity or light influence which light source L3 has upon vertex V1 (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and shadows which affect Light Source L3's influence at vertex V1).

Additionally, in at least one embodiment, the value I(L1@V1) (representing the amount of light intensity or light influence which light source L1 has upon vertex V1) may be expressed, for example, as


I(L1@V1)=I(R1@V1)+I(G1@V1)+I(B1@V1),

where:

I(R1@V1) represents the amount of light intensity or light influence which the red component (R1) of light source L1 has upon vertex V1;

I(G1@V1) represents the amount of light intensity or light influence which the green component (G1) of light source L1 has upon vertex V1; and

I(B1@V1) represents the amount of light intensity or light influence which the blue component (B1) of light source L1 has upon vertex V1.

Accordingly, in at least one embodiment, the Composite Light Intensity Value(s) for vertices V1, V2, V2, V4, V5 (FIG. 7) may be represented as:


I(V1)=I(L1@V1)+I(L2@V1)+I(L3@V1);


I(V2)=I(L1@V2)+I(L2@V2)+I(L3@V2);


I(V3)=I(L1@V3)+I(L2@V3)+I(L3@V3);


I(V4)=I(L1@V4)+I(L2@V4)+I(L3@V4);


I(V5)=I(L1@V5)+I(L2@V5)+I(L3@V5);

where:

I(L1@V1)=I(R1@V1)+I(G1@V1)+I(B1@V1);

I(L2@V1)=I(R2@V1)+I(G2@V1)+I(B2@V1);

I(L3@V1)=I(R3@V1)+I(G3@V1)+I(B3@V1);

I(L1@V2)=I(R1@V2)+I(G1@V2)+I(B1@V2);

I(L2@V2)=I(R2@V2)+I(G2@V2)+I(B2@V2);

I(L3@V2)=I(R3@V2)+I(G3@V2)+I(B3@V2);

etc.

In at least one embodiment, the calculated Light Source Influence values may factor in various types of other lighting attributes and/or characteristics described and/or referenced herein. In at least one embodiment, the lighting characteristics relating to the color or hue of a given vertex may be dynamically determined at render time using the Light Source Influence values and current lighting characteristics of the identified light source(s) of the scene. For example, in one embodiment, lighting characteristics relating to the color or hue of a given vertex may be calculated as the sum of the products of the influence of each light source on the vertex multiplied by the light color. In one embodiment, the light color may not affect the amount of influence which a given light source has on the vertex, but may only affects the resulting color.

Thus, for example, in at least one embodiment, the amount of light intensity or light influence which a given light source has over a pixel (or vertex or other point in space of the scene being rendered) may be pre-calculated and/or recorded as a static value(s), and this information (e.g., the Light Source Influence criteria) may be used to render and/or to dynamically determine the color and/or brightness characteristics of one or more pixels, vertices, Light Influence Grid Points and/or other points in space of the scene being rendered. Moreover, the dynamic lighting and rendering techniques described herein enables the rendered color and/or brightness characteristics of pixels/objects in a given scene to dynamically change over one or more different time intervals, for example, by dynamically adjusting the lighting characteristics (e.g., RGB, brightness, falloff, etc.) of one or more light sources in the scene.

Thus, for example, in one embodiment, the color characteristics for an object not be determined until the pixels are rendered at runtime, and may be based on the current color attributes of each light source at that moment of rendering. Those, for example, unlike prior art techniques which may compute and storing a static composite light value (RGB) for each vertex, the dynamic lighting and rendering techniques described herein may compute and store the Light Source Influence values (e.g., I(V1), I(V2), I(V3), etc.) characterizing the amount of light intensity or light influence which a given light source has over one or more pixels, vertices, Light Influence Grid Points and/or other points in space of the scene being rendered. Thereafter, at runtime, color characteristics (and/or other display characteristics) for the scene features, pixels, objects, etc. may be dynamically calculated using the Light Source Influence values and the current composite RGB light values of light sources.

In at least one embodiment, once the Composite Light Intensity Values have been computed for each of the vertices of the virtual object 710, the lighting characteristics of any point or pixel (e.g., 711) on the surface of the virtual object may be relatively quickly calculated (e.g., in real-time) using a vertex-based lighting technique. For example, in one embodiment, the lighting characteristics associated with virtual object point 711 (e.g., at pixel coordinates X1,Y1,Z1) may be determined or calculated by interpolating the Composite Light Intensity Values of the n nearest vertices to the identified point (e.g., the n nearest vertices which are within direct “line of sight” of the identified point, which in this particular example is assumed to be vertices V3, V4, and V5. Thus, for example, in the example scene 700, the lighting characteristics for the point 711 of the virtual object's surface may be calculated and/or determined (e.g., in real-time), for example, by performing one or more of the following operations (or combinations thereof): identifying selected vertices of the virtual object to be used for interpolating the lighting characteristics of the selected point, and calculating a Composite Light Intensity Value for the identified point (711) using a vertex-based lighting technique wherein the lighting characteristics of the identified point are interpolated based on the relative amount of lighting influence which each of the identified vertices (e.g., V3, V4, V5) is determined to have on the selected point. For example, in at least one embodiment, the Composite Light Intensity Value for the identified point 711 may be determined according to:


I(X1,Y1,Z1)=a*I(V3)+b*I(V4)+c*I(V5),

where a, b, c are weighted variables representing interpolations of the relative influences of each identified vertices (V3, V4, V5). In at least one embodiment, the values assigned to weighted variables a, b, c may be based, at least in part, on the respective distance of each identified vertex to the identified coordinate (X1,Y1,Z1); and where a+b+c=1. In at least one embodiment, the weighted values for a, b, c may be determined by normalizing the relative distances of each identified vertex to the identified coordinate. For example, in at least one embodiment, the weighted values for a, b, c may be determined according to:


dSUM=dA+dB+dc;


a=dA/dSUM;


b=dB/dSUM;


c=dc/dSUM.

In at least one embodiment, the values I(V3), I(V4), and I(V5) (representing the Composite Light Intensity Values for vertices V3, V4, V5, respectively) may be expressed as a function of the amount of light intensity or light influence which each of the respective light sources L1, L2, and L3 has upon vertices V3, V4, V5, such as, for example:


I(V3)=I(L1@V3)+I(L2@V3)+I(L3@V3);


I(V4)=I(L1@V4)+I(L2@V4)+I(L3@V4);


I(V5)=I(L1@V5)+I(L2@V5)+I(L3@V5).

Alternatively, in some embodiments, the values I(V3), I(V4), and I(V5) (representing the Composite Light Intensity Values for vertices V3, V4, V5, respectively) may be expressed as a function of the amount of light intensity or light influence which each of the composite red, green, and blue channels (as determined from light sources L1, L2, and L3) has upon vertices V3, V4, V5, such as, for example:

I ( V 3 ) = R ( L 1 @ V 3 ) + R ( L 2 @ V 3 ) + R ( L 3 @ V 3 ) + G ( L 1 @ V 3 ) + G ( L 2 @ V 3 ) + G ( L 3 @ V 3 ) + B ( L 1 @ V 3 ) + B ( L 2 @ V 3 ) + B ( L 3 @ V 3 ) I ( V 4 ) = R ( L 1 @ V 4 ) + R ( L 2 @ V 4 ) + R ( L 3 @ V 4 ) + G ( L 1 @ V 4 ) + G ( L 2 @ V 4 ) + G ( L 3 @ V 4 ) + B ( L 1 @ V 4 ) + B ( L 2 @ V 4 ) + B ( L 3 @ V 4 ) I ( V 5 ) = R ( L 1 @ V 5 ) + R ( L 2 @ V 5 ) + R ( L 3 @ V 5 ) + G ( L 1 @ V 5 ) + G ( L 2 @ V 5 ) + G ( L 3 @ V 5 ) + B ( L 1 @ V 5 ) + B ( L 2 @ V 5 ) + B ( L 3 @ V 5 )

where, for example:

R(L1@V3)+R(L2@V3)+R(L3@V3) represents a “red channel” composite light influence for vertex V3, which may be calculated based on the respective amounts of red component light influence which each of the light source L1, L2, and L3 has on vertex V3;

G(L1@V3)+G(L2@V3)+G(L3@V3) represents a “green channel” composite light influence for vertex V3, which may be calculated based on the respective amounts of green component light influence which each of the light source L1, L2, and L3 has on vertex V3; and

B(L1@V3)+B(L2@V3)+B(L3@V3) represents a “blue channel” composite light influence for vertex V3, which may be calculated based on the respective amounts of blue component light influence which each of the light source L1, L2, and L3 has on vertex V3.

In at least one embodiment, the shape of the virtual object may influence the number of vertices of that object which are used for interpolating the lighting characteristics of any given pixel on the surface of the virtual object. In some embodiments, there may be upper and/or lower limits to the number of vertices which may be used for interpolation of lighting characteristics of the virtual object's surface pixels.

In at least one embodiment, using the dynamic lighting and rendering techniques described herein, each (or selected ones) of the light sources in a given scene (and/or a given game) may have associated therewith a respective, dynamically adjustable light source profile which, for example, may be provided to a graphics processor (e.g., during game initialization) and which may be used to facilitate real-time, dynamic adjustment of the lighting characteristics of individual light sources within a rendered scene, and/or may be used to facilitate real-time dynamic adjustment of rendered lighting characteristics of virtual object(s) (e.g., including both static objects and movable objects) within a rendered scene based on the dynamic adjustment of the lighting characteristics of individual light sources within the rendered scene. According to different embodiments, the dynamically adjustable light source profile(s) may include at least one matrix or table of values (e.g., FIG. 8, FIG. 10, etc.) representing different lighting characteristics of that particular light source.

In some embodiments, using the dynamic lighting and rendering techniques described herein, each (or selected ones) of the virtual object(s) of a given scene (and/or a given game) may have associated therewith a respective dynamic light influence profile which, for example, may be provided to a graphics processor (e.g., during game initialization) and which may be used to facilitate real-time dynamic adjustment of rendered lighting characteristics of that virtual object within a given scene based on the dynamic adjustment of the lighting characteristics of individual light sources within that scene. According to different embodiments, the dynamic light influence profile(s) may include at least one matrix or table of values representing different lighting characteristics of that particular virtual object.

For example, FIG. 9 shows an example representation of a Light Source Influence Table 900 in accordance with a specific embodiment. In the specific example embodiment of FIG. 9, Light Source Influence Table 900 may be used to facilitate real-time dynamic adjustment of rendered lighting characteristics of virtual object 710 (FIG. 7) based on the dynamic adjustment of the lighting characteristics of individual light sources (e.g., L1, L2, L3) within that scene. As illustrated in the example embodiment of FIG. 9, each of the vertices V1, V2, V2, V4, V5 of the virtual object may have associated therewith a respective set of Light Source Influence criteria (e.g., 910) which characterizes the relative amount of light intensity or light influence which each light source (L1, L2, L3) has upon a given vertex. In at least one embodiment, calculation of the Light Source Influence criteria may occur prior to runtime, and may take into account various types of environmental and lighting features, characteristics, properties, objects, and shadows which may affect the degree of light source influence (e.g., of each light source) at a given vertex.

In at least one embodiment, light influence characteristics for one or more vertices may be defined relative to a virtual object or to a group of virtual object (or other types of groupings) in a manner which facilitates the calculation of dynamic light influence characteristics of rotating or moving virtual objects or groups of virtual objects.

In at least one embodiment, the dynamic lighting and rendering techniques described herein may be advantageously leveraged and used to support dynamic adjustment of light source characteristics light sources, which, for example, may provide or enable additional light source features, functionalities, and/or characteristics such, as, for example, one or more of the following (or combinations thereof):

    • dimming;
    • strobing;
    • pulsing;
    • flickering;
    • dynamic changes in hue;
    • dynamic changes in brightness;
    • and/or other types of lighting characteristics which may be dynamically changed or adjusted over one or more time intervals.

According to different embodiments, such additional light source features, functionalities, and/or characteristics may be advantageously used (e.g. by game software developers) to provide new types of visual effect(s) in one or more scenes, such as, for example, one or more of the following (or combinations thereof):

    • Sparks emitting from equipment;
    • Strobe lights in an urban setting;
    • Police or emergency lights;
    • Power-loss or fire alarm lights in a building;
    • Eerie effects in sci-fi or horror genres;
    • Enhanced simulation for training or entertainment;
    • and/or other types of visual effects which my advantageously leverage one or more of the dynamic lighting and rendering techniques described herein.

Pixel-Based and Grid Point Based Lighting Influence

FIG. 13 illustrates one embodiment of an example scene (1300) which may be used for illustrating one or more of the grid-point based dynamic lighting techniques described herein. For example, in the specific example embodiment of FIG. 13, a two-dimensional scene 1300 is depicted which may be rendered (e.g., in real-time) on a display screen such as the display of a gaming system. In this example, it is assumed that a movable virtual object 1310 is to be displayed within the scene, and that the object is illuminated by three different light sources, namely L1, L2, and L3. Additionally, in this example embodiment, it is assumed that the scene 1300 has associated therewith a plurality of non-visible Light Influence Grid Points (1302) which may be used for facilitating real-time dynamic rendering and shading of virtual object(s) (e.g., including both static objects and movable objects) and/or shadows within a rendered scene. Additionally, according to different embodiments, at least a portion of the Light Influence Grid Points 1302 may be used to enable and/or provide functionality for supporting real-time dynamic rendering and shading of virtual object(s) (e.g., including both static objects and movable objects) and/or shadows within a rendered scene based on the dynamic adjustment of lighting characteristics of individual light sources within the rendered scene.

In some embodiments, using the dynamic lighting and rendering techniques described herein, each (or selected ones) of Light Influence Grid Points of a given scene may have associated therewith a respective set of predefined Light Source Influence criteria, which for example, may be provided to a graphics processor (e.g., during game initialization), and which may be used to facilitate real-time dynamic rendering of lighting and shading characteristics of virtual objects and/or shadows within a given scene based on the lighting characteristics of individual light sources within that scene. According to different embodiments, the Light Source Influence criteria may include at least one matrix or table of values representing different Light Source Influence criteria.

FIG. 11 shows an example representation of a Light Source Influence Table 1100 in accordance with a specific embodiment. In the specific example embodiment of FIG. 11, Light Source Influence Table 1100 may be used to facilitate dynamic rendering of lighting and shading characteristics of virtual object 1310 (FIG. 13) based on the dynamic influence of the lighting characteristics of individual light sources (e.g., L1, L2, L3) within that scene. As illustrated in the example embodiment of FIG. 11, each of the Light Influence Grid Points (e.g., A, B, C, D, etc.) has associated therewith a respective set of Light Source Influence criteria (e.g., 1110) which characterizes the relative amount of light intensity or light influence which each light source (L1, L2, L3) has upon a given Light Influence Grid Point. In at least one embodiment, calculation of the Light Source Influence criteria may occur prior to runtime, and may take into account various types of environmental and lighting features, characteristics, properties, virtual objects, and shadows which may affect the degree of light source influence (e.g., of each light source) at a given Light Influence Grid Point.

For example, taking into account the various different types of environmental and lighting features, characteristics, properties, and objects (e.g., radiosity, stationary objects, shadows, light sources, ambient occlusion, etc.) which may influence lighting within scene portion 1300, numerous algorithmic and/or complex computations may be performed (e.g., in advance of real-time game play) to thereby generate a set (or matrix) of Composite Light Intensity value(s) for each Light Influence Grid Point 1302 represented in scene portion 1300. In at least one embodiment, the Composite Light Intensity Value(s) for a given Light Influence Grid Point (e.g., Light Influence Grid Point A) may be dynamically calculated (e.g., during runtime) using a set of one or more separately defined Light Source Intensity values each representing an amount of light intensity or light influence which a given light source (e.g., L1, L2, L3) has on that particular Light Influence Grid Point. Thus, for example, in the example scene 1300 which includes three different light sources (L1, L2, L3), the Composite Light Intensity Value for Light Influence Grid Point A I(A) may be defined as a set of separate Light Source Intensity values such as:


I(A)=I(L1@A),I(L2@A),I(L3@A),

where:

I(L1@A) represents the computed amount of light intensity or light influence which light source L1 has upon Light Influence Grid Point A (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and shadows which affect Light Source L1's influence at Light Influence Grid Point A);

I(L2@A) represents the computed amount of light intensity or light influence which light source L2 has upon Light Influence Grid Point A (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and shadows which affect Light Source L2's influence at Light Influence Grid Point A); and

I(L3@A represents the computed amount of light intensity or light influence which light source L3 has upon Light Influence Grid Point A (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and shadows which affect Light Source L3's influence at Light Influence Grid Point A).

Additionally, in at least one embodiment, the value I(L1@A) (representing the amount of light intensity or light influence which light source L1 has upon Light Influence Grid Point A) may be expressed, for example, as


I(L1@A)=I(R1@A)+I(G1@A)+I(B1@A),

where:

I(R1@A) represents the amount of light intensity or light influence which the red component (R1) of light source L1 has upon Light Influence Grid Point A;

I(G1@A) represents the amount of light intensity or light influence which the green component (G1) of light source L1 has upon Light Influence Grid Point A; and

I(B1@A) represents the amount of light intensity or light influence which the blue component (B1) of light source L1 has upon Light Influence Grid Point A.

Accordingly, in at least one embodiment, the Composite Light Intensity Value(s) for Light Influence Grid Points A, B, C, D (FIG. 13) may be represented as:


I(A)=I(L1@A)+I(L2@A)+I(L3@A);


I(B)=I(L1@B)+I(L2@B)+I(L3@B);


I(C)=I(L1@C)+I(L2@C)+I(L3@C);


I(D)=I(L1@D)+I(L2@D)+I(L3@D);

where:

I(L1@A)=I(R1@A)+I(G1@A)+I(B1@A);

I(L2@A)=I(R2@A)+I(G2@A)+I(B2@A);

I(L3@A)=I(R3@A)+I(G3@A)+I(B3@A);

I(L1@B)=I(R1@B)+I(G1@B)+I(B1@B);

I(L2@B)=I(R2@B)+I(G2@B)+I(B2@B);

I(L3@B)=I(R3@B)+I(G3@B)+I(B3@B);

etc.

In at least one embodiment, once the Composite Light Intensity Values have been computed for each of the Light Influence Grid Points in the scene, the lighting characteristics of an virtual object moving within the scene may be relatively quickly calculated (e.g., in real-time) using a Light Influence Grid Point Influence Interpolation technique wherein the lighting characteristics of the virtual object (e.g., 1310) are interpolated based on the Composite Light Intensity Values of the n nearest Light Influence Grid Points to the virtual object (at a given point in time). In at least one embodiment, the lighting characteristics of the virtual object (e.g., 1310) may be interpolated based on the respective distances of each of the identified Light Influence Grid Points to the virtual object (at that specific time). Thus, for example, in the example scene 1300, if at time T1 it is determined that a representative pixel (1311) of virtual object 1310 is located at coordinates (X1,Y1), the lighting characteristics of the identified pixel 1311 of virtual object 1310 may be calculated and/or determined (e.g., in real-time), for example, by performing one or more of the following operations (or combinations thereof):

    • Identifying the virtual object to be rendered;
    • Identifying a region (e.g., 1311) of the virtual object to be rendered;
    • Identifying at least one representative coordinate of the identified virtual object region (e.g., virtual object pixel 1311 @×1, Y1).
    • Determining the n nearest Light Influence Grid Points to the identified virtual object pixel coordinate. For example, in a 2D environments such as that illustrated in scene portion 1300, the value for n may be set equal to 4, meaning that the 4 nearest Light Influence Grid Points to the pixel located at X1, Y1 may be identified as Light Influence Grid Points A, B, C, D. In other embodiments, the value n may be set equal to other integers such as 2, 6, 8, etc. For example, in some 3D environment embodiments, the value n may be set equal to 8.
    • Determining the respective linear distances between each of the identified Light Influence Grid Points and the identified virtual object pixel coordinate. As illustrated in the example embodiment shown at 1300, the distances between the identified virtual object pixel coordinate 1311 and Light Influence Grid Points A, B, C, D, are indicated as dA, dB, dC, dD), respectively.
    • Calculate Composite Light Intensity Value(s) for the identified virtual object pixel coordinate (e.g., I(X1,Y1)) using a Light Influence Grid Point Influence Interpolation technique wherein the lighting characteristics of the identified coordinate are interpolated based on the relative amount of lighting influence which each of the identified Grid Coordinates (e.g., A, B, C, D) is determined to have on the identified coordinate based on the respective distance of each identified Light Influence Grid Point to the identified virtual object pixel coordinate. For example, in at least one embodiment, the Composite Light Intensity Value(s) for the identified coordinate 1311 may be determined according to:


I(X1,Y1)=a*I(A)+b*I(B)+c*I(C)+d*I(D),

      • where a, b, c, d are weighted variables representing interpolations of the relative influences of each identified Light Influence Grid Points. In at least one embodiment, the values assigned to weighted variables a, b, c, d may be based, at least in part, on the respective distance of each identified Light Influence Grid Point to the identified virtual object pixel coordinate (X1,Y1); and where a+b+c+d=1. For example, in at least one embodiment, the weighted values for a, b, c, d may be determined by normalizing the relative distances of each identified Light Influence Grid Point to the identified coordinate. Thus, for example, in one embodiment, the weighted values for a, b, c, d may be determined according to:


dSUM=dA+dB+dC+dD


a=dA/dSUM


b=dB/dSUM


c=dC/dSUM


d=dD/dSUM

In at least one embodiment, the Light Influence Grid Point Influence Interpolation technique may include calculating separate Composite Light Intensity Value(s) for each light source affecting or influencing the identified virtual object pixel coordinate 1311. For example, in at least one embodiment, the Composite Light Intensity Value(s) for the identified pixel coordinate 1311 may be determined according to:


I(L1@(X1,Y1)=a*I(L1@A)+b*I(L1@B)+c*I(L1@C)+d*I(L1@D)


I(L2@(X1,Y1)=a*I(L2@A)+b*I(L2@B)+c*I(L2@C)+d*I(L2@D)


I(L3@(X1,Y1)=a*I(L3@A)+b*I(L3@B)+c*I(L3@C)+d*I(L3@D)

In some embodiments, the lighting characteristics of the other portions/regions of the identified virtual object (1310) may be dynamically determined (e.g., in real-time) using the lighting characteristics, properties and/or attributes of the identified virtual object (1310) and the lighting characteristics of the identified Light Influence Grid Points.

In at least one embodiment, the spacing between Light Influence Grid Points may be configured or designed to match the pixel spacing of the display screen for which the scene is to be rendered and displayed. In some embodiments, spacing between Light Influence Grid Points may be configured or designed to match pixel spacing based on one or more of the following (or combinations thereof): camera angle, position, perspective, etc. In at least one embodiment, the granularity of the Light Influence Grid Array for a given scene may be adjusted (or may be based upon) the relative size(s) of the virtual objects to be displayed in that particular scene, and/or may be based upon other desired criteria.

Alternatively, in some embodiments, the values I(A), I(B), I(C), and I(D) (representing the Composite Light Intensity Values for Light Influence Grid Points A, B, C, D respectively) may be expressed as a function of the amount of light intensity or light influence which each of the composite red, green, and blue channels (as determined from light sources L1, L2, and L3) has upon Light Influence Grid Points A, B, C, D such as, for example:

I ( A ) = R ( L 1 @ A ) + R ( L 2 @ A ) + R ( L 3 @ A ) + G ( L 1 @ A ) + G ( L 2 @ A ) + G ( L 3 @ A ) + B ( L 1 @ A ) + B ( L 2 @ A ) + B ( L 3 @ A ) I ( B ) = R ( L 1 @ B ) + R ( L 2 @ B ) + R ( L 3 @ B ) + G ( L 1 @ B ) + G ( L 2 @ B ) + G ( L 3 @ B ) + B ( L 1 @ B ) + B ( L 2 @ B ) + B ( L 3 @ B ) I ( C ) = R ( L 1 @ C ) + R ( L 2 @ C ) + R ( L 3 @ C ) + G ( L 1 @ C ) + G ( L 2 @ C ) + G ( L 3 @ C ) + B ( L 1 @ C ) + B ( L 2 @ C ) + B ( L 3 @ C ) I ( D ) = R ( L 1 @ D ) + R ( L 2 @ D ) + R ( L 3 @ D ) + G ( L 1 @ D ) + G ( L 2 @ D ) + G ( L 3 @ D ) + B ( L 1 @ D ) + B ( L 2 @ D ) + B ( L 3 @ D )

where, for example:

R(L1@A)+R(L2@A)+R(L3@A) represents a “red channel” composite light influence for Light Influence Grid Point A, which may be calculated based on the respective amounts of red component light influence which each of the light source L1, L2, and L3 has on Light Influence Grid Point A;

G(L1@A)+G(L2@A)+G(L3@A) represents a “green channel” composite light influence for Light Influence Grid Point A, which may be calculated based on the respective amounts of green component light influence which each of the light source L1, L2, and L3 has on Light Influence Grid Point A; and

B(L1@A)+B(L2@A)+B(L3@A) represents a “blue channel” composite light influence for Light Influence Grid Point A, which may be calculated based on the respective amounts of blue component light influence which each of the light source L1, L2, and L3 has on Light Influence Grid Point A.

In some embodiments, the lighting characteristics of the identified virtual object pixel coordinate 1311 (of virtual object 1310) may be calculated and/or determined (e.g., in real-time), for example, by determining the separate RGB channel values for the identified virtual object pixel coordinate using the respective RGB component values associated with each influencing light source (e.g., L1, L2, L3). For example, in the example scene portion 1300, separate RGB channel values for the identified virtual object pixel coordinate 1311 may be calculated according to:


R(X1,Y1)=R(L1@(X1,Y1))+R(L2@(X1,Y1))+R(L3@(X1,Y1));


G(X1,Y1)=G(L1@(X1,Y1))+G(L2@(X1,Y1))+G(L3@(X1,Y1));


B(X1,Y1)=B(L1@(X1,Y1))+B(L2@(X1,Y1))+B(L3@(X1,Y1));

where:

R(L1@(X1,Y1))=R1*I(L1@(X1,Y1))

G(L1@(X1,Y1))=G1*I(L1@(X1,Y1))

B(L1@(X1,Y1))=B1*I(L1@(X1,Y1))

R(L2@(X1,Y1))=R2*I(L2@(X1,Y1))

G(L2@(X1,Y1))=G2*I(L2@(X1,Y1))

B(L2@(X1,Y1))=B2*I(L2@(X1,Y1))

R(L3@(X1,Y1))=R3*I(L3@(X1,Y1))

G(L3@(X1,Y1))=G3*I(L3@(X1,Y1))

B(L3@(X1,Y1))=B3*I(L3@(X1,Y1))

In at least one embodiment, the shape of the virtual object may influence the number of Light Influence Grid Points which are used for interpolating the lighting characteristics of any given pixel on the surface of the virtual object. In some embodiments, there may be upper and/or lower limits to the number of Light Influence Grid Points which may be used for interpolation of lighting characteristics of the virtual object's surface pixels.

FIG. 12 illustrates one embodiment of an example scene (1200) which may be used for illustrating one or more of the dynamic lighting and rendering techniques described herein. For example. in some embodiments, the lighting characteristics of virtual object moving through shadows (e.g., 1213, which, for example, may be created by static virtual objects such as 1210) may be dynamically calculated and/or determined (e.g., in real-time), for example, by determining the separate RGB channel values for the pixel coordinate (e.g., 1212) using the respective RGB component values associated with each influencing light source (e.g., L1, L2, L3). In at least one embodiment, lighting characteristics for shadows of stationary objects may be “pre-baked” into the Light Influence Grid Points of a given scene. In at least one embodiment, one or more tables and/or matrices may be used to represent shadow map lighting influence characteristics. In some embodiments, a plurality of different shadow maps may be generated and/or stored. In one embodiment, separate R-Channel, G-Channel and B-Channel shadow maps may be generated for a given scene.

For example, referring to the example scene of FIG. 12, separate R-Channel, G-Channel and B-Channel shadow map layers may be generated according to:

Red Channel Shadow Map Layer : R ( X 1 , Y 1 , Z 1 ) = R ( L 1 @ X 1 , Y 1 , Z 1 ) + R ( L 2 @ X 1 , Y 1 , Z 1 ) + R ( L 3 @ X 1 , Y 1 , Z 1 ) Green Channel Shadow Map Layer : G ( X 1 , Y 1 , Z 1 ) = G ( L 1 @ X 1 , Y 1 , Z 1 ) + G ( L 2 @ X 1 , Y 1 , Z 1 ) + G ( L 3 @ X 1 , Y 1 , Z 1 ) Blue Channel Shadow Map Layer : B ( X 1 , Y 1 , Z 1 ) = B ( L 1 @ X 1 , Y 1 , Z 1 ) + B ( L 2 @ X 1 , Y 1 , Z 1 ) + B ( L 3 @ X 1 , Y 1 , Z 1 )

where, for example:

R(L1@X1,Y1,Z1) represents a “red channel” composite light influence for pixel coordinate 1212, which may be calculated based on the respective amounts of red component light influence which each of the light source L1, L2, and L3 has on pixel coordinate 1212;

G(L1@X1,Y1,Z1) represents a “green channel” composite light influence for pixel coordinate 1212, which may be calculated based on the respective amounts of green component light influence which each of the light source L1, L2, and L3 has on pixel coordinate 1212;

B(L1@X1,Y1,Z1) represents a “blue channel” composite light influence for pixel coordinate 1212, which may be calculated based on the respective amounts of blue component light influence which each of the light source L1, L2, and L3 has on pixel coordinate 1212.

In at least one embodiment, the gaming system may use the Red Channel to store (L1@X1,Y1,Z1), the Green Channel to store (L2@X1,Y1,Z1), etc. In this way, at render time or runtime, the current RGB component color characteristics of each light source may be used to calculate the real-time color characteristics of the rendered pixel, for example, according to:

RGB ( X 1 , Y 1 , Z 1 ) = Red Channel @ X 1 , Y 1 , Z 1 * L 1 ( RGB ) ) + ( Green Channel @ X 1 , Y 1 , Z 1 * L 2 ( RGB ) ) + ( Blue Channel @ X 1 , Y 1 , Z 1 * L 3 ( RGB ) )

According to different embodiments, at least a portion of the various types of functions, operations, actions, and/or other features provided by the dynamic lighting and rendering procedures described herein may be implemented at one or more gaming systems(s), at one or more server systems (s), and/or combinations thereof.

In at least one embodiment, the dynamic rendering of pixels for a selected virtual object within a given scene may be performed in real-time using predefined light source influence criteria representing the amount of light intensity or light influence which each distinct light source (of the scene) has on each (or selected) pixels of the virtual object being rendered. In at least one embodiment, lighting characteristics for pixels of a selected static object (within a given scene) may be “pre-baked” into the pixels associated with that object. In at least one embodiment, the lighting characteristics for pixels of a selected static object may be “pre-baked” into the texture map (or texture-related pixels) associated with that object.

FIG. 14 shows an example representation of a Pixel-related Light Source Influence Table 1400 in accordance with a specific embodiment. In the specific example embodiment of FIG. 14, Light Source Influence Table 1400 includes Light Source Influence criteria for selected pixels (e.g., Pixels E-J) of scene 1200 (FIG. 12) characterizing the amount of light intensity or light influence which each distinct light source (e.g., L1, L2, L3, FIG. 12) has on each of the selected pixels. According to different embodiments, the selected pixels may include, but are not limited to, one or more of the following types of pixels (or combinations thereof):

    • pixels (e.g., E, F, G) associated with one or more static objects within the scene;
    • pixels (e.g., H, I, J) associated with one or more shadows within the scene;
    • pixels associated with one or more points in space within the scene;
    • pixels associated with one or more texture maps for static objects within the scene;
    • etc.

As illustrated in the example embodiment of FIG. 14, each of the identified pixels (e.g., E, F, G, H, I, J, etc.) has associated therewith a respective set of Light Source Influence criteria which characterizes the relative amount of light intensity or light influence which each light source (L1, L2, L3) has upon a given pixel. In at least one embodiment, calculation of the Light Source Influence criteria may occur prior to runtime, and may take into account various types of environmental and lighting features, characteristics, properties, virtual objects, and pixels which may affect the degree of light source influence (e.g., of each light source) at a given pixel.

For example, taking into account the various different types of environmental and lighting features, characteristics, properties, and objects (e.g., radiosity, stationary objects, pixels, light sources, ambient occlusion, etc.) which may influence lighting within scene portion 1200, numerous algorithmic and/or complex computations may be performed (e.g., in advance of real-time game play) to thereby generate a set (or matrix) of Composite Light Intensity value(s) for each pixel 1202 represented in scene portion 1200. In at least one embodiment, the Composite Light Intensity Value(s) for a given pixel (e.g., pixel A) may be dynamically calculated (e.g., during runtime) using a set of one or more separately defined Light Source Intensity values each representing an amount of light intensity or light influence which a given light source (e.g., L1, L2, L3) has on that particular pixel. Thus, for example, in the example scene 1200 which includes three different light sources (L1, L2, L3), the Composite Light Intensity Value for pixel F (I(F)) may be defined as a set of separate Light Source Intensity values such as:


I(F)=I(L1@F),I(L2@F),I(L3@F),

where:

I(L1@F) represents the computed amount of light intensity or light influence which light source L1 has upon pixel F (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and pixels which affect Light Source L1's influence at pixel F);

I(L2@F) represents the computed amount of light intensity or light influence which light source L2 has upon pixel F (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and pixels which affect Light Source L2's influence at pixel F); and

I(L3@F represents the computed amount of light intensity or light influence which light source L3 has upon pixel F (taking into account the various types of environmental and lighting features, characteristics, properties, objects, and pixels which affect Light Source L3's influence at pixel F).

Additionally, in at least one embodiment, the value I(L1@F) (representing the amount of light intensity or light influence which light source L1 has upon pixel F) may be expressed, for example, as


I(L1@F)=I(R1@F)+I(G1@F)+I(B1@F),

where:

I(R1@F) represents the amount of light intensity or light influence which the red component (R1) of light source L1 has upon pixel F;

I(G1@F) represents the amount of light intensity or light influence which the green component (G1) of light source L1 has upon pixel F; and

I(B1@F) represents the amount of light intensity or light influence which the blue component (B1) of light source L1 has upon pixel F.

Similarly, the value I(L2@F) (representing the amount of light intensity or light influence which light source L2 has upon pixel F) may be expressed, for example, as


I(L2@F)=I(R2@F)+I(G2@F)+I(B2@F),

where:

I(R2@F) represents the amount of light intensity or light influence which the red component (R2) of light source L2 has upon pixel F;

I(G2@F) represents the amount of light intensity or light influence which the green component (G2) of light source L2 has upon pixel F; and

I(B2@F) represents the amount of light intensity or light influence which the blue component (B2) of light source L2 has upon pixel F.

FIG. 14 shows a flow diagram of a Dynamic Light Influence Rendering Procedure in accordance with a specific embodiment. According to different embodiments, at least a portion of the various types of functions, operations, actions, and/or other features provided by the Dynamic Light Influence Rendering Procedure may be implemented at one or more client systems(s), at one or more server systems (s), and/or combinations thereof.

In at least one embodiment, the Dynamic Light Influence Rendering Procedure may be operable to perform and/or implement various types of functions, operations, actions, and/or other features such as one or more of those described and/or referenced herein.

In at least one embodiment, the Dynamic Light Influence Rendering Procedure may be operable to utilize and/or generate various different types of data and/or other types of information when performing specific tasks and/or operations. This may include, for example, input data/information and/or output data/information. For example, in at least one embodiment, the Dynamic Light Influence Rendering Procedure may be operable to access, process, and/or otherwise utilize information from one or more different types of sources, such as, for example, one or more local and/or remote memories, devices and/or systems. Additionally, in at least one embodiment, the Dynamic Light Influence Rendering Procedure may be operable to generate one or more different types of output data/information, which, for example, may be stored in memory of one or more local and/or remote devices and/or systems. Examples of different types of input data/information and/or output data/information which may be accessed and/or utilized by the Dynamic Light Influence Rendering Procedure may include, but are not limited to, one or more of those described and/or referenced herein.

In at least one embodiment, a given instance of the Dynamic Light Influence Rendering Procedure may access and/or utilize information from one or more associated databases. In at least one embodiment, at least a portion of the database information may be accessed via communication with one or more local and/or remote memory devices. Examples of different types of data which may be accessed by the Dynamic Light Influence Rendering Procedure may include, but are not limited to, one or more of those described and/or referenced herein.

According to specific embodiments, multiple instances or threads of the Dynamic Light Influence Rendering Procedure may be concurrently implemented and/or initiated via the use of one or more processors and/or other combinations of hardware and/or hardware and software. For example, in at least some embodiments, various aspects, features, and/or functionalities of the Dynamic Light Influence Rendering Procedure may be performed, implemented and/or initiated by one or more of the various systems, components, systems, devices, procedures, processes, etc., described and/or referenced herein.

According to different embodiments, one or more different threads or instances of the Dynamic Light Influence Rendering Procedure may be initiated in response to detection of one or more conditions or events satisfying one or more different types of minimum threshold criteria for triggering initiation of at least one instance of the Dynamic Light Influence Rendering Procedure. Various examples of conditions or events which may trigger initiation and/or implementation of one or more different threads or instances of the Dynamic Light Influence Rendering Procedure may include, but are not limited to, one or more of those described and/or referenced herein.

According to different embodiments, one or more different threads or instances of the Dynamic Light Influence Rendering Procedure may be initiated and/or implemented manually, automatically, statically, dynamically, concurrently, and/or combinations thereof. Additionally, different instances and/or embodiments of the Dynamic Light Influence Rendering Procedure may be initiated at one or more different time intervals (e.g., during a specific time interval, at regular periodic intervals, at irregular periodic intervals, upon demand, etc.).

In at least one embodiment, initial configuration of a given instance of the Dynamic Light Influence Rendering Procedure may be performed using one or more different types of initialization parameters. In at least one embodiment, at least a portion of the initialization parameters may be accessed via communication with one or more local and/or remote memory devices. In at least one embodiment, at least a portion of the initialization parameters provided to an instance of the Dynamic Light Influence Rendering Procedure may correspond to and/or may be derived from the input data/information.

In the specific example embodiment of FIG. 15, it is assumed that the Dynamic Light Influence Rendering Procedure is being used to facilitate the dynamic rendering (e.g., at runtime) of pixels for a selected virtual object within a given scene using predefined light source influence criteria representing the amount of light intensity or light influence which each distinct light source (of the scene) has on the all (or selected) pixels of the virtual object being rendered.

As shown at 1502, it is assumed that a specific virtual scene is identified. In at least one embodiment, the static virtual objects and/or light sources of the scene are also identified. The purposes of illustration, it is assumed that the identified scene is scene 1200 of FIG. 12, which is assumed to include a first static object 1210, and which is assumed to include three distinct light sources L1, L2, L3.

As shown at 1504, Light Source Influence criteria for selected features of the scene (e.g., virtual objects, pixels, Light Influence Grid Points, object vertices, etc.) are calculated. In at least one embodiment, the Light Source Influence criteria associated with a given pixel characterizes the amount of light intensity or light influence which each distinct, identified light source has on that particular pixel.

In at least one embodiment, calculation of the Light Source Influence criteria may be performed in advance of runtime. In at least one embodiment, the calculated Light Source Influence criteria may be “pre-baked” into one or more selected features (e.g., pixels, Light Influence Grid Points, object vertices, texture maps, etc.) of the scene.

As shown at 1506 a virtual object may be identified to be rendered within identified scene at runtime.

As shown at 1508 the current lighting characteristics of the identified light source(s) are identified. Examples of such current lighting characteristics may include, but are not limited to, one or more of the following (or combinations thereof): hue, falloff, brightness/intensity, etc.

As shown at 1510 display characteristics for the pixels of the identified virtual object may be dynamically calculated (e.g., during runtime) using the pre-calculated Light Source Influence criteria and current lighting characteristics of the identified light source(s).

As shown at 1512 the identified virtual object may be dynamically rendered and displayed at runtime using the dynamically calculated pixel display characteristics.

FIG. 1 illustrates a simplified block diagram of a specific example embodiment of a portion of a Gaming Network 100. As described in greater detail herein, different embodiments of computer networks may be configured, designed, and/or operable to provide various different types of operations, functionalities, and/or features generally relating to Dynamic lighting and rendering technology. Further, as described in greater detail herein, many of the various operations, functionalities, and/or features of the Gaming Network(s) disclosed herein may provide may enable or provide different types of advantages and/or benefits to different entities interacting with the Gaming Network(s).

According to different embodiments, the Gaming Network 100 may include a plurality of different types of components, devices, modules, processes, systems, etc., which, for example, may be implemented and/or instantiated via the use of hardware and/or combinations of hardware and software. For example, as illustrated in the example embodiment of FIG. 1, the Gaming Network 100 may include one or more of the following types of systems, components, devices, processes, etc. (or combinations thereof):

    • Server System(s) 120—In at least one embodiment, the Server System(s) may be operable to perform and/or implement various types of functions, operations, actions, and/or other features such as those described or referenced herein (e.g., such as those illustrated and/or described with respect to FIG. 6). In at least one embodiment, Server System 120 may be configured or designed to include Dynamic Lighting and Rendering Component (s) for providing functionality relating to one or more of the Dynamic lighting and rendering aspects disclosed herein.
    • Publisher/Content Provider System component(s) 140
    • Client Computer System (s) 130
    • 3rd Party System(s) 150
    • Internet & Cellular Network(s) 110
    • Remote Database System(s) 180
    • Remote Server System(s)/Service(s) 170, which, for example, may include, but are not limited to, one or more of the following (or combinations thereof):
      • Content provider servers/services
      • Media Streaming servers/services
      • Database storage/access/query servers/services
      • Financial transaction servers/services
      • Payment gateway servers/services
      • Electronic commerce servers/services
      • Event management/scheduling servers/services
      • Etc.
    • Gaming Device(s) 160, which, for example, may include, but are not limited to, one or more of the following (or combinations thereof): gaming machines, kiosks, consumer devices, smart phones, video game consoles, personal computer systems, electronic display systems, etc. In at least one embodiment, the gaming device(s) may be operable to perform and/or implement various types of functions, operations, actions, and/or other features such as those described or referenced herein (e.g., such as those illustrated and/or described with respect to FIG. 4). In at least one embodiment, a gaming device may be configured or designed as a gaming console which, for example, may be configured or designed to provides users/players with access to various types of games, such as, for example, one or more of the following (or combinations thereof): multi-player online games (e.g., World of Warcraft™, The Sims Online™, etc.); single-player online games, etc; games accessible via consumer-type game consoles such as Microsoft XBOX™, Sony Playstation™ Nintendo WII™, etc; wager-based games accessible via one or more casino gaming networks and/or other types of gaming networks; and/or other types of games which may be accessible to users/players via one or more other types of systems and/or networks.
    • etc.

In at least one embodiment, a gaming device may be operable to detect gross motion or gross movement of a user. For example, in one embodiment, a gaming device may include motion detection component(s) which may be operable to detect gross motion or gross movement of a user's body and/or appendages such as, for example, hands, fingers, arms, head, etc.

According to different embodiments, at least some Gaming Network(s) may be configured, designed, and/or operable to provide a number of different advantages and/or benefits and/or may be operable to initiate, and/or enable various different types of operations, functionalities, and/or features, such as, for example, one or more of those described or referenced herein.

According to different embodiments, at least a portion of the various types of functions, operations, actions, and/or other features provided by the Gaming Network 100 may be implemented at one or more client systems(s), at one or more server systems (s), and/or combinations thereof.

According to different embodiments, the Gaming Network may be operable to utilize and/or generate various different types of data and/or other types of information when performing specific tasks and/or operations. This may include, for example, input data/information and/or output data/information. For example, in at least one embodiment, the Gaming Network may be operable to access, process, and/or otherwise utilize information from one or more different types of sources, such as, for example, one or more local and/or remote memories, devices and/or systems. Additionally, in at least one embodiment, the Gaming Network may be operable to generate one or more different types of output data/information, which, for example, may be stored in memory of one or more local and/or remote devices and/or systems. Examples of different types of input data/information and/or output data/information which may be accessed and/or utilized by the Gaming Network may include, but are not limited to, one or more of those described and/or referenced herein.

According to specific embodiments, multiple instances or threads of the Gaming Network may be concurrently implemented and/or initiated via the use of one or more processors and/or other combinations of hardware and/or hardware and software. For example, in at least some embodiments, various aspects, features, and/or functionalities of the Gaming Network may be performed, implemented and/or initiated by one or more of the various systems, components, systems, devices, procedures, processes, etc., described and/or referenced herein.

In at least one embodiment, a given instance of the Gaming Network may access and/or utilize information from one or more associated databases. In at least one embodiment, at least a portion of the database information may be accessed via communication with one or more local and/or remote memory devices. Examples of different types of data which may be accessed by the Gaming Network may include, but are not limited to, one or more of those described and/or referenced herein.

According to different embodiments, one or more different threads or instances of the Gaming Network may be initiated in response to detection of one or more conditions or events satisfying one or more different types of minimum threshold criteria for triggering initiation of at least one instance of the Gaming Network. Various examples of conditions or events which may trigger initiation and/or implementation of one or more different threads or instances of the Gaming Network may include, but are not limited to, one or more of those described and/or referenced herein.

It will be appreciated that the Gaming Network of FIG. 1 is but one example from a wide range of Gaming Network embodiments which may be implemented. Other embodiments of the Gaming Network (not shown) may include additional, fewer and/or different components/features that those illustrated in the example Gaming Network embodiment of FIG. 1.

Generally, the dynamic lighting and rendering techniques described herein may be implemented in hardware and/or hardware+software. For example, they can be implemented in an operating system kernel, in a separate user process, in a library package bound into network applications, on a specially constructed machine, or on a network interface card. In a specific embodiment, various aspects described herein may be implemented in software such as an operating system or in an application running on an operating system.

Hardware and/or software+hardware hybrid embodiments of the dynamic lighting and rendering techniques described herein may be implemented on a general-purpose programmable machine selectively activated or reconfigured by a computer program stored in memory. Such programmable machine may include, for example, mobile or handheld computing systems, PDA, smart phones, notebook computers, tablets, netbooks, desktop computing systems, server systems, cloud computing systems, network devices, etc.

FIG. 2 is a simplified block diagram of an exemplary gaming machine 200 in accordance with a specific embodiment. As illustrated in the embodiment of FIG. 2, gaming machine 200 includes at least one processor 210, at least one interface 206, and memory 216.

In one implementation, processor 210 and master game controller 212 are included in a logic device 213 enclosed in a logic device housing. The processor 210 may include any conventional processor or logic device configured to execute software allowing various configuration and reconfiguration tasks such as, for example: a) communicating with a remote source via communication interface 206, such as a server that stores authentication information or games; b) converting signals read by an interface to a format corresponding to that used by software or memory in the gaming machine; c) accessing memory to configure or reconfigure game parameters in the memory according to indicia read from the device; d) communicating with interfaces, various peripheral devices 222 and/or I/O devices; e) operating peripheral devices 222 such as, for example, card readers, paper ticket readers, etc.; f) operating various I/O devices such as, for example, displays 235, input devices 230; etc. For instance, the processor 210 may send messages including game play information to the displays 235 to inform players of cards dealt, wagering information, and/or other desired information.

The gaming machine 200 also includes memory 216 which may include, for example, volatile memory (e.g., RAM 209), non-volatile memory 219 (e.g., disk memory, FLASH memory, EPROMs, etc.), unalterable memory (e.g., EPROMs 208), etc. The memory may be configured or designed to store, for example: 1) configuration software 214 such as all the parameters and settings for a game playable on the gaming machine; 2) associations 218 between configuration indicia read from a device with one or more parameters and settings; 3) communication protocols allowing the processor 210 to communicate with peripheral devices 222 and I/O devices 211; 4) a secondary memory storage device 215 such as a non-volatile memory device, configured to store gaming software related information (the gaming software related information and memory may be used to store various audio files and games not currently being used and invoked in a configuration or reconfiguration); 5) communication transport protocols (such as, for example, TCP/IP, USB, Firewire, IEEE1394, Bluetooth, IEEE 802.11x (IEEE 802.11 standards), hiperlan/2, HomeRF, etc.) for allowing the gaming machine to communicate with local and non-local devices using such protocols; etc. In one implementation, the master game controller 212 communicates using a serial communication protocol. A few examples of serial communication protocols that may be used to communicate with the master game controller include but are not limited to USB, RS-232 and Netplex (a proprietary protocol developed by IGT, Reno, Nev.).

A plurality of device drivers 242 may be stored in memory 216. Example of different types of device drivers may include device drivers for gaming machine components, device drivers for peripheral components 222, etc. Typically, the device drivers 242 utilize a communication protocol of some type that enables communication with a particular physical device. The device driver abstracts the hardware implementation of a device. For example, a device drive may be written for each type of card reader that may be potentially connected to the gaming machine. Examples of communication protocols used to implement the device drivers include Netplex, USB, Serial, Ethernet 275, Firewire, I/O debouncer, direct memory map, serial, PCI, parallel, RF, Bluetooth™, near-field communications (e.g., using near-field magnetics), 802.11 (WiFi), etc. Netplex is a proprietary IGT standard while the others are open standards. According to a specific embodiment, when one type of a particular device is exchanged for another type of the particular device, a new device driver may be loaded from the memory 216 by the processor 210 to allow communication with the device. For instance, one type of card reader in gaming machine 200 may be replaced with a second type of card reader where device drivers for both card readers are stored in the memory 216.

In some embodiments, the software units stored in the memory 216 may be upgraded as needed. For instance, when the memory 216 is a hard drive, new games, game options, various new parameters, new settings for existing parameters, new settings for new parameters, device drivers, and new communication protocols may be uploaded to the memory from the master game controller 212 or from some other external device. As another example, when the memory 216 includes a CD/DVD drive including a CD/DVD designed or configured to store game options, parameters, and settings, the software stored in the memory may be upgraded by replacing a first CD/DVD with a second CD/DVD. In yet another example, when the memory 216 uses one or more flash memory 219 or EPROM 208 units designed or configured to store games, game options, parameters, settings, the software stored in the flash and/or EPROM memory units may be upgraded by replacing one or more memory units with new memory units which include the upgraded software. In another embodiment, one or more of the memory devices, such as the hard-drive, may be employed in a game software download process from a remote software server.

In some embodiments, the gaming machine 200 may also include various authentication and/or validation components 244 which may be used for authenticating/validating specified gaming machine components such as, for example, hardware components, software components, firmware components, information stored in the gaming machine memory 216, etc. Examples of various authentication and/or validation components are described in U.S. Pat. No. 6,620,047, titled, “ELECTRONIC GAMING APPARATUS HAVING AUTHENTICATION DATA SETS,” incorporated herein by reference in its entirety for all purposes.

Peripheral devices 222 may include several device interfaces such as, for example: transponders 254, wire/wireless power distribution components 258, input device(s) 230, sensors 260, audio and/or video devices 262 (e.g., cameras, speakers, etc.), transponders 254, wireless communication components 256, wireless power components 258, gaming device function control components 262, side wagering management components 264, etc.

Sensors 260 may include, for example, optical sensors, pressure sensors, RF sensors, Infrared sensors, image sensors, thermal sensors, biometric sensors, etc. Such sensors may be used for a variety of functions such as, for example detecting the presence and/or identity of various persons (e.g., players, casino employees, etc.), devices (e.g., gaming devices), and/or systems within a predetermined proximity to the gaming machine. In one implementation, at least a portion of the sensors 260 and/or input devices 230 may be implemented in the form of touch keys selected from a wide variety of commercially available touch keys used to provide electrical control signals. Alternatively, some of the touch keys may be implemented in another form which are touch sensors such as those provided by a touchscreen display. For example, in at least one implementation, the gaming machine player displays and/or gaming device displays may include input functionality for allowing players to provide desired information (e.g., game play instructions and/or other input) to the gaming machine, game table and/or other gaming system components using the touch keys and/or other player control sensors/buttons. Additionally, such input functionality may also be used for allowing players to provide input to other devices in the casino gaming network (such as, for example, player tracking systems, side wagering systems, etc.)

Wireless communication components 256 may include one or more communication interfaces having different architectures and utilizing a variety of protocols such as, for example, 802.11 (WiFi), 802.15 (including Bluetooth™), 802.16 (WiMax), 802.22, Cellular standards such as CDMA, CDMA2000, WCDMA, Radio Frequency (e.g., RFID), Infrared, Near Field Magnetic communication protocols, etc. The communication links may transmit electrical, electromagnetic or optical signals which carry digital data streams or analog signals representing various types of information.

Power distribution components 258 may include, for example, components or devices which are operable for providing wired or wireless power to other devices. For example, in one implementation, the power distribution components 258 may include a magnetic induction system which is adapted to provide wireless power to one or more gaming devices near the gaming machine. In one implementation, a gaming device docking region may be provided which includes a power distribution component that is able to recharge a gaming device without requiring metal-to-metal contact.

In at least one embodiment, gaming device function control components 262 may be operable to control operating mode selection functionality, features, and/or components associated with one or more gaming devices (e.g., 250). In at least one embodiment, gaming device function control components 262 may be operable to remotely control and/or configure components of one or more gaming devices 250 based on various parameters and/or upon detection of specific events or conditions such as, for example: time of day, player activity levels; location of the gaming device; identity of gaming device user; user input; system override (e.g., emergency condition detected); proximity to other devices belonging to same group or association; proximity to specific objects, regions, zones, etc.

In at least one embodiment, side wagering management components 264 may be operable to manage side wagering activities associated with one or more side wager participants. Side wagering management components 264 may also be operable to manage or control side wagering functionality associated with one or more gaming devices 250. In accordance with at least one embodiment, side wagers may be associated with specific events in a wager-based game that is uncertain at the time the side wager is made. The events may also be associated with particular players, gaming devices (e.g., EGMs), game themes, bonuses, denominations, and/or paytables. In embodiments where the wager-based game is being played by multiple players, in one embodiment the side wagers may be made by participants who are not players of the game, and who are thus at least one level removed from the actual play of the game.

In instances where side wagers are made on events that depend at least in part on the skill of a particular player, it may be beneficial to provide observers (e.g., side wager participants) with information which is useful for determining whether a particular side wager should be placed, and/or for helping to determine the amount of such side wager. In at least one embodiment, side wagering management components 264 may be operable to manage and/or facilitate data access to player ratings, historical game play data, historical payout data, etc. For example, in one embodiment, a player rating for a player of the wager-based game may be computed based on historical data associated with past play of the wager-based game by that player in accordance with a pre-determined algorithms. The player rating for a particular player may be displayed to other players and/or observers, possibly at the option (or permission) of the player. By using player ratings in the consideration of making side wagers, decisions by observers to make side wagers on certain events need not be made completely at random. Player ratings may also be employed by the players themselves to aid them in determining potential opponents, for example.

Dynamic Lighting and Rendering Component(s) 292 may be configured or designed to provide functionality for facilitating dynamic, real-time adjustment and rendering of lighting, shading, and/or other display characteristics associated with stationary virtual objects, dynamically movable virtual objects, shadow mapping, etc. In at least one embodiment, the Dynamic Lighting and Rendering Component(s) 292 may be configured or designed to facilitate, enable, initiate, and/or perform one or more of the following operation(s), action(s), and/or feature(s) (or combinations thereof):

    • real-time, dynamic adjustment of lighting characteristics of individual light sources within a rendered scene.
    • real-time dynamic adjustment of rendered lighting characteristics of virtual object(s) (e.g., including both static objects and movable objects) within a rendered scene based on the dynamic adjustment of the lighting characteristics of individual light sources within the rendered scene.
    • facilitating real-time dynamic rendering and shading of virtual object(s) (e.g., including both static objects and movable objects) and/or shadows within a rendered scene.
    • real-time dynamic rendering and shading of virtual object(s) (e.g., including both static objects and movable objects) and/or shadows within a rendered scene based on the dynamic adjustment of lighting characteristics of individual light sources within the rendered scene.
    • real-time dynamic rendering of lighting and shading characteristics of virtual objects and/or shadows within a given scene based on the lighting characteristics of individual light sources within that scene
    • real-time properties of reflected light (radiosity) projected onto virtual objects within a given scene based on the lighting characteristics of individual light sources and object within that scene.
    • real-time adjustment of lighting intensity, color and falloff.

In other embodiments (not shown) other peripheral devices include: player tracking devices, card readers, bill validator/paper ticket readers, etc. Such devices may each comprise resources for handling and processing configuration indicia such as a microcontroller that converts voltage levels for one or more scanning devices to signals provided to processor 210. In one embodiment, application software for interfacing with peripheral devices 222 may store instructions (such as, for example, how to read indicia from a portable device) in a memory device such as, for example, non-volatile memory, hard drive or a flash memory.

In at least one implementation, the gaming machine may include card readers such as used with credit cards, or other identification code reading devices to allow or require player identification in connection with play of the card game and associated recording of game action. Such a user identification interface can be implemented in the form of a variety of magnetic card readers commercially available for reading a user-specific identification information. The user-specific information can be provided on specially constructed magnetic cards issued by a casino, or magnetically coded credit cards or debit cards frequently used with national credit organizations such as VISA™, MASTERCARD™, banks and/or other institutions.

The gaming machine may include other types of participant identification mechanisms which may use a fingerprint image, eye blood vessel image reader, or other suitable biological information to confirm identity of the user. Still further it is possible to provide such participant identification information by having the dealer manually code in the information in response to the player indicating his or her code name or real name. Such additional identification could also be used to confirm credit use of a smart card, transponder, and/or player's gaming device.

It will be apparent to those skilled in the art that other memory types, including various computer readable media, may be used for storing and executing program instructions pertaining to the operation EGMs described herein. Because such information and program instructions may be employed to implement the systems/methods described herein, example embodiments may relate to machine-readable media that include program instructions, state information, etc. for performing various operations described herein. Examples of machine-readable media include, but are not limited to, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROM disks; magneto-optical media such as floptical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory devices (ROM) and random access memory (RAM). Example embodiments may also be embodied in a carrier wave traveling over an appropriate medium such as airwaves, optical lines, electric lines, etc. Examples of program instructions include both machine code, such as produced by a compiler, and files including higher level code that may be executed by the computer using an interpreter.

FIG. 3 shows a diagrammatic representation of machine in the exemplary form of a client (or end user) computer system 300 within which a set of instructions, for causing the machine to perform any one or more of the methodologies discussed herein, may be executed. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.

The exemplary computer system 300 includes a processor 302 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 304 and a static memory 306, which communicate with each other via a bus 308. The computer system 300 may further include a video display unit 310 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). The computer system 300 also includes an alphanumeric input device 312 (e.g., a keyboard), a user interface (UI) navigation device 314 (e.g., a mouse), a disk drive unit 316, a signal generation device 318 (e.g., a speaker) and a network interface device 320.

The disk drive unit 316 includes a machine-readable medium 322 on which is stored one or more sets of instructions and data structures (e.g., software 324) embodying or utilized by any one or more of the methodologies or functions described herein. The software 324 may also reside, completely or at least partially, within the main memory 304 and/or within the processor 302 during execution thereof by the computer system 300, the main memory 304 and the processor 302 also constituting machine-readable media.

The software 324 may further be transmitted or received over a network 326 via the network interface device 320 utilizing any one of a number of well-known transfer protocols (e.g., HTTP).

While the machine-readable medium 322 is shown in an exemplary embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present invention, or that is capable of storing, encoding or carrying data structures utilized by or associated with such a set of instructions. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals. Although an embodiment of the present invention has been described with reference to specific exemplary embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the invention. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.

According to various embodiments, Client Computer System 300 may include a variety of components, modules and/or systems for providing various types of functionality. For example, in at least one embodiment, Client Computer System 300 may include a web browser application which is operable to process, execute, and/or support the use of scripts (e.g., JavaScript, AJAX, etc.), Plug-ins, executable code, virtual machines, vector-based web animation (e.g., Adobe Flash), etc.

In at least one embodiment, the web browser application may be configured or designed to instantiate components and/or objects at the Client Computer System in response to processing scripts, instructions, and/or other information received from a remote server such as a web server. Examples of such components and/or objects may include, but are not limited to, one or more of the following (or combinations thereof):

    • UI Components such as those illustrated, described, and/or referenced herein.
    • Database Components such as those illustrated, described, and/or referenced herein.
    • Processing Components such as those illustrated, described, and/or referenced herein.
    • Other Components which, for example, may include components for facilitating and/or enabling the Client Computer System to perform and/or initiate various types of operations, activities, functions such as those described herein.

In at least one embodiment, Client Computer System 300 may be configured or designed to include Dynamic Lighting and Rendering functionality for facilitating dynamic, real-time adjustment and rendering of lighting, shading, and/or other display characteristics associated with stationary virtual objects, dynamically movable virtual objects, shadow mapping, etc. In at least one embodiment, Client Computer System 300 may include Dynamic Lighting and Rendering Component(s), which, for example, may be configured or designed to facilitate, enable, initiate, and/or perform one or more of the following operation(s), action(s), and/or feature(s) (or combinations thereof):

    • real-time, dynamic adjustment of lighting characteristics of individual light sources within a rendered scene.
    • real-time dynamic adjustment of rendered lighting characteristics of virtual object(s) (e.g., including both static objects and movable objects) within a rendered scene based on the dynamic adjustment of the lighting characteristics of individual light sources within the rendered scene.
    • facilitating real-time dynamic rendering and shading of virtual object(s) (e.g., including both static objects and movable objects) and/or shadows within a rendered scene.
    • real-time dynamic rendering and shading of virtual object(s) (e.g., including both static objects and movable objects) and/or shadows within a rendered scene based on the dynamic adjustment of lighting characteristics of individual light sources within the rendered scene.
    • real-time dynamic rendering of lighting and shading characteristics of virtual objects and/or shadows within a given scene based on the lighting characteristics of individual light sources within that scene
    • real-time properties of reflected light (radiosity) projected onto virtual objects within a given scene based on the lighting characteristics of individual light sources and object within that scene.
    • real-time adjustment of lighting intensity, color and falloff.

FIG. 4 is a simplified block diagram of an exemplary gaming device 400 in accordance with a specific embodiment. In at least one embodiment, the gaming device may be configured or designed to include hardware components and/or hardware+software components for enabling or implementing at least a portion of the various dynamic lighting and rendering techniques described and/or referenced herein.

According to specific embodiments, various aspects, features, and/or functionalities of the gaming device may be performed, implemented and/or initiated by one or more of the following types of systems, components, systems, devices, procedures, processes, etc. (or combinations thereof): Processor(s) 410; Device Drivers 442; Memory 416; Interface(s) 406; Power Source(s)/Distribution 443; Geolocation module 446; Display(s) 435; I/O Devices 430; Audio/Video devices(s) 439; Peripheral Devices 431; Motion Detection module 440; User Identification/Authentication module 447; Client App Component(s) 460; Other Component(s) 468; UI Component(s) 462; Database Component(s) 464; Processing Component(s) 466; Software/Hardware Authentication/Validation 444; Wireless communication module(s) 445; Information Filtering module(s) 449; Operating mode selection component 448; Speech Processing module 454; Scanner/Camera 452; OCR Processing Engine 456; Dynamic Lighting and Rendering Component(s); etc.

As illustrated in the example of FIG. 4, gaming device 400 may include a variety of components, modules and/or systems for providing various types of functionality. For example, as illustrated in FIG. 4, gaming device 400 may include gaming device Application components (e.g., 460), which, for example, may include, but are not limited to, one or more of the following (or combinations thereof):

    • UI Components 462 such as those illustrated, described, and/or referenced herein.
    • Database Components 464 such as those illustrated, described, and/or referenced herein.
    • Processing Components 466 such as those illustrated, described, and/or referenced herein.
    • Other Components 468 which, for example, may include components for facilitating and/or enabling the gaming device to perform and/or initiate various types of operations, activities, functions such as those described herein.

In at least one embodiment, the gaming device Application component(s) may be operable to perform and/or implement various types of functions, operations, actions, and/or other features such as, for example, one or more of those described and/or referenced herein.

According to specific embodiments, multiple instances or threads of the gaming device Application component(s) may be concurrently implemented and/or initiated via the use of one or more processors and/or other combinations of hardware and/or hardware and software. For example, in at least some embodiments, various aspects, features, and/or functionalities of the gaming device Application component(s) may be performed, implemented and/or initiated by one or more of the various systems, components, systems, devices, procedures, processes, etc., described and/or referenced herein.

According to different embodiments, one or more different threads or instances of the gaming device Application component(s) may be initiated in response to detection of one or more conditions or events satisfying one or more different types of minimum threshold criteria for triggering initiation of at least one instance of the gaming device Application component(s). Various examples of conditions or events which may trigger initiation and/or implementation of one or more different threads or instances of the gaming device Application component(s) may include, but are not limited to, one or more of those described and/or referenced herein.

In at least one embodiment, a given instance of the gaming device Application component(s) may access and/or utilize information from one or more associated databases. In at least one embodiment, at least a portion of the database information may be accessed via communication with one or more local and/or remote memory devices. Examples of different types of data which may be accessed by the gaming device Application component(s) may include, but are not limited to, one or more of those described and/or referenced herein.

According to different embodiments, gaming device 400 may further include, but is not limited to, one or more of the following types of components, modules and/or systems (or combinations thereof):

    • At least one processor 410. In at least one embodiment, the processor(s) 410 may include one or more commonly known CPUs which are deployed in many of today's consumer electronic devices, such as, for example, CPUs or processors from the Motorola or Intel family of microprocessors, etc. In an alternative embodiment, at least one processor may be specially designed hardware for controlling the operations of the client system. In a specific embodiment, a memory (such as non-volatile RAM and/or ROM) also forms part of CPU. When acting under the control of appropriate software or firmware, the CPU may be responsible for implementing specific functions associated with the functions of a desired network device. The CPU preferably accomplishes all these functions under the control of software including an operating system, and any appropriate applications software.
    • Memory 416, which, for example, may include volatile memory (e.g., RAM), non-volatile memory (e.g., disk memory, FLASH memory, EPROMs, etc.), unalterable memory, and/or other types of memory. In at least one implementation, the memory 416 may include functionality similar to at least a portion of functionality implemented by one or more commonly known memory devices such as those described herein and/or generally known to one having ordinary skill in the art. According to different embodiments, one or more memories or memory modules (e.g., memory blocks) may be configured or designed to store data, program instructions for the functional operations of the client system and/or other information relating to the functionality of the various dynamic lighting and rendering techniques described herein. The program instructions may control the operation of an operating system and/or one or more applications, for example. The memory or memories may also be configured to store data structures, metadata, timecode synchronization information, audio/visual media content, asset file information, keyword taxonomy information, advertisement information, and/or information/data relating to other features/functions described herein. Because such information and program instructions may be employed to implement at least a portion of the dynamic lighting and rendering techniques described herein, various aspects described herein may be implemented using machine readable media that include program instructions, state information, etc. Examples of machine-readable media include, but are not limited to, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROM disks; magneto-optical media such as floptical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory devices (ROM) and random access memory (RAM). Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter.
    • Interface(s) 406 which, for example, may include wired interfaces and/or wireless interfaces. In at least one implementation, the interface(s) 406 may include functionality similar to at least a portion of functionality implemented by one or more computer system interfaces such as those described herein and/or generally known to one having ordinary skill in the art. For example, in at least one implementation, the wireless communication interface(s) may be configured or designed to communicate with selected electronic game tables, computer systems, remote servers, other wireless devices (e.g., PDAs, cell phones, player tracking transponders, etc.), etc. Such wireless communication may be implemented using one or more wireless interfaces/protocols such as, for example, 802.11 (WiFi), 802.15 (including Bluetooth™), 802.16 (WiMax), 802.22, Cellular standards such as CDMA, CDMA2000, WCDMA, Radio Frequency (e.g., RFID), Infrared, Near Field Magnetics, etc.
    • Device driver(s) 442. In at least one implementation, the device driver(s) 442 may include functionality similar to at least a portion of functionality implemented by one or more computer system driver devices such as those described herein and/or generally known to one having ordinary skill in the art.
    • At least one power source (and/or power distribution source) 443. In at least one implementation, the power source may include at least one mobile power source (e.g., battery) for allowing the client system to operate in a wireless and/or mobile environment. For example, in one implementation, the power source 443 may be implemented using a rechargeable, thin-film type battery. Further, in embodiments where it is desirable for the device to be flexible, the power source 443 may be designed to be flexible.
    • Geolocation module 446 which, for example, may be configured or designed to acquire geolocation information from remote sources and use the acquired geolocation information to determine information relating to a relative and/or absolute position of the client system.
    • Motion detection component 440 for detecting motion or movement of the client system and/or for detecting motion, movement, gestures and/or other input data from user. In at least one embodiment, the motion detection component 440 may include one or more motion detection sensors such as, for example, MEMS (Micro Electro Mechanical System) accelerometers, that can detect the acceleration and/or other movements of the client system as it is moved by a user.
    • User Identification/Authentication module 447. In one implementation, the User Identification module may be adapted to determine and/or authenticate the identity of the current user or owner of the client system. For example, in one embodiment, the current Functionality for enabling a user to be required to perform a log in process at the client system in order to access one or more features. Alternatively, the client system may be adapted to automatically determine the identity of the current user based upon one or more external signals such as, for example, an RFID tag or badge worn by the current user which provides a wireless signal to the client system for determining the identity of the current user. In at least one implementation, various security features may be incorporated into the client system to prevent unauthorized users from accessing confidential or sensitive information.
    • One or more display(s) 435. According to various embodiments, such display(s) may be implemented using, for example, LCD display technology, OLED display technology, and/or other types of conventional display technology. In at least one implementation, display(s) 435 may be adapted to be flexible or bendable. Additionally, in at least one embodiment the information displayed on display(s) 435 may utilize e-ink technology (such as that available from E Ink Corporation, Cambridge, Mass., www.eink.com), or other suitable technology for reducing the power consumption of information displayed on the display(s) 435.
    • One or more user I/O Device(s) 430 such as, for example, keys, buttons, scroll wheels, cursors, touchscreen sensors, audio command interfaces, magnetic strip reader, optical scanner, etc.
    • Audio/Video device(s) 439 such as, for example, components for displaying audio/visual media which, for example, may include cameras, speakers, microphones, media presentation components, wireless transmitter/receiver devices for enabling wireless audio and/or visual communication between the client system 400 and remote devices (e.g., radios, telephones, computer systems, etc.). For example, in one implementation, the audio system may include componentry for enabling the client system to function as a cell phone or two-way radio device.
    • Other types of peripheral devices 431 which may be useful to the users of various client systems, such as, for example: PDA functionality; memory card reader(s); fingerprint reader(s); image projection device(s); social networking peripheral component(s); etc.
    • Information filtering module(s) 449 which, for example, may be adapted to automatically and dynamically generate, using one or more filter parameters, filtered information to be displayed on one or more displays of the gaming device. In one implementation, such filter parameters may be customizable by the player or user of the device. In some embodiments, information filtering module(s) 449 may also be adapted to display, in real-time, filtered information to the user based upon a variety of criteria such as, for example, geolocation information, casino data information, player tracking information, etc.
    • Wireless communication module(s) 445. In one implementation, the wireless communication module 445 may be configured or designed to communicate with external devices using one or more wireless interfaces/protocols such as, for example, 802.11 (WiFi), 802.15 (including Bluetooth™), 802.16 (WiMax), 802.22, Cellular standards such as CDMA, CDMA2000, WCDMA, Radio Frequency (e.g., RFID), Infrared, Near Field Magnetics, etc.
    • Software/Hardware Authentication/validation components 444 which, for example, may be used for authenticating and/or validating local hardware and/or software components, hardware/software components residing at a remote device, game play information, wager information, user information and/or identity, etc. Examples of various authentication and/or validation components are described in U.S. Pat. No. 6,620,047, titled, “ELECTRONIC GAMING APPARATUS HAVING AUTHENTICATION DATA SETS,” incorporated herein by reference in its entirety for all purposes.
    • Operating mode selection component 448 which, for example, may be operable to automatically select an appropriate mode of operation based on various parameters and/or upon detection of specific events or conditions such as, for example: the gaming device's current location; identity of current user; user input; system override (e.g., emergency condition detected); proximity to other devices belonging to same group or association; proximity to specific objects, regions, zones, etc. Additionally, the gaming device may be operable to automatically update or switch its current operating mode to the selected mode of operation. The gaming device may also be adapted to automatically modify accessibility of user-accessible features and/or information in response to the updating of its current mode of operation.
    • Scanner/Camera Component(s) (e.g., 452) which may be configured or designed for use in scanning identifiers and/or other content from other devices and/or objects such as for example: gaming device displays, computer displays, static displays (e.g., printed on tangible mediums), etc.
    • OCR Processing Engine (e.g., 456) which, for example, may be operable to perform image processing and optical character recognition of images such as those captured by a gaming device camera, for example.
    • Speech Processing module (e.g., 454) which, for example, may be operable to perform speech recognition, and may be operable to perform speech-to-text conversion.
    • Dynamic Lighting and Rendering Component(s) 492 may be configured or designed to provide functionality for facilitating dynamic, real-time adjustment and rendering of lighting, shading, and/or other display characteristics associated with stationary virtual objects, dynamically movable virtual objects, shadow mapping, etc. In at least one embodiment, the Dynamic Lighting and Rendering Component(s) 292 may be configured or designed to facilitate, enable, initiate, and/or perform one or more of the following operation(s), action(s), and/or feature(s) (or combinations thereof):
      • real-time, dynamic adjustment of lighting characteristics of individual light sources within a rendered scene.
      • real-time dynamic adjustment of rendered lighting characteristics of virtual object(s) (e.g., including both static objects and movable objects) within a rendered scene based on the dynamic adjustment of the lighting characteristics of individual light sources within the rendered scene.
      • facilitating real-time dynamic rendering and shading of virtual object(s) (e.g., including both static objects and movable objects) and/or shadows within a rendered scene.
      • real-time dynamic rendering and shading of virtual object(s) (e.g., including both static objects and movable objects) and/or shadows within a rendered scene based on the dynamic adjustment of lighting characteristics of individual light sources within the rendered scene.
      • real-time dynamic rendering of lighting and shading characteristics of virtual objects and/or shadows within a given scene based on the lighting characteristics of individual light sources within that scene
    • Etc.

According to a specific embodiment, the gaming device may be adapted to implement at least a portion of the features associated with the mobile game service system described in U.S. patent application Ser. No. 10/115,164, which is now U.S. Pat. No. 6,800,029, issued Oct. 5, 2004, (previously incorporated by reference in its entirety). For example. in one embodiment, the gaming device may be comprised of a hand-held game service user interface device (GSUID) and a number of input and output devices. The GSUID is generally comprised of a display screen which may display a number of game service interfaces. These game service interfaces are generated on the display screen by a microprocessor of some type within the GSUID. Examples of a hand-held GSUID which may accommodate the game service interfaces are manufactured by Symbol Technologies, Incorporated of Holtsville, N.Y.

The game service interfaces may be used to provide a variety of game service transactions and gaming operations services. The game service interfaces, including a login interface, an input/output interface, a transaction reconciliation interface, a ticket validation interface, a prize services interfaces, a food services interface, an accommodation services interfaces, a gaming operations interfaces, a multi-game/multi-denomination meter data transfer interface, etc. Each interface may be accessed via a main menu with a number of sub-menus that allow a game service representative to access the different display screens relating to the particular interface. Using the different display screens within a particular interface, the game service representative may perform various operations needed to provide a particular game service. For example, the login interface may allow the game service representative to enter a user identification of some type and verify the user identification with a password. When the display screen is a touch screen, the user may enter the user/operator identification information on a display screen comprising the login interface using the input stylus and/or using the input buttons. Using a menu on the display screen of the login interface, the user may select other display screens relating to the login and registration process. For example, another display screen obtained via a menu on a display screen in the login interface may allow the GSUID to scan a finger print of the game service representative for identification purposes or scan the finger print of a game player.

The user identification information and user validation information may allow the game service representative to access all or some subset of the available game service interfaces available on the GSUID. For example, certain users, after logging into the GSUID (e.g. entering a user identification and a valid user identification information), may be able to access a variety of different interfaces, such as, for example, one or more of: input/output interface, communication interface, food services interface, accommodation services interface, prize service interface, gaming operation services interface, transaction reconciliation interface, voice communication interface, gaming device performance or metering data transfer interface, etc.; and perform a variety of services enabled by such interfaces. While other users may be only be able to access the award ticket validation interface and perform EZ pay ticket validations. The GSUID may also output game service transaction information to a number of different devices (e.g., card reader, printer, storage devices, gaming machines and remote transaction servers, etc.).

In addition to the features described above, various embodiments of gaming devices described herein may also include additional functionality for displaying, in real-time, filtered information to the user based upon a variety of criteria such as, for example, geolocation information, casino data information, player tracking information, etc.

FIG. 5 illustrates an example embodiment of a Server System 580 which may be used for implementing various aspects/features described herein. In at least one embodiment, the Server System 580 includes at least one network device 560, and at least one storage device 570 (such as, for example, a direct attached storage device). In one embodiment, Server System 580 may be suitable for implementing at least some of the dynamic lighting and rendering techniques described herein.

In according to one embodiment, network device 560 may include a master central processing unit (CPU) 562, interfaces 568, and a bus 567 (e.g., a PCI bus). When acting under the control of appropriate software or firmware, the CPU 562 may be responsible for implementing specific functions associated with the functions of a desired network device. For example, when configured as a server, the CPU 562 may be responsible for analyzing packets; encapsulating packets; forwarding packets to appropriate network devices; instantiating various types of virtual machines, virtual interfaces, virtual storage volumes, virtual appliances; etc. The CPU 562 preferably accomplishes at least a portion of these functions under the control of software including an operating system (e.g. Linux), and any appropriate system software (such as, for example, AppLogic™ software).

CPU 562 may include one or more processors 563 such as, for example, one or more processors from the AMD, Motorola, Intel and/or MIPS families of microprocessors. In an alternative embodiment, processor 563 may be specially designed hardware for controlling the operations of Server System 580. In a specific embodiment, a memory 561 (such as non-volatile RAM and/or ROM) also forms part of CPU 562. However, there may be many different ways in which memory could be coupled to the system. Memory block 561 may be used for a variety of purposes such as, for example, caching and/or storing data, programming instructions, etc.

The interfaces 568 may be typically provided as interface cards (sometimes referred to as “line cards”). Alternatively, one or more of the interfaces 568 may be provided as on-board interface controllers built into the system motherboard. Generally, they control the sending and receiving of data packets over the network and sometimes support other peripherals used with the Server System 580. Among the interfaces that may be provided may be FC interfaces, Ethernet interfaces, frame relay interfaces, cable interfaces, DSL interfaces, token ring interfaces, Infiniband interfaces, and the like. In addition, various very high-speed interfaces may be provided, such as fast Ethernet interfaces, Gigabit Ethernet interfaces, ATM interfaces, HSSI interfaces, POS interfaces, FDDI interfaces, ASI interfaces, DHEI interfaces and the like. Other interfaces may include one or more wireless interfaces such as, for example, 802.11 (WiFi) interfaces, 802.15 interfaces (including Bluetooth™), 802.16 (WiMax) interfaces, 802.22 interfaces, Cellular standards such as CDMA interfaces, CDMA2000 interfaces, WCDMA interfaces, TDMA interfaces, Cellular 3G interfaces, etc.

Generally, one or more interfaces may include ports appropriate for communication with the appropriate media. In some cases, they may also include an independent processor and, in some instances, volatile RAM. The independent processors may control such communications intensive tasks as packet switching, media control and management. By providing separate processors for the communications intensive tasks, these interfaces allow the master microprocessor 562 to efficiently perform routing computations, network diagnostics, security functions, etc.

In at least one embodiment, some interfaces may be configured or designed to allow the Server System 580 to communicate with other network devices associated with various local area network (LANs) and/or wide area networks (WANs). Other interfaces may be configured or designed to allow network device 560 to communicate with one or more direct attached storage device(s) 570.

Although the system shown in FIG. 5 illustrates one specific network device described herein, it is by no means the only network device architecture on which one or more embodiments can be implemented. For example, an architecture having a single processor that handles communications as well as routing computations, etc. may be used. Further, other types of interfaces and media could also be used with the network device.

Regardless of network device's configuration, it may employ one or more memories or memory modules (such as, for example, memory block 565, which, for example, may include random access memory (RAM)) configured to store data, program instructions for the general-purpose network operations and/or other information relating to the functionality of the various dynamic lighting and rendering techniques described herein. The program instructions may control the operation of an operating system and/or one or more applications, for example. The memory or memories may also be configured to store data structures, and/or other specific non-program information described herein.

Because such information and program instructions may be employed to implement the systems/methods described herein, one or more embodiments relates to machine readable media that include program instructions, state information, etc. for performing various operations described herein. Examples of machine-readable storage media include, but are not limited to, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROM disks; magneto-optical media such as floptical disks; and hardware devices that may be specially configured to store and perform program instructions, such as read-only memory devices (ROM) and random access memory (RAM). Some embodiments may also be embodied in transmission media such as, for example, a carrier wave travelling over an appropriate medium such as airwaves, optical lines, electric lines, etc. Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter.

FIG. 6 illustrates an example of a functional block diagram of a Server System 600 in accordance with a specific embodiment. In at least one embodiment, the Server System 600 may be operable to perform and/or implement various types of functions, operations, actions, and/or other features such, for example, one or more of those illustrated, described, and/or referenced herein.

In at least one embodiment, the Server System may include a plurality of components operable to perform and/or implement various types of functions, operations, actions, and/or other features such as, for example, one or more of the following (or combinations thereof):

    • Context Interpreter (e.g., 602) which, for example, may be operable to automatically and/or dynamically analyze contextual criteria relating to one or more detected event(s) and/or condition(s), and automatically determine or identify one or more contextually appropriate response(s) based on the contextual interpretation of the detected event(s)/condition(s). According to different embodiments, examples of contextual criteria which may be analyzed may include, but are not limited to, one or more of the following (or combinations thereof): location-based criteria (e.g., geolocation of client device, geolocation of agent device, etc.); time-based criteria; identity of Client user; identity of Agent user; user profile information; transaction history information; recent user activities; proximate business-related criteria (e.g., criteria which may be used to determine whether the client device is currently located at or near a recognized business establishment such as a bank, gas station, restaurant, supermarket, etc.); etc.
    • Time Synchronization Engine (e.g., 604) which, for example, may be operable to manages universal time synchronization (e.g., via NTP and/or GPS)
    • Search Engine (e.g., 628) which, for example, may be operable to search for transactions, logs, items, accounts, options in the TIS databases
    • Configuration Engine (e.g., 632) which, for example, may be operable to determine and handle configuration of various customized configuration parameters for one or more devices, component(s), system(s), process(es), etc.
    • Time Interpreter (e.g., 618) which, for example, may be operable to automatically and/or dynamically modify or change identifier activation and expiration time(s) based on various criteria such as, for example, time, location, transaction status, etc.
    • Authentication/Validation Component(s) (e.g., 647) (password, software/hardware info, SSL certificates) which, for example, may be operable to perform various types of authentication/validation tasks such as, for example, one or more of the following (or combinations thereof): verifying/authenticating devices; verifying passwords, passcodes, SSL certificates, biometric identification information, and/or other types of security-related information; verify/validate activation and/or expiration times; etc. In one implementation, the Authentication/Validation Component(s) may be adapted to determine and/or authenticate the identity of the current user or owner of the mobile client system. For example, in one embodiment, the current user may be required to perform a log in process at the mobile client system in order to access one or more features. In some embodiments, the mobile client system may include biometric security components which may be operable to validate and/or authenticate the identity of a user by reading or scanning The user's biometric information (e.g., fingerprints, face, voice, eye/iris, etc.). In at least one implementation, various security features may be incorporated into the mobile client system to prevent unauthorized users from accessing confidential or sensitive information.
    • Transaction Processing Engine (e.g., 622) which, for example, may be operable to handle various types of transaction processing tasks such as, for example, one or more of the following (or combinations thereof): identifying/determining transaction type; determining which payment gateway(s) to use; associating databases information to identifiers; etc.
    • OCR Processing Engine (e.g., 634) which, for example, may be operable to perform image processing and optical character recognition of images such as those captured by a gaming device camera, for example.
    • Database Manager (e.g., 626) which, for example, may be operable to handle various types of tasks relating to database updating, database management, database access, etc. In at least one embodiment, the Database Manager may be operable to manage TISS databases, gaming device Application databases, etc.
    • Log Component(s) (e.g., 610) which, for example, may be operable to generate and manage transactions history logs, system errors, connections from APIs, etc.
    • Status Tracking Component(s) (e.g., 612) which, for example, may be operable to automatically and/or dynamically determine, assign, and/or report updated transaction status information based, for example, on the state of the transaction. In at least one embodiment, the status of a given transaction may be reported as one or more of the following (or combinations thereof): Completed, Incomplete, Pending, Invalid, Error, Declined, Accepted, etc.
    • Gateway Component(s) (e.g., 614) which, for example, may be operable to facilitate and manage communications and transactions with external Payment Gateways.
    • Web Interface Component(s) (e.g., 608) which, for example, may be operable to facilitate and manage communications and transactions with TIS web portal(s).
    • API Interface(s) to Server System(s) (e.g., 646) which, for example, may be operable to facilitate and manage communications and transactions with API Interface(s) to Server System(s)
    • API Interface(s) to 3rd Party Server System(s) (e.g., 648) which, for example, may be operable to facilitate and manage communications and transactions with API Interface(s) to 3rd Party Server System(s)
    • OCR Processing Engine (e.g., 634) which, for example, may be operable to perform image processing and optical character recognition of images such as those captured by a gaming device camera, for example.
    • At least one processor 610. In at least one embodiment, the processor(s) 610 may include one or more commonly known CPUs which are deployed in many of today's consumer electronic devices, such as, for example, CPUs or processors from the Motorola or Intel family of microprocessors, etc. In an alternative embodiment, at least one processor may be specially designed hardware for controlling the operations of the mobile client system. In a specific embodiment, a memory (such as non-volatile RAM and/or ROM) also forms part of CPU. When acting under the control of appropriate software or firmware, the CPU may be responsible for implementing specific functions associated with the functions of a desired network device. The CPU preferably accomplishes all these functions under the control of software including an operating system, and any appropriate applications software.
    • Memory 616, which, for example, may include volatile memory (e.g., RAM), non-volatile memory (e.g., disk memory, FLASH memory, EPROMs, etc.), unalterable memory, and/or other types of memory. In at least one implementation, the memory 616 may include functionality similar to at least a portion of functionality implemented by one or more commonly known memory devices such as those described herein and/or generally known to one having ordinary skill in the art. According to different embodiments, one or more memories or memory modules (e.g., memory blocks) may be configured or designed to store data, program instructions for the functional operations of the mobile client system and/or other information relating to the functionality of the various Mobile Transaction techniques described herein. The program instructions may control the operation of an operating system and/or one or more applications, for example. The memory or memories may also be configured to store data structures, metadata, identifier information/images, and/or information/data relating to other features/functions described herein. Because such information and program instructions may be employed to implement at least a portion of the Gaming Network techniques described herein, various aspects described herein may be implemented using machine readable media that include program instructions, state information, etc. Examples of machine-readable media include, but are not limited to, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROM disks; magneto-optical media such as floptical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory devices (ROM) and random access memory (RAM). Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter.
    • Interface(s) 606 which, for example, may include wired interfaces and/or wireless interfaces. In at least one implementation, the interface(s) 606 may include functionality similar to at least a portion of functionality implemented by one or more computer system interfaces such as those described herein and/or generally known to one having ordinary skill in the art.
    • Device driver(s) 642. In at least one implementation, the device driver(s) 642 may include functionality similar to at least a portion of functionality implemented by one or more computer system driver devices such as those described herein and/or generally known to one having ordinary skill in the art.
    • One or more display(s) 635. According to various embodiments, such display(s) may be implemented using, for example, LCD display technology, OLED display technology, and/or other types of conventional display technology. In at least one implementation, display(s) 635 may be adapted to be flexible or bendable. Additionally, in at least one embodiment the information displayed on display(s) 635 may utilize e-ink technology (such as that available from E Ink Corporation, Cambridge, Mass., www.eink.com), or other suitable technology for reducing the power consumption of information displayed on the display(s) 635.
    • Email Server Component(s) 636, which, for example, may be configured or designed to provide various functions and operations relating to email activities and communications.
    • Web Server Component(s) 637, which, for example, may be configured or designed to provide various functions and operations relating to web server activities and communications.
    • Messaging Server Component(s) 638, which, for example, may be configured or designed to provide various functions and operations relating to text messaging and/or other social network messaging activities and/or communications.
    • Dynamic Lighting and Rendering Component(s) 692, which, for example, may be configured or designed to provide functionality for facilitating dynamic, real-time adjustment and rendering of lighting, shading, and/or other display characteristics associated with stationary virtual objects, dynamically movable virtual objects, shadow mapping, etc. In at least one embodiment, the Dynamic Lighting and Rendering Component(s) 292 may be configured or designed to facilitate, enable, initiate, and/or perform one or more of the following operation(s), action(s), and/or feature(s) (or combinations thereof):
      • real-time, dynamic adjustment of lighting characteristics of individual light sources within a rendered scene.
      • real-time dynamic adjustment of rendered lighting characteristics of virtual object(s) (e.g., including both static objects and movable objects) within a rendered scene based on the dynamic adjustment of the lighting characteristics of individual light sources within the rendered scene.
      • facilitating real-time dynamic rendering and shading of virtual object(s) (e.g., including both static objects and movable objects) and/or shadows within a rendered scene.
      • real-time dynamic rendering and shading of virtual object(s) (e.g., including both static objects and movable objects) and/or shadows within a rendered scene based on the dynamic adjustment of lighting characteristics of individual light sources within the rendered scene.
      • real-time dynamic rendering of lighting and shading characteristics of virtual objects and/or shadows within a given scene based on the lighting characteristics of individual light sources within that scene
      • real-time properties of reflected light (radiosity) projected onto virtual objects within a given scene based on the lighting characteristics of individual light sources and object within that scene.
      • real-time adjustment of lighting intensity, color and falloff.

Although several example embodiments of one or more aspects and/or features have been described in detail herein with reference to the accompanying drawings, it is to be understood that aspects and/or features are not limited to these precise embodiments, and that various changes and modifications may be effected therein by one skilled in the art without departing from the scope of spirit of the invention(s) as defined, for example, in the appended claims.

Claims

1. A gaming system in a gaming network, comprising:

a gaming controller;
memory;
a first display;
at least one interface for communicating with at least one other device in the gaming network;
the gaming system being operable to:
initiate a first active gaming session at the first gaming system;
identify a first virtual scene to be rendered for display during the first active gaming session;
identify a first virtual light source associated with the first virtual scene, the first virtual light source having associated therewith a first portion of lighting characteristics;
dynamically set the first portion of lighting characteristics to be in accordance with a first set of values;
dynamically render and display, during the first active gaming session and using the first portion of lighting characteristics, a first virtual object of the first scene, wherein the first rendered virtual object is displayed in accordance with a first set of rendered lighting characteristics relating to a first visual appearance of the first rendered virtual object;
dynamically modify the first portion of lighting characteristics to be accordance with a second set of values; and
dynamically adjust the visual appearance of the first rendered virtual object in response to the dynamic modification of the first portion of lighting characteristics, wherein the dynamic adjustment of the visual appearance of the first rendered virtual object includes dynamically calculating, using the second set of values, a modified set of rendered lighting characteristics relating to a modified visual appearance of the first rendered virtual object.

2. The system of claim 1 being further operable to:

identify a second virtual light source associated with the first virtual scene, the second virtual light second light source having associated therewith a second portion lighting characteristics;
dynamically render and display, during the first active gaming session and using the first portion of lighting characteristics and second portion of lighting characteristics, the first virtual object of the first scene, wherein the first rendered virtual object is displayed in accordance with a third set of rendered lighting characteristics relating to a third visual appearance of the first rendered virtual object;
dynamically modify the second portion of lighting characteristics to be accordance with a modified set of values; and
dynamically adjust the visual appearance of the first rendered virtual object in response to the dynamic modification of the second portion of lighting characteristics, wherein the dynamic adjustment of the visual appearance of the first rendered virtual object includes dynamically calculating, using the modified set of values, a second modified set of rendered lighting characteristics relating to a second modified visual appearance of the first rendered virtual object.

3. The system of claim 1 being further operable to:

identify a first pixel associated with the first virtual object;
identify first light source influence criteria relating to the first pixel, wherein the first light source influence criteria characterizes a first amount of light intensity or light influence which the first virtual light source has over the first pixel.

4. The system of claim 1 being further operable to:

identify a second virtual light source associated with the first virtual scene, the second virtual light second light source having associated therewith a second portion lighting characteristics;
identify a first pixel associated with the first virtual object;
identify first light source influence criteria relating to the identified first pixel, wherein the first light source influence criteria characterizes a first amount of light intensity or light influence which the first virtual light source has over the first pixel;
identify second light source influence criteria relating to the first pixel, wherein the second light source influence criteria characterizes a second amount of light intensity or light influence which the second virtual light source has over the first pixel; and
dynamically calculate, during rendering of the first virtual object and using the first and second light source influence criteria, a set of first pixel lighting characteristics relating to a visual appearance of the first pixel.

5. The system of claim 1 being further operable to:

identify a second virtual light source associated with the first virtual scene, the second virtual light second light source having associated therewith a second portion lighting characteristics;
identify a first pixel associated with the first virtual object;
identify first light source influence criteria relating to the identified first pixel, wherein the first light source influence criteria characterizes a first amount of light intensity or light influence which the first virtual light source has over the first pixel;
identify second light source influence criteria relating to the first pixel, wherein the second light source influence criteria characterizes a second amount of light intensity or light influence which the second virtual light source has over the first pixel; and
dynamically calculate, during rendering of the first virtual object and using the first and second light source influence criteria, a set of first pixel lighting characteristics relating to a composite light intensity of the first pixel.

6. The system of claim 1 being further operable to:

identify a second virtual light source associated with the first virtual scene, the second virtual light second light source having associated therewith a second portion lighting characteristics;
identify a first pixel associated with the first virtual object;
identify first light source influence criteria relating to the identified first pixel, wherein the first light source influence criteria characterizes a first amount of light intensity or light influence which the first virtual light source has over the first pixel;
identify second light source influence criteria relating to the first pixel, wherein the second light source influence criteria characterizes a second amount of light intensity or light influence which the second virtual light source has over the first pixel;
identify a first current color profile associated with the first virtual light source;
identify a second current color profile associated with the second virtual light source;
dynamically calculate, during rendering of the first virtual object and using the first and second light source influence criteria, a set of first pixel lighting characteristics relating to a composite light intensity of the first pixel; and
dynamically determine, during rendering of the first virtual object, at least one color characteristic of the first pixel using the set of first pixel lighting characteristics and using the first and second current color profiles.

7. A computer implemented method for operating a gaming system in a gaming network, the method comprising:

identifying a first virtual scene to be rendered for display during a first active gaming session at the gaming system;
identifying a first virtual light source associated with the first virtual scene, the first virtual light source having associated therewith a first portion of lighting characteristics;
dynamically set the first portion of lighting characteristics to be in accordance with a first set of values;
dynamically rendering and displaying, during the first active gaming session and using the first portion of lighting characteristics, a first virtual object of the first scene, wherein the first rendered virtual object is displayed in accordance with a first set of rendered lighting characteristics relating to a first visual appearance of the first rendered virtual object;
dynamically modifying the first portion of lighting characteristics to be accordance with a second set of values; and
dynamically adjusting the visual appearance of the first rendered virtual object in response to the dynamic modification of the first portion of lighting characteristics, wherein the dynamic adjustment of the visual appearance of the first rendered virtual object includes dynamically calculating, using the second set of values, a modified set of rendered lighting characteristics relating to a modified visual appearance of the first rendered virtual object.

8. The method of claim 7 further comprising:

identifying a second virtual light source associated with the first virtual scene, the second virtual light second light source having associated therewith a second portion lighting characteristics;
dynamically rendering and displaying, during the first active gaming session and using the first portion of lighting characteristics and second portion of lighting characteristics, the first virtual object of the first scene, wherein the first rendered virtual object is displayed in accordance with a third set of rendered lighting characteristics relating to a third visual appearance of the first rendered virtual object;
dynamically modifying the second portion of lighting characteristics to be accordance with a modified set of values; and
dynamically adjusting the visual appearance of the first rendered virtual object in response to the dynamic modification of the second portion of lighting characteristics, wherein the dynamic adjustment of the visual appearance of the first rendered virtual object includes dynamically calculating, using the modified set of values, a second modified set of rendered lighting characteristics relating to a second modified visual appearance of the first rendered virtual object.

9. The method of claim 7 further comprising:

identifying a first pixel associated with the first virtual object;
identifying first light source influence criteria relating to the first pixel, wherein the first light source influence criteria characterizes a first amount of light intensity or light influence which the first virtual light source has over the first pixel.

10. The method of claim 7 further comprising:

identifying a second virtual light source associated with the first virtual scene, the second virtual light second light source having associated therewith a second portion lighting characteristics;
identifying a first pixel associated with the first virtual object;
identifying first light source influence criteria relating to the identified first pixel, wherein the first light source influence criteria characterizes a first amount of light intensity or light influence which the first virtual light source has over the first pixel;
identifying second light source influence criteria relating to the first pixel, wherein the second light source influence criteria characterizes a second amount of light intensity or light influence which the second virtual light source has over the first pixel; and
dynamically calculating, during rendering of the first virtual object and using the first and second light source influence criteria, a set of first pixel lighting characteristics relating to a visual appearance of the first pixel.

11. The method of claim 7 further comprising:

identifying a second virtual light source associated with the first virtual scene, the second virtual light second light source having associated therewith a second portion lighting characteristics;
identifying a first pixel associated with the first virtual object;
identifying first light source influence criteria relating to the identified first pixel, wherein the first light source influence criteria characterizes a first amount of light intensity or light influence which the first virtual light source has over the first pixel;
identifying second light source influence criteria relating to the first pixel, wherein the second light source influence criteria characterizes a second amount of light intensity or light influence which the second virtual light source has over the first pixel; and
dynamically calculating, during rendering of the first virtual object and using the first and second light source influence criteria, a set of first pixel lighting characteristics relating to a composite light intensity of the first pixel.

12. The method of claim 7 further comprising:

identifying a second virtual light source associated with the first virtual scene, the second virtual light second light source having associated therewith a second portion lighting characteristics;
identifying a first pixel associated with the first virtual object;
identifying first light source influence criteria relating to the identified first pixel, wherein the first light source influence criteria characterizes a first amount of light intensity or light influence which the first virtual light source has over the first pixel;
identifying second light source influence criteria relating to the first pixel, wherein the second light source influence criteria characterizes a second amount of light intensity or light influence which the second virtual light source has over the first pixel;
identifying a first current color profile associated with the first virtual light source;
identifying a second current color profile associated with the second virtual light source;
dynamically calculating, during rendering of the first virtual object and using the first and second light source influence criteria, a set of first pixel lighting characteristics relating to a composite light intensity of the first pixel; and
dynamically determining, during rendering of the first virtual object, at least one color characteristic of the first pixel using the set of first pixel lighting characteristics and using the first and second current color profiles.

13. A computer program product for operating a gaming system in a gaming network, the computer program product comprising:

a computer usable medium having computer readable code embodied therein, the computer readable code comprising:
computer code for identifying a first virtual scene to be rendered for display during a first active gaming session at the gaming system;
computer code for identifying a first virtual light source associated with the first virtual scene, the first virtual light source having associated therewith a first portion of lighting characteristics;
dynamically set the first portion of lighting characteristics to be in accordance with a first set of values;
computer code for dynamically rendering and displaying, during the first active gaming session and using the first portion of lighting characteristics, a first virtual object of the first scene, wherein the first rendered virtual object is displayed in accordance with a first set of rendered lighting characteristics relating to a first visual appearance of the first rendered virtual object;
computer code for dynamically modifying the first portion of lighting characteristics to be accordance with a second set of values; and
computer code for dynamically adjusting the visual appearance of the first rendered virtual object in response to the dynamic modification of the first portion of lighting characteristics, wherein the dynamic adjustment of the visual appearance of the first rendered virtual object includes dynamically calculating, using the second set of values, a modified set of rendered lighting characteristics relating to a modified visual appearance of the first rendered virtual object.

14. The computer program product of claim 13 further comprising:

computer code for identifying a second virtual light source associated with the first virtual scene, the second virtual light second light source having associated therewith a second portion lighting characteristics;
computer code for dynamically rendering and displaying, during the first active gaming session and using the first portion of lighting characteristics and second portion of lighting characteristics, the first virtual object of the first scene, wherein the first rendered virtual object is displayed in accordance with a third set of rendered lighting characteristics relating to a third visual appearance of the first rendered virtual object;
computer code for dynamically modifying the second portion of lighting characteristics to be accordance with a modified set of values; and
computer code for dynamically adjusting the visual appearance of the first rendered virtual object in response to the dynamic modification of the second portion of lighting characteristics, wherein the dynamic adjustment of the visual appearance of the first rendered virtual object includes dynamically calculating, using the modified set of values, a second modified set of rendered lighting characteristics relating to a second modified visual appearance of the first rendered virtual object.

15. The computer program product of claim 13 further comprising:

computer code for identifying a first pixel associated with the first virtual object;
computer code for identifying first light source influence criteria relating to the first pixel, wherein the first light source influence criteria characterizes a first amount of light intensity or light influence which the first virtual light source has over the first pixel.

16. The computer program product of claim 13 further comprising:

computer code for identifying a second virtual light source associated with the first virtual scene, the second virtual light second light source having associated therewith a second portion lighting characteristics;
computer code for identifying a first pixel associated with the first virtual object;
computer code for identifying first light source influence criteria relating to the identified first pixel, wherein the first light source influence criteria characterizes a first amount of light intensity or light influence which the first virtual light source has over the first pixel;
computer code for identifying second light source influence criteria relating to the first pixel, wherein the second light source influence criteria characterizes a second amount of light intensity or light influence which the second virtual light source has over the first pixel; and
computer code for dynamically calculating, during rendering of the first virtual object and using the first and second light source influence criteria, a set of first pixel lighting characteristics relating to a visual appearance of the first pixel.

17. The computer program product of claim 13 further comprising:

identifying a second virtual light source associated with the first virtual scene, the second virtual light second light source having associated therewith a second portion lighting characteristics;
computer code for identifying a first pixel associated with the first virtual object;
computer code for identifying first light source influence criteria relating to the identified first pixel, wherein the first light source influence criteria characterizes a first amount of light intensity or light influence which the first virtual light source has over the first pixel;
computer code for identifying second light source influence criteria relating to the first pixel, wherein the second light source influence criteria characterizes a second amount of light intensity or light influence which the second virtual light source has over the first pixel; and
computer code for dynamically calculating, during rendering of the first virtual object and using the first and second light source influence criteria, a set of first pixel lighting characteristics relating to a composite light intensity of the first pixel.

18. The computer program product of claim 13 further comprising:

computer code for identifying a second virtual light source associated with the first virtual scene, the second virtual light second light source having associated therewith a second portion lighting characteristics;
computer code for identifying a first pixel associated with the first virtual object;
computer code for identifying first light source influence criteria relating to the identified first pixel, wherein the first light source influence criteria characterizes a first amount of light intensity or light influence which the first virtual light source has over the first pixel;
computer code for identifying second light source influence criteria relating to the first pixel, wherein the second light source influence criteria characterizes a second amount of light intensity or light influence which the second virtual light source has over the first pixel;
computer code for identifying a first current color profile associated with the first virtual light source;
computer code for identifying a second current color profile associated with the second virtual light source;
computer code for dynamically calculating, during rendering of the first virtual object and using the first and second light source influence criteria, a set of first pixel lighting characteristics relating to a composite light intensity of the first pixel; and
computer code for dynamically determining, during rendering of the first virtual object, at least one color characteristic of the first pixel using the set of first pixel lighting characteristics and using the first and second current color profiles.
Patent History
Publication number: 20130005458
Type: Application
Filed: Jul 2, 2012
Publication Date: Jan 3, 2013
Applicant: 3G STUDIOS, INC. (Reno, NV)
Inventors: JAMES PETER KOSTA (Gardnerville, NV), Dylan S. Petty (Reno, NV)
Application Number: 13/540,581
Classifications
Current U.S. Class: Visual (e.g., Enhanced Graphics, Etc.) (463/31)
International Classification: A63F 13/00 (20060101);