Bounding plane-based techniques for improved sample test efficiency in image rendering
A method for reducing the number of samples tested for rendering a screen space region of an image includes constructing a trajectory of a primitive in a three dimensional coordinate system, the coordinate system including a screen space dimension, a lens dimension and a time dimension. A bounding volume is constructed for a screen space region which is to be rendered, the bounding volume overlapping a portion of the screen space region. The bounding volume is defined according to a plurality of bounding planes which extend in the three dimensional coordinate system, whereby the bounding planes are determined as a function of the trajectory of the primitive. One or more sample points which are located within the screen space region, and which are not overlapped by the bounding volume are excluded from testing.
Latest NVIDIA CORPORATION Patents:
This application is concurrently filed with the following commonly-owned patent applications, each of which is incorporated by reference in its entirety for all purposes:
-
- U.S. patent application Ser. No. 13/168,765, filed Jun. 24, 2011, entitled “System and Method for Improved Sample Test Efficiency in Image Rendering,” and
- U.S. patent application Ser. No. 13/168,771, filed Jun. 24, 2011, entitled “Bounding Box-Based Techniques for Improved Sample Test Efficiency in Image Rendering,”.
The present invention relates to image rendering, and more particularly to improving sample test efficiency in image rendering applications.
The rendering of a high quality image relies upon an accurate color computation for each pixel forming the image. The accuracy of this color computation is improved by distributing sample points across each pixel, testing which sample points are overlapped by a primitive which is to be rendered in the image, and computing a color for the pixel based upon those overlapped and non-overlapped sample points.
Sample testing algorithms (sometimes referred to as “point in polygon tests”) determine which samples of a screen space region (usually a pixel) are overlapped by a primitive, and the quality of such algorithms can be based upon their “sample test efficiency” (STE), this term referring to the number of sample points overlapped by a primitive versus the number of sample points tested for a given screen space region, e.g. a pixel. A high STE indicates an efficient sample testing algorithm, as a high percentage of the test sample points were actually or possibly overlapped by the primitive.
Techniques for improving STE are useful in the contexts of motion blur and depth of field rendering effects, as both types of effects involve a primitive potentially traversing a large number of pixels, resulting in a potentially large number of sample points which have to be considered for testing.
Motion blur results when the camera and/or geometry move while the virtual camera shutter is open. While the motion can theoretically be arbitrary during the exposure of a frame, it has been observed in film industry that vertex motion can often be satisfactorily simplified by assuming linear motion between shutter open (t=0) and closed (t=1).
In stochastic rasterization, the frame buffer is generalized so that each sample has additional properties in addition to the screen-space (x,y) position. In order to support motion blur, a time value is assigned to each frame buffer sample. In absence of motion, the frame buffer behaves exactly as it does currently, providing spatial antialiasing. With motion, a sample is updated only when a triangle overlaps the sample at the time of the sample.
The prior art describes several ways of interpolating a triangle to a specified time. One approach is as described in “The Accumulation Buffer: Hardware Support for High Quality Rendering,” P. Haberli and K. Akeley, Proc. SIGGRAPH 1990, pgs. 309-318, and in “Data-Parallel Rasterization of Micropolygons with Defocus and Motion Blur,” K. Fatahalian, E. Luong, S. Boulos, K. Akeley, W. Mark, and P. Hanrahan, Proc. High Performance Graphics 2009. This approach involves interpolating the vertices of a primitive in homogeneous clip space before triangle setup, and therefore a separate triangle setup/rendering pass is required for each distinct time. While simple to implement, this approach may not scale to a large number of samples per pixel, and the image quality can suffer due to a fixed (typically small) set of unique time values.
A second conventional approach has been to identify the screen-space bounds for the “time-continuous triangle” (TCT) for the entire exposure time, and then test all samples in all covered pixels by interpolating the triangle to the current sample's time, as described in disclosed in “Stochastic rasterization using time-continuous triangles,” T. Akenine-Möller, J. Munkberg, and J. Hasselgren, Proc. Graphics Hardware 2009. Possible implementations include at least time-continuous edge functions (about 3× the cost of traditional 2D edges) and ray-triangle intersection. TCTs offer high image quality because a unique time value can be set to each sample, but an accompanying disadvantage is low STE. When a triangle moves quickly, it can cover a relatively large region on the screen, yet at the same time we expect it to cover approximately a constant number of samples regardless of motion. STE therefore degrades drastically for fast motion, and can be as low as 1% in realistic cases.
A third approach is described in U.S. Pat. No. 4,897,806, whereby exposure time is split into several strata (typically, the number of strata equals the number of samples per pixel), and the above-mentioned second approach is called for each strata. This improves STE significantly, but the efficiency of the solution is not optimal for the low sampling densities typically encountered in fast rendering graphics (4-16 samples/pixel).
In view of the shortcomings of the conventional approaches, a new method for providing improved sample test efficiency in image rendering is needed.
SUMMARYA system, method, and computer program product for reducing the number of samples tested for rendering a screen space region of an image is presented herein. The method includes constructing a trajectory of a primitive in a three dimensional coordinate system, the coordinate system including a screen space dimension, a lens dimension and a time dimension. A bounding volume is constructed for a screen space region which is to be rendered, the bounding volume overlapping a portion of the screen space region. The bounding volume is defined according to a plurality of bounding planes which extend in the three dimensional coordinate system, whereby the bounding planes are determined as a function of the trajectory of the primitive. One or more sample points which are located within the screen space region, and which are not overlapped by the bounding volume are excluded from testing.
The foregoing method finds particular application in the rendering of images, an exemplary method of which includes the aforementioned operations, and the additional operations of identifying a screen space region which is to be rendered, testing sample points which are located within the screen space region and which are overlapped by the bounding volume of the primitive, and rendering the screen space region based upon the tested sample points.
These and other features of the invention will be better understood in view of the following drawings and detailed description of exemplary embodiments.
For clarity, previously described features retain their reference indices in subsequent figures.
DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTSCurrent hardware rasterizers rely on approximations of a pinhole camera and infinitesimal exposure time for efficiency, an accordingly motion blur and depth of field effects are missing from the rendered images. If finite aperture and exposure time are assumed, the screen-space vertex positions become dependent on the amount of motion during the exposure time as well as the amount of defocus blur. The latter is a simple function of depth, and its shape is typically a circle. An exemplary model is
VertexPosition(t,u,v)=VertexPositiont=0,u=0,v=0+MotionVector*(t)+CircleOfConfusionRadius*(u,v) eq. (1)
The typical approach for rasterizing triangles defined by three such vertices consists of two stages. At the first stage, all pixels that may be covered for some combination of lens position and time during exposure are identified. This is typically implemented by computing minimum and maximum x and y bounds from the equation above, for valid ranges oft, u and v. In certain cases it can be beneficial to determine a tighter but more complicated bounding shape, especially when motion blur is much stronger than defocus blur.
At the second stage, the pixels that fall within the bounding shape are identified, and the sample points included therein are tested for coverage. Each sample has a predefined (u,v,t) in addition to the traditional (x,y) subpixel offset. The actual coverage test can be implemented in various ways, for example using a ray-triangle intersection test, or high-dimensional edge functions in a pixel shader. These tests are computationally expensive to execute, and thus it is advantageous to exclude as many samples as possible without compromising correctness. A triangle can be expected to cover roughly a constant number of samples regardless of the amount of motion or defocus, and yet the screen-space bounds can grow significantly due to motion and defocus, leading to a significant amount of redundant work and low sample test efficiency (STE).
The commonly-owned, concurrently filed U.S. patent application Ser. No. 13/168,765, filed Jun. 24, 2011, entitled “System and Method for Improved Sample Test Efficiency in Image Rendering” and U.S. patent application Ser. No. 13/168,771, filed Jun. 24, 2011, entitled “Bounding Box-Based Techniques for Improved Sample Test Efficiency in Image Rendering” each present methods for computing t-, u- and v-bounds [tmin,tmax],[umin,umax] and [vmin,vmax] for a particular screen space region, e.g., a pixel/tile on the screen. Once these parameter bounds are known, the number of samples tested can be reduced to those that are within the computed non-screen space (u,v,t) bounds.
However, while motion or defocus can be optimized independently, concurrent motion and defocus may cause reduced efficiency. This is because motion (t) and defocus (u,v) both cause apparent movement on the screen, which means that the same object point may project to the same (x,y) screen location from multiple different (u,v,t) coordinates, effectively coupling the t axis with the u and v axes. This multiple-valuedness inflates the [tmin,tmax], [umin,umax] and [vmin,vmax] bounds, resulting in an increased t range to account for the worst-case effect of the lens, Similarly, increased u, v ranges would be needed to account for the worst-case effect of motion, leading to less than ideal STE.
To address this problem, the construction of a bounding volume in two new coordinate systems of (t,u,x) and (t,v,y) is proposed. A bounding volume in the (t,u,x) and (t,v,y) systems is defined by an upper bounding plane and lower bounding plane in each of the of new coordinate (t,u,x) and (t,v,y) systems.
Under the assumptions made in Equation (1), the x coordinate of a given object point is an affine function oft and u, i.e., its image in (t,u,x) coordinates is a plane. Similarly, the y coordinate of the object point is an affine function oft and v and its image in (t,v,y) coordinates is a plane. The skilled person will appreciate that other models exists for determining vertex positions of a primitive, and the present invention is equally applicable to those models.
For background,
The triangle's motion is further characterized as having a trajectory 120 extending between the triangles 1100 and 1101. As used herein, the trajectory 120 determines a region on the screen for any (u,v,t) coordinates, so that the screen space region bounds the primitive at those particular (u,v,t) coordinates. As such, the trajectory 120 includes screen space dimensions (x,y) and a non-screen space dimension (u,v, t,), and more particularly in the present invention two non-screen space dimensions, either (u,t) or (v,t). As shown, the trajectory 120 defines the possible screen space paths along which the triangle 110 can take during the time span t=0 to t=1. The triangle 110 may include lens dimension coordinates (u,v) alternative or in addition to the time dimension (t). As used herein, the term “screen space” refers to the conventionally-known screen space of the image, the space being defined in the convention in the (x,y) dimension coordinate system. The term “non-screen space” refers to a space which is not included in the screen space of the image. Examples of such spaces/dimensions include time (t) and lens dimensions (u,v).
As shown, the trajectory 120 intersects a screen space region 130 (referred to herein as a “tile” for brevity), which may be any region in the screen space which is to be rendered, e.g., a pixel whose color/reflectance/transmittance is to be determined. The trajectory's intersection of the tile 130 indicates that the triangle may (but not necessarily) intersect the tile.
It will be understood by the skilled person that the foregoing example applies equally to a description of a primitive with respect to horizontal and vertical lens dimensions (u,v), as viewing the primitive from different lens positions (u,v) will result in apparent motion of the primitive in the screen space, such apparent motion intersecting the tile 130. In this case, each frame buffer sample point is further indexed with a lens position (u, v). Depth of field is related to motion blur in the sense that when the viewing position on the lens is moved to the left or the right, this causes only horizontal movement of the vertices, and similarly for vertical movement. That is, a change in the horizontal lens coordinate (u) results in apparent movement of the primitive only in the horizontal direction, and a change in vertical lens coordinate (v) results in apparent movement of the primitive only in the vertical direction.
A trajectory of the triangle's motion in the x-plane can be determined from the
Although not shown, the primitive's trajectory intersects a screen space region (in this instance, along the x-plane), and thus the testing of samples within the region is necessary to render the tile 130. Exemplary, a bounding volume corresponding to the region between the upper and lower trajectory portions is constructed by limiting the x,y coordinates to lie within the screen space region which is to be rendered.
As above in
Although not shown, the screen space trajectory intersects a screen space region (in this instance, along the y-plane), and thus the testing of samples within the region is necessary to determine the render the tile 130. Exemplary, a bounding volume corresponding to region located between the upper and lower trajectory portions is constructed by limiting the x,y coordinates to lie within the screen space region which is to be rendered.
The TUX,LO, TUX,HI, TVY,LO and TVY,HI bounding planes 422, 424, 442 and 444 have the form:
ATUX,LO*t+BTUX,LO*u+x+CTUX,LO>=0 eq. (2)
ATUX,HI*t+BTUX,HI*u−X+CTUX,HI>=0 eq. (3)
ATVY,LO*t+BTVY,LO*v+y+CTVY,LO>=0 eq. (4)
ATVY,HI*t+BTVY,HI*v−y+CTVY,HI>=0 eq. (5)
where the A, B and C are constants determined during triangle setup. Because TUX, LO bounds the x coordinates of the vertices from below and TUX, HI from above, it is known that any visibility sample (x, y, u, v, t) where the TUX,LO plane equation evaluates to a negative value cannot possibly be covered the triangle, and similarly for the other planes. This gives a fast method for early culling samples using dot products before a more precise coverage test. Once a sample is determined to lie within the halfspaces defined by all four planes, the complete coverage test is performed using ray-triangle intersection or high-dimensional edge functions.
The bounding planes can also be seen as a X-Y screen bounding box that are affinely parametrized by u, v and t: after substituting a fixed (u, v, t) triplet into the plane equations, the remaining linear inequalities determine conservative X and Y bounds for the image of the triangle.
The axis-aligned coordinate ranges for a pixel at (X,Y), as presented in the commonly-owned concurrently filed U.S. patent application Ser. No. 13/168,771, filed Jun. 24, 2011, entitled “Bounding Box-Based Techniques for Improved Sample Test Efficiency in Image Rendering” can be computed by intersecting the TUX (respectively, TVY) planes with the slab X=<x=<X+1 (respectively, Y=<y=<Y+1) and finding the minimum and maximum t, u (respectively, t, v) values along this intersection over the t, u rectangle. This can be imagined as slicing the vertex surfaces shown in
While equation (1) assumes a constant per-vertex circle of confusion, the methods described herein also applies the per-vertex circles of confusion are allowed to vary linearly in time. In particular, setting up the bounding planes can still be done using the same procedure that looks at the x and y coordinates of the vertices only at the corners of the (t,u,v) cube.
Further exemplary, the above equations 2-5 are similar to the typical 2D edge equations (Ax+By+D>=0). Accordingly, it would therefore be possible to extend the fixed-function edge function evaluators in a graphics pipeline to process plane equations 2-5 described herein. For example, the coefficients of the (t,u,x) and (t,v,y) bounding planes, as defined in Equations 2, 3, 4, and 5, can be described using a few bits, for example, 8 bits. In this sense, the simplifications are consistent with existing 2D edge equations, in which x and y have limited subpixel resolution.
In a first exemplary embodiment of operation 602, a trajectory is constructed of one or more vertices of the primitive in a (t,u,x) coordinate system, for example, by forming a vertex surface for each of a plurality of vertices of the primitive exemplified in the description of
In a second exemplary embodiment of operation 602, a trajectory is constructed of one or more vertices of the primitive in a (t,v,y) coordinate system, for example, by forming a vertex surface for each of a plurality of vertices of the primitive exemplified in the description of
Exemplary, the method 600 is carried out for each of a plurality of primitives in order to improve the sample test efficiency for determining the color/transparency/reflectance of a tile (e.g., a pixel) which is overlapped by the plurality of primitives. Further exemplary, multiple primitives overlapping a particular region/pixel may be processed in parallel, whereby multiple instances of operations 602 and 604 (one instance for each primitive) are carried out concurrently.
In an exemplary application, the method of
In one embodiment, system 800 is operable to reduce the number of samples tested for rendering a region of an image in accordance with the present invention. In this embodiment, the system 800 includes a processor 822 operable to perform one or more of the operations described for
As readily appreciated by those skilled in the art, the described processes and operations may be implemented in hardware, software, firmware or a combination of these implementations as appropriate. In addition, some or all of the described processes and operations may be carried out as a computer-implemented method, or as computer readable instruction code resident on a computer readable medium, the instruction code operable to control a computer of other such programmable device to carry out the intended functions. The computer readable medium on which the instruction code resides may take various forms, for example, a removable disk, volatile or non-volatile memory, etc.
In a particular embodiment of the invention, a memory (which may be included locally within the processor 822 or globally within system 800) is operable to store instructions for performing any of the operations described for
The terms “a” or “an” are used to refer to one, or more than one feature described thereby. Furthermore, the term “coupled” or “connected” refers to features which are in communication with each other, either directly, or via one or more intervening structures or substances. The sequence of operations and actions referred to in method flowcharts are exemplary, and the operations and actions may be conducted in a different sequence, as well as two or more of the operations and actions conducted concurrently. Reference indicia (if any) included in the claims serves to refer to one exemplary embodiment of a claimed feature, and the claimed feature is not limited to the particular embodiment referred to by the reference indicia. The scope of the claimed feature shall be that defined by the claim wording as if the reference indicia were absent therefrom. All publications, patents, and other documents referred to herein are incorporated by reference in their entirety. To the extent of any inconsistent usage between any such incorporated document and this document, usage in this document shall control.
The foregoing exemplary embodiments of the invention have been described in sufficient detail to enable one skilled in the art to practice the invention, and it is to be understood that the embodiments may be combined. The described embodiments were chosen in order to best explain the principles of the invention and its practical application to thereby enable others skilled in the art to best utilize the invention in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined solely by the claims appended hereto.
Claims
1. A method for reducing the number of sample points tested for rendering a screen space tile of an image, the method comprising:
- constructing a trajectory of a primitive in a three dimensional coordinate system comprising a screen space dimension, a lens dimension, and a time dimension;
- determining low and high bounding planes in the three dimensional coordinate system for the screen space tile of the image by using three dimensional points of the trajectory of the primitive;
- constructing a bounding volume for the screen space tile which is to be rendered by using the low and high bounding planes in the three dimensional coordinate system for the screen space tile, wherein the bounding volume extends between the low and high bounding planes in the three dimensional coordinate system that are determined as a function of the trajectory of the primitive, and wherein the bounding volume overlaps a portion of the screen space tile; and
- excluding from testing, one or more sample points which are located within the screen space tile, and which are not overlapped by the bounding volume.
2. The method of claim 1, wherein said constructing a trajectory comprises
- constructing a trajectory of one or more vertices of the primitive in a (t,u,x) coordinate system.
3. The method of claim 2,
- wherein the (t,u,x) coordinate system includes a (t,u) rectangle extending across the x-dimensional space of the (t,u,x) coordinate system, and
- wherein said constructing a bounding volume comprises: determining a minimum x-coordinate and a maximum x-coordinate at each of the four corners of the (t,u) rectangle; fitting a plane to the minimum x-coordinates to form a lower (t,u,x) bounding plane; and fitting a plane to the maximum x-coordinates to form a upper (t,u,x) bounding plane.
4. The method of claim 3, wherein each sample point comprises indices in the (t,u,x) coordinate system, and wherein said excluding from testing comprises
- excluding from testing one or more sample points having an index which does not lie between the lower (t,u,x) bounding plane and the upper (t,u,x) bounding plane.
5. The method of claim 1, wherein said constructing a trajectory comprises
- constructing a trajectory of one or more vertices of the primitive in a (t,v,y) coordinate system.
6. The method of claim 5,
- wherein the (t,v,y) coordinate system includes a (t,v) rectangle extending across the y-dimensional space of the (t,v,y) coordinate system, and
- wherein said constructing a bounding volume comprises: determining a minimum y-coordinate and a maximum y-coordinate at each of the four corners of the (t,v) rectangle; fitting a plane to the minimum y-coordinates to form a lower (t,v,y) bounding plane; and fitting a plane to the maximum y-coordinates to form a upper (t,v,y) bounding plane.
7. The method of claim 6, wherein each sample point comprises indices in the (t,v,y) coordinate system, and wherein said excluding from testing comprises
- excluding from testing one or more sample points having an index which does not lie between the lower (t,v,y) bounding plane and the upper (t,v,y) bounding plane.
8. A method for rendering a screen space tile of an image, comprising:
- identifying a screen space tile which is to be rendered;
- constructing a trajectory of a primitive in a three dimensional coordinate system comprising a screen space dimension, a lens dimension, and a time dimension;
- determining low and high bounding planes in the three dimensional coordinate system for the screen space tile of the image by using three dimensional points of the trajectory of the primitive;
- constructing a bounding volume for the screen space tile by using the low and high bounding planes in the three dimensional coordinate system for the screen space tile, wherein the bounding volume extends between the low and high bounding planes in the three dimensional coordinate system that are determined as a function of the trajectory of the primitive, and wherein the bounding volume overlaps a portion of the screen space tile;
- excluding from testing, one or more sample points which are located within the screen space tile, and which are not overlapped by the bounding volume;
- testing the sample points which are located within the screen space tile and which are overlapped by the bounding volume; and
- rendering the screen space tile of the image based upon the tested sample points.
9. The method of claim 8, wherein said constructing a trajectory comprises
- constructing a trajectory of one or more vertices of the primitive in a (t,u,x) coordinate system.
10. The method of claim 9,
- wherein the (t,u,x) coordinate system includes a (t,u) rectangle extending across the x-dimensional space of the (t,u,x) coordinate system, and
- wherein said constructing a bounding volume comprises: determining a minimum x-coordinate and a maximum x-coordinate at each of the four corners of the (t,u) rectangle; fitting a plane to the minimum x-coordinates to form a lower (t,u,x) bounding plane; and fitting a plane to the maximum x-coordinates to form a upper (t,u,x) bounding plane.
11. The method of claim 10, wherein each sample point comprises indices in the (t,u,x) coordinate system, and wherein said excluding from testing comprises
- excluding from testing one or more sample points having an index which does not lie between the lower (t,u,x) bounding plane and the upper (t,u,x) bounding plane.
12. The method of claim 8, wherein said constructing a trajectory comprises
- constructing a trajectory of one or more vertices of the primitive in a (t,v,y) coordinate system.
13. The method of claim 12,
- wherein the (t,v,y) coordinate system includes a (t,v) rectangle extending across the y-dimensional space of the (t,v,y) coordinate system, and
- wherein said constructing a bounding volume comprises: determining a minimum y-coordinate and a maximum y-coordinate at each of the four corners of the (t,v) rectangle; fitting a plane to the minimum y-coordinates to form a lower (t,v,y) bounding plane; and fitting a plane to the maximum y-coordinates to form a upper (t,v,y) bounding plane.
14. The method of claim 13, wherein each sample point comprises indices in the (t,v,y) coordinate system, and wherein said excluding from testing comprises
- excluding from testing one or more sample points having an index which does not lie between the lower (t,v,y) bounding plane and the upper (t,v,y) bounding plane.
15. A system operable to reduce the number of sample points tested for rendering a screen space tile of an image, the system including a processor operable to:
- construct a trajectory of a primitive in a three dimensional coordinate system comprising a screen space dimension, a lens dimension, and a time dimension;
- determine low and high bounding planes in the three dimensional coordinate system for the screen space tile of the image by using three dimensional points of the trajectory of the primitive;
- construct a bounding volume for the screen space tile which is to be rendered by using the low and high bounding planes in the three dimensional coordinate system for the screen space tile, wherein the bounding volume extends between the low and high bounding planes in the three dimensional coordinate system that are determined as a function of the trajectory of the primitive, and wherein the bounding volume overlaps a portion of the screen space tile; and
- exclude from testing, one or more sample points which are located within the screen space tile, and which are not overlapped by the bounding volume.
16. The system of claim 15, wherein in accordance with the processor operable to construct a trajectory, the processor is operable to construct a trajectory of one or more vertices of the primitive in either a (t,u,x) coordinate system.
17. The system of claim 16,
- wherein the (t,u,x) coordinate system includes a (t,u) rectangle extending across the x-dimensional space of the (t,u,x) coordinate system, and
- wherein in accordance with the processor operable to construct a bounding volume, the processor is operable to: determine a minimum x-coordinate and a maximum x-coordinate at each of the four corners of the (t,u) rectangle; fit a plane to the minimum x-coordinates to form a lower (t,u,x) bounding plane; and fit a plane to the maximum x-coordinates to form a upper (t,u,x) bounding plane.
18. The system of claim 17, wherein each sample point comprises indices in the (t,u,x) coordinate system, and wherein in accordance with the processor operable to exclude from testing, the processor is operable to exclude from testing one or more sample points having an index which does not lie between the lower (t,u,x) bounding plane and the upper (t,u,x) bounding plane.
19. The system of claim 15, wherein in accordance with the processor operable to construct a trajectory, the processor is operable to constructing a trajectory of one or more vertices of the primitive in a (t,v,y) coordinate system.
20. The system of claim 19,
- wherein the (t,v,y) coordinate system includes a (t,v) rectangle extending across the y-dimensional space of the (t,v,y) coordinate system, and
- wherein in accordance with the processor operable to construct a bounding volume, the processor is operable to: determine a minimum y-coordinate and a maximum y-coordinate at each of the four corners of the (t,v) rectangle; fit a plane to the minimum y-coordinates to form a lower (t,v,y) bounding plane; and fit a plane to the maximum y-coordinates to form a upper (t,v,y) bounding plane.
21. The system of claim 20, wherein each sample point comprises indices in the (t,v,y) coordinate system, and wherein in accordance with the processor operable to exclude from testing, the processor is operable to exclude from testing one or more sample points having an index which does not lie between the lower (t,v,y) bounding plane and the upper (t,v,y) bounding plane.
22. A system operable to render a screen space tile of an image, the system including a processor operable to:
- identify a screen space tile which is to be rendered;
- construct a trajectory of a primitive in a three dimensional coordinate system comprising a screen space dimension, a lens dimension, and a time dimension;
- determine low and high bounding planes in the three dimensional coordinate system for the screen space tile of the image by using three dimensional points of the trajectory of the primitive;
- construct a bounding volume for the screen space tile by using the low and high bounding planes in the three dimensional coordinate system for the screen space tile, wherein the bounding volume extends between the low and high bounding planes in the three dimensional coordinate system that are determined as a function of the trajectory of the primitive, and wherein the bounding volume overlaps a portion of the screen space tile;
- exclude from testing, one or more sample points which are located within the screen space tile, and which are not overlapped by the bounding volume;
- test the sample points which are located within the screen space tile and which are overlapped by the bounding volume; and
- render the screen space tile of the image based upon the tested sample points.
23. The system of claim 22, wherein in accordance with the processor operable to construct a trajectory, the processor is operable to construct a trajectory of one or more vertices of the primitive in either a (t,u,x) coordinate system.
24. The system of claim 23,
- wherein the (t,u,x) coordinate system includes a (t,u) rectangle extending across the x-dimensional space of the (t,u,x) coordinate system, and
- wherein in accordance with the processor operable to construct a bounding volume, the processor is operable to: determine a minimum x-coordinate and a maximum x-coordinate at each of the four corners of the (t,u) rectangle; fit a plane to the minimum x-coordinates to form a lower (t,u,x) bounding plane; and fit a plane to the maximum x-coordinates to form a upper (t,u,x) bounding plane.
25. The system of claim 24, wherein each sample point comprises indices in the (t,u,x) coordinate system, and wherein in accordance with the processor operable to exclude from testing, the processor is operable to exclude from testing one or more sample points having an index which does not lie between the lower (t,u,x) bounding plane and the upper (t,u,x) bounding plane.
26. The system of claim 22, wherein in accordance with the processor operable to construct a trajectory, the processor is operable to constructing a trajectory of one or more vertices of the primitive in a (t,v,y) coordinate system.
27. The system of claim 26,
- wherein the (t,v,y) coordinate system includes a (t,v) rectangle extending across the y-dimensional space of the (t,v,y) coordinate system, and
- wherein in accordance with the processor operable to construct a bounding volume, the processor is operable to: determine a minimum y-coordinate and a maximum y-coordinate at each of the four corners of the (t,v) rectangle; fit a plane to the minimum y-coordinates to form a lower (t,v,y) bounding plane; and fit a plane to the maximum y-coordinates to form a upper (t,v,y) bounding plane.
28. The system of claim 27, wherein each sample point comprises indices in the (t,v,y) coordinate system, and wherein in accordance with the processor operable to exclude from testing, the processor is operable to exclude from testing one or more sample points having an index which does not lie between the lower (t,v,y) bounding plane and the upper (t,v,y) bounding plane.
29. A computer program product, resident on a non-transitory computer-readable medium, and operable to store instructions for reducing the number of sample points tested for rendering a screen space tile of an image, the computer program product comprising:
- instruction code for constructing a trajectory of a primitive in a three dimensional coordinate system comprising a screen space dimension, a lens dimension, and a time dimension;
- instruction code for determining low and high bounding planes in the three dimensional coordinate system for the screen space tile of the image by using three dimensional points of the trajectory of the primitive;
- instruction code for constructing a bounding volume for the screen space tile which is to be rendered by using the low and high bounding planes in the three dimensional coordinate system for the screen space tile, wherein the bounding volume extends between the low and high bounding planes in the three dimensional coordinate system that are determined as a function of the trajectory of the primitive, and wherein the bounding volume overlaps a portion of the screen space tile; and
- instruction code for excluding from testing, one or more sample points which are located within the screen space tile, and which are not overlapped by the bounding volume.
30. A computer program product, resident on a non-transitory computer-readable medium, and operable to store instructions for rendering a screen space tile of an image, the computer program product comprising:
- instruction code for identifying a screen space tile which is to be rendered;
- instruction code for constructing a trajectory of a primitive in a three dimensional coordinate system comprising a screen space dimension, a lens dimension, and a time dimension;
- determining low and high bounding planes in the three dimensional coordinate system for the screen space tile of the image by using three dimensional points of the trajectory of the primitive;
- instruction code for constructing a bounding volume for the screen space tile by using the low and high bounding planes in the three dimensional coordinate system for the screen space tile, wherein the bounding volume extends between the low and high bounding planes in the three dimensional coordinate system that are determined as a function of the trajectory of the primitive, and wherein the bounding volume overlaps a portion of the screen space tile;
- instruction code for excluding from testing, one or more sample points which are located within the screen space tile, and which are not overlapped by the bounding volume;
- instruction code for testing the sample points which are located within the screen space tile and which are overlapped by the bounding volume; and
- instruction code for rendering the screen space tile of the image based upon the tested sample points.
4897806 | January 30, 1990 | Cook et al. |
5113493 | May 12, 1992 | Crosby |
5222203 | June 22, 1993 | Obata |
5239624 | August 24, 1993 | Cook et al. |
5289565 | February 22, 1994 | Smith et al. |
5299298 | March 29, 1994 | Elmquist et al. |
5357579 | October 18, 1994 | Buchner et al. |
5384667 | January 24, 1995 | Beckwith |
5402534 | March 1995 | Yeomans |
5465119 | November 7, 1995 | Demos |
5684935 | November 4, 1997 | Demesa, III et al. |
5729672 | March 17, 1998 | Ashton |
5737027 | April 7, 1998 | Demos |
5809219 | September 15, 1998 | Pearce et al. |
5870096 | February 9, 1999 | Anjyo et al. |
5973700 | October 26, 1999 | Taylor et al. |
5982385 | November 9, 1999 | Fish et al. |
6034667 | March 7, 2000 | Barrett |
6211882 | April 3, 2001 | Pearce et al. |
6300956 | October 9, 2001 | Apodaca et al. |
6618048 | September 9, 2003 | Leather |
6636214 | October 21, 2003 | Leather et al. |
6700586 | March 2, 2004 | Demers |
6707458 | March 16, 2004 | Leather et al. |
6717577 | April 6, 2004 | Cheng et al. |
6720975 | April 13, 2004 | Dietrich, Jr. |
6811489 | November 2, 2004 | Shimizu et al. |
6867781 | March 15, 2005 | Van Hook et al. |
6885384 | April 26, 2005 | Deering et al. |
6999100 | February 14, 2006 | Leather et al. |
7002591 | February 21, 2006 | Leather et al. |
7034828 | April 25, 2006 | Drebin et al. |
7050066 | May 23, 2006 | Ohta |
7061502 | June 13, 2006 | Law et al. |
7075545 | July 11, 2006 | Van Hook et al. |
7119813 | October 10, 2006 | Hollis et al. |
7133041 | November 7, 2006 | Kaufman et al. |
7133047 | November 7, 2006 | Pallister |
7136081 | November 14, 2006 | Gritz et al. |
7176919 | February 13, 2007 | Drebin et al. |
7184059 | February 27, 2007 | Fouladi et al. |
7187379 | March 6, 2007 | Keller |
7196710 | March 27, 2007 | Fouladi et al. |
7205999 | April 17, 2007 | Leather |
7230618 | June 12, 2007 | Keller |
7307638 | December 11, 2007 | Leather et al. |
7307640 | December 11, 2007 | Demers et al. |
7317459 | January 8, 2008 | Fouladi et al. |
7362332 | April 22, 2008 | Gritz |
7446780 | November 4, 2008 | Everitt et al. |
7453460 | November 18, 2008 | Keller |
7453461 | November 18, 2008 | Keller |
7477261 | January 13, 2009 | Pallister |
7483010 | January 27, 2009 | Bai et al. |
7499054 | March 3, 2009 | Keller |
7538772 | May 26, 2009 | Fouladi et al. |
7576748 | August 18, 2009 | Van Hook et al. |
7616200 | November 10, 2009 | Heinrich et al. |
7623726 | November 24, 2009 | Georgiev |
7697010 | April 13, 2010 | Pallister |
7701461 | April 20, 2010 | Fouladi et al. |
7742060 | June 22, 2010 | Maillot |
7961970 | June 14, 2011 | Georgiev |
7973789 | July 5, 2011 | Cook |
7995069 | August 9, 2011 | Van Hook et al. |
8098255 | January 17, 2012 | Fouladi et al. |
8970584 | March 3, 2015 | Aila et al. |
20030083850 | May 1, 2003 | Schmidt et al. |
20030234789 | December 25, 2003 | Gritz |
20060101242 | May 11, 2006 | Siu et al. |
20070046686 | March 1, 2007 | Keller |
20080001961 | January 3, 2008 | Roimela et al. |
20080244241 | October 2, 2008 | Barraclough et al. |
20090167763 | July 2, 2009 | Waechter et al. |
20110090337 | April 21, 2011 | Klomp et al. |
20120218264 | August 30, 2012 | Clarberg et al. |
20120293515 | November 22, 2012 | Clarberg et al. |
20120327071 | December 27, 2012 | Laine et al. |
20130321420 | December 5, 2013 | Laine et al. |
1856805 | November 2006 | CN |
2012115711 | August 2012 | WO |
- Hou, Q., et al, “Micropolygon Ray Tracing with Defocus and Motion Blur,” ACM Transactions on Graphics (TOG), vol. 29, Article 64, Jul. 2010, pp. 1-10.
- Laine, S., et al., “Clipless Dual-Space Bounds for Faster Stochastic Rasterization,” ACM Transactions on Graphics (TOG), vol. 30, Issue 4, Article 106, Jul. 2011, pp. 1-6.
- P. Haberli and K. Akeley, “The Accumulation Buffer: Hardware Support for High Quality Rendering,” In Proc. SIGGRAPH 1990. pp. 309-318.
- Tomas Akenine-Möller, Jacob Munkberg, and Jon Hasselgren, “Stochastic rasterization using time-continuous triangles,” Proc. Graphics Hardware 2009.
- Kayvon Fatahalian, Edward Luong, Solomon Boulos, Kurt Akeley, William R. Mark, and Pat Hanrahan, “Data-Parallel Rasterization of Micropolygons with Defocus and Motion Blur,” Proc. High Performance Graphics 2009.
- Moller., et al., “Stochastic Rasterization Using Time-Continuous Triangles,” ACM, Jan. 2007, pp. 1-11.
- McGuire, et al; “Hardware-Accelerated Global Illumination by Image Space Photon Mapping”, HPG 2009, New Orleans, Louisiana, Aug. 1-3, 2009.
- Linsen, et al; “Splat-based Ray Tracing of Point Clouds”, Journal ofWSCG 15: 51-58, 2007.
- Schaufler, et al; “Ray Tracing Point Sampled Geometry”, In Proceedings of the Eurographics Workshop on Rendering Techniques 2000, p. 319-328, London, UK, 2000. Springer-Verlag.
Type: Grant
Filed: Jun 24, 2011
Date of Patent: Sep 29, 2015
Assignee: NVIDIA CORPORATION (Santa Clara, CA)
Inventors: Jaakko Lehtinen (Helsinki), Timo Aila (Helsinki), Samuli Laine (Helsinki)
Primary Examiner: Ryan R Yang
Application Number: 13/168,773
International Classification: G06T 15/00 (20110101); G09G 5/00 (20060101); G06T 11/20 (20060101);