Patents Assigned to DreamWorks Animation LLC
-
Publication number: 20140267237Abstract: Systems and methods for rendering three-dimensional images using a level graph are provided. The level graph is accessed, comprising a first node, a second node, and a target node. The second and target nodes are descendants of the first node. The first node comprises first scene description data, the second node comprises first variation data, and the target node comprises second variation data. The target node is selected for computation. Target node ancestors are determined. The first node and the second node are ancestors of the target node. A linearization of the ancestors is determined, comprising an order. A scene description is initialized using the first scene description data. The first variation is applied to the scene description, based on the linearization. The second variation is applied to the scene description to produce a final scene description. An image is rendered using the final scene description.Type: ApplicationFiled: March 15, 2013Publication date: September 18, 2014Applicant: DREAMWORKS ANIMATION LLCInventors: Peter MCNERNEY, Evan P. SMYTH, Robert Giles WILSON, Greg HEFLIN, Jeff BEALL, Jonathan GIBBS, Mike HAMLER, Benoit GAGNON
-
Publication number: 20140267357Abstract: A computer-enabled method for shading locations for use in rendering a computer-generated scene having one or more objects represented by a point cloud. The method involves selecting a shading location, selecting a set of points from the point cloud, rasterizing the points onto a raster shape positioned at the shading location, where the raster shape has varying texel densities that are based on characteristics of the points in the point cloud, such that the texel density varies on different surfaces of the raster shape or on different areas of the same surface or both, and shading the shading location.Type: ApplicationFiled: March 15, 2013Publication date: September 18, 2014Applicant: DreamWorks Animation LLCInventor: DreamWorks Animation LLC
-
Publication number: 20140267083Abstract: Systems and methods for manipulating a virtual three-dimensional (3D) object in a virtual 3D space are provided. A representation of the 3D object is displayed on a display. A non-hemispherical arcball having a surface is determined. The non-hemispherical arcball is associated with the representation of the 3D object. A pointing device is detected at a first position and at a second position. The first position of the pointing device is translated onto a first location on the surface of the non-hemispherical arcball. The second position of the pointing device is translated onto a second location on the surface of the non-hemispherical arcball. A rotation of the representation of the 3D object is displayed on the display, the rotation based on a path of travel between the first location and the second location along the surface of the non-hemispherical arcball.Type: ApplicationFiled: March 15, 2013Publication date: September 18, 2014Applicant: DREAMWORKS ANIMATION LLCInventors: Morgwn Quin MCCARTY, Alexander P. POWELL
-
Publication number: 20140267307Abstract: Computer animation tools for viewing, in multiple contexts, the effect of changes to a computer animation are disclosed. An artist configures multiple visual displays in the user interface of a computer animation system. A visual display shows one or more frames of computer animation. An artist configures a visual display to reflect a specific context. For example, the artist may assign a particular virtual viewpoint of a scene to a particular visual display. Once visual displays are configured, the artist changes a configuration of the computer animation. For example, the artist may change the lighting parameters of a scene. In response, the visual displays show the visual effects of the configuration (e.g., lighting parameters) change under corresponding contexts (e.g., different virtual camera viewpoints). Using multiple visual displays, which may be displayed side-by-side, an artist can view the effects of her configuration changes in the various contexts.Type: ApplicationFiled: March 15, 2013Publication date: September 18, 2014Applicant: DreamWorks Animation LLCInventors: Tsuey Jin Liou, Evan P. Smyth, Andrew Phillip Pearce, Peter McNerney
-
Publication number: 20140267252Abstract: Using curves to emulate soft body deformation in a computer-generated character is disclosed. A method can include accessing a reference model mapped to one or more deformation curves for the character. The reference model can include a mesh of vertices representing a soft body layer of the character. The deformation curve can include multiple sample points selected for mapping. Each mesh vertex on the model can be mapped to each sample point on the curve to establish a relationship between them for deformation. The method can also include receiving a movement of one or more sample points on the curve to a desired deformation position. The method can further include calculating primary and secondary movements of the mesh vertices on the model based on the movements of sample points. The method can move the mesh vertices as calculated to a desired deformation position and output the reference model with the moved vertices for rendering to emulate the soft body deformation of the character.Type: ApplicationFiled: March 15, 2013Publication date: September 18, 2014Applicant: DREAMWORKS ANIMATION LLCInventors: Michael HUTCHINSON, Guido ZIMMERMANN, Robert HELMS
-
Publication number: 20140270561Abstract: Data representing animated hair in a computer generated imagery (CGI) scene may be compressed by treating hair data as arrays of parameters. Hair data parameters may include control vertices, hair color, hair radius, and the like. A principal component analysis (PCA) may be performed on the arrays of hair data. PCA may yield new basis vectors, varying in length, with the largest basis vector corresponding to a new dimension with the largest variance in hair data. The hair data may be quantized based on the varying lengths of new basis vectors. The number of bits allocated for quantizing each new dimension corresponding to each new basis vector may be determined based on the relative lengths of new basis vectors, with more bits allocated to dimensions corresponding to longer basis vectors. The quantized hair data may be bit-packed and then compressed using lossless entropy encoding.Type: ApplicationFiled: March 14, 2013Publication date: September 18, 2014Applicant: DreamWorks Animation LLCInventor: Mark Jeffrey MATTHEWS
-
Patent number: 8730239Abstract: Embodiments relate to a computer-implemented method of providing a transition between first and second regions within a virtual scene, where the first and second regions are rendered using different methods and being connected to one another along a border line. The second region features a sharply diminishing illumination from the border line. The method includes adding, an overlay of additional illumination to the first region as to make the illumination in portions of the first region that are close to the borderline similar to that of portions of the second region that are close to the border line. The method also includes shifting a position on which calculation of the illumination of the second region is based away from the first region.Type: GrantFiled: March 12, 2012Date of Patent: May 20, 2014Assignee: DreamWorks Animation LLCInventors: Bruce Nunzio Tartaglia, Doug Cooper, Pablo Valle, Michael McNeill
-
Publication number: 20140111441Abstract: A touch-sensitive surface for a computer animator to create or modify a computer-generated image includes processes for differentiating between click and drag operations. The included processes also beneficially reduce input errors. When a touch object (e.g., finger or stylus) touches the drawing table, information regarding the duration of the touch and the movement of the touch are used to determine whether the touch input represents a (graphical user interface) click or a drag operation.Type: ApplicationFiled: October 18, 2012Publication date: April 24, 2014Applicant: DreamWorks Animation LLCInventor: Alexander P. POWELL
-
Publication number: 20140085312Abstract: Systems and processes for rendering fractures in an object are provided. In one example, a surface representation of an object may be converted into a volumetric representation of the object. The volumetric representation of the object may be divided into volumetric representations of two or more fragments. The volumetric representations of the two or more fragments may be converted into surface representations of the two or more fragments. Additional information associated with attributes of adjacent fragments may be used to convert the volumetric representations of the two or more fragments into surface representations of the two or more fragments. The surface representations of the two or more fragments may be displayed.Type: ApplicationFiled: November 5, 2013Publication date: March 27, 2014Applicant: DreamWorks Animation LLCInventors: Akash GARG, Kyle MAXWELL, David LIPTON
-
Publication number: 20140035918Abstract: Bounded-parallax constraints are determined for the placement of a pair of stereoscopic cameras within a computer-generated scene. A minimum scene depth is calculated based on the distance from the pair of cameras to a nearest point of interest in the computer-generated scene. A near-parallax value is also calculated based on the focal length and the minimum scene depth. Calculating the near-parallax value includes selecting a baseline stereo-setting entry from a set of stereo-setting entries, each stereo-setting entry of the set of baseline stereo-setting entries includes a recommended scene depth, a recommended focal length, and a recommended near-parallax value. For the selected baseline stereo-setting entry: the recommended scene depth corresponds to the minimum scene depth, and the recommended focal length corresponds to the focal length. The near-parallax value and far-parallax value are stored as the bounded-parallax constraints for the placement of the pair of stereoscopic cameras.Type: ApplicationFiled: March 14, 2013Publication date: February 6, 2014Applicant: DreamWorks Animation LLCInventors: Philip MCNALLY, Matthew LOW
-
Publication number: 20140035931Abstract: Systems and processes are described below relating to evaluating a dependency graph having one or more temporally dependent variables. The temporally dependent variables may include variables that may be used to evaluate the dependency graph at a frame other than that at which the temporally dependent variable was evaluated. One example process may include tracking the temporal dirty state for each temporally dependent variable using a temporal dependency list. This list may be used to determine which frames, if any, should be reevaluated when a request to evaluate a dependency graph for a particular frame is received. This advantageously reduces the amount of time and computing resources needed to reevaluate a dependency graph.Type: ApplicationFiled: August 2, 2013Publication date: February 6, 2014Applicant: DreamWorks Animation LLCInventors: Alexander P. Powell, Alex M. Wells
-
Publication number: 20140035903Abstract: A computer-implemented method for smoothing a stereo parameter for a computer-animated film sequence. A timeline for the film sequence is obtained, the timeline comprising a plurality of time entries. A stereo parameter distribution is obtained, wherein the stereo parameter distribution comprises one stereo parameter value for at least two time entries of the plurality of time entries, and wherein the stereo parameter value corresponds a stereo setting associated with a pair of stereoscopic cameras configured to produce a stereoscopic image of the computer-animated film sequence. Depending on a statistical measurement of the stereo parameter distribution, either a static scene parameter is calculated, or a set of smoothed parameter values is calculated.Type: ApplicationFiled: March 13, 2013Publication date: February 6, 2014Applicant: DreamWorks Animation LLCInventors: Philip MCNALLY, Matthew Low
-
Publication number: 20140036037Abstract: A computer-implemented method for placing a window object within a computer-generated scene. The computer-generated scene includes a pair of stereoscopic cameras adapted to capture an image of at least one computer-generated object and the window object. A left portion and right portion of the image along the left and right edges of the image are obtained. The nearest computer-generated object to the pair of stereoscopic cameras within the left and right portions of the image is identified. The window object is placed between the identified computer-generated object and the stereoscopic cameras at an offset distance from the identified computer-generated object.Type: ApplicationFiled: March 13, 2013Publication date: February 6, 2014Applicant: DreamWorks Animation LLCInventors: Philip MCNALLY, Matthew LOW
-
Publication number: 20140036036Abstract: A computer-implemented method for determining a user-defined stereo effect for a computer-animated film sequence. A stereo-volume value for a timeline of the film sequence is obtained, wherein the stereo-volume value represents a percentage of parallax at the respective time entry. A stereo-shift value for the timeline is also obtained, wherein the stereo-shift value represents a distance across one of: an area associated with a sensor of a pair of stereoscopic cameras adapted to create the film sequence; and a screen adapted to depict a stereoscopic image of the computer-generated scene. A script-adjusted near-parallax value and a script-adjusted far-parallax value are calculated.Type: ApplicationFiled: March 13, 2013Publication date: February 6, 2014Applicant: DreamWorks Animation LLCInventors: Philip MCNALLY, Matthew LOW
-
Publication number: 20140035908Abstract: Systems and processes are described below relating to evaluating a dependency graph to render three-dimensional (3D) graphics using constraints. Two virtual 3D objects are accessed in a virtual 3D space. A constraint relationship request is received, which identifies the first object as a parent and the second object as a child. The technique verifies whether the graphs of the objects are compatible for being constrained to one another. The first object is evaluated to determine its translation, rotation, and scale. The second object is similarly evaluated based on the translation, rotation, and scale of the first object. An image is rendered depicting at least a portion of the first virtual 3D object and at least a portion of the second virtual 3D object.Type: ApplicationFiled: August 2, 2013Publication date: February 6, 2014Applicant: DreamWorks Animation LLCInventors: Alexander P. POWELL, Esteban D. PAPP, Alex M. WELLS
-
Publication number: 20140035922Abstract: Systems and processes providing a tool for visualizing parallel dependency graph evaluation in computer animation are provided. Runtime evaluation data of a parallel dependency graph may be collected, including the start time and stop time for each node in the graph. The visualization tool may process the data to generate performance visualizations as well as other analysis features. Performance visualizations may illustrate the level of concurrency over time during parallel dependency graph evaluation. Performance visualizations may be generated by graphing node blocks according to node start time and stop time as well as the level of concurrency at a given time to illustrate parallelism. Performance visualizations may enable character technical directors, character riggers, programmers, and other users to evaluate how well parallelism is expressed in parallel dependency graphs in computer animation.Type: ApplicationFiled: March 12, 2013Publication date: February 6, 2014Applicant: DreamWorks Animation LLCInventors: Martin Peter WATT, Brendan Duncan
-
Publication number: 20140035917Abstract: A computer-implemented method for determining bounded-parallax constraints for the placement of a pair of stereoscopic cameras within a computer-generated scene. An initial near-parallax value is determined based on the focal length and a minimum scene depth. An initial far-parallax value is determined based on a focal length. A scaled near-parallax value and scaled far-parallax value are calculated based on the initial near-parallax value, initial far-parallax value, and a range of focal lengths. A creative near-parallax value is calculated based on a stereo-shift value and the product of a stereo-volume and the scaled near-parallax value. A creative far-parallax value is calculated based on the stereo-shift value and the product of the stereo-volume and the scaled far-parallax value. The creative near-parallax value and the creative far-parallax value are stored as the bounded-parallax constraints for the placement of the pair of stereoscopic cameras.Type: ApplicationFiled: March 13, 2013Publication date: February 6, 2014Applicant: DreamWorks Animation LLCInventors: Philip MCNALLY, Matthew LOW
-
Publication number: 20140036039Abstract: A computer-implemented method determining a user-defined stereo effect for a computer-generated scene. A set of bounded-parallax constraints including a near-parallax value and a far-parallax value is obtained. A stereo-volume value is obtained, wherein the stereo-volume value represents a percentage of parallax. A stereo-shift value is also obtained, wherein the stereo-shift value represents a distance across one of: an area associated with a camera sensor of a pair of stereoscopic cameras adapted to film the computer-generated scene; and a screen adapted to depict a stereoscopic image of the computer-generated scene. A creative near-parallax value is calculated based on the stereo-shift value, the stereo-volume, and the near-parallax value. A creative far-parallax value is also calculated based on the stereo-shift value and the product of the stereo-volume and the far-parallax value.Type: ApplicationFiled: March 14, 2013Publication date: February 6, 2014Applicant: DreamWorks Animation LLCInventors: Philip MCNALLY, Matthew LOW
-
Publication number: 20140036038Abstract: Techniques for determining scaled-parallax constraints used for the placement of a pair of stereoscopic cameras within a computer-generated scene. A set of bounded-parallax constraints including a near-parallax value and a far-parallax value is also obtained along with a lower-bound value and upper-bound value for a range of focal lengths. Scaled near-parallax and scaled far-parallax values are calculated, the calculation depending on the whether the focal length is greater than, less than, or within the range of focal lengths.Type: ApplicationFiled: March 13, 2013Publication date: February 6, 2014Applicant: DreamWorks Animation LLCInventors: Philip McNally, Matthew Low
-
Publication number: 20130335406Abstract: Locations are shaded for use in rendering a computer-generated scene having one or more objects represented by the point cloud. A hierarchy for the point cloud is obtained. The point cloud includes a plurality of points. The hierarchy has a plurality of clusters of points of the point cloud. A location is selected to shade. A first cluster from the plurality of clusters is selected. The first cluster represents a first set of points in the point cloud. An importance weight for the first cluster is determined. A render-quality criterion for the first cluster is determined based on the importance weight. Whether the first cluster meets a render-quality criterion is determined based on a render-quality parameter for the first cluster. In response to the first cluster meeting the quality criterion, the location is shaded based on an indication of light emitted from the first cluster.Type: ApplicationFiled: June 18, 2012Publication date: December 19, 2013Applicant: DreamWorks Animation LLCInventor: Eric TABELLION