Dreamworks Patent Applications

Dreamworks patent applications that are pending before the United States Patent and Trademark Office (USPTO).

  • Publication number: 20140267357
    Abstract: A computer-enabled method for shading locations for use in rendering a computer-generated scene having one or more objects represented by a point cloud. The method involves selecting a shading location, selecting a set of points from the point cloud, rasterizing the points onto a raster shape positioned at the shading location, where the raster shape has varying texel densities that are based on characteristics of the points in the point cloud, such that the texel density varies on different surfaces of the raster shape or on different areas of the same surface or both, and shading the shading location.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DreamWorks Animation LLC
    Inventor: DreamWorks Animation LLC
  • Publication number: 20140267354
    Abstract: A lighting correction filter for selectively correcting lighting in computer animation is disclosed. The lighting correction filter can select a computer-generated object having one or more lighting attributes. The selected object can be a portion of an object, an entire object, a portion of a computer-generated scene, or an entire scene. The filter can then set lighting correction values for the lighting attributes of the selected object. The lighting correction values can be color values, exposure values, or both. The filter can apply the lighting correction values to the selected object's lighting attributes to effect a lighting correction in the object prior to rendering.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DREAMWORKS ANIMATION LLC
    Inventor: Stephen BAILEY
  • Publication number: 20140267312
    Abstract: A rail manipulator indicates the possible range(s) of movement of a part of a computer-generated character in a computer animation system. The rail manipulator obtains a model of the computer-generated character. The model may be a skeleton structure of bones connected at joints. The interconnected bones may constrain the movements of one another. When an artist selects one of the bones for movement, the rail manipulator determines the range of movement of the selected bone. The determination may be based on the position and/or the ranges of movements of other bones in the skeleton structure. The range of movement is displayed on-screen to the artist, together with the computer-generated character. In this way, the rail manipulator directly communicates to the artist the degree to which a portion of the computer-generated character can be moved, in response to the artist's selection of the portion of the computer-generated character.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DreamWorks Animation LLC
    Inventor: Alexander P. Powell
  • Publication number: 20140267237
    Abstract: Systems and methods for rendering three-dimensional images using a level graph are provided. The level graph is accessed, comprising a first node, a second node, and a target node. The second and target nodes are descendants of the first node. The first node comprises first scene description data, the second node comprises first variation data, and the target node comprises second variation data. The target node is selected for computation. Target node ancestors are determined. The first node and the second node are ancestors of the target node. A linearization of the ancestors is determined, comprising an order. A scene description is initialized using the first scene description data. The first variation is applied to the scene description, based on the linearization. The second variation is applied to the scene description to produce a final scene description. An image is rendered using the final scene description.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DREAMWORKS ANIMATION LLC
    Inventors: Peter MCNERNEY, Evan P. SMYTH, Robert Giles WILSON, Greg HEFLIN, Jeff BEALL, Jonathan GIBBS, Mike HAMLER, Benoit GAGNON
  • Publication number: 20140111441
    Abstract: A touch-sensitive surface for a computer animator to create or modify a computer-generated image includes processes for differentiating between click and drag operations. The included processes also beneficially reduce input errors. When a touch object (e.g., finger or stylus) touches the drawing table, information regarding the duration of the touch and the movement of the touch are used to determine whether the touch input represents a (graphical user interface) click or a drag operation.
    Type: Application
    Filed: October 18, 2012
    Publication date: April 24, 2014
    Applicant: DreamWorks Animation LLC
    Inventor: Alexander P. POWELL
  • Publication number: 20140085312
    Abstract: Systems and processes for rendering fractures in an object are provided. In one example, a surface representation of an object may be converted into a volumetric representation of the object. The volumetric representation of the object may be divided into volumetric representations of two or more fragments. The volumetric representations of the two or more fragments may be converted into surface representations of the two or more fragments. Additional information associated with attributes of adjacent fragments may be used to convert the volumetric representations of the two or more fragments into surface representations of the two or more fragments. The surface representations of the two or more fragments may be displayed.
    Type: Application
    Filed: November 5, 2013
    Publication date: March 27, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Akash GARG, Kyle MAXWELL, David LIPTON
  • Publication number: 20140035917
    Abstract: A computer-implemented method for determining bounded-parallax constraints for the placement of a pair of stereoscopic cameras within a computer-generated scene. An initial near-parallax value is determined based on the focal length and a minimum scene depth. An initial far-parallax value is determined based on a focal length. A scaled near-parallax value and scaled far-parallax value are calculated based on the initial near-parallax value, initial far-parallax value, and a range of focal lengths. A creative near-parallax value is calculated based on a stereo-shift value and the product of a stereo-volume and the scaled near-parallax value. A creative far-parallax value is calculated based on the stereo-shift value and the product of the stereo-volume and the scaled far-parallax value. The creative near-parallax value and the creative far-parallax value are stored as the bounded-parallax constraints for the placement of the pair of stereoscopic cameras.
    Type: Application
    Filed: March 13, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Philip MCNALLY, Matthew LOW
  • Publication number: 20140036037
    Abstract: A computer-implemented method for placing a window object within a computer-generated scene. The computer-generated scene includes a pair of stereoscopic cameras adapted to capture an image of at least one computer-generated object and the window object. A left portion and right portion of the image along the left and right edges of the image are obtained. The nearest computer-generated object to the pair of stereoscopic cameras within the left and right portions of the image is identified. The window object is placed between the identified computer-generated object and the stereoscopic cameras at an offset distance from the identified computer-generated object.
    Type: Application
    Filed: March 13, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Philip MCNALLY, Matthew LOW
  • Publication number: 20140036039
    Abstract: A computer-implemented method determining a user-defined stereo effect for a computer-generated scene. A set of bounded-parallax constraints including a near-parallax value and a far-parallax value is obtained. A stereo-volume value is obtained, wherein the stereo-volume value represents a percentage of parallax. A stereo-shift value is also obtained, wherein the stereo-shift value represents a distance across one of: an area associated with a camera sensor of a pair of stereoscopic cameras adapted to film the computer-generated scene; and a screen adapted to depict a stereoscopic image of the computer-generated scene. A creative near-parallax value is calculated based on the stereo-shift value, the stereo-volume, and the near-parallax value. A creative far-parallax value is also calculated based on the stereo-shift value and the product of the stereo-volume and the far-parallax value.
    Type: Application
    Filed: March 14, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Philip MCNALLY, Matthew LOW
  • Publication number: 20140035918
    Abstract: Bounded-parallax constraints are determined for the placement of a pair of stereoscopic cameras within a computer-generated scene. A minimum scene depth is calculated based on the distance from the pair of cameras to a nearest point of interest in the computer-generated scene. A near-parallax value is also calculated based on the focal length and the minimum scene depth. Calculating the near-parallax value includes selecting a baseline stereo-setting entry from a set of stereo-setting entries, each stereo-setting entry of the set of baseline stereo-setting entries includes a recommended scene depth, a recommended focal length, and a recommended near-parallax value. For the selected baseline stereo-setting entry: the recommended scene depth corresponds to the minimum scene depth, and the recommended focal length corresponds to the focal length. The near-parallax value and far-parallax value are stored as the bounded-parallax constraints for the placement of the pair of stereoscopic cameras.
    Type: Application
    Filed: March 14, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Philip MCNALLY, Matthew LOW
  • Publication number: 20140035931
    Abstract: Systems and processes are described below relating to evaluating a dependency graph having one or more temporally dependent variables. The temporally dependent variables may include variables that may be used to evaluate the dependency graph at a frame other than that at which the temporally dependent variable was evaluated. One example process may include tracking the temporal dirty state for each temporally dependent variable using a temporal dependency list. This list may be used to determine which frames, if any, should be reevaluated when a request to evaluate a dependency graph for a particular frame is received. This advantageously reduces the amount of time and computing resources needed to reevaluate a dependency graph.
    Type: Application
    Filed: August 2, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Alexander P. Powell, Alex M. Wells
  • Publication number: 20140036038
    Abstract: Techniques for determining scaled-parallax constraints used for the placement of a pair of stereoscopic cameras within a computer-generated scene. A set of bounded-parallax constraints including a near-parallax value and a far-parallax value is also obtained along with a lower-bound value and upper-bound value for a range of focal lengths. Scaled near-parallax and scaled far-parallax values are calculated, the calculation depending on the whether the focal length is greater than, less than, or within the range of focal lengths.
    Type: Application
    Filed: March 13, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Philip McNally, Matthew Low
  • Publication number: 20140035922
    Abstract: Systems and processes providing a tool for visualizing parallel dependency graph evaluation in computer animation are provided. Runtime evaluation data of a parallel dependency graph may be collected, including the start time and stop time for each node in the graph. The visualization tool may process the data to generate performance visualizations as well as other analysis features. Performance visualizations may illustrate the level of concurrency over time during parallel dependency graph evaluation. Performance visualizations may be generated by graphing node blocks according to node start time and stop time as well as the level of concurrency at a given time to illustrate parallelism. Performance visualizations may enable character technical directors, character riggers, programmers, and other users to evaluate how well parallelism is expressed in parallel dependency graphs in computer animation.
    Type: Application
    Filed: March 12, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Martin Peter WATT, Brendan Duncan
  • Publication number: 20140035908
    Abstract: Systems and processes are described below relating to evaluating a dependency graph to render three-dimensional (3D) graphics using constraints. Two virtual 3D objects are accessed in a virtual 3D space. A constraint relationship request is received, which identifies the first object as a parent and the second object as a child. The technique verifies whether the graphs of the objects are compatible for being constrained to one another. The first object is evaluated to determine its translation, rotation, and scale. The second object is similarly evaluated based on the translation, rotation, and scale of the first object. An image is rendered depicting at least a portion of the first virtual 3D object and at least a portion of the second virtual 3D object.
    Type: Application
    Filed: August 2, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Alexander P. POWELL, Esteban D. PAPP, Alex M. WELLS
  • Publication number: 20140035903
    Abstract: A computer-implemented method for smoothing a stereo parameter for a computer-animated film sequence. A timeline for the film sequence is obtained, the timeline comprising a plurality of time entries. A stereo parameter distribution is obtained, wherein the stereo parameter distribution comprises one stereo parameter value for at least two time entries of the plurality of time entries, and wherein the stereo parameter value corresponds a stereo setting associated with a pair of stereoscopic cameras configured to produce a stereoscopic image of the computer-animated film sequence. Depending on a statistical measurement of the stereo parameter distribution, either a static scene parameter is calculated, or a set of smoothed parameter values is calculated.
    Type: Application
    Filed: March 13, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Philip MCNALLY, Matthew Low
  • Publication number: 20140036036
    Abstract: A computer-implemented method for determining a user-defined stereo effect for a computer-animated film sequence. A stereo-volume value for a timeline of the film sequence is obtained, wherein the stereo-volume value represents a percentage of parallax at the respective time entry. A stereo-shift value for the timeline is also obtained, wherein the stereo-shift value represents a distance across one of: an area associated with a sensor of a pair of stereoscopic cameras adapted to create the film sequence; and a screen adapted to depict a stereoscopic image of the computer-generated scene. A script-adjusted near-parallax value and a script-adjusted far-parallax value are calculated.
    Type: Application
    Filed: March 13, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Philip MCNALLY, Matthew LOW
  • Publication number: 20130335406
    Abstract: Locations are shaded for use in rendering a computer-generated scene having one or more objects represented by the point cloud. A hierarchy for the point cloud is obtained. The point cloud includes a plurality of points. The hierarchy has a plurality of clusters of points of the point cloud. A location is selected to shade. A first cluster from the plurality of clusters is selected. The first cluster represents a first set of points in the point cloud. An importance weight for the first cluster is determined. A render-quality criterion for the first cluster is determined based on the importance weight. Whether the first cluster meets a render-quality criterion is determined based on a render-quality parameter for the first cluster. In response to the first cluster meeting the quality criterion, the location is shaded based on an indication of light emitted from the first cluster.
    Type: Application
    Filed: June 18, 2012
    Publication date: December 19, 2013
    Applicant: DreamWorks Animation LLC
    Inventor: Eric TABELLION
  • Publication number: 20130194279
    Abstract: A system for performing graphics processing is disclosed. A dependency graph comprising interconnected nodes is accessed. Each node has output attributes and the dependency graph receives input attributes. A first list is accessed, which includes a dirty status for each dirty output attribute of the dependency graph. A second list is accessed, which associates one of the input attributes with output attributes that are affected by the one input attribute. A third list is accessed, which associates one of the output attributes with output attributes that affect the one output attribute. An evaluation request for a requested output attribute is received. A set of output attributes are selected for evaluation based on being specified in the first list as dirty and being specified in the third list as associated with the requested output attribute. The set of output attributes are evaluated.
    Type: Application
    Filed: September 6, 2012
    Publication date: August 1, 2013
    Applicant: DreamWorks Animation LLC
    Inventors: Martin WATT, Alexander P. Powell
  • Publication number: 20130088497
    Abstract: A skin deformation system for use in computer animation is disclosed. The skin deformation system accesses the skeleton structure of a computer generated character, and accesses a user's identification of features of the skeleton structure that may affect a skin deformation. The system also accesses the user's identification of a weighting strategy. Using the identified weighting strategy and identified features of the skeleton structure, the skin deformation system determines the degree to which each feature identified by the user may influence the deformation of a skin of the computer generated character. The skin deformation system may incorporate secondary operations including bulge, slide, scale and twist into the deformation of a skin Information relating to a deformed skin may be stored by the skin deformation system so that the information may be used to produce a visual image for a viewer.
    Type: Application
    Filed: October 7, 2011
    Publication date: April 11, 2013
    Applicant: DreamWorks Animation LLC
    Inventors: Paul Carmen DILORENZO, Matthew Christopher Gong, Arthur D. Gregory
  • Publication number: 20130063363
    Abstract: A drawing table for an animator to hand create or modify a computer-generated image includes a display and a fused fiber optic plate. The display is configured to display the computer-generated image on a top surface. The fused fiber optic plate of bundled, optical fibers has an input surface and an output surface. The input surface is optically bonded to the top surface of the display. When the computer-generated image is displayed on the display, the fused fiber optic plate is configured to relay the computer-generated image from the input surface to the output surface.
    Type: Application
    Filed: September 9, 2011
    Publication date: March 14, 2013
    Applicant: DreamWorks Animation LLC
    Inventors: Edwin R. LEONARD, Hans T. Ku
  • Publication number: 20130027407
    Abstract: An animated special effect is modeled using a fluid dynamics framework system. The fluid dynamics framework for animated special effects system accepts volumetric data as input. Input volumetric data may represent the initial state of an animated special effect. Input volumetric data may also represent sources, sinks, external forces, and/or other influences on the animated special effect. In addition, the system accepts input parameters related to fluid dynamics modeling. The input volumes and parameters are applied to the incompressible Navier-Stokes equations as modifications to the initial state of the animated special effect, as modifications to the forcing term of a pressure equation, or in the computations of other types of forces that influence the solution. The input volumetric data may be composited with other volumetric data using a scalar blending field. The solution of the incompressible Navier-Stokes equations models the motion of the animated special effect.
    Type: Application
    Filed: July 27, 2011
    Publication date: January 31, 2013
    Applicant: DreamWorks Animation LLC
    Inventor: Ronald D. HENDERSON
  • Publication number: 20130002671
    Abstract: A computer-animated scene illuminated by indirect light is shaded. The scene is comprised of sample locations on a surface element of an object in the scene. A point cloud representation of the scene is generated. Optionally, an importance map of the scene, based on the point cloud representation, is generated. The importance map is generated by rasterizing one or more points in the point cloud and designating areas of interest based on the energy value of the one or more points in the point cloud. A ray tracing engine is biased, based on the importance map. The biased ray tracing engine calculates the path of the ray to the sample locations in the scene to an area of interest. The scene is shaded using the output from the biased ray tracing engine.
    Type: Application
    Filed: June 30, 2011
    Publication date: January 3, 2013
    Applicant: DreamWorks Animation LLC
    Inventors: Chris F. ARMSDEN, Bruce Tartaglia
  • Publication number: 20120169757
    Abstract: Embodiments relate to a computer-implemented method of providing a transition between first and second regions within a virtual scene, where the first and second regions are rendered using different methods and being connected to one another along a border line. The second region features a sharply diminishing illumination from the border line. The method includes adding, an overlay of additional illumination to the first region as to make the illumination in portions of the first region that are close to the borderline similar to that of portions of the second region that are close to the border line. The method also includes shifting a position on which calculation of the illumination of the second region is based away from the first region.
    Type: Application
    Filed: March 12, 2012
    Publication date: July 5, 2012
    Applicant: DreamWorks Animation LLC
    Inventors: Bruce Nunzio TARTAGLIA, Doug Cooper, Pablo Valle, Michael McNeill
  • Publication number: 20120026172
    Abstract: To generate a skin-attached element on a skin surface of an animated character, a region of the skin surface within a predetermined distance from a skin-attached element root position is deformed to form a lofted skin according to one of a plurality of constraint surfaces, where each of the plurality of constraint surfaces does not intersect with each other. A sublamina mesh surface constrained to the lofted skin is created. A two-dimensional version of the skin-attached element is projected onto the sublamina mesh surface. The lofted skin is reverted back to a state of the skin surface prior to the deformation of the region of the skin surface.
    Type: Application
    Filed: July 27, 2010
    Publication date: February 2, 2012
    Applicant: DreamWorks Animation LLC
    Inventors: Andrew J. WEBER, Galen Gerald Gornowicz
  • Publication number: 20120001909
    Abstract: Systems and processes for rendering fractures in an object are provided. In one example, a surface representation of an object may be converted into a volumetric representation of the object. The volumetric representation of the object may be divided into volumetric representations of two or more fragments. The volumetric representations of the two or more fragments may be converted into surface representations of the two or more fragments. Additional information associated with attributes of adjacent fragments may be used to convert the volumetric representations of the two or more fragments into surface representations of the two or more fragments. The surface representations of the two or more fragments may be displayed.
    Type: Application
    Filed: July 13, 2010
    Publication date: January 5, 2012
    Applicant: DreamWorks Animation LLC
    Inventors: Akash Garg, Kyle Maxwell, David Lipton
  • Publication number: 20110043521
    Abstract: A computer-enabled method for rendering a scene of objects representing physical objects includes projecting a first plurality of rays against a scene and aggregating a second plurality of rays that intersect a bounding volume, wherein the bounding volume encloses an object of the scene, and wherein the second plurality of rays is a portion of the first plurality of rays. The method further includes determining or computing intersections of the second plurality of aggregated rays with the object when the number of the second plurality of aggregated rays exceeds a predetermined value. The method also includes rendering the scene based on the determined intersections of the rays with the object. The second plurality of rays may be aggregated in a bounding volume aggregate data structure for processing.
    Type: Application
    Filed: August 18, 2009
    Publication date: February 24, 2011
    Applicant: DreamWorks Animation LLC
    Inventor: Evan P. SMYTH
  • Publication number: 20110018881
    Abstract: In rendering a computer-generated animation sequence, pieces of animation corresponding to shots of the computer-generated animation sequence are obtained. Measurements of action in the shots are obtained. Frame rates, which can be different, for the shots are determined based on the determined measurements of action in the shots. The shots are rendered based on the determined frame rates for the shots. The rendered shots with frame rate information indicating the frame rates used in rendering the shots are stored.
    Type: Application
    Filed: July 27, 2009
    Publication date: January 27, 2011
    Applicant: DreamWorks Animation LLC
    Inventor: Erik NASH
  • Publication number: 20100182326
    Abstract: In computer enabled key frame animation, a method and associated system for rigging a character so as to provide a large range of motion with great fluidity of motion. The rigging uses a character body that moves along a path or freely as needed. The nodes in the body and path are not physically connected but are linked for performing a particular task. This task driven behavior of the nodes which may allow them to re-organize themselves in different re-configurations in order to perform a common duty, implies a variable geometry to the entire dynamic structure. To some regard the nodes can be said to be intelligent.
    Type: Application
    Filed: February 19, 2010
    Publication date: July 22, 2010
    Applicant: DreamWorks Animation LLC
    Inventor: Haggai GOLDFARB
  • Publication number: 20090254293
    Abstract: Embodiments are directed to modifying an existing scheme for providing translucent illumination in order to take account of subsurface scattering. The color of a selected point of a translucent object can be determined using existing methods. The existing methods need not take subsurface scattering into account. Then, a contribution to the color at the selected point due to subsurface scattering may be calculated. The contribution due to subsurface scattering may be calculated based on a photon map. Embodiments of the invention also include the use of different types of photon maps. In some embodiments, a standard photon map may be used. In other embodiments, a photon map may be defined in a manner similar to a depth map. Thus, the entries of a photon map may be defined in terms of an angle from a light source and a distance between an object's surface and a light source.
    Type: Application
    Filed: April 2, 2008
    Publication date: October 8, 2009
    Applicant: DreamWorks Animation LLC
    Inventors: Bruce Nunzio Tartaglia, Alexander P. Powell
  • Publication number: 20090207176
    Abstract: The surface of a body of water can be animated by deconstructing a master wave model into several layer models and then reconstructing the layer models to form an optimized wave model. A wave model is obtained, which describes the wave surfaces in a body of water. The wave model is comprised of a range of wave model frequencies over a given area. A primary layer model, secondary and tertiary layer models are constructed based on portions of the wave model frequencies. An optimized wave model is constructed by combining the primary, secondary, and tertiary layer models. A wave surface point location is determined within the given area. A wave height value is computed for the wave surface point location using the optimized wave model. The wave height value that is associated with the surface point location is stored.
    Type: Application
    Filed: January 20, 2009
    Publication date: August 20, 2009
    Applicant: DreamWorks Animation LLC
    Inventor: Galen Gerald GORNOWICZ
  • Publication number: 20090128561
    Abstract: A tinted color value is produced for a surface of an object in a computer generated scene. The surface is illuminated by a light source having a lighting color value associated with the light source. A first reflected color value is calculated for the surface. The first reflected color value is calculated based on an assumption that the surface is illuminated by white light rather than the lighting color value associated with the light source. A desaturated color value is computed using the first reflected color value. A tinted color value is computed by combining the desaturated color value with the lighting color value associated with the light source. The tinted color value is stored.
    Type: Application
    Filed: November 20, 2007
    Publication date: May 21, 2009
    Applicant: DreamWorks Animation LLC
    Inventors: Douglas W. Cooper, Ben H. Kwa
  • Publication number: 20090096803
    Abstract: Embodiments of the invention relate for rendering translucent objects. According to some embodiments, the color of a pixel of a translucent object that is not directly illuminated by a light source can be determined by decaying the illumination contributed by the light source according to a predefined decay function. The decay function may be, for example, an exponential decay function. The decay function may be evaluated based on an initial illumination contributed by the light source, and a transmittance distance. In some embodiments, the initial color of the pixel is decayed instead of the illumination. Also disclosed is modifying the renderings of different regions of an object which have been rendered using different methods in order to remove sharp contrasts between these regions.
    Type: Application
    Filed: October 16, 2007
    Publication date: April 16, 2009
    Applicant: DREAMWORKS ANIMATION LLC
    Inventors: Bruce Nunzio Tartaglia, Doug Cooper, Pablo Valle, Michael McNeill
  • Publication number: 20090091575
    Abstract: Animating strands (such as long hair), for movies, videos, etc. is accomplished using computer graphics by use of differential algebraic equations. Each strand is subject to simulation by defining its motion path, then evaluating dynamic forces acting on the strand. Collision detection with any objects is performed, and collision response forces are evaluated. Then for each frame a differential algebraic equations solver is invoked to simulate the strands.
    Type: Application
    Filed: October 4, 2007
    Publication date: April 9, 2009
    Applicant: DreamWorks Animation LLC
    Inventors: Silviu Borac, Sunil S. Hadap
  • Publication number: 20080297519
    Abstract: The present invention deforms hairs from a reference pose based on one or more of the following: magnet position and/or orientation; local reference space position (e.g., a character's head or scalp); and several profile curves and variables. In one embodiment, after an initial deformation is determined, it is refined in order to simulate collisions, control hair length, and reduce the likelihood of hairs penetrating the surface model. The deformed hairs can be rendered to create a frame. This procedure can be performed multiple times, using different inputs, to create different hair deformations. These different inputs can be generated based on interpolations of existing inputs. Frames created using these deformations can then be displayed in sequence to produce an animation. The invention can be used to animate any tubular or cylindrical structure protruding from a surface.
    Type: Application
    Filed: May 27, 2008
    Publication date: December 4, 2008
    Applicant: DreamWorks Animation LLC
    Inventors: Nicolas SCAPEL, Terran J. Boylan, Daniel Lee Dawson
  • Publication number: 20080278491
    Abstract: Embodiments of the present invention are directed to rendering computer graphics using an augmented direct light model which approximates the effect of indirect light in shadows. More specifically, a shadow illuminator light source is provided for. The shadow illuminator light source is associated with an ordinary, or primary light source and is used to provide illumination in areas which are in shadows with respect to the primary light source. The shadow illuminator provides illumination only to areas which are considered to be in the shadows with respect to the light source the shadow illuminator is associated with. Thus, the shadow illuminator may be used to approximate the effects of indirect light.
    Type: Application
    Filed: May 8, 2007
    Publication date: November 13, 2008
    Applicant: DreamWorks Animation LLC
    Inventors: Bruce Nunzio Tartaglia, Philippe Denis
  • Publication number: 20080266292
    Abstract: A computer generated character is decorated with skin-attached features in computer graphics by defining a skin surface of the computer generated character. The skin surface is defined using a set of one or more connected parametric surfaces. Feature locations for the features are placed on the defined skin surface. Guide locations for guides are placed on the defined skin surface. The skin surface is partitioned into a plurality of cells. Each cell has a set of vertices. The set of vertices for each cell is a set of the guide locations. Interpolation weights are determined for the feature locations using the guide locations and the plurality of cells.
    Type: Application
    Filed: April 27, 2007
    Publication date: October 30, 2008
    Applicant: DreamWorks Animation LLC
    Inventors: Galen G. Gornowicz, Gokhan Kisacikoglu
  • Publication number: 20080266308
    Abstract: Skin-attached features are placed on a computer generated character by defining a set of placement points on at least a portion of a skin surface of the computer generated character. For each placement point, a radius is defined for the placement point. For each placement point, a density value is determined for the placement point. The density value is a sum of weighted overlaps with neighboring placement points within the radius of the placement point. The weighted overlaps are functions of the radius of the placement point. The number of placement points in the set of placement points is reduced based on the density values.
    Type: Application
    Filed: April 27, 2007
    Publication date: October 30, 2008
    Applicant: DreamWorks Animation LLC
    Inventors: Galen G. Gornowicz, Gokhan Kisacikoglu
  • Publication number: 20080186707
    Abstract: In a video or other screen display apparatus, a surround to the actual screen is provided whose light output is variable in order to compensate for the effect of ambient (room) illumination on apparent contrast and chroma of the displayed image. The relationship between ambient light level and the surround illumination is an inverse power function. This provides the effect of making the viewer perceive that the entire room is brighter than it actually is, resulting in a desirable change in his perception of brightness. Thereby, the chromaticity of the surround is variable to allow a match to the calibrated white point of the video display. The apparatus includes an ambient light sensor whose output signal is provided to a control system driving the illuminated surround.
    Type: Application
    Filed: February 5, 2007
    Publication date: August 7, 2008
    Applicant: DreamWorks Animation LLC
    Inventors: Hans Ku, Jonathan Egstad, John Hanashiro, Karl Rasche
  • Publication number: 20080174600
    Abstract: A method and apparatus for high quality soft shadows for area lights in cinematic lighting for use in computer graphics, such as computer enabled animation. The method is an extension of traditional shadow maps, so it has the advantage of image based shadow methods; the complexity of the present method is independent of geometric complexity. It uses multilayer translucent shadow maps, which can be used to produce high quality soft shadows for scenes with extremely complex geometry, fur, and volume objects. Instead of the traditional sampling and filtering of shadow maps, the present method computes the shadow factor by ray tracing the multilayer translucent shadow map. The result is soft edged shadows of quality similar to that achieved by conventional stochastic ray tracing, but at much lower computational cost.
    Type: Application
    Filed: February 16, 2007
    Publication date: July 24, 2008
    Applicant: DreamWorks Animation LLC
    Inventor: Feng Xie
  • Publication number: 20070270092
    Abstract: A method of animating feather elements includes: specifying initial positions for a skin surface and for feather elements; specifying positions for the skin surface at an animated time; determining a feather-ordering sequence for placing the feather elements on the skin surface; determining positions for skirt elements that provide spatial extensions for the skin surface at the animated time; determining positions for feather-proxy elements that provide spatial extensions for the feather elements at the animated time; and determining positions for the feather elements at the animated time by extracting the feather elements from the feather-proxy elements. The feather-proxy elements are determined from the skirt elements according to the feather-ordering sequence, and the feather-proxy elements satisfy a separation criterion for avoiding intersections between the feather-proxy elements.
    Type: Application
    Filed: April 21, 2006
    Publication date: November 22, 2007
    Applicant: DreamWorks Animation LLC
    Inventors: Galen Gornowicz, Andrew Weber
  • Publication number: 20070097125
    Abstract: A computer-based animation method and system for deforming animated characters (people, animals, etc.) using a volume preserving and collision resolution process for the animation. Parts of a character are deformed by a soft mesh deformation to emulate skin deformation due to flesh and muscle movement driven by the rigid body animation along the character's joints. Resolution of the interpenetration or volume loss of deformed parts is needed for smooth and realistic animation. The present method and system enable automatic collision resolution, allowing local deformations on two intersecting parts of a character so the parts no longer interpenetrate, while preserving the volume and general shape of the two parts, e.g., the character's torso and a limb.
    Type: Application
    Filed: October 28, 2005
    Publication date: May 3, 2007
    Applicant: DreamWorks Animation LLC
    Inventors: Feng Xie, Nick Foster, Peter Farson