Patents Assigned to DreamWorks Animation LLC
-
Publication number: 20110043521Abstract: A computer-enabled method for rendering a scene of objects representing physical objects includes projecting a first plurality of rays against a scene and aggregating a second plurality of rays that intersect a bounding volume, wherein the bounding volume encloses an object of the scene, and wherein the second plurality of rays is a portion of the first plurality of rays. The method further includes determining or computing intersections of the second plurality of aggregated rays with the object when the number of the second plurality of aggregated rays exceeds a predetermined value. The method also includes rendering the scene based on the determined intersections of the rays with the object. The second plurality of rays may be aggregated in a bounding volume aggregate data structure for processing.Type: ApplicationFiled: August 18, 2009Publication date: February 24, 2011Applicant: DreamWorks Animation LLCInventor: Evan P. SMYTH
-
Publication number: 20110018881Abstract: In rendering a computer-generated animation sequence, pieces of animation corresponding to shots of the computer-generated animation sequence are obtained. Measurements of action in the shots are obtained. Frame rates, which can be different, for the shots are determined based on the determined measurements of action in the shots. The shots are rendered based on the determined frame rates for the shots. The rendered shots with frame rate information indicating the frame rates used in rendering the shots are stored.Type: ApplicationFiled: July 27, 2009Publication date: January 27, 2011Applicant: DreamWorks Animation LLCInventor: Erik NASH
-
Patent number: 7872654Abstract: The present invention deforms hairs from a reference pose based on one or more of the following: magnet position and/or orientation; local reference space position (e.g., a character's head or scalp); and several profile curves and variables. In one embodiment, after an initial deformation is determined, it is refined in order to simulate collisions, control hair length, and reduce the likelihood of hairs penetrating the surface model. The deformed hairs can be rendered to create a frame. This procedure can be performed multiple times, using different inputs, to create different hair deformations. These different inputs can be generated based on interpolations of existing inputs. Frames created using these deformations can then be displayed in sequence to produce an animation. The invention can be used to animate any tubular or cylindrical structure protruding from a surface.Type: GrantFiled: May 27, 2008Date of Patent: January 18, 2011Assignee: DreamWorks Animation LLCInventors: Nicolas Scapel, Terran Boylan, Daniel Lee Dawson
-
Patent number: 7859546Abstract: Skin-attached features are placed on a computer generated character by defining a set of placement points on at least a portion of a skin surface of the computer generated character. For each placement point, a radius is defined for the placement point. For each placement point, a density value is determined for the placement point. The density value is a sum of weighted overlaps with neighboring placement points within the radius of the placement point. The weighted overlaps are functions of the radius of the placement point. The number of placement points in the set of placement points is reduced based on the density values.Type: GrantFiled: April 27, 2007Date of Patent: December 28, 2010Assignee: DreamWorks Animation LLCInventors: Galen G. Gornowicz, Gokhan Kisacikoglu
-
Patent number: 7782324Abstract: In computer enabled key frame animation, a method and associated system for rigging a character so as to provide a large range of motion with great fluidity of motion. The rigging uses a character body that moves along a path or freely as needed. The nodes in the body and path are not physically connected but are linked for performing a particular task. This task driven behavior of the nodes which may allow them to re-organize themselves in different re-configurations in order to perform a common duty, implies a variable geometry to the entire dynamic structure. To some regard the nodes can be said to be intelligent.Type: GrantFiled: November 23, 2005Date of Patent: August 24, 2010Assignee: Dreamworks Animation LLCInventor: Haggai Goldfarb
-
Publication number: 20100182326Abstract: In computer enabled key frame animation, a method and associated system for rigging a character so as to provide a large range of motion with great fluidity of motion. The rigging uses a character body that moves along a path or freely as needed. The nodes in the body and path are not physically connected but are linked for performing a particular task. This task driven behavior of the nodes which may allow them to re-organize themselves in different re-configurations in order to perform a common duty, implies a variable geometry to the entire dynamic structure. To some regard the nodes can be said to be intelligent.Type: ApplicationFiled: February 19, 2010Publication date: July 22, 2010Applicant: DreamWorks Animation LLCInventor: Haggai GOLDFARB
-
Patent number: 7649535Abstract: A method of animating feather elements includes: specifying initial positions for a skin surface and for feather elements; specifying positions for the skin surface at an animated time; determining a feather-ordering sequence for placing the feather elements on the skin surface; determining positions for skirt elements that provide spatial extensions for the skin surface at the animated time; determining positions for feather-proxy elements that provide spatial extensions for the feather elements at the animated time; and determining positions for the feather elements at the animated time by extracting the feather elements from the feather-proxy elements. The feather-proxy elements are determined from the skirt elements according to the feather-ordering sequence, and the feather-proxy elements satisfy a separation criterion for avoiding intersections between the feather-proxy elements.Type: GrantFiled: April 21, 2006Date of Patent: January 19, 2010Assignee: DreamWorks Animation LLCInventors: Galen Gerald Gornowicz, Andrew John Weber
-
Publication number: 20090254293Abstract: Embodiments are directed to modifying an existing scheme for providing translucent illumination in order to take account of subsurface scattering. The color of a selected point of a translucent object can be determined using existing methods. The existing methods need not take subsurface scattering into account. Then, a contribution to the color at the selected point due to subsurface scattering may be calculated. The contribution due to subsurface scattering may be calculated based on a photon map. Embodiments of the invention also include the use of different types of photon maps. In some embodiments, a standard photon map may be used. In other embodiments, a photon map may be defined in a manner similar to a depth map. Thus, the entries of a photon map may be defined in terms of an angle from a light source and a distance between an object's surface and a light source.Type: ApplicationFiled: April 2, 2008Publication date: October 8, 2009Applicant: DreamWorks Animation LLCInventors: Bruce Nunzio Tartaglia, Alexander P. Powell
-
Publication number: 20090207176Abstract: The surface of a body of water can be animated by deconstructing a master wave model into several layer models and then reconstructing the layer models to form an optimized wave model. A wave model is obtained, which describes the wave surfaces in a body of water. The wave model is comprised of a range of wave model frequencies over a given area. A primary layer model, secondary and tertiary layer models are constructed based on portions of the wave model frequencies. An optimized wave model is constructed by combining the primary, secondary, and tertiary layer models. A wave surface point location is determined within the given area. A wave height value is computed for the wave surface point location using the optimized wave model. The wave height value that is associated with the surface point location is stored.Type: ApplicationFiled: January 20, 2009Publication date: August 20, 2009Applicant: DreamWorks Animation LLCInventor: Galen Gerald GORNOWICZ
-
Patent number: 7545379Abstract: A computer-based animation method and system for deforming animated characters (people, animals, etc.) using a volume preserving and collision resolution process for the animation. Parts of a character are deformed by a soft mesh deformation to emulate skin deformation due to flesh and muscle movement driven by the rigid body animation along the character's joints. Resolution of the interpenetration or volume loss of deformed parts is needed for smooth and realistic animation. The present method and system enable automatic collision resolution, allowing local deformations on two intersecting parts of a character so the parts no longer interpenetrate, while preserving the volume and general shape of the two parts, e.g., the character's torso and a limb.Type: GrantFiled: October 28, 2005Date of Patent: June 9, 2009Assignee: Dreamworks Animation LLCInventors: Feng Xie, Nick Foster, Peter Farson
-
Publication number: 20090128561Abstract: A tinted color value is produced for a surface of an object in a computer generated scene. The surface is illuminated by a light source having a lighting color value associated with the light source. A first reflected color value is calculated for the surface. The first reflected color value is calculated based on an assumption that the surface is illuminated by white light rather than the lighting color value associated with the light source. A desaturated color value is computed using the first reflected color value. A tinted color value is computed by combining the desaturated color value with the lighting color value associated with the light source. The tinted color value is stored.Type: ApplicationFiled: November 20, 2007Publication date: May 21, 2009Applicant: DreamWorks Animation LLCInventors: Douglas W. Cooper, Ben H. Kwa
-
Publication number: 20090096803Abstract: Embodiments of the invention relate for rendering translucent objects. According to some embodiments, the color of a pixel of a translucent object that is not directly illuminated by a light source can be determined by decaying the illumination contributed by the light source according to a predefined decay function. The decay function may be, for example, an exponential decay function. The decay function may be evaluated based on an initial illumination contributed by the light source, and a transmittance distance. In some embodiments, the initial color of the pixel is decayed instead of the illumination. Also disclosed is modifying the renderings of different regions of an object which have been rendered using different methods in order to remove sharp contrasts between these regions.Type: ApplicationFiled: October 16, 2007Publication date: April 16, 2009Applicant: DREAMWORKS ANIMATION LLCInventors: Bruce Nunzio Tartaglia, Doug Cooper, Pablo Valle, Michael McNeill
-
Publication number: 20090091575Abstract: Animating strands (such as long hair), for movies, videos, etc. is accomplished using computer graphics by use of differential algebraic equations. Each strand is subject to simulation by defining its motion path, then evaluating dynamic forces acting on the strand. Collision detection with any objects is performed, and collision response forces are evaluated. Then for each frame a differential algebraic equations solver is invoked to simulate the strands.Type: ApplicationFiled: October 4, 2007Publication date: April 9, 2009Applicant: DreamWorks Animation LLCInventors: Silviu Borac, Sunil S. Hadap
-
Publication number: 20080297519Abstract: The present invention deforms hairs from a reference pose based on one or more of the following: magnet position and/or orientation; local reference space position (e.g., a character's head or scalp); and several profile curves and variables. In one embodiment, after an initial deformation is determined, it is refined in order to simulate collisions, control hair length, and reduce the likelihood of hairs penetrating the surface model. The deformed hairs can be rendered to create a frame. This procedure can be performed multiple times, using different inputs, to create different hair deformations. These different inputs can be generated based on interpolations of existing inputs. Frames created using these deformations can then be displayed in sequence to produce an animation. The invention can be used to animate any tubular or cylindrical structure protruding from a surface.Type: ApplicationFiled: May 27, 2008Publication date: December 4, 2008Applicant: DreamWorks Animation LLCInventors: Nicolas SCAPEL, Terran J. Boylan, Daniel Lee Dawson
-
Publication number: 20080278491Abstract: Embodiments of the present invention are directed to rendering computer graphics using an augmented direct light model which approximates the effect of indirect light in shadows. More specifically, a shadow illuminator light source is provided for. The shadow illuminator light source is associated with an ordinary, or primary light source and is used to provide illumination in areas which are in shadows with respect to the primary light source. The shadow illuminator provides illumination only to areas which are considered to be in the shadows with respect to the light source the shadow illuminator is associated with. Thus, the shadow illuminator may be used to approximate the effects of indirect light.Type: ApplicationFiled: May 8, 2007Publication date: November 13, 2008Applicant: DreamWorks Animation LLCInventors: Bruce Nunzio Tartaglia, Philippe Denis
-
Publication number: 20080266292Abstract: A computer generated character is decorated with skin-attached features in computer graphics by defining a skin surface of the computer generated character. The skin surface is defined using a set of one or more connected parametric surfaces. Feature locations for the features are placed on the defined skin surface. Guide locations for guides are placed on the defined skin surface. The skin surface is partitioned into a plurality of cells. Each cell has a set of vertices. The set of vertices for each cell is a set of the guide locations. Interpolation weights are determined for the feature locations using the guide locations and the plurality of cells.Type: ApplicationFiled: April 27, 2007Publication date: October 30, 2008Applicant: DreamWorks Animation LLCInventors: Galen G. Gornowicz, Gokhan Kisacikoglu
-
Publication number: 20080266308Abstract: Skin-attached features are placed on a computer generated character by defining a set of placement points on at least a portion of a skin surface of the computer generated character. For each placement point, a radius is defined for the placement point. For each placement point, a density value is determined for the placement point. The density value is a sum of weighted overlaps with neighboring placement points within the radius of the placement point. The weighted overlaps are functions of the radius of the placement point. The number of placement points in the set of placement points is reduced based on the density values.Type: ApplicationFiled: April 27, 2007Publication date: October 30, 2008Applicant: DreamWorks Animation LLCInventors: Galen G. Gornowicz, Gokhan Kisacikoglu
-
Publication number: 20080186707Abstract: In a video or other screen display apparatus, a surround to the actual screen is provided whose light output is variable in order to compensate for the effect of ambient (room) illumination on apparent contrast and chroma of the displayed image. The relationship between ambient light level and the surround illumination is an inverse power function. This provides the effect of making the viewer perceive that the entire room is brighter than it actually is, resulting in a desirable change in his perception of brightness. Thereby, the chromaticity of the surround is variable to allow a match to the calibrated white point of the video display. The apparatus includes an ambient light sensor whose output signal is provided to a control system driving the illuminated surround.Type: ApplicationFiled: February 5, 2007Publication date: August 7, 2008Applicant: DreamWorks Animation LLCInventors: Hans Ku, Jonathan Egstad, John Hanashiro, Karl Rasche
-
Publication number: 20080174600Abstract: A method and apparatus for high quality soft shadows for area lights in cinematic lighting for use in computer graphics, such as computer enabled animation. The method is an extension of traditional shadow maps, so it has the advantage of image based shadow methods; the complexity of the present method is independent of geometric complexity. It uses multilayer translucent shadow maps, which can be used to produce high quality soft shadows for scenes with extremely complex geometry, fur, and volume objects. Instead of the traditional sampling and filtering of shadow maps, the present method computes the shadow factor by ray tracing the multilayer translucent shadow map. The result is soft edged shadows of quality similar to that achieved by conventional stochastic ray tracing, but at much lower computational cost.Type: ApplicationFiled: February 16, 2007Publication date: July 24, 2008Applicant: DreamWorks Animation LLCInventor: Feng Xie
-
Publication number: 20070270092Abstract: A method of animating feather elements includes: specifying initial positions for a skin surface and for feather elements; specifying positions for the skin surface at an animated time; determining a feather-ordering sequence for placing the feather elements on the skin surface; determining positions for skirt elements that provide spatial extensions for the skin surface at the animated time; determining positions for feather-proxy elements that provide spatial extensions for the feather elements at the animated time; and determining positions for the feather elements at the animated time by extracting the feather elements from the feather-proxy elements. The feather-proxy elements are determined from the skirt elements according to the feather-ordering sequence, and the feather-proxy elements satisfy a separation criterion for avoiding intersections between the feather-proxy elements.Type: ApplicationFiled: April 21, 2006Publication date: November 22, 2007Applicant: DreamWorks Animation LLCInventors: Galen Gornowicz, Andrew Weber