Patents Assigned to DreamWorks Animation LLC
  • Patent number: 9087406
    Abstract: Techniques for determining scaled-parallax constraints used for the placement of a pair of stereoscopic cameras within a computer-generated scene. A set of bounded-parallax constraints including a near-parallax value and a far-parallax value is also obtained along with a lower-bound value and upper-bound value for a range of focal lengths. Scaled near-parallax and scaled far-parallax values are calculated, the calculation depending on the whether the focal length is greater than, less than, or within the range of focal lengths.
    Type: Grant
    Filed: March 13, 2013
    Date of Patent: July 21, 2015
    Assignee: DreamWorks Animation LLC
    Inventors: Philip McNally, Matthew Low
  • Patent number: 9082223
    Abstract: Systems and methods for manipulating a virtual three-dimensional (3D) object in a virtual 3D space are provided. A representation of the 3D object is displayed on a display. A non-hemispherical arcball having a surface is determined. The non-hemispherical arcball is associated with the representation of the 3D object. A pointing device is detected at a first position and at a second position. The first position of the pointing device is translated onto a first location on the surface of the non-hemispherical arcball. The second position of the pointing device is translated onto a second location on the surface of the non-hemispherical arcball. A rotation of the representation of the 3D object is displayed on the display, the rotation based on a path of travel between the first location and the second location along the surface of the non-hemispherical arcball.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: July 14, 2015
    Assignee: DreamWorks Animation LLC
    Inventors: Morgwn Quin McCarty, Alexander P. Powell
  • Patent number: 9076262
    Abstract: A computer-implemented method for determining a user-defined stereo effect for a computer-animated film sequence. A stereo-volume value for a timeline of the film sequence is obtained, wherein the stereo-volume value represents a percentage of parallax at the respective time entry. A stereo-shift value for the timeline is also obtained, wherein the stereo-shift value represents a distance across one of: an area associated with a sensor of a pair of stereoscopic cameras adapted to create the film sequence; and a screen adapted to depict a stereoscopic image of the computer-generated scene. A script-adjusted near-parallax value and a script-adjusted far-parallax value are calculated.
    Type: Grant
    Filed: March 13, 2013
    Date of Patent: July 7, 2015
    Assignee: DreamWorks Animation LLC
    Inventors: Philip McNally, Matthew Low
  • Publication number: 20150187113
    Abstract: Systems and methods for performing MOS skin deformations are provided. In one example process, the in vector of a MOS transform may be manually configured by a user. In another example process, a slide/bulge operation may be configured to depend on two or more MOS transforms. Each of the MOS transforms may be assigned a weight that represents the transform's contribution to the overall slide/bulge. In yet another example process, a bulge operation for a MOS vertex may be performed in a direction orthogonal to the attached MOS curve regardless of the direction of the attachment vector. In yet another example process, a ghost transform may be inserted into a MOS closed curve and used to calculate skin deformations associated with first transform of the MOS closed curve.
    Type: Application
    Filed: December 31, 2013
    Publication date: July 2, 2015
    Applicant: DreamWorks Animation LLC
    Inventors: Mark R. Rubin, Robert Lloyd Helms, Arthur D. Gregory, Peter Dean Farson, Matthew Christopher Gong, Michael Scott Hutchinson
  • Patent number: 9070222
    Abstract: A computer-implemented method for determining bounded-parallax constraints for the placement of a pair of stereoscopic cameras within a computer-generated scene. An initial near-parallax value is determined based on the focal length and a minimum scene depth. An initial far-parallax value is determined based on a focal length. A scaled near-parallax value and scaled far-parallax value are calculated based on the initial near-parallax value, initial far-parallax value, and a range of focal lengths. A creative near-parallax value is calculated based on a stereo-shift value and the product of a stereo-volume and the scaled near-parallax value. A creative far-parallax value is calculated based on the stereo-shift value and the product of the stereo-volume and the scaled far-parallax value. The creative near-parallax value and the creative far-parallax value are stored as the bounded-parallax constraints for the placement of the pair of stereoscopic cameras.
    Type: Grant
    Filed: March 13, 2013
    Date of Patent: June 30, 2015
    Assignee: DreamWorks Animation LLC
    Inventors: Philip McNally, Matthew Low
  • Patent number: 9064345
    Abstract: Data representing animated hair in a computer generated imagery (CGI) scene may be compressed by treating hair data as arrays of parameters. Hair data parameters may include control vertices, hair color, hair radius, and the like. A principal component analysis (PCA) may be performed on the arrays of hair data. PCA may yield new basis vectors, varying in length, with the largest basis vector corresponding to a new dimension with the largest variance in hair data. The hair data may be quantized based on the varying lengths of new basis vectors. The number of bits allocated for quantizing each new dimension corresponding to each new basis vector may be determined based on the relative lengths of new basis vectors, with more bits allocated to dimensions corresponding to longer basis vectors. The quantized hair data may be bit-packed and then compressed using lossless entropy encoding.
    Type: Grant
    Filed: March 14, 2013
    Date of Patent: June 23, 2015
    Assignee: DreamWorks Animation LLC
    Inventor: Mark Jeffrey Matthews
  • Patent number: 8982157
    Abstract: To generate a skin-attached element on a skin surface of an animated character, a region of the skin surface within a predetermined distance from a skin-attached element root position is deformed to form a lofted skin according to one of a plurality of constraint surfaces, where each of the plurality of constraint surfaces does not intersect with each other. A sublamina mesh surface constrained to the lofted skin is created. A two-dimensional version of the skin-attached element is projected onto the sublamina mesh surface. The lofted skin is reverted back to a state of the skin surface prior to the deformation of the region of the skin surface.
    Type: Grant
    Filed: July 27, 2010
    Date of Patent: March 17, 2015
    Assignee: DreamWorks Animation LLC
    Inventors: Andrew J. Weber, Galen Gerald Gornowicz
  • Patent number: 8952958
    Abstract: A computer-implemented method for defining a range of bounding parameter values that satisfy perceptual constraints for a stereoscopically filmed computer-generated scene. A user selection of a bounding parameter from a set of scene parameters is selected. Values for scene parameters of the set of scene parameters that were not selected as the bounding parameter are obtained. A first bounding value for the bounding parameter is calculated based on a first perceptual constraint and based on the values of the scene parameters of the set of scene parameters that were not selected. A second bounding value for the bounding parameter is also calculated based on a second perceptual constraint and based on the values of the scene parameter of the set of scene parameters that were not selected. The first and second bounding values define a minimum and a maximum value of a range of values and are stored.
    Type: Grant
    Filed: March 18, 2013
    Date of Patent: February 10, 2015
    Assignee: DreamWorks Animation LLC
    Inventors: Matthew Low, Donald Greenberg, Philip Mcnally
  • Patent number: 8866813
    Abstract: A computer-animated scene illuminated by indirect light is shaded. The scene is comprised of sample locations on a surface element of an object in the scene. A point cloud representation of the scene is generated. Optionally, an importance map of the scene, based on the point cloud representation, is generated. The importance map is generated by rasterizing one or more points in the point cloud and designating areas of interest based on the energy value of the one or more points in the point cloud. A ray tracing engine is biased, based on the importance map. The biased ray tracing engine calculates the path of the ray to the sample locations in the scene to an area of interest. The scene is shaded using the output from the biased ray tracing engine.
    Type: Grant
    Filed: June 30, 2011
    Date of Patent: October 21, 2014
    Assignee: DreamWorks Animation LLC
    Inventors: Chris F. Armsden, Bruce Tartaglia
  • Publication number: 20140270561
    Abstract: Data representing animated hair in a computer generated imagery (CGI) scene may be compressed by treating hair data as arrays of parameters. Hair data parameters may include control vertices, hair color, hair radius, and the like. A principal component analysis (PCA) may be performed on the arrays of hair data. PCA may yield new basis vectors, varying in length, with the largest basis vector corresponding to a new dimension with the largest variance in hair data. The hair data may be quantized based on the varying lengths of new basis vectors. The number of bits allocated for quantizing each new dimension corresponding to each new basis vector may be determined based on the relative lengths of new basis vectors, with more bits allocated to dimensions corresponding to longer basis vectors. The quantized hair data may be bit-packed and then compressed using lossless entropy encoding.
    Type: Application
    Filed: March 14, 2013
    Publication date: September 18, 2014
    Applicant: DreamWorks Animation LLC
    Inventor: Mark Jeffrey MATTHEWS
  • Publication number: 20140267357
    Abstract: A computer-enabled method for shading locations for use in rendering a computer-generated scene having one or more objects represented by a point cloud. The method involves selecting a shading location, selecting a set of points from the point cloud, rasterizing the points onto a raster shape positioned at the shading location, where the raster shape has varying texel densities that are based on characteristics of the points in the point cloud, such that the texel density varies on different surfaces of the raster shape or on different areas of the same surface or both, and shading the shading location.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DreamWorks Animation LLC
    Inventor: DreamWorks Animation LLC
  • Publication number: 20140267239
    Abstract: Systems and methods for rendering three-dimensional images by instancing scene description data using a hierarchy are provided. A hierarchy is accessed. The hierarchy comprises a first node and an instance node. The first node is a predecessor to a subtree of one or more nodes and the first node is associated with a first scene description data object. The instance node is a leaf of the hierarchy. The instance node has a parent node and the instance node is associated with a second scene description data object. The parent node has successor nodes other than the instance node. An instancing instruction of the instance node is read. The instancing instruction comprises information identifying the first node. An instance of the subtree of one or more nodes is merged at a location in the hierarchy of the instance node. An image is rendered based on the merged instance of the subtree.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DREAMWORKS ANIMATION LLC
    Inventors: Robert Giles WILSON, David MOORE, Nick LONG
  • Publication number: 20140267237
    Abstract: Systems and methods for rendering three-dimensional images using a level graph are provided. The level graph is accessed, comprising a first node, a second node, and a target node. The second and target nodes are descendants of the first node. The first node comprises first scene description data, the second node comprises first variation data, and the target node comprises second variation data. The target node is selected for computation. Target node ancestors are determined. The first node and the second node are ancestors of the target node. A linearization of the ancestors is determined, comprising an order. A scene description is initialized using the first scene description data. The first variation is applied to the scene description, based on the linearization. The second variation is applied to the scene description to produce a final scene description. An image is rendered using the final scene description.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DREAMWORKS ANIMATION LLC
    Inventors: Peter MCNERNEY, Evan P. SMYTH, Robert Giles WILSON, Greg HEFLIN, Jeff BEALL, Jonathan GIBBS, Mike HAMLER, Benoit GAGNON
  • Publication number: 20140280323
    Abstract: Systems and processes provide network clients on various platforms a customized file system experience on demand while managing files (e.g., computer animation files) across a variety of storage devices spread across a network of arbitrary size (local area networks, wide area networks, worldwide networks, the world wide web, etc.). Clients may specify a set of requirements for an instantiation of a file system interface or object for a given application. Such requirements may include storage location, file quality, capacity, scale, permanence, speed, and the like. The system may then provide to the client a customized file system interface with particular hardware resources allocated to satisfy the designated file system requirements. The file system interface may coordinate file delivery, allocation, tracking, transportation, caching, deletion, and the like. The system may manage and allocate hardware resources ranging from a local client computer to distant hard drive banks across the world.
    Type: Application
    Filed: March 13, 2013
    Publication date: September 18, 2014
    Applicant: DREAMWORKS ANIMATION LLC
    Inventor: Michael Christian SEALES
  • Publication number: 20140267249
    Abstract: Systems and processes for contouring 2D shadow characters in 3D CGI scenes are provided. A simplified drawing surface may be added to a CGI scene and displayed from a first perspective to approximate a major surface where a shadow character may be located. A drawn shadow character may be received on the simplified drawing surface. A naturally-cast reference shadow of a corresponding 3D modeled character may be provided on the drawing surface to aid artists in developing the shadow character. An image of the drawn shadow character may be captured from a second perspective at the primary light source. The simplified drawing surface and drawn shadow character may be removed from the scene. The captured shadow character image may be projected into the scene from the second perspective, contouring naturally to object surfaces. The scene, including the shadow character, may be captured from a third perspective.
    Type: Application
    Filed: March 14, 2013
    Publication date: September 18, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Douglas Wayne COOPER, Robyn Nicholas RINDGE
  • Publication number: 20140267083
    Abstract: Systems and methods for manipulating a virtual three-dimensional (3D) object in a virtual 3D space are provided. A representation of the 3D object is displayed on a display. A non-hemispherical arcball having a surface is determined. The non-hemispherical arcball is associated with the representation of the 3D object. A pointing device is detected at a first position and at a second position. The first position of the pointing device is translated onto a first location on the surface of the non-hemispherical arcball. The second position of the pointing device is translated onto a second location on the surface of the non-hemispherical arcball. A rotation of the representation of the 3D object is displayed on the display, the rotation based on a path of travel between the first location and the second location along the surface of the non-hemispherical arcball.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DREAMWORKS ANIMATION LLC
    Inventors: Morgwn Quin MCCARTY, Alexander P. POWELL
  • Publication number: 20140267288
    Abstract: A system for partitioning a set of assets, where each asset represents a computer-generated object associated with a computer-generated scene. A dependency graph comprising a plurality of interconnected nodes including an organizer node is accessed. The set of assets identified by an input of a predicate test of the organizer node are accessed. It is determined if the at least one predicate test can be evaluated using the set of assets. If the at least one predicate test can be evaluated, one or more partition assets are identified and passed to a next node. If the at least one predicate test cannot be evaluated, a conservative set of assets is identified and passed to the next node, wherein the conservative set of assets is the same set of assets identified by the input of the predicate test.
    Type: Application
    Filed: March 14, 2013
    Publication date: September 18, 2014
    Applicant: DREAMWORKS ANIMATION LLC
    Inventor: Evan P. SMYTH
  • Publication number: 20140267309
    Abstract: Systems and methods for rendering an image using a render setup graph are provided. The render setup graph may be used to configure and manage lighting configuration data as well as external processes used to render the computer-generated image. The render setup graph may include a dependency graph having nodes interconnected by edges along which objects and object configuration data may be passed between nodes. The nodes may be used to provide a source of objects and object configuration data, configure visual effects of an object, partition a set of objects, call external processes, perform data routing functions within the graph, and the like. In this way, the render setup graph may advantageously be used to organize configuration data and execution of processes for rendering an image.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Robert Giles WILSON, Evan P. Smyth, Mark Lee, Max Requenes, Peter McNerney
  • Publication number: 20140267354
    Abstract: A lighting correction filter for selectively correcting lighting in computer animation is disclosed. The lighting correction filter can select a computer-generated object having one or more lighting attributes. The selected object can be a portion of an object, an entire object, a portion of a computer-generated scene, or an entire scene. The filter can then set lighting correction values for the lighting attributes of the selected object. The lighting correction values can be color values, exposure values, or both. The filter can apply the lighting correction values to the selected object's lighting attributes to effect a lighting correction in the object prior to rendering.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DREAMWORKS ANIMATION LLC
    Inventor: Stephen BAILEY
  • Publication number: 20140267312
    Abstract: A rail manipulator indicates the possible range(s) of movement of a part of a computer-generated character in a computer animation system. The rail manipulator obtains a model of the computer-generated character. The model may be a skeleton structure of bones connected at joints. The interconnected bones may constrain the movements of one another. When an artist selects one of the bones for movement, the rail manipulator determines the range of movement of the selected bone. The determination may be based on the position and/or the ranges of movements of other bones in the skeleton structure. The range of movement is displayed on-screen to the artist, together with the computer-generated character. In this way, the rail manipulator directly communicates to the artist the degree to which a portion of the computer-generated character can be moved, in response to the artist's selection of the portion of the computer-generated character.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DreamWorks Animation LLC
    Inventor: Alexander P. Powell