Dreamworks Patent Applications

Patents granted to Dreamworks by the U.S. Patent and Trademark Office (USPTO).

  • Publication number: 20180025527
    Abstract: A skin deformation system for use in computer animation is disclosed. The skin deformation system accesses the skeleton structure of a computer generated character, and accesses a user's identification of features of the skeleton structure that may affect a skin deformation. The system also accesses the user's identification of a weighting strategy. Using the identified weighting strategy and identified features of the skeleton structure, the skin deformation system determines the degree to which each feature identified by the user may influence the deformation of a skin of the computer generated character. The skin deformation system may incorporate secondary operations including bulge, slide, scale and twist into the deformation of a skin. Information relating to a deformed skin may be stored by the skin deformation system so that the information may be used to produce a visual image for a viewer.
    Type: Application
    Filed: October 2, 2017
    Publication date: January 25, 2018
    Applicant: DreamWorks Animation L.L.C.
    Inventors: Paul Carmen DILORENZO, Matthew Christopher GONG, Arthur D. GREGORY
  • Publication number: 20180018804
    Abstract: A rail manipulator indicates the possible range(s) of movement of a part of a computer-generated character in a computer animation system. The rail manipulator obtains a model of the computer-generated character. The model may be a skeleton structure of bones connected at joints. The interconnected bones may constrain the movements of one another. When an artist selects one of the bones for movement, the rail manipulator determines the range of movement of the selected bone. The determination may be based on the position and/or the ranges of moments of other bones in the skeleton structure. The range of movement is displayed on-screen to the artist, together with the computer-generated character. In this way, the rail manipulator directly communicates to the artist the degree to which a portion of the computer-generated character can be moved, in response to the artist's selection of the portion of the computer-generated character.
    Type: Application
    Filed: September 28, 2017
    Publication date: January 18, 2018
    Applicant: DreamWorks Animation L.L.C.
    Inventor: Alexander P. POWELL
  • Publication number: 20170316605
    Abstract: A method for generating stereoscopic images includes obtaining image data comprising a plurality of sample points. A direction, a color value, and a depth value are associated with each sample point. The directions and depth values are relative to a common origin. A mesh is generated by displacing the sample points from the origin. The sample points are displaced in the associated directions by distances representative of the corresponding depth values. The image data is mapped to the mesh such that the color values associated with the sample points are mapped to the mesh at the corresponding directions. A first image of the mesh is generated from a first perspective, and a second image of the mesh is generated from a second perspective. The first and second images of the mesh may be caused to be displayed to provide an illusion of depth.
    Type: Application
    Filed: July 14, 2017
    Publication date: November 2, 2017
    Applicant: DreamWorks Animation LLC
    Inventor: Brad Kenneth HERMAN
  • Publication number: 20170309253
    Abstract: A method of scheduling and performing computations for generating an interactive computer-generated animation on behalf of a client device to achieve a desired quality of service includes generating a computational configuration of computations that, when performed, produce the computer-generated animation with the desired quality of service. The configuration includes an identification of a first computation that outputs first data, a first start time for the first computation, and a first end time, where the first computation is to end before the first end time. The configuration also includes an identification of a second computation that depends on the first data, and a second start time for the second computation. The first computation is performed in response to an occurrence of the first start time and the second computation is performed in response to an occurrence of the second start time.
    Type: Application
    Filed: July 6, 2017
    Publication date: October 26, 2017
    Applicant: DreamWorks Animation LLC
    Inventor: Evan P. SMYTH
  • Publication number: 20170287197
    Abstract: Computer animation tools for viewing, in multiple contexts, the effect of changes to a computer animation are disclosed. An artist configures multiple visual displays in the user interface of a computer animation system. A visual display shows one or more frames of computer animation. An artist configures a visual display to reflect a specific context. For example, the artist may assign a particular virtual viewpoint of a scene to a particular visual display. Once visual displays are configured, the artist changes a configuration of the computer animation. For example, the artist may change the lighting parameters of a scene. In response, the visual displays show the visual effects of the configuration (e.g., lighting parameters) change under corresponding contexts (e.g., different virtual camera viewpoints). Using multiple visual displays, which may be displayed side-by-side, an artist can view the effects of her configuration changes in the various contexts.
    Type: Application
    Filed: April 12, 2017
    Publication date: October 5, 2017
    Applicant: DreamWorks Animation LLC
    Inventors: Tsuey Jin LIOU, Evan P. SMYTH, Andrew Philip PEARCE, Peter MCNERNEY
  • Publication number: 20170278290
    Abstract: Systems and processes providing a tool for visualizing parallel dependency graph evaluation in computer animation are provided. Runtime evaluation data of a parallel dependency graph may be collected, including the start time and stop time for each node in the graph. The visualization tool may process the data to generate performance visualizations as well as other analysis features. Performance visualizations may illustrate the level of concurrency over time during parallel dependency graph evaluation. Performance visualizations may be generated by graphing node blocks according to node start time and stop time as well as the level of concurrency at a given time to illustrate parallelism. Performance visualizations may enable character technical directors, character riggers, programmers, and other users to evaluate how well parallelism is expressed in parallel dependency graphs in computer animation.
    Type: Application
    Filed: June 9, 2017
    Publication date: September 28, 2017
    Applicant: DreamWorks Animation LLC
    Inventors: Martin Peter WATT, Brendan DUNCAN
  • Publication number: 20170213076
    Abstract: A method for evaluating a facial performance using facial capture of two users includes obtaining a reference set of facial performance data representing a first user's facial capture; obtaining a facial capture of a second user; extracting a second set of facial performance data based on the second user's facial capture; calculating at least one matching metric based on a comparison of the reference set of facial performance data to the second set of facial performance data; and displaying an indication of the at least one matching metric on a display.
    Type: Application
    Filed: January 17, 2017
    Publication date: July 27, 2017
    Applicant: DreamWorks Animation LLC
    Inventors: Emmanuel C. FRANCISCO, Demian GORDON, Elvin KORKUTI
  • Publication number: 20170206696
    Abstract: Systems and methods for automatically animating a character based on an existing corpus of animation are described. The character may be from a previously produced feature animated film, and the data used for training may be the data used to animate the character in the film. A low-dimensional embedding for subsets of the existing animation corresponding to different semantic labels may be learned by mapping high-dimensional rig control parameters to a latent space. A particle model may be used to move within the latent space, thereby generating novel animations corresponding to the space's semantic label, such as a pose. Bridges may link a first pose of a first model within the latent space that is similar to a second pose of a second model of the space. Animations corresponding to transitions between semantic labels may be generated by creating animation paths that traverse a bridge from one model into another.
    Type: Application
    Filed: January 18, 2017
    Publication date: July 20, 2017
    Applicant: DreamWorks Animation LLC
    Inventors: Stephen BAILEY, Martin WATT, Bo MORGAN, James O'BRIEN
  • Publication number: 20170169555
    Abstract: An electronic device with a display screen provides drawing directions to guide a user to create artwork on a physical medium. The electronic device displays a first drawing direction for drawing a portion of a subject on a physical medium, and prompts a user for a user input indicating completion of the first drawing direction by the user. Upon receiving the prompted user input, the electronic device displays a second drawing direction for drawing another portion of the subject on the physical medium. The subject may be based on a computer-animated movie title. The first drawing direction may include a representation of a virtual host, which is also based on a computer-animated character from a computer-animated movie title.
    Type: Application
    Filed: November 21, 2016
    Publication date: June 15, 2017
    Applicant: DreamWorks Animation LLC
    Inventors: Scott LAROCCA, Campbell MCGROUTHER
  • Publication number: 20170098327
    Abstract: One exemplary process for animating hair includes receiving data representing a plurality of hairs and a plurality of objects in a timestep of a frame of animation. A first tree is populated to represent kinematic objects of the plurality of objects and a second tree is populated to represent dynamic objects of the plurality of objects based on the received data. A first elasticity preconditioner is created to represent internal elastic energy of the plurality of hairs based on the received data. Based on the first tree and the second tree, a first set of potential contacts is determined between two or more hairs of the plurality of hairs or between one or more hairs of the plurality of hairs and one or more objects of the plurality of objects. Positions of the plurality of hairs are determined based on the first set of potential contacts and the first elasticity preconditioner.
    Type: Application
    Filed: September 13, 2016
    Publication date: April 6, 2017
    Applicant: DreamWorks Animation LLC
    Inventors: Galen G. GORNOWICZ, Silviu BORAC
  • Publication number: 20160343167
    Abstract: A virtual reality system includes a platform, a headset, a mount, and a control unit. The headset includes a motion-sensing unit and a display unit configured to display a video of a virtual environment. The mount is positioned on the platform and configured to releasably engage the headset. While the headset is engaged with the mount, the headset is positioned in a first position. While the headset is disengaged from the mount, the headset is positioned in a second position. The control unit is connected to the headset and configured to receive first data representing the first position and associate the first position with a predetermined first perspective of the virtual environment. The control unit is also configured to receive second data representing the second position, determine a second perspective of the virtual environment corresponding to the second position, and provide video of the virtual environment from the second perspective.
    Type: Application
    Filed: May 17, 2016
    Publication date: November 24, 2016
    Applicant: DreamWorks Animation LLC
    Inventors: Brad Kenneth HERMAN, St. John COLĂ“N
  • Publication number: 20150331597
    Abstract: A graphical user interface (GUI) for training includes, in some embodiments, a first group of icons arranged about a first axis, where the first group of icons corresponds to computer-generated animation concepts. The GUI also includes a second group of icons arranged about a second axis that intersects the first axis at a particular icon along the first axis. The second group of icons corresponds to videos that illustrate the computer-generated animation concept associated with the particular icon on the first axis. The GUI can also include a third group of icons arranged about a third axis that intersects the first axis at another icon along the first axis. Horizontal correspondence between icons along the second and third axes indicates logical relationships between the corresponding training content.
    Type: Application
    Filed: May 15, 2014
    Publication date: November 19, 2015
    Applicant: DreamWorks Animation LLC
    Inventor: Hoyt Lee NG
  • Publication number: 20150221119
    Abstract: One exemplary process for animating hair includes receiving data representing a plurality of hairs and a plurality of objects in a timestep of a frame of animation. A first tree is populated to represent kinematic objects of the plurality of objects and a second tree is populated to represent dynamic objects of the plurality of objects based on the received data. A first elasticity preconditioner is created to represent internal elastic energy of the plurality of hairs based on the received data. Based on the first tree and the second tree, a first set of potential contacts is determined between two or more hairs of the plurality of hairs or between one or more hairs of the plurality of hairs and one or more objects of the plurality of objects. Positions of the plurality of hairs are determined based on the first set of potential contacts and the first elasticity preconditioner.
    Type: Application
    Filed: February 3, 2014
    Publication date: August 6, 2015
    Applicant: DREAMWORKS ANIMATION LLC
    Inventors: Galen G. GORNOWICZ, Silviu BORAC
  • Publication number: 20150187113
    Abstract: Systems and methods for performing MOS skin deformations are provided. In one example process, the in vector of a MOS transform may be manually configured by a user. In another example process, a slide/bulge operation may be configured to depend on two or more MOS transforms. Each of the MOS transforms may be assigned a weight that represents the transform's contribution to the overall slide/bulge. In yet another example process, a bulge operation for a MOS vertex may be performed in a direction orthogonal to the attached MOS curve regardless of the direction of the attachment vector. In yet another example process, a ghost transform may be inserted into a MOS closed curve and used to calculate skin deformations associated with first transform of the MOS closed curve.
    Type: Application
    Filed: December 31, 2013
    Publication date: July 2, 2015
    Applicant: DreamWorks Animation LLC
    Inventors: Mark R. Rubin, Robert Lloyd Helms, Arthur D. Gregory, Peter Dean Farson, Matthew Christopher Gong, Michael Scott Hutchinson
  • Publication number: 20140270561
    Abstract: Data representing animated hair in a computer generated imagery (CGI) scene may be compressed by treating hair data as arrays of parameters. Hair data parameters may include control vertices, hair color, hair radius, and the like. A principal component analysis (PCA) may be performed on the arrays of hair data. PCA may yield new basis vectors, varying in length, with the largest basis vector corresponding to a new dimension with the largest variance in hair data. The hair data may be quantized based on the varying lengths of new basis vectors. The number of bits allocated for quantizing each new dimension corresponding to each new basis vector may be determined based on the relative lengths of new basis vectors, with more bits allocated to dimensions corresponding to longer basis vectors. The quantized hair data may be bit-packed and then compressed using lossless entropy encoding.
    Type: Application
    Filed: March 14, 2013
    Publication date: September 18, 2014
    Applicant: DreamWorks Animation LLC
    Inventor: Mark Jeffrey MATTHEWS
  • Publication number: 20140267252
    Abstract: Using curves to emulate soft body deformation in a computer-generated character is disclosed. A method can include accessing a reference model mapped to one or more deformation curves for the character. The reference model can include a mesh of vertices representing a soft body layer of the character. The deformation curve can include multiple sample points selected for mapping. Each mesh vertex on the model can be mapped to each sample point on the curve to establish a relationship between them for deformation. The method can also include receiving a movement of one or more sample points on the curve to a desired deformation position. The method can further include calculating primary and secondary movements of the mesh vertices on the model based on the movements of sample points. The method can move the mesh vertices as calculated to a desired deformation position and output the reference model with the moved vertices for rendering to emulate the soft body deformation of the character.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DREAMWORKS ANIMATION LLC
    Inventors: Michael HUTCHINSON, Guido ZIMMERMANN, Robert HELMS
  • Publication number: 20140267357
    Abstract: A computer-enabled method for shading locations for use in rendering a computer-generated scene having one or more objects represented by a point cloud. The method involves selecting a shading location, selecting a set of points from the point cloud, rasterizing the points onto a raster shape positioned at the shading location, where the raster shape has varying texel densities that are based on characteristics of the points in the point cloud, such that the texel density varies on different surfaces of the raster shape or on different areas of the same surface or both, and shading the shading location.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DreamWorks Animation LLC
    Inventor: DreamWorks Animation LLC
  • Publication number: 20140280323
    Abstract: Systems and processes provide network clients on various platforms a customized file system experience on demand while managing files (e.g., computer animation files) across a variety of storage devices spread across a network of arbitrary size (local area networks, wide area networks, worldwide networks, the world wide web, etc.). Clients may specify a set of requirements for an instantiation of a file system interface or object for a given application. Such requirements may include storage location, file quality, capacity, scale, permanence, speed, and the like. The system may then provide to the client a customized file system interface with particular hardware resources allocated to satisfy the designated file system requirements. The file system interface may coordinate file delivery, allocation, tracking, transportation, caching, deletion, and the like. The system may manage and allocate hardware resources ranging from a local client computer to distant hard drive banks across the world.
    Type: Application
    Filed: March 13, 2013
    Publication date: September 18, 2014
    Applicant: DREAMWORKS ANIMATION LLC
    Inventor: Michael Christian SEALES
  • Publication number: 20140267291
    Abstract: Preservation and reuse of intermediate data generated in a render setup graph for computer animation is disclosed. A processing node in the graph can generate intermediate data and, rather than send it directly to a downstream node in the graph, preserve it for reuse during subsequent processing. As a result, a downstream processing node can reuse the preserved intermediate data, rather than wait while the intermediate data is generated by the processing node in realtime. An intermediate data file management module can manage this process by storing the generated intermediate data in a file for preservation, retrieving the stored intermediate data from the file for reuse, optimizing the file storage location for speed and efficiency, and facilitating sharing of the intermediate data during collaboration between users.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DREAMWORKS ANIMATION LLC
    Inventors: Evan P. SMYTH, Peter MCNERNEY
  • Publication number: 20140267239
    Abstract: Systems and methods for rendering three-dimensional images by instancing scene description data using a hierarchy are provided. A hierarchy is accessed. The hierarchy comprises a first node and an instance node. The first node is a predecessor to a subtree of one or more nodes and the first node is associated with a first scene description data object. The instance node is a leaf of the hierarchy. The instance node has a parent node and the instance node is associated with a second scene description data object. The parent node has successor nodes other than the instance node. An instancing instruction of the instance node is read. The instancing instruction comprises information identifying the first node. An instance of the subtree of one or more nodes is merged at a location in the hierarchy of the instance node. An image is rendered based on the merged instance of the subtree.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DREAMWORKS ANIMATION LLC
    Inventors: Robert Giles WILSON, David MOORE, Nick LONG
  • Publication number: 20140267307
    Abstract: Computer animation tools for viewing, in multiple contexts, the effect of changes to a computer animation are disclosed. An artist configures multiple visual displays in the user interface of a computer animation system. A visual display shows one or more frames of computer animation. An artist configures a visual display to reflect a specific context. For example, the artist may assign a particular virtual viewpoint of a scene to a particular visual display. Once visual displays are configured, the artist changes a configuration of the computer animation. For example, the artist may change the lighting parameters of a scene. In response, the visual displays show the visual effects of the configuration (e.g., lighting parameters) change under corresponding contexts (e.g., different virtual camera viewpoints). Using multiple visual displays, which may be displayed side-by-side, an artist can view the effects of her configuration changes in the various contexts.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Tsuey Jin Liou, Evan P. Smyth, Andrew Phillip Pearce, Peter McNerney
  • Publication number: 20140267237
    Abstract: Systems and methods for rendering three-dimensional images using a level graph are provided. The level graph is accessed, comprising a first node, a second node, and a target node. The second and target nodes are descendants of the first node. The first node comprises first scene description data, the second node comprises first variation data, and the target node comprises second variation data. The target node is selected for computation. Target node ancestors are determined. The first node and the second node are ancestors of the target node. A linearization of the ancestors is determined, comprising an order. A scene description is initialized using the first scene description data. The first variation is applied to the scene description, based on the linearization. The second variation is applied to the scene description to produce a final scene description. An image is rendered using the final scene description.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DREAMWORKS ANIMATION LLC
    Inventors: Peter MCNERNEY, Evan P. SMYTH, Robert Giles WILSON, Greg HEFLIN, Jeff BEALL, Jonathan GIBBS, Mike HAMLER, Benoit GAGNON
  • Publication number: 20140267249
    Abstract: Systems and processes for contouring 2D shadow characters in 3D CGI scenes are provided. A simplified drawing surface may be added to a CGI scene and displayed from a first perspective to approximate a major surface where a shadow character may be located. A drawn shadow character may be received on the simplified drawing surface. A naturally-cast reference shadow of a corresponding 3D modeled character may be provided on the drawing surface to aid artists in developing the shadow character. An image of the drawn shadow character may be captured from a second perspective at the primary light source. The simplified drawing surface and drawn shadow character may be removed from the scene. The captured shadow character image may be projected into the scene from the second perspective, contouring naturally to object surfaces. The scene, including the shadow character, may be captured from a third perspective.
    Type: Application
    Filed: March 14, 2013
    Publication date: September 18, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Douglas Wayne COOPER, Robyn Nicholas RINDGE
  • Publication number: 20140267083
    Abstract: Systems and methods for manipulating a virtual three-dimensional (3D) object in a virtual 3D space are provided. A representation of the 3D object is displayed on a display. A non-hemispherical arcball having a surface is determined. The non-hemispherical arcball is associated with the representation of the 3D object. A pointing device is detected at a first position and at a second position. The first position of the pointing device is translated onto a first location on the surface of the non-hemispherical arcball. The second position of the pointing device is translated onto a second location on the surface of the non-hemispherical arcball. A rotation of the representation of the 3D object is displayed on the display, the rotation based on a path of travel between the first location and the second location along the surface of the non-hemispherical arcball.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DREAMWORKS ANIMATION LLC
    Inventors: Morgwn Quin MCCARTY, Alexander P. POWELL
  • Publication number: 20140267352
    Abstract: A system and method for computing a rendered image of a computer-generated object in a computer-generated scene. A dependency graph is accessed, the dependency graph including a plurality of interconnected nodes including a look-selector node. An asset is accessed at an input to the look-selector node. The asset includes a plurality of looks for the computer-generated object, each look of the plurality of looks corresponding to a different visual appearance of the computer-generated object. At the look-selector node, an active look is selected from the plurality of looks. The active look is passed to a next node of the dependency graph. The rendered image of the computer-generated object is computed having a visual appearance that corresponds to the active look.
    Type: Application
    Filed: March 14, 2013
    Publication date: September 18, 2014
    Applicant: DREAMWORKS ANIMATION LLC
    Inventor: Evan P. SMYTH
  • Publication number: 20140267309
    Abstract: Systems and methods for rendering an image using a render setup graph are provided. The render setup graph may be used to configure and manage lighting configuration data as well as external processes used to render the computer-generated image. The render setup graph may include a dependency graph having nodes interconnected by edges along which objects and object configuration data may be passed between nodes. The nodes may be used to provide a source of objects and object configuration data, configure visual effects of an object, partition a set of objects, call external processes, perform data routing functions within the graph, and the like. In this way, the render setup graph may advantageously be used to organize configuration data and execution of processes for rendering an image.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Robert Giles WILSON, Evan P. Smyth, Mark Lee, Max Requenes, Peter McNerney
  • Publication number: 20140267354
    Abstract: A lighting correction filter for selectively correcting lighting in computer animation is disclosed. The lighting correction filter can select a computer-generated object having one or more lighting attributes. The selected object can be a portion of an object, an entire object, a portion of a computer-generated scene, or an entire scene. The filter can then set lighting correction values for the lighting attributes of the selected object. The lighting correction values can be color values, exposure values, or both. The filter can apply the lighting correction values to the selected object's lighting attributes to effect a lighting correction in the object prior to rendering.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DREAMWORKS ANIMATION LLC
    Inventor: Stephen BAILEY
  • Publication number: 20140267312
    Abstract: A rail manipulator indicates the possible range(s) of movement of a part of a computer-generated character in a computer animation system. The rail manipulator obtains a model of the computer-generated character. The model may be a skeleton structure of bones connected at joints. The interconnected bones may constrain the movements of one another. When an artist selects one of the bones for movement, the rail manipulator determines the range of movement of the selected bone. The determination may be based on the position and/or the ranges of movements of other bones in the skeleton structure. The range of movement is displayed on-screen to the artist, together with the computer-generated character. In this way, the rail manipulator directly communicates to the artist the degree to which a portion of the computer-generated character can be moved, in response to the artist's selection of the portion of the computer-generated character.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DreamWorks Animation LLC
    Inventor: Alexander P. Powell
  • Publication number: 20140267277
    Abstract: Systems and methods for rendering three-dimensional images using a render setup graph are provided. A dependency graph is accessed. The dependency graph comprises a plurality of supplier nodes, a multiplexer node, and a plurality of graphlet nodes. The plurality of supplier nodes is accessed. The supplier nodes each have an output of a first type. These outputs are connected to the multiplexer node. A graphlet is accessed. The graphlet comprises the plurality of graphlet nodes. An output of the multiplexer node connects to the graphlet by connecting to an input of one node of the plurality of graphlet nodes. The multiplexer is configured to generate an instance of the graphlet for each supplier node connected to the multiplexer node. An image is rendered utilizing the accessed graphlet.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Peter MCNERNEY, Evan P. Smyth
  • Publication number: 20140267344
    Abstract: Systems and methods for partitioning a set of animation objects using a node in a render setup graph are provided. The render setup graph may be used to configure and manage lighting configuration data as well as external processes used to render the computer-generated image. The render setup graph may include a dependency graph having nodes interconnected by edges along which objects and object configuration data may be passed between nodes. The nodes may be used to provide a source of objects and object configuration data, configure visual effects of an object, partition a set of objects, call external processes, perform data routing functions within the graph, and the like. The objects can be partitioned based on attributes of the objects and associated configuration data. In this way, the render setup graph may advantageously be used to organize configuration data and execution of processes for rendering an image.
    Type: Application
    Filed: March 15, 2013
    Publication date: September 18, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Robert Giles WILSON, Evan P. Smyth, Mark Lee, Max Requenes, Peter Mcnerney
  • Publication number: 20140267288
    Abstract: A system for partitioning a set of assets, where each asset represents a computer-generated object associated with a computer-generated scene. A dependency graph comprising a plurality of interconnected nodes including an organizer node is accessed. The set of assets identified by an input of a predicate test of the organizer node are accessed. It is determined if the at least one predicate test can be evaluated using the set of assets. If the at least one predicate test can be evaluated, one or more partition assets are identified and passed to a next node. If the at least one predicate test cannot be evaluated, a conservative set of assets is identified and passed to the next node, wherein the conservative set of assets is the same set of assets identified by the input of the predicate test.
    Type: Application
    Filed: March 14, 2013
    Publication date: September 18, 2014
    Applicant: DREAMWORKS ANIMATION LLC
    Inventor: Evan P. SMYTH
  • Publication number: 20140111441
    Abstract: A touch-sensitive surface for a computer animator to create or modify a computer-generated image includes processes for differentiating between click and drag operations. The included processes also beneficially reduce input errors. When a touch object (e.g., finger or stylus) touches the drawing table, information regarding the duration of the touch and the movement of the touch are used to determine whether the touch input represents a (graphical user interface) click or a drag operation.
    Type: Application
    Filed: October 18, 2012
    Publication date: April 24, 2014
    Applicant: DreamWorks Animation LLC
    Inventor: Alexander P. POWELL
  • Publication number: 20140085312
    Abstract: Systems and processes for rendering fractures in an object are provided. In one example, a surface representation of an object may be converted into a volumetric representation of the object. The volumetric representation of the object may be divided into volumetric representations of two or more fragments. The volumetric representations of the two or more fragments may be converted into surface representations of the two or more fragments. Additional information associated with attributes of adjacent fragments may be used to convert the volumetric representations of the two or more fragments into surface representations of the two or more fragments. The surface representations of the two or more fragments may be displayed.
    Type: Application
    Filed: November 5, 2013
    Publication date: March 27, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Akash GARG, Kyle MAXWELL, David LIPTON
  • Publication number: 20140036038
    Abstract: Techniques for determining scaled-parallax constraints used for the placement of a pair of stereoscopic cameras within a computer-generated scene. A set of bounded-parallax constraints including a near-parallax value and a far-parallax value is also obtained along with a lower-bound value and upper-bound value for a range of focal lengths. Scaled near-parallax and scaled far-parallax values are calculated, the calculation depending on the whether the focal length is greater than, less than, or within the range of focal lengths.
    Type: Application
    Filed: March 13, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Philip McNally, Matthew Low
  • Publication number: 20140035931
    Abstract: Systems and processes are described below relating to evaluating a dependency graph having one or more temporally dependent variables. The temporally dependent variables may include variables that may be used to evaluate the dependency graph at a frame other than that at which the temporally dependent variable was evaluated. One example process may include tracking the temporal dirty state for each temporally dependent variable using a temporal dependency list. This list may be used to determine which frames, if any, should be reevaluated when a request to evaluate a dependency graph for a particular frame is received. This advantageously reduces the amount of time and computing resources needed to reevaluate a dependency graph.
    Type: Application
    Filed: August 2, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Alexander P. Powell, Alex M. Wells
  • Publication number: 20140035917
    Abstract: A computer-implemented method for determining bounded-parallax constraints for the placement of a pair of stereoscopic cameras within a computer-generated scene. An initial near-parallax value is determined based on the focal length and a minimum scene depth. An initial far-parallax value is determined based on a focal length. A scaled near-parallax value and scaled far-parallax value are calculated based on the initial near-parallax value, initial far-parallax value, and a range of focal lengths. A creative near-parallax value is calculated based on a stereo-shift value and the product of a stereo-volume and the scaled near-parallax value. A creative far-parallax value is calculated based on the stereo-shift value and the product of the stereo-volume and the scaled far-parallax value. The creative near-parallax value and the creative far-parallax value are stored as the bounded-parallax constraints for the placement of the pair of stereoscopic cameras.
    Type: Application
    Filed: March 13, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Philip MCNALLY, Matthew LOW
  • Publication number: 20140035908
    Abstract: Systems and processes are described below relating to evaluating a dependency graph to render three-dimensional (3D) graphics using constraints. Two virtual 3D objects are accessed in a virtual 3D space. A constraint relationship request is received, which identifies the first object as a parent and the second object as a child. The technique verifies whether the graphs of the objects are compatible for being constrained to one another. The first object is evaluated to determine its translation, rotation, and scale. The second object is similarly evaluated based on the translation, rotation, and scale of the first object. An image is rendered depicting at least a portion of the first virtual 3D object and at least a portion of the second virtual 3D object.
    Type: Application
    Filed: August 2, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Alexander P. POWELL, Esteban D. PAPP, Alex M. WELLS
  • Publication number: 20140036039
    Abstract: A computer-implemented method determining a user-defined stereo effect for a computer-generated scene. A set of bounded-parallax constraints including a near-parallax value and a far-parallax value is obtained. A stereo-volume value is obtained, wherein the stereo-volume value represents a percentage of parallax. A stereo-shift value is also obtained, wherein the stereo-shift value represents a distance across one of: an area associated with a camera sensor of a pair of stereoscopic cameras adapted to film the computer-generated scene; and a screen adapted to depict a stereoscopic image of the computer-generated scene. A creative near-parallax value is calculated based on the stereo-shift value, the stereo-volume, and the near-parallax value. A creative far-parallax value is also calculated based on the stereo-shift value and the product of the stereo-volume and the far-parallax value.
    Type: Application
    Filed: March 14, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Philip MCNALLY, Matthew LOW
  • Publication number: 20140035903
    Abstract: A computer-implemented method for smoothing a stereo parameter for a computer-animated film sequence. A timeline for the film sequence is obtained, the timeline comprising a plurality of time entries. A stereo parameter distribution is obtained, wherein the stereo parameter distribution comprises one stereo parameter value for at least two time entries of the plurality of time entries, and wherein the stereo parameter value corresponds a stereo setting associated with a pair of stereoscopic cameras configured to produce a stereoscopic image of the computer-animated film sequence. Depending on a statistical measurement of the stereo parameter distribution, either a static scene parameter is calculated, or a set of smoothed parameter values is calculated.
    Type: Application
    Filed: March 13, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Philip MCNALLY, Matthew Low
  • Publication number: 20140035918
    Abstract: Bounded-parallax constraints are determined for the placement of a pair of stereoscopic cameras within a computer-generated scene. A minimum scene depth is calculated based on the distance from the pair of cameras to a nearest point of interest in the computer-generated scene. A near-parallax value is also calculated based on the focal length and the minimum scene depth. Calculating the near-parallax value includes selecting a baseline stereo-setting entry from a set of stereo-setting entries, each stereo-setting entry of the set of baseline stereo-setting entries includes a recommended scene depth, a recommended focal length, and a recommended near-parallax value. For the selected baseline stereo-setting entry: the recommended scene depth corresponds to the minimum scene depth, and the recommended focal length corresponds to the focal length. The near-parallax value and far-parallax value are stored as the bounded-parallax constraints for the placement of the pair of stereoscopic cameras.
    Type: Application
    Filed: March 14, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Philip MCNALLY, Matthew LOW
  • Publication number: 20140035922
    Abstract: Systems and processes providing a tool for visualizing parallel dependency graph evaluation in computer animation are provided. Runtime evaluation data of a parallel dependency graph may be collected, including the start time and stop time for each node in the graph. The visualization tool may process the data to generate performance visualizations as well as other analysis features. Performance visualizations may illustrate the level of concurrency over time during parallel dependency graph evaluation. Performance visualizations may be generated by graphing node blocks according to node start time and stop time as well as the level of concurrency at a given time to illustrate parallelism. Performance visualizations may enable character technical directors, character riggers, programmers, and other users to evaluate how well parallelism is expressed in parallel dependency graphs in computer animation.
    Type: Application
    Filed: March 12, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Martin Peter WATT, Brendan Duncan
  • Publication number: 20140036036
    Abstract: A computer-implemented method for determining a user-defined stereo effect for a computer-animated film sequence. A stereo-volume value for a timeline of the film sequence is obtained, wherein the stereo-volume value represents a percentage of parallax at the respective time entry. A stereo-shift value for the timeline is also obtained, wherein the stereo-shift value represents a distance across one of: an area associated with a sensor of a pair of stereoscopic cameras adapted to create the film sequence; and a screen adapted to depict a stereoscopic image of the computer-generated scene. A script-adjusted near-parallax value and a script-adjusted far-parallax value are calculated.
    Type: Application
    Filed: March 13, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Philip MCNALLY, Matthew LOW
  • Publication number: 20140036037
    Abstract: A computer-implemented method for placing a window object within a computer-generated scene. The computer-generated scene includes a pair of stereoscopic cameras adapted to capture an image of at least one computer-generated object and the window object. A left portion and right portion of the image along the left and right edges of the image are obtained. The nearest computer-generated object to the pair of stereoscopic cameras within the left and right portions of the image is identified. The window object is placed between the identified computer-generated object and the stereoscopic cameras at an offset distance from the identified computer-generated object.
    Type: Application
    Filed: March 13, 2013
    Publication date: February 6, 2014
    Applicant: DreamWorks Animation LLC
    Inventors: Philip MCNALLY, Matthew LOW
  • Publication number: 20130335406
    Abstract: Locations are shaded for use in rendering a computer-generated scene having one or more objects represented by the point cloud. A hierarchy for the point cloud is obtained. The point cloud includes a plurality of points. The hierarchy has a plurality of clusters of points of the point cloud. A location is selected to shade. A first cluster from the plurality of clusters is selected. The first cluster represents a first set of points in the point cloud. An importance weight for the first cluster is determined. A render-quality criterion for the first cluster is determined based on the importance weight. Whether the first cluster meets a render-quality criterion is determined based on a render-quality parameter for the first cluster. In response to the first cluster meeting the quality criterion, the location is shaded based on an indication of light emitted from the first cluster.
    Type: Application
    Filed: June 18, 2012
    Publication date: December 19, 2013
    Applicant: DreamWorks Animation LLC
    Inventor: Eric TABELLION
  • Publication number: 20130194279
    Abstract: A system for performing graphics processing is disclosed. A dependency graph comprising interconnected nodes is accessed. Each node has output attributes and the dependency graph receives input attributes. A first list is accessed, which includes a dirty status for each dirty output attribute of the dependency graph. A second list is accessed, which associates one of the input attributes with output attributes that are affected by the one input attribute. A third list is accessed, which associates one of the output attributes with output attributes that affect the one output attribute. An evaluation request for a requested output attribute is received. A set of output attributes are selected for evaluation based on being specified in the first list as dirty and being specified in the third list as associated with the requested output attribute. The set of output attributes are evaluated.
    Type: Application
    Filed: September 6, 2012
    Publication date: August 1, 2013
    Applicant: DreamWorks Animation LLC
    Inventors: Martin WATT, Alexander P. Powell
  • Publication number: 20130088497
    Abstract: A skin deformation system for use in computer animation is disclosed. The skin deformation system accesses the skeleton structure of a computer generated character, and accesses a user's identification of features of the skeleton structure that may affect a skin deformation. The system also accesses the user's identification of a weighting strategy. Using the identified weighting strategy and identified features of the skeleton structure, the skin deformation system determines the degree to which each feature identified by the user may influence the deformation of a skin of the computer generated character. The skin deformation system may incorporate secondary operations including bulge, slide, scale and twist into the deformation of a skin Information relating to a deformed skin may be stored by the skin deformation system so that the information may be used to produce a visual image for a viewer.
    Type: Application
    Filed: October 7, 2011
    Publication date: April 11, 2013
    Applicant: DreamWorks Animation LLC
    Inventors: Paul Carmen DILORENZO, Matthew Christopher Gong, Arthur D. Gregory
  • Publication number: 20130063363
    Abstract: A drawing table for an animator to hand create or modify a computer-generated image includes a display and a fused fiber optic plate. The display is configured to display the computer-generated image on a top surface. The fused fiber optic plate of bundled, optical fibers has an input surface and an output surface. The input surface is optically bonded to the top surface of the display. When the computer-generated image is displayed on the display, the fused fiber optic plate is configured to relay the computer-generated image from the input surface to the output surface.
    Type: Application
    Filed: September 9, 2011
    Publication date: March 14, 2013
    Applicant: DreamWorks Animation LLC
    Inventors: Edwin R. LEONARD, Hans T. Ku
  • Publication number: 20130027407
    Abstract: An animated special effect is modeled using a fluid dynamics framework system. The fluid dynamics framework for animated special effects system accepts volumetric data as input. Input volumetric data may represent the initial state of an animated special effect. Input volumetric data may also represent sources, sinks, external forces, and/or other influences on the animated special effect. In addition, the system accepts input parameters related to fluid dynamics modeling. The input volumes and parameters are applied to the incompressible Navier-Stokes equations as modifications to the initial state of the animated special effect, as modifications to the forcing term of a pressure equation, or in the computations of other types of forces that influence the solution. The input volumetric data may be composited with other volumetric data using a scalar blending field. The solution of the incompressible Navier-Stokes equations models the motion of the animated special effect.
    Type: Application
    Filed: July 27, 2011
    Publication date: January 31, 2013
    Applicant: DreamWorks Animation LLC
    Inventor: Ronald D. HENDERSON
  • Publication number: 20130002671
    Abstract: A computer-animated scene illuminated by indirect light is shaded. The scene is comprised of sample locations on a surface element of an object in the scene. A point cloud representation of the scene is generated. Optionally, an importance map of the scene, based on the point cloud representation, is generated. The importance map is generated by rasterizing one or more points in the point cloud and designating areas of interest based on the energy value of the one or more points in the point cloud. A ray tracing engine is biased, based on the importance map. The biased ray tracing engine calculates the path of the ray to the sample locations in the scene to an area of interest. The scene is shaded using the output from the biased ray tracing engine.
    Type: Application
    Filed: June 30, 2011
    Publication date: January 3, 2013
    Applicant: DreamWorks Animation LLC
    Inventors: Chris F. ARMSDEN, Bruce Tartaglia
  • Publication number: 20120169757
    Abstract: Embodiments relate to a computer-implemented method of providing a transition between first and second regions within a virtual scene, where the first and second regions are rendered using different methods and being connected to one another along a border line. The second region features a sharply diminishing illumination from the border line. The method includes adding, an overlay of additional illumination to the first region as to make the illumination in portions of the first region that are close to the borderline similar to that of portions of the second region that are close to the border line. The method also includes shifting a position on which calculation of the illumination of the second region is based away from the first region.
    Type: Application
    Filed: March 12, 2012
    Publication date: July 5, 2012
    Applicant: DreamWorks Animation LLC
    Inventors: Bruce Nunzio TARTAGLIA, Doug Cooper, Pablo Valle, Michael McNeill