Patents Assigned to DreamWorks Animation LLC
  • Publication number: 20170316605
    Abstract: A method for generating stereoscopic images includes obtaining image data comprising a plurality of sample points. A direction, a color value, and a depth value are associated with each sample point. The directions and depth values are relative to a common origin. A mesh is generated by displacing the sample points from the origin. The sample points are displaced in the associated directions by distances representative of the corresponding depth values. The image data is mapped to the mesh such that the color values associated with the sample points are mapped to the mesh at the corresponding directions. A first image of the mesh is generated from a first perspective, and a second image of the mesh is generated from a second perspective. The first and second images of the mesh may be caused to be displayed to provide an illusion of depth.
    Type: Application
    Filed: July 14, 2017
    Publication date: November 2, 2017
    Applicant: DreamWorks Animation LLC
    Inventor: Brad Kenneth HERMAN
  • Publication number: 20170309253
    Abstract: A method of scheduling and performing computations for generating an interactive computer-generated animation on behalf of a client device to achieve a desired quality of service includes generating a computational configuration of computations that, when performed, produce the computer-generated animation with the desired quality of service. The configuration includes an identification of a first computation that outputs first data, a first start time for the first computation, and a first end time, where the first computation is to end before the first end time. The configuration also includes an identification of a second computation that depends on the first data, and a second start time for the second computation. The first computation is performed in response to an occurrence of the first start time and the second computation is performed in response to an occurrence of the second start time.
    Type: Application
    Filed: July 6, 2017
    Publication date: October 26, 2017
    Applicant: DreamWorks Animation LLC
    Inventor: Evan P. SMYTH
  • Publication number: 20170287197
    Abstract: Computer animation tools for viewing, in multiple contexts, the effect of changes to a computer animation are disclosed. An artist configures multiple visual displays in the user interface of a computer animation system. A visual display shows one or more frames of computer animation. An artist configures a visual display to reflect a specific context. For example, the artist may assign a particular virtual viewpoint of a scene to a particular visual display. Once visual displays are configured, the artist changes a configuration of the computer animation. For example, the artist may change the lighting parameters of a scene. In response, the visual displays show the visual effects of the configuration (e.g., lighting parameters) change under corresponding contexts (e.g., different virtual camera viewpoints). Using multiple visual displays, which may be displayed side-by-side, an artist can view the effects of her configuration changes in the various contexts.
    Type: Application
    Filed: April 12, 2017
    Publication date: October 5, 2017
    Applicant: DreamWorks Animation LLC
    Inventors: Tsuey Jin LIOU, Evan P. SMYTH, Andrew Philip PEARCE, Peter MCNERNEY
  • Publication number: 20170278290
    Abstract: Systems and processes providing a tool for visualizing parallel dependency graph evaluation in computer animation are provided. Runtime evaluation data of a parallel dependency graph may be collected, including the start time and stop time for each node in the graph. The visualization tool may process the data to generate performance visualizations as well as other analysis features. Performance visualizations may illustrate the level of concurrency over time during parallel dependency graph evaluation. Performance visualizations may be generated by graphing node blocks according to node start time and stop time as well as the level of concurrency at a given time to illustrate parallelism. Performance visualizations may enable character technical directors, character riggers, programmers, and other users to evaluate how well parallelism is expressed in parallel dependency graphs in computer animation.
    Type: Application
    Filed: June 9, 2017
    Publication date: September 28, 2017
    Applicant: DreamWorks Animation LLC
    Inventors: Martin Peter WATT, Brendan DUNCAN
  • Patent number: 9734798
    Abstract: A method of scheduling and performing computations for generating an interactive computer-generated animation on behalf of a client device to achieve a desired quality of service includes generating a computational configuration of computations that, when performed, produce the computer-generated animation with the desired quality of service. The configuration includes an identification of a first computation that outputs first data, a first start time for the first computation, and a first end time, where the first computation is to end before the first end time. The configuration also includes an identification of a second computation that depends on the first data, and a second start time for the second computation. The first computation is performed in response to an occurrence of the first start time and the second computation is performed in response to an occurrence of the second start time.
    Type: Grant
    Filed: March 17, 2015
    Date of Patent: August 15, 2017
    Assignee: DreamWorks Animation LLC
    Inventor: Evan P. Smyth
  • Patent number: 9721385
    Abstract: A method for generating stereoscopic images includes obtaining image data comprising a plurality of sample points. A direction, a color value, and a depth value are associated with each sample point. The directions and depth values are relative to a common origin. A mesh is generated by displacing the sample points from the origin. The sample points are displaced in the associated directions by distances representative of the corresponding depth values. The image data is mapped to the mesh such that the color values associated with the sample points are mapped to the mesh at the corresponding directions. A first image of the mesh is generated from a first perspective, and a second image of the mesh is generated from a second perspective. The first and second images of the mesh may be caused to be displayed to provide an illusion of depth.
    Type: Grant
    Filed: February 10, 2015
    Date of Patent: August 1, 2017
    Assignee: DreamWorks Animation LLC
    Inventor: Brad Kenneth Herman
  • Publication number: 20170213076
    Abstract: A method for evaluating a facial performance using facial capture of two users includes obtaining a reference set of facial performance data representing a first user's facial capture; obtaining a facial capture of a second user; extracting a second set of facial performance data based on the second user's facial capture; calculating at least one matching metric based on a comparison of the reference set of facial performance data to the second set of facial performance data; and displaying an indication of the at least one matching metric on a display.
    Type: Application
    Filed: January 17, 2017
    Publication date: July 27, 2017
    Applicant: DreamWorks Animation LLC
    Inventors: Emmanuel C. FRANCISCO, Demian GORDON, Elvin KORKUTI
  • Publication number: 20170206696
    Abstract: Systems and methods for automatically animating a character based on an existing corpus of animation are described. The character may be from a previously produced feature animated film, and the data used for training may be the data used to animate the character in the film. A low-dimensional embedding for subsets of the existing animation corresponding to different semantic labels may be learned by mapping high-dimensional rig control parameters to a latent space. A particle model may be used to move within the latent space, thereby generating novel animations corresponding to the space's semantic label, such as a pose. Bridges may link a first pose of a first model within the latent space that is similar to a second pose of a second model of the space. Animations corresponding to transitions between semantic labels may be generated by creating animation paths that traverse a bridge from one model into another.
    Type: Application
    Filed: January 18, 2017
    Publication date: July 20, 2017
    Applicant: DreamWorks Animation LLC
    Inventors: Stephen BAILEY, Martin WATT, Bo MORGAN, James O'BRIEN
  • Patent number: 9703469
    Abstract: A touch-sensitive surface for a computer animator to create or modify a computer-generated image includes processes for differentiating between click and drag operations. The included processes also beneficially reduce input errors. When a touch object (e.g., finger or stylus) touches the drawing table, information regarding the duration of the touch and the movement of the touch are used to determine whether the touch input represents a (graphical user interface) click or a drag operation.
    Type: Grant
    Filed: October 18, 2012
    Date of Patent: July 11, 2017
    Assignee: DreamWorks Animation LLC
    Inventor: Alexander P. Powell
  • Patent number: 9691171
    Abstract: Systems and processes providing a tool for visualizing parallel dependency graph evaluation in computer animation are provided. Runtime evaluation data of a parallel dependency graph may be collected, including the start time and stop time for each node in the graph. The visualization tool may process the data to generate performance visualizations as well as other analysis features. Performance visualizations may illustrate the level of concurrency over time during parallel dependency graph evaluation. Performance visualizations may be generated by graphing node blocks according to node start time and stop time as well as the level of concurrency at a given time to illustrate parallelism. Performance visualizations may enable character technical directors, character riggers, programmers, and other users to evaluate how well parallelism is expressed in parallel dependency graphs in computer animation.
    Type: Grant
    Filed: March 12, 2013
    Date of Patent: June 27, 2017
    Assignee: DreamWorks Animation LLC
    Inventors: Martin Peter Watt, Brendan Duncan
  • Publication number: 20170169555
    Abstract: An electronic device with a display screen provides drawing directions to guide a user to create artwork on a physical medium. The electronic device displays a first drawing direction for drawing a portion of a subject on a physical medium, and prompts a user for a user input indicating completion of the first drawing direction by the user. Upon receiving the prompted user input, the electronic device displays a second drawing direction for drawing another portion of the subject on the physical medium. The subject may be based on a computer-animated movie title. The first drawing direction may include a representation of a virtual host, which is also based on a computer-animated character from a computer-animated movie title.
    Type: Application
    Filed: November 21, 2016
    Publication date: June 15, 2017
    Applicant: DreamWorks Animation LLC
    Inventors: Scott LAROCCA, Campbell MCGROUTHER
  • Patent number: 9659398
    Abstract: Computer animation tools for viewing, in multiple contexts, the effect of changes to a computer animation are disclosed. An artist configures multiple visual displays in the user interface of a computer animation system. A visual display shows one or more frames of computer animation. An artist configures a visual display to reflect a specific context. For example, the artist may assign a particular virtual viewpoint of a scene to a particular visual display. Once visual displays are configured, the artist changes a configuration of the computer animation. For example, the artist may change the lighting parameters of a scene. In response, the visual displays show the visual effects of the configuration (e.g., lighting parameters) change under corresponding contexts (e.g., different virtual camera viewpoints). Using multiple visual displays, which may be displayed side-by-side, an artist can view the effects of her configuration changes in the various contexts.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: May 23, 2017
    Assignee: DreamWorks Animation LLC
    Inventors: Tsuey Jin Liou, Evan P. Smyth, Andrew Phillip Pearce, Peter McNerney
  • Patent number: 9626787
    Abstract: Systems and methods for rendering three-dimensional images using a render setup graph are provided. A dependency graph is accessed. The dependency graph comprises a plurality of supplier nodes, a multiplexer node, and a plurality of graphlet nodes. The plurality of supplier nodes is accessed. The supplier nodes each have an output of a first type. These outputs are connected to the multiplexer node. A graphlet is accessed. The graphlet comprises the plurality of graphlet nodes. An output of the multiplexer node connects to the graphlet by connecting to an input of one node of the plurality of graphlet nodes. The multiplexer is configured to generate an instance of the graphlet for each supplier node connected to the multiplexer node. An image is rendered utilizing the accessed graphlet.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: April 18, 2017
    Assignee: DreamWorks Animation LLC
    Inventors: Peter McNerney, Evan P. Smyth
  • Publication number: 20170098327
    Abstract: One exemplary process for animating hair includes receiving data representing a plurality of hairs and a plurality of objects in a timestep of a frame of animation. A first tree is populated to represent kinematic objects of the plurality of objects and a second tree is populated to represent dynamic objects of the plurality of objects based on the received data. A first elasticity preconditioner is created to represent internal elastic energy of the plurality of hairs based on the received data. Based on the first tree and the second tree, a first set of potential contacts is determined between two or more hairs of the plurality of hairs or between one or more hairs of the plurality of hairs and one or more objects of the plurality of objects. Positions of the plurality of hairs are determined based on the first set of potential contacts and the first elasticity preconditioner.
    Type: Application
    Filed: September 13, 2016
    Publication date: April 6, 2017
    Applicant: DreamWorks Animation LLC
    Inventors: Galen G. GORNOWICZ, Silviu BORAC
  • Patent number: 9589382
    Abstract: Systems and methods for rendering an image using a render setup graph are provided. The render setup graph may be used to configure and manage lighting configuration data as well as external processes used to render the computer-generated image. The render setup graph may include a dependency graph having nodes interconnected by edges along which objects and object configuration data may be passed between nodes. The nodes may be used to provide a source of objects and object configuration data, configure visual effects of an object, partition a set of objects, call external processes, perform data routing functions within the graph, and the like. In this way, the render setup graph may advantageously be used to organize configuration data and execution of processes for rendering an image.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: March 7, 2017
    Assignee: DreamWorks Animation LLC
    Inventors: Robert Giles Wilson, Evan P. Smyth, Mark Lee, Max Requenes, Peter McNerney
  • Patent number: 9582918
    Abstract: A computer-implemented method determining a user-defined stereo effect for a computer-generated scene. A set of bounded-parallax constraints including a near-parallax value and a far-parallax value is obtained. A stereo-volume value is obtained, wherein the stereo-volume value represents a percentage of parallax. A stereo-shift value is also obtained, wherein the stereo-shift value represents a distance across one of: an area associated with a camera sensor of a pair of stereoscopic cameras adapted to film the computer-generated scene; and a screen adapted to depict a stereoscopic image of the computer-generated scene. A creative near-parallax value is calculated based on the stereo-shift value, the stereo-volume, and the near-parallax value. A creative far-parallax value is also calculated based on the stereo-shift value and the product of the stereo-volume and the far-parallax value.
    Type: Grant
    Filed: March 14, 2013
    Date of Patent: February 28, 2017
    Assignee: DreamWorks Animation LLC
    Inventors: Philip McNally, Matthew Low
  • Patent number: 9514562
    Abstract: Systems and methods for partitioning a set of animation objects using a node in a render setup graph are provided. The render setup graph may be used to configure and manage lighting configuration data as well as external processes used to render the computer-generated image. The render setup graph may include a dependency graph having nodes interconnected by edges along which objects and object configuration data may be passed between nodes. The nodes may be used to provide a source of objects and object configuration data, configure visual effects of an object, partition a set of objects, call external processes, perform data routing functions within the graph, and the like. The objects can be partitioned based on attributes of the objects and associated configuration data. In this way, the render setup graph may advantageously be used to organize configuration data and execution of processes for rendering an image.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: December 6, 2016
    Assignee: DreamWorks Animation LLC
    Inventors: Robert Giles Wilson, Evan P. Smyth, Mark Lee, Max Requenes, Peter McNerney
  • Patent number: 9514560
    Abstract: Systems and methods for using hierarchical tags to create a computer-generated animation are provided. The hierarchical tags may be used to organize, identify, and select animation assets in order to configure animation parameters used to render a computer-generated image. The hierarchical tags may be used to display representations of animation assets for selection. A hierarchy based on the hierarchical tags may be represented by a tree structure. The hierarchical tags may be used as part of a rule to partition animation assets. In this way, the hierarchical tags may advantageously be used to identify, organize, and select animation assets and perform animation processes.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: December 6, 2016
    Assignee: DreamWorks Animation LLC
    Inventors: Peter McNerney, Evan P. Smyth
  • Publication number: 20160343167
    Abstract: A virtual reality system includes a platform, a headset, a mount, and a control unit. The headset includes a motion-sensing unit and a display unit configured to display a video of a virtual environment. The mount is positioned on the platform and configured to releasably engage the headset. While the headset is engaged with the mount, the headset is positioned in a first position. While the headset is disengaged from the mount, the headset is positioned in a second position. The control unit is connected to the headset and configured to receive first data representing the first position and associate the first position with a predetermined first perspective of the virtual environment. The control unit is also configured to receive second data representing the second position, determine a second perspective of the virtual environment corresponding to the second position, and provide video of the virtual environment from the second perspective.
    Type: Application
    Filed: May 17, 2016
    Publication date: November 24, 2016
    Applicant: DreamWorks Animation LLC
    Inventors: Brad Kenneth HERMAN, St. John COLĂ“N
  • Patent number: 9460553
    Abstract: Locations are shaded for use in rendering a computer-generated scene having one or more objects represented by the point cloud. A hierarchy for the point cloud is obtained. The point cloud includes a plurality of points. The hierarchy has a plurality of clusters of points of the point cloud. A location is selected to shade. A first cluster from the plurality of clusters is selected. The first cluster represents a first set of points in the point cloud. An importance weight for the first cluster is determined. A render-quality criterion for the first cluster is determined based on the importance weight. Whether the first cluster meets a render-quality criterion is determined based on a render-quality parameter for the first cluster. In response to the first cluster meeting the quality criterion, the location is shaded based on an indication of light emitted from the first cluster.
    Type: Grant
    Filed: June 18, 2012
    Date of Patent: October 4, 2016
    Assignee: DreamWorks Animation LLC
    Inventor: Eric Tabellion