Lucasfilm Patents

Lucasfilm Ltd. produced the Star Wars and Indiana Jones motion pictures. The company was acquired by the Walt Disney Company in 2012.

Lucasfilm Patents by Type
  • Patent number: 10692288
    Abstract: A method may include capturing a first image of a physical environment using a mobile device. The mobile device may include a physical camera and a display. The method may also include receiving a second image from a content provider system. The second image may be generated by the content provider system by rendering a view from a virtual camera in a virtual environment. The virtual environment may represent at least a portion of the physical environment. A location of the virtual camera in the virtual environment may correspond to a location of the physical camera in the physical environment. The second image may include a view of a computer-generated object. The method may additionally include generating a third image by compositing the first image and the second image, and causing the third image to be displayed on the display of the mobile device.
    Type: Grant
    Filed: June 27, 2017
    Date of Patent: June 23, 2020
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Nicholas Rasmussen, Michael Koperwas, Earle M. Alexander, IV
  • Publication number: 20200143592
    Abstract: An immersive content presentation system and techniques that can detect and correct lighting artifacts caused by movements of one or more taking camera in a performance area consisting of multiple displays (e.g., LED or LCD displays). The techniques include capturing, with a camera, a plurality of images of a performer performing in a performance area at least partially surrounded by one or more displays presenting images of a virtual environment. Where the images of the virtual environment within a frustum of the camera are updated on the one or more displays based on movement of the camera, and images of the virtual environment outside of the frustum of the camera are not updated based on movement of the camera. The techniques further include generating content based on the plurality of captured images.
    Type: Application
    Filed: November 6, 2019
    Publication date: May 7, 2020
    Applicant: LUCASFILM ENTERTAINMENT COMPANY LTD. LLC
    Inventors: Roger CORDES, Richard BLUFF, Lutz LATTA
  • Publication number: 20200145644
    Abstract: An immersive content presentation system and techniques that can detect and correct lighting artifacts caused by movements of one or more taking camera in a performance area consisting of multiple displays (e.g., LED or LCD displays). The techniques include capturing, with a camera, a plurality of images of a performer performing in a performance area at least partially surrounded by one or more displays presenting images of a virtual environment. Where the images of the virtual environment within a frustum of the camera are updated on the one or more displays based on movement of the camera, and images of the virtual environment outside of the frustum of the camera are not updated based on movement of the camera. The techniques further include generating content based on the plurality of captured images.
    Type: Application
    Filed: November 6, 2019
    Publication date: May 7, 2020
    Applicant: LUCASFILM ENTERTAINMENT COMPANY LTD. LLC
    Inventors: Roger CORDES, Nicholas RASMUSSEN, Kevin WOOLEY, Rachel ROSE
  • Patent number: 10627908
    Abstract: A system and method for controlling a view of a virtual reality (VR) environment via a computing device with a touch sensitive surface are disclosed. In some examples, a user may be enabled to augment the view of the VR environment by providing finger gestures to the touch sensitive surface. In one example, the user is enabled to call up a menu in the view of the VR environment. In one example, the user is enabled to switch the view of the VR environment displayed on a device associated with another user to a new location within the VR environment. In some examples, the user may be enabled to use the computing device to control a virtual camera within the VR environment and have various information regarding one or more aspects of the virtual camera displayed in the view of the VR environment presented to the user.
    Type: Grant
    Filed: September 30, 2015
    Date of Patent: April 21, 2020
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Darby Johnston, Ian Wakelin
  • Patent number: 10602200
    Abstract: Systems and techniques are provided for switching between different modes of a media content item. A media content item may include a movie that has different modes, such as a cinematic mode and an interactive mode. For example, a movie may be presented in a cinematic mode that does not allow certain user interactions with the movie. The movie may be switched to an interactive mode during any point of the movie, allowing a viewer to interact with various aspects of the movie. The movie may be displayed using different formats and resolutions depending on which mode the movie is being presented.
    Type: Grant
    Filed: March 31, 2016
    Date of Patent: March 24, 2020
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Andrew Grant, Lutz Markus Latta, Ian Wakelin, Darby Johnston, John Gaeta
  • Patent number: 10600245
    Abstract: Systems and techniques are provided for switching between different modes of a media content item. A media content item may include a movie that has different modes, such as a cinematic mode and an interactive mode. For example, a movie may be presented in a cinematic mode that does not allow certain user interactions with the movie. The movie may be switched to an interactive mode during any point of the movie, allowing a viewer to interact with various aspects of the movie. The movie may be displayed using different formats and resolutions depending on which mode the movie is being presented.
    Type: Grant
    Filed: May 28, 2015
    Date of Patent: March 24, 2020
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Lutz Markus Latta, Ian Wakelin, Darby Johnston, Andrew Grant, John Gaeta
  • Patent number: 10594786
    Abstract: Views of a virtual environment can be displayed on mobile devices in a real-world environment simultaneously for multiple users. The users can operate selections devices in the real-world environment that interact with objects in the virtual environment. Virtual characters and objects can be moved and manipulated using selection shapes. A graphical interface can be instantiated and rendered as part of the virtual environment. Virtual cameras and screens can also be instantiated to created storyboards, backdrops, and animated sequences of the virtual environment. These immersive experiences with the virtual environment can be used to generate content for users and for feature films.
    Type: Grant
    Filed: October 11, 2017
    Date of Patent: March 17, 2020
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Jose Perez, III, Peter Dollar, Barak Moshe
  • Patent number: 10553036
    Abstract: Views of a virtual environment can be displayed on mobile devices in a real-world environment simultaneously for multiple users. The users can operate selections devices in the real-world environment that interact with objects in the virtual environment. Virtual characters and objects can be moved and manipulated using selection shapes. A graphical interface can be instantiated and rendered as part of the virtual environment. Virtual cameras and screens can also be instantiated to created storyboards, backdrops, and animated sequences of the virtual environment. These immersive experiences with the virtual environment can be used to generate content for users and for feature films.
    Type: Grant
    Filed: October 11, 2017
    Date of Patent: February 4, 2020
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Jose Perez, III, Peter Dollar, Barak Moshe
  • Patent number: 10489958
    Abstract: A system includes a computing device that includes a memory configured to store instructions. The system also includes a processor configured to execute the instructions to perform a method that includes receiving multiple representations of an object. Each of the representations includes position information of the object and corresponds to an instance in time. For at least one of the representations, the method includes defining a contour that represents a movable silhouette of a surface feature of the object. The method also includes producing a deformable model of the surface of the object from the defined contour and from the at least one representation of the object.
    Type: Grant
    Filed: August 1, 2017
    Date of Patent: November 26, 2019
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Ronald Mallet, Yuting Ye, Michael Koperwas, Adrian R. Goldenthal, Kiran S. Bhat
  • Patent number: 10484824
    Abstract: A method may include causing first content to be displayed on a display device. The method may also include determining a location of a mobile device relative to the display device. In some embodiments, the mobile device may be positioned such that the first content is visible to a viewer of the mobile device. The method may additionally include causing second content to be displayed on the mobile device such that the second content is layered over the first content.
    Type: Grant
    Filed: June 27, 2016
    Date of Patent: November 19, 2019
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: John Gaeta, Michael Koperwas, Nicholas Rasmussen
  • Patent number: 10423234
    Abstract: A system and method facilitating a user to manipulate a virtual reality (VR) environment are disclosed. The user may provide an input via a touch sensitive surface of a computing device associated with the user to bind a virtual object in the VR environment to the computing device. The user may then move and/or rotate the computing device to cause the bound virtual object to move and/or rotate in the VR environment accordingly. In some examples, the bound virtual object may cast a ray into the VR environment. The movement and/or rotation of the virtual object controlled by the computing device in those examples can change the direction of the ray. In some examples, the virtual object may include a virtual camera. In those examples, the user may move and/or rotate the virtual camera in the VR environment by moving and/or rotate the computing device.
    Type: Grant
    Filed: September 30, 2015
    Date of Patent: September 24, 2019
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Darby Johnston, Ian Wakelin
  • Patent number: 10403019
    Abstract: A multi-channel tracking pattern is provided along with techniques and systems for performing motion capture using the multi-channel tracking pattern. The multi-channel tracking pattern includes a plurality of shapes having different colors on different portions of the pattern. The portions with the unique shapes and colors allow a motion capture system to track motion of an object bearing the pattern across a plurality of video frames.
    Type: Grant
    Filed: February 11, 2016
    Date of Patent: September 3, 2019
    Assignee: LUCASFILM ENTERTAINMENT COMPANY
    Inventor: John Levin
  • Patent number: 10373342
    Abstract: Views of a virtual environment can be displayed on mobile devices in a real-world environment simultaneously for multiple users. The users can operate selections devices in the real-world environment that interact with objects in the virtual environment. Virtual characters and objects can be moved and manipulated using selection shapes. A graphical interface can be instantiated and rendered as part of the virtual environment. Virtual cameras and screens can also be instantiated to created storyboards, backdrops, and animated sequences of the virtual environment. These immersive experiences with the virtual environment can be used to generate content for users and for feature films.
    Type: Grant
    Filed: October 11, 2017
    Date of Patent: August 6, 2019
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Jose Perez, III, Peter Dollar, Barak Moshe
  • Patent number: 10321117
    Abstract: A method of generating unrecorded camera views may include receiving a plurality of 2-D video sequences of a subject in a real 3-D space, where each 2-D video sequence may depict the subject from a different perspective. The method may also include generating a 3-D representation of the subject in a virtual 3-D space, where a geometry and texture of the 3-D representation may be generated based on the 2D video sequences, and the motion of the 3-D representation in the virtual 3-D space is based on motion of the subject in the real 3-D space. The method may additionally include generating a 2-D video sequence of the motion of the 3D representation using a virtual camera in the virtual 3-D space where the perspective of the virtual camera may be different than the perspectives of the plurality of 2-D video sequences.
    Type: Grant
    Filed: August 25, 2014
    Date of Patent: June 11, 2019
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Hilmar Koch, Ronald Mallet, Kim Libreri, Paige Warner, Mike Sanders, John Gaeta
  • Publication number: 20190122374
    Abstract: Embodiments of the disclosure provide systems and methods for motion capture to generate content (e.g., motion pictures, television programming, videos, etc.). An actor or other performing being can have multiple markers on his or her face that are essentially invisible to the human eye, but that can be clearly captured by camera systems of the present disclosure. Embodiments can capture the performance using two different camera systems, each of which can observe the same performance but capture different images of that performance. For instance, a first camera system can capture the performance within a first light wavelength spectrum (e.g., visible light spectrum), and a second camera system can simultaneously capture the performance in a second light wavelength spectrum different from the first spectrum (e.g., invisible light spectrum such as the IR light spectrum). The images captured by the first and second camera systems can be combined to generate content.
    Type: Application
    Filed: August 13, 2018
    Publication date: April 25, 2019
    Applicant: Lucasfilm Entertainment Company Ltd.
    Inventors: Leandro Estebecorena, John Knoll, Stephane Grabli, Per Karefelt, Pablo Helman, John M. Levin
  • Publication number: 20190124244
    Abstract: Embodiments of the disclosure provide systems and methods for motion capture to generate content (e.g., motion pictures, television programming, videos, etc.). An actor or other performing being can have multiple markers on his or her face that are essentially invisible to the human eye, but that can be clearly captured by camera systems of the present disclosure. Embodiments can capture the performance using two different camera systems, each of which can observe the same performance but capture different images of that performance. For instance, a first camera system can capture the performance within a first light wavelength spectrum (e.g., visible light spectrum), and a second camera system can simultaneously capture the performance in a second light wavelength spectrum different from the first spectrum (e.g., invisible light spectrum such as the IR light spectrum). The images captured by the first and second camera systems can be combined to generate content.
    Type: Application
    Filed: August 13, 2018
    Publication date: April 25, 2019
    Applicant: Lucasfilm Entertainment Company Ltd.
    Inventors: John Knoll, Leandro Estebecorena, Stephane Grabli, Per Karefelt, Pablo Helman, John M. Levin
  • Patent number: 10269165
    Abstract: A system includes a computing device that includes a memory and a processor configured to execute instructions to perform a method that includes receiving multiple representations of one or more expressions of an object. Each representation includes position information attained from one or more images of the object. The method also includes producing an animation model from one or more groups of controls that respectively define each of the one or more expressions of the object as provided by the multiple representations. Each control of each group of controls has an adjustable value that defines the geometry of at least one shape of a portion of the respective expression of the object. Producing the animation model includes producing one or more corrective shapes if the animation model is incapable of accurately presenting the one or more expressions of the object as provided by the multiple representations.
    Type: Grant
    Filed: January 30, 2012
    Date of Patent: April 23, 2019
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Kiran S. Bhat, Michael Koperwas, Rachel M. Rose, Jung-Seung Hong, Frederic P. Pighin, Christopher David Twigg, Cary Phillips, Steve Sullivan
  • Patent number: 10269169
    Abstract: In one general aspect, a method is described. The method includes generating a positional relationship between one or more support structures having at least one motion capture mark and at least one virtual structure corresponding to geometry of an object to be tracked and positioning the support structures on the object to be tracked. The support structures has sufficient rigidity that, if there are multiple marks, the marks on each support structure maintain substantially fixed distances from each other in response to movement by the object. The method also includes determining an effective quantity of ray traces between one or more camera views and one or more marks on the support structures, and estimating an orientation of the virtual structure by aligning the determined effective quantity of ray traces with a known configuration of marks on the support structures.
    Type: Grant
    Filed: August 9, 2016
    Date of Patent: April 23, 2019
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Steve Sullivan, Colin Davidson, Michael Sanders, Kevin Wooley
  • Patent number: 10147219
    Abstract: Performance capture systems and techniques are provided for capturing a performance of a subject and reproducing an animated performance that tracks the subject's performance. For example, systems and techniques are provided for determining control values for controlling an animation model to define features of a computer-generated representation of a subject based on the performance. A method may include obtaining input data corresponding to a pose performed by the subject, the input data including position information defining positions on a face of the subject. The method may further include obtaining an animation model for the subject that includes adjustable controls that control the animation model to define facial features of the computer-generated representation of the face, and matching one or more of the positions on the face with one or more corresponding positions on the animation model.
    Type: Grant
    Filed: February 3, 2017
    Date of Patent: December 4, 2018
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Kiran Bhat, Michael Koperwas, Jeffery Yost, Ji Hun Yu, Sheila Santos
  • Patent number: 10142561
    Abstract: A handheld device includes: an input control configured to control and modify a virtual scene including a virtual camera; and a display that shows a representation of the controlled and modified virtual scene generated by the virtual camera. A system includes: a computer system configured to execute program instructions for generating a virtual scene including a virtual camera; and handheld device configured to communicate with the computer system for controlling and modifying the virtual scene, the handheld device comprising: an input control configured to control and modify the virtual scene; and a display that shows a representation of the controlled and modified virtual scene generated by the virtual camera.
    Type: Grant
    Filed: March 20, 2017
    Date of Patent: November 27, 2018
    Assignee: Lucasfilm Entertainment Company Ltd.
    Inventors: Spencer Reynolds, Michael Sanders, Kevin Wooley, Steve Sullivan, Adam Schnitzer
  • Patent number: 10109062
    Abstract: System and methods are provided for a non-coherent point tracking process that allows unknown camera motion to be estimated. One or more edges can be identified in images captured by a camera when shooting a scene. For each of the identified edge in the images, at least one tracking object can be placed arbitrarily on the edge. The positions of tracking objects in the images can then be used to estimate a camera motion. In some embodiments, two tracking objects can be placed arbitrarily on the edge to represent the edge and move along the edge arbitrarily from image to image where the edge appears. Multiple of such edges can be identified in the images and camera motions in multiple directions can be estimated based on the identified edges and combined to obtain a combined camera motion indicating the camera's movement in a 3D space when shooting the scene.
    Type: Grant
    Filed: September 30, 2016
    Date of Patent: October 23, 2018
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventor: Douglas Moore
  • Patent number: 10078917
    Abstract: A method may include rendering a first view of a three-dimensional (3-D) virtual scene comprising a view of first content being displayed on a virtual display device from a location in the 3-D virtual scene. The method may also include rendering a second view comprising one or more content objects. The second view may be rendered from the location in the 3-D virtual scene, and the second scene may include a view of the display device as would be seen through a pair of augmented-reality glasses that display the one or more content objects. The method may additionally include generating a composite view by combing the first view and the second view. The method may further include causing the composite view to be displayed on a virtual-reality headset.
    Type: Grant
    Filed: October 10, 2016
    Date of Patent: September 18, 2018
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: John Gaeta, Michael Koperwas, Nicholas Rasmussen
  • Patent number: 10055874
    Abstract: In some embodiments a method of transferring facial expressions from a subject to a computer-generated character is provided where the method includes receiving positional information from a motion capture session of the subject representing a performance having facial expressions to be transferred to the computer-generated character, receiving a first animation model that represents the subject, and receiving a second animation model that represents the computer-generated character. Each of the first and second animation models can include a plurality of adjustable controls that define geometries of the model and that can be adjusted to present different facial expressions on the model, and where the first and second animation models are designed so that setting the same values for the same set of adjustable controls in each model generates a similar facial poses on the models.
    Type: Grant
    Filed: August 20, 2015
    Date of Patent: August 21, 2018
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Ji Hun Yu, Michael Koperwas, Jeffrey Bruce Yost, Sheila Santos, Kiran S. Bhat
  • Patent number: 9858700
    Abstract: Systems and techniques are provided for transferring changes in animation data between geometric models of a character having different resolutions. For example, systems and techniques are provided for transferring changes in geometric properties between the geometric models. A method may include obtaining a first geometric model of the character and a second geometric model of the character, the geometric models having different resolutions with different numbers of data points. The method may further include determining one or more correspondences between data points of the first geometric model and data points of the second geometric model. The correspondences include one or more data points of the first geometric model that overlap with one or more data points of the second geometric model.
    Type: Grant
    Filed: May 13, 2015
    Date of Patent: January 2, 2018
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Rachel Rose, Yuting Ye, Scott Jones
  • Patent number: 9854176
    Abstract: Systems and techniques for dynamically capturing and reconstructing lighting are provided. The systems and techniques may be based on a stream of images capturing the lighting within an environment as a scene is shot. Reconstructed lighting data may be used to illuminate a character in a computer-generated environment as the scene is shot. For example, a method may include receiving a stream of images representing lighting of a physical environment. The method may further include compressing the stream of images to reduce an amount of data used in reconstructing the lighting of the physical environment and may further include outputting the compressed stream of images for reconstructing the lighting of the physical environment using the compressed stream, the reconstructed lighting being used to render a computer-generated environment.
    Type: Grant
    Filed: January 24, 2014
    Date of Patent: December 26, 2017
    Assignee: Lucasfilm Entertainment Company Ltd.
    Inventors: Michael Sanders, Kiran Bhat, Curt Isamu Miyashiro, Jason Snell, Stephane Grabli
  • Patent number: 9818201
    Abstract: Methods and systems efficiently apply known distortion, such as of a camera and lens, to source image data to produce data of an output image with the distortion. In an embodiment, an output image field is segmented into regions so that on each segment the distortion function is approximately linear, and segmentation data is stored in a quadtree. The distortion function is applied to the segmented image field to produce a segmented rendered distortion image (SRDI) and a corresponding look-up table. To distort a source image, a location in the output image field is selected, and the uniquely colored segment at the same location in the SRDI is found. The look-up table provides the local linear inverse of the distortion function, which is applied to determine from where in the source image to take image texture data for the distorted output image.
    Type: Grant
    Filed: December 22, 2014
    Date of Patent: November 14, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventor: Ronald Mallet
  • Patent number: 9811941
    Abstract: Methods, systems and computer program products pertaining to simulating liquid bodies are presented. The subject matter of this document can be embodied in a method that includes obtaining one or more data arrays representing low frequency spatial features of a simulated volume of liquid, and up-sampling the one or more data arrays to produce corresponding high resolution data arrays. The method also includes obtaining procedural data representing high frequency spatial features of a simulated liquid surface, and modifying the one or more high resolution data arrays using the procedural data to produce corresponding modified data arrays that reflect both the high frequency and the low frequency spatial features of the simulated volume of liquid.
    Type: Grant
    Filed: April 30, 2013
    Date of Patent: November 7, 2017
    Assignee: Lucasfilm Entertainment Company Ltd.
    Inventors: Frederick E. Hankins, Nicholas Grant Rasmussen, William Geiger
  • Patent number: 9792479
    Abstract: A method of motion capture may include accessing a 3D model of a subject, and associating the 3D model of the subject with a 2D representation of the subject in a plurality of frames. The method may also include identifying a change to the 2D representation of the subject between two or more of the plurality of frames, and deforming the 3D model in a virtual 3D space. In some embodiments, the deforming may be based on the identified change to the 2D representation and at least one constraint restricting how the 3D model can be deformed.
    Type: Grant
    Filed: January 29, 2014
    Date of Patent: October 17, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Ronald Mallet, Kiran S. Bhat, Kevin Wooley
  • Patent number: 9779538
    Abstract: A method may include presenting a scene from linear content on one or more display devices in an immersive environment, and receiving, from a user within the immersive environment, input to change an aspect of the scene. The method may also include accessing 3-D virtual scene information previously used to render the scene, and changing the 3-D virtual scene information according to the changed aspect of the scene. The method may additionally include rending the 3-D virtual scene to incorporate the changed aspect, and presenting the rendered scene in real time in the immersive user environment.
    Type: Grant
    Filed: December 15, 2014
    Date of Patent: October 3, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Mike Sanders, Kim Libreri, Nicholas Grant Rasmussen, John Gaeta
  • Patent number: 9781354
    Abstract: Among other aspects, on computer-implemented method includes: receiving at least one command in a computer system from a handheld device; positioning a virtual camera and controlling a virtual scene according to the command; and in response to the command, generating an output to the handheld device for displaying a view of the virtual scene as controlled on a display of the handheld device, the view captured by the virtual camera as positioned.
    Type: Grant
    Filed: April 12, 2016
    Date of Patent: October 3, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Kevin Wooley, Michael Sanders, Steve Sullivan, Spencer Reynolds, Brian Cantwell
  • Patent number: 9747716
    Abstract: A system includes a computing device that includes a memory configured to store instructions. The system also includes a processor configured to execute the instructions to perform a method that includes receiving multiple representations of an object. Each of the representations includes position information of the object and corresponds to an instance in time. For at least one of the representations, the method includes defining a contour that represents a movable silhouette of a surface feature of the object. The method also includes producing a deformable model of the surface of the object from the defined contour and from the at least one representation of the object.
    Type: Grant
    Filed: March 15, 2013
    Date of Patent: August 29, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Ronald Mallet, Yuting Ye, Michael Koperwas, Adrian R. Goldenthal, Kiran S. Bhat
  • Patent number: 9734624
    Abstract: A method of compressing a deep image representation may include receiving a deep image, where the deep image may include multiple pixels, and where each pixel in the deep image may include multiple samples. The method may also include compressing the deep image by combining samples in each pixel that are associated with the same primitives. This process may be repeated on a pixel-by-pixel basis. Some embodiments may use primitive IDs to match pixels to primitives through the rendering and compositing process.
    Type: Grant
    Filed: April 30, 2014
    Date of Patent: August 15, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventor: Shijun Haw
  • Patent number: 9734615
    Abstract: An animation analyzer is configured to receive an animation sequence and to identify a subsample of the frames that are to be rendered. A rendering engine is configured to render the subsample of the frames. The rendering engine is further configured to identify the frames that have not been rendered and to generate in-betweens for the frames that have not been rendered. The rendering engine is further configured to assemble the subsample of frames and the in-betweens into a video sequence depicting the animation sequence.
    Type: Grant
    Filed: March 14, 2013
    Date of Patent: August 15, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: John Knoll, Victor Schutz, IV, Mark Nettleton
  • Patent number: 9710972
    Abstract: A method may include displaying, on one or more display devices in a virtual-reality environment, a visual representation of a 3-D virtual scene from the perspective of a subject location in the virtual-reality environment. The method may also include displaying, on the one or more display devices, a chroma-key background with the visual representation. The method may further include recording, using a camera, an image of the subject in the virtual-reality environment against the chroma-key background.
    Type: Grant
    Filed: September 11, 2014
    Date of Patent: July 18, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Mike Sanders, Kim Libreri, Nick Rasmussen, John Gaeta
  • Patent number: 9710966
    Abstract: Methods are disclosed for the computer generation of data for images that include hair, fur, or other strand-like material. A volume for the hair is specified, having a plurality of surfaces. A fluid flow simulation is performed within the volume, with a first surface of the volume being a source area through which fluid is simulated to enter the volume, and a second surface being an exit surface through which fluid is simulated as exiting the volume. The fluid flow simulation may be used to produce fluid flow lines, such as from a velocity vector field for the fluid. Fluid flow lines are selected, and image data of hairs that follow the fluid flow lines are generated. Other embodiments include generating animation sequences by generating images wherein the volume and surfaces vary between frames.
    Type: Grant
    Filed: September 16, 2014
    Date of Patent: July 18, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Stephen D. Bowline, Nicholas Grant Rasmussen
  • Patent number: 9704290
    Abstract: A method may include receiving a plurality of objects from a 3-D virtual scene. The plurality of objects may be arranged in a hierarchy. The method may also include generating a plurality of identifiers for the plurality of objects. The plurality of identifiers may include a first identifier for a first object in the plurality of objects, and the identifier may be generated based on a position of the first object in the hierarchy. The method may additionally include performing a rendering operation on the plurality of objects to generate a deep image. The deep image may include a plurality of samples that correspond to the first object. The method may further include propagating the plurality of identifiers through the rendering operation such that each of the plurality of samples in the deep image that correspond to the first object are associated with the identifier.
    Type: Grant
    Filed: September 30, 2014
    Date of Patent: July 11, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Shijun Haw, Xavier Bernasconi
  • Patent number: 9704229
    Abstract: A method of rendering stereo images includes receiving a first image of an object comprising rendered color information. The first image of the object is rendered using a first camera position. The method also includes receiving a second image of the object. The second image of the object is rendered using a second camera position, and pixels in the second image comprise locations of corresponding pixels in the first image. The method additionally includes coloring the pixels in the second image using colors of the corresponding pixels in the first image. The second image and the third image are stereo images for a 3-D presentation.
    Type: Grant
    Filed: February 17, 2016
    Date of Patent: July 11, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Victor Schutz, Patrick Conran
  • Publication number: 20170195527
    Abstract: A handheld device includes: an input control configured to control and modify a virtual scene including a virtual camera; and a display that shows a representation of the controlled and modified virtual scene generated by the virtual camera. A system includes: a computer system configured to execute program instructions for generating a virtual scene including a virtual camera; and handheld device configured to communicate with the computer system for controlling and modifying the virtual scene, the handheld device comprising: an input control configured to control and modify the virtual scene; and a display that shows a representation of the controlled and modified virtual scene generated by the virtual camera.
    Type: Application
    Filed: March 20, 2017
    Publication date: July 6, 2017
    Applicant: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Spencer Reynolds, Michael Sanders, Kevin Wooley, Steve Sullivan, Adam Schnitzer
  • Publication number: 20170178382
    Abstract: A multi-channel tracking pattern is provided along with techniques and systems for performing motion capture using the multi-channel tracking pattern. The multi-channel tracking pattern includes a plurality of shapes having different colors on different portions of the pattern. The portions with the unique shapes and colors allow a motion capture system to track motion of an object bearing the pattern across a plurality of video frames.
    Type: Application
    Filed: February 11, 2016
    Publication date: June 22, 2017
    Applicant: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventor: John Levin
  • Patent number: 9684993
    Abstract: A method includes receiving a first motion path for an object, where an orientation of the object is not aligned with the first motion path for the object for at least a portion of the first motion path. The method also includes receiving a first motion path for a virtual camera and determining a speed of the object along the first motion path for the object. The method additionally includes calculating a second motion path for the object based on the speed of the object along the first motion path for the object and the orientation of the object, where the second motion path of the object is aligned with second motion path. The method further includes calculating a second motion path for the virtual camera based on a difference between the first motion path of the object and the second motion path of the object.
    Type: Grant
    Filed: September 23, 2015
    Date of Patent: June 20, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventor: David Weitzberg
  • Patent number: 9672417
    Abstract: In one aspect, a computer implemented method of motion capture, the method includes tracking the motion of a dynamic object bearing a pattern configured such that a first portion of the patterns is tracked at a first resolution and a second portion of the pattern is tracked at a second resolution. The method further includes causing data representing the motion to be stored to a computer readable medium.
    Type: Grant
    Filed: December 31, 2015
    Date of Patent: June 6, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY, LTD.
    Inventors: Kevin Wooley, Ronald Mallet
  • Patent number: 9665966
    Abstract: Representing a connection between objects in a simulation includes identifying, on a source object to be used in a simulation process, a source point that is available for creating a connection to another object. On a target object, a target point is identified for use in creating the connection. At least one of the source object and the target object is a rigid object. A spring element having respective ends at the source point and the target point is created, and at least one property thereof is set. A system includes an animation module, and a simulation module that performs a simulation process involving creating, and setting a property of, a spring element having respective ends at a source point on the source object and a target point on the target object.
    Type: Grant
    Filed: December 1, 2006
    Date of Patent: May 30, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Brice Criswell, Karin Cooper, Don Hatch, James Robert Tooley
  • Publication number: 20170148201
    Abstract: Performance capture systems and techniques are provided for capturing a performance of a subject and reproducing an animated performance that tracks the subject's performance. For example, systems and techniques are provided for determining control values for controlling an animation model to define features of a computer-generated representation of a subject based on the performance. A method may include obtaining input data corresponding to a pose performed by the subject, the input data including position information defining positions on a face of the subject. The method may further include obtaining an animation model for the subject that includes adjustable controls that control the animation model to define facial features of the computer-generated representation of the face, and matching one or more of the positions on the face with one or more corresponding positions on the animation model.
    Type: Application
    Filed: February 3, 2017
    Publication date: May 25, 2017
    Applicant: Lucasfilm Entertainment Company Ltd.
    Inventors: Kiran Bhat, Michael Koperwas, Jeffery Yost, Ji Hun Yu, Sheila Santos
  • Patent number: 9641830
    Abstract: Methods and systems are disclosed for calibrating a camera using a calibration target apparatus that contains at least one fiducial marking on a planar surface. The set of all planar markings on the apparatus are distinguishable. Parameters of the camera are inferred from at least one image of the calibration target apparatus. In some embodiments, pixel coordinates of identified fiducial markings in an image are used with geometric knowledge of the apparatus to calculate camera parameters.
    Type: Grant
    Filed: April 8, 2014
    Date of Patent: May 2, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Ronald Mallet, Jason Snell, Jeff Saltzman, Douglas Moore, Paige Warner
  • Patent number: 9626786
    Abstract: A handheld device includes: an input control configured to control and modify a virtual scene including a virtual camera; and a display that shows a representation of the controlled and modified virtual scene generated by the virtual camera. A system includes: a computer system configured to execute program instructions for generating a virtual scene including a virtual camera; and handheld device configured to communicate with the computer system for controlling and modifying the virtual scene, the handheld device comprising: an input control configured to control and modify the virtual scene; and a display that shows a representation of the controlled and modified virtual scene generated by the virtual camera.
    Type: Grant
    Filed: December 30, 2010
    Date of Patent: April 18, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Spencer Reynolds, Michael Sanders, Kevin Wooley, Steve Sullivan, Adam Schnitzer
  • Publication number: 20170084072
    Abstract: A method includes receiving a first motion path for an object, where an orientation of the object is not aligned with the first motion path for the object for at least a portion of the first motion path. The method also includes receiving a first motion path for a virtual camera and determining a speed of the object along the first motion path for the object. The method additionally includes calculating a second motion path for the object based on the speed of the object along the first motion path for the object and the orientation of the object, where the second motion path of the object is aligned with second motion path. The method further includes calculating a second motion path for the virtual camera based on a difference between the first motion path of the object and the second motion path of the object.
    Type: Application
    Filed: September 23, 2015
    Publication date: March 23, 2017
    Applicant: LUCASFILM ENTERTAINMENT COMPANY, LTD.
    Inventor: David Weitzberg
  • Patent number: 9600742
    Abstract: Performance capture systems and techniques are provided for capturing a performance of a subject and reproducing an animated performance that tracks the subject's performance. For example, systems and techniques are provided for determining control values for controlling an animation model to define features of a computer-generated representation of a subject based on the performance. A method may include obtaining input data corresponding to a pose performed by the subject, the input data including position information defining positions on a face of the subject. The method may further include obtaining an animation model for the subject that includes adjustable controls that control the animation model to define facial features of the computer-generated representation of the face, and matching one or more of the positions on the face with one or more corresponding positions on the animation model.
    Type: Grant
    Filed: May 5, 2015
    Date of Patent: March 21, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Kiran Bhat, Michael Koperwas, Jeffery Yost, Ji Hun Yu, Sheila Santos
  • Publication number: 20170046865
    Abstract: Systems and techniques are provided for performing animation motion capture of objects within an environment. For example, a method may include obtaining input data including a three-dimensional point cloud of the environment. The three-dimensional point cloud is generated using a three-dimensional laser scanner including multiple laser emitters and multiple laser receivers. The method may further include obtaining an animation model for an object within the environment. The animation model includes a mesh, an animation skeleton rig, and adjustable controls that control the animation skeleton rig to define a position of one or more faces of the mesh. The method may further include determining a pose of the object within the environment. Determining a pose includes fitting the one or more faces of the mesh to one or more points of a portion of the three-dimensional point cloud. The portion of the three-dimensional point cloud corresponds to the object in the environment.
    Type: Application
    Filed: August 14, 2015
    Publication date: February 16, 2017
    Applicant: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventor: Brian Cantwell
  • Patent number: 9558578
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for animation. An animation application creates an animation environment. An input device receives input from a user to the animation application. An output device displays output to the user of the animation application. The animation application is configured to have a mode of operation that includes displaying, through the output device, a 3D animation view of the animation environment overlain by a 2D edit view of the animation environment.
    Type: Grant
    Filed: March 14, 2013
    Date of Patent: January 31, 2017
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Adam Schnitzer, Max S-Han Chen, Domenico Porcino, Louise Rasmussen, Greg James, Jonathan Stone, Steve Sullivan, Kent Oberheu
  • Patent number: 9519976
    Abstract: A stereoscopic camera calibration target includes: first illumination points on a first surface; second illumination points on a second surface, the first and second surfaces being planar, parallel to each other, and spaced from each other; and circuitry that sets a strobe frequency of the first and second illumination points. A method includes: moving a calibration target in front of a stereoscopic camera, the calibration target comprising first points on a first surface and second points on a second surface, the first and second surfaces being planar, parallel to each other, and spaced from each other; capturing, using the stereoscopic camera, an image sequence of the calibration target, the image sequence comprising pairs of left and right images of at least some of the first and second points; determining a calibration value for the stereoscopic camera using the image sequence; and processing the image sequence using the calibration value.
    Type: Grant
    Filed: January 28, 2011
    Date of Patent: December 13, 2016
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventor: Jeffrey Saltzman