Patents Assigned to Lucasfilm Entertainment Company Ltd.
  • Patent number: 11514653
    Abstract: An immersive content presentation system can capture the motion or position of a performer in a real-world environment. A game engine can be modified to receive the position or motion of the performer and identify predetermined gestures or positions that can be used to trigger actions in a 3-D virtual environment, such as generating a digital effect, transitioning virtual assets through an animation graph, adding new objects, and so forth. The use of the 3-D environment can be rendered and composited views can be generated. Information for constructing the composited views can be streamed to numerous display devices in many different physical locations using a customized communication protocol. Multiple real-world performers can interact with virtual objects through the game engine in a shared mixed-reality experience.
    Type: Grant
    Filed: October 6, 2021
    Date of Patent: November 29, 2022
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Roger Cordes, David Brickhill
  • Patent number: 11508125
    Abstract: Systems and techniques are provided for switching between different modes of a media content item. A media content item may include a movie that has different modes, such as a cinematic mode and an interactive mode. For example, a movie may be presented in a cinematic mode that does not allow certain user interactions with the movie. The movie may be switched to an interactive mode during any point of the movie, allowing a viewer to interact with various aspects of the movie. The movie may be displayed using different formats and resolutions depending on which mode the movie is being presented.
    Type: Grant
    Filed: March 12, 2020
    Date of Patent: November 22, 2022
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Lutz Markus Latta, Ian Wakelin, Darby Johnston, Andrew Grant, John Gaeta
  • Publication number: 20220345234
    Abstract: Some implementations of the disclosure relate to using a model trained on mixing console data of sound mixes to automate the process of sound mix creation. In one implementation, a non-transitory computer-readable medium has executable instructions stored thereon that, when executed by a processor, causes the processor to perform operations comprising: obtaining a first version of a sound mix; extracting first audio features from the first version of the sound mix obtaining mixing metadata; automatically calculating with a trained model, using at least the mixing metadata and the first audio features, mixing console features; and deriving a second version of the sound mix using at least the mixing console features calculated by the trained model.
    Type: Application
    Filed: April 21, 2021
    Publication date: October 27, 2022
    Applicant: Lucasfilm Entertainment Company Ltd. LLC
    Inventors: Stephen Morris, Scott Levine, Nicolas Tsingos
  • Patent number: 11403769
    Abstract: A motion capture tracking device comprising a base portion including a first alignment feature, a first magnetic element and an attachment mechanism operative to mechanically couple the base portion to a rod, a detachable end cap configured to be removably mated with the base portion, and a plurality of motion capture markers coupled to the end cap. The detachable end cap can include a second alignment feature and a second magnetic element, such that, during a mating event in which the detachable end cap is coupled to the base portion, the second alignment feature cooperates with the first alignment feature to ensure that the base portion and detachable end cap are mated in accordance with a unique registration and the second magnetic feature cooperates with the first magnetic feature to magnetically retain the detachable end cap in physical contact with the base portion upon completion of the mating event.
    Type: Grant
    Filed: September 1, 2020
    Date of Patent: August 2, 2022
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventor: Paige M. Warner
  • Patent number: 11308693
    Abstract: A method of edge loop selection includes accessing a polygon mesh; receiving a selection of a first edge connected to a first non-four-way intersection vertex; receiving, after receiving the selection of the first edge, a selection of a second edge connected to the first non-four-way intersection vertex; in response to receiving a command invoking an edge loop selection process: evaluating a topological relationship between the first edge and the second edge; determining a rule for processing a non-four-way intersection vertex based on the topological relationship between the first edge and the second edge; and completing an edge loop by, from the second edge, processing each respective four-way intersection vertex by choosing a middle edge as a next edge at the respective four-way intersection vertex, and processing each respective non-four-way intersection vertex based on the rule.
    Type: Grant
    Filed: July 16, 2020
    Date of Patent: April 19, 2022
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventor: Colette Mullenhoff
  • Publication number: 20220067948
    Abstract: A motion capture tracking device comprising a base portion including a first alignment feature, a first magnetic element and an attachment mechanism operative to mechanically couple the base portion to a rod, a detachable end cap configured to be removably mated with the base portion, and a plurality of motion capture markers coupled to the end cap. The detachable end cap can include a second alignment feature and a second magnetic element, such that, during a mating event in which the detachable end cap is coupled to the base portion, the second alignment feature cooperates with the first alignment feature to ensure that the base portion and detachable end cap are mated in accordance with a unique registration and the second magnetic feature cooperates with the first magnetic feature to magnetically retain the detachable end cap in physical contact with the base portion upon completion of the mating event.
    Type: Application
    Filed: September 1, 2020
    Publication date: March 3, 2022
    Applicant: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventor: Paige M. Warner
  • Publication number: 20220058870
    Abstract: Some implementations of the disclosure are directed to techniques for facial reconstruction from a sparse set of facial markers. In one implementation, a method comprises: obtaining data comprising a captured facial performance of a subject with a plurality of facial markers; determining a three-dimensional (3D) bundle corresponding to each of the plurality of facial markers of the captured facial performance; using at least the determined 3D bundles to retrieve, from a facial dataset comprising a plurality of facial shapes of the subject, a local geometric shape corresponding to each of the plurality of the facial markers; and merging the retrieved local geometric shapes to create a facial reconstruction of the subject for the captured facial performance.
    Type: Application
    Filed: November 5, 2021
    Publication date: February 24, 2022
    Applicant: Lucasfilm Entertainment Company Ltd. LLC
    Inventors: Matthew Cong, Ronald Fedkiw, Lana Lan
  • Patent number: 11238619
    Abstract: Views of a virtual environment can be displayed on mobile devices in a real-world environment simultaneously for multiple users. The users can operate selections devices in the real-world environment that interact with objects in the virtual environment. Virtual characters and objects can be moved and manipulated using selection shapes. A graphical interface can be instantiated and rendered as part of the virtual environment. Virtual cameras and screens can also be instantiated to created storyboards, backdrops, and animated sequences of the virtual environment. These immersive experiences with the virtual environment can be used to generate content for users and for feature films.
    Type: Grant
    Filed: March 16, 2020
    Date of Patent: February 1, 2022
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Jose Perez, III, Peter Dollar, Barak Moshe
  • Publication number: 20220005279
    Abstract: An immersive content presentation system and techniques that can detect and correct lighting artifacts caused by movements of one or more taking camera in a performance area consisting of multiple displays (e.g., LED or LCD displays). The techniques include capturing, with a camera, a plurality of images of a performer performing in a performance area at least partially surrounded by one or more displays presenting images of a virtual environment. Where the images of the virtual environment within a frustum of the camera are updated on the one or more displays based on movement of the camera, and images of the virtual environment outside of the frustum of the camera are not updated based on movement of the camera. The techniques further include generating content based on the plurality of captured images.
    Type: Application
    Filed: September 22, 2021
    Publication date: January 6, 2022
    Applicant: LUCASFILM ENTERTAINMENT COMPANY LTD. LLC
    Inventors: Roger CORDES, Nicholas RASMUSSEN, Kevin WOOLEY, Rachel ROSE
  • Publication number: 20210407199
    Abstract: A method of edge loop selection includes accessing a polygon mesh; receiving a selection of a first edge connected to a first non-four-way intersection vertex; receiving, after receiving the selection of the first edge, a selection of a second edge connected to the first non-four-way intersection vertex; in response to receiving a command invoking an edge loop selection process: evaluating a topological relationship between the first edge and the second edge; determining a rule for processing a non-four-way intersection vertex based on the topological relationship between the first edge and the second edge; and completing an edge loop by, from the second edge, processing each respective four-way intersection vertex by choosing a middle edge as a next edge at the respective four-way intersection vertex, and processing each respective non-four-way intersection vertex based on the rule.
    Type: Application
    Filed: July 16, 2020
    Publication date: December 30, 2021
    Applicant: Lucasfilm Entertainment Company Ltd.
    Inventor: Colette Mullenhoff
  • Publication number: 20210407174
    Abstract: A method of rendering an image includes receiving information of a virtual camera, including a camera position and a camera orientation defining a virtual screen; receiving information of a target screen, including a target screen position and a target screen orientation defining a plurality of pixels, each respective pixel corresponding to a respective UV coordinate on the target screen; for each respective pixel of the target screen: determining a respective XY coordinate of a corresponding point on the virtual screen based on the camera position, the camera orientation, the target screen position, the target screen orientation, and the respective UV coordinate; tracing one or more rays from the virtual camera through the corresponding point on the virtual screen toward a virtual scene; and estimating a respective color value for the respective pixel based on incoming light from virtual objects in the virtual scene that intersect the one or more rays.
    Type: Application
    Filed: June 30, 2020
    Publication date: December 30, 2021
    Applicant: Lucasfilm Entertainment Company Ltd.
    Inventors: Nicholas Walker, David Weitzberg, André Mazzone
  • Patent number: 11200752
    Abstract: In at least one embodiment, an immersive content generation system may receive a first user input that defines a three-dimensional (3D) volume within a performance area. In at least one embodiment, the system may capture a plurality of images of an object in the performance area using a camera, wherein the object is at least partially surrounded by one or more displays presenting images of a virtual environment. In at least one embodiment, the system may receive a second user input to adjust a color value of a virtual image of the object as displayed in the images in the virtual environment. In at least one embodiment, the system may perform a color correction pass for the displayed images of the virtual environment. In at least one embodiment, the system may generate content based on the plurality of captured images that are corrected via the color correction pass.
    Type: Grant
    Filed: August 21, 2020
    Date of Patent: December 14, 2021
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Roger Cordes, Lutz Latta
  • Patent number: 11170571
    Abstract: Some implementations of the disclosure are directed to techniques for facial reconstruction from a sparse set of facial markers. In one implementation, a method comprises: obtaining data comprising a captured facial performance of a subject with a plurality of facial markers; determining a three-dimensional (3D) bundle corresponding to each of the plurality of facial markers of the captured facial performance; using at least the determined 3D bundles to retrieve, from a facial dataset comprising a plurality of facial shapes of the subject, a local geometric shape corresponding to each of the plurality of the facial markers; and merging the retrieved local geometric shapes to create a facial reconstruction of the subject for the captured facial performance.
    Type: Grant
    Filed: November 15, 2019
    Date of Patent: November 9, 2021
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD. LLC
    Inventors: Matthew Cong, Ronald Fedkiw, Lana Lan
  • Publication number: 20210342971
    Abstract: A method of content production includes generating a survey of a performance area that includes a point cloud representing a first physical object, in a survey graph hierarchy, constraining the point cloud and a taking camera coordinate system as child nodes of an origin of a survey coordinate system, obtaining virtual content including a first virtual object that corresponds to the first physical object, applying a transformation to the origin of the survey coordinate system so that at least a portion of the point cloud that represents the first physical object is substantially aligned with a portion of the virtual content that represents the first virtual object, displaying the first virtual object on one or more displays from a perspective of the taking camera, capturing, using the taking camera, one or more images of the performance area, and generating content based on the one or more images.
    Type: Application
    Filed: April 13, 2021
    Publication date: November 4, 2021
    Applicant: Lucasfilm Entertainment Company Ltd.
    Inventors: Douglas G. Watkins, Paige M. Warner, Dacklin R. Young
  • Patent number: 11145125
    Abstract: An immersive content presentation system can capture the motion or position of a performer in a real-world environment. A game engine can be modified to receive the position or motion of the performer and identify predetermined gestures or positions that can be used to trigger actions in a 3-D virtual environment, such as generating a digital effect, transitioning virtual assets through an animation graph, adding new objects, and so forth. The use of the 3-D environment can be rendered and composited views can be generated. Information for constructing the composited views can be streamed to numerous display devices in many different physical locations using a customized communication protocol. Multiple real-world performers can interact with virtual objects through the game engine in a shared mixed-reality experience.
    Type: Grant
    Filed: September 13, 2018
    Date of Patent: October 12, 2021
    Assignee: Lucasfilm Entertainment Company Ltd.
    Inventors: Roger Cordes, David Brickhill
  • Patent number: 11132838
    Abstract: An immersive content presentation system and techniques that can detect and correct lighting artifacts caused by movements of one or more taking camera in a performance area consisting of multiple displays (e.g., LED or LCD displays). The techniques include capturing, with a camera, a plurality of images of a performer performing in a performance area at least partially surrounded by one or more displays presenting images of a virtual environment. Where the images of the virtual environment within a frustum of the camera are updated on the one or more displays based on movement of the camera, and images of the virtual environment outside of the frustum of the camera are not updated based on movement of the camera. The techniques further include generating content based on the plurality of captured images.
    Type: Grant
    Filed: November 6, 2019
    Date of Patent: September 28, 2021
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD. LLC
    Inventors: Roger Cordes, Richard Bluff, Lutz Latta
  • Patent number: 11132837
    Abstract: An immersive content presentation system and techniques that can detect and correct lighting artifacts caused by movements of one or more taking camera in a performance area consisting of multiple displays (e.g., LED or LCD displays). The techniques include capturing, with a camera, a plurality of images of a performer performing in a performance area at least partially surrounded by one or more displays presenting images of a virtual environment. Where the images of the virtual environment within a frustum of the camera are updated on the one or more displays based on movement of the camera, and images of the virtual environment outside of the frustum of the camera are not updated based on movement of the camera. The techniques further include generating content based on the plurality of captured images.
    Type: Grant
    Filed: November 6, 2019
    Date of Patent: September 28, 2021
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD. LLC
    Inventors: Roger Cordes, Nicholas Rasmussen, Kevin Wooley, Rachel Rose
  • Patent number: 11113885
    Abstract: An immersive content presentation system can capture the motion or position of a performer in a real-world environment. A game engine can be modified to receive the position or motion of the performer and identify predetermined gestures or positions that can be used to trigger actions in a 3-D virtual environment, such as generating a digital effect, transitioning virtual assets through an animation graph, adding new objects, and so forth. The use of the 3-D environment can be rendered and composited views can be generated. Information for constructing the composited views can be streamed to numerous display devices in many different physical locations using a customized communication protocol. Multiple real-world performers can interact with virtual objects through the game engine in a shared mixed-reality experience.
    Type: Grant
    Filed: September 13, 2018
    Date of Patent: September 7, 2021
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Roger Cordes, David Brickhill
  • Patent number: 11107195
    Abstract: An immersive content production system may capture a plurality of images of a physical object in a performance area using a taking camera. The system may determine an orientation and a velocity of the taking camera with respect to the physical object in the performance area. A user may select a first amount of motion blur exhibited by the images of the physical object based on a desired motion effects. The system may determine a correction to apply to a virtual object based at least in part on the orientation and the velocity of the taking camera and the desired motion blur effect. The system may also detect the distance from the taking camera to a physical object and the taking camera to the virtual display. The system may use these distances to generate a corrected circle of confusion for the virtual images on the display.
    Type: Grant
    Filed: August 21, 2020
    Date of Patent: August 31, 2021
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Roger Cordes, Lutz Latta
  • Patent number: 11099654
    Abstract: A system and method for controlling a view of a virtual reality (VR) environment via a computing device with a touch sensitive surface are disclosed. In some examples, a user may be enabled to augment the view of the VR environment by providing finger gestures to the touch sensitive surface. In one example, the user is enabled to call up a menu in the view of the VR environment. In one example, the user is enabled to switch the view of the VR environment displayed on a device associated with another user to a new location within the VR environment. In some examples, the user may be enabled to use the computing device to control a virtual camera within the VR environment and have various information regarding one or more aspects of the virtual camera displayed in the view of the VR environment presented to the user.
    Type: Grant
    Filed: April 17, 2020
    Date of Patent: August 24, 2021
    Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.
    Inventors: Darby Johnston, Ian Wakelin