Patents Assigned to Lucasfilm
-
Patent number: 10600245Abstract: Systems and techniques are provided for switching between different modes of a media content item. A media content item may include a movie that has different modes, such as a cinematic mode and an interactive mode. For example, a movie may be presented in a cinematic mode that does not allow certain user interactions with the movie. The movie may be switched to an interactive mode during any point of the movie, allowing a viewer to interact with various aspects of the movie. The movie may be displayed using different formats and resolutions depending on which mode the movie is being presented.Type: GrantFiled: May 28, 2015Date of Patent: March 24, 2020Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.Inventors: Lutz Markus Latta, Ian Wakelin, Darby Johnston, Andrew Grant, John Gaeta
-
Patent number: 10594786Abstract: Views of a virtual environment can be displayed on mobile devices in a real-world environment simultaneously for multiple users. The users can operate selections devices in the real-world environment that interact with objects in the virtual environment. Virtual characters and objects can be moved and manipulated using selection shapes. A graphical interface can be instantiated and rendered as part of the virtual environment. Virtual cameras and screens can also be instantiated to created storyboards, backdrops, and animated sequences of the virtual environment. These immersive experiences with the virtual environment can be used to generate content for users and for feature films.Type: GrantFiled: October 11, 2017Date of Patent: March 17, 2020Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.Inventors: Jose Perez, III, Peter Dollar, Barak Moshe
-
Patent number: 10553036Abstract: Views of a virtual environment can be displayed on mobile devices in a real-world environment simultaneously for multiple users. The users can operate selections devices in the real-world environment that interact with objects in the virtual environment. Virtual characters and objects can be moved and manipulated using selection shapes. A graphical interface can be instantiated and rendered as part of the virtual environment. Virtual cameras and screens can also be instantiated to created storyboards, backdrops, and animated sequences of the virtual environment. These immersive experiences with the virtual environment can be used to generate content for users and for feature films.Type: GrantFiled: October 11, 2017Date of Patent: February 4, 2020Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.Inventors: Jose Perez, III, Peter Dollar, Barak Moshe
-
Patent number: 10489958Abstract: A system includes a computing device that includes a memory configured to store instructions. The system also includes a processor configured to execute the instructions to perform a method that includes receiving multiple representations of an object. Each of the representations includes position information of the object and corresponds to an instance in time. For at least one of the representations, the method includes defining a contour that represents a movable silhouette of a surface feature of the object. The method also includes producing a deformable model of the surface of the object from the defined contour and from the at least one representation of the object.Type: GrantFiled: August 1, 2017Date of Patent: November 26, 2019Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.Inventors: Ronald Mallet, Yuting Ye, Michael Koperwas, Adrian R. Goldenthal, Kiran S. Bhat
-
Patent number: 10484824Abstract: A method may include causing first content to be displayed on a display device. The method may also include determining a location of a mobile device relative to the display device. In some embodiments, the mobile device may be positioned such that the first content is visible to a viewer of the mobile device. The method may additionally include causing second content to be displayed on the mobile device such that the second content is layered over the first content.Type: GrantFiled: June 27, 2016Date of Patent: November 19, 2019Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.Inventors: John Gaeta, Michael Koperwas, Nicholas Rasmussen
-
Patent number: 10423234Abstract: A system and method facilitating a user to manipulate a virtual reality (VR) environment are disclosed. The user may provide an input via a touch sensitive surface of a computing device associated with the user to bind a virtual object in the VR environment to the computing device. The user may then move and/or rotate the computing device to cause the bound virtual object to move and/or rotate in the VR environment accordingly. In some examples, the bound virtual object may cast a ray into the VR environment. The movement and/or rotation of the virtual object controlled by the computing device in those examples can change the direction of the ray. In some examples, the virtual object may include a virtual camera. In those examples, the user may move and/or rotate the virtual camera in the VR environment by moving and/or rotate the computing device.Type: GrantFiled: September 30, 2015Date of Patent: September 24, 2019Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.Inventors: Darby Johnston, Ian Wakelin
-
Patent number: 10403019Abstract: A multi-channel tracking pattern is provided along with techniques and systems for performing motion capture using the multi-channel tracking pattern. The multi-channel tracking pattern includes a plurality of shapes having different colors on different portions of the pattern. The portions with the unique shapes and colors allow a motion capture system to track motion of an object bearing the pattern across a plurality of video frames.Type: GrantFiled: February 11, 2016Date of Patent: September 3, 2019Assignee: LUCASFILM ENTERTAINMENT COMPANYInventor: John Levin
-
Patent number: 10373342Abstract: Views of a virtual environment can be displayed on mobile devices in a real-world environment simultaneously for multiple users. The users can operate selections devices in the real-world environment that interact with objects in the virtual environment. Virtual characters and objects can be moved and manipulated using selection shapes. A graphical interface can be instantiated and rendered as part of the virtual environment. Virtual cameras and screens can also be instantiated to created storyboards, backdrops, and animated sequences of the virtual environment. These immersive experiences with the virtual environment can be used to generate content for users and for feature films.Type: GrantFiled: October 11, 2017Date of Patent: August 6, 2019Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.Inventors: Jose Perez, III, Peter Dollar, Barak Moshe
-
Patent number: 10321117Abstract: A method of generating unrecorded camera views may include receiving a plurality of 2-D video sequences of a subject in a real 3-D space, where each 2-D video sequence may depict the subject from a different perspective. The method may also include generating a 3-D representation of the subject in a virtual 3-D space, where a geometry and texture of the 3-D representation may be generated based on the 2D video sequences, and the motion of the 3-D representation in the virtual 3-D space is based on motion of the subject in the real 3-D space. The method may additionally include generating a 2-D video sequence of the motion of the 3D representation using a virtual camera in the virtual 3-D space where the perspective of the virtual camera may be different than the perspectives of the plurality of 2-D video sequences.Type: GrantFiled: August 25, 2014Date of Patent: June 11, 2019Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.Inventors: Hilmar Koch, Ronald Mallet, Kim Libreri, Paige Warner, Mike Sanders, John Gaeta
-
Publication number: 20190122374Abstract: Embodiments of the disclosure provide systems and methods for motion capture to generate content (e.g., motion pictures, television programming, videos, etc.). An actor or other performing being can have multiple markers on his or her face that are essentially invisible to the human eye, but that can be clearly captured by camera systems of the present disclosure. Embodiments can capture the performance using two different camera systems, each of which can observe the same performance but capture different images of that performance. For instance, a first camera system can capture the performance within a first light wavelength spectrum (e.g., visible light spectrum), and a second camera system can simultaneously capture the performance in a second light wavelength spectrum different from the first spectrum (e.g., invisible light spectrum such as the IR light spectrum). The images captured by the first and second camera systems can be combined to generate content.Type: ApplicationFiled: August 13, 2018Publication date: April 25, 2019Applicant: Lucasfilm Entertainment Company Ltd.Inventors: Leandro Estebecorena, John Knoll, Stephane Grabli, Per Karefelt, Pablo Helman, John M. Levin
-
Publication number: 20190124244Abstract: Embodiments of the disclosure provide systems and methods for motion capture to generate content (e.g., motion pictures, television programming, videos, etc.). An actor or other performing being can have multiple markers on his or her face that are essentially invisible to the human eye, but that can be clearly captured by camera systems of the present disclosure. Embodiments can capture the performance using two different camera systems, each of which can observe the same performance but capture different images of that performance. For instance, a first camera system can capture the performance within a first light wavelength spectrum (e.g., visible light spectrum), and a second camera system can simultaneously capture the performance in a second light wavelength spectrum different from the first spectrum (e.g., invisible light spectrum such as the IR light spectrum). The images captured by the first and second camera systems can be combined to generate content.Type: ApplicationFiled: August 13, 2018Publication date: April 25, 2019Applicant: Lucasfilm Entertainment Company Ltd.Inventors: John Knoll, Leandro Estebecorena, Stephane Grabli, Per Karefelt, Pablo Helman, John M. Levin
-
Patent number: 10269165Abstract: A system includes a computing device that includes a memory and a processor configured to execute instructions to perform a method that includes receiving multiple representations of one or more expressions of an object. Each representation includes position information attained from one or more images of the object. The method also includes producing an animation model from one or more groups of controls that respectively define each of the one or more expressions of the object as provided by the multiple representations. Each control of each group of controls has an adjustable value that defines the geometry of at least one shape of a portion of the respective expression of the object. Producing the animation model includes producing one or more corrective shapes if the animation model is incapable of accurately presenting the one or more expressions of the object as provided by the multiple representations.Type: GrantFiled: January 30, 2012Date of Patent: April 23, 2019Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.Inventors: Kiran S. Bhat, Michael Koperwas, Rachel M. Rose, Jung-Seung Hong, Frederic P. Pighin, Christopher David Twigg, Cary Phillips, Steve Sullivan
-
Patent number: 10269169Abstract: In one general aspect, a method is described. The method includes generating a positional relationship between one or more support structures having at least one motion capture mark and at least one virtual structure corresponding to geometry of an object to be tracked and positioning the support structures on the object to be tracked. The support structures has sufficient rigidity that, if there are multiple marks, the marks on each support structure maintain substantially fixed distances from each other in response to movement by the object. The method also includes determining an effective quantity of ray traces between one or more camera views and one or more marks on the support structures, and estimating an orientation of the virtual structure by aligning the determined effective quantity of ray traces with a known configuration of marks on the support structures.Type: GrantFiled: August 9, 2016Date of Patent: April 23, 2019Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.Inventors: Steve Sullivan, Colin Davidson, Michael Sanders, Kevin Wooley
-
Patent number: 10147219Abstract: Performance capture systems and techniques are provided for capturing a performance of a subject and reproducing an animated performance that tracks the subject's performance. For example, systems and techniques are provided for determining control values for controlling an animation model to define features of a computer-generated representation of a subject based on the performance. A method may include obtaining input data corresponding to a pose performed by the subject, the input data including position information defining positions on a face of the subject. The method may further include obtaining an animation model for the subject that includes adjustable controls that control the animation model to define facial features of the computer-generated representation of the face, and matching one or more of the positions on the face with one or more corresponding positions on the animation model.Type: GrantFiled: February 3, 2017Date of Patent: December 4, 2018Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.Inventors: Kiran Bhat, Michael Koperwas, Jeffery Yost, Ji Hun Yu, Sheila Santos
-
Patent number: 10142561Abstract: A handheld device includes: an input control configured to control and modify a virtual scene including a virtual camera; and a display that shows a representation of the controlled and modified virtual scene generated by the virtual camera. A system includes: a computer system configured to execute program instructions for generating a virtual scene including a virtual camera; and handheld device configured to communicate with the computer system for controlling and modifying the virtual scene, the handheld device comprising: an input control configured to control and modify the virtual scene; and a display that shows a representation of the controlled and modified virtual scene generated by the virtual camera.Type: GrantFiled: March 20, 2017Date of Patent: November 27, 2018Assignee: Lucasfilm Entertainment Company Ltd.Inventors: Spencer Reynolds, Michael Sanders, Kevin Wooley, Steve Sullivan, Adam Schnitzer
-
Patent number: 10109062Abstract: System and methods are provided for a non-coherent point tracking process that allows unknown camera motion to be estimated. One or more edges can be identified in images captured by a camera when shooting a scene. For each of the identified edge in the images, at least one tracking object can be placed arbitrarily on the edge. The positions of tracking objects in the images can then be used to estimate a camera motion. In some embodiments, two tracking objects can be placed arbitrarily on the edge to represent the edge and move along the edge arbitrarily from image to image where the edge appears. Multiple of such edges can be identified in the images and camera motions in multiple directions can be estimated based on the identified edges and combined to obtain a combined camera motion indicating the camera's movement in a 3D space when shooting the scene.Type: GrantFiled: September 30, 2016Date of Patent: October 23, 2018Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.Inventor: Douglas Moore
-
Patent number: 10078917Abstract: A method may include rendering a first view of a three-dimensional (3-D) virtual scene comprising a view of first content being displayed on a virtual display device from a location in the 3-D virtual scene. The method may also include rendering a second view comprising one or more content objects. The second view may be rendered from the location in the 3-D virtual scene, and the second scene may include a view of the display device as would be seen through a pair of augmented-reality glasses that display the one or more content objects. The method may additionally include generating a composite view by combing the first view and the second view. The method may further include causing the composite view to be displayed on a virtual-reality headset.Type: GrantFiled: October 10, 2016Date of Patent: September 18, 2018Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.Inventors: John Gaeta, Michael Koperwas, Nicholas Rasmussen
-
Patent number: 10055874Abstract: In some embodiments a method of transferring facial expressions from a subject to a computer-generated character is provided where the method includes receiving positional information from a motion capture session of the subject representing a performance having facial expressions to be transferred to the computer-generated character, receiving a first animation model that represents the subject, and receiving a second animation model that represents the computer-generated character. Each of the first and second animation models can include a plurality of adjustable controls that define geometries of the model and that can be adjusted to present different facial expressions on the model, and where the first and second animation models are designed so that setting the same values for the same set of adjustable controls in each model generates a similar facial poses on the models.Type: GrantFiled: August 20, 2015Date of Patent: August 21, 2018Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.Inventors: Ji Hun Yu, Michael Koperwas, Jeffrey Bruce Yost, Sheila Santos, Kiran S. Bhat
-
Patent number: 9858700Abstract: Systems and techniques are provided for transferring changes in animation data between geometric models of a character having different resolutions. For example, systems and techniques are provided for transferring changes in geometric properties between the geometric models. A method may include obtaining a first geometric model of the character and a second geometric model of the character, the geometric models having different resolutions with different numbers of data points. The method may further include determining one or more correspondences between data points of the first geometric model and data points of the second geometric model. The correspondences include one or more data points of the first geometric model that overlap with one or more data points of the second geometric model.Type: GrantFiled: May 13, 2015Date of Patent: January 2, 2018Assignee: LUCASFILM ENTERTAINMENT COMPANY LTD.Inventors: Rachel Rose, Yuting Ye, Scott Jones
-
Patent number: 9854176Abstract: Systems and techniques for dynamically capturing and reconstructing lighting are provided. The systems and techniques may be based on a stream of images capturing the lighting within an environment as a scene is shot. Reconstructed lighting data may be used to illuminate a character in a computer-generated environment as the scene is shot. For example, a method may include receiving a stream of images representing lighting of a physical environment. The method may further include compressing the stream of images to reduce an amount of data used in reconstructing the lighting of the physical environment and may further include outputting the compressed stream of images for reconstructing the lighting of the physical environment using the compressed stream, the reconstructed lighting being used to render a computer-generated environment.Type: GrantFiled: January 24, 2014Date of Patent: December 26, 2017Assignee: Lucasfilm Entertainment Company Ltd.Inventors: Michael Sanders, Kiran Bhat, Curt Isamu Miyashiro, Jason Snell, Stephane Grabli