Patents Assigned to Weta Digital Limited
-
Patent number: 11328437Abstract: Methods and systems for defocusing a rendered computer-generated image are presented. Pixel values for a pixel array are determined from a scene description. A blur amount for each pixel is determined based on a lens function representing a lens shape and/or effect. A blur amount and blur transparency value are determined for the pixel based on the lens function and pixel depth. A convolution range comprising pixels adjacent to the pixel is determined based on the blur amount. A blend color value is determined for the pixel based on the color value of the pixel, color values of pixels in the convolution range, and the blur transparency value. The blend color value is scaled based on the blend color value and a modified pixel color value is determined from scaled blend color values.Type: GrantFiled: October 30, 2020Date of Patent: May 10, 2022Assignee: Weta Digital LimitedInventor: Peter Hillman
-
Patent number: 11308644Abstract: The present description relates relate to recalibration of a sensor device for performance capture, by detecting a miscalibration problem with sensor device and assessment of the problem. A recalibration system includes sensor devices initially calibrated at a recording site. A recording site change occurs and afterwards, a failure to match virtual rays projected from one sensor device with virtual rays projected from the active marker is detected. In response to determining the failure, the active marker is signaled to emit a unique display of light. The failure of the rays to match is assessed based on whether sensor devices capture the unique display of light. Three-dimensional (3-D) coordinates of an active marker is reconstructed from marker data of the calibrated sensor devices. A problematic sensor device is recalibrated based on the assessment, using the 3-D coordinates of the active marker from marker data of the remaining calibrated sensor devices, without stopping the recording.Type: GrantFiled: July 16, 2021Date of Patent: April 19, 2022Assignee: WETA DIGITAL LIMITEDInventors: Dejan Momcilovic, Jake Botting
-
Patent number: 11302052Abstract: An aspect provides a computer-implemented method for constructing evaluation logic associated with an animation software package. The method comprises receiving at least one software module, the at least one software module including at least one evaluator; writing the at least one software module to at least one executable code object; and maintaining data for the at least one software module in a contiguous block of memory for use by the software module.Type: GrantFiled: July 19, 2021Date of Patent: April 12, 2022Assignee: WETA DIGITAL LIMITEDInventors: Niall J. Lenihan, Richard Chi Lei, Sander van der Steen
-
Publication number: 20220103750Abstract: An imagery processing system determines alternative pixel color values for pixels of captured imagery where the alternative pixel color values are obtained from alternative sources. A main imagery capture device, such as a camera, captures main imagery such as still images and/or video sequences, of a live action scene. Alternative devices capture imagery of the live action scene, in some spectra and form, and that alternative imagery is processed to provide user-selectable alternatives for pixel ranges from the main imagery.Type: ApplicationFiled: December 9, 2021Publication date: March 31, 2022Applicant: Weta Digital LimitedInventors: Kimball D. Thurston, III, Peter M. Hillman
-
Patent number: 11288496Abstract: The present description relates to light patterns used in a live action scene of a visual production to encode information associated with objects in the scene, such as movement and position of the objects. A data capture system includes active markers that emit light of a particular wavelength in predefined strobing patterns. In some implementations, the active markers are instructed to emit an assigned signature pattern of light through a signal controller sending signals to a control unit. Various components are synchronized such that pulsing of light corresponds to time slices and particular frames captured by the performance capture system. The data representing the pattern is embedded in illuminated and blank frames. Frames showing the light pattern are analyzed to extract information about the active markers, such as identification of the active markers and objects to which they are attached.Type: GrantFiled: May 28, 2021Date of Patent: March 29, 2022Assignee: WETA DIGITAL LIMITEDInventors: Dejan Momcilovic, Jake Botting
-
Publication number: 20220092301Abstract: The present description relates to light patterns used in a live action scene of a visual production to encode information associated with objects in the scene, such as movement and position of the objects. A data capture system includes active markers that emit light of a particular wavelength in predefined strobing patterns. In some implementations, the active markers are instructed to emit an assigned signature pattern of light through a signal controller sending signals to a control unit. Various components are synchronized such that pulsing of light corresponds to time slices and particular frames captured by the performance capture system. The data representing the pattern is embedded in illuminated and blank frames. Frames showing the light pattern are analyzed to extract information about the active markers, such as identification of the active markers and objects to which they are attached.Type: ApplicationFiled: December 2, 2021Publication date: March 24, 2022Applicant: Weta Digital LimitedInventors: Dejan Momcilovic, Jake Botting
-
Patent number: 11282233Abstract: Embodiments facilitate the calibration of cameras in a live action scene. In some embodiments, a system receives images of the live action scene from a plurality of cameras. The system further receives reference point data generated from a performance capture system, where the reference point data is based on at least three reference points, where the at least three reference points are positioned within the live action scene, and where distances between the at least three reference points are predetermined. The system further determines a location and orientation of each camera based on the reference point data.Type: GrantFiled: February 25, 2021Date of Patent: March 22, 2022Assignee: WETA DIGITAL LIMITEDInventors: Dejan Momcilovic, Jake Botting
-
Patent number: 11281477Abstract: An example method facilitates adjusting or enhancing performance of a process of as graphics program or animation software package, and includes providing a first User Interface (UI) control for allowing user assignment of or selection of one or more processor types, e.g., Graphics Processing Unit (GPU) or Central Processing Unit (CPU), and/or associated memory types, e.g., GPU memory and/or CPU memory, to one or more computing resources, such as variables and/or associated functions or evaluators. A drop-down menu or other control may be provided in a first UI to allow for user specification of or assignment of one or more computing resources, e.g., CPU or GPU processors and/or memory to one or more variables, data structures, associated functions or other executable code. In a specific implementation, one UI control facilitates user specification of one or more evaluators of a plugin, wherein the one or more evaluators are usable by a host application of the plugin.Type: GrantFiled: July 19, 2021Date of Patent: March 22, 2022Assignee: WETA DIGITAL LIMITEDInventors: Niall J. Lenihan, Richard Chi Lei, Sander van der Steen
-
Publication number: 20220086308Abstract: Implementations provide a wearable article for a performance capture system. In some implementations, a wearable article includes one or more regions, where the one or more regions are configured to be worn on at least a portion of a body of a user, and where at least one of the one or more regions are configured to hold performance capture equipment in predetermined positions. In some implementations, the wearable article also includes a plurality of mounting mechanisms coupled to the one or more regions for mounting reference markers to be used for position determination. In some implementations, the wearable article also includes a plurality of fastening mechanisms coupled to the one or more regions for fastening devices and accessories for controlling the reference markers.Type: ApplicationFiled: November 24, 2021Publication date: March 17, 2022Applicant: Weta Digital LimitedInventors: Dejan Momcilovic, Jake Botting
-
Publication number: 20220076451Abstract: Embodiments facilitate the calibration of cameras in a live action scene. In some embodiments, a system receives images of the live action scene from a plurality of cameras. The system further receives reference point data generated from a performance capture system, where the reference point data is based on a plurality of reference points coupled to a plurality of extensions coupled to a base, where the plurality of reference points are in a non-linear arrangement, where distances between references points are predetermined. The system further computes reference point data generated from a performance capture system and based on the distances. The system further computes a location and orientation of each camera in the live action scene based on the reference point data.Type: ApplicationFiled: February 25, 2021Publication date: March 10, 2022Applicant: Weta Digital LimitedInventors: Dejan Momcilovic, Jake Botting
-
Publication number: 20220076450Abstract: Embodiments facilitate the calibration of cameras in a live action scene. In some embodiments, a system receives images of the live action scene from a plurality of cameras. The system further receives reference point data generated from a performance capture system, where the reference point data is based on at least three reference points, where the at least three reference points are positioned within the live action scene, and where distances between the at least three reference points are predetermined. The system further determines a location and orientation of each camera based on the reference point data.Type: ApplicationFiled: February 25, 2021Publication date: March 10, 2022Applicant: Weta Digital LimitedInventors: Dejan Momcilovic, Jake Botting
-
Publication number: 20220076452Abstract: Embodiments facilitate the calibration of cameras in a live action scene. In some embodiments, a system receives images of the live action scene from a plurality of cameras. The system further receives reference point data generated from a performance capture system, where the reference point data is based on at least three reference points, where the at least three reference points are attached to a linear form, and where distances between the at least three reference points are predetermined. The system further locates the at least three reference points in one or more images of the images. The system further computes one or more ratios of the distances between each adjacent pair of reference points of the at least three reference points in the one or more images. The system further determines a location and orientation of each camera based on the reference point data.Type: ApplicationFiled: February 25, 2021Publication date: March 10, 2022Applicant: Weta Digital LimitedInventors: Dejan Momcilovic, Jake Botting
-
Patent number: 11270490Abstract: A method for generating one or more visual representations of an object colliding with an interface between a simulated fluid and a material. The method includes obtaining shape and movement data of a bulk fluid and an object, identifying an interface where the bulk fluid covers a portion of the object, generating an emitted fluid at the interface, generating shape and movement data of the emitted fluid interacting with the object.Type: GrantFiled: February 24, 2021Date of Patent: March 8, 2022Assignee: Weta Digital LimitedInventor: Alexey Stomakhin
-
Publication number: 20220067968Abstract: Embodiments facilitate the calibration of cameras in a live action scene using drones with multiple cameras. In some embodiments, a method configures a plurality of reference cameras to observe at least one portion of the live action scene. The method further configures at least one first camera coupled to an apparatus to observe one or more moving objects in the live action scene. The method further configures at least one second camera coupled to the at least one apparatus to observe at least three known reference points located in the live action scene. The method further receives reference point data in association with the at least one second camera, where the reference point data is based on the at least three known reference points. The method further computes a location and an orientation of the at least one first camera and the at least one second camera based on the reference point data.Type: ApplicationFiled: December 11, 2020Publication date: March 3, 2022Applicant: Weta Digital LimitedInventors: Dejan Momcilovich, Jake Botting
-
Publication number: 20220067970Abstract: Embodiments facilitate the calibration of cameras in a live action scene using fixed cameras and drones. In some embodiments, a method configures a plurality of reference cameras to observe at least three known reference points located in the live action scene and to observe one or more reference points associated with one or more moving cameras having unconstrained motion. The method further configures the one or more moving cameras to observe one or more moving objects in the live action scene. The method further receives reference point data in association with one or more reference cameras of the plurality of reference cameras, where the reference point data is based on the at least three known reference points and the one or more reference points associated with the one or more moving cameras.Type: ApplicationFiled: December 11, 2020Publication date: March 3, 2022Applicant: Weta Digital LimitedInventors: Dejan Momcilovic, Jake Botting
-
Publication number: 20220067972Abstract: The present description relates relate to recalibration of a sensor device for performance capture, by detecting a miscalibration problem with sensor device and assessment of the problem. A recalibration system includes sensor devices initially calibrated at a recording site. A recording site change occurs and afterwards, a failure to match virtual rays projected from one sensor device with virtual rays projected from the active marker is detected. In response to determining the failure, the active marker is signaled to emit a unique display of light. The failure of the rays to match is assessed based on whether sensor devices capture the unique display of light. Three-dimensional (3-D) coordinates of an active marker is reconstructed from marker data of the calibrated sensor devices. A problematic sensor device is recalibrated based on the assessment, using the 3-D coordinates of the active marker from marker data of the remaining calibrated sensor devices, without stopping the recording.Type: ApplicationFiled: July 16, 2021Publication date: March 3, 2022Applicant: Weta Digital LimitedInventors: Dejan Momcilovic, Jake Botting
-
Publication number: 20220067969Abstract: Embodiments facilitate the calibration of cameras in a live action scene using drones. In some embodiments, a method configures a plurality of reference cameras to observe at least one portion of the live action scene. The method further configures one or more moving cameras having unconstrained motion to observe one or more moving objects in the live action scene and to observe at least three known reference points associated with the plurality of reference cameras. The method further receives reference point data in association with the one or more moving cameras, where the reference point data is based on the at least three known reference points. The method further computes a location and an orientation of each moving camera of the one or more moving cameras based on one or more of the reference point data and one or more locations of one or more reference cameras of the plurality of reference cameras.Type: ApplicationFiled: December 11, 2020Publication date: March 3, 2022Applicant: Weta Digital LimitedInventors: Dejan Momcilovic, Jake Botting
-
Publication number: 20220053108Abstract: Embodiments provide a wearable article for a performance capture system. In some embodiments, a wearable article includes one or more regions, where the one or more regions are configured to be worn on at least a portion of a body of a user, where the one or more regions have a first pliability and a second pliability, where the first pliability and the second pliability are different pliabilities, and where at least one of the one or more regions are configured to hold devices in predetermined positions while maintaining shape and respective pliability. In some embodiments, the wearable article also includes a plurality of mounting mechanisms coupled to the one or more regions for mounting one or more reference markers to be used for position determination.Type: ApplicationFiled: April 30, 2021Publication date: February 17, 2022Applicant: Weta Digital LimitedInventors: Dejan Momcilovic, Jake Botting
-
Publication number: 20220050497Abstract: Embodiments provide a wearable article with channels for a performance capture system. In some embodiments, a wearable article includes one or more regions of the wearable article configured to be worn on at least a portion of a body of a user. In some embodiments, the wearable article also includes at least one of the one or more regions comprising at least one base layer and at least one secondary layer configured to form at least one connection passage between the at least one base layer and the at least one secondary layer. In some embodiments, the at least one connection passage is configured to provide access for flexible cable connections between at least one reference marker and one or more other reference markers or a control unit. In some embodiments, the at least one connection passage is configured to allow movement of a flexible cable within the connection passage in response to movement of the user.Type: ApplicationFiled: April 30, 2021Publication date: February 17, 2022Applicant: Weta Digital LimitedInventors: Dejan Momcilovic, Jake Botting
-
Method for controlling digital feather growth between two manifolds in a computer simulated creature
Patent number: 11250609Abstract: A realistic feather growth may be represented between two surface manifolds in a modeling system. To perform the feather growth, a feather groom for a plurality of feathers between an inner shell of a creature and an outer shell of the creature is received. An inner manifold for the inner shell and an outer manifold for the outer shell is determined with a plurality of follicle points and a plurality of tip points. A first surface contour definition for the inner manifold and a second surface contour definition for the outer manifold is determined and used to determine a volumetric vector field between the inner manifold and the outer manifold. Thereafter, the plurality of feathers is generated between the inner manifold and the outer manifold using the follicle points, the tip points, and the volumetric vector fields.Type: GrantFiled: May 14, 2021Date of Patent: February 15, 2022Assignee: Weta Digital LimitedInventor: Christoph Sprenger