Patents by Inventor Nathaniel Christopher Dirksen
Nathaniel Christopher Dirksen has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240087200Abstract: Systems, methods, and non-transitory computer-readable media can identify a virtual character being presented to a user within a real-time immersive environment. A first animation to be applied to the virtual character is determined. A nonverbal communication animation to be applied to the virtual character simultaneously with the first animation is determined. The virtual character is animated in real-time based on the first animation and the nonverbal communication animation.Type: ApplicationFiled: November 10, 2023Publication date: March 14, 2024Inventors: Nathaniel Christopher Dirksen, Michael Scott Hutchinson, Eric Richard Darnell, Lawrence David Cutler, Daniel Tomas Steamer, Apostolos Lerios
-
Patent number: 11915342Abstract: Systems, methods, and non-transitory computer-readable media can obtain data associated with a computer-based experience. The computer-based experience can be based on interactive real-time technology. At least one virtual camera can be configured within the computer-based experience in a real-time engine. Data associated with an edit cut of the computer-based experience can be obtained based on content captured by the at least one virtual camera. A plurality of shots that correspond to two-dimensional content can be generated from the edit cut of the computer-based experience in the real-time engine. Data associated with a two-dimensional version of the computer-based experience can be generated with the real-time engine based on the plurality of shots. The two-dimensional version can be rendered based on the generated data.Type: GrantFiled: July 15, 2022Date of Patent: February 27, 2024Assignee: Baobab Studios Inc.Inventors: Mikhail Stanislavovich Solovykh, Wei Wang, Nathaniel Christopher Dirksen, Lawrence David Cutler, Apostolos Lerios
-
Patent number: 11694381Abstract: Systems, methods, and non-transitory computer-readable media can identify a virtual character being presented to a user within a real-time immersive environment. A first animation to be applied to the virtual character is determined. A nonverbal communication animation to be applied to the virtual character simultaneously with the first animation is determined. The virtual character is animated in real-time based on the first animation and the nonverbal communication animation.Type: GrantFiled: March 12, 2021Date of Patent: July 4, 2023Assignee: Baobab Studios Inc.Inventors: Nathaniel Christopher Dirksen, Michael Scott Hutchinson, Eric Richard Darnell, Lawrence David Cutler, Daniel Tomas Steamer, Apostolos Lerios
-
Patent number: 11636640Abstract: Systems, methods, and non-transitory computer-readable media can identify a virtual character being presented to a user within a real-time immersive environment. A first animation to be applied to the virtual character is determined. A nonverbal communication animation to be applied to the virtual character simultaneously with the first animation is determined. The virtual character is animated in real-time based on the first animation and the nonverbal communication animation.Type: GrantFiled: March 12, 2021Date of Patent: April 25, 2023Assignee: Baobab Studios Inc.Inventors: Nathaniel Christopher Dirksen, Michael Scott Hutchinson, Eric Richard Darnell, Lawrence David Cutler, Daniel Tomas Steamer, Apostolos Lerios
-
Publication number: 20230005192Abstract: Systems, methods, and non-transitory computer-readable media can obtain data associated with a computer-based experience. The computer-based experience can be based on interactive real-time technology. At least one virtual camera can be configured within the computer-based experience in a real-time engine. Data associated with an edit cut of the computer-based experience can be obtained based on content captured by the at least one virtual camera. A plurality of shots that correspond to two-dimensional content can be generated from the edit cut of the computer-based experience in the real-time engine. Data associated with a two-dimensional version of the computer-based experience can be generated with the real-time engine based on the plurality of shots. The two-dimensional version can be rendered based on the generated data.Type: ApplicationFiled: July 15, 2022Publication date: January 5, 2023Inventors: Mikhail Stanislavovich Solovykh, Wei Wang, Nathaniel Christopher Dirksen, Lawrence David Cutler, Apostolos Lerios
-
Patent number: 11403787Abstract: Systems, methods, and non-transitory computer-readable media can obtain data associated with a computer-based experience. The computer-based experience can be based on interactive real-time technology. At least one virtual camera can be configured within the computer-based experience in a real-time engine. Data associated with an edit cut of the computer-based experience can be obtained based on content captured by the at least one virtual camera. A plurality of shots that correspond to two-dimensional content can be generated from the edit cut of the computer-based experience in the real-time engine. Data associated with a two-dimensional version of the computer-based experience can be generated with the real-time engine based on the plurality of shots. The two-dimensional version can be rendered based on the generated data.Type: GrantFiled: October 22, 2020Date of Patent: August 2, 2022Assignee: Baobab Studios Inc.Inventors: Mikhail Stanislavovich Solovykh, Wei Wang, Nathaniel Christopher Dirksen, Lawrence David Cutler, Apostolos Lerios
-
Publication number: 20210201552Abstract: Systems, methods, and non-transitory computer-readable media can identify a virtual character being presented to a user within a real-time immersive environment. A first animation to be applied to the virtual character is determined. A nonverbal communication animation to be applied to the virtual character simultaneously with the first animation is determined. The virtual character is animated in real-time based on the first animation and the nonverbal communication animation.Type: ApplicationFiled: March 12, 2021Publication date: July 1, 2021Inventors: Nathaniel Christopher Dirksen, Michael Scott Hutchinson, Eric Richard Darnell, Lawrence David Cutler, Daniel Tomas Steamer, Apostolos Lerios
-
Publication number: 20210142546Abstract: Systems, methods, and non-transitory computer-readable media can identify a virtual character being presented to a user within a real-time immersive environment. A first animation to be applied to the virtual character is determined. A nonverbal communication animation to be applied to the virtual character simultaneously with the first animation is determined. The virtual character is animated in real-time based on the first animation and the nonverbal communication animation.Type: ApplicationFiled: January 20, 2021Publication date: May 13, 2021Inventors: Nathaniel Christopher Dirksen, Michael Scott Hutchinson, Eric Richard Darnell, Lawrence David Cutler, Daniel Tomas Steamer, Apostolos Lerios
-
Publication number: 20210125382Abstract: Systems, methods, and non-transitory computer-readable media can obtain data associated with a computer-based experience. The computer-based experience can be based on interactive real-time technology. At least one virtual camera can be configured within the computer-based experience in a real-time engine. Data associated with an edit cut of the computer-based experience can be obtained based on content captured by the at least one virtual camera. A plurality of shots that correspond to two-dimensional content can be generated from the edit cut of the computer-based experience in the real-time engine. Data associated with a two-dimensional version of the computer-based experience can be generated with the real-time engine based on the plurality of shots. The two-dimensional version can be rendered based on the generated data.Type: ApplicationFiled: October 22, 2020Publication date: April 29, 2021Inventors: Mikhail Stanislavovich Solovykh, Wei Wang, Nathaniel Christopher Dirksen, Lawrence David Cutler, Apostolos Lerios
-
Publication number: 20210125306Abstract: Systems, methods, and non-transitory computer-readable media can obtain data associated with a computer-based experience. The computer-based experience can be based on interactive real-time technology. At least one virtual camera can be configured within the computer-based experience in an animation creation application. A plurality of shots that correspond to two-dimensional content can be generated from an edit cut of content captured by the at least one virtual camera in the animation creation application. Data associated with a two-dimensional version of the computer-based experience can be generated in a real-time engine based on the plurality of shots. The two-dimensional version can be rendered based on the generated data.Type: ApplicationFiled: October 22, 2020Publication date: April 29, 2021Inventors: Mikhail Stanislavovich Solovykh, Wei Wang, Nathaniel Christopher Dirksen, Lawrence David Cutler, Apostolos Lerios
-
Patent number: 10937219Abstract: Systems, methods, and non-transitory computer-readable media can identify a virtual character being presented to a user within a real-time immersive environment. A first animation to be applied to the virtual character is determined. A nonverbal communication animation to be applied to the virtual character simultaneously with the first animation is determined. The virtual character is animated in real-time based on the first animation and the nonverbal communication animation.Type: GrantFiled: July 25, 2018Date of Patent: March 2, 2021Assignee: Baobab Studios Inc.Inventors: Nathaniel Christopher Dirksen, Michael Scott Hutchinson, Eric Richard Darnell, Lawrence David Cutler, Daniel Tomas Steamer, Apostolos Lerios
-
Patent number: 10810780Abstract: Systems, methods, and non-transitory computer-readable media can receive virtual model information associated with a virtual deformable geometric model. The virtual model information comprises a complex rig comprising a plurality of transforms and a first plurality of vertices defined by a default model, and a simplified rig comprising a second plurality of transforms and a second plurality of vertices. The second plurality of vertices correspond to the first plurality of vertices defined by the default model. The simplified rig and the complex rig are deformed based on an animation to be applied to the virtual deformable geometric model. A set of offset data is calculated. The set of offset data comprises, for each vertex in the first plurality of vertices, an offset between the vertex and a corresponding vertex in the second plurality of vertices.Type: GrantFiled: July 25, 2018Date of Patent: October 20, 2020Assignee: Baobab Studios Inc.Inventors: Michael Scott Hutchinson, Nathaniel Christopher Dirksen, Lawrence David Cutler, Apostolos Lerios
-
Publication number: 20190035132Abstract: Systems, methods, and non-transitory computer-readable media can identify a virtual character being presented to a user within a real-time immersive environment. A first animation to be applied to the virtual character is determined. A nonverbal communication animation to be applied to the virtual character simultaneously with the first animation is determined. The virtual character is animated in real-time based on the first animation and the nonverbal communication animation.Type: ApplicationFiled: July 25, 2018Publication date: January 31, 2019Inventors: Nathaniel Christopher Dirksen, Michael Scott Hutchinson, Eric Richard Darnell, Lawrence David Cutler, Daniel Tomas Steamer, Apostolos Lerios
-
Publication number: 20190035130Abstract: Systems, methods, and non-transitory computer-readable media can receive virtual model information associated with a virtual deformable geometric model. The virtual model information comprises a complex rig comprising a plurality of transforms and a first plurality of vertices defined by a default model, and a simplified rig comprising a second plurality of transforms and a second plurality of vertices. The second plurality of vertices correspond to the first plurality of vertices defined by the default model. The simplified rig and the complex rig are deformed based on an animation to be applied to the virtual deformable geometric model. A set of offset data is calculated. The set of offset data comprises, for each vertex in the first plurality of vertices, an offset between the vertex and a corresponding vertex in the second plurality of vertices.Type: ApplicationFiled: July 25, 2018Publication date: January 31, 2019Inventors: Michael Scott Hutchinson, Nathaniel Christopher Dirksen, Lawrence David Cutler, Apostolos Lerios