Patents by Inventor Nathaniel Christopher Dirksen

Nathaniel Christopher Dirksen has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240087200
    Abstract: Systems, methods, and non-transitory computer-readable media can identify a virtual character being presented to a user within a real-time immersive environment. A first animation to be applied to the virtual character is determined. A nonverbal communication animation to be applied to the virtual character simultaneously with the first animation is determined. The virtual character is animated in real-time based on the first animation and the nonverbal communication animation.
    Type: Application
    Filed: November 10, 2023
    Publication date: March 14, 2024
    Inventors: Nathaniel Christopher Dirksen, Michael Scott Hutchinson, Eric Richard Darnell, Lawrence David Cutler, Daniel Tomas Steamer, Apostolos Lerios
  • Patent number: 11915342
    Abstract: Systems, methods, and non-transitory computer-readable media can obtain data associated with a computer-based experience. The computer-based experience can be based on interactive real-time technology. At least one virtual camera can be configured within the computer-based experience in a real-time engine. Data associated with an edit cut of the computer-based experience can be obtained based on content captured by the at least one virtual camera. A plurality of shots that correspond to two-dimensional content can be generated from the edit cut of the computer-based experience in the real-time engine. Data associated with a two-dimensional version of the computer-based experience can be generated with the real-time engine based on the plurality of shots. The two-dimensional version can be rendered based on the generated data.
    Type: Grant
    Filed: July 15, 2022
    Date of Patent: February 27, 2024
    Assignee: Baobab Studios Inc.
    Inventors: Mikhail Stanislavovich Solovykh, Wei Wang, Nathaniel Christopher Dirksen, Lawrence David Cutler, Apostolos Lerios
  • Patent number: 11694381
    Abstract: Systems, methods, and non-transitory computer-readable media can identify a virtual character being presented to a user within a real-time immersive environment. A first animation to be applied to the virtual character is determined. A nonverbal communication animation to be applied to the virtual character simultaneously with the first animation is determined. The virtual character is animated in real-time based on the first animation and the nonverbal communication animation.
    Type: Grant
    Filed: March 12, 2021
    Date of Patent: July 4, 2023
    Assignee: Baobab Studios Inc.
    Inventors: Nathaniel Christopher Dirksen, Michael Scott Hutchinson, Eric Richard Darnell, Lawrence David Cutler, Daniel Tomas Steamer, Apostolos Lerios
  • Patent number: 11636640
    Abstract: Systems, methods, and non-transitory computer-readable media can identify a virtual character being presented to a user within a real-time immersive environment. A first animation to be applied to the virtual character is determined. A nonverbal communication animation to be applied to the virtual character simultaneously with the first animation is determined. The virtual character is animated in real-time based on the first animation and the nonverbal communication animation.
    Type: Grant
    Filed: March 12, 2021
    Date of Patent: April 25, 2023
    Assignee: Baobab Studios Inc.
    Inventors: Nathaniel Christopher Dirksen, Michael Scott Hutchinson, Eric Richard Darnell, Lawrence David Cutler, Daniel Tomas Steamer, Apostolos Lerios
  • Publication number: 20230005192
    Abstract: Systems, methods, and non-transitory computer-readable media can obtain data associated with a computer-based experience. The computer-based experience can be based on interactive real-time technology. At least one virtual camera can be configured within the computer-based experience in a real-time engine. Data associated with an edit cut of the computer-based experience can be obtained based on content captured by the at least one virtual camera. A plurality of shots that correspond to two-dimensional content can be generated from the edit cut of the computer-based experience in the real-time engine. Data associated with a two-dimensional version of the computer-based experience can be generated with the real-time engine based on the plurality of shots. The two-dimensional version can be rendered based on the generated data.
    Type: Application
    Filed: July 15, 2022
    Publication date: January 5, 2023
    Inventors: Mikhail Stanislavovich Solovykh, Wei Wang, Nathaniel Christopher Dirksen, Lawrence David Cutler, Apostolos Lerios
  • Patent number: 11403787
    Abstract: Systems, methods, and non-transitory computer-readable media can obtain data associated with a computer-based experience. The computer-based experience can be based on interactive real-time technology. At least one virtual camera can be configured within the computer-based experience in a real-time engine. Data associated with an edit cut of the computer-based experience can be obtained based on content captured by the at least one virtual camera. A plurality of shots that correspond to two-dimensional content can be generated from the edit cut of the computer-based experience in the real-time engine. Data associated with a two-dimensional version of the computer-based experience can be generated with the real-time engine based on the plurality of shots. The two-dimensional version can be rendered based on the generated data.
    Type: Grant
    Filed: October 22, 2020
    Date of Patent: August 2, 2022
    Assignee: Baobab Studios Inc.
    Inventors: Mikhail Stanislavovich Solovykh, Wei Wang, Nathaniel Christopher Dirksen, Lawrence David Cutler, Apostolos Lerios
  • Publication number: 20210201552
    Abstract: Systems, methods, and non-transitory computer-readable media can identify a virtual character being presented to a user within a real-time immersive environment. A first animation to be applied to the virtual character is determined. A nonverbal communication animation to be applied to the virtual character simultaneously with the first animation is determined. The virtual character is animated in real-time based on the first animation and the nonverbal communication animation.
    Type: Application
    Filed: March 12, 2021
    Publication date: July 1, 2021
    Inventors: Nathaniel Christopher Dirksen, Michael Scott Hutchinson, Eric Richard Darnell, Lawrence David Cutler, Daniel Tomas Steamer, Apostolos Lerios
  • Publication number: 20210142546
    Abstract: Systems, methods, and non-transitory computer-readable media can identify a virtual character being presented to a user within a real-time immersive environment. A first animation to be applied to the virtual character is determined. A nonverbal communication animation to be applied to the virtual character simultaneously with the first animation is determined. The virtual character is animated in real-time based on the first animation and the nonverbal communication animation.
    Type: Application
    Filed: January 20, 2021
    Publication date: May 13, 2021
    Inventors: Nathaniel Christopher Dirksen, Michael Scott Hutchinson, Eric Richard Darnell, Lawrence David Cutler, Daniel Tomas Steamer, Apostolos Lerios
  • Publication number: 20210125382
    Abstract: Systems, methods, and non-transitory computer-readable media can obtain data associated with a computer-based experience. The computer-based experience can be based on interactive real-time technology. At least one virtual camera can be configured within the computer-based experience in a real-time engine. Data associated with an edit cut of the computer-based experience can be obtained based on content captured by the at least one virtual camera. A plurality of shots that correspond to two-dimensional content can be generated from the edit cut of the computer-based experience in the real-time engine. Data associated with a two-dimensional version of the computer-based experience can be generated with the real-time engine based on the plurality of shots. The two-dimensional version can be rendered based on the generated data.
    Type: Application
    Filed: October 22, 2020
    Publication date: April 29, 2021
    Inventors: Mikhail Stanislavovich Solovykh, Wei Wang, Nathaniel Christopher Dirksen, Lawrence David Cutler, Apostolos Lerios
  • Publication number: 20210125306
    Abstract: Systems, methods, and non-transitory computer-readable media can obtain data associated with a computer-based experience. The computer-based experience can be based on interactive real-time technology. At least one virtual camera can be configured within the computer-based experience in an animation creation application. A plurality of shots that correspond to two-dimensional content can be generated from an edit cut of content captured by the at least one virtual camera in the animation creation application. Data associated with a two-dimensional version of the computer-based experience can be generated in a real-time engine based on the plurality of shots. The two-dimensional version can be rendered based on the generated data.
    Type: Application
    Filed: October 22, 2020
    Publication date: April 29, 2021
    Inventors: Mikhail Stanislavovich Solovykh, Wei Wang, Nathaniel Christopher Dirksen, Lawrence David Cutler, Apostolos Lerios
  • Patent number: 10937219
    Abstract: Systems, methods, and non-transitory computer-readable media can identify a virtual character being presented to a user within a real-time immersive environment. A first animation to be applied to the virtual character is determined. A nonverbal communication animation to be applied to the virtual character simultaneously with the first animation is determined. The virtual character is animated in real-time based on the first animation and the nonverbal communication animation.
    Type: Grant
    Filed: July 25, 2018
    Date of Patent: March 2, 2021
    Assignee: Baobab Studios Inc.
    Inventors: Nathaniel Christopher Dirksen, Michael Scott Hutchinson, Eric Richard Darnell, Lawrence David Cutler, Daniel Tomas Steamer, Apostolos Lerios
  • Patent number: 10810780
    Abstract: Systems, methods, and non-transitory computer-readable media can receive virtual model information associated with a virtual deformable geometric model. The virtual model information comprises a complex rig comprising a plurality of transforms and a first plurality of vertices defined by a default model, and a simplified rig comprising a second plurality of transforms and a second plurality of vertices. The second plurality of vertices correspond to the first plurality of vertices defined by the default model. The simplified rig and the complex rig are deformed based on an animation to be applied to the virtual deformable geometric model. A set of offset data is calculated. The set of offset data comprises, for each vertex in the first plurality of vertices, an offset between the vertex and a corresponding vertex in the second plurality of vertices.
    Type: Grant
    Filed: July 25, 2018
    Date of Patent: October 20, 2020
    Assignee: Baobab Studios Inc.
    Inventors: Michael Scott Hutchinson, Nathaniel Christopher Dirksen, Lawrence David Cutler, Apostolos Lerios
  • Publication number: 20190035132
    Abstract: Systems, methods, and non-transitory computer-readable media can identify a virtual character being presented to a user within a real-time immersive environment. A first animation to be applied to the virtual character is determined. A nonverbal communication animation to be applied to the virtual character simultaneously with the first animation is determined. The virtual character is animated in real-time based on the first animation and the nonverbal communication animation.
    Type: Application
    Filed: July 25, 2018
    Publication date: January 31, 2019
    Inventors: Nathaniel Christopher Dirksen, Michael Scott Hutchinson, Eric Richard Darnell, Lawrence David Cutler, Daniel Tomas Steamer, Apostolos Lerios
  • Publication number: 20190035130
    Abstract: Systems, methods, and non-transitory computer-readable media can receive virtual model information associated with a virtual deformable geometric model. The virtual model information comprises a complex rig comprising a plurality of transforms and a first plurality of vertices defined by a default model, and a simplified rig comprising a second plurality of transforms and a second plurality of vertices. The second plurality of vertices correspond to the first plurality of vertices defined by the default model. The simplified rig and the complex rig are deformed based on an animation to be applied to the virtual deformable geometric model. A set of offset data is calculated. The set of offset data comprises, for each vertex in the first plurality of vertices, an offset between the vertex and a corresponding vertex in the second plurality of vertices.
    Type: Application
    Filed: July 25, 2018
    Publication date: January 31, 2019
    Inventors: Michael Scott Hutchinson, Nathaniel Christopher Dirksen, Lawrence David Cutler, Apostolos Lerios