Patents by Inventor Mianlun Zheng

Mianlun Zheng has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11972052
    Abstract: Human interactive texture generation and search systems and methods are described. A deep convolutional generative adversarial network is used for mapping information in a latent space into texture models. An interactive evolutionary computation algorithm for searching a texture through an evolving latent space driven by human preference is also described. Advantages of a generative model and an evolutionary computation are combined to realize a controllable and bounded texture tuning process under the guidance of human preferences. Additionally, a fully haptic user interface is described, which can be used to evaluate the systems and methods in terms of their efficiency and accuracy of searching and generating new virtual textures that are closely representative of given real textures.
    Type: Grant
    Filed: April 21, 2022
    Date of Patent: April 30, 2024
    Assignee: UNIVERSITY OF SOUTHERN CALIFORNIA
    Inventors: Shihan Lu, Heather Culbertson, Matthew Fontaine, Mianlun Zheng
  • Patent number: 11830138
    Abstract: Various disclosed embodiments are directed to estimating that a first vertex of a patch will change from a first position to a second position (the second position being at least partially indicative of secondary motion) based at least in part on one or more features of: primary motion data, one or more material properties, and constraint data associated with the particular patch. Such estimation can be made for some or all of the patches of an entire volumetric mesh in order to accurately predict the overall secondary motion of an object. This, among other functionality described herein resolves the inaccuracies, computer resource consumption, and the user experience of existing technologies.
    Type: Grant
    Filed: March 19, 2021
    Date of Patent: November 28, 2023
    Assignee: ADOBE INC.
    Inventors: Duygu Ceylan Aksit, Mianlun Zheng, Yi Zhou
  • Publication number: 20220382375
    Abstract: Human interactive texture generation and search systems and methods are described. A deep convolutional generative adversarial network is used for mapping information in a latent space into texture models. An interactive evolutionary computation algorithm for searching a texture through an evolving latent space driven by human preference is also described. Advantages of a generative model and an evolutionary computation are combined to realize a controllable and bounded texture tuning process under the guidance of human preferences. Additionally, a fully haptic user interface is described, which can be used to evaluate the systems and methods in terms of their efficiency and accuracy of searching and generating new virtual textures that are closely representative of given real textures.
    Type: Application
    Filed: April 21, 2022
    Publication date: December 1, 2022
    Inventors: Shihan LU, Heather CULBERTSON, Matthew FONTAINE, Mianlun ZHENG
  • Publication number: 20220301262
    Abstract: Various disclosed embodiments are directed to estimating that a first vertex of a patch will change from a first position to a second position (the second position being at least partially indicative of secondary motion) based at least in part on one or more features of: primary motion data, one or more material properties, and constraint data associated with the particular patch. Such estimation can be made for some or all of the patches of an entire volumetric mesh in order to accurately predict the overall secondary motion of an object. This, among other functionality described herein resolves the inaccuracies, computer resource consumption, and the user experience of existing technologies.
    Type: Application
    Filed: March 19, 2021
    Publication date: September 22, 2022
    Inventors: Duygu Ceylan Aksit, Mianlun Zheng, Yi Zhou