Patents by Inventor Jason Zhi LIANG
Jason Zhi LIANG has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11507844Abstract: The technology disclosed proposes a novel asynchronous evaluation strategy (AES) that increases throughput of evolutionary algorithms by continuously maintaining a queue of K individuals ready to be sent to the worker nodes for evaluation and evolving the next generation once a fraction Mi of the K individuals have been evaluated by the worker nodes, where Mi<<K. A suitable value for Mi is determined experimentally, balancing diversity and efficiency. The technology disclosed is extended to coevolution of deep neural network supermodules and blueprints in the form of AES for cooperative evolution of deep neural networks (CoDeepNEAT-AES). Applied to image captioning domain, a threefold speedup is observed on 200 graphics processing unit (GPU) worker nodes, demonstrating that the disclosed AES and CoDeepNEAT-AES are promising techniques for evolving complex systems with long and variable evaluation times.Type: GrantFiled: March 7, 2018Date of Patent: November 22, 2022Assignee: Cognizant Technology Solutions U.S. CorporationInventors: Jason Zhi Liang, Hormoz Shahrzad, Babak Hodjat, Risto Miikkulainen
-
Patent number: 11250327Abstract: The technology disclosed relates to evolving deep neural network structures. A deep neural network structure includes a plurality of modules with submodules and interconnections among the modules and the submodules. In particular, the technology disclosed relates to storing candidate genomes that identify respective values for a plurality of hyperparameters of a candidate genome. The hyperparameters include global topology hyperparameters, global operational hyperparameters, local topology hyperparameters, and local operational hyperparameters. It further includes evolving the hyperparameters by training, evaluating, and procreating the candidate genomes and corresponding modules and submodules.Type: GrantFiled: October 26, 2017Date of Patent: February 15, 2022Assignee: Cognizant Technology Solutions U.S. CorporationInventors: Jason Zhi Liang, Risto Miikkulainen
-
Patent number: 11250328Abstract: The technology disclosed relates to evolving a deep neural network based solution to a provided problem. In particular, it relates to providing an improved cooperative evolution technique for deep neural network structures. It includes creating blueprint structures that include a plurality of supermodule structures. The supermodule structures include a plurality of modules. The modules are neural networks. A first loop of evolution executes at the blueprint level. A second loop of evolution executes at the supermodule level. Further, multiple mini-loops of evolution execute at each of the subpopulations of the supermodules. The first loop, the second loop, and the mini-loops execute in parallel.Type: GrantFiled: October 26, 2017Date of Patent: February 15, 2022Assignee: Cognizant Technology Solutions U.S. CorporationInventors: Jason Zhi Liang, Risto Miikkulainen
-
Patent number: 11030529Abstract: Evolution and coevolution of neural networks via multitask learning is described. The foundation is (1) the original soft ordering, which uses a fixed architecture for the modules and a fixed routing (i.e. network topology) that is shared among all tasks. This architecture is then extended in two ways with CoDeepNEAT: (2) by coevolving the module architectures (CM), and (3) by coevolving both the module architectures and a single shared routing for all tasks using (CMSR). An alternative evolutionary process (4) keeps the module architecture fixed, but evolves a separate routing for each task during training (CTR). Finally, approaches (2) and (4) are combined into (5), where both modules and task routing are coevolved (CMTR).Type: GrantFiled: December 13, 2018Date of Patent: June 8, 2021Assignee: Cognizant Technology Solutions U.S. CorporationInventors: Jason Zhi Liang, Elliot Meyerson, Risto Miikkulainen
-
Patent number: 11003994Abstract: A system and method for evolving a deep neural network structure that solves a provided problem includes: a memory storing a candidate supermodule genome database having a pool of candidate supermodules having values for hyperparameters for identifying a plurality of neural network modules in the candidate supermodule and further storing fixed multitask neural networks; a training module that assembles and trains N enhanced fixed multitask neural networks and trains each enhanced fixed multitask neural network using training data; an evaluation module that evaluates a performance of each enhanced fixed multitask neural network using validation data; a competition module that discards supermodules in accordance with assigned fitness values and saves others in an elitist pool; an evolution module that evolves the supermodules in the elitist pool; and a solution harvesting module providing for deployment of a selected one of the enhanced fixed multitask neural networks, instantiated with supermodules selected frType: GrantFiled: December 7, 2018Date of Patent: May 11, 2021Assignee: Cognizant Technology Solutions U.S. CorporationInventors: Jason Zhi Liang, Elliot Meyerson, Risto Miikkulainen
-
Publication number: 20200143243Abstract: An evolutionary AutoML framework called LEAF optimizes hyperparameters, network architectures and the size of the network. LEAF makes use of both evolutionary algorithms (EAs) and distributed computing frameworks. A multiobjective evolutionary algorithm is used to maximize the performance and minimize the complexity of the evolved networks simultaneously by calculating the Pareto front given a group of individuals that have been evaluated for multiple obj ectives.Type: ApplicationFiled: November 1, 2019Publication date: May 7, 2020Applicant: Cognizant Technology Solutions U.S. CorporationInventors: Jason Zhi Liang, Elliot Meyerson, Risto Miikkulainen
-
Publication number: 20190180186Abstract: A system and method for evolving a deep neural network structure that solves a provided problem includes: a memory storing a candidate supermodule genome database having a pool of candidate supermodules having values for hyperparameters for identifying a plurality of neural network modules in the candidate supermodule and further storing fixed multitask neural networks; a training module that assembles and trains N enhanced fixed multitask neural networks and trains each enhanced fixed multitask neural network using training data; an evaluation module that evaluates a performance of each enhanced fixed multitask neural network using validation data; a competition module that discards supermodules in accordance with assigned fitness values and saves others in an elitist pool; an evolution module that evolves the supermodules in the elitist pool; and a solution harvesting module providing for deployment of a selected one of the enhanced fixed multitask neural networks, instantiated with supermodules selected frType: ApplicationFiled: December 7, 2018Publication date: June 13, 2019Applicant: Sentient Technologies (Barbados) LimitedInventors: Jason Zhi Liang, Elliot Meyerson, Risto Miikkulainen
-
Publication number: 20190180188Abstract: Evolution and coevolution of neural networks via multitask learning is described. The foundation is (1) the original soft ordering, which uses a fixed architecture for the modules and a fixed routing (i.e. network topology) that is shared among all tasks. This architecture is then extended in two ways with CoDeepNEAT: (2) by coevolving the module architectures (CM), and (3) by coevolving both the module architectures and a single shared routing for all tasks using (CMSR). An alternative evolutionary process (4) keeps the module architecture fixed, but evolves a separate routing for each task during training (CTR). Finally, approaches (2) and (4) are combined into (5), where both modules and task routing are coevolved (CMTR).Type: ApplicationFiled: December 13, 2018Publication date: June 13, 2019Applicant: Cognizant Technology Solutions U.S. CorporationInventors: Jason Zhi Liang, Elliot Meyerson, Risto Miikkulainen
-
Publication number: 20180260713Abstract: The technology disclosed proposes a novel asynchronous evaluation strategy (AES) that increases throughput of evolutionary algorithms by continuously maintaining a queue of K individuals ready to be sent to the worker nodes for evaluation and evolving the next generation once a fraction Mi of the K individuals have been evaluated by the worker nodes, where Mi<<K. A suitable value for Mi is determined experimentally, balancing diversity and efficiency. The technology disclosed is extended to coevolution of deep neural network supermodules and blueprints in the form of AES for cooperative evolution of deep neural networks (CoDeepNEAT-AES). Applied to image captioning domain, a threefold speedup is observed on 200 graphics processing unit (GPU) worker nodes, demonstrating that the disclosed AES and CoDeepNEAT-AES are promising techniques for evolving complex systems with long and variable evaluation times.Type: ApplicationFiled: March 7, 2018Publication date: September 13, 2018Applicant: SENTIENT TECHNOLOGIES (BARBADOS) LIMITEDInventors: Jason Zhi LIANG, Hormoz SHAHRZAD, Babak HODJAT, Risto MIIKKULAINEN
-
Publication number: 20180114115Abstract: The technology disclosed relates to evolving deep neural network structures. A deep neural network structure includes a plurality of modules with submodules and interconnections among the modules and the submodules. In particular, the technology disclosed relates to storing candidate genomes that identify respective values for a plurality of hyperparameters of a candidate genome. The hyperparameters include global topology hyperparameters, global operational hyperparameters, local topology hyperparameters, and local operational hyperparameters. It further includes evolving the hyperparameters by training, evaluating, and procreating the candidate genomes and corresponding modules and submodules.Type: ApplicationFiled: October 26, 2017Publication date: April 26, 2018Applicant: Sentient Technologies (Barbados) LimitedInventors: Jason Zhi LIANG, Risto MIIKKULAINEN
-
Publication number: 20180114116Abstract: The technology disclosed relates to evolving a deep neural network based solution to a provided problem. In particular, it relates to providing an improved cooperative evolution technique for deep neural network structures. It includes creating blueprint structures that include a plurality of supermodule structures. The supermodule structures include a plurality of modules. The modules are neural networks. A first loop of evolution executes at the blueprint level. A second loop of evolution executes at the supermodule level. Further, multiple mini-loops of evolution execute at each of the subpopulations of the supermodules. The first loop, the second loop, and the mini-loops execute in parallel.Type: ApplicationFiled: October 26, 2017Publication date: April 26, 2018Inventors: Jason Zhi LIANG, Risto MIIKKULAINEN