Patents by Inventor Zhihua Wu

Zhihua Wu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240128518
    Abstract: An electrode assembly and a lithium ion electric roll having the same are provided. The electrode assembly includes: a first electrode unit; a first anti-puncture cushion; in which the first electrode unit includes a first electrode sheet, an second electrode sheet, and a separator, the second electrode sheet comprises a second top edge and a second bottom edge along the length direction of the first electrode unit; an edge of the first anti-puncture cushion exceeds the second electrode sheet from the second top edge or the second bottom edge along the length direction of the first electrode unit.
    Type: Application
    Filed: December 22, 2023
    Publication date: April 18, 2024
    Applicant: DONGGUAN AMPEREX TECHNOLOGY LIMITED
    Inventors: Junliang ZHU, Haibing WANG, Tongming DONG, Wenqiang CHENG, Baohua CHEN, Shufeng WU, Wei YANG, Zhihua QIN, Meina LIN
  • Publication number: 20240105983
    Abstract: A stacking device may include a negative delivery structure, a separator delivery structure, a first heater and a positive delivery structure. The negative delivery structure may be configured to deliver a negative electrode sheet. The separator delivery structure may be configured to deliver separators to two sides of the negative electrode sheet such that the separators are attached to the negative electrode sheet. The first heater may be arranged downstream of the separator delivery structure and configured to heat the negative electrode sheet and the separators. The positive delivery structure may be arranged downstream of the first heater and configured to deliver positive electrode sheets to the two sides of the negative electrode sheet such that the positive electrode sheets are attached to the separators.
    Type: Application
    Filed: December 8, 2023
    Publication date: March 28, 2024
    Applicant: CONTEMPORARY AMPEREX TECHNOLOGY CO., LIMITED
    Inventors: Ruhu LIAO, Gang ZENG, Zhiyang WU, Jianlei WANG, Ya DAI, Zhihua WEN
  • Publication number: 20240082406
    Abstract: The present invention provides a benzoheterocycle substituted tetrahydroisoquinoline compound, and in particular, relates to a compound shown in formula (I) and a pharmaceutically acceptable salt thereof, and the compound for the treatment of chronic kidney disease.
    Type: Application
    Filed: December 17, 2021
    Publication date: March 14, 2024
    Inventors: Shuchun GUO, Jun FAN, Nan WU, Zhihua FANG, Wenqiang SHI, Yang LIU, Jianbiao PENG, Haibing GUO
  • Publication number: 20230206024
    Abstract: A resource allocation method, including: determining a neural network model to be allocated resources, and determining a set of devices capable of providing resources for the neural network model; determining, based on the set of devices and the neural network model, first set of evaluation points including first number of evaluation points, each of which corresponds to one resource allocation scheme and resource use cost corresponding to the resource allocation scheme; updating and iterating first set of evaluation points to obtain second set of evaluation points including second number of evaluation points, each of which corresponds to one resource allocation scheme and resource use cost corresponding to the resource allocation scheme, and second number being greater than first number; and selecting a resource allocation scheme with minimum resource use cost from the second set of evaluation points as a resource allocation scheme for allocating resources to the neural network model.
    Type: Application
    Filed: August 19, 2022
    Publication date: June 29, 2023
    Inventors: Ji Liu, Zhihua Wu, Danlei Feng, Chendi Zhou, Minxu Zhang, Xinxuan Wu, Xuefeng Yao, Dejing Dou, Dianhai Yu, Yanjun Ma
  • Publication number: 20230206080
    Abstract: A model training system includes at least one first cluster and a second cluster communicating with the at least first cluster. The at least one first cluster is configured to acquire a sample data set, generate training data according to the sample data set, and send the training data to the second cluster; and the second cluster is configured to train a pre-trained model according to the training data sent by the at least one first cluster.
    Type: Application
    Filed: March 7, 2023
    Publication date: June 29, 2023
    Inventors: Shuohuan WANG, Weibao GONG, Zhihua WU, Yu SUN, Siyu DING, Yaqian HAN, Yanbin ZHAO, Yuang LIU, Dianhai YU
  • Publication number: 20230206075
    Abstract: A method for distributing network layers in a neural network model includes: acquiring a to-be-processed neural network model and a computing device set; generating a target number of distribution schemes according to network layers in the to-be-processed neural network model and computing devices in the computing device set, the distribution schemes including corresponding relationships between the network layers and the computing devices; according to device types of the computing devices, combining the network layers corresponding to the same device type in each distribution scheme into one stage, to obtain a combination result of each distribution scheme; obtaining an adaptive value of each distribution scheme according to the combination result of each distribution scheme; and determining a target distribution scheme from the distribution schemes according to respective adaptive value, and taking the target distribution scheme as a distribution result of the network layers in the to-be-processed neural n
    Type: Application
    Filed: November 21, 2022
    Publication date: June 29, 2023
    Applicant: BEIJING BAIDU NETCOM SCIENCE TECHNOLOGY CO., LTD.
    Inventors: Ji LIU, Zhihua WU, Danlei FENG, Minxu ZHANG, Xinxuan WU, Xuefeng YAO, Beichen MA, Dejing DOU, Dianhai YU, Yanjun MA
  • Publication number: 20230169351
    Abstract: A distributed training method based on end-to-end adaption, a device and a storage medium. The method includes: obtaining slicing results by slicing a model to be trained; obtaining an attribute of computing resources allocated to the model for training by parsing the computing resources, in which the computing resources are determined based on a computing resource requirement of the model, computing resources occupied by another model being trained, and idle computing resources, and the attribute of the computing resources is configured to represent at least one of a topology relation and a task processing capability of the computing resources; determining a distribution strategy of each of the slicing results in the computing resources based on the attributes of the computing resources; and performing distributed training on the model using the computing resources based on the distribution strategy.
    Type: Application
    Filed: December 1, 2022
    Publication date: June 1, 2023
    Applicant: BEIJING BAIDU NETCOM SCIENCE TECHNOLOGY CO., LTD.
    Inventors: Haifeng Wang, Zhihua Wu, Dianhai Yu, Yanjun Ma, Tian Wu
  • Publication number: 20220382441
    Abstract: A method and apparatus for constructing a virtual assembly, and a computer-readable storage medium are provided. The method includes: receiving a cutting operation instruction input on a substrate by a user, displaying, on the substrate, a cutting path indicated by the cutting operation instruction, and cutting the substrate into at least two parts, and by means of receiving an assembly instruction input by the user, assembling the at least two parts, and displaying a virtual assembly formed by means of assembly. Since the parts are determined by the cutting path indicated by the cutting operation instruction input by the user, the “parts” in the method are not limited by material or shape, and the virtual assembly formed by assembling the parts is not limited by materials, parts, space, etc. Therefore, the construction of a virtual assembly is highly flexible, thereby improving the user experience.
    Type: Application
    Filed: August 10, 2022
    Publication date: December 1, 2022
    Inventors: Xin HUANG, Zhihua Wu, Jiaqi Fan, Shentao Wang
  • Publication number: 20220374704
    Abstract: The disclosure provides a neural network training method and apparatus, an electronic device, a medium and a program product, and relates to the field of artificial intelligence, in particular to the fields of deep learning and distributed learning.
    Type: Application
    Filed: December 21, 2021
    Publication date: November 24, 2022
    Applicant: Beijing Baidu Netcom Science Technology Co., Ltd.
    Inventors: Danlei FENG, Long LIAN, Dianhai YU, Xuefeng YAO, Xinxuan WU, Zhihua WU, Yanjun MA
  • Publication number: 20220374713
    Abstract: The present disclosure provides a method and apparatus for performing distributed training on a deep learning model. The method may include: generating a distributed computation view based on data information of a to-be-trained deep learning model; generating a cluster resource view based on property information of a cluster hardware resource corresponding to the to-be-trained deep learning model; determining a target segmentation strategy of a distributed training task based on the distributed computation view and the cluster resource view; and performing distributed training on the to-be-trained deep learning model based on the target segmentation strategy.
    Type: Application
    Filed: August 3, 2022
    Publication date: November 24, 2022
    Inventors: Zhihua WU, Dianhai YU, Yulong AO, Weibao GONG
  • Publication number: 20220061013
    Abstract: There is provided a method comprising receiving at least one measured signal characteristic from a user equipment, the user equipment being located at a user equipment location; comparing the at least one measured signal characteristic to at least one of a plurality of signal characteristics, each signal characteristic being associated with a respective measurement point; and determining, based on the comparing, a probability that the user equipment location is a first location.
    Type: Application
    Filed: September 17, 2018
    Publication date: February 24, 2022
    Inventors: Jun WANG, Gang SHEN, Liuhai LI, Liang CHEN, Kan LIN, Zhihua WU, Chaojun XU, Jiexing GAO
  • Publication number: 20220058222
    Abstract: The present disclosure provides a method of processing information, an apparatus of processing information, a method of recommending information, an electronic device, and a storage medium. The method includes: obtaining a tree structure parameter of a tree structure, wherein the tree structure is configured to index an object set used for recommendation; obtaining a classifier parameter of a classifier, wherein the classifier is configured to sequentially predict, from a top layer of the tree structure to a bottom layer of the tree structure, a preference node set whose probability of being preferred by a user is ranked higher in each layer, and a preference node set of each layer subsequent to the top layer of the tree structure is determined based on a preference node set of a previous layer of the each layer; and constructing a recalling model based on the tree structure parameter and the classifier parameter.
    Type: Application
    Filed: November 3, 2021
    Publication date: February 24, 2022
    Inventors: Mo CHENG, Dianhai YU, Lin MA, Zhihua WU, Daxiang DONG, Wei TANG
  • Publication number: 20220036241
    Abstract: The present disclosure discloses a method, an apparatus and a storage medium for training a deep learning framework, and relates to the artificial intelligence field such as deep learning and big data processing. The specific implementation solution is: acquiring at least one task node in a current task node cluster, that meets a preset opening condition when a target task meets a training start condition; judging whether a number of nodes of the at least one task node is greater than or equal to a preset number; synchronously training the deep learning framework of the target task by the at least one task node according to sample data if the number of nodes is greater than the preset number; and acquiring a synchronously trained target deep learning framework when the target task meets a training completion condition.
    Type: Application
    Filed: October 14, 2021
    Publication date: February 3, 2022
    Applicant: BEIJING BAIDU NETCOM SCIENCE TECHNOLOGY CO., LTD.
    Inventors: Tianjian He, Dianhai Yu, Zhihua Wu, Daxiang Dong, Yanjun Ma
  • Publication number: 20210374542
    Abstract: The invention discloses a method and an apparatus for updating parameters of a multi-task model. The method includes: obtaining a training sample set, in which the training sample set comprises a plurality of samples and a task to which each sample belongs; putting each sample into a corresponding sample queue sequentially according to the task to which each sample belongs; training a shared network layer in the multi-task model and a target sub-network layer of tasks associated with the sample queue with samples in the sample queue in case that the number of the samples in the sample queue reaches a training data requirement, so as to generate a model parameter update gradient corresponding to the tasks associated with the sample queue; and updating parameters of the shared network layer and the target sub-network layer in a parameter server according to the model parameter update gradient.
    Type: Application
    Filed: August 9, 2021
    Publication date: December 2, 2021
    Applicant: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY CO., LTD.
    Inventors: Wenhui ZHANG, Dianhai YU, Zhihua WU
  • Patent number: D954502
    Type: Grant
    Filed: May 16, 2021
    Date of Patent: June 14, 2022
    Inventor: Zhihua Wu
  • Patent number: D1016710
    Type: Grant
    Filed: October 27, 2021
    Date of Patent: March 5, 2024
    Assignee: CITIC Dicastal Co., Ltd.
    Inventors: Zhichong Liu, Zuo Xu, Hanqi Wu, Zhihua Zhu
  • Patent number: D1016711
    Type: Grant
    Filed: October 27, 2021
    Date of Patent: March 5, 2024
    Assignee: CITIC Dicastal Co., Ltd.
    Inventors: Zhichong Liu, Zuo Xu, Hanqi Wu, Zhihua Zhu
  • Patent number: D1018416
    Type: Grant
    Filed: October 27, 2021
    Date of Patent: March 19, 2024
    Assignee: CITIC Dicastal Co., Ltd.
    Inventors: Zhichong Liu, Zuo Xu, Hanqi Wu, Zhihua Zhu
  • Patent number: D1018421
    Type: Grant
    Filed: October 27, 2021
    Date of Patent: March 19, 2024
    Assignee: CITIC Dicastal Co., Ltd.
    Inventors: Zhichong Liu, Zuo Xu, Hanqi Wu, Zhihua Zhu
  • Patent number: D1024494
    Type: Grant
    Filed: December 6, 2022
    Date of Patent: April 30, 2024
    Inventor: Zhihua Wu