Patents by Inventor Hanxiao Liu

Hanxiao Liu has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240104262
    Abstract: A method for scaling a vehicle crash dummy model, includes: obtaining a height ratio, a girth ratio, and a mass ratio of each part of a body shape of a target population to a reference population; calculating part scaling ratios of the target population to the reference population in a height direction and each direction in a girth plane according to the height ratio, the girth ratio, and the mass ratio; scaling simulation models of each part of a dummy simulation model of the reference population according to each scaling ratio; and assembling scaled models of each part, to obtain a dummy simulation model of the target population. This embodiment improves a shape simulation degree for a real human body of a dummy.
    Type: Application
    Filed: March 27, 2023
    Publication date: March 28, 2024
    Applicants: CHINA AUTOMOTIVE TECHNOLOGY AND RESEARCH CENTER CO., LTD, CATARC AUTOMOTIVE TEST CENTER (TIANJIN) CO., LTD
    Inventors: Zhixin LIU, Zhixin WU, Weidong LIU, Hong ZHENG, Kai WANG, Hong CHEN, Bingxu DUAN, Hanxiao ZHANG
  • Patent number: 11907853
    Abstract: A computer-implemented method for automatically determining a neural network architecture represents a neural network architecture as a data structure defining a hierarchical set of directed acyclic graphs in multiple levels. Each graph has an input, an output, and a plurality of nodes between the input and the output. At each level, a corresponding set of the nodes are connected pairwise by directed edges which indicate operations performed on outputs of one node to generate an input to another node. Each level is associated with a corresponding set of operations. At a lowest level, the operations associated with each edge are selected from a set of primitive operations. The method includes repeatedly generating new sample neural network architectures, and evaluating their fitness. The modification is performed by selecting a level, selecting two nodes at that level, and modifying, removing or adding an edge between those nodes according to operations associated with lower levels of the hierarchy.
    Type: Grant
    Filed: October 26, 2018
    Date of Patent: February 20, 2024
    Assignee: DeepMind Technologies Limited
    Inventors: Chrisantha Thomas Fernando, Karen Simonyan, Koray Kavukcuoglu, Hanxiao Liu, Oriol Vinyals
  • Publication number: 20230359862
    Abstract: A computer-implemented method for performing computer vision with reduced computational cost and improved accuracy can include obtaining, by a computing system including one or more computing devices, input data comprising an input tensor having one or more dimensions, providing, by the computing system, the input data to a machine-learned convolutional attention network, the machine-learned convolutional attention network including two or more network stages, and, in response to providing the input data to the machine-learned convolutional attention network, receiving, by the computing system, a machine-learning prediction from the machine-learned convolutional attention network. The convolutional attention network can include at least one attention block, wherein the attention block includes a relative attention mechanism, the relative attention mechanism including the sum of a static convolution kernel with an adaptive attention matrix.
    Type: Application
    Filed: July 19, 2023
    Publication date: November 9, 2023
    Inventors: Zihang Dai, Mingxing Tan, Quoc V. Le, Hanxiao Liu
  • Patent number: 11755883
    Abstract: A computer-implemented method for performing computer vision with reduced computational cost and improved accuracy can include obtaining, by a computing system including one or more computing devices, input data comprising an input tensor having one or more dimensions, providing, by the computing system, the input data to a machine-learned convolutional attention network, the machine-learned convolutional attention network including two or more network stages, and, in response to providing the input data to the machine-learned convolutional attention network, receiving, by the computing system, a machine-learning prediction from the machine-learned convolutional attention network. The convolutional attention network can include at least one attention block, wherein the attention block includes a relative attention mechanism, the relative attention mechanism including the sum of a static convolution kernel with an adaptive attention matrix.
    Type: Grant
    Filed: May 27, 2022
    Date of Patent: September 12, 2023
    Assignee: GOOGLE LLC
    Inventors: Zihang Dai, Hanxiao Liu, Mingxing Tan, Quoc V. Le
  • Publication number: 20230176840
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for compiler optimizations using a compiler optimization network. One of the methods includes receiving an input program, wherein the input program defines a graph of operation modules, wherein each node in the graph is a respective operation module, and each edge between nodes in the graph represents one operation module receiving the output generated by another operation module. The input program is processed by a compiler optimization network comprising a graph-embedding network that is configured to encode operation features and operation dependencies of the operation modules of the input program into a graph embedding representation and a policy network that is configured to generate an optimization action for each of one or more nodes encoded in the graph embedding representation.
    Type: Application
    Filed: June 7, 2021
    Publication date: June 8, 2023
    Inventors: Yanqi Zhou, Sudip Roy, Amirali Abdolrashidi, Daniel Lin-Kit Wong, Chao Ma, Qiumin Xu, Hanxiao Liu, Phitchaya Mangpo Phothilimthana, Shen Wang, Anna Darling Goldie, Azalia Mirhoseini, James Laudon
  • Publication number: 20230154161
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using memory-optimized contrastive learning to train image encoder and text encoder neural networks.
    Type: Application
    Filed: November 16, 2022
    Publication date: May 18, 2023
    Inventors: Hieu Hy Pham, Zihang Dai, Golnaz Ghiasi, Hanxiao Liu, Wei Yu, Mingxing Tan, Quoc V. Le
  • Publication number: 20230121404
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for searching for an architecture for an activation-normalization layer to be included in a neural network to replace a set of layers that receive a layer input comprising a plurality of values, apply one or more normalization operations to the values in the layer input to generate a normalized layer input, and apply an element-wise activation function to the normalized layer input to generate a layer output.
    Type: Application
    Filed: February 8, 2021
    Publication date: April 20, 2023
    Inventors: Hanxiao Liu, Quoc V. Le, Andrew Brock, Karen Simonyan
  • Publication number: 20220405579
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for selecting a neural network to perform a particular machine learning task while satisfying a set of constraints.
    Type: Application
    Filed: March 3, 2021
    Publication date: December 22, 2022
    Inventors: Jiahui Yu, Pengchong Jin, Hanxiao Liu, Gabriel Mintzer Bender, Pieter-Jan Kindermans, Mingxing Tan, Xiaodan Song, Ruoming Pang, Quoc V. Le
  • Publication number: 20220383119
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for performing a machine learning task on a network input to generate a network output. One of the systems includes an attention neural network configured to perform the machine learning task. The attention neural network includes one or more attentions layers that each include a squared ReLU activation layer, a depth-wise convolution layer, or both.
    Type: Application
    Filed: May 27, 2022
    Publication date: December 1, 2022
    Inventors: David Richard So, Quoc V. Le, Jr., Hanxiao Liu, Wojciech Andrzej Manke, Zihang Dai, Noam M. Shazeer
  • Publication number: 20220383069
    Abstract: A computer-implemented method for performing computer vision with reduced computational cost and improved accuracy can include obtaining, by a computing system including one or more computing devices, input data comprising an input tensor having one or more dimensions, providing, by the computing system, the input data to a machine-learned convolutional attention network, the machine-learned convolutional attention network including two or more network stages, and, in response to providing the input data to the machine-learned convolutional attention network, receiving, by the computing system, a machine-learning prediction from the machine-learned convolutional attention network. The convolutional attention network can include at least one attention block, wherein the attention block includes a relative attention mechanism, the relative attention mechanism including the sum of a static convolution kernel with an adaptive attention matrix.
    Type: Application
    Filed: May 27, 2022
    Publication date: December 1, 2022
    Inventors: Zihang Dai, Hanxiao Liu, Mingxing Tan, Quoc V. Le
  • Publication number: 20220367052
    Abstract: Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for performing a machine learning task on a network input to generate a network output. In one aspect, one of the systems includes a neural network configured to perform the machine learning task, the neural network including one or more blocks that each include a feedforward spatial transformation unit.
    Type: Application
    Filed: May 16, 2022
    Publication date: November 17, 2022
    Inventors: Hanxiao Liu, David Richard So, Quoc V. Le, Zihang Dai
  • Publication number: 20200293899
    Abstract: A computer-implemented method for automatically determining a neural network architecture represents a neural network architecture as a data structure defining a hierarchical set of directed acyclic graphs in multiple levels. Each graph has an input, an output, and a plurality of nodes between the input and the output. At each level, a corresponding set of the nodes are connected pairwise by directed edges which indicate operations performed on outputs of one node to generate an input to another node. Each level is associated with a corresponding set of operations. At a lowest level, the operations associated with each edge are selected from a set of primitive operations. The method includes repeatedly generating new sample neural network architectures, and evaluating their fitness. The modification is performed by selecting a level, selecting two nodes at that level, and modifying, removing or adding an edge between those nodes according to operations associated with lower levels of the hierarchy.
    Type: Application
    Filed: October 26, 2018
    Publication date: September 17, 2020
    Inventors: Chrisantha Thomas Fernando, Karen Simonyan, Koray Kavukcuoglu, Hanxiao Liu, Oriol Vinyals