Patents by Inventor Yi Tay

Yi Tay has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12346793
    Abstract: A system for performing a machine learning task on a network input is described. The system includes one or more computers and one or more storage devices storing instructions that, when executed by the one or more computers, cause the one or more computers to implement (i) multiple sorting networks in which each sorting network is configured to sort vector blocks in a sequence of vector blocks to generate a sorted sequence of vector blocks; and (ii) a sorting attention neural network configured to perform the machine learning task on the input sequence by executing multiple sorting attention mechanisms using the sorting networks.
    Type: Grant
    Filed: February 8, 2021
    Date of Patent: July 1, 2025
    Assignee: Google LLC
    Inventors: Yi Tay, Liu Yang, Donald Arthur Metzler, Jr., Dara Bahri, Da-Cheng Juan
  • Publication number: 20250174907
    Abstract: A multiplexer and an antenna array module applying the multiplexer are provided, the array antenna module includes an array antenna and a beamforming module, the multiplexer includes a first end and at least two second ends extending towards the first end. One of the first end or the at least two second ends is connected to the array antenna, the other one of the first end or the at least two second ends is connected to the beamforming module, the first end and the at least two second ends are configured to conduct signals between the array antenna and the beamforming module.
    Type: Application
    Filed: November 27, 2024
    Publication date: May 29, 2025
    Inventors: LUNG-TA CHANG, Chia-Hsien Chen, Shao-Yi Tai, Ping-Chi Kao, Peng-Hao Huang, Tsung-Bin Chang
  • Publication number: 20250174880
    Abstract: A multiplexer and an antenna array module applying the multiplexer are provided, the array antenna module includes a circuit board having a plurality of layers and an array antenna, the multiplexer includes a first end, at least two second ends, a connecting portion, and a plurality of conductive portions. The second ends and the first end are arranged on a same layer of the circuit board. The connecting portion is connected between the first end and the second ends, the connecting portion arranged on a different layer of the circuit board to the layer which the first end and the at least two second ends are located. The connecting portion is connected to the first end and the second ends through the conductive portions. One of the first end or the at least two second ends is connected to the array antenna for conducting signals of the array antenna.
    Type: Application
    Filed: November 28, 2024
    Publication date: May 29, 2025
    Inventors: LUNG-TA CHANG, Chia-Hsien Chen, Shao-Yi Tai, Ping-Chi Kao, Peng-Hao Huang, Tsung-Bin Chang
  • Publication number: 20250165469
    Abstract: Provided are systems and methods for training and/or use of a machine learning model that can directly predict one or more resources that are responsive to a query as an output of the model. In particular, the present disclosure demonstrates that information retrieval can be accomplished with a single machine learning model (e.g., that has a neural network architecture such as, for example, a Transformer architecture) in which all information about the corpus is encoded in the parameters of the model. To this end, the present disclosure introduces the Differentiable Search Index (DSI), a new paradigm that learns a query-to-result (e.g., in text-to-text format) model that will map queries (e.g., text strings) directly to relevant resource identifiers (“docids”) (e.g.
    Type: Application
    Filed: February 9, 2023
    Publication date: May 22, 2025
    Inventors: Yi Tay, Vinh Quoc Tran, William Weston Cohen, Donald Arthur Metzler, JR.
  • Publication number: 20250156756
    Abstract: An example method for pretraining a machine-learned model is provided. The example method includes obtaining a plurality of different combinations of configuration parameters of a pretraining objective framework. The example method includes generating, using the pretraining objective framework, a plurality of corrupted training examples from one or more training examples, wherein the plurality of corrupted training examples are respectively generated according to the plurality of different combinations. The example method includes inputting the plurality of corrupted training examples into the machine-learned model, wherein the machine-learned model is configured to generate uncorrupted subportions corresponding to corrupted subportions of the corrupted training examples. The example method includes obtaining, from the machine-learned model, a plurality of outputs respectively generated by the machine-learned model based on the plurality of corrupted training examples.
    Type: Application
    Filed: December 30, 2022
    Publication date: May 15, 2025
    Inventors: Yi Tay, Mostafa Dehghani
  • Patent number: 12275258
    Abstract: A printing device and the ribbon mounting mechanism thereof are disclosed. The ribbon mounting mechanism is used for mounting a first ribbon comprising a first core and a second ribbon comprising a second core having a diameter smaller than that of the first core. The ribbon mounting mechanism includes a first shaft supporting the first core of the first ribbon; a second shaft supporting the second core of the second ribbon, wherein the second shaft and the first shaft are concentrically disposed; a torque generator comprising a third shaft; a first gear set connecting the first shaft and the third shaft, and torque generated by the torque generator is transmitted to the first shaft through the first gear set; and a second gear set connecting the second shaft and the third shaft, and torque generated by the torque generator is transmitted to the second shaft through the second gear set.
    Type: Grant
    Filed: May 28, 2023
    Date of Patent: April 15, 2025
    Assignee: GODEX INTERNATIONAL CO., LTD
    Inventors: Feng-Yi Tai, Ching-Yang Chou, Che-Fu Hsu, Chun-Chang Tu
  • Publication number: 20250096091
    Abstract: A substrate comprising at least one dielectric layer, and a plurality of interconnects located at least partially in the at least one dielectric layer, wherein the plurality of interconnects include a plurality of via interconnects, and wherein the plurality of via interconnects include a first via interconnect comprising a first via wall that is approximately vertical.
    Type: Application
    Filed: September 19, 2023
    Publication date: March 20, 2025
    Inventors: Chiao-Yi TAI, Joan Rey Villarba BUOT, Hong Bok WE
  • Publication number: 20250017847
    Abstract: Implantable depots for delivering therapeutic agents and associated systems and methods are provided. In some embodiments, an implantable depot for treating pain in a subject after a surgical procedure includes a therapeutic region having a first polymer and an analgesic agent, a first control region including a second polymer, and a second control region including a third polymer. The first and second control regions can cover first and second surfaces of the therapeutic region to inhibit release of the analgesic agent therefrom. The depot can include one or more holes extending through the first and second control regions and the therapeutic region to form one or more exposed portions. When implanted in the subject, the implantable depot can release the analgesic agent from a lateral surface of the therapeutic region between the first and second surfaces, and from the exposed portions of the therapeutic region.
    Type: Application
    Filed: September 30, 2022
    Publication date: January 16, 2025
    Inventors: Jackie Joe Hancock, Daniel Boon Lim Seet, Cynthia R. Lee, Koon Kiat Teu, Gregg M. Bishop, Ming Siew Lim, Alicia Mui Shen Ng, Patrick H. Ruane, Mukhtiar Singh, James Su, Mei Yi Tay
  • Publication number: 20240403636
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for executing and training a multi-modal, multi-task self-attention neural network.
    Type: Application
    Filed: October 5, 2022
    Publication date: December 5, 2024
    Inventors: Valerii Likhosherstov, Mostafa Dehghani, Anurag Arnab, Krzysztof Marcin Choromanski, Mario Lucic, Yi Tay
  • Publication number: 20240391263
    Abstract: A printing device and the ribbon mounting mechanism thereof are disclosed. The ribbon mounting mechanism is used for mounting a first ribbon comprising a first core and a second ribbon comprising a second core having a diameter smaller than that of the first core. The ribbon mounting mechanism includes a first shaft supporting the first core of the first ribbon; a second shaft supporting the second core of the second ribbon, wherein the second shaft and the first shaft are concentrically disposed; a torque generator comprising a third shaft; a first gear set connecting the first shaft and the third shaft, and torque generated by the torque generator is transmitted to the first shaft through the first gear set; and a second gear set connecting the second shaft and the third shaft, and torque generated by the torque generator is transmitted to the second shaft through the second gear set.
    Type: Application
    Filed: May 28, 2023
    Publication date: November 28, 2024
    Inventors: FENG-YI TAI, CHING-YANG CHOU, CHE-FU HSU, CHUN-CHANG TU
  • Publication number: 20240289552
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for performing a machine learning task on an input sequence of characters that has a respective character at each of a plurality of character positions to generate a network output. One of the systems includes a neural network configured to perform the machine learning task, the neural network comprising a gradient-based sub-word tokenizer and an output neural network. The gradient-based sub-word tokenizer is configured to apply a learned, i.e., flexible, sub-word tokenization strategy to the input sequence of characters to generate a sequence of latent sub-word representations. The output neural network is configured to process the latent sub-word representation to generate the network output for the task.
    Type: Application
    Filed: May 27, 2022
    Publication date: August 29, 2024
    Inventors: Yi Tay, Dara Bahri, Donald Arthur Metzler, Jr., Hyung Won Chung, Jai Prakash Gupta, Sebastian Nikolas Ruder, Simon Baumgartner, Vinh Quoc Tran, Zhen Qin
  • Publication number: 20240256965
    Abstract: An example method for training a machine-learned sequence processing model includes obtaining a plurality of training examples for training the machine-learned sequence processing model. For each respective training example of the plurality of training examples, the example method includes: obtaining a respective query associated with the respective training example; inputting the respective query to the machine-learned sequence processing model; obtaining, from the machine-learned sequence processing model a response to the respective query and a trace of intermediate states from the respective query to the response; evaluating the response using a ground truth response associated with the respective training example; evaluating the trace using a ground truth trace associated with the respective training example; and updating one or more parameters of the machine-learned sequence processing model based on the evaluation of the response and based on the evaluation of the trace.
    Type: Application
    Filed: January 26, 2024
    Publication date: August 1, 2024
    Inventors: Hyung Won Chung, Barret Zoph, Dengyong Zhou, Liam Fedus, Shayne Longpre, Le Hou, Yi Tay, Jason Weng Wei, Siddhartha Brahma, Quoc V. Le
  • Publication number: 20240256964
    Abstract: An example method includes obtaining a pretrained machine-learned model that was initially pretrained using a pretraining dataset and further pretraining the model by generating, using a pretraining objective framework, a plurality of corrupted training examples from one or more training examples obtained from the pretraining dataset. A first set of one or more training examples can be corrupted according to a first set of configuration parameters of the pretraining objective framework. A second set can be corrupted according to a second set of configuration parameters of the pretraining objective framework. The example method includes inputting the plurality of corrupted training examples into model; obtaining from the model, a plurality of outputs respectively generated by model based on the plurality of corrupted training examples; and updating one or more parameters of model based on an evaluation of the plurality of outputs.
    Type: Application
    Filed: January 26, 2024
    Publication date: August 1, 2024
    Inventors: Yi Tay, Mostafa Dehghani
  • Publication number: 20240169184
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating output sequences using auto-regressive decoder neural networks. In particular, during generation, adaptive early exiting is used to reduce the time required to generate the output sequence.
    Type: Application
    Filed: January 29, 2024
    Publication date: May 23, 2024
    Inventors: Tal Schuster, Adam Joshua Fisch, Jai Prakash Gupta, Mostafa Dehghani, Dara Bahri, Vinh Quoc Tran, Yi Tay, Donald Arthur Metzler, JR.
  • Patent number: 11886976
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating output sequences using auto-regressive decoder neural networks. In particular, during generation, adaptive early exiting is used to reduce the time required to generate the output sequence.
    Type: Grant
    Filed: July 14, 2023
    Date of Patent: January 30, 2024
    Assignee: Google LLC
    Inventors: Tal Schuster, Adam Joshua Fisch, Jai Prakash Gupta, Mostafa Dehghani, Dara Bahri, Vinh Quoc Tran, Yi Tay, Donald Arthur Metzler, Jr.
  • Publication number: 20240020516
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for generating output sequences using auto-regressive decoder neural networks. In particular, during generation, adaptive early exiting is used to reduce the time required to generate the output sequence.
    Type: Application
    Filed: July 14, 2023
    Publication date: January 18, 2024
    Inventors: Tal Schuster, Adam Joshua Fisch, Jai Prakash Gupta, Mostafa Dehghani, Dara Bahri, Vinh Quoc Tran, Yi Tay, Donald Arthur Metzler, Jr.
  • Publication number: 20230244938
    Abstract: An example method for pretraining a machine-learned model is provided. The example method includes obtaining a plurality of different combinations of configuration parameters of a pretraining objective framework. The example method includes generating, using the pretraining objective framework, a plurality of corrupted training examples from one or more training examples, wherein the plurality of corrupted training examples are respectively generated according to the plurality of different combinations. The example method includes inputting the plurality of corrupted training examples into the machine-learned model, wherein the machine-learned model is configured to generate uncorrupted subportions corresponding to corrupted subportions of the corrupted training examples. The example method includes obtaining, from the machine-learned model, a plurality of outputs respectively generated by the machine-learned model based on the plurality of corrupted training examples.
    Type: Application
    Filed: January 27, 2023
    Publication date: August 3, 2023
    Inventors: Jason Weng Wei, Dengyong Zhou, Xuezhi Wang, Dale Eric Schuurmans, Quoc V. Le, Maarten Paul Bosma, Ed Huai-Hsin Chi, Olivier Jean Andrè Bousquet, Le Hou, Charles Aloysius Sutton, Nathanael Martin Schärli, Nathan Kemp Sekiguchi Scales, Augustus Quadrozzi Odena, Sharan Ajit Narang, Guy Gur-Ari Krakover, Aakanksha Chowdhery, David Martin Dohan, Aitor Lewkowycz, Henryk Michalewski, Jiageng Luan, David J. Bieber, Jacob Austin, Anders Johan Andreassen, Maxwell Isaac Nye, Yi Tay, Mostafa Dehghani
  • Publication number: 20220406992
    Abstract: Some embodiments relate to a memory device. The memory device includes a substrate comprising an inter-metal dielectric layer having a metal line, a dielectric layer over the substrate, a bottom electrode via through the dielectric layer and in contact with the metal line, a bottom electrode over the bottom electrode via, a magnetic tunneling junction (MTJ) element over the bottom electrode, and a top electrode over the MTJ element. A center portion of the bottom electrode directly above the bottom electrode via is thicker than an edge portion of the bottom electrode.
    Type: Application
    Filed: April 21, 2022
    Publication date: December 22, 2022
    Inventors: Yi-Cheng Chu, Chung-Te Lin, Kai-Wen Cheng, Han-Ting Tsai, Jung-Tsan Tsai, Pao-Yi Tai, Chien-Hua Huang
  • Publication number: 20220383120
    Abstract: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a neural network having a plurality of network parameters. One of the methods includes obtaining an unlabeled training input from a set of unlabeled training data; processing the unlabeled training input to generate a first embedding; generating a corrupted version of the unlabeled training input, comprising determining a proper subset of the feature dimensions and, for each feature dimension that is in the proper subset of feature dimensions, applying a corruption to the respective feature in the feature dimension using one or more feature values sampled from a marginal distribution of the feature dimension as specified in the set of unlabeled training data; processing the corrupted version of the unlabeled training input to generate a second embedding; and determining an update to the current values of the plurality of network parameters.
    Type: Application
    Filed: May 27, 2022
    Publication date: December 1, 2022
    Inventors: Dara Bahri, Donald Arthur Metzler, JR., Hanxi Heinrich Jiang, Yi Tay
  • Patent number: 11440694
    Abstract: A test tube preparation device uses a labeling device to include a linking module to link a positioning unit, and the positioning unit can be used to place a tube body and finely adjust a position of the tube body relative to the label generating module of the surface treating device to complete the label generation, label conveyance, tube labeling and tube delivery, and provide a preparation device for effectively integrating the label and applying to the tube body before and after the labeling of the tube body, so as to improve the quality of the generation of the tube body and the label. Further, the present invention further provides a test tube preparation method.
    Type: Grant
    Filed: December 8, 2019
    Date of Patent: September 13, 2022
    Inventors: Chien-Hua Chen, Feng-Yi Tai