Patents by Inventor Jianfeng Gao

Jianfeng Gao has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250228137
    Abstract: The present disclosure relates to the field of microelectronic manufacturing technology, in particular to a SOT-MRAM memory cell and a method of manufacturing a SOT-MRAM memory cell. The SOT-MRAM memory cell includes a bottom electrode layer, a magnetic tunnel junction, an antiferromagnetic layer and a top electrode layer provided sequentially from bottom to top, where the magnetic tunnel junction includes a free layer, a tunneling layer and a pinning layer, the bottom electrode layer is a stack of odd number of layers, and the odd number of layers include at least one W metal layer and at least one Ta metal layer.
    Type: Application
    Filed: December 16, 2024
    Publication date: July 10, 2025
    Inventors: Jianfeng GAO, Meiyin YANG, Weibing LIU, Tao YANG, Junfeng LI, Jun LUO
  • Publication number: 20250191930
    Abstract: The present disclosure discloses a deep ultraviolet lithography method, a lithography pattern and a semiconductor structure, which relates to the field of deep ultraviolet lithography technology. The deep ultraviolet lithography method includes: dividing, when a depth of field of the deep ultraviolet lithography is less than a height of the step of the substrate, a pattern to be photoetched into at least two portions according to a distribution situation of the step, where each of the at least two portions corresponds to an on-step pattern or an off-step pattern of the step; corresponding the at least two portions of the pattern to be photoetched onto at least two masks respectively; and simultaneously baking and developing the exposed at least two masks after the at least two masks are exposed sequentially.
    Type: Application
    Filed: November 4, 2024
    Publication date: June 12, 2025
    Inventors: Xiaobin He, Junfeng Li, Tingting Li, Jinbiao Liu, Jianfeng Gao, Tao Yang, Jun Luo
  • Publication number: 20250165792
    Abstract: This document relates to training of machine learning models such as neural networks. One example method involves providing a machine learning model having one or more layers and associated parameters and performing a pretraining stage on the parameters of the machine learning model to obtain pretrained parameters. The example method also involves performing a tuning stage on the machine learning model by using labeled training samples to tune the pretrained parameters. The tuning stage can include performing noise adjustment of the labeled training examples to obtain noise-adjusted training samples. The tuning stage can also include adjusting the pretrained parameters based at least on the labeled training examples and the noise-adjusted training examples to obtain adapted parameters. The example method can also include outputting a tuned machine learning model having the adapted parameters.
    Type: Application
    Filed: January 22, 2025
    Publication date: May 22, 2025
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Xiaodong LIU, Jianfeng GAO, Pengcheng HE, Weizhu CHEN
  • Patent number: 12299579
    Abstract: This document relates to training of machine learning models. One example method involves providing a machine learning model having one or more mapping layers. The one or more mapping layers can include at least a first mapping layer configured to map components of pretraining examples into first representations in a space. The example method also includes performing a pretraining stage on the one or more mapping layers using the pretraining examples. The pretraining stage can include adding noise to the first representations of the components of the pretraining examples to obtain noise-adjusted first representations. The pretraining stage can also include performing a self-supervised learning process to pretrain the one or more mapping layers using at least the first representations of the training data items and the noise-adjusted first representations of the training data items.
    Type: Grant
    Filed: September 26, 2023
    Date of Patent: May 13, 2025
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Xiaodong Liu, Hao Cheng, Yu Wang, Jianfeng Gao, Weizhu Chen, Pengcheng He, Hoifung Poon
  • Publication number: 20250149339
    Abstract: A conformal boron doping method for a three-dimensional structure includes the steps of: removing a natural oxide layer on a surface of a silicon-based three-dimensional substrate; forming a buffer layer on the surface of the silicon-based three-dimensional substrate; forming a boron oxide thin film on the alumina buffer layer; covering a passivation layer on a surface of the boron oxide thin film; and driving boron impurities containing boron oxide into the silicon-based three-dimensional substrate through the buffer layer by using laser or rapid annealing, to dope the silicon-based three-dimensional substrate. Selecting suitable boron source precursors and oxidants solves the problems of difficult nucleation and inability to form a film after reaching a certain thickness for boron oxide. By selecting alumina as the passivation layer, it is possible to protect the boron oxide thin film from being damaged, and thus achieve damage-free diffusion doping during laser or rapid annealing processes.
    Type: Application
    Filed: December 28, 2023
    Publication date: May 8, 2025
    Inventors: Jianfeng Gao, Shuai Yang, Jinbiao Liu, Weibing Liu, Junfeng Li, Jun Luo, Jinjuan Xiang
  • Publication number: 20250120108
    Abstract: A method for fabricating a GAA nanosheet structure, comprising: forming at least two channel layers and at least one sacrificial layer alternately stacked on a substrate to form a channel stack; forming, on the substrate, a dummy gate astride the channel stack; forming a first sidewall on a surface of the dummy gate; etching the sacrificial layer to form a recess at a side surface of the channel stack; forming a second sidewall within the recess; forming a source and a drain at two sides of the channel stack; in response to a channel layer being in contact with the dummy gate, etching the dummy gate and the channel layer to expose the at least one sacrificial layer, and then etching the at least one sacrificial layer to form a space for manufacturing a surrounding gate; and forming a metallic surrounding gate in the space.
    Type: Application
    Filed: November 27, 2023
    Publication date: April 10, 2025
    Inventors: Na ZHOU, Junjie LI, Jianfeng GAO, Tao YANG, Junfeng LI, Jun LUO
  • Publication number: 20250081530
    Abstract: A semiconductor device and a method for manufacturing the same. The method comprises: providing a substrate; forming a fin, a dummy gate, a first spacer, and a hard mask on a surface of the substrate; etching the substrate to form a groove located directly beneath the fin and running through a second spacer; forming, in the groove, a filling layer made of an insulating dielectric material, and thermal conductivity of the insulating dielectric material is higher than that of the substrate; removing the second spacer through etching; removing two opposite ends of each sacrificial layer to form cavities; filling the cavities to form inner spacers; forming a source and a drain on the substrate; forming a first dielectric layer; planarizing the first dielectric layer to expose the dummy gate; removing the dummy gate to release a channel comprising conductive nanosheets; forming a surrounding gate surrounding the conductive nanosheets.
    Type: Application
    Filed: November 27, 2023
    Publication date: March 6, 2025
    Inventors: Junjie LI, Enxu LIU, Na ZHOU, Jianfeng GAO, Junfeng LI, Yongliang LI, Jun LUO, Wenwu WANG
  • Patent number: 12242971
    Abstract: This document relates to training of machine learning models such as neural networks. One example method involves providing a machine learning model having one or more layers and associated parameters and performing a pretraining stage on the parameters of the machine learning model to obtain pretrained parameters. The example method also involves performing a tuning stage on the machine learning model by using labeled training samples to tune the pretrained parameters. The tuning stage can include performing noise adjustment of the labeled training examples to obtain noise-adjusted training samples. The tuning stage can also include adjusting the pretrained parameters based at least on the labeled training examples and the noise-adjusted training examples to obtain adapted parameters. The example method can also include outputting a tuned machine learning model having the adapted parameters.
    Type: Grant
    Filed: January 29, 2020
    Date of Patent: March 4, 2025
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Xiaodong Liu, Jianfeng Gao, Pengcheng He, Weizhu Chen
  • Publication number: 20250063713
    Abstract: The present disclosure provides a memory with a three-dimensional vertical structure and a manufacturing method. The memory includes: a semiconductor substrate, a first isolation layer, a first transistor and a second transistor. The first transistor includes a first source layer, a second isolation layer, a first drain layer, a third isolation layer, and a first through hole penetrating to the first source layer. A first active layer, a first gate dielectric layer and a first gate layer are on an inner sidewall of the first through hole. The second transistor includes a fourth isolation layer, a second source layer, a fifth isolation layer, and a second through hole penetrating to the first gate layer. A second active layer, a second gate dielectric layer and a second gate layer are on an inner sidewall of the second through hole. The second through hole is surrounded by the first through hole.
    Type: Application
    Filed: August 9, 2024
    Publication date: February 20, 2025
    Applicant: INSTITUTE OF MICROELECTRONICS, CHINESE ACADEMY OF SCIENCES
    Inventors: Jianfeng GAO, Weibing LIU, Junjie LI, Na ZHOU, Tao Yang, Junfeng LI, Jun LUO
  • Patent number: 12223269
    Abstract: A method for training a language model comprises (a) receiving vectorized training data as input to a multitask pretraining problem; (b) generating modified vectorized training data based on the vectorized training data, according to an upstream data embedding; (c) emitting pretraining output based on the modified vectorized training data, according to a downstream data embedding equivalent to the upstream data embedding; and (d) adjusting the upstream data embedding and the downstream data embedding by computing, based on the pretraining output, a gradient of the upstream data embedding disentangled from a gradient of the downstream data embedding, thereby advancing the multitask pretraining problem toward a pretrained state.
    Type: Grant
    Filed: May 18, 2022
    Date of Patent: February 11, 2025
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Pengcheng He, Jianfeng Gao, Weizhu Chen
  • Publication number: 20250040226
    Abstract: The present disclosure provides a semiconductor device and a method of manufacturing a semiconductor device. The semiconductor device includes: a substrate; an insulating layer provided with a plurality of trenches extending in a first direction; a first electrode layer and a second electrode layer, where a spacing region is provided between the first electrode layer and the second electrode layer; a semiconductor layer covering bottom portions and sidewalls of all channel trenches, where the channel trenches are at least a part of trench bodies of the trenches located in the spacing region; a gate dielectric layer covering a surface of the semiconductor layer in the channel trenches on a side away from the bottom portions and the sidewalls of the channel trenches; a gate layer, where at least a part of the channel trenches are fully filled with the gate layer.
    Type: Application
    Filed: July 26, 2024
    Publication date: January 30, 2025
    Inventors: Junjie LI, Gaobo Xu, Na Zhou, Chenchen Zhang, Jianfeng Gao, Yihong Lu, Tao Yang, Junfeng Li, Jun Luo, Rui Chen
  • Publication number: 20250006822
    Abstract: A method for manufacturing a gate-all-around TFET device. The method comprises: forming, on a substrate, a channel stack comprising channel layer(s) and sacrificial layer(s) that alternate with each other; forming, on the substrate, a dummy gate astride the channel stack; forming a first spacer at a surface of the dummy gate; etching the sacrificial layer(s) to form recesses on side surfaces of the channel stack; forming second spacers in the recesses, respectively; fabricating a source and a drain separately, where a region for fabricating the source is shielded by a dielectric material when fabricating the drain, and a region for fabricating the drain is shielded by another dielectric material when fabricating the source; etching the dummy gate and the sacrificial layer(s) to form a space for a surrounding gate; and fabricating a surrounding dielectric-metal gate in the space.
    Type: Application
    Filed: November 27, 2023
    Publication date: January 2, 2025
    Inventors: Na Zhou, Junjie Li, Jianfeng Gao, Tao Yang, Junfeng Li, Jun Luo
  • Publication number: 20240419960
    Abstract: Generally discussed herein are devices, systems, and methods for backpropagation of a discrete latent variable.
    Type: Application
    Filed: June 19, 2023
    Publication date: December 19, 2024
    Inventors: Liyuan LIU, Chengyu DONG, Xiaodong LIU, Bin YU, Jianfeng GAO
  • Publication number: 20240362418
    Abstract: A technique supplements a language model with knowledge information retrieved from external sources. The technique operates by: receiving a query; receiving knowledge information based on the query; generating original model-input information that includes the query and the knowledge information; and presenting the original model-input information to the language model. The technique further includes: receiving an original response from the language model; generating a usefulness measure that identifies usefulness of the original response; and determining whether the usefulness measure satisfies a prescribed test. Upon determining that the usefulness measure does not satisfy the test, the technique includes: generating revised model-input information that includes feedback information; presenting the revised model-input information to the language model; and receiving a revised response from the language model.
    Type: Application
    Filed: April 28, 2023
    Publication date: October 31, 2024
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Baolin PENG, Michel GALLEY, Hao CHENG, Pengcheng HE, Nguyen Hung BACH, Weizhu CHEN, Jianfeng GAO
  • Publication number: 20240346295
    Abstract: This document relates to architectures and training procedures for multi-task machine learning models, such as neural networks. One example method involves providing a multi-task machine learning model having one or more shared layers and two or more task-specific layers. The method can also involve performing a pretraining stage on the one or more shared layers using one or more unsupervised prediction tasks.
    Type: Application
    Filed: May 3, 2024
    Publication date: October 17, 2024
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Weizhu CHEN, Pengcheng HE, Xiaodong LIU, Jianfeng GAO
  • Publication number: 20240311656
    Abstract: A technique performs the task of knowledge-graph completion in a manner that is both scalable and resource efficient. In some implementations, the technique identifies a source entity having a source-target relation that connects the source entity to a yet-to-be-determined target entity. The technique also identifies a source-entity data item that provides a passage of source-entity text pertaining to the source entity. The technique uses a machine-trained encoder model to map the source-entity data item to source-entity encoded information. The technique then predicts an identity of the target entity based on the source-entity encoded information, and based on predicate encoded information that encodes the source-target relation. In some implementations, the technique also predicts the target entity based on a consideration of one or more neighboring entities that are connected to the source entity and their respective source-to-neighbor relations.
    Type: Application
    Filed: March 16, 2023
    Publication date: September 19, 2024
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Xiaodong LIU, Jian JIAO, Hao CHENG, Sanxing CHEN, Jianfeng GAO
  • Publication number: 20240281705
    Abstract: The disclosed concepts relate to pretraining of machine learning models. One example method involves performing separate optimization of a first machine learning model and a second machine learning model. The first machine learning model can be optimized based at least on first predictions and the second machine learning model can be optimized based at least on second predictions. The first predictions can represent predictions of masked values in first sequences of values values, and the second predictions can represent whether or not the first values were replaced with different values predicted by the first machine learning model.
    Type: Application
    Filed: June 21, 2023
    Publication date: August 22, 2024
    Applicant: Microsoft Technology Licensing, LLC
    Inventors: Xiaodong LIU, Chengyu DONG, Lucas LIU, Hao CHENG, Jianfeng GAO
  • Patent number: D1060017
    Type: Grant
    Filed: September 19, 2024
    Date of Patent: February 4, 2025
    Inventor: Jianfeng Gao
  • Patent number: D1068039
    Type: Grant
    Filed: October 23, 2024
    Date of Patent: March 25, 2025
    Inventor: Jianfeng Gao
  • Patent number: D1073878
    Type: Grant
    Filed: October 22, 2024
    Date of Patent: May 6, 2025
    Inventor: Jianfeng Gao