Patents by Inventor Ashish Verma

Ashish Verma has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230100713
    Abstract: Embodiments of the disclosure are directed to advanced integrated circuit (IC) structure fabrication and, in particular, IC structures with an improved two-dimensional (2D) channel architecture. Other embodiments may be disclosed or claimed.
    Type: Application
    Filed: September 24, 2021
    Publication date: March 30, 2023
    Inventors: Chelsey DOROW, Kevin P. O'BRIEN, Carl H. NAYLOR, Kirby MAXEY, Sudarat LEE, Ashish Verma PENUMATCHA, Uygar E. AVCI
  • Publication number: 20230098467
    Abstract: Thin film transistors having a spin-on two-dimensional (2D) channel material are described. In an example, an integrated circuit structure includes a first device layer including a first two-dimensional (2D) material layer above a substrate. The first 2D material layer includes molybdenum, sulfur, sodium and carbon. A second device layer including a second 2D material layer is above the substrate. The second 2D material layer includes tungsten, selenium, sodium and carbon.
    Type: Application
    Filed: September 24, 2021
    Publication date: March 30, 2023
    Inventors: Carl H. NAYLOR, Kirby MAXEY, Kevin P. O'BRIEN, Chelsey DOROW, Sudarat LEE, Ashish Verma PENUMATCHA, Shriram SHIVARAMAN, Uygar E. AVCI, Patrick THEOFANIS, Charles MOKHTARZADEH, Matthew V. METZ, Scott B. CLENDENNING
  • Publication number: 20230100505
    Abstract: Embodiments disclosed herein include transistor devices and methods of forming such devices. In an embodiment, a transistor device comprises a first channel, wherein the first channel comprises a semiconductor material and a second channel above the first channel, wherein the second channel comprises the semiconductor material. In an embodiment, a first spacer is between the first channel and the second channel, and a second spacer is between the first channel and the second channel. In an embodiment, a first gate dielectric is over a surface of the first channel that faces the second channel, and a second gate dielectric is over a surface of the second channel that faces the first channel. In an embodiment, the first gate dielectric is physically separated from the second gate dielectric.
    Type: Application
    Filed: September 24, 2021
    Publication date: March 30, 2023
    Inventors: Ashish Verma PENUMATCHA, Sarah ATANASOV, Seung Hoon SUNG, Rahul RAMAMURTHY, I-Cheng TUNG, Uygar E. AVCI, Matthew V. METZ, Jack T. KAVALIEROS, Chia-Ching LIN, Kaan OGUZ
  • Patent number: 11616130
    Abstract: Techniques and mechanisms to provide electrical insulation between a gate and a channel region of a non-planar circuit device. In an embodiment, the gate structure, and insulation spacers at opposite respective sides of the gate structure, each extend over a semiconductor fin structure. In a region between the insulation spacers, a first dielectric layer extends conformally over the fin, and a second dielectric layer adjoins and extends conformally over the first dielectric layer. A third dielectric layer, adjoining the second dielectric layer and the insulation spacers, extends under the gate structure. Of the first, second and third dielectric layers, the third dielectric layer is conformal to respective sidewalls of the insulation spacers. In another embodiment, the second dielectric layer is of dielectric constant which is greater than that of the first dielectric layer, and equal to or less than that of the third dielectric layer.
    Type: Grant
    Filed: March 25, 2019
    Date of Patent: March 28, 2023
    Assignee: Intel Corporation
    Inventors: Seung Hoon Sung, Jack Kavalieros, Ian Young, Matthew Metz, Uygar Avci, Devin Merrill, Ashish Verma Penumatcha, Chia-Ching Lin, Owen Loh
  • Publication number: 20230086499
    Abstract: Thin film transistors having fin structures integrated with two-dimensional (2D) channel materials are described. In an example, an integrated circuit structure includes a plurality of insulator fins above a substrate. A two-dimensional (2D) material layer is over the plurality of insulator fins. A gate dielectric layer is on the 2D material layer. A gate electrode is on the gate dielectric layer. A first conductive contact is on the 2D material layer adjacent to a first side of the gate electrode. A second conductive contact is on the 2D material layer adjacent to a second side of the gate electrode, the second side opposite the first side.
    Type: Application
    Filed: September 20, 2021
    Publication date: March 23, 2023
    Inventors: Kirby MAXEY, Ashish Verma PENUMATCHA, Kevin P. O'BRIEN, Chelsey DOROW, Uygar E. AVCI, Sudarat LEE, Carl NAYLOR, Tanay GOSAVI
  • Publication number: 20230090093
    Abstract: Thin film transistors having semiconductor structures integrated with two-dimensional (2D) channel materials are described. In an example, an integrated circuit structure includes a two-dimensional (2D) material layer above a substrate. A gate stack is above the 2D material layer, the gate stack having a first side opposite a second side. A semiconductor structure including germanium is included, the semiconductor structure laterally adjacent to and in contact with the 2D material layer adjacent the first side of the gate stack. A first conductive structure is adjacent the first side of the second gate stack, the first conductive structure over and in direct electrical contact with the semiconductor structure. The semiconductor structure is intervening between the first conductive structure and the 2D material layer. A second conductive structure is adjacent the second side of the second gate stack, the second conductive structure over and in direct electrical contact with the 2D material layer.
    Type: Application
    Filed: September 20, 2021
    Publication date: March 23, 2023
    Inventors: Ashish Verma PENUMATCHA, Uygar E. AVCI, Chelsey DOROW, Tanay GOSAVI, Chia-Ching LIN, Carl NAYLOR, Nazila HARATIPOUR, Kevin P. O'BRIEN, Seung Hoon SUNG, Ian A. YOUNG, Urusa ALAAN
  • Publication number: 20230088101
    Abstract: Thin film transistors having edge-modulated two-dimensional (2D) channel material are described. In an example, an integrated circuit structure includes a device layer including a two-dimensional (2D) material layer above a substrate, the 2D material layer including a center portion and first and second edge portions, the center portion consisting essentially of molybdenum or tungsten and of sulfur or selenium, and the first and second edge portions including molybdenum or tungsten and including tellurium.
    Type: Application
    Filed: September 22, 2021
    Publication date: March 23, 2023
    Inventors: Carl H. NAYLOR, Kirby MAXEY, Kevin P. O'BRIEN, Chelsey DOROW, Sudarat LEE, Ashish Verma PENUMATCHA, Uygar E. AVCI, Matthew V. METZ, Scott B. CLENDENNING
  • Publication number: 20230087668
    Abstract: Thin film transistors having strain-inducing structures integrated with two-dimensional (2D) channel materials are described. In an example, an integrated circuit structure includes a two-dimensional (2D) material layer above a substrate. A gate stack is on the 2D material layer, the gate stack having a first side opposite a second side. A first gate spacer is on the 2D material layer and adjacent to the first side of the gate stack. A second gate spacer is on the 2D material layer and adjacent to the second side of the gate stack. The first gate spacer and the second gate spacer induce a strain on the 2D material layer. A first conductive structure is on the 2D material layer and adjacent to the first gate spacer. A second conductive structure is on the 2D material layer and adjacent to the second gate spacer.
    Type: Application
    Filed: September 21, 2021
    Publication date: March 23, 2023
    Inventors: Chelsey DOROW, Kevin P. O'BRIEN, Carl NAYLOR, Kirby MAXEY, Sudarat LEE, Ashish Verma PENUMATCHA, Uygar E. AVCI
  • Patent number: 11605624
    Abstract: Describe is a resonator that uses ferroelectric (FE) material in a capacitive structure. The resonator includes a first plurality of metal lines extending in a first direction; an array of capacitors comprising ferroelectric material; a second plurality of metal lines extending in the first direction, wherein the array of capacitors is coupled between the first and second plurality of metal lines; and a circuitry to switch polarization of at least one capacitor of the array of capacitors. The switching of polarization regenerates acoustic waves. In some embodiments, the acoustic mode of the resonator is isolated using phononic gratings all around the resonator using metal lines above and adjacent to the FE based capacitors.
    Type: Grant
    Filed: January 2, 2019
    Date of Patent: March 14, 2023
    Assignee: Intel Corporation
    Inventors: Tanay Gosavi, Chia-ching Lin, Raseong Kim, Ashish Verma Penumatcha, Uygar Avci, Ian Young
  • Publication number: 20230069913
    Abstract: Techniques for utilizing model and hyperparameter optimization for multi-objective machine learning are disclosed. In one example, a method comprises the following steps. One of a plurality of hyperparameter optimization operations and a plurality of model parameter optimization operations are performed to generate a first solution set. The other of the plurality of hyperparameter optimization operations and the plurality of model parameter optimization operations are performed to generate a second solution set. At least a portion of the first solution set and at least a portion of the second solution set are combined to generate a third solution set.
    Type: Application
    Filed: September 9, 2021
    Publication date: March 9, 2023
    Inventors: Aswin Kannan, Vaibhav Saxena, Anamitra Roy Choudhury, Yogish Sabharwal, Parikshit Ram, Ashish Verma, Saurabh Manish Raje
  • Publication number: 20230065198
    Abstract: A memory device, an integrated circuit component including an array of the memory devices, and an integrated device assembly including the integrated circuit component. The memory devices includes a first electrode; a second electrode including an antiferromagnetic (AFM) material; and a memory stack including: a first layer adjacent the second electrode and including a multilayer stack of adjacent layers comprising ferromagnetic materials; a second layer adjacent the first layer; and a third layer adjacent the second layer at one side thereof, and adjacent the first electrode at another side thereof, the second layer between the first layer and the third layer, the third layer including a ferromagnetic material. The memory device may correspond to a magnetic tunnel junction (MTJ) magnetic random access memory bit cell, and the memory stack may correspond to a MTJ device.
    Type: Application
    Filed: September 2, 2021
    Publication date: March 2, 2023
    Applicant: Intel Corporation
    Inventors: Ian Alexander Young, Dmitri Evgenievich Nikonov, Chia-Ching Lin, Tanay A. Gosavi, Ashish Verma Penumatcha, Kaan Oguz, Punyashloka Debashis
  • Publication number: 20230058938
    Abstract: A pbit device, in one embodiment, includes a first field-effect transistor (FET) that includes a source region, a drain region, a source electrode on the source region, a drain electrode on the drain region, a channel region between the source and drain regions, a dielectric layer on a surface over the channel region, an electrode layer above the dielectric layer, and a ferroelectric (FE) material layer between the dielectric layer and the electrode layer. The pbit device also includes a second FET comprising a source electrode, a drain electrode, and a gate electrode. The drain electrode of the second FET is connected to the drain electrode of the first FET.
    Type: Application
    Filed: August 23, 2021
    Publication date: February 23, 2023
    Applicant: Intel Corporation
    Inventors: Punyashloka Debashis, Dmitri Evgenievich Nikonov, Hai Li, Chia-Ching Lin, Raseong Kim, Tanay A. Gosavi, Ashish Verma Penumatcha, Uygar E. Avci, Marko Radosavljevic, Ian Alexander Young
  • Patent number: 11586932
    Abstract: A computer-implemented machine learning model training method and resulting machine learning model. One embodiment of the method may comprise receiving at a computer memory training data; and training on a computer processor a machine learning model on the received training data using a plurality of batch sizes to produce a trained processor. The training may include calculating a plurality of activations during a forward pass of the training and discarding at least some of the calculated plurality of activations after the forward pass of the training.
    Type: Grant
    Filed: March 10, 2020
    Date of Patent: February 21, 2023
    Assignee: International Business Machines Corporation
    Inventors: Saurabh Goyal, Anamitra Roy Choudhury, Yogish Sabharwal, Ashish Verma
  • Patent number: 11586475
    Abstract: One embodiment provides a method, including: receiving at least one deep learning job for scheduling and running on a distributed system comprising a plurality of nodes; receiving a batch size range indicating a minimum batch size and a maximum batch size that can be utilized for running the at least one deep learning job; determining a plurality of runtime estimations for running the at least one deep learning job; creating a list of optimal combinations of (i) batch sizes and (ii) numbers of the plurality of nodes for running both (a) the at least one deep learning job and (b) current deep learning jobs; and scheduling the at least one deep-learning job at the distributed system, responsive to identifying, by utilizing the list, that the distributed system has necessary processing resources for running both (iii) the at least one deep learning job and (iv) the current deep learning jobs.
    Type: Grant
    Filed: February 28, 2020
    Date of Patent: February 21, 2023
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Saurav Basu, Vaibhav Saxena, Yogish Sabharwal, Ashish Verma, Jayaram Kallapalayam Radhakrishnan
  • Publication number: 20230015895
    Abstract: Methods, systems, and computer program products for accelerating inference of transformer-based models are provided herein. A computer-implemented method includes obtaining a machine learning model comprising a plurality of transformer blocks, a task, and a natural language dataset; generating a compressed version of the machine learning model based on the task and the natural language dataset, wherein the generating comprises: obtaining at least one set of tokens, wherein each token in the set corresponds to one of the items in the natural language dataset, identifying and removing one or more redundant output activations of different ones of the plurality of transformer blocks for the at least one set of tokens, and adding one or more input activations corresponding to the one or more removed output activations into the machine learning model at subsequent ones of the plurality of the transformer blocks; and outputting the compressed version of the machine learning model to at least one user.
    Type: Application
    Filed: July 12, 2021
    Publication date: January 19, 2023
    Inventors: Saurabh Goyal, Anamitra Roy Choudhury, Saurabh Manish Raje, Venkatesan T. Chakaravarthy, Yogish Sabharwal, Ashish Verma
  • Patent number: 11551145
    Abstract: Systems, computer-implemented methods, and computer program products that can facilitate switching a model training process from a ground truth training phase to an adversarial training phase based on performance of a model trained in the ground truth training phase are provided. According to an embodiment, a system can comprise a memory that stores computer executable components and a processor that executes the computer executable components stored in the memory. The computer executable components can comprise an analysis component that identifies a performance condition of a model trained in a model training process. The computer executable components can further comprise a trainer component that switches the model training process from a ground truth training process to an adversarial training process based on the identified performance condition.
    Type: Grant
    Filed: February 5, 2020
    Date of Patent: January 10, 2023
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Sidharth Gupta, Parijat Dube, Ashish Verma
  • Patent number: 11532439
    Abstract: Described is an ultra-dense ferroelectric memory. The memory is fabricated using a patterning method by that applies atomic layer deposition with selective dry and/or wet etch to increase memory density at a given via opening. A ferroelectric capacitor in one example comprises: a first structure (e.g., first electrode) comprising metal; a second structure (e.g., a second electrode) comprising metal; and a third structure comprising ferroelectric material, wherein the third structure is between and adjacent to the first and second structures, wherein a portion of the third structure is interdigitated with the first and second structures to increase surface area of the third structure. The increased surface area allows for higher memory density.
    Type: Grant
    Filed: March 7, 2019
    Date of Patent: December 20, 2022
    Assignee: Intel Corporation
    Inventors: Chia-Ching Lin, Sou-Chi Chang, Nazila Haratipour, Seung Hoon Sung, Ashish Verma Penumatcha, Jack Kavalieros, Uygar E. Avci, Ian A. Young
  • Publication number: 20220374763
    Abstract: Techniques for distributed federated learning leverage a multi-layered defense strategy to provide for reduced information leakage. In lieu of aggregating model updates centrally, an aggregation function is decentralized into multiple independent and functionally-equivalent execution entities, each running within its own trusted executed environment (TEE). The TEEs enable confidential and remote-attestable federated aggregation. Preferably, each aggregator entity runs within an encrypted virtual machine that support runtime in-memory encryption. Each party remotely authenticates the TEE before participating in the training. By using multiple decentralized aggregators, parties are enabled to partition their respective model updates at model-parameter granularity, and can map single weights to a specific aggregator entity. Parties also can dynamically shuffle fragmentary model updates at each training iteration to further obfuscate the information dispatched to each aggregator execution entity.
    Type: Application
    Filed: May 18, 2021
    Publication date: November 24, 2022
    Applicant: International Business Machines Corporation
    Inventors: Zhongshu Gu, Jayaram Kallapalayam Radhakrishnan, Ashish Verma, Enriquillo Valdez, Pau-Chen Cheng, Hani Talal Jamjoom, Kevin Eykholt
  • Publication number: 20220374762
    Abstract: Techniques for distributed federated learning leverage a multi-layered defense strategy to provide for reduced information leakage. In lieu of aggregating model updates centrally, an aggregation function is decentralized into multiple independent and functionally-equivalent execution entities, each running within its own trusted executed environment (TEE). The TEEs enable confidential and remote-attestable federated aggregation. Preferably, each aggregator entity runs within an encrypted virtual machine that support runtime in-memory encryption. Each party remotely authenticates the TEE before participating in the training. By using multiple decentralized aggregators, parties are enabled to partition their respective model updates at model-parameter granularity, and can map single weights to a specific aggregator entity. Parties also can dynamically shuffle fragmentary model updates at each training iteration to further obfuscate the information dispatched to each aggregator execution entity.
    Type: Application
    Filed: May 18, 2021
    Publication date: November 24, 2022
    Applicant: International Business Machines Corporation
    Inventors: Jayaram Kallapalayam Radhakrishnan, Ashish Verma, Zhongshu Gu, Enriquillo Valdez, Pau-Chen Cheng, Hani Talal Jamjoom
  • Publication number: 20220358358
    Abstract: Methods, systems, and computer program products for accelerating inference of neural network models via dynamic early exits are provided herein. A computer-implemented method includes determining a plurality of candidate exit points of a neural network model; obtaining a plurality of outputs of the neural network model for data samples in a target dataset, wherein the plurality of outputs comprises early outputs of the neural network model from the plurality of candidate exit points and regular outputs of the neural network model; and a set of one or more exit points from the plurality of candidate exits points that are dependent on the target dataset based at least in part on the plurality of outputs.
    Type: Application
    Filed: May 4, 2021
    Publication date: November 10, 2022
    Inventors: Saurabh Manish Raje, Saurabh Goyal, Anamitra Roy Choudhury, Yogish Sabharwal, Ashish Verma