Patents by Inventor Lalit Gupta

Lalit Gupta has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20230315770
    Abstract: A process includes obtaining a document, determining a set of vectors based on a count of n-grams of the document, and determining a first set of information based on the document using a first set of neural networks. The process includes selecting a text section of the natural language document using a second set of neural networks and a code template of a plurality of code templates based on the text section based on the first set of information and the text section. The process includes determining an entity identifier, a value of a conditional statement, a second set of information, and a third set of information based on the text section, the first set of information, and the code template. The process includes generating a first set of program code based on the entity identifier, the value, the second set of information, and the third set of information.
    Type: Application
    Filed: June 8, 2023
    Publication date: October 5, 2023
    Inventors: Jaya Prakash Narayana Gutta, Sharad Malhautra, Lalit Gupta
  • Patent number: 11776591
    Abstract: Various implementations described herein refer to a method for providing single port memory with multiple different banks having a first bank and a second bank that is different than the first bank. The method may include coupling multiple wordlines to the single port memory including coupling a first wordline to the first bank and coupling a second wordline to the second bank. The method may include performing multiple memory access operations concurrently in the single port memory.
    Type: Grant
    Filed: September 26, 2019
    Date of Patent: October 3, 2023
    Assignee: Arm Limited
    Inventors: Lalit Gupta, Bo Zheng, El Mehdi Boujamaa, Fakhruddin Ali Bohra
  • Patent number: 11763880
    Abstract: Various implementations described herein are related to a device having memory architecture having multiple bitcell arrays. The device may include column multiplexer circuitry coupled to the memory architecture via multiple bitlines for read access operations. The column multiplexer circuitry may perform read access operations in the multiple bitcell arrays via the bitlines based on a sense amplifier enable signal and a read multiplexer signal. The device may include control circuitry that provides the read multiplexer signal to the column multiplexer circuitry based on a clock signal and the sense amplifier enable signal so that the column multiplexer circuitry is able to perform the read access operations.
    Type: Grant
    Filed: August 11, 2020
    Date of Patent: September 19, 2023
    Assignee: Arm Limited
    Inventors: Lalit Gupta, Fakhruddin Ali Bohra, Shri Sagar Dwivedi, Vidit Babbar
  • Patent number: 11742001
    Abstract: Various implementations described herein are related to a device having memory circuitry having an array of memory cells. The device may include output circuitry coupled to the memory circuitry, and the output circuitry may have a first set of multiplexers that receives column data from the array of memory cells and provides first multiplexed output data. The device may include output interface circuitry coupled to the output circuitry, and the output interface circuitry may have a second set of multiplexers that receives the first multiplexed output data from the output circuitry and selectively provides second multiplexed output data based on a configurable mode of multiplexed operation.
    Type: Grant
    Filed: April 28, 2020
    Date of Patent: August 29, 2023
    Assignee: Arm Limited
    Inventors: Fakhruddin Ali Bohra, Lalit Gupta, Shri Sagar Dwivedi
  • Publication number: 20230267992
    Abstract: A static random access memory (SRAM) or other bit-storing cell arrangement includes memory cells and a hierarchical bitline structure including local bitlines for subsets of the memory banks and a global bitline spanning the subsets. A keeper circuit for the global bitline is replaced by bias circuitry on output transistors of the memory cells.
    Type: Application
    Filed: February 23, 2022
    Publication date: August 24, 2023
    Applicant: NVIDIA Corp.
    Inventors: Lalit Gupta, Stefan P. Sywyk, Andreas Jon Gotterba, Jesse Wang
  • Patent number: 11720615
    Abstract: A process includes obtaining a document, determining a set of vectors based on a count of n-grams of the document, and determining a first set of information based on the document using a first set of neural networks. The process includes selecting a text section of the natural language document using a second set of neural networks and a code template of a plurality of code templates based on the text section based on the first set of information and the text section. The process includes determining an entity identifier, a value of a conditional statement, a second set of information, and a third set of information based on the text section, the first set of information, and the code template. The process includes generating a first set of program code based on the entity identifier, the value, the second set of information, and the third set of information.
    Type: Grant
    Filed: July 29, 2022
    Date of Patent: August 8, 2023
    Assignee: DSilo Inc.
    Inventors: Jaya Prakash Narayana Gutta, Sharad Malhautra, Lalit Gupta
  • Publication number: 20230195767
    Abstract: Some embodiments may perform operations of a process that includes obtaining a natural language text document and use a machine learning model to generate a set of attributes based on a set of machine-learning-model-generated classifications in the document. The process may include performing hierarchical data extraction operations to populate the attributes, where different machine learning models may be used in sequence. The process may include using a pre-trained Bidirectional Encoder Representations from Transformers (BERT) model augmented with a pooling operation to determine a BERT output via a multi-channel transformer model to generate vectors on a per-sentence level or other per-text-section level. The process may include using a finer-grain model to extract quantitative or categorical values of interest, where the context of the per-sentence level may be retained for the finer-grain model.
    Type: Application
    Filed: February 10, 2023
    Publication date: June 22, 2023
    Inventors: Jaya Prakash Narayana Gutta, Sharad Malhautra, Lalit Gupta
  • Publication number: 20230197127
    Abstract: To mitigate pulse shape degradation along a signal route, the signal is driven from two ends. One end of the route is loaded and the other is relatively unloaded. The loaded route and unloaded route may traverse two different metal layers on a printed circuit board. The two routes may thus be related such that the unloaded route has less RC distortion effects on the signal than does the loaded route.
    Type: Application
    Filed: December 20, 2021
    Publication date: June 22, 2023
    Applicant: NVIDIA Corp.
    Inventors: Lalit Gupta, Andreas Jon Gotterba, Jesse Wang
  • Publication number: 20230096857
    Abstract: Some embodiments may obtain a natural language question, determine a context of the natural language question, and generate a first vector based on the natural language question using encoder neural network layers. Some embodiments may access a data table comprising column names, generate vectors based on the column names, and determine attention scores based on the vectors. Some embodiments may update the vectors based on the attention scores, generating a second vector based on the natural language question, determine a set of strings comprising a name of the column names and a database language operator based on the vectors. Some embodiments may determine a values based on the determined database language operator, the name, using a transformer neural network model. Some embodiments may generate a query based on the set of strings and the values.
    Type: Application
    Filed: December 2, 2022
    Publication date: March 30, 2023
    Inventors: Jaya Prakash Narayana Gutta, Sharad Malhautra, Lalit Gupta
  • Patent number: 11588477
    Abstract: Various implementations described herein are directed to an integrated circuit having clock generation circuitry that receives an input clock signal and provides a first clock signal having a first pulse width. The integrated circuit includes first pulse-stretching circuitry coupled between the clock generation circuitry and input latch control circuitry. The first pulse-stretching circuitry receives the first clock signal and provides a second clock signal to the input latch control circuitry based on an enable signal. The second clock signal has a second pulse width that is at least greater than the first pulse width. The integrated circuit may include second pulse-stretching circuitry coupled between the clock generation circuitry and read-write circuitry. The second pulse-stretching circuitry provides a third clock signal to the read-write circuitry based on the enable signal. The third clock signal has a third pulse width that is at least greater than the first pulse width.
    Type: Grant
    Filed: December 21, 2020
    Date of Patent: February 21, 2023
    Assignee: Arm Limited
    Inventors: Shri Sagar Dwivedi, Fakhruddin Ali Bohra, Lalit Gupta, Yew Keong Chong, Gus Yeung
  • Patent number: 11580150
    Abstract: Some embodiments may perform operations of a process that includes obtaining a natural language text document and use a machine learning model to generate a set of attributes based on a set of machine-learning-model-generated classifications in the document. The process may include performing hierarchical data extraction operations to populate the attributes, where different machine learning models may be used in sequence. The process may include using a pre-trained Bidirectional Encoder Representations from Transformers (BERT) model augmented with a pooling operation to determine a BERT output via a multi-channel transformer model to generate vectors on a per-sentence level or other per-text-section level. The process may include using a finer-grain model to extract quantitative or categorical values of interest, where the context of the per-sentence level may be retained for the finer-grain model.
    Type: Grant
    Filed: July 29, 2022
    Date of Patent: February 14, 2023
    Assignee: Dsilo, Inc.
    Inventors: Jaya Prakash Narayana Gutta, Sharad Malhautra, Lalit Gupta
  • Publication number: 20230038529
    Abstract: A process includes obtaining a document, determining a set of vectors based on a count of n-grams of the document, and determining a first set of information based on the document using a first set of neural networks. The process includes selecting a text section of the natural language document using a second set of neural networks and a code template of a plurality of code templates based on the text section based on the first set of information and the text section. The process includes determining an entity identifier, a value of a conditional statement, a second set of information, and a third set of information based on the text section, the first set of information, and the code template. The process includes generating a first set of program code based on the entity identifier, the value, the second set of information, and the third set of information.
    Type: Application
    Filed: July 29, 2022
    Publication date: February 9, 2023
    Inventors: Jaya Prakash Narayana Gutta, Sharad Malhautra, Lalit Gupta
  • Publication number: 20230044421
    Abstract: Various implementations described herein are related to a device having memory circuitry activated by a power-gated supply. The device may include level shifting circuitry that receives a switch control signal in a first voltage domain, shifts the switch control signal in the first voltage domain to a second voltage domain, and provides the switch control signal in the second voltage domain. The device may include power-gating circuitry activated by the switch control signal in the second voltage domain, and the power-gating circuitry may provide the power-gated supply to the memory circuitry to trigger activation of the memory circuitry with the power-gated supply when activated by the switch control signal in the second voltage domain.
    Type: Application
    Filed: October 10, 2022
    Publication date: February 9, 2023
    Inventors: Lalit Gupta, Cyrille Nicolas Dray, El Mehdi Boujamaa
  • Patent number: 11574660
    Abstract: In a particular implementation, a circuit comprises: a memory array including a plurality of bit cells, where each of the bit cells are coupled to a respective bit path; a first multiplexer comprising a plurality of column address locations, where each of the plurality of column address locations is coupled to the memory array and corresponds to a respective bit path capacitance; and a variable capacitance circuit coupled to a reference path and configured to substantially match reference path capacitance to each of the respective bit path capacitances.
    Type: Grant
    Filed: August 11, 2020
    Date of Patent: February 7, 2023
    Assignee: Arm Limited
    Inventors: Lalit Gupta, Nimish Sharma, Hetansh Pareshbhai Shah, Bo Zheng
  • Publication number: 20230037077
    Abstract: Some embodiments may perform operations of a process that includes obtaining a natural language text document and use a machine learning model to generate a set of attributes based on a set of machine-learning-model-generated classifications in the document. The process may include performing hierarchical data extraction operations to populate the attributes, where different machine learning models may be used in sequence. The process may include using a pre-trained Bidirectional Encoder Representations from Transformers (BERT) model augmented with a pooling operation to determine a BERT output via a multi-channel transformer model to generate vectors on a per-sentence level or other per-text-section level. The process may include using a finer-grain model to extract quantitative or categorical values of interest, where the context of the per-sentence level may be retained for the finer-grain model.
    Type: Application
    Filed: July 29, 2022
    Publication date: February 2, 2023
    Inventors: Jaya Prakash Narayana Gutta, Sharad Malhautra, Lalit Gupta
  • Publication number: 20220406371
    Abstract: A machine memory includes multiple memory cells. Word lines, each with at least one word line driver, are coupled to the memory cells along rows. The word line drivers of at least some adjacent pairs of the word lines are coupled together by a pull-down transistor, in a manner that reduces read disturb of the memory cells.
    Type: Application
    Filed: June 17, 2021
    Publication date: December 22, 2022
    Applicant: NVIDIA Corp.
    Inventors: Lalit Gupta, Andreas Jon Gotterba, Jesse Wang, Stefan P. Sywyk
  • Patent number: 11520815
    Abstract: Some embodiments may obtain a natural language question, determine a context of the natural language question, and generate a first vector based on the natural language question using encoder neural network layers. Some embodiments may access a data table comprising column names, generate vectors based on the column names, and determine attention scores based on the vectors. Some embodiments may update the vectors based on the attention scores, generating a second vector based on the natural language question, determine a set of strings comprising a name of the column names and a database language operator based on the vectors. Some embodiments may determine a values based on the determined database language operator, the name, using a transformer neural network model. Some embodiments may generate a query based on the set of strings and the values.
    Type: Grant
    Filed: July 29, 2022
    Date of Patent: December 6, 2022
    Assignee: Dsilo, Inc.
    Inventors: Jaya Prakash Narayana Gutta, Sharad Malhautra, Lalit Gupta
  • Patent number: 11468943
    Abstract: Various implementations described herein are related to a device having memory circuitry activated by a power-gated supply. The device may include level shifting circuitry that receives a switch control signal in a first voltage domain, shifts the switch control signal in the first voltage domain to a second voltage domain, and provides the switch control signal in the second voltage domain. The device may include power-gating circuitry activated by the switch control signal in the second voltage domain, and the power-gating circuitry may provide the power-gated supply to the memory circuitry to trigger activation of the memory circuitry with the power-gated supply when activated by the switch control signal in the second voltage domain.
    Type: Grant
    Filed: July 29, 2020
    Date of Patent: October 11, 2022
    Assignee: Arm Limited
    Inventors: Lalit Gupta, Cyrille Nicolas Dray, El Mehdi Boujamaa
  • Patent number: 11386937
    Abstract: Various implementations described herein refer to a method for providing single port memory with a bitcell array arranged in columns and rows. The method may include coupling a wordline to the single port memory including coupling the wordline to the columns of the bitcell array. The method may include performing multiple memory access operations concurrently in the single port memory including performing a read operation in one column of the bitcell array using the wordline while performing a write operation in another column of the bitcell array using the wordline, or performing a write operation in one column of the bitcell array using the wordline while performing a read operation in another column of the bitcell array using the same wordline.
    Type: Grant
    Filed: October 12, 2019
    Date of Patent: July 12, 2022
    Assignee: Arm Limited
    Inventors: Lalit Gupta, Nicolaas Klarinus Johannes Van Winkelhoff, Bo Zheng, El Mehdi Boujamaa, Fakhruddin Ali Bohra
  • Patent number: 11277133
    Abstract: Various implementations described herein are related to a device having level shifter circuitry configured to receive isolation control signals in a first voltage domain and provide an output signal in a second voltage domain that is different than the first voltage domain. The device may include isolation logic circuitry configured to receive a data input signal in the first voltage domain and then provide the isolation control signals to the level shifter circuitry in the first voltage domain based on the data input signal. The isolation logic circuitry may include control passgates that enable the data input signal to propagate to the level shifter circuitry via the isolation control signals.
    Type: Grant
    Filed: August 21, 2020
    Date of Patent: March 15, 2022
    Assignee: Arm Limited
    Inventors: Lalit Gupta, El Mehdi Boujamaa, Tirdad Anthony Takeshian