Patents by Inventor Atul Lele

Atul Lele has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240103811
    Abstract: In one example, a neural network processor comprises an input data register, a weights register, a computing engine configurable to perform multiplication and accumulation (MAC) operations between input data elements of a range of input precisions and weight elements of a range of weight precisions, and a controller. The controller is configured to: receive a first indication of the particular input precision and a second indication of the particular weight precision, and configure the computing engine based on the first and second indications. The controller is also configured to, responsive to an instruction: fetch input data elements and weight elements to the computing engine; and perform, using the computing engine configured based on the first and second indications, MAC operations between the input data elements at the particular input precision and the weight elements at the particular weight precision to generate intermediate output data elements.
    Type: Application
    Filed: July 20, 2023
    Publication date: March 28, 2024
    Applicant: Texas Instruments Incorporated
    Inventors: Mahesh M Mehendale, Atul Lele, Nagendra Gulur, Hetul Sanghvi, Srinivasa BS Chakravarthy
  • Publication number: 20240103875
    Abstract: In one example, a neural network processor comprises a memory interface, an instruction buffer, a weights buffer, an input data register, a weights register, an output data register, a computing engine, and a controller. The controller is configured to: receive a first instruction from the instruction buffer; responsive to the first instruction, fetch input data elements from the memory interface to the input data register, and fetch weight elements from the weights buffer to the weights register. The controller is also configured to: receive a second instruction from the instruction buffer; and responsive to the second instruction: fetch the input data elements and the weight elements from, respectively, the input data register and the weights register to the computing engine; and perform, using the computing engine, computation operations between the input data elements and the weight elements to generate output data elements.
    Type: Application
    Filed: July 20, 2023
    Publication date: March 28, 2024
    Applicant: Texas Instruments Incorporated
    Inventors: Mahesh M Mehendale, Nagendra Gulur, Srinivasa BS Chakravarthy, Atul Lele, Hetul Sanghvi
  • Publication number: 20240104361
    Abstract: In one example, a neural network processor comprises a computing engine and a post-processing engine, the post-processing engine configurable to perform different post-processing operations for a range of output precisions and a range of weight precisions. The neural network processor further comprises a controller configured to: receive a first indication of a particular output precision, a second indication of the particular weight precision, and post-processing parameters; and configure the post-processing engine based on the first and second indications and the first and second post-processing parameters. The controller is further configured to, responsive to a first instruction, perform, using the computing engine, multiplication and accumulation operations between input data elements and weight elements to generate intermediate data elements.
    Type: Application
    Filed: July 20, 2023
    Publication date: March 28, 2024
    Applicant: Texas Instruments Incorporated
    Inventors: Mahesh M Mehendale, Hetul Sanghvi, Nagendra Gulur, Atul Lele, Srinivasa BS Chakravarthy