Patents by Inventor Sudhakar Kamma

Sudhakar Kamma has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250094277
    Abstract: Methods and apparatus for low latency fail-operational Time Sensitive Networking. An apparatus includes an error detection and path switching (EDPS) circuit comprising circuitry to read data from volatile memory, detect whether read data is errant, the errant data including one or more uncorrectable errors, and send a message to access correct data corresponding to the errant data stored in a non-volatile memory device, the message identifying the correct data to be returned to the EDPS circuit. The apparatus receives the correct data and enables the correct data to be read by circuitry coupled to the EPDS circuit. The EDPS circuit includes Error Correction Code (ECC) logic to detect uncorrectable errors, detect the data has no ECC errors, and detect and correct single-bit errors to obtain corrected data. Data with no ECC errors and corrected data is immediately made available for reading by the circuitry coupled to the EPDS circuit.
    Type: Application
    Filed: December 3, 2024
    Publication date: March 20, 2025
    Inventors: Kishore KASICHAINULA, Thierry BEAUMONT, Dhanumjai PASUMARTHY, Sudhakar KAMMA
  • Publication number: 20240403259
    Abstract: Methods and apparatus relating to techniques for data compression. In an example, an apparatus comprises a processor receive a data compression instruction for a memory segment; and in response to the data compression instruction, compress a sequence of identical memory values in response to a determination that the sequence of identical memory values has a length which exceeds a threshold. Other embodiments are also disclosed and claimed.
    Type: Application
    Filed: July 22, 2024
    Publication date: December 5, 2024
    Applicant: Intel Corporation
    Inventors: Abhishek R. Appu, Altug Koker, Aravindh Anantaraman, Elmoustapha Ould-Ahmed-Vall, Joydeep Ray, Mike MacPherson, Valentin Andrei, Nicolas Galoppo Von Borries, Varghese George, Subramaniam Maiyuran, Vasanth Ranganathan, Jayakrishna P S, Pattabhiraman K, Sudhakar Kamma
  • Publication number: 20240329130
    Abstract: Techniques for debug of compute logic (e.g., a part of a system on a chip) are described. In some examples, a plurality of infield testing debug and status registers are to store information about the execution of an infield test; and a plurality of infield control debug registers are to control the infield test, wherein access to the plurality of infield testing debug registers and the plurality of infield control debug registers is programmable.
    Type: Application
    Filed: March 30, 2023
    Publication date: October 3, 2024
    Inventors: Rakesh KANDULA, Vasubabu RAVIPATI, Thierry BEAUMONT, Dhanumjai PASUMARTHY, Sudhakar KAMMA
  • Patent number: 12093210
    Abstract: Methods and apparatus relating to techniques for data compression. In an example, an apparatus comprises a processor receive a data compression instruction for a memory segment; and in response to the data compression instruction, compress a sequence of identical memory values in response to a determination that the sequence of identical memory values has a length which exceeds a threshold. Other embodiments are also disclosed and claimed.
    Type: Grant
    Filed: March 14, 2020
    Date of Patent: September 17, 2024
    Assignee: INTEL CORPORATION
    Inventors: Abhishek R. Appu, Altug Koker, Aravindh Anantaraman, Elmoustapha Ould-Ahmed-Vall, Joydeep Ray, Mike Macpherson, Valentin Andrei, Nicolas Galoppo Von Borries, Varghese George, Subramaniam Maiyuran, Vasanth Ranganathan, Jayakrishna P S, K Pattabhiraman, Sudhakar Kamma
  • Patent number: 11593260
    Abstract: An apparatus to facilitate memory data compression is disclosed. The apparatus includes a memory and having a plurality of banks to store main data and metadata associated with the main data and a memory management unit (MMU) coupled to the plurality of banks to perform a hash function to compute indices into virtual address locations in memory for the main data and the metadata and adjust the metadata virtual address locations to store each adjusted metadata virtual address location in a bank storing the associated main data.
    Type: Grant
    Filed: March 30, 2021
    Date of Patent: February 28, 2023
    Assignee: Intel Corporation
    Inventors: Abhishek R. Appu, Altug Koker, Joydeep Ray, Niranjan Cooray, Prasoonkumar Surti, Sudhakar Kamma, Vasanth Ranganathan
  • Patent number: 11574382
    Abstract: Examples described herein relate to a decompression engine that can request compressed data to be transferred over a memory bus. In some cases, the memory bus is a width that requires multiple data transfers to transfer the requested data. In a case that requested data is to be presented in-order to the decompression engine, a re-order buffer can be used to store entries of data. When a head-of-line entry is received, the entry can be provided to the decompression engine. When a last entry in a group of one or more entries is received, all entries in the group are presented in-order to the decompression engine. In some examples, a decompression engine can borrow memory resources allocated for use by another memory client to expand a size of re-order buffer available for use. For example, a memory client with excess capacity and a slowest growth rate can be chosen to borrow memory resources from.
    Type: Grant
    Filed: September 3, 2021
    Date of Patent: February 7, 2023
    Assignee: Intel Corporation
    Inventors: Abhishek R. Appu, Eric G. Liskay, Prasoonkumar Surti, Sudhakar Kamma, Karthik Vaidyanathan, Rajasekhar Pantangi, Altug Koker, Abhishek Rhisheekesan, Shashank Lakshminarayana, Priyanka Ladda, Karol A. Szerszen
  • Publication number: 20220129265
    Abstract: Methods and apparatus relating to techniques for data compression. In an example, an apparatus comprises a processor receive a data compression instruction for a memory segment; and in response to the data compression instruction, compress a sequence of identical memory values in response to a determination that the sequence of identical memory values has a length which exceeds a threshold. Other embodiments are also disclosed and claimed.
    Type: Application
    Filed: March 14, 2020
    Publication date: April 28, 2022
    Applicant: Intel Corporation
    Inventors: Abhishek R. Appu, Altug Koker, Aravindh Anantaraman, Elmoustapha Ould-Ahmed-Vall, Joydeep Ray, Mike Macpherson, Valentin Andrei, Nicolas Galoppo Von Borries, Varghese George, Subramaniam Maiyuran, Vasanth Ranganathan, Jayakrishna P S, K Pattabhiraman, Sudhakar Kamma
  • Publication number: 20220058765
    Abstract: Examples described herein relate to a decompression engine that can request compressed data to be transferred over a memory bus. In some cases, the memory bus is a width that requires multiple data transfers to transfer the requested data. In a case that requested data is to be presented in-order to the decompression engine, a re-order buffer can be used to store entries of data. When a head-of-line entry is received, the entry can be provided to the decompression engine. When a last entry in a group of one or more entries is received, all entries in the group are presented in-order to the decompression engine. In some examples, a decompression engine can borrow memory resources allocated for use by another memory client to expand a size of re-order buffer available for use. For example, a memory client with excess capacity and a slowest growth rate can be chosen to borrow memory resources from.
    Type: Application
    Filed: September 3, 2021
    Publication date: February 24, 2022
    Inventors: Abhishek R. APPU, Eric G. LISKAY, Prasoonkumar SURTI, Sudhakar KAMMA, Karthik VAIDYANATHAN, Rajasekhar PANTANGI, Altug KOKER, Abhishek RHISHEEKESAN, Shashank LAKSHMINARAYANA, Priyanka LADDA, Karol A. SZERSZEN
  • Patent number: 11113783
    Abstract: Examples described herein relate to a decompression engine that can request compressed data to be transferred over a memory bus. In some cases, the memory bus is a width that requires multiple data transfers to transfer the requested data. In a case that requested data is to be presented in-order to the decompression engine, a re-order buffer can be used to store entries of data. When a head-of-line entry is received, the entry can be provided to the decompression engine. When a last entry in a group of one or more entries is received, all entries in the group are presented in-order to the decompression engine. In some examples, a decompression engine can borrow memory resources allocated for use by another memory client to expand a size of re-order buffer available for use. For example, a memory client with excess capacity and a slowest growth rate can be chosen to borrow memory resources from.
    Type: Grant
    Filed: November 13, 2019
    Date of Patent: September 7, 2021
    Assignee: Intel Corporation
    Inventors: Abhishek R. Appu, Eric G. Liskay, Prasoonkumar Surti, Sudhakar Kamma, Karthik Vaidyanathan, Rajasekhar Pantangi, Altug Koker, Abhishek Rhisheekesan, Shashank Lakshminarayana, Priyanka Ladda, Karol A. Szerszen
  • Publication number: 20210255951
    Abstract: An apparatus to facilitate memory data compression is disclosed. The apparatus includes a memory and having a plurality of banks to store main data and metadata associated with the main data and a memory management unit (MMU) coupled to the plurality of banks to perform a hash function to compute indices into virtual address locations in memory for the main data and the metadata and adjust the metadata virtual address locations to store each adjusted metadata virtual address location in a bank storing the associated main data.
    Type: Application
    Filed: March 30, 2021
    Publication date: August 19, 2021
    Applicant: Intel Corporation
    Inventors: Abhishek R. Appu, Altug Koker, Joydeep Ray, Niranjan Cooray, Prasoonkumar Surti, Sudhakar Kamma, Vasanth Ranganathan
  • Publication number: 20210142438
    Abstract: Examples described herein relate to a decompression engine that can request compressed data to be transferred over a memory bus. In some cases, the memory bus is a width that requires multiple data transfers to transfer the requested data. In a case that requested data is to be presented in-order to the decompression engine, a re-order buffer can be used to store entries of data. When a head-of-line entry is received, the entry can be provided to the decompression engine. When a last entry in a group of one or more entries is received, all entries in the group are presented in-order to the decompression engine. In some examples, a decompression engine can borrow memory resources allocated for use by another memory client to expand a size of re-order buffer available for use. For example, a memory client with excess capacity and a slowest growth rate can be chosen to borrow memory resources from.
    Type: Application
    Filed: November 13, 2019
    Publication date: May 13, 2021
    Inventors: Abhishek R. APPU, Eric G. LISKAY, Prasoonkumar SURTI, Sudhakar KAMMA, Karthik VAIDYANATHAN, Rajasekhar PANTANGI, Altug KOKER, Abhishek RHISHEEKESAN, Shashank LAKSHMINARAYANA, Priyanka LADDA, Karol A. Szerszen
  • Patent number: 10983906
    Abstract: An apparatus to facilitate memory data compression is disclosed. The apparatus includes a memory and having a plurality of banks to store main data and metadata associated with the main data and a memory management unit (MMU) coupled to the plurality of banks to perform a hash function to compute indices into virtual address locations in memory for the main data and the metadata and adjust the metadata virtual address locations to store each adjusted metadata virtual address location in a bank storing the associated main data.
    Type: Grant
    Filed: March 18, 2019
    Date of Patent: April 20, 2021
    Assignee: Intel Corporation
    Inventors: Abhishek R. Appu, Altug Koker, Joydeep Ray, Niranjan Cooray, Prasoonkumar Surti, Sudhakar Kamma, Vasanth Ranganathan
  • Publication number: 20200301826
    Abstract: An apparatus to facilitate memory data compression is disclosed. The apparatus includes a memory and having a plurality of banks to store main data and metadata associated with the main data and a memory management unit (MMU) coupled to the plurality of banks to perform a hash function to compute indices into virtual address locations in memory for the main data and the metadata and adjust the metadata virtual address locations to store each adjusted metadata virtual address location in a bank storing the associated main data.
    Type: Application
    Filed: March 18, 2019
    Publication date: September 24, 2020
    Applicant: Intel Corporation
    Inventors: Abhishek R. Appu, Altug Koker, Joydeep Ray, Niranjan Cooray, Prasoonkumar Surti, Sudhakar Kamma, Vasanth Ranganathan