Patents by Inventor Manish MUKUL

Manish MUKUL has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11907588
    Abstract: Aspects of the invention include identifying a first subsystem and a second subsystem of a plurality of subsystems respectively storing a first compressed data and a second compressed data, wherein the first compressed data and the second compressed data are fragments of a requested data. A compression method used to compress the first compressed data and second compressed data is identified. A first accelerator of first subsystem and a second accelerator of the second subsystem is identified. The first compressed data from a first local memory of the first subsystem is offloaded to the first accelerator, and the second compressed data from a second local memory of the second subsystem is offloaded to the second accelerator, wherein offloading comprises provided a decompression method for the first compressed data and the second compressed data.
    Type: Grant
    Filed: November 15, 2021
    Date of Patent: February 20, 2024
    Assignee: International Business Machines Corporation
    Inventors: Vishnupriya R, Mehulkumar J. Patel, Manish Mukul
  • Publication number: 20230153034
    Abstract: Aspects of the invention include identifying a first subsystem and a second subsystem of a plurality of subsystems respectively storing a first compressed data and a second compressed data, wherein the first compressed data and the second compressed data are fragments of a requested data. A compression method used to compress the first compressed data and second compressed data is identified. A first accelerator of first subsystem and a second accelerator of the second subsystem is identified. The first compressed data from a first local memory of the first subsystem is offloaded to the first accelerator, and the second compressed data from a second local memory of the second subsystem is offloaded to the second accelerator, wherein offloading comprises provided a decompression method for the first compressed data and the second compressed data.
    Type: Application
    Filed: November 15, 2021
    Publication date: May 18, 2023
    Inventors: Vishnupriya R, MEHULKUMAR J. PATEL, Manish Mukul
  • Patent number: 11144207
    Abstract: Embodiments herein describe using compression engines in a processor subsystem to compress only the data fragments stored locally. That is, an application may be allocated a buffer where the physical memory of that buffer is spread across multiple processor subsystems. Rather than asking a single actor (e.g., a single host processor or compression engine) to compress all the fragments of the buffer, a compression library can instead instruct the individual compression engines in each of the processor subsystems to compress only the fragments stored in local memory in the same processor subsystem. Doing so leverages the memory affinity between the compression engines in the local memory which can reduce the overall time required to perform compression.
    Type: Grant
    Filed: November 7, 2019
    Date of Patent: October 12, 2021
    Assignee: International Business Machines Corporation
    Inventors: Vishnupriya R, Manish Mukul, Mehulkumar Patel
  • Publication number: 20210141535
    Abstract: Embodiments herein describe using compression engines in a processor subsystem to compress only the data fragments stored locally. That is, an application may be allocated a buffer where the physical memory of that buffer is spread across multiple processor subsystems. Rather than asking a single actor (e.g., a single host processor or compression engine) to compress all the fragments of the buffer, a compression library can instead instruct the individual compression engines in each of the processor subsystems to compress only the fragments stored in local memory in the same processor subsystem. Doing so leverages the memory affinity between the compression engines in the local memory which can reduce the overall time required to perform compression.
    Type: Application
    Filed: November 7, 2019
    Publication date: May 13, 2021
    Inventors: Vishnupriya R., Manish MUKUL, Mehulkumar PATEL