Patents by Inventor Narasinga Rao MINISKAR

Narasinga Rao MINISKAR has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11693663
    Abstract: Methods and apparatus for managing circular queues are disclosed. A pointer designates an index position of a particular queue element and contains an additional pointer state, whereby two pointer values (split indexes) can designate the same index position. Front and rear pointers are respectively managed by dequeue and enqueue logic. The front pointer state and rear pointer state distinguish full and empty queue states when both pointers designate the same index position. Asynchronous dequeue and enqueue operations are supported, no lock is required, and no queue entry is wasted. Hardware and software embodiments for numerous applications are disclosed.
    Type: Grant
    Filed: October 13, 2021
    Date of Patent: July 4, 2023
    Assignee: UT-Battelle, LLC
    Inventors: Narasinga Rao Miniskar, Frank Y. Liu, Jeffrey S. Vetter
  • Patent number: 11593644
    Abstract: The present disclosure disclose method and apparatus for determining memory requirement for processing a DNN model on a device, a method includes receiving a DNN model for an input, wherein the DNN model includes a plurality of processing layers. The method includes generating a network graph of the DNN model. The method includes creating a colored network graph of the DNN model based on the identified execution order of the plurality of processing layers. The colored network graph indicates assignment of at least one memory buffer for storing at least one output of at least one processing layer. The method includes determining at least one buffer reuse overlap possibility across the plurality of processing layers. Based on the determined at least one buffer reuse overlap possibility, the method includes determining and assigning the memory required for processing the DNN model.
    Type: Grant
    Filed: August 8, 2018
    Date of Patent: February 28, 2023
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Narasinga Rao Miniskar, Sirish Kumar Pasupuleti, Raj Narayana Gadde, Ashok Vishnoi, Vasanthakumar Rajagopal, Chandra Kumar Ramasamy
  • Publication number: 20220188155
    Abstract: Apparatus and methods are disclosed for scheduling tasks in a heterogeneous computing environment. Coarse scheduling of a received task-set is performed centrally, with tasks dispatched to respective processing resources including one or more accelerators. At each accelerator, sub-tasks of a received task are identified, scheduled, and executed. Data-transfer and computation sub-tasks can be pipelined. The accelerator operates using small tiles of local data, which are transferred to or from a large shared reservoir of main memory. Sub-task scheduling can be customized to each accelerator; coarse task scheduling can work on larger tasks; both can be efficient. Simulations demonstrate large improvements in makespan and/or circuit area. Disclosed technologies are scalable and can be implemented in varying combinations of hard-wired or software modules. These technologies are widely applicable to high-performance computing, image classification, media processing, wireless coding, encryption, and other fields.
    Type: Application
    Filed: December 3, 2021
    Publication date: June 16, 2022
    Applicant: UT-Battelle, LLC
    Inventors: Narasinga Rao Miniskar, Frank Y. Liu, Aaron R. Young, Jeffrey S. Vetter, Dwaipayan Chakraborty
  • Publication number: 20220129275
    Abstract: Methods and apparatus for managing circular queues are disclosed. A pointer designates an index position of a particular queue element and contains an additional pointer state, whereby two pointer values (split indexes) can designate the same index position. Front and rear pointers are respectively managed by dequeue and enqueue logic. The front pointer state and rear pointer state distinguish full and empty queue states when both pointers designate the same index position. Asynchronous dequeue and enqueue operations are supported, no lock is required, no queue entry is wasted. Hardware and software embodiments for numerous applications are disclosed.
    Type: Application
    Filed: October 13, 2021
    Publication date: April 28, 2022
    Applicant: UT-Battelle, LLC
    Inventors: Narasinga Rao Miniskar, Frank Y. Liu, Jeffrey S. Vetter
  • Publication number: 20200257972
    Abstract: The present disclosure disclose method and apparatus for determining memory requirement for processing a DNN model on a device, a method includes receiving a DNN model for an input, wherein the DNN model includes a plurality of processing layers. The method includes generating a network graph of the DNN model. The method includes creating a colored network graph of the DNN model based on the identified execution order of the plurality of processing layers. The colored network graph indicates assignment of at least one memory buffer for storing at least one output of at least one processing layer. The method includes determining at least one buffer reuse overlap possibility across the plurality of processing layers. Based on the determined at least one buffer reuse overlap possibility, the method includes determining and assigning the memory required for processing the DNN model.
    Type: Application
    Filed: August 8, 2018
    Publication date: August 13, 2020
    Applicant: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Narasinga Rao MINISKAR, Sirish Kumar PASUPULETI, Raj Narayana GADDE, Ashok VISHNOI, Vasanthakumar RAJAGOPAL, Chandra Kumar RAMASAMY