Patents by Inventor Madhu Gumma

Madhu Gumma has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 7925899
    Abstract: Assertions of sentinel nodes within a device may be used to calculate device power consumption at runtime. These power estimates may be used to estimate a device temperature or leakage power, and/or may be used to make device throttling decisions.
    Type: Grant
    Filed: December 29, 2005
    Date of Patent: April 12, 2011
    Assignee: Intel Corporation
    Inventors: Madhu Gumma, Anand Ramachandran, Eric Samson
  • Publication number: 20070157035
    Abstract: Assertions of sentinel nodes within a device may be used to calculate device power consumption at runtime. These power estimates may be used to estimate a device temperature or leakage power, and/or may be used to make device throttling decisions.
    Type: Application
    Filed: December 29, 2005
    Publication date: July 5, 2007
    Inventors: Madhu Gumma, Anand Ramachandran, Eric Samson
  • Publication number: 20060018330
    Abstract: Provided are a method, system, and program for managing memory requests for logic blocks or clients of a device. In one embodiment, busses are separated by the type of data to be carried by the busses. In another aspect, data transfers are decoupled from the memory requests which initiate the data transfers. In another aspect, clients competing for busses are arbitrated and selected memory requests may be provided programmable higher priority than other memory operations of a similar type.
    Type: Application
    Filed: June 30, 2004
    Publication date: January 26, 2006
    Inventors: Ashish Choubal, Madhu Gumma, Christopher Foulds, Mohannad Noah
  • Publication number: 20050210202
    Abstract: Provided are a method, system, and program for managing Input/Output (I/O) requests in a cache memory system. A request is received to data at a memory address in a first memory device, wherein data in the first memory device is cached in a second memory device. A determination is made as to whether to fetch the requested data from the first memory device to cache in the second memory device in response to determining that the requested data is not in the second memory device. The requested data in the first memory device is accessed and the second memory device is bypassed to execute the request in response to determining not to fetch the requested data from the first memory device to cache in the second memory device.
    Type: Application
    Filed: March 19, 2004
    Publication date: September 22, 2005
    Inventors: Ashish Choubal, Christopher Foulds, Madhu Gumma, Quang Le