Patents by Inventor Mariama Ndoye

Mariama Ndoye has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9164915
    Abstract: Embodiments of the disclosure include a method for reserving large pages in a large frame area (LFAREA) of a main memory. The method includes pre-scanning a plurality of storage increments and counting a number of available large pages that are online and issuing a message that indicates the number of available large pages. The method also includes receiving and parsing an LFAREA request including a target number of large pages to be reserved. The method further includes calculating an optimal number of large pages to be reserved, based upon the target number of available pages and a system limit. The method includes determining if the LFAREA request is valid and if the LFAREA request can be satisfied and scanning the plurality of the storage increments and reserving the optimal number of pages in the LFAREA.
    Type: Grant
    Filed: January 15, 2013
    Date of Patent: October 20, 2015
    Assignee: International Business Machines Corporation
    Inventors: Alfred F. Foster, David Hom, Charles E. Mari, Matthew J. Mauriello, Robert Miller, Jr., Mariama Ndoye, Scott B. Tuttle, Elpida Tzortzatos
  • Patent number: 8966220
    Abstract: Embodiments of the disclosure include a method for optimizing large page processing. The method includes receiving an indication that a real memory includes a first page. The first page includes a plurality of smaller pages. The method also includes determining a page frame table entry associated with a first smaller page of the first page and storing data associated with the first page in the page frame table entry associated with the first smaller page. The page frame table entry associated with the first smaller page of the first page is a data repository for the plurality of smaller pages of the first page.
    Type: Grant
    Filed: January 15, 2013
    Date of Patent: February 24, 2015
    Assignee: International Business Machines Corporation
    Inventors: Alfred F. Foster, David Horn, Charles E. Mari, Matthew J. Mauriello, Robert Miller, Jr., Mariama Ndoye, Scott B. Tuttle, Elpida Tzortzatos
  • Patent number: 8868876
    Abstract: Dedicated large page memory pools are provided to, at least in part, facilitate access to large pages. The large page memory is managed by: establishing multiple large page memory pools, each large page memory pool of the multiple large page memory pools including a number of large pages; and dedicating each large page memory pool of the multiple large page memory pools to a respective processor of multiple processors of the computing environment, wherein processors of the multiple processors can concurrently access pages from the respective large page memory pools of the multiple large page memory pools.
    Type: Grant
    Filed: December 28, 2011
    Date of Patent: October 21, 2014
    Assignee: International Business Machines Corporation
    Inventors: Alfred F. Foster, David Hom, Charles E. Mari, Matthew J. Mauriello, Robert Miller, Jr., Mariama Ndoye, Michael G. Spiegel, Peter G. Sutton, Scott B. Tuttle, Elpida Tzortzatos, Chun-Kwan K. Yee
  • Patent number: 8799611
    Abstract: Allocation of pages of memory is managed in computing environments that include multiple sized memory pools. Responsive to a request for a page of memory, one or more memory pools are searched for an available frame of memory to service the request. The search uses a predefined order of search, which includes multiple types of memory pools in a specific order based on the requested size of the page of memory.
    Type: Grant
    Filed: May 5, 2011
    Date of Patent: August 5, 2014
    Assignee: International Business Machines Corporation
    Inventors: Alfred F. Foster, David Horn, Charles E. Mari, Matthew J. Mauriello, Robert Miller, Jr., Mariama Ndoye, Michael G. Spiegel, Peter G. Sutton, Scott B. Tuttle, Elpida Tzortzatos, Chun Kwan K. Yee
  • Patent number: 8793444
    Abstract: Large page memory pools are managed. Thresholds are used to determine if the number of pages in a large page memory pool is to be adjusted. If the number of pages is to be increased, a particular technique is provided for adding additional pages to the pool. Further, if there are too many pages in the pool, one or more pages may be removed.
    Type: Grant
    Filed: May 5, 2011
    Date of Patent: July 29, 2014
    Assignee: International Business Machines Corporation
    Inventors: Alfred F. Foster, David Horn, Charles E. Mari, Matthew J. Mauriello, Robert Miller, Jr., Mariama Ndoye, Michael G. Spiegel, Peter G. Sutton, Scott B. Tuttle, Elpida Tzortzatos, Chun Kwan K. Yee
  • Publication number: 20140201493
    Abstract: Embodiments of the disclosure include a method for optimizing large page processing. The method includes receiving an indication that a real memory includes a first page. The first page includes a plurality of smaller pages. The method also includes determining a page frame table entry associated with a first smaller page of the first page and storing data associated with the first page in the page frame table entry associated with the first smaller page. The page frame table entry associated with the first smaller page of the first page is a data repository for the plurality of smaller pages of the first page.
    Type: Application
    Filed: January 15, 2013
    Publication date: July 17, 2014
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Alfred F. Foster, David Hom, Charles E. Mari, Matthew J. Mauriello, Robert Miller, Jr., Mariama Ndoye, Scott B. Tuttle, Elpida Tzortzatos
  • Publication number: 20140201496
    Abstract: Embodiments of the disclosure include a method for reserving large pages in a large frame area (LFAREA) of a main memory. The method includes pre-scanning a plurality of storage increments and counting a number of available large pages that are online and issuing a message that indicates the number of available large pages. The method also includes receiving and parsing an LFAREA request including a target number of large pages to be reserved. The method further includes calculating an optimal number of large pages to be reserved, based upon the target number of available pages and a system limit. The method includes determining if the LFAREA request is valid and if the LFAREA request can be satisfied and scanning the plurality of the storage increments and reserving the optimal number of pages in the LFAREA.
    Type: Application
    Filed: January 15, 2013
    Publication date: July 17, 2014
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Alfred F. Foster, David Hom, Charles E. Mari, Matthew J. Mauriello, Robert Miller, Jr., Mariama Ndoye, Scott B. Tuttle, Elpida Tzortzatos
  • Publication number: 20140075142
    Abstract: A computer system includes memory and a processor configured to manage memory allocation. The processor is configured to execute a memory allocation request to allocate a portion of the memory to an application by determining whether a size of the memory allocation request is less than a first pre-defined size. The processor searches virtual memory for a free allocated memory area corresponding at least to the size of the memory allocation request based on determining that the size of the memory allocation request is less than the first pre-defined size.
    Type: Application
    Filed: September 13, 2012
    Publication date: March 13, 2014
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: David Hom, James H. Mulder, Mariama Ndoye, Michael G. Spiegel, Elpida Tzortzatos
  • Publication number: 20130173880
    Abstract: Dedicated large page memory pools are provided to, at least in part, facilitate access to large pages. The large page memory is managed by: establishing multiple large page memory pools, each large page memory pool of the multiple large page memory pools including a number of large pages; and dedicating each large page memory pool of the multiple large page memory pools to a respective processor of multiple processors of the computing environment, wherein processors of the multiple processors can concurrently access pages from the respective large page memory pools of the multiple large page memory pools.
    Type: Application
    Filed: December 28, 2011
    Publication date: July 4, 2013
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Alfred F. FOSTER, David HOM, Charles E. MARI, Matthew J. MAURIELLO, Robert MILLER, JR., Mariama NDOYE, Michael G. SPIEGEL, Peter G. SUTTON, Scott B. TUTTLE, Elpida TZORTZATOS, Chun-Kwan K. YEE
  • Publication number: 20120284483
    Abstract: Allocation of pages of memory is managed in computing environments that include multiple sized memory pools. Responsive to a request for a page of memory, one or more memory pools are searched for an available frame of memory to service the request. The search uses a predefined order of search, which includes multiple types of memory pools in a specific order based on the requested size of the page of memory.
    Type: Application
    Filed: May 5, 2011
    Publication date: November 8, 2012
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Alfred F. Foster, David Hom, Charles E. Mari, Matthew J. Mauriello, Robert Miller, JR., Mariama Ndoye, Michael G. Spiegel, Peter G. Sutton, Scott B. Tuttle, Elpida Tzortzatos, Chun Kwan K. Yee
  • Publication number: 20120284479
    Abstract: Large page memory pools are managed. Thresholds are used to determine if the number of pages in a large page memory pool is to be adjusted. If the number of pages is to be increased, a particular technique is provided for adding additional pages to the pool. Further, if there are too many pages in the pool, one or more pages may be removed.
    Type: Application
    Filed: May 5, 2011
    Publication date: November 8, 2012
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Alfred F. Foster, David Hom, Charles E. Mari, Matthew J. Mauriello, Robert Miller, JR., Mariama Ndoye, Michael G. Spiegel, Peter G. Sutton, Scott B. Tuttle, Elpida Tzortzatos, Chun-Kwan K. Yee
  • Patent number: 6807588
    Abstract: A sectioned ordered queue in an information handling system comprises a plurality of queue sections arranged in order from a first queue section to a last queue section. Each queue section contains one or more queue entries that correspond to available ranges of real storage locations and are arranged in order from a first queue entry to a last queue entry. Each queue section and each queue entry in the queue sections having a weight factor defined for it. Each queue entry has an effective weight factor formed by combining the weight factor defined for the queue section with the weight factor defined for the queue entry. A new entry is added to the last queue section to indicate a newly available corresponding storage location, and one or more queue entries are deleted from the first section of the queue to indicate that the corresponding storage locations are no longer available.
    Type: Grant
    Filed: February 27, 2002
    Date of Patent: October 19, 2004
    Assignee: International Business Machines Corporation
    Inventors: Tri M. Hoang, Tracy D. Butler, Danny R. Sutherland, David B. Emmes, Mariama Ndoye, Elpida Tzortzatos
  • Publication number: 20030163644
    Abstract: A sectioned ordered queue in an information handling system comprises a plurality of queue sections arranged in order from a first queue section to a last queue section. Each queue section contains one or more queue entries that correspond to available ranges of real storage locations and are arranged in order from a first queue entry to a last queue entry. Each queue section and each queue entry in the queue sections having a weight factor defined for it. Each queue entry has an effective weight factor formed by combining the weight factor defined for the queue section with the weight factor defined for the queue entry. A new entry is added to the last queue section to indicate a newly available corresponding storage location, and one or more queue entries are deleted from the first section of the queue to indicate that the corresponding storage locations are no longer available.
    Type: Application
    Filed: February 27, 2002
    Publication date: August 28, 2003
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Tri M. Hoang, Tracy D. Butler, Danny R. Sutherland, David B. Emmes, Mariama Ndoye, Elpida Tzortzatos