Patents by Inventor Kevin T. Lim

Kevin T. Lim has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10990533
    Abstract: A system and method for retrieving cached data are disclosed herein. The system includes a cache server including a local memory and a table residing on the local memory, wherein the table is used to identify data objects corresponding to cached data. The system also includes the data objects residing on the local memory, wherein the data objects contain pointers to the cached data. The system further includes a remote memory communicatively coupled to the cache server through an Input-Output (I/O) connection, wherein the cached data resides on the remote memory.
    Type: Grant
    Filed: June 22, 2018
    Date of Patent: April 27, 2021
    Assignee: Hewlett Packard Enterprise Development LP
    Inventors: Kevin T. Lim, Alvin AuYoung
  • Patent number: 10728171
    Abstract: Disclosed herein are a system, non-transitory computer readable medium, and method for governing communications of a bare metal guest in a cloud network. A network interface handles packets of data in accordance with commands by a control agent.
    Type: Grant
    Filed: April 30, 2013
    Date of Patent: July 28, 2020
    Assignee: HEWLETT PACKARD ENTERPRISE DEVELOPMENT LP
    Inventors: Jeffrey Clifford Mogul, Jose Renato G. Santos, Yoshio Turner, Kevin T. Lim
  • Patent number: 10467116
    Abstract: Methods, systems, and computer-readable and executable instructions are provided for checkpointing using a field programmable gate array (FPGA). Checkpointing using FPGA can include checkpointing data within a region of a server's contents to memory and monitoring the checkpointed data using the FPGA.
    Type: Grant
    Filed: June 8, 2012
    Date of Patent: November 5, 2019
    Assignee: HEWLETT PACKARD ENTERPRISE DEVELOPMENT LP
    Inventors: Kevin T. Lim, Alvin AuYoung
  • Patent number: 10402324
    Abstract: According to an example, a processor generates a memory access request and sends the memory access request to a memory module. The processor receives data from the memory module in response to the memory access request when a memory device in the memory module for the memory access request is busy and unable to execute the memory access request.
    Type: Grant
    Filed: October 31, 2013
    Date of Patent: September 3, 2019
    Assignee: HEWLETT PACKARD ENTERPRISE DEVELOPMENT LP
    Inventors: Kevin T. Lim, Sheng Li, Parthasarathy Ranganathan, William C. Hallowell
  • Patent number: 10216659
    Abstract: An example system includes a memory controller; a memory bus coupled to the memory controller; and a dual inline memory module (DIMM) coupled to the memory controller through the memory bus. The DIMM includes a dynamic random access memory (DRAM) portion; a storage portion; and a gate array portion coupled to the memory bus to detect memory access signals and to store information related to the memory access signals on the storage portion.
    Type: Grant
    Filed: May 30, 2014
    Date of Patent: February 26, 2019
    Assignee: HEWLETT PACKARD ENTERPRISE DEVELOPMENT LP
    Inventors: Jim W Brainard, Hubert E Brinkmann, Jr., Kevin T Lim, Mitchel E Wright, Raghavan V Venugopal, Reza M Bacchus
  • Patent number: 10152247
    Abstract: A technique includes acquiring a plurality of write requests from at least one memory controller and logging information associated with the plurality of write requests in persistent storage. The technique includes applying the plurality of write requests atomically as a group to persistent storage.
    Type: Grant
    Filed: January 23, 2014
    Date of Patent: December 11, 2018
    Assignee: HEWLETT PACKARD ENTERPRISE DEVELOPMENT LP
    Inventors: Sheng Li, Jishen Zhao, Jichuan Chang, Parthasarathy Ranganathan, Alistair Veitch, Kevin T. Lim, Mark Lillibridge
  • Patent number: 10127282
    Abstract: A bit vector for a Bloom filter is determined by performing one or more hash function operations on a set of ternary content addressable memory (TCAM) words. A TCAM array is partitioned into a first portion to store the bit vector for the Bloom filter and a second portion to store the set of TCAM words. The TCAM array can be searched using a search word by performing the one or more hash function operations on the search word to generate a hashed search word and determining whether bits at specified positions of the hashed search word match bits at corresponding positions of the bit vector stored in the first portion of the TCAM array before searching the second portion of the TCAM array with the search word.
    Type: Grant
    Filed: April 30, 2014
    Date of Patent: November 13, 2018
    Assignee: HEWLETT PACKARD ENTERPRISE DEVELOPMENT LP
    Inventors: Sheng Li, Kevin T. Lim, Dejan S. Milojicic, Paolo Faraboschi
  • Publication number: 20180300249
    Abstract: A system and method for retrieving cached data are disclosed herein. The system includes a cache server including a local memory and a table residing on the local memory, wherein the table is used to identify data objects corresponding to cached data. The system also includes the data objects residing on the local memory, wherein the data objects contain pointers to the cached data. The system further includes a remote memory communicatively coupled to the cache server through an Input-Output (I/O) connection, wherein the cached data resides on the remote memory.
    Type: Application
    Filed: June 22, 2018
    Publication date: October 18, 2018
    Inventors: Kevin T. Lim, Alvin AuYoung
  • Patent number: 10019371
    Abstract: A system and method for retrieving cached data are disclosed herein. The system includes a cache server including a local memory and a table residing on the local memory, wherein the table is used to identify data objects corresponding to cached data. The system also includes the data objects residing on the local memory, wherein the data objects contain pointers to the cached data. The system further includes a remote memory communicatively coupled to the cache server through an Input-Output (I/O) connection, wherein the cached data resides on the remote memory.
    Type: Grant
    Filed: April 27, 2012
    Date of Patent: July 10, 2018
    Assignee: Hewlett Packard Enterprise Development LP
    Inventors: Kevin T. Lim, Alvin AuYoung
  • Publication number: 20180074959
    Abstract: According to an example, a node-based computing device includes memory nodes communicatively coupled to a processor node. The memory nodes may form a main memory address space for the processor node. The processor node may establish a virtual circuit through memory nodes. The virtual circuit may dedicate a path within the memory nodes. The processor node may then communicate a message through the virtual circuit. The memory nodes may forward the message according to the path dedicated by the virtual circuit.
    Type: Application
    Filed: July 22, 2014
    Publication date: March 15, 2018
    Inventors: Sheng Li, Jishen Zhao, Kevin T. Lim, Paolo Faraboschi
  • Patent number: 9767070
    Abstract: One embodiment is a storage system having one or more compute blades to generate and use data and one or more memory blades to generate a computational result. The computational result is generated by a computational function that transforms the data generated and used by the one or more compute blades. One or more storage devices are in communication with and remotely located from the one or more compute blades. The one or more storage devices store and serve the data for the one or more compute blades.
    Type: Grant
    Filed: November 6, 2009
    Date of Patent: September 19, 2017
    Assignee: Hewlett Packard Enterprise Development LP
    Inventors: Jichuan Chang, Kevin T Lim, Parthasarathy Ranganathan
  • Publication number: 20170199831
    Abstract: An example system includes a memory controller; a memory bus coupled to the memory controller; and a dual inline memory module (DIMM) coupled to the memory controller through the memory bus. The DIMM includes a dynamic random access memory (DRAM) portion; a storage portion; and a gate array portion coupled to the memory bus to detect memory access signals and to store information related to the memory access signals on the storage portion.
    Type: Application
    Filed: May 30, 2014
    Publication date: July 13, 2017
    Inventors: Jim W. Brainard, Hubert E Brinkmann, Kevin T Lim, Mitchel E Wright, Raghavan V Venugopal, Reza M Bacchus
  • Patent number: 9690692
    Abstract: A replace operation is performed in relation to a priority queue. The priority queue has trees and elements. A first element stores a value having a greatest priority of any value stored in any element and in any tree. Each tree corresponds to one of the elements.
    Type: Grant
    Filed: October 31, 2014
    Date of Patent: June 27, 2017
    Assignee: Hewlett Packard Enterprise Development LP
    Inventors: Muhuan Huang, Kevin T. Lim
  • Patent number: 9575889
    Abstract: A memory server providing remote memory for servers independent from the memory server. The memory server includes memory modules and a page table. A memory controller for the memory server allocates memory in the memory modules for each of the servers and manages remote memory accesses for the servers. The page table includes entries identifying the memory module and locations in the memory module storing data for the servers.
    Type: Grant
    Filed: July 3, 2008
    Date of Patent: February 21, 2017
    Assignee: Hewlett Packard Enterprise Development LP
    Inventors: Jichuan Chang, Parthasarathy Ranganathan, Kevin T. Lim
  • Publication number: 20170046395
    Abstract: A bit vector for a Bloom filter is determined by performing one or more hash function operations on a set of ternary content addressable memory (TCAM) words. A TCAM array is partitioned into a first portion to store the bit vector for the Bloom filter and a second portion to store the set of TCAM words. The TCAM array can be searched using a search word by performing the one or more hash function operations on the search word to generate a hashed search word and determining whether bits at specified positions of the hashed search word match bits at corresponding positions of the bit vector stored in the first portion of the TCAM array before searching the second portion of the TCAM array with the search word.
    Type: Application
    Filed: April 30, 2014
    Publication date: February 16, 2017
    Inventors: Sheng Li, Kevin T. Lim, Dejan S. Milojicic, Paolo Faraboschi
  • Publication number: 20160342351
    Abstract: A technique includes acquiring a plurality of write requests from at least one memory controller and logging information associated with the plurality of write requests in persistent storage. The technique includes applying the plurality of write requests atomically as a group to persistent storage.
    Type: Application
    Filed: January 23, 2014
    Publication date: November 24, 2016
    Inventors: Sheng Li, Jishen Zhao, Jichuan Chang, Parthasarathy Ranganathan, Alistair Veitch, Kevin T. Lim, Mark Lillibridge
  • Publication number: 20160275014
    Abstract: According to an example, a processor generates a memory access request and sends the memory access request to a memory module. The processor receives data from the memory module in response to the memory access request when a memory device in the memory module for the memory access request is busy and unable to execute the memory access request.
    Type: Application
    Filed: October 31, 2013
    Publication date: September 22, 2016
    Inventors: Kevin T. Lim, Sheng Li, Parthasarathy Ranganathan, William C. Hallowell
  • Publication number: 20160267015
    Abstract: A method for mapping virtual memory pages to physical memory pages is described. The method includes receiving a mapping of a virtual memory page to multiple physical memory pages, detecting a request for a transaction to be performed on data contained in the multiple physical memory pages, in which the transaction includes a number of data updates, determining which of the number of multiple physical memory pages contains a latest version of the data to be updated by the transaction, updating a physical memory page by performing the transaction within a physical memory page among the multiple physical memory pages that does not contain the latest version of the data, and updating an indication of which of the physical memory pages contains the latest version of the data pertaining to the transaction.
    Type: Application
    Filed: October 29, 2013
    Publication date: September 15, 2016
    Inventors: Sheng Li, Jishen Zhao, Jichuan Chang, Parthasarathy Ranganathan, Alistair Veitch, Kevin T. Lim
  • Publication number: 20160239211
    Abstract: Example implementations relate to performing active memory operations. In example implementations, a memory controller may be programmed such that the memory controller allocates more time for a standard memory operation than required by a timing specification of a memory communicatively coupled to the memory controller. Extra time that is allocated for the standard memory operation may be identified. An active memory operation may be performed during the extra time.
    Type: Application
    Filed: September 30, 2013
    Publication date: August 18, 2016
    Inventors: Kevin T. Lim, Naveen Muralimanohar
  • Publication number: 20160125008
    Abstract: A replace operation is performed in relation to a priority queue. The priority queue has trees and elements. A first element stores a value having a greatest priority of any value stored in any element and in any tree. Each tree corresponds to one of the elements.
    Type: Application
    Filed: October 31, 2014
    Publication date: May 5, 2016
    Applicant: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.
    Inventors: Muhuan Huang, Kevin T. Lim