Patents by Inventor Trung A. Diep
Trung A. Diep has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11921642Abstract: A cache memory includes cache lines to store information. The stored information is associated with physical addresses that include first, second, and third distinct portions. The cache lines are indexed by the second portions of respective physical addresses associated with the stored information. The cache memory also includes one or more tables, each of which includes respective table entries that are indexed by the first portions of the respective physical addresses. The respective table entries in each of the one or more tables are to store indications of the second portions of respective physical addresses associated with the stored information.Type: GrantFiled: November 14, 2022Date of Patent: March 5, 2024Assignee: RAMBUS INC.Inventors: Trung Diep, Hongzhong Zheng
-
Publication number: 20230153251Abstract: The disclosed embodiments relate to a computer system with a cache memory that supports tagless addressing. During operation, the system receives a request to perform a memory access, wherein the request includes a virtual address. In response to the request, the system performs an address-translation operation, which translates the virtual address into both a physical address and a cache address. Next, the system uses the physical address to access one or more levels of physically addressed cache memory, wherein accessing a given level of physically addressed cache memory involves performing a tag-checking operation based on the physical address. If the access to the one or more levels of physically addressed cache memory fails to hit on a cache line for the memory access, the system uses the cache address to directly index a cache memory, wherein directly indexing the cache memory does not involve performing a tag-checking operation and eliminates the tag storage overhead.Type: ApplicationFiled: November 22, 2022Publication date: May 18, 2023Inventors: Hongzhong Zheng, Trung A. Diep
-
Publication number: 20230142048Abstract: A cache memory includes cache lines to store information. The stored information is associated with physical addresses that include first, second, and third distinct portions. The cache lines are indexed by the second portions of respective physical addresses associated with the stored information. The cache memory also includes one or more tables, each of which includes respective table entries that are indexed by the first portions of the respective physical addresses. The respective table entries in each of the one or more tables are to store indications of the second portions of respective physical addresses associated with the stored information.Type: ApplicationFiled: November 14, 2022Publication date: May 11, 2023Inventors: Trung Diep, Hongzhong Zheng
-
Patent number: 11537531Abstract: The disclosed embodiments relate to a computer system with a cache memory that supports tagless addressing. During operation, the system receives a request to perform a memory access, wherein the request includes a virtual address. In response to the request, the system performs an address-translation operation, which translates the virtual address into both a physical address and a cache address. Next, the system uses the physical address to access one or more levels of physically addressed cache memory, wherein accessing a given level of physically addressed cache memory involves performing a tag-checking operation based on the physical address. If the access to the one or more levels of physically addressed cache memory fails to hit on a cache line for the memory access, the system uses the cache address to directly index a cache memory, wherein directly indexing the cache memory does not involve performing a tag-checking operation and eliminates the tag storage overhead.Type: GrantFiled: December 8, 2020Date of Patent: December 27, 2022Assignee: Rambus Inc.Inventors: Hongzhong Zheng, Trung A. Diep
-
Patent number: 11500781Abstract: A cache memory includes cache lines to store information. The stored information is associated with physical addresses that include first, second, and third distinct portions. The cache lines are indexed by the second portions of respective physical addresses associated with the stored information. The cache memory also includes one or more tables, each of which includes respective table entries that are indexed by the first portions of the respective physical addresses. The respective table entries in each of the one or more tables are to store indications of the second portions of respective physical addresses associated with the stored information.Type: GrantFiled: November 30, 2020Date of Patent: November 15, 2022Assignee: RAMBUS INC.Inventors: Trung Diep, Hongzhong Zheng
-
Publication number: 20210232507Abstract: A cache memory includes cache lines to store information. The stored information is associated with physical addresses that include first, second, and third distinct portions. The cache lines are indexed by the second portions of respective physical addresses associated with the stored information. The cache memory also includes one or more tables, each of which includes respective table entries that are indexed by the first portions of the respective physical addresses. The respective table entries in each of the one or more tables are to store indications of the second portions of respective physical addresses associated with the stored information.Type: ApplicationFiled: November 30, 2020Publication date: July 29, 2021Inventors: Trung Diep, Hongzhong Zheng
-
Publication number: 20210157742Abstract: The disclosed embodiments relate to a computer system with a cache memory that supports tagless addressing. During operation, the system receives a request to perform a memory access, wherein the request includes a virtual address. In response to the request, the system performs an address-translation operation, which translates the virtual address into both a physical address and a cache address. Next, the system uses the physical address to access one or more levels of physically addressed cache memory, wherein accessing a given level of physically addressed cache memory involves performing a tag-checking operation based on the physical address. If the access to the one or more levels of physically addressed cache memory fails to hit on a cache line for the memory access, the system uses the cache address to directly index a cache memory, wherein directly indexing the cache memory does not involve performing a tag-checking operation and eliminates the tag storage overhead.Type: ApplicationFiled: December 8, 2020Publication date: May 27, 2021Inventors: Hongzhong Zheng, Trung A. Diep
-
Patent number: 10891241Abstract: The disclosed embodiments relate to a computer system with a cache memory that supports tagless addressing. During operation, the system receives a request to perform a memory access, wherein the request includes a virtual address. In response to the request, the system performs an address-translation operation, which translates the virtual address into both a physical address and a cache address. Next, the system uses the physical address to access one or more levels of physically addressed cache memory, wherein accessing a given level of physically addressed cache memory involves performing a tag-checking operation based on the physical address. If the access to the one or more levels of physically addressed cache memory fails to hit on a cache line for the memory access, the system uses the cache address to directly index a cache memory, wherein directly indexing the cache memory does not involve performing a tag-checking operation and eliminates the tag storage overhead.Type: GrantFiled: October 2, 2018Date of Patent: January 12, 2021Assignee: Rambus Inc.Inventors: Hongzhong Zheng, Trung A. Diep
-
Patent number: 10853261Abstract: A cache memory includes cache lines to store information. The stored information is associated with physical addresses that include first, second, and third distinct portions. The cache lines are indexed by the second portions of respective physical addresses associated with the stored information. The cache memory also includes one or more tables, each of which includes respective table entries that are indexed by the first portions of the respective physical addresses. The respective table entries in each of the one or more tables are to store indications of the second portions of respective physical addresses associated with the stored information.Type: GrantFiled: October 11, 2018Date of Patent: December 1, 2020Assignee: RAMBUS INC.Inventors: Trung Diep, Hongzhong Zheng
-
Patent number: 10515010Abstract: Embodiments are disclosed for replacing one or more pages of a memory to level wear on the memory. In one embodiment, a system includes a page fault handling function and a memory address mapping function. Upon receipt of a page fault, the page fault handling function maps an evicted virtual memory address to a stressed page and maps a stressed virtual memory address to a free page using the memory address mapping function.Type: GrantFiled: March 29, 2018Date of Patent: December 24, 2019Assignee: Rambus Inc.Inventors: Trung Diep, Eric Linstadt
-
Publication number: 20190179768Abstract: A cache memory includes cache lines to store information. The stored information is associated with physical addresses that include first, second, and third distinct portions. The cache lines are indexed by the second portions of respective physical addresses associated with the stored information. The cache memory also includes one or more tables, each of which includes respective table entries that are indexed by the first portions of the respective physical addresses. The respective table entries in each of the one or more tables are to store indications of the second portions of respective physical addresses associated with the stored information.Type: ApplicationFiled: October 11, 2018Publication date: June 13, 2019Inventors: Trung Diep, Hongzhong Zheng
-
Publication number: 20190102318Abstract: The disclosed embodiments relate to a computer system with a cache memory that supports tagless addressing. During operation, the system receives a request to perform a memory access, wherein the request includes a virtual address. In response to the request, the system performs an address-translation operation, which translates the virtual address into both a physical address and a cache address. Next, the system uses the physical address to access one or more levels of physically addressed cache memory, wherein accessing a given level of physically addressed cache memory involves performing a tag-checking operation based on the physical address. If the access to the one or more levels of physically addressed cache memory fails to hit on a cache line for the memory access, the system uses the cache address to directly index a cache memory, wherein directly indexing the cache memory does not involve performing a tag-checking operation and eliminates the tag storage overhead.Type: ApplicationFiled: October 2, 2018Publication date: April 4, 2019Inventors: Hongzhong Zheng, Trung A. Diep
-
Patent number: 10133676Abstract: Embodiments related to a cache memory that supports tagless addressing are disclosed. Some embodiments receive a request to perform a memory access, wherein the request includes a virtual address. In response, the system performs an address-translation operation, which translates the virtual address into both a physical address and a cache address. Next, the system uses the physical address to access one or more levels of physically addressed cache memory, wherein accessing a given level of physically addressed cache memory involves performing a tag-checking operation based on the physical address. If the access to the one or more levels of physically addressed cache memory fails to hit on a cache line for the memory access, the system uses the cache address to directly index a cache memory, wherein directly indexing the cache memory does not involve performing a tag-checking operation and eliminates the tag storage overhead.Type: GrantFiled: July 25, 2011Date of Patent: November 20, 2018Assignee: Rambus Inc.Inventors: Hongzhong Zheng, Trung A. Diep
-
Patent number: 10102140Abstract: A cache memory includes cache lines to store information. The stored information is associated with physical addresses that include first, second, and third distinct portions. The cache lines are indexed by the second portions of respective physical addresses associated with the stored information. The cache memory also includes one or more tables, each of which includes respective table entries that are indexed by the first portions of the respective physical addresses. The respective table entries in each of the one or more tables are to store indications of the second portions of respective physical addresses associated with the stored information.Type: GrantFiled: December 28, 2016Date of Patent: October 16, 2018Assignee: RAMBUS INC.Inventors: Trung Diep, Hongzhong Zheng
-
Publication number: 20180285259Abstract: Embodiments are disclosed for replacing one or more pages of a memory to level wear on the memory. In one embodiment, a system includes a page fault handling function and a memory address mapping function. Upon receipt of a page fault, the page fault handling function maps an evicted virtual memory address to a stressed page and maps a stressed virtual memory address to a free page using the memory address mapping function.Type: ApplicationFiled: March 29, 2018Publication date: October 4, 2018Inventors: Trung Diep, Eric Linstadt
-
Patent number: 10037433Abstract: Methods and systems described herein may perform a word-level encryption and a sentence-level encryption of one or more documents. The word-level encryption and the sentence-level encryption may be performed with an encryption key generated by a client device. A document indexer is stored in the one or more storage networks. The document indexer includes encrypted word frequencies and encrypted word position identifiers based on the encrypted words of the one or more encrypted documents. The client device receives search terms and encrypts the search terms with the encryption key. The one or more encrypted documents are identified in the one or more storage networks based on searching with the encrypted search terms and at least one of the encrypted word frequencies and/or the encrypted word position identifiers.Type: GrantFiled: November 4, 2015Date of Patent: July 31, 2018Assignee: NTT DOCOMO INC.Inventors: Trung Diep, Pero Subasic
-
Patent number: 9934142Abstract: Embodiments are disclosed for replacing one or more pages of a memory to level wear on the memory. In one embodiment, a system includes a page fault handling function and a memory address mapping function. Upon receipt of a page fault, the page fault handling function maps an evicted virtual memory address to a stressed page and maps a stressed virtual memory address to a free page using the memory address mapping function.Type: GrantFiled: July 11, 2016Date of Patent: April 3, 2018Assignee: Rambus, Inc.Inventors: Trung Diep, Eric Linstadt
-
Publication number: 20170206168Abstract: A cache memory includes cache lines to store information. The stored information is associated with physical addresses that include first, second, and third distinct portions. The cache lines are indexed by the second portions of respective physical addresses associated with the stored information. The cache memory also includes one or more tables, each of which includes respective table entries that are indexed by the first portions of the respective physical addresses. The respective table entries in each of the one or more tables are to store indications of the second portions of respective physical addresses associated with the stored information.Type: ApplicationFiled: December 28, 2016Publication date: July 20, 2017Inventors: Trung DIEP, Hongzhong ZHENG
-
Patent number: 9569359Abstract: A cache memory includes cache lines to store information. The stored information is associated with physical addresses that include first, second, and third distinct portions. The cache lines are indexed by the second portions of respective physical addresses associated with the stored information. The cache memory also includes one or more tables, each of which includes respective table entries that are indexed by the first portions of the respective physical addresses. The respective table entries in each of the one or more tables are to store indications of the second portions of respective physical addresses associated with the stored information.Type: GrantFiled: February 22, 2012Date of Patent: February 14, 2017Assignee: Rambus Inc.Inventors: Trung Diep, Hongzhong Zheng
-
Publication number: 20170010964Abstract: Embodiments are disclosed for replacing one or more pages of a memory to level wear on the memory. In one embodiment, a system includes a page fault handling function and a memory address mapping function. Upon receipt of a page fault, the page fault handling function maps an evicted virtual memory address to a stressed page and maps a stressed virtual memory address to a free page using the memory address mapping function.Type: ApplicationFiled: July 11, 2016Publication date: January 12, 2017Inventors: Trung Diep, Eric Linstadt