Patents by Inventor Aaron Tsai
Aaron Tsai has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 12141076Abstract: Disclosed herein is a virtual cache and method in a processor for supporting multiple threads on the same cache line. The processor is configured to support virtual memory and multiple threads. The virtual cache directory includes a plurality of directory entries, each entry is associated with a cache line. Each cache line has a corresponding tag. The tag includes a logical address, an address space identifier, a real address bit indicator, and a per thread validity bit for each thread that accesses the cache line. When a subsequent thread determines that the cache line is valid for that thread the validity bit for that thread is set, while not invalidating any validity bits for other threads.Type: GrantFiled: August 18, 2023Date of Patent: November 12, 2024Assignee: International Business Machines CorporationInventors: Markus Helms, Christian Jacobi, Ulrich Mayer, Martin Recktenwald, Johannes C. Reichart, Anthony Saporito, Aaron Tsai
-
Patent number: 12099845Abstract: In an approach, responsibility for reissuing a fetch micro-operation is allocated to a reissue queue subsequent to a cache miss corresponding to a cache and the fetch micro-operation. Responsive to higher level cache returning data to the cache, an issue selection algorithm of the issue queue is overridden to prioritize reissuing the fetch micro-operation.Type: GrantFiled: June 16, 2022Date of Patent: September 24, 2024Assignee: International Business Machines CorporationInventors: Jonathan Ting Hsieh, Gregory William Alexander, Aaron Tsai, Yossi Shapira
-
Patent number: 11977486Abstract: A computer system includes a processor core and a memory system in signal communication with the processor core. The memory system includes a first cache and a second cache. The first cache is arranged at a first level of a hierarchy in the memory system and is configured to store a plurality of first-cache entries. The second cache is arranged at a second level of the hierarchy that is lower than the first level, and stores a plurality of second-cache entries. The first cache maintains a directory that contains information for each of the first-cache entries. The second cache maintains a shadow pointer directory (SPD) that includes one or more SPD entries that maps each of the first-cache entries to a corresponding second cache entry at a lower-level cache location.Type: GrantFiled: April 4, 2022Date of Patent: May 7, 2024Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Ashraf ElSharif, Richard Joseph Branciforte, Gregory William Alexander, Deanna Postles Dunn Berger, Timothy Bronson, Aaron Tsai, Taylor J. Pritchard, Markus Kaltenbach, Christian Jacobi, Michael A. Blake
-
Publication number: 20240104021Abstract: Embodiments are for processor cross-core cache line contention management. A computer-implemented method includes sending a cross-invalidate command to one or more caches based on receiving a cache state change request for a cache line in a symmetric multiprocessing system and determining a retry delay based on receiving a cross-invalidate reject response from at least one of the one or more caches. The computer-implemented method also includes waiting until a retry delay period associated with the retry delay has elapsed to resend the cross-invalidate command to the one or more caches and granting the cache state change request for the cache line based on receiving a cross-invalidate accept response from the one or more caches.Type: ApplicationFiled: September 23, 2022Publication date: March 28, 2024Inventors: Michael Joseph Cadigan, JR., Gregory William Alexander, Deanna Postles Dunn Berger, Timothy Bronson, Chung-Lung K. Shum, Aaron Tsai
-
Publication number: 20240070075Abstract: A lower-level cache managing cross-core invalidation (XI) snapshots in a shared-memory multiprocessing system, wherein the management of XI snapshots reduces an amount of required snapshots while allowing shared lower-level caches, comprising: the lower-level cache maintaining respective response sync state for at least one processor in a plurality of processors signifying that a line may have been changed by another processor since last fetched by a requesting processor.Type: ApplicationFiled: August 23, 2022Publication date: February 29, 2024Inventors: Richard Joseph Branciforte, Gregory William Alexander, Timothy Bronson, Deanna Postles Dunn Berger, Akash V. Giri, Aaron Tsai
-
Patent number: 11907124Abstract: Aspects include using a shadow copy of a level 1 (L1) cache in a cache hierarchy. A method includes maintaining the shadow copy of the L1 cache in the cache hierarchy. The maintaining includes updating the shadow copy of the L1 cache with memory content changes to the L1 cache a number of pipeline cycles after the L1 cache is updated with the memory content changes.Type: GrantFiled: March 31, 2022Date of Patent: February 20, 2024Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Yair Fried, Aaron Tsai, Eyal Naor, Christian Jacobi, Timothy Bronson, Chung-Lung K. Shum
-
Patent number: 11892949Abstract: A method and a system detects a cache line as a potential or confirmed hot cache line based on receiving an intervention of a processor associated with a fetch of the cache line. The method and system include suppressing an action of operations associated with the hot cache line. A related method and system detect an intervention and, in response, communicates an intervention notification to another processor. An alternative method and system detect a hot data object associated with an intervention event of an application. The method and system can suppress actions of operations associated with the hot data object. An alternative method and system can detect and communicate an intervention associated with a data object.Type: GrantFiled: January 13, 2023Date of Patent: February 6, 2024Assignee: International Business Machines CorporationInventors: Christian Zoellin, Christian Jacobi, Chung-Lung K. Shum, Martin Recktenwald, Anthony Saporito, Aaron Tsai
-
Patent number: 11880304Abstract: To facilitate an efficient processing of contended cache lines, a cache controller that is associated with a requestor receives a fetch request for data from the requestor. The fetch request is associated with a cache scope designation. If the data is in a high-level cache (e.g., L1 cache) associated with the requestor, the cache controller returns the requested data to the requestor. If the data is not in the high-level cache or if the data is not within the cache pool identified by the cache scope of search designation, and/or if obtaining the data is contentious, the controller returns a cache miss, undeliverable data, and request done instruction to the requestor. Such scheme allows or permits address contention events when the requestor deems such events are necessary and/or when important. As such, address contention events, performance, latencies, increased executions times, inefficient use of resources, may be diminished.Type: GrantFiled: May 24, 2022Date of Patent: January 23, 2024Assignee: International Business Machines CorporationInventors: Taylor J Pritchard, Aaron Tsai, Richard Joseph Branciforte, Ashraf ElSharif, Gregory William Alexander, Deanna Postles Dunn Berger, Michael Fee
-
Publication number: 20230409331Abstract: In an approach, responsibility for reissuing a fetch micro-operation is allocated to a reissue queue subsequent to a cache miss corresponding to a cache and the fetch micro-operation. Responsive to higher level cache returning data to the cache, an issue selection algorithm of the issue queue is overridden to prioritize reissuing the fetch micro-operation.Type: ApplicationFiled: June 16, 2022Publication date: December 21, 2023Inventors: Jonathan Ting Hsieh, Gregory William Alexander, Aaron Tsai, Yossi Shapira
-
Publication number: 20230401161Abstract: Disclosed herein is a virtual cache and method in a processor for supporting multiple threads on the same cache line. The processor is configured to support virtual memory and multiple threads. The virtual cache directory includes a plurality of directory entries, each entry is associated with a cache line. Each cache line has a corresponding tag. The tag includes a logical address, an address space identifier, a real address bit indicator, and a per thread validity bit for each thread that accesses the cache line. When a subsequent thread determines that the cache line is valid for that thread the validity bit for that thread is set, while not invalidating any validity bits for other threads.Type: ApplicationFiled: August 18, 2023Publication date: December 14, 2023Inventors: Markus Helms, Christian Jacobi, Ulrich Mayer, Martin Recktenwald, Johannes C. Reichart, Anthony Saporito, Aaron Tsai
-
Publication number: 20230385195Abstract: To facilitate an efficient processing of contended cache lines, a cache controller that is associated with a requestor receives a fetch request for data from the requestor. The fetch request is associated with a cache scope designation. If the data is in a high-level cache (e.g., L1 cache) associated with the requestor, the cache controller returns the requested data to the requestor. If the data is not in the high-level cache or if the data is not within the cache pool identified by the cache scope of search designation, and/or if obtaining the data is contentious, the controller returns a cache miss, undeliverable data, and request done instruction to the requestor. Such scheme allows or permits address contention events when the requestor deems such events are necessary and/or when important. As such, address contention events, performance, latencies, increased executions times, inefficient use of resources, may be diminished.Type: ApplicationFiled: May 24, 2022Publication date: November 30, 2023Inventors: Taylor J. Pritchard, Aaron Tsai, Richard Joseph Branciforte, Ashraf ElSharif, Gregory William Alexander, Deanna Postles Dunn Berger, Michael Fee
-
Publication number: 20230315631Abstract: Aspects include using a shadow copy of a level 1 (L1) cache in a cache hierarchy. A method includes maintaining the shadow copy of the L1 cache in the cache hierarchy. The maintaining includes updating the shadow copy of the L1 cache with memory content changes to the L1 cache a number of pipeline cycles after the L1 cache is updated with the memory content changes.Type: ApplicationFiled: March 31, 2022Publication date: October 5, 2023Inventors: Yair Fried, Aaron Tsai, Eyal Naor, Christian Jacobi, Timothy Bronson, Chung-Lung K. Shum
-
Publication number: 20230315633Abstract: A computer system includes a processor core and a memory system in signal communication with the processor core. The memory system includes a first cache and a second cache. The first cache is arranged at a first level of a hierarchy in the memory system and is configured to store a plurality of first-cache entries. The second cache is arranged at a second level of the hierarchy that is lower than the first level, and stores a plurality of second-cache entries. The first cache maintains a directory that contains information for each of the first-cache entries. The second cache maintains a shadow pointer directory (SPD) that includes one or more SPD entries that maps each of the first-cache entries to a corresponding second cache entry at a lower-level cache location.Type: ApplicationFiled: April 4, 2022Publication date: October 5, 2023Inventors: Ashraf ElSharif, Richard Joseph Branciforte, Gregory William Alexander, Deanna Postles Dunn Berger, Timothy Bronson, Aaron Tsai, Taylor J. Pritchard, Markus Kaltenbach, Christian Jacobi, Michael A. Blake
-
Patent number: 11775445Abstract: Disclosed herein is a virtual cache and method in a processor for supporting multiple threads on the same cache line. The processor is configured to support virtual memory and multiple threads. The virtual cache directory includes a plurality of directory entries, each entry is associated with a cache line. Each cache line has a corresponding tag. The tag includes a logical address, an address space identifier, a real address bit indicator, and a per thread validity bit for each thread that accesses the cache line. When a subsequent thread determines that the cache line is valid for that thread the validity bit for that thread is set, while not invalidating any validity bits for other threads.Type: GrantFiled: October 13, 2020Date of Patent: October 3, 2023Assignee: International Business Machines CorporationInventors: Markus Helms, Christian Jacobi, Ulrich Mayer, Martin Recktenwald, Johannes C. Reichart, Anthony Saporito, Aaron Tsai
-
Publication number: 20230281132Abstract: Embodiments are for special tracking pool enhancement for core L1 address invalidates. An invalidate request is designated to fill an entry in a queue in a local cache of a processor core, the queue including a first allocation associated with processing any type of invalidate request and a second allocation associated with processing an invalidate request not requiring a response in order for a controller to be made available, the entry being in the second allocation. Responsive to designating the invalidate request to fill the entry in the queue in the local cache, a state of the controller that made the invalidate request is changed to available based at least in part on the entry being in the second allocation.Type: ApplicationFiled: March 4, 2022Publication date: September 7, 2023Inventors: Deanna Postles Dunn Berger, Gregory William Alexander, Richard Joseph Branciforte, Aaron Tsai, Markus Kaltenbach
-
Patent number: 11748266Abstract: Embodiments are for special tracking pool enhancement for core L1 address invalidates. An invalidate request is designated to fill an entry in a queue in a local cache of a processor core, the queue including a first allocation associated with processing any type of invalidate request and a second allocation associated with processing an invalidate request not requiring a response in order for a controller to be made available, the entry being in the second allocation. Responsive to designating the invalidate request to fill the entry in the queue in the local cache, a state of the controller that made the invalidate request is changed to available based at least in part on the entry being in the second allocation.Type: GrantFiled: March 4, 2022Date of Patent: September 5, 2023Assignee: International Business Machines CorporationInventors: Deanna Postles Dunn Berger, Gregory William Alexander, Richard Joseph Branciforte, Aaron Tsai, Markus Kaltenbach
-
Publication number: 20230153244Abstract: A method and a system detects a cache line as a potential or confirmed hot cache line based on receiving an intervention of a processor associated with a fetch of the cache line. The method and system include suppressing an action of operations associated with the hot cache line. A related method and system detect an intervention and, in response, communicates an intervention notification to another processor. An alternative method and system detect a hot data object associated with an intervention event of an application. The method and system can suppress actions of operations associated with the hot data object. An alternative method and system can detect and communicate an intervention associated with a data object.Type: ApplicationFiled: January 13, 2023Publication date: May 18, 2023Inventors: Christian Zoellin, Christian Jacobi, Chung-Lung K. Shum, Martin Recktenwald, Anthony Saporito, Aaron Tsai
-
Patent number: 11605140Abstract: Various aspects of the subject technology relate to systems, methods, and machine-readable media for an interactive analytics platform responsive to data inquiries. These aspects include identifying a request for energy consumption data of an energy user from a user in a live interaction with the energy user. A utility bill as well as energy consumption data associated with the second user may be retrieved using an authentication of the second user. Energy consumption factors that influence a billing amount of the utility bill may be identified based on the energy consumption data. An interface with specific billing insights that correspond to the energy consumption factors is generated. The specific billing insights provide an explanation as to why the billing amount of the utility bill exceeded a cost expectation of the second user. The interface mapped with the utility bill may be provided for display.Type: GrantFiled: March 27, 2020Date of Patent: March 14, 2023Assignee: OPower, Inc.Inventors: Lawrence Han, Suryaveer Singh Lodha, Daniel Bloomfield Ramagem, Michael Ian Kristopher Eaton, Nowell Boardman Strite, Aaron Tsai Otani, Khanh Duc Nguyen
-
Patent number: 11586542Abstract: A method and a system detects a cache line as a potential or confirmed hot cache line based on receiving an intervention of a processor associated with a fetch of the cache line. The method and system include suppressing an action of operations associated with the hot cache line. A related method and system detect an intervention and, in response, communicates an intervention notification to another processor. An alternative method and system detect a hot data object associated with an intervention event of an application. The method and system can suppress actions of operations associated with the hot data object. An alternative method and system can detect and communicate an intervention associated with a data object.Type: GrantFiled: April 9, 2021Date of Patent: February 21, 2023Assignee: International Business Machines CorporationInventors: Christian Zoellin, Christian Jacobi, Chung-Lung K. Shum, Martin Recktenwald, Anthony Saporito, Aaron Tsai
-
Patent number: 11403222Abstract: Disclosed herein is a method for operating access to a cache memory via an effective address comprising a tag field and a cache line index field. The method comprises: splitting the tag field into a first group of bits and a second group of bits. The line index bits and the first group of bits are searched in the set directory. A set identifier is generated indicating the set containing the respective cache line of the effective address. The set identifier, the line index bits and the second group of bits are searched in the validation directory. In response to determining the presence of the cache line in the set based on the second searching, a hit signal is generated.Type: GrantFiled: October 13, 2020Date of Patent: August 2, 2022Assignee: International Business Machines CorporationInventors: Christian Jacobi, Ulrich Mayer, Martin Recktenwald, Anthony Saporito, Aaron Tsai