Patents by Inventor Aaron Tsai

Aaron Tsai has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240104021
    Abstract: Embodiments are for processor cross-core cache line contention management. A computer-implemented method includes sending a cross-invalidate command to one or more caches based on receiving a cache state change request for a cache line in a symmetric multiprocessing system and determining a retry delay based on receiving a cross-invalidate reject response from at least one of the one or more caches. The computer-implemented method also includes waiting until a retry delay period associated with the retry delay has elapsed to resend the cross-invalidate command to the one or more caches and granting the cache state change request for the cache line based on receiving a cross-invalidate accept response from the one or more caches.
    Type: Application
    Filed: September 23, 2022
    Publication date: March 28, 2024
    Inventors: Michael Joseph Cadigan, JR., Gregory William Alexander, Deanna Postles Dunn Berger, Timothy Bronson, Chung-Lung K. Shum, Aaron Tsai
  • Publication number: 20240070075
    Abstract: A lower-level cache managing cross-core invalidation (XI) snapshots in a shared-memory multiprocessing system, wherein the management of XI snapshots reduces an amount of required snapshots while allowing shared lower-level caches, comprising: the lower-level cache maintaining respective response sync state for at least one processor in a plurality of processors signifying that a line may have been changed by another processor since last fetched by a requesting processor.
    Type: Application
    Filed: August 23, 2022
    Publication date: February 29, 2024
    Inventors: Richard Joseph Branciforte, Gregory William Alexander, Timothy Bronson, Deanna Postles Dunn Berger, Akash V. Giri, Aaron Tsai
  • Patent number: 11907124
    Abstract: Aspects include using a shadow copy of a level 1 (L1) cache in a cache hierarchy. A method includes maintaining the shadow copy of the L1 cache in the cache hierarchy. The maintaining includes updating the shadow copy of the L1 cache with memory content changes to the L1 cache a number of pipeline cycles after the L1 cache is updated with the memory content changes.
    Type: Grant
    Filed: March 31, 2022
    Date of Patent: February 20, 2024
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Yair Fried, Aaron Tsai, Eyal Naor, Christian Jacobi, Timothy Bronson, Chung-Lung K. Shum
  • Patent number: 11892949
    Abstract: A method and a system detects a cache line as a potential or confirmed hot cache line based on receiving an intervention of a processor associated with a fetch of the cache line. The method and system include suppressing an action of operations associated with the hot cache line. A related method and system detect an intervention and, in response, communicates an intervention notification to another processor. An alternative method and system detect a hot data object associated with an intervention event of an application. The method and system can suppress actions of operations associated with the hot data object. An alternative method and system can detect and communicate an intervention associated with a data object.
    Type: Grant
    Filed: January 13, 2023
    Date of Patent: February 6, 2024
    Assignee: International Business Machines Corporation
    Inventors: Christian Zoellin, Christian Jacobi, Chung-Lung K. Shum, Martin Recktenwald, Anthony Saporito, Aaron Tsai
  • Patent number: 11880304
    Abstract: To facilitate an efficient processing of contended cache lines, a cache controller that is associated with a requestor receives a fetch request for data from the requestor. The fetch request is associated with a cache scope designation. If the data is in a high-level cache (e.g., L1 cache) associated with the requestor, the cache controller returns the requested data to the requestor. If the data is not in the high-level cache or if the data is not within the cache pool identified by the cache scope of search designation, and/or if obtaining the data is contentious, the controller returns a cache miss, undeliverable data, and request done instruction to the requestor. Such scheme allows or permits address contention events when the requestor deems such events are necessary and/or when important. As such, address contention events, performance, latencies, increased executions times, inefficient use of resources, may be diminished.
    Type: Grant
    Filed: May 24, 2022
    Date of Patent: January 23, 2024
    Assignee: International Business Machines Corporation
    Inventors: Taylor J Pritchard, Aaron Tsai, Richard Joseph Branciforte, Ashraf ElSharif, Gregory William Alexander, Deanna Postles Dunn Berger, Michael Fee
  • Publication number: 20230409331
    Abstract: In an approach, responsibility for reissuing a fetch micro-operation is allocated to a reissue queue subsequent to a cache miss corresponding to a cache and the fetch micro-operation. Responsive to higher level cache returning data to the cache, an issue selection algorithm of the issue queue is overridden to prioritize reissuing the fetch micro-operation.
    Type: Application
    Filed: June 16, 2022
    Publication date: December 21, 2023
    Inventors: Jonathan Ting Hsieh, Gregory William Alexander, Aaron Tsai, Yossi Shapira
  • Publication number: 20230401161
    Abstract: Disclosed herein is a virtual cache and method in a processor for supporting multiple threads on the same cache line. The processor is configured to support virtual memory and multiple threads. The virtual cache directory includes a plurality of directory entries, each entry is associated with a cache line. Each cache line has a corresponding tag. The tag includes a logical address, an address space identifier, a real address bit indicator, and a per thread validity bit for each thread that accesses the cache line. When a subsequent thread determines that the cache line is valid for that thread the validity bit for that thread is set, while not invalidating any validity bits for other threads.
    Type: Application
    Filed: August 18, 2023
    Publication date: December 14, 2023
    Inventors: Markus Helms, Christian Jacobi, Ulrich Mayer, Martin Recktenwald, Johannes C. Reichart, Anthony Saporito, Aaron Tsai
  • Publication number: 20230385195
    Abstract: To facilitate an efficient processing of contended cache lines, a cache controller that is associated with a requestor receives a fetch request for data from the requestor. The fetch request is associated with a cache scope designation. If the data is in a high-level cache (e.g., L1 cache) associated with the requestor, the cache controller returns the requested data to the requestor. If the data is not in the high-level cache or if the data is not within the cache pool identified by the cache scope of search designation, and/or if obtaining the data is contentious, the controller returns a cache miss, undeliverable data, and request done instruction to the requestor. Such scheme allows or permits address contention events when the requestor deems such events are necessary and/or when important. As such, address contention events, performance, latencies, increased executions times, inefficient use of resources, may be diminished.
    Type: Application
    Filed: May 24, 2022
    Publication date: November 30, 2023
    Inventors: Taylor J. Pritchard, Aaron Tsai, Richard Joseph Branciforte, Ashraf ElSharif, Gregory William Alexander, Deanna Postles Dunn Berger, Michael Fee
  • Publication number: 20230315631
    Abstract: Aspects include using a shadow copy of a level 1 (L1) cache in a cache hierarchy. A method includes maintaining the shadow copy of the L1 cache in the cache hierarchy. The maintaining includes updating the shadow copy of the L1 cache with memory content changes to the L1 cache a number of pipeline cycles after the L1 cache is updated with the memory content changes.
    Type: Application
    Filed: March 31, 2022
    Publication date: October 5, 2023
    Inventors: Yair Fried, Aaron Tsai, Eyal Naor, Christian Jacobi, Timothy Bronson, Chung-Lung K. Shum
  • Publication number: 20230315633
    Abstract: A computer system includes a processor core and a memory system in signal communication with the processor core. The memory system includes a first cache and a second cache. The first cache is arranged at a first level of a hierarchy in the memory system and is configured to store a plurality of first-cache entries. The second cache is arranged at a second level of the hierarchy that is lower than the first level, and stores a plurality of second-cache entries. The first cache maintains a directory that contains information for each of the first-cache entries. The second cache maintains a shadow pointer directory (SPD) that includes one or more SPD entries that maps each of the first-cache entries to a corresponding second cache entry at a lower-level cache location.
    Type: Application
    Filed: April 4, 2022
    Publication date: October 5, 2023
    Inventors: Ashraf ElSharif, Richard Joseph Branciforte, Gregory William Alexander, Deanna Postles Dunn Berger, Timothy Bronson, Aaron Tsai, Taylor J. Pritchard, Markus Kaltenbach, Christian Jacobi, Michael A. Blake
  • Patent number: 11775445
    Abstract: Disclosed herein is a virtual cache and method in a processor for supporting multiple threads on the same cache line. The processor is configured to support virtual memory and multiple threads. The virtual cache directory includes a plurality of directory entries, each entry is associated with a cache line. Each cache line has a corresponding tag. The tag includes a logical address, an address space identifier, a real address bit indicator, and a per thread validity bit for each thread that accesses the cache line. When a subsequent thread determines that the cache line is valid for that thread the validity bit for that thread is set, while not invalidating any validity bits for other threads.
    Type: Grant
    Filed: October 13, 2020
    Date of Patent: October 3, 2023
    Assignee: International Business Machines Corporation
    Inventors: Markus Helms, Christian Jacobi, Ulrich Mayer, Martin Recktenwald, Johannes C. Reichart, Anthony Saporito, Aaron Tsai
  • Publication number: 20230281132
    Abstract: Embodiments are for special tracking pool enhancement for core L1 address invalidates. An invalidate request is designated to fill an entry in a queue in a local cache of a processor core, the queue including a first allocation associated with processing any type of invalidate request and a second allocation associated with processing an invalidate request not requiring a response in order for a controller to be made available, the entry being in the second allocation. Responsive to designating the invalidate request to fill the entry in the queue in the local cache, a state of the controller that made the invalidate request is changed to available based at least in part on the entry being in the second allocation.
    Type: Application
    Filed: March 4, 2022
    Publication date: September 7, 2023
    Inventors: Deanna Postles Dunn Berger, Gregory William Alexander, Richard Joseph Branciforte, Aaron Tsai, Markus Kaltenbach
  • Patent number: 11748266
    Abstract: Embodiments are for special tracking pool enhancement for core L1 address invalidates. An invalidate request is designated to fill an entry in a queue in a local cache of a processor core, the queue including a first allocation associated with processing any type of invalidate request and a second allocation associated with processing an invalidate request not requiring a response in order for a controller to be made available, the entry being in the second allocation. Responsive to designating the invalidate request to fill the entry in the queue in the local cache, a state of the controller that made the invalidate request is changed to available based at least in part on the entry being in the second allocation.
    Type: Grant
    Filed: March 4, 2022
    Date of Patent: September 5, 2023
    Assignee: International Business Machines Corporation
    Inventors: Deanna Postles Dunn Berger, Gregory William Alexander, Richard Joseph Branciforte, Aaron Tsai, Markus Kaltenbach
  • Publication number: 20230153244
    Abstract: A method and a system detects a cache line as a potential or confirmed hot cache line based on receiving an intervention of a processor associated with a fetch of the cache line. The method and system include suppressing an action of operations associated with the hot cache line. A related method and system detect an intervention and, in response, communicates an intervention notification to another processor. An alternative method and system detect a hot data object associated with an intervention event of an application. The method and system can suppress actions of operations associated with the hot data object. An alternative method and system can detect and communicate an intervention associated with a data object.
    Type: Application
    Filed: January 13, 2023
    Publication date: May 18, 2023
    Inventors: Christian Zoellin, Christian Jacobi, Chung-Lung K. Shum, Martin Recktenwald, Anthony Saporito, Aaron Tsai
  • Patent number: 11605140
    Abstract: Various aspects of the subject technology relate to systems, methods, and machine-readable media for an interactive analytics platform responsive to data inquiries. These aspects include identifying a request for energy consumption data of an energy user from a user in a live interaction with the energy user. A utility bill as well as energy consumption data associated with the second user may be retrieved using an authentication of the second user. Energy consumption factors that influence a billing amount of the utility bill may be identified based on the energy consumption data. An interface with specific billing insights that correspond to the energy consumption factors is generated. The specific billing insights provide an explanation as to why the billing amount of the utility bill exceeded a cost expectation of the second user. The interface mapped with the utility bill may be provided for display.
    Type: Grant
    Filed: March 27, 2020
    Date of Patent: March 14, 2023
    Assignee: OPower, Inc.
    Inventors: Lawrence Han, Suryaveer Singh Lodha, Daniel Bloomfield Ramagem, Michael Ian Kristopher Eaton, Nowell Boardman Strite, Aaron Tsai Otani, Khanh Duc Nguyen
  • Patent number: 11586542
    Abstract: A method and a system detects a cache line as a potential or confirmed hot cache line based on receiving an intervention of a processor associated with a fetch of the cache line. The method and system include suppressing an action of operations associated with the hot cache line. A related method and system detect an intervention and, in response, communicates an intervention notification to another processor. An alternative method and system detect a hot data object associated with an intervention event of an application. The method and system can suppress actions of operations associated with the hot data object. An alternative method and system can detect and communicate an intervention associated with a data object.
    Type: Grant
    Filed: April 9, 2021
    Date of Patent: February 21, 2023
    Assignee: International Business Machines Corporation
    Inventors: Christian Zoellin, Christian Jacobi, Chung-Lung K. Shum, Martin Recktenwald, Anthony Saporito, Aaron Tsai
  • Patent number: 11403222
    Abstract: Disclosed herein is a method for operating access to a cache memory via an effective address comprising a tag field and a cache line index field. The method comprises: splitting the tag field into a first group of bits and a second group of bits. The line index bits and the first group of bits are searched in the set directory. A set identifier is generated indicating the set containing the respective cache line of the effective address. The set identifier, the line index bits and the second group of bits are searched in the validation directory. In response to determining the presence of the cache line in the set based on the second searching, a hit signal is generated.
    Type: Grant
    Filed: October 13, 2020
    Date of Patent: August 2, 2022
    Assignee: International Business Machines Corporation
    Inventors: Christian Jacobi, Ulrich Mayer, Martin Recktenwald, Anthony Saporito, Aaron Tsai
  • Publication number: 20210232502
    Abstract: A method and a system detects a cache line as a potential or confirmed hot cache line based on receiving an intervention of a processor associated with a fetch of the cache line. The method and system include suppressing an action of operations associated with the hot cache line. A related method and system detect an intervention and, in response, communicates an intervention notification to another processor. An alternative method and system detect a hot data object associated with an intervention event of an application. The method and system can suppress actions of operations associated with the hot data object. An alternative method and system can detect and communicate an intervention associated with a data object.
    Type: Application
    Filed: April 9, 2021
    Publication date: July 29, 2021
    Inventors: Christian Zoellin, Christian Jacobi, Chung-Lung K. Shum, Martin Recktenwald, Anthony Saporito, Aaron Tsai
  • Patent number: 11010298
    Abstract: A method and a system detects a cache line as a potential or confirmed hot cache line based on receiving an intervention of a processor associated with a fetch of the cache line. The method and system include suppressing an action of operations associated with the hot cache line. A related method and system detect an intervention and, in response, communicates an intervention notification to another processor. An alternative method and system detect a hot data object associated with an intervention event of an application. The method and system can suppress actions of operations associated with the hot data object. An alternative method and system can detect and communicate an intervention associated with a data object.
    Type: Grant
    Filed: January 17, 2020
    Date of Patent: May 18, 2021
    Assignee: International Business Machines Corporation
    Inventors: Christian Zoellin, Christian Jacobi, Chung-Lung K. Shum, Martin Recktenwald, Anthony Saporito, Aaron Tsai
  • Patent number: 11010307
    Abstract: A method, a computer system, and a computer program product to perform a directory lookup in a first level cache for requested cache line data. A first processor core can detect that the requested cache line data is not found in a plurality of sets of data in the first level cache and detect that existing cache line data stored in a least recently used data set stored in the first level cache is in an exclusive state, wherein the existing cache line data stored in the least recently used data set is to be overwritten by the requested cache line data retrieved from a second level cache. Furthermore, the first processor core can send a request for the requested cache line data and a physical address of the least recently used data set to the second level cache and execute additional instructions based on the first level cache and data retrieved from the second level cache.
    Type: Grant
    Filed: December 2, 2019
    Date of Patent: May 18, 2021
    Assignee: International Business Machines Corporation
    Inventors: Deanna P. D. Berger, Christian Jacobi, Martin Recktenwald, Yossi Shapira, Aaron Tsai