Patents by Inventor John G. Favor

John G. Favor has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11803638
    Abstract: In order to mitigate side channel attacks that exploit speculative store-to-load forwarding, a store dependence predictor is used to prevent store-to-load forwarding if the load and store instructions do not have a matching translation context (TC). In one design, a store queue (SQ) stores the TC—a function of the privilege mode (PM), address space identifier (ASID), and/or virtual machine identifier (VMID)—of each store and conditions store-to-load forwarding on matching store and load TCs. In another design, a memory dependence predictor (MDP) disambiguates predictions of store-to-load forwarding based on the load instruction's TC. In each design, the MDP or SQ does not predict or allow store-to-load forwarding for loads whose addresses, but not their TCs, match an MDP entry.
    Type: Grant
    Filed: February 25, 2021
    Date of Patent: October 31, 2023
    Assignee: Ventana Micro Systems Inc.
    Inventor: John G. Favor
  • Patent number: 11803637
    Abstract: A processor and a method are disclosed that mitigate side channel attacks (SCAs) that exploit store-to-load forwarding operations. In one embodiment, the processor detects a translation context change from a first translation context (TC) to a second TC and responsively disallows store-to-load forwarding until all store instructions older than the TC change are committed. The TC comprises an address space identifier (ASID), a virtual machine identifier (VMID), a privilege mode (PM) or a combination of two or more of the ASID, VMID and PM, or a derivative thereof, such as a TC hash, TC generation value, or a RobID associated with the last TC-updating instruction. In other embodiments, TC generation values of load and store instructions are compared or RobIDs of the load and store instructions are compared with the RobID associated with the last TC-updating instruction. If the instructions' RobIDs straddle the TC boundary, store-to-load forwarding is not allowed.
    Type: Grant
    Filed: February 25, 2021
    Date of Patent: October 31, 2023
    Assignee: Ventana Micro Systems Inc.
    Inventor: John G. Favor
  • Patent number: 11803640
    Abstract: An out-of-order and speculative execution microprocessor that mitigates side channel attacks includes a cache memory and fill request generation logic that generates a request to fill the cache memory with a cache line implicated by a memory address that misses in the cache memory. At least one execution pipeline receives first and second load operations, detects a condition in which the first load generates a need for an architectural exception, the second load misses in the cache memory, and the second load is newer in program order than the first load, and prevents state of the cache memory from being affected by the miss of the second load by inhibiting the fill request generation logic from generating a fill request for the second load or by canceling the fill request for the second load if the fill request generation logic has already generated the fill request for the second load.
    Type: Grant
    Filed: August 27, 2020
    Date of Patent: October 31, 2023
    Assignee: Ventana Micro Systems Inc.
    Inventors: John G. Favor, Srivatsan Srinivasan
  • Patent number: 11797673
    Abstract: A superscalar out-of-order speculative execution microprocessor mitigates side channel attacks that attempt to exploit speculation windows within which instructions dependent in their execution upon a result of a load instruction may speculatively execute before being flushed because the load instruction raises an architectural exception. A load unit signals an abort request, among other potential abort requests, to control logic in response to detecting that a load instruction causes a need for an architectural exception. The control logic initiates an abort process as soon as the control logic determines that the abort request from the load unit is highest priority among any other concurrently received abort requests and determines a location of the exception-causing load instruction within the program order of outstanding instructions. To perform the abort process, the control logic flushes from the pipeline all instructions dependent upon a result of the exception-causing load instruction.
    Type: Grant
    Filed: March 17, 2021
    Date of Patent: October 24, 2023
    Assignee: Ventana Micro Systems Inc.
    Inventors: John G Favor, Srivatsan Srinivasan
  • Patent number: 11789871
    Abstract: A microprocessor prevents same address load-load ordering violations. Each load queue entry holds a load physical memory line address (PMLA) and an indication of whether a load instruction has completed execution. The microprocessor fills a line specified by a fill PMLA into a cache entry and snoops the load queue with the fill PMLA, either before the fill or in an atomic manner with the fill with respect to ability of the filled entry to be hit upon by any load instruction, to determine whether the fill PMLA matches load PMLAs in load queue entries associated with load instructions that have completed execution and there are other load instructions in the load queue that have not completed execution. The microprocessor, if the condition is true, flushes at least the other load instructions in the load queue that have not completed execution.
    Type: Grant
    Filed: May 18, 2022
    Date of Patent: October 17, 2023
    Assignee: Ventana Micro Systems Inc.
    Inventors: John G. Favor, Srivatsan Srinivasan
  • Publication number: 20230315838
    Abstract: A processor and a method are disclosed that mitigate side channel attacks (SCAs) that exploit store-to-load forwarding operations. In one embodiment, the processor detects a translation context change from a first translation context (TC) to a second TC and responsively disallows store-to-load forwarding until all store instructions older than the TC change are committed. The TC comprises an address space identifier (ASID), a virtual machine identifier (VMID), a privilege mode (PM) or a combination of two or more of the ASID, VMID and PM, or a derivative thereof, such as a TC hash, TC generation value, or a RobID associated with the last TC-updating instruction. In other embodiments, TC generation values of load and store instructions are compared or RobIDs of the load and store instructions are compared with the RobID associated with the last TC-updating instruction. If the instructions' RobIDs straddle the TC boundary, store-to-load forwarding is not allowed.
    Type: Application
    Filed: June 9, 2023
    Publication date: October 5, 2023
    Inventor: John G. Favor
  • Patent number: 11762999
    Abstract: A microprocessor for mitigating side channel attacks includes a memory subsystem that receives a load operation that specifies a load address. The memory subsystem includes a virtually-indexed, virtually-tagged data cache memory (VIVTDCM) comprising entries that hold translation information. The memory subsystem also includes a data translation lookaside buffer (DTLB) comprising entries that hold physical address translations and translation information. The processor performs speculative execution of instructions and executes instructions out of program order. The memory system allows non-inclusion with respect to translation information between the VIVTDCM and the DTLB such that, for instances in time, translation information associated with the load address is present in the VIVTDCM and absent in the DTLB.
    Type: Grant
    Filed: October 6, 2020
    Date of Patent: September 19, 2023
    Assignee: Ventana Micro Systems Inc.
    Inventors: John G. Favor, Srivatsan Srinivasan
  • Patent number: 11755732
    Abstract: A processor is disclosed that mitigates side channel attacks that exploit speculative store-to-load forwarding. The processor includes logic that conditions store-to-load forwarding of uncommitted store data in the store queue from an uncommitted store instruction to the load instruction upon circumstances associated with a translation context (TC) change or update. The TC comprises an address space identifier (ASID), a virtual machine identifier (VMID), a privilege mode (PM) or a combination of two or more of the ASID, VMID and PM or a derivative thereof. The logic is embedded or associated with any of several structures, such as a store queue (SQ), a memory dependence predictor (MDP), or a reorder buffer (ROB).
    Type: Grant
    Filed: February 25, 2021
    Date of Patent: September 12, 2023
    Assignee: Ventana Micro Systems Inc.
    Inventor: John G. Favor
  • Patent number: 11755731
    Abstract: A processor for mitigating side channel attacks includes units that perform fetch, decode, and execution of instructions and pipeline control logic. The processor performs speculative and out-of-order execution of the instructions. The units detect and notify the control unit of events that cause a change from a first translation context (TC) to a second TC. In response, the pipeline control logic prevents speculative execution of instructions that are dependent in their execution on the change to the second TC until all instructions that are dependent on the first TC have completed execution, which may involve stalling their dispatch until all first-TC-dependent instructions have at least completed execution, or by tagging them and dispatching them to execution schedulers but preventing them from starting execution until all first-TC-dependent instructions have at least completed execution.
    Type: Grant
    Filed: July 23, 2020
    Date of Patent: September 12, 2023
    Assignee: Ventana Micro Systems Inc.
    Inventors: John G. Favor, David S. Oliver
  • Patent number: 11733972
    Abstract: A microprocessor that mitigates side channel attacks. The microprocessor includes a data cache memory and a load unit that receive a load operation that specifies a load address. The processor performs speculative execution of instructions and executes instructions out of program order. The load unit detects that the load operation does not have permission to access the load address or that the load address specifies a location for which a valid address translation does not currently exist and provides random load data as a result of the execution of the load operation.
    Type: Grant
    Filed: October 6, 2020
    Date of Patent: August 22, 2023
    Assignee: Ventana Micro Systems Inc.
    Inventors: John G. Favor, Srivatsan Srinivasan
  • Patent number: 11734426
    Abstract: A microprocessor for mitigating side channel attacks includes a memory subsystem having at least a data cache memory and configured to receive a load operation that specifies a load address. The processor performs speculative execution of instructions and executes instructions out of program order. The memory subsystem, in response to detecting that the load address misses in the data cache memory: detects a condition in which the load address specifies a location for which a valid address translation does not currently exist or permission to read from the location is not allowed, and prevents cache line data implicated by the missing load address from being filled into the data cache memory in response to detection of the condition.
    Type: Grant
    Filed: October 6, 2020
    Date of Patent: August 22, 2023
    Assignee: Ventana Micro Systems Inc.
    Inventors: John G. Favor, Srivatsan Srinivasan
  • Publication number: 20230244778
    Abstract: A physically-tagged data cache memory mitigates side channel attacks by using a translation context (TC). With each entry allocation, control logic uses the received TC to perform the allocation, and with each access uses the received TC in a hit determination. The TC includes an address space identifier (ASID), virtual machine identifier (VMID), a privilege mode (PM) or translation regime (TR), or combination thereof. The TC is included in a tag of the allocated entry. Alternatively, or additionally, the TC is included in the set index to select a set of entries of the cache memory. Also, the TC may be hashed with address index bits to generate a small tag also included in the allocated entry used to generate an access early miss indication and way select.
    Type: Application
    Filed: April 3, 2023
    Publication date: August 3, 2023
    Inventors: John G. Favor, Srivatsan Srinivasan
  • Patent number: 11687466
    Abstract: A virtually-indexed and virtually-tagged cache has E entries each holding a memory line at a physical memory line address (PMLA), a tag of a virtual memory line address (VMLA), and permissions of a memory page that encompasses the PMLA. A directory having E corresponding entries is physically arranged as R rows by C columns=E. Each directory entry holds a directory tag comprising hashes of corresponding portions of a page address portion of the VMLA whose tag is held in the corresponding cache entry. In response to a translation lookaside buffer management instruction (TLBMI), the microprocessor generates a target tag comprising hashes of corresponding portions of a TLBMI-specified page address. For each directory row, the microprocessor: for each directory entry of the row, compares the target and directory tags to generate a match indictor used to invalidate the corresponding cache entry.
    Type: Grant
    Filed: May 24, 2022
    Date of Patent: June 27, 2023
    Assignee: Ventana Micro Systems Inc.
    Inventors: John G. Favor, Srivatsan Srinivasan
  • Patent number: 11625479
    Abstract: A data cache memory mitigates side channel attacks in a processor that comprises the data cache memory and that includes a translation context (TC). A first input receives a virtual memory address. A second input receives the TC. Control logic, with each allocation of an entry of the data cache memory, uses the received virtual memory address and the received TC to perform the allocation of the entry. The control logic also, with each access of the data cache memory, uses the received virtual memory address and the received TC in a correct determination of whether the access hits in the data cache memory. The TC includes a virtual machine identifier (VMID), or a privilege mode (PM) or a translation regime (TR), or both the VMID and the PM or the TR.
    Type: Grant
    Filed: August 27, 2020
    Date of Patent: April 11, 2023
    Assignee: Ventana Micro Systems Inc.
    Inventors: John G. Favor, Srivatsan Srinivasan
  • Patent number: 11620377
    Abstract: A physically-tagged data cache memory mitigates side channel attacks by using a translation context (TC). With each entry allocation, control logic uses the received TC to perform the allocation, and with each access uses the received TC in a hit determination. The TC includes an address space identifier (ASID), virtual machine identifier (VMID), a privilege mode (PM) or translation regime (TR), or combination thereof. The TC is included in a tag of the allocated entry. Alternatively, or additionally, the TC is included in the set index to select a set of entries of the cache memory. Also, the TC may be hashed with address index bits to generate a small tag also included in the allocated entry used to generate an access early miss indication and way select.
    Type: Grant
    Filed: August 27, 2020
    Date of Patent: April 4, 2023
    Assignee: Ventana Micro Systems Inc.
    Inventors: John G. Favor, Srivatsan Srinivasan
  • Publication number: 20220358047
    Abstract: A microprocessor prevents same address load-load ordering violations. Each load queue entry holds a load physical memory line address (PMLA) and an indication of whether a load instruction has completed execution. The microprocessor fills a line specified by a fill PMLA into a cache entry and snoops the load queue with the fill PMLA, either before the fill or in an atomic manner with the fill with respect to ability of the filled entry to be hit upon by any load instruction, to determine whether the fill PMLA matches load PMLAs in load queue entries associated with load instructions that have completed execution and there are other load instructions in the load queue that have not completed execution. The microprocessor, if the condition is true, flushes at least the other load instructions in the load queue that have not completed execution.
    Type: Application
    Filed: May 18, 2022
    Publication date: November 10, 2022
    Inventors: John G. Favor, Srivatsan Srinivasan
  • Publication number: 20220358043
    Abstract: A microprocessor includes a physically-indexed-and-tagged second-level set-associative cache. Each cache entry is uniquely identified by a set index and a way number. Each entry of a write-combine buffer (WCB) holds write data to be written to a write physical memory address, a portion of which is a write physical line address. Each WCB entry also holds a write physical address proxy (PAP) for the write physical line address. The write PAP specifies the set index and the way number of the cache entry into which a cache line specified by the write physical line address is allocated. In response to receiving a store instruction that is being committed and that specifies a store PAP, the WCB compares the store PAP with the write PAP of each WCB entry and requires a match as a necessary condition for merging store data of the store instruction into a WCB entry.
    Type: Application
    Filed: July 8, 2021
    Publication date: November 10, 2022
    Inventors: John G. Favor, Srivatsan Srinivasan
  • Publication number: 20220358210
    Abstract: A method and system for mitigating against side channel attacks (SCA) that exploit speculative store-to-load forwarding is described. The method comprises conditioning store-to-load forwarding on the memory dependence predictor (MDP) being trained for that load instruction. Training involves identifying situations in which store-to-load forwarding could have been performed, but wasn't, and obversely, identifying situations in which store-to-load forwarding was performed but resulted in an error.
    Type: Application
    Filed: January 13, 2022
    Publication date: November 10, 2022
    Inventors: John G. Favor, Srivatsan Srinivasan
  • Publication number: 20220358045
    Abstract: A microprocessor includes a physically-indexed physically-tagged second-level set-associative cache. A set index and a way uniquely identifies each entry. A load/store unit, during store/load instruction execution: detects that a first and second portions of store/load data are to be written/read to/from different first and second lines of memory specified by first and second store physical memory line addresses, writes to a store/load queue entry first and second store physical address proxies (PAPs) for first and second store physical memory line addresses (and all the store data in store execution case). The first and second store PAPs comprise respective set indexes and ways that uniquely identifies respective entries of the second-level cache that holds respective copies of the respective first and second lines of memory. The entries of the store queue are absent storage for holding the first and second store physical memory line addresses.
    Type: Application
    Filed: May 18, 2022
    Publication date: November 10, 2022
    Inventors: John G. Favor, Srivatsan Srinivasan
  • Publication number: 20220358209
    Abstract: A method and system for mitigating against side channel attacks (SCA) that exploit speculative store-to-load forwarding is described. The method comprises ensuring that the physical load and store addresses match and/or that permissions are present before speculatively store-to-load forwarding. Various improvements maintain a short load-store pipeline, including usage of a virtual level-one data cache (DL1), usage of an inclusive physical level-two data cache (DL2), storage and lookup of physical data address equivalents in the DL1, and using a memory dependence predictor (MDP) to speed up or replace store queue camming of load data addresses against store data addresses.
    Type: Application
    Filed: September 10, 2021
    Publication date: November 10, 2022
    Inventors: John G. Favor, Srivatsan Srinivasan