Patents by Inventor Lucas GARCIA
Lucas GARCIA has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240330066Abstract: A method and a system for automated performance of capacity allocation, brokerage, placement, and provisioning of compute, network, and storage resources are provided. The method includes: receiving a first data set that relates to resource requirements of a user; retrieving, from a memory, a second data set that relates to resource availability; analyzing the first data set and the second data set in order to determine a proposed allocation of resources and a proposed timing that corresponds to the proposed allocation; and provisioning the resources to the user based on the proposed allocation and the proposed timing. A machine learning model that is trained by using historical resource allocation data may be applied to the first data set and the second data set in order to perform the analysis.Type: ApplicationFiled: May 17, 2023Publication date: October 3, 2024Applicant: JPMorgan Chase Bank, N.A.Inventors: Jessie RINCON-PAZ, Francine SHEPHARD, Navin NAGARAJAIAH, Tijelino J BRAVO, Louis FLORES, Andres Lucas GARCIA FIORINI, Anmol P MEHTA, Nisha KAW, Rajesh GUNTHA, Shiv GURUSWAMY, Joseph E LEIDEMER
-
Patent number: 11720494Abstract: Apparatuses and methods relating to controlling cache evictions are disclosed. Processing circuitry which execute instructions out-of-order is provided with a private cache into which blocks of data are copied from a shared storage location to which the processing circuitry shares access. The processing circuitry also has a read-after-read buffer, into which an entry is allocated when out-of-order execution of a load instruction occurs comprising an address accessed by the load instruction. The address remains as a valid entry in the read-after-read buffer until the load instruction is committed. Eviction of an eviction candidate block of data from the private cache to the shared storage location is controlled in dependence on whether the eviction candidate block of data has a corresponding valid entry in the read-after-read buffer.Type: GrantFiled: March 11, 2022Date of Patent: August 8, 2023Assignee: Arm LimitedInventors: Yohan Fernand Fargeix, Lucas Garcia, Luca Nassi, Albin Pierrick Tonnerre
-
Patent number: 11163691Abstract: Examples of the present disclosure relate to an apparatus comprising processing circuitry to perform data processing operations, storage circuitry to store data for access by the processing circuitry, address translation circuitry to maintain address translation data for translating virtual memory addresses into corresponding physical memory addresses, and prefetch circuitry. The prefetch circuitry is arranged to prefetch first data into the storage circuitry in anticipation of the first data being required for performing the data processing operations. The prefetching comprises, based on a prediction scheme, predicting a first virtual memory address associated with the first data, accessing the address translation circuitry to determine a first physical memory address corresponding to the first virtual memory address, and retrieving the first data based on the first physical memory address corresponding to the first virtual memory address.Type: GrantFiled: June 25, 2019Date of Patent: November 2, 2021Assignee: ARM LIMITEDInventors: Stefano Ghiggini, Natalya Bondarenko, Damien Guillaume Pierre Payet, Lucas Garcia
-
Patent number: 11138119Abstract: There is provided an apparatus that includes storage circuitry. The storage circuitry is made up from a plurality of sets, each of the sets having at least one storage location. Receiving circuitry receives an access request that includes an input address. Lookup circuitry obtains a plurality of candidate sets that correspond with an index part of the input address. The lookup circuitry determines a selected storage location from the candidate sets using an access policy. The access policy causes the lookup circuitry to iterate through the candidate sets to attempt to locate an appropriate storage location. The appropriate storage location is accessed in response to the appropriate storage location being found.Type: GrantFiled: January 15, 2019Date of Patent: October 5, 2021Assignee: Arm LimitedInventors: Damien Guillaume Pierre Payet, Natalya Bondarenko, Florent Begon, Lucas Garcia
-
Patent number: 10956206Abstract: A technique is described for managing a cache structure in a system employing transactional memory. The apparatus comprises processing circuitry to perform data processing operations in response to instructions, the processing circuitry comprising transactional memory support circuitry to support execution of a transaction, and a cache structure comprising a plurality of cache entries for storing data for access by the processing circuitry. Each cache entry has associated therewith an allocation tag, and allocation tag control circuitry is provided to control use of a plurality of allocation tags and to maintain an indication of a current state of each of those allocation tags.Type: GrantFiled: April 2, 2019Date of Patent: March 23, 2021Assignee: Arm LimitedInventors: Damien Guillaume Pierre Payet, Lucas Garcia, Natalya Bondarenko, Stefano Ghiggini
-
Patent number: 10783031Abstract: An apparatus comprises processing circuitry, transactional memory support circuitry and a cache. The processing circuitry processes threads of data processing, and the transactional memory support circuitry supports execution of a transaction within a thread, including tracking a read set of addresses, comprising addresses accessed by read instructions within the transaction. A transaction comprises instructions for which the processing circuitry is configured to prevent commitment of the results of speculatively executed instruction until the transaction has completed. The cache has a plurality of entries, each associated with an address and specifying a replaceable-information value for that address that comprises information for which, outside of the transaction, processing would be functionally correct even if the information was incorrect.Type: GrantFiled: August 20, 2018Date of Patent: September 22, 2020Assignee: Arm LimitedInventors: Damien Guillaume Pierre Payet, Lucas Garcia, Natalya Bondarenko, Stefano Ghiggini
-
Patent number: 10776274Abstract: Data processing circuitry comprises a cache memory to cache a subset of data elements from a main memory; a processing element to execute program code to access data elements having respective memory addresses, the processing element being configured to access the data elements in the cache memory and, in the case of a cache miss, to fetch the data elements from the main memory; prefetch circuitry, responsive to an access to a current data element, to initiate prefetching into the cache memory of a data element at a memory address defined by a current offset value relative to the address of the current data element; and offset value selection circuitry comprising: an address table to store memory addresses for which a data element accessed by the processing element resulted in a cache miss or an access to a previously prefetched data element; and detector circuitry to detect, for each of a group of candidate offset values, one or more respective metrics representing a proportion of a set of data element accesType: GrantFiled: March 2, 2018Date of Patent: September 15, 2020Assignee: Arm LimitedInventors: Lucas Garcia, Geoffray Matthieu Lacourba, Natalya Bondarenko, Nathanael Premillieu
-
Patent number: 10769069Abstract: Data processing circuitry comprises a cache memory to cache a subset of data elements from a main memory; a processing element to execute program code to access data elements having respective memory addresses, the processing element being configured to access the data elements in the cache memory and, in the case of a cache miss, to fetch the data elements from the main memory; prefetch circuitry, responsive to an access to a current data element, to initiate prefetching into the cache memory of a data element at a memory address defined by a current offset value relative to the address of the current data element; offset value selection circuitry comprising: an address table to store memory addresses for which a data element accessed by the processing element resulted in a cache miss or an access to a previously prefetched data element; detector circuitry to detect, for each of a group of candidate offset values, one or more respective metrics representing a proportion of a set of data element accesses whicType: GrantFiled: March 2, 2018Date of Patent: September 8, 2020Assignee: Arm LimitedInventors: Natalya Bondarenko, Lucas Garcia, Geoffray Matthieu Lacourba
-
Publication number: 20200065257Abstract: Examples of the present disclosure relate to an apparatus comprising processing circuitry to perform data processing operations, storage circuitry to store data for access by the processing circuitry, address translation circuitry to maintain address translation data for translating virtual memory addresses into corresponding physical memory addresses, and prefetch circuitry. The prefetch circuitry is arranged to prefetch first data into the storage circuitry in anticipation of the first data being required for performing the data processing operations. The prefetching comprises, based on a prediction scheme, predicting a first virtual memory address associated with the first data, accessing the address translation circuitry to determine a first physical memory address corresponding to the first virtual memory address, and retrieving the first data based on the first physical memory address corresponding to the first virtual memory address.Type: ApplicationFiled: June 25, 2019Publication date: February 27, 2020Inventors: Stefano GHIGGINI, Natalya BONDARENKO, Damien Guillaume Pierre PAYET, Lucas GARCIA
-
Publication number: 20200057692Abstract: An apparatus comprises processing circuitry, transactional memory support circuitry and a cache. The processing circuitry processes threads of data processing, and the transactional memory support circuitry supports execution of a transaction within a thread, including tracking a read set of addresses, comprising addresses accessed by read instructions within the transaction. A transaction comprises instructions for which the processing circuitry is configured to prevent commitment of the results of speculatively executed instruction until the transaction has completed. The cache has a plurality of entries, each associated with an address and specifying a replaceable-information value for that address that comprises information for which, outside of the transaction, processing would be functionally correct even if the information was incorrect.Type: ApplicationFiled: August 20, 2018Publication date: February 20, 2020Inventors: Damien Guillaume Pierre PAYET, Lucas GARCIA, Natalya BONDARENKO, Stefano GHIGGINI
-
Publication number: 20190347124Abstract: A technique is described for managing a cache structure in a system employing transactional memory. The apparatus comprises processing circuitry to perform data processing operations in response to instructions, the processing circuitry comprising transactional memory support circuitry to support execution of a transaction, and a cache structure comprising a plurality of cache entries for storing data for access by the processing circuitry. Each cache entry has associated therewith an allocation tag, and allocation tag control circuitry is provided to control use of a plurality of allocation tags and to maintain an indication of a current state of each of those allocation tags.Type: ApplicationFiled: April 2, 2019Publication date: November 14, 2019Inventors: Damien Guillaume Pierre PAYET, Lucas GARCIA, Natalya BONDARENKO, Stefano GHIGGINI
-
Patent number: 10445241Abstract: Data processing circuitry comprises a processing element to execute successive iterations of program code to access a set of data elements in memory, each iteration accessing one or more respective data elements of the set; a data element structure memory to store a memory address relationship between the data elements of the set; and prefetch circuitry, responsive to an access by a current program code iteration to a current data element of the set, to detect, using the memory address relationship stored in the data element structure memory a memory address defining a subsequent data element to be accessed by a next program iteration and to initiate prefetching of at least a portion of the subsequent data element from memory.Type: GrantFiled: March 6, 2018Date of Patent: October 15, 2019Assignee: ARM LimitedInventors: Lucas Garcia, Laurent Claude Desnogues, Adrien Pesle, Vincenzo Consales
-
Publication number: 20190278709Abstract: Data processing circuitry comprises a processing element to execute successive iterations of program code to access a set of data elements in memory, each iteration accessing one or more respective data elements of the set; a data element structure memory to store a memory address relationship between the data elements of the set; and prefetch circuitry, responsive to an access by a current program code iteration to a current data element of the set, to detect, using the memory address relationship stored in the data element structure memory a memory address defining a subsequent data element to be accessed by a next program iteration and to initiate prefetching of at least a portion of the subsequent data element from memory.Type: ApplicationFiled: March 6, 2018Publication date: September 12, 2019Inventors: Lucas GARCIA, Laurent Claude DESNOGUES, Adrien PESLE, Vincenzo CONSALES
-
Publication number: 20190272233Abstract: Data processing circuitry comprises a cache memory to cache a subset of data elements from a main memory; a processing element to execute program code to access data elements having respective memory addresses, the processing element being configured to access the data elements in the cache memory and, in the case of a cache miss, to fetch the data elements from the main memory; prefetch circuitry, responsive to an access to a current data element, to initiate prefetching into the cache memory of a data element at a memory address defined by a current offset value relative to the address of the current data element; and offset value selection circuitry comprising: an address table to store memory addresses for which a data element accessed by the processing element resulted in a cache miss or an access to a previously prefetched data element; and detector circuitry to detect, for each of a group of candidate offset values, one or more respective metrics representing a proportion of a set of data element accesType: ApplicationFiled: March 2, 2018Publication date: September 5, 2019Inventors: Lucas GARCIA, Geoffray Matthieu LACOURBA, Natalya BONDARENKO, Nathanael PREMILLIEU
-
Publication number: 20190272234Abstract: Data processing circuitry comprises a cache memory to cache a subset of data elements from a main memory; a processing element to execute program code to access data elements having respective memory addresses, the processing element being configured to access the data elements in the cache memory and, in the case of a cache miss, to fetch the data elements from the main memory; prefetch circuitry, responsive to an access to a current data element, to initiate prefetching into the cache memory of a data element at a memory address defined by a current offset value relative to the address of the current data element; offset value selection circuitry comprising: an address table to store memory addresses for which a data element accessed by the processing element resulted in a cache miss or an access to a previously prefetched data element; detector circuitry to detect, for each of a group of candidate offset values, one or more respective metrics representing a proportion of a set of data element accesses whicType: ApplicationFiled: March 2, 2018Publication date: September 5, 2019Inventors: Natalya BONDARENKO, Lucas GARCIA, Geoffray Matthieu LACOURBA
-
Publication number: 20190220414Abstract: There is provided an apparatus that includes storage circuitry. The storage circuitry is made up from a plurality of sets, each of the sets having at least one storage location. Receiving circuitry receives an access request that includes an input address. Lookup circuitry obtains a plurality of candidate sets that correspond with an index part of the input address. The lookup circuitry determines a selected storage location from the candidate sets using an access policy. The access policy causes the lookup circuitry to iterate through the candidate sets to attempt to locate an appropriate storage location. The appropriate storage location is accessed in response to the appropriate storage location being found.Type: ApplicationFiled: January 15, 2019Publication date: July 18, 2019Inventors: Damien Guillaume Pierre PAYET, Natalya BONDARENKO, Florent BEGON, Lucas GARCIA