Patents by Inventor Takahito Hirano
Takahito Hirano has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10983914Abstract: A tag match determination unit determines, in response to an acquisition request for predetermined data, whether predetermined data is present in a primary cache. When the predetermined data is not present in the primary cache, the move-in buffer outputs the acquisition request for the predetermined data to a secondary cache management unit or the storage device and holds determination purpose information based on state information on a predetermined area that stores therein the predetermined data. A storage processing unit determines, when an acquired response from the secondary cache management unit or the storage device is a predetermined type, based on the determination purpose information, whether or not to acquire the state information stored in the primary cache; invalidates the predetermined area when it is determined not to acquire the state information; and stores, in the predetermined area, the predetermined data included in the response.Type: GrantFiled: May 22, 2019Date of Patent: April 20, 2021Assignee: FUJITSU LIMITEDInventor: Takahito Hirano
-
Patent number: 10552331Abstract: An arithmetic processing device includes a memory access request issuance unit and a cache including a cache memory for tags and data and a move-in buffer control unit for issuing a move-in request for data on the memory access request when a cache miss occurs. The move-in buffer control unit, when the cache miss occurs, determines to acquire a move-in buffer and issue the move-in request when the memory access request has the same index as an index of any move-in request registered in the move-in buffer and the number of move-in requests of the same index registered in the move-in buffer is less than the number of ways, and determines not to acquire the move-in buffer and does not issue the move-in request when the memory access request has the same index and the number of the move-in requests of the same index reaches the number of the ways.Type: GrantFiled: August 23, 2017Date of Patent: February 4, 2020Assignee: FUJITSU LIMITEDInventors: Yuki Kamikubo, Noriko Takagi, Takahito Hirano
-
Publication number: 20190391924Abstract: A tag match determination unit determines, in response to an acquisition request for predetermined data, whether predetermined data is present in a primary cache. When the predetermined data is not present in the primary cache, the move-in buffer outputs the acquisition request for the predetermined data to a secondary cache management unit or the storage device and holds determination purpose information based on state information on a predetermined area that stores therein the predetermined data. A storage processing unit determines, when an acquired response from the secondary cache management unit or the storage device is a predetermined type, based on the determination purpose information, whether or not to acquire the state information stored in the primary cache; invalidates the predetermined area when it is determined not to acquire the state information; and stores, in the predetermined area, the predetermined data included in the response.Type: ApplicationFiled: May 22, 2019Publication date: December 26, 2019Applicant: FUJITSU LIMITEDInventor: TAKAHITO HIRANO
-
Publication number: 20180095886Abstract: An arithmetic processing device includes a memory access request issuance unit and a cache including a cache memory for tags and data and a move-in buffer control unit for issuing a move-in request for data on the memory access request when a cache miss occurs. The move-in buffer control unit, when the cache miss occurs, determines to acquire a move-in buffer and issue the move-in request when the memory access request has the same index as an index of any move-in request registered in the move-in buffer and the number of move-in requests of the same index registered in the move-in buffer is less than the number of ways, and determines not to acquire the move-in buffer and does not issue the move-in request when the memory access request has the same index and the number of the move-in requests of the same index reaches the number of the ways.Type: ApplicationFiled: August 23, 2017Publication date: April 5, 2018Applicant: FUJITSU LIMITEDInventors: YUKI KAMIKUBO, Noriko Takagi, TAKAHITO HIRANO
-
Publication number: 20170300322Abstract: An arithmetic processing device includes: an instruction control circuit; primary cache circuit that includes a primary cache memory and a first buffer; and a secondary cache memory. The primary cache circuit is configured to, when a first instruction for executing processing to register data of a cache line in the secondary cache memory without the occurrence of an access to the main memory, is issued from the instruction control circuit and when data corresponding to a first address designated as an access target in the first instruction is not stored in the primary cache memory, store the first address in the first buffer and issue the first instruction to the secondary cache memory.Type: ApplicationFiled: April 4, 2017Publication date: October 19, 2017Applicant: FUJITSU LIMITEDInventors: Takahito HIRANO, Noriko TAKAGI
-
Patent number: 9410815Abstract: An entry point card displayed within a mapping application viewport may display context and other data based on a calendar appointment and other data that the user might, predictably, want to search for upon opening the mapping application. Using appointment information from a calendar application, an entry point card might display a time the user must leave his current location in order to make the appointment on time. Or, using a history of the user's routine errands or trips, the entry point card may display predicted information. For example, the card may display an amount of time to get to work or other information.Type: GrantFiled: March 20, 2015Date of Patent: August 9, 2016Assignee: GOOGLE INC.Inventors: Takahito Hirano, Ryo Kawaguchi, Masanori Goto, Koichi Suematsu, Pawel Szczepanski, Takahiro Kosakai, Naoto Kaneko, Taj J. Campbell, Peter Foo, Kaori Kozai
-
Publication number: 20150052307Abstract: When a primary cache controller of a core unit arbitrates and issues a non-cache write request in a thread “0” (zero) and a non-cache read request in a thread 1 from an instruction controller, and when the non-cache requests being arbitration objects are in issuable states by obtaining a response for a preceding non-cache write request after an issuance of the preceding non-cache write request in the thread-0 (zero) which precedes to the non-cache write request in the thread-0 (zero) being the arbitration object, the non-cache read request in the thread-1 is given priority to be issued so that the non-cache read request whose priority is low is not continued to be waited.Type: ApplicationFiled: July 22, 2014Publication date: February 19, 2015Inventor: TAKAHITO HIRANO
-
Patent number: 8856478Abstract: A processor holds, in a plurality of respective cache lines, part of data held in a main memory unit. The processor also holds, in the plurality of respective cache lines, a tag address used to search for the data held in the cache lines and a flag indicating the validity of the data held in the cache lines. The processor executes a cache line fill instruction on a cache line corresponding to a specified address. Upon execution of the cache line fill instruction, the processor registers predetermined data in the cache line of the cache memory unit which has a tag address corresponding to the specified address and validates a flag in the cache line having the tag address corresponding to the specified address.Type: GrantFiled: December 22, 2010Date of Patent: October 7, 2014Assignee: Fujitsu LimitedInventors: Takahito Hirano, Iwao Yamazaki
-
Patent number: 8806102Abstract: A cache system includes a primary cache memory configured to input and output data between a computation unit, the primary cache memory includes multi-port memory units each including a storing unit that stores unit data having a first data size, a writing unit that simultaneously writes sequentially inputted plural unit data to consecutive locations of the storing unit, and an outputting unit that reads out and outputs unit data written in the storing unit, wherein when writing data having a second data size that is an arbitrary multiple of a first data size and is segmented into unit data to the primary cache memory, the data is stored in different multi-port memory units by writing the sequential unit data to a subset of the multi-port memory units, and writing the other sequential unit data to another subset of the multi-port memory units.Type: GrantFiled: January 25, 2011Date of Patent: August 12, 2014Assignee: Fujitsu LimitedInventor: Takahito Hirano
-
Patent number: 8806318Abstract: A fault analyzing circuit has: a comparing circuit to compare fault data stored in a storage area with a fault being caused with data of an alternation register; and a position identifying circuit to identify an error bit position from data of a comparative result of the comparing circuit.Type: GrantFiled: July 31, 2012Date of Patent: August 12, 2014Assignee: Fujitsu LimitedInventor: Takahito Hirano
-
Patent number: 8533565Abstract: A cache memory controlling unit includes a plurality of STBs for maintaining 8-byte store data received from an execution unit, a plurality of WBs, a DATA-RAM, an FCDR, and an ECC-RAM. The cache memory controlling unit having such a structure obtains data-not-to-be-stored from the DATA-RAM, stores the obtained data in the FCDR, and merges the stored data with data-to-be-stored in the store data output from the execution unit and stored in the STBs or the WBs to generate new store data. The cache memory controlling unit then writes the generated new store data in the DATA-RAM, generates an ECC from the new store data, and writes the ECC in the ECC-RAM.Type: GrantFiled: December 18, 2009Date of Patent: September 10, 2013Assignee: Fujitsu LimitedInventors: Takashi Miura, Iwao Yamazaki, Takahito Hirano
-
Publication number: 20130067282Abstract: A fault analyzing circuit has: a comparing circuit to compare fault data stored in a storage area with a fault being caused with data of an alternation register; and a position identifying circuit to identify an error bit position from data of a comparative result of the comparing circuit.Type: ApplicationFiled: July 31, 2012Publication date: March 14, 2013Applicant: FUJITSU LIMITEDInventor: Takahito HIRANO
-
Patent number: 8286053Abstract: A reading apparatus reads data from a storage device based on which an error correcting code is to be generated. An error determining unit reads the data from the storage device, and determines whether a read error has occurred in the data. A reading unit re-reads, when the error determining unit determines that a read error has occurred in the data, the same data from the storage device.Type: GrantFiled: July 28, 2008Date of Patent: October 9, 2012Assignee: Fujitsu LimitedInventor: Takahito Hirano
-
Patent number: 8127205Abstract: A correct error correction code can be generated even if a RAM error occurs before writing store data in cache memory (RAM) after confirming that cache line data for storage includes no errors. Before writing the store data, cache line data for storage is stored in a register, the store data is written to the cache memory, the stored contents of the register are merged with the store data, and an error correction code is generated for a result of the merger.Type: GrantFiled: September 28, 2007Date of Patent: February 28, 2012Assignee: Fujitsu LimitedInventors: Takahito Hirano, Takashi Miura, Iwao Yamazaki
-
Publication number: 20110197013Abstract: A cache system includes a primary cache memory configured to input and output data between a computation unit, the primary cache memory includes multi-port memory units each including a storing unit that stores unit data having a first data size, a writing unit that simultaneously writes sequentially inputted plural unit data to consecutive locations of the storing unit, and an outputting unit that reads out and outputs unit data written in the storing unit, wherein when writing data having a second data size that is an arbitrary multiple of a first data size and is segmented into unit data to the primary cache memory, the data is stored in different multi-port memory units by writing the sequential unit data to a subset of the multi-port memory units, and writing the other sequential unit data to another subset of the multi-port memory units.Type: ApplicationFiled: January 25, 2011Publication date: August 11, 2011Applicant: FUJITSU LIMITEDInventor: Takahito HIRANO
-
Publication number: 20110161600Abstract: A processor holds, in a plurality of respective cache lines, part of data held in a main memory unit. The processor also holds, in the plurality of respective cache lines, a tag address used to search for the data held in the cache lines and a flag indicating the validity of the data held in the cache lines. The processor executes a cache line fill instruction on a cache line corresponding to a specified address. Upon execution of the cache line fill instruction, the processor registers predetermined data in the cache line of the cache memory unit which has a tag address corresponding to the specified address and validates a flag in the cache line having the tag address corresponding to the specified address.Type: ApplicationFiled: December 22, 2010Publication date: June 30, 2011Applicant: Fujitsu LimitedInventors: Takahito Hirano, Iwao Yamazaki
-
Publication number: 20100107038Abstract: A cache memory controlling unit includes a plurality of STBs for maintaining 8-byte store data received from an execution unit, a plurality of WBs, a DATA-RAM, an FCDR, and an ECC-RAM. The cache memory controlling unit having such a structure obtains data-not-to-be-stored from the DATA-RAM, stores the obtained data in the FCDR, and merges the stored data with data-to-be-stored in the store data output from the execution unit and stored in the STBs or the WBs to generate new store data. The cache memory controlling unit then writes the generated new store data in the DATA-RAM, generates an ECC from the new store data, and writes the ECC in the ECC-RAM.Type: ApplicationFiled: December 18, 2009Publication date: April 29, 2010Applicant: FUJITSU LIMITEDInventors: Takashi Miura, Iwao Yamazaki, Takahito Hirano
-
Patent number: 7617379Abstract: The present invention comprises, for enabling sharing an address translation buffer (TLB=Translation Lookaside Buffer) between plural threads without generating undesirable multi-hits in an information processor which operates in multi-thread mode, an address translation buffer for storing address translation pairs and thread information, a retriever for retrieving an address translation pair of a virtual addresses identical to said virtual address from the address translation buffer for translating the virtual address into a physical address, a determination unit for determining, when plural addresses translation pairs are retrieved by the retriever, whether or not two or more of said thread information are identical among plural thread information corresponding to plural address translation pairs, and a multi-hit controller for suppressing output of multi-hits and directing execution of address translation if the thread information are determined to be different according to the determination unit.Type: GrantFiled: November 15, 2004Date of Patent: November 10, 2009Assignee: Fujitsu LimitedInventors: Takahito Hirano, Iwao Yamazaki, Tsuyoshi Motokurumada
-
Publication number: 20080294961Abstract: A reading apparatus reads data from a storage device based on which an error correcting code is to be generated. An error determining unit reads the data from the storage device, and determines whether a read error has occurred in the data. A reading unit re-reads, when the error determining unit determines that a read error has occurred in the data, the same data from the storage device.Type: ApplicationFiled: July 28, 2008Publication date: November 27, 2008Applicant: FUJITSU LIMITEDInventor: Takahito Hirano
-
Publication number: 20080163029Abstract: A correct error correction code can be generated even if a RAM error occurs before writing store data in cache memory (RAM) after confirming that cache line data for storage includes no errors. Before writing the store data, cache line data for storage is stored in a register, the store data is written to the cache memory, the stored contents of the register are merged with the store data, and an error correction code is generated for a result of the merger.Type: ApplicationFiled: September 28, 2007Publication date: July 3, 2008Applicant: FUJITSU LIMITEDInventors: Takahito Hirano, Takashi Miura, Iwao Yamazaki