Cache With Multi-port Tag Or Data Arrays (epo) Patents (Class 711/E12.048)
-
Patent number: 10437666Abstract: In accordance with an embodiment of the invention, an IC device is disclosed. In the embodiment, the IC device includes an array of bit cells of static random-access memory (SRAM), a multi-level digitization module configured to generate a value in a range of values from a bit cell in the array of bit cells, the range of values including more than two discrete values, an output buffer configured to store the generated values, and an error correction code (ECC) decoder configured to output error corrected values based on the stored values.Type: GrantFiled: August 6, 2015Date of Patent: October 8, 2019Assignee: NXP B.V.Inventors: Nur Engin, Ajay Kapoor
-
Patent number: 8959291Abstract: Described embodiments provide a multi-port memory system that has a plurality of memory banks and an equal number of mapping memory banks, each one of the data memory banks corresponding to one of the mapping memory banks. The multi-port memory reads, from one of the mapping memory banks selected by a read logical bank number, a read physical bank number identifying which one of the data memory banks data is to be read. The memory system also calculates, from at least one physical bank number read from the mapping memory banks other than the mapping memory bank selected by the read logical bank number, a write physical bank number indicating which one of the data memory banks is to be written. The calculation uses a hash of the physical bank numbers, such as by using an Exclusive-OR. This arrangement allows for simultaneous read/write access of the memory with fixed latency.Type: GrantFiled: December 21, 2010Date of Patent: February 17, 2015Assignee: LSI CorporationInventor: Ting Zhou
-
Patent number: 8930643Abstract: Multi-port memory having an additional control bus for passing commands between ports have individual ports that can be configured to respond to a command received from an external control bus or to a command received from the additional control bus. This facilitates various combinations of ports to vary the bandwidth or latency of the memory to facilitate tailoring performance characteristics to differing applications.Type: GrantFiled: June 9, 2014Date of Patent: January 6, 2015Assignee: Micron Technology, Inc.Inventors: Dan Skinner, J. Thomas Pawlowski
-
Patent number: 8769213Abstract: Multi-port memory having an additional control bus for passing commands between ports have individual ports that can be configured to respond to a command received from an external control bus or to a command received from the additional control bus. This facilitates various combinations of ports to vary the bandwidth or latency of the memory to facilitate tailoring performance characteristics to differing applications.Type: GrantFiled: August 24, 2009Date of Patent: July 1, 2014Assignee: Micron Technology, Inc.Inventors: Dan Skinner, J. Thomas Pawlowski
-
Patent number: 8719506Abstract: In an embodiment, a memory port controller (MPC) is coupled to a memory port and receives transactions from processors and a coherency port (ACP) used by one or more peripheral devices that may be cache coherent. The transactions include various quality of service (QoS) parameters. If a high priority QoS transaction is received on the ACP, the MPC may push previous (lower priority) transactions until the high priority transaction may be completed. The MPC may maintain a count of outstanding high priority QoS transactions. The L2 interface controller and ACP controller may push increment and decrement events based on processing the high priority QoS transactions, and the MPC may push the memory transactions when the count is non-zero. In an embodiment, the MPC may continue pushing transactions until the L2 interface controller informs the MPC that the earlier transactions have been completed.Type: GrantFiled: November 21, 2011Date of Patent: May 6, 2014Assignee: Apple Inc.Inventor: Jason M. Kassoff
-
Publication number: 20130326131Abstract: A security context management system within a security accelerator that can operate with high latency memories and can provide line-rate processing on several security protocols. The method employed hides the memory latencies by having the processing engines working in a pipelined fashion. It is designed to auto-fetch security context from external memory, and will allow any number of simultaneous security connections by caching only limited contexts on-chip and fetching other contexts as needed. The module does the task of fetching and associating security context with ingress packet, and populates the security context RAM with data from the external memory.Type: ApplicationFiled: May 29, 2012Publication date: December 5, 2013Applicant: TEXAS INSTRUMENTS INCORPORATEDInventors: Amritpal Singh Mundra, Denis Beaudoin, Eric Lasmana
-
Publication number: 20130132682Abstract: In an embodiment, a memory port controller (MPC) is coupled to a memory port and receives transactions from processors and a coherency port (ACP) used by one or more peripheral devices that may be cache coherent. The transactions include various QoS parameters. If a high priority QoS transaction is received on the ACP, the MPC may push previous (lower priority) transactions until the high priority transaction may be completed. The MPC may maintain a count of outstanding high priority QoS transactions. The L2 interface controller and ACP controller may push increment and decrement events based on processing the high priority QoS transactions, and the MPC may push the memory transactions when the count is non-zero. In an embodiment, the MPC may continue pushing transactions until the L2 interface controller informs the MPC that the earlier transactions have been completed (e.g. by passing an upgrade token to the MPC).Type: ApplicationFiled: November 21, 2011Publication date: May 23, 2013Inventor: Jason M. Kassoff
-
Patent number: 8255424Abstract: A system, storage medium, and method for structuring data are provided. The system connects to a storage device that stores original data. The method obtains the original data from the storage device, and stores the original data in the form of character strings into a buffer memory according to end of file-line (EOF) tags. The method further constructs data arrays to store the character strings, and arranges each of the data arrays into a data matrix. In addition, the method classifies each of the data arrays in the data matrix according to properties of the character strings, arranges the classified data arrays into a data file, and stores the data file into the buffer memory.Type: GrantFiled: September 14, 2009Date of Patent: August 28, 2012Assignee: Hon Hai Precision Industry Co., Ltd.Inventors: Shen-Chun Li, Yung-Chieh Chen, Shou-Kuo Hsu
-
Patent number: 8001334Abstract: A method and system for sharing banks of memory in a multi-port memory device between components is provided. The multi-port memory device includes multiple ports to which components of a system are attached, and multiple banks of memory within the multi-port memory device that are shared by each of the ports. A bank availability pin is added to each port for each bank of memory. The bank availability pin is signaled when the bank is available to a particular port and unsignaled when the bank is unavailable. Thus, the multi-port memory device can be shared by several components simultaneously with only a small amount of additional hardware to support the sharing. Also provided are methods for refreshing the banks of memory.Type: GrantFiled: December 6, 2007Date of Patent: August 16, 2011Assignee: Silicon Image, Inc.Inventor: Dongyun Lee
-
Publication number: 20080256297Abstract: A device that includes multiple processors that are connected to multiple level-one cache units. The device also includes a multi-port high-level cache unit that includes a first modular interconnect, a second modular interconnect, multiple high-level cache paths; whereas the multiple high-level cache paths comprise multiple concurrently accessible interleaved high-level cache units. Conveniently, the device also includes at least one non-cacheable path. A method for retrieving information from a cache that includes: concurrently receiving, by a first modular interconnect of a multiple-port high-level cache unit, requests to retrieve information. The method is characterized by providing information from at least two paths out of multiple high-level cache paths if at least two high-level cache hit occurs, and providing information via a second modular interconnect if a high-level cache miss occurs.Type: ApplicationFiled: November 17, 2005Publication date: October 16, 2008Applicant: Freescale Semiconductor, Inc.Inventors: Ron Bercovich, Odi Dahan, Norman Goldstein, Yehuda Nowogrodski
-
Publication number: 20080162818Abstract: A cache-memory control apparatus controls a level-1 (L1) cache and a level-2 (L2) cache having a cache line divided into a plurality of sub-lines for storing data from the L1 cache. The cache-memory control apparatus includes a control-flag adding unit, an L1 cache control unit, and an L2 cache control unit. The control-flag adding unit provides an SP flag to each of the sub-lines. The L1-cache control unit acquires an access virtual address, and, when there is no data at the access virtual address, outputs an L2 cache-access address to the L2-cache control unit. The L2-cache control unit switches the SP flag based on a virtual page number in an L1 index and a physical page number in an L2 index. Based on the SP flag, corresponding one of the sub-lines is written back to the L1 cache.Type: ApplicationFiled: October 31, 2007Publication date: July 3, 2008Applicant: Fujitsu LimitedInventors: Tomoyuki Okawa, Hiroyuki Kojima, Hideki Sakata, Masaki Ukai
-
Publication number: 20080040552Abstract: The occurrence of a failure in any of an operational processor and a standby processor is monitored, and when a failure occurs in the operational processor, switching to the standby processor is made. A cache memory of each processor has a plurality of ports through which data can be read and written simultaneously. A cache memory controller of the operational processor transfers an update for the cache memory to the cache memory of the standby processor by using a port different from the port used for updating. A cache memory controller of the standby processor writes the received update into the cache memory by using a port different from the port used for updating.Type: ApplicationFiled: August 1, 2007Publication date: February 14, 2008Applicant: FUJITSU LIMITEDInventors: Eiichi TSUIJI, Noaki Kawasaki, Kunio Yamaguchi, Kazunori Uemura, Ryouko Tamura