Patents by Inventor Amin Ansari

Amin Ansari has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20200104692
    Abstract: A method of exploiting activation sparsity in deep neural networks is described. The method includes retrieving an activation tensor and a weight tensor where the activation tensor is a sparse activation tensor. The method also includes generating a compressed activation tensor comprising non-zero activations of the activation tensor, where the compressed activation tensor has fewer columns than the activation tensor. The method further includes processing the compressed activation tensor and the weight tensor to generate an output tensor.
    Type: Application
    Filed: September 28, 2018
    Publication date: April 2, 2020
    Inventors: Rexford HILL, Aaron LAMB, Michael GOLDFARB, Amin ANSARI, Christopher LOTT
  • Publication number: 20200089497
    Abstract: Systems and methods for of minimizing control variance overhead in a dataflow processor include receiving a generating instruction specifying at least an acknowledge predicate based on a first number, a second number, and a first value, wherein a true branch comprises the first number of consumer instructions of the generating instruction based on the first value, used as a first predicate, being true; and a false branch comprises a second number of consumer instructions of the generating instruction based on the first value, used as the first predicate, being false. The acknowledge predicate is evaluated to be a selected number, which is the first number if the first value is true, or the second number if the first value is false. The generating instruction is fired upon the selected number of acknowledge arcs being received from the true branch or the false branch.
    Type: Application
    Filed: September 18, 2018
    Publication date: March 19, 2020
    Inventors: Rakesh KOMURAVELLI, Amin ANSARI, Ramesh Chandra CHAUHAN, Karamvir CHATHA
  • Patent number: 10296076
    Abstract: Supply voltage droop management circuits for reducing or avoiding supply voltage droops are disclosed. A supply voltage droop management circuit includes interrupt circuit configured to receive event signals generated by a functional circuit. Event signals correspond to an operational event that occurs in the functional circuit and increases load current demand to a power supply powering the functional circuit, causing supply voltage droop. The interrupt circuit is configured to generate an interrupt signal in response to the received event signal. Memory includes an operational event-frequency table having entries with a target frequency corresponding to an operational event. Operating the functional circuit at target frequency reduces the load current demand on the power supply, and supply voltage droop.
    Type: Grant
    Filed: May 16, 2016
    Date of Patent: May 21, 2019
    Assignee: QUALCOMM Incorporated
    Inventors: Javid Jaffari, Amin Ansari
  • Publication number: 20190087713
    Abstract: The present disclosure describes methods, computer-readable media, and apparatuses for operating neural networks. For example, a first apparatus may receive a set of sparse weight vectors. The first apparatus may compress the set of sparse weight vectors to produce a compressed set of sparse weight vectors. The first apparatus may operate a neural network based on the compressed set of sparse weight vectors. In another example, a second apparatus may receive a set of sparse weight vectors. The second apparatus may perform a sparse computation based on the set of sparse weight vectors, and the performance of the sparse computation may produce one or more partial sums. The second apparatus may operate a neural network based at least in part on the one or more partial sums.
    Type: Application
    Filed: September 20, 2018
    Publication date: March 21, 2019
    Inventors: Aaron LAMB, Rexford HILL, Amin ANSARI
  • Patent number: 10176147
    Abstract: Multi-processor core three-dimensional (3D) integrated circuits (ICs) (3DICs) and related methods are disclosed. In aspects disclosed herein, ICs are provided that include a central processing unit (CPU) having multiple processor cores (“cores”) to improve performance. To further improve CPU performance, the multiple cores can also be designed to communicate with each other to offload workloads and/or share resources for parallel processing, but at a communication overhead associated with passing data through interconnects which have an associated latency. To mitigate this communication overhead inefficiency, aspects disclosed herein provide the CPU with its multiple cores in a 3DIC.
    Type: Grant
    Filed: March 7, 2017
    Date of Patent: January 8, 2019
    Assignee: QUALCOMM Incorporated
    Inventors: Kambiz Samadi, Amin Ansari, Yang Du
  • Patent number: 10120581
    Abstract: Aspects for generating compressed data streams with lookback pre-fetch instructions are disclosed. A data compression system is provided and configured to receive and compress an uncompressed data stream as part of a lookback-based compression scheme. The data compression system determines if a current data block was previously compressed. If so, the data compression system is configured to insert a lookback instruction corresponding to the current data block into the compressed data stream. Each lookback instruction includes a lookback buffer index that points to an entry in a lookback buffer where decompressed data corresponding to the data block will be stored during a separate decompression scheme. Once the data blocks have been compressed, the data compression system is configured to move a lookback buffer index of each lookback instruction in the compressed data stream into a lookback pre-fetch instruction located earlier than the corresponding lookback instruction in the compressed data stream.
    Type: Grant
    Filed: March 30, 2016
    Date of Patent: November 6, 2018
    Assignee: QUALCOMM Incorporated
    Inventors: Richard Senior, Amin Ansari, Vito Remo Bica, Jinxia Bai
  • Publication number: 20180260360
    Abstract: Multi-processor core three-dimensional (3D) integrated circuits (ICs) (3DICs) and related methods are disclosed. In aspects disclosed herein, ICs are provided that include a central processing unit (CPU) having multiple processor cores (“cores”) to improve performance. To further improve CPU performance, the multiple cores can also be designed to communicate with each other to offload workloads and/or share resources for parallel processing, but at a communication overhead associated with passing data through interconnects which have an associated latency. To mitigate this communication overhead inefficiency, aspects disclosed herein provide the CPU with its multiple cores in a 3DIC.
    Type: Application
    Filed: March 7, 2017
    Publication date: September 13, 2018
    Inventors: Kambiz Samadi, Amin Ansari, Yang Du
  • Patent number: 9998143
    Abstract: A system for data decompression may include a processor coupled to a remote memory having a remote dictionary stored thereon and coupled to a decompression logic having a local memory with a local dictionary. The processor may decompress data during execution by accessing the local dictionary, and if necessary, the remote dictionary.
    Type: Grant
    Filed: August 5, 2016
    Date of Patent: June 12, 2018
    Assignee: QUALCOMM Incorporated
    Inventors: Richard Senior, Amin Ansari, Jinxia Bai, Vito Bica
  • Patent number: 9823854
    Abstract: Aspects disclosed relate to a priority-based access of compressed memory lines in a processor-based system. In an aspect, a memory access device in the processor-based system receives a read access request for memory. If the read access request is higher priority, the memory access device uses the logical memory address of the read access request as the physical memory address to access the compressed memory line. However, if the read access request is lower priority, the memory access device translates the logical memory address of the read access request into one or more physical memory addresses in memory space left by the compression of higher priority lines. In this manner, the efficiency of higher priority compressed memory accesses is improved by removing a level of indirection otherwise required to find and access compressed memory lines.
    Type: Grant
    Filed: March 18, 2016
    Date of Patent: November 21, 2017
    Assignee: QUALCOMM Incorporated
    Inventors: Andres Alejandro Oportus Valenzuela, Amin Ansari, Richard Senior, Nieyan Geng, Anand Janakiraman, Gurvinder Singh Chhabra
  • Publication number: 20170329391
    Abstract: Supply voltage droop management circuits for reducing or avoiding supply voltage droops are disclosed. A supply voltage droop management circuit includes interrupt circuit configured to receive event signals generated by a functional circuit. Event signals correspond to an operational event that occurs in the functional circuit and increases load current demand to a power supply powering the functional circuit, causing supply voltage droop. The interrupt circuit is configured to generate an interrupt signal in response to the received event signal. Memory includes an operational event-frequency table having entries with a target frequency corresponding to an operational event. Operating the functional circuit at target frequency reduces the load current demand on the power supply, and supply voltage droop.
    Type: Application
    Filed: May 16, 2016
    Publication date: November 16, 2017
    Inventors: Javid Jaffari, Amin Ansari
  • Publication number: 20170285939
    Abstract: Aspects for generating compressed data streams with lookback pre-fetch instructions are disclosed. A data compression system is provided and configured to receive and compress an uncompressed data stream as part of a lookback-based compression scheme. The data compression system determines if a current data block was previously compressed. If so, the data compression system is configured to insert a lookback instruction corresponding to the current data block into the compressed data stream. Each lookback instruction includes a lookback buffer index that points to an entry in a lookback buffer where decompressed data corresponding to the data block will be stored during a separate decompression scheme. Once the data blocks have been compressed, the data compression system is configured to move a lookback buffer index of each lookback instruction in the compressed data stream into a lookback pre-fetch instruction located earlier than the corresponding lookback instruction in the compressed data stream.
    Type: Application
    Filed: March 30, 2016
    Publication date: October 5, 2017
    Inventors: Richard Senior, Amin Ansari, Vito Remo Bica, Jinxia Bai
  • Publication number: 20170269851
    Abstract: Aspects disclosed relate to a priority-based access of compressed memory lines in a processor-based system. In an aspect, a memory access device in the processor-based system receives a read access request for memory. If the read access request is higher priority, the memory access device uses the logical memory address of the read access request as the physical memory address to access the compressed memory line. However, if the read access request is lower priority, the memory access device translates the logical memory address of the read access request into one or more physical memory addresses in memory space left by the compression of higher priority lines. In this manner, the efficiency of higher priority compressed memory accesses is improved by removing a level of indirection otherwise required to find and access compressed memory lines.
    Type: Application
    Filed: March 18, 2016
    Publication date: September 21, 2017
    Inventors: Andres Alejandro Oportus Valenzuela, Amin Ansari, Richard Senior, Nieyan Geng, Anand Janakiraman, Gurvinder Singh Chhabra
  • Publication number: 20170262367
    Abstract: Systems, methods, and computer programs are disclosed for allocating memory in a hybrid parallel/serial memory system. One method comprises configuring a memory address map for a multi-rank memory system with a dedicated serial access region in a first memory rank and a dedicated parallel access region in a second memory rank. A request is received for a virtual memory page. If the request comprises a performance hint, the virtual memory page is selectively assigned to a free physical page in the dedicated serial access in the first memory rank and the dedicated parallel access region in the second memory rank.
    Type: Application
    Filed: March 11, 2016
    Publication date: September 14, 2017
    Inventors: DEXTER TAMIO CHUN, YANRU LI, JAVID JAFFARI, AMIN ANSARI
  • Patent number: 9747038
    Abstract: Systems and methods are disclosed for a hybrid parallel-serial memory access by a system on chip (SoC). The SoC is electrically coupled to the memory by both a parallel access channel and a separate serial access channel. A request for access to the memory is received. In response to receiving the request to access the memory, a type of memory access is identified. A determination is then made whether to access the memory with the serial access channel. In response to the determination to access the memory with the serial access channel, a first portion of the memory is accessed with the parallel access channel, and a second portion of the memory is accessed with the serial access channel.
    Type: Grant
    Filed: December 2, 2015
    Date of Patent: August 29, 2017
    Assignee: QUALCOMM Incorporated
    Inventors: Javid Jaffari, Amin Ansari, Rodolfo Beraha
  • Publication number: 20170160928
    Abstract: Systems and methods are disclosed for a hybrid parallel-serial memory access by a system on chip (SoC). The SoC is electrically coupled to the memory by both a parallel access channel and a separate serial access channel. A request for access to the memory is received. In response to receiving the request to access the memory, a type of memory access is identified. A determination is then made whether to access the memory with the serial access channel. In response to the determination to access the memory with the serial access channel, a first portion of the memory is accessed with the parallel access channel, and a second portion of the memory is accessed with the serial access channel.
    Type: Application
    Filed: December 2, 2015
    Publication date: June 8, 2017
    Inventors: JAVID JAFFARI, Amin Ansari, Rodolfo Beraha
  • Publication number: 20170017576
    Abstract: Aspects include computing devices, systems, and methods for implementing generating a cache memory configuration. A server may apply machine learning to context data. The server may determine a cache memory configuration relating to the context data for a cache memory of a computing device and predict execution of an application on the computing device. Aspects include computing devices, systems, and methods for implementing configuring a cache memory of the computing device. The computing device may classify a plurality of cache memory configurations, related to a predicted application execution, based on at least a hardware data threshold and a first hardware data. The computing device may select a first cache memory configuration from the plurality of cache memory configurations in response to the first cache memory configuration being classified for the first hardware data, and configuring the cache memory at runtime based on the first cache memory configuration.
    Type: Application
    Filed: July 16, 2015
    Publication date: January 19, 2017
    Inventors: Rosario Cammarota, Kishore Yalamanchili, Amin Ansari, Amrit Kumar Panda, Rodolfo Giacomo Beraha
  • Publication number: 20160344406
    Abstract: A system for data decompression may include a processor coupled to a remote memory having a remote dictionary stored thereon and coupled to a decompression logic having a local memory with a local dictionary. The processor may decompress data during execution by accessing the local dictionary, and if necessary, the remote dictionary.
    Type: Application
    Filed: August 5, 2016
    Publication date: November 24, 2016
    Inventors: Richard SENIOR, Amin ANSARI, Jinxia BAI, Vito BICA
  • Publication number: 20160248441
    Abstract: A system for data decompression may include a processor coupled to a remote memory having a remote dictionary stored thereon and coupled to a decompression logic having a local memory with a local dictionary. The processor may decompress data during execution by accessing the local dictionary, and if necessary, the remote dictionary.
    Type: Application
    Filed: February 19, 2015
    Publication date: August 25, 2016
    Inventors: Richard SENIOR, Amin ANSARI, Jinxia BAI, Vito BICA
  • Patent number: 9413386
    Abstract: A system for data decompression may include a processor coupled to a remote memory having a remote dictionary stored thereon and coupled to a decompression logic having a local memory with a local dictionary. The processor may decompress data during execution by accessing the local dictionary, and if necessary, the remote dictionary.
    Type: Grant
    Filed: February 19, 2015
    Date of Patent: August 9, 2016
    Assignee: QUALCOMM Incorporated
    Inventors: Richard Senior, Amin Ansari, Jinxia Bai, Vito Bica
  • Patent number: 9344114
    Abstract: Data compression systems, methods, and computer program products are disclosed. For each successive input word of an input stream, it is determined whether the input word matches an entry in a lookback table. The lookback table is updated in response to the input word. Input words may be of a number of data types, including zero runs and full or partial matches with an entry in the lookback table. A codeword is generated by entropy encoding a data type corresponding to the input word. The lookback table may be indexed by the position of the input word in the input stream.
    Type: Grant
    Filed: August 21, 2015
    Date of Patent: May 17, 2016
    Assignee: QUALCOMM INCORPORATED
    Inventors: Adrienne Milner, Amin Ansari, Richard Senior, Vito Remo Bica