Patents by Inventor Kenneth Duong

Kenneth Duong has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11361213
    Abstract: Some embodiments provide a neural network inference circuit for implementing a neural network that includes multiple computation nodes at multiple layers. Each of a set of the computation nodes includes (i) a linear function that includes a dot product of input values and weight values and (ii) a non-linear activation function. The neural network inference circuit includes (i) a set of dot product circuits to compute dot products for the plurality of computation nodes and (ii) at least one computation node post-processing circuit to (i) receive a dot product for a computation node computed by the set of dot product circuits, (ii) compute a result of the linear function for the computation node based on the dot product, and (iii) use a lookup table to compute the non-linear activation function of the computation node from the result of the linear function to determine an output of the computation node.
    Type: Grant
    Filed: December 6, 2018
    Date of Patent: June 14, 2022
    Assignee: PERCEIVE CORPORATION
    Inventors: Kenneth Duong, Jung Ko, Steven L. Teig
  • Patent number: 11347297
    Abstract: For a neural network inference circuit that executes a neural network including multiple computation nodes at multiple layers for which data is stored in a plurality of memory banks, some embodiments provide a method for dynamically putting memory banks into a sleep mode of operation to conserve power. The method tracks the accesses to individual memory banks and, if a certain number of clock cycles elapse with no access to a particular memory bank, sends a signal to the memory bank indicating that it should operate in a sleep mode. Circuit components involved in dynamic memory sleep, in some embodiments, include a core RAM pipeline, a core RAM sleep controller, a set of core RAM bank select decoders, and a set of core RAM memory bank wrappers.
    Type: Grant
    Filed: May 30, 2019
    Date of Patent: May 31, 2022
    Assignee: PERCEIVE CORPORATION
    Inventors: Jung Ko, Kenneth Duong, Steven L. Teig
  • Patent number: 11341397
    Abstract: Some embodiments provide a method for a neural network inference circuit (NNIC) that implements a neural network including multiple computation nodes at multiple layers. Each computation node includes a dot product of input values and weight values and a set of post-processing operations. The method retrieves a set of weight values and a set of input values for a computation node from a set of memories of the NNIC. The method computes a dot product of the retrieved sets of weight values and input values. The method performs the post-processing operations for the computation node on a result of the dot product computation to compute an output value for the computation node. The method stores the output value in the set of memories. No intermediate results of the dot product or the set of post-processing operations are stored in any RAM of the NNIC during the computation.
    Type: Grant
    Filed: December 6, 2018
    Date of Patent: May 24, 2022
    Assignee: PERCEIVE CORPORATION
    Inventors: Kenneth Duong, Jung Ko, Steven L. Teig
  • Publication number: 20220108161
    Abstract: Some embodiments provide a three-dimensional (3D) circuit structure that has two or more vertically stacked bonded layers with a machine-trained network on at least one bonded layer. As described above, each bonded layer can be an IC die or an IC wafer in some embodiments with different embodiments encompassing different combinations of wafers and dies for the different bonded layers. The machine-trained network in some embodiments includes several stages of machine-trained processing nodes with routing fabric that supplies the outputs of earlier stage nodes to drive the inputs of later stage nodes. In some embodiments, the machine-trained network is a neural network and the processing nodes are neurons of the neural network. In some embodiments, one or more parameters associated with each processing node (e.g., each neuron) is defined through machine-trained processes that define the values of these parameters in order to allow the machine-trained network (e.g.
    Type: Application
    Filed: October 13, 2021
    Publication date: April 7, 2022
    Inventors: Steven L. Teig, Kenneth Duong
  • Patent number: 11295200
    Abstract: Some embodiments provide a method for a neural network inference circuit that executes a neural network including multiple nodes. The method loads a first set of weight values into a first set of weight value buffers, a second set of weight values into a second set of weight value buffers, a first set of input values into a first set of input value buffers, and a second set of input values into a second set of input value buffers. In a first clock cycle, the method computes a first dot product of the first set of weight values and the first set of input values. In a second clock cycle, the method computes a second dot product of the second set of weight values and the second set of input values. The method adds the first and second dot products to compute a dot product for the node.
    Type: Grant
    Filed: December 6, 2018
    Date of Patent: April 5, 2022
    Assignee: PERCEIVE CORPORATION
    Inventors: Jung Ko, Kenneth Duong, Steven L. Teig
  • Publication number: 20220068890
    Abstract: Some embodiments of the invention provide a three-dimensional (3D) circuit that is formed by vertically stacking two or more integrated circuit (IC) dies to at least partially overlap. In this arrangement, several circuit blocks defined on each die (1) overlap with other circuit blocks defined on one or more other dies, and (2) electrically connect to these other circuit blocks through connections that cross one or more bonding layers that bond one or more pairs of dies. In some embodiments, the overlapping, connected circuit block pairs include pairs of computation blocks and pairs of computation and memory blocks. The connections that cross bonding layers to electrically connect circuit blocks on different dies are referred to below as z-axis wiring or connections. This is because these connections traverse completely or mostly in the z-axis of the 3D circuit, with the x-y axes of the 3D circuit defining the planar surface of the IC die substrate or interconnect layers.
    Type: Application
    Filed: September 14, 2021
    Publication date: March 3, 2022
    Inventors: Steven L. Teig, Ilyas Mohammed, Kenneth Duong, Javier DeLaCruz
  • Patent number: 11250326
    Abstract: Some embodiments provide a method for compiling a neural network (NN) program for an NN inference circuit (NNIC) that includes multiple partial dot product computation circuits (PDPCCs) for computing dot products between weight values and input values. The method receives an NN definition with multiple nodes. The method assigns a group of filters to specific PDPCCs. Each filter is assigned to a different set of the PDPCCs. When a filter does not have enough weight values equal to zero for a first set of PDPCCs to which the filter is assigned to compute dot products for nodes that use the filter, the method divides the filter between the first set and a second set of PDPCCs. The method generates program instructions for instructing the NNIC to execute the NN by using the first and second PDPCCs to compute dot products for the nodes that use the filter.
    Type: Grant
    Filed: December 6, 2018
    Date of Patent: February 15, 2022
    Assignee: PERCEIVE CORPORATION
    Inventors: Jung Ko, Kenneth Duong, Steven L. Teig
  • Publication number: 20220020741
    Abstract: A microelectronic circuit structure comprises a stack of bonded layers comprising a bottom layer and at least one upper layer. At least one of the upper layers comprises an oxide layer having a back surface and a front surface closer to the bottom layer than the back surface, and a plurality of FD-SOI transistors built on the from surface. At least a first back gate line and a second back gate line extend separate from each other above the back surface for independently providing a first back gate bias to a first group of transistors and a second back gate bias to a second different group of transistors.
    Type: Application
    Filed: August 24, 2021
    Publication date: January 20, 2022
    Inventors: Javier A. Delacruz, David Edward Fisch, Kenneth Duong, Xu Chang, Liang Wang
  • Patent number: 11222257
    Abstract: Some embodiments provide a neural network inference circuit (NNIC) for executing a neural network. The NNIC includes a first circuit that outputs dot products for computation nodes of a first set of neural network layers, that include dot product computations of sets of weight values with sets of input values. The NNIC also includes a second circuit that outputs values for computation nodes of a second set of neural network layers, that apply a set of calculations that do not include dot products to sets of input values. The NNIC also includes a selection circuit that selects a dot product output from the first circuit when a current layer being processed by the NNIC belongs to the first set of layers, and selects a non-dot product output from the second circuit when the current layer belongs to the second set of layers.
    Type: Grant
    Filed: August 21, 2019
    Date of Patent: January 11, 2022
    Assignee: PERCEIVE CORPORATION
    Inventors: Jung Ko, Kenneth Duong, Steven L. Teig
  • Patent number: 11210586
    Abstract: Some embodiments provide a method for a neural network inference circuit that executes a neural network including multiple computation nodes at multiple layers. Each computation node of a set of the computation nodes includes a dot product of input values and weight values. The method reads a set of encoded weight data for a set of weight values from a memory of the neural network inference circuit. The method decodes the encoded weight data to generate decoded weight data for the set of weight values. The method stores the decoded weight data in a buffer. The method uses the decoded weight data to execute a set of computation nodes. Each computation node of the set of computation nodes includes a dot product between the set of weight values and a different set of input values.
    Type: Grant
    Filed: June 28, 2019
    Date of Patent: December 28, 2021
    Assignee: PERCEIVE CORPORATION
    Inventors: Kenneth Duong, Jung Ko, Steven L. Teig
  • Patent number: 11205115
    Abstract: Some embodiments provide a neural network inference circuit (NNIC) for implementing a neural network that includes multiple computation nodes at multiple layers. Each of a set of the computation nodes includes a dot product of input values and weight values. The NNIC includes multiple dot product core circuits for computing multiple partial dot products and a set of channel circuits connecting the core circuits. The set of channel circuits includes (i) a dot product bus for aggregating the partial dot products to compute dot products for computation nodes of the neural network, (ii) one or more post-processing circuits for performing additional computation operations on the dot products to compute outputs for the computation nodes, and (iii) an output bus for providing the computed outputs of the computation nodes to the core circuits for the core circuits to use as inputs for subsequent computation nodes.
    Type: Grant
    Filed: December 6, 2018
    Date of Patent: December 21, 2021
    Assignee: PERCEIVE CORPORATION
    Inventors: Kenneth Duong, Jung Ko, Steven L. Teig
  • Patent number: 11176450
    Abstract: Some embodiments provide a three-dimensional (3D) circuit structure that has two or more vertically stacked bonded layers with a machine-trained network on at least one bonded layer. As described above, each bonded layer can be an IC die or an IC wafer in some embodiments with different embodiments encompassing different combinations of wafers and dies for the different bonded layers. The machine-trained network in some embodiments includes several stages of machine-trained processing nodes with routing fabric that supplies the outputs of earlier stage nodes to drive the inputs of later stage nodes. In some embodiments, the machine-trained network is a neural network and the processing nodes are neurons of the neural network. In some embodiments, one or more parameters associated with each processing node (e.g., each neuron) is defined through machine-trained processes that define the values of these parameters in order to allow the machine-trained network (e.g.
    Type: Grant
    Filed: December 31, 2017
    Date of Patent: November 16, 2021
    Assignee: Xcelsis Corporation
    Inventors: Steven L. Teig, Kenneth Duong
  • Patent number: 11170289
    Abstract: Some embodiments provide a neural network inference circuit (NNIC) for executing a neural network that includes multiple computation nodes, that include dot products, at multiple layers. The NNIC includes multiple dot product core circuits and a bus, including one or more aggregation circuits, that connects the core circuits. Each core circuit includes (i) a set of memories for storing multiple input values and multiple weight values and (ii) a set of adder tree circuits for computing dot products of sets of input values and sets of weight values stored in the set of memories. For a particular computation node, at least two of the core circuits compute partial dot products using input values and weight values stored in the memories of the respective core circuits and at least one of the aggregation circuits of the bus combines the partial dot products to compute the dot product for the computation node.
    Type: Grant
    Filed: December 6, 2018
    Date of Patent: November 9, 2021
    Assignee: PERCEIVE CORPORATION
    Inventors: Kenneth Duong, Jung Ko, Steven L. Teig
  • Patent number: 11157670
    Abstract: An integrated circuit and a method for designing an IC where the smallest repeatable block is selected, designed and tested to span across multiple die levels. The block is configured to be timing closed at the block level thereby reducing the overall complexity of the design and avoiding the limiting effects of the constrained EDA tools. The block may subsequently be repeated on multiple die to be stacked in an IC.
    Type: Grant
    Filed: May 22, 2020
    Date of Patent: October 26, 2021
    Assignee: Xcelsis Corporation
    Inventors: Javier A Delacruz, Eric Nequist, Jung Ko, Kenneth Duong
  • Patent number: 11152336
    Abstract: Some embodiments of the invention provide a three-dimensional (3D) circuit that is formed by vertically stacking two or more integrated circuit (IC) dies to at least partially overlap. In this arrangement, several circuit blocks defined on each die (1) overlap with other circuit blocks defined on one or more other dies, and (2) electrically connect to these other circuit blocks through connections that cross one or more bonding layers that bond one or more pairs of dies. In some embodiments, the overlapping, connected circuit block pairs include pairs of computation blocks and pairs of computation and memory blocks. The connections that cross bonding layers to electrically connect circuit blocks on different dies are referred to below as z-axis wiring or connections. This is because these connections traverse completely or mostly in the z-axis of the 3D circuit, with the x-y axes of the 3D circuit defining the planar surface of the IC die substrate or interconnect layers.
    Type: Grant
    Filed: March 27, 2020
    Date of Patent: October 19, 2021
    Assignee: Xcelsis Corporation
    Inventors: Steven L. Teig, Ilyas Mohammed, Kenneth Duong, Javier DeLaCruz
  • Patent number: 11127738
    Abstract: A microelectronic circuit structure comprises a stack of bonded layers comprising a bottom layer and at least one upper layer. At least one of the upper layers comprises an oxide layer having a back surface and a front surface closer to the bottom layer than the back surface, and a plurality of FD-SOI transistors built on the front surface. At least a first back gate line and a second back gate line extend separate from each other above the back surface for independently providing a first back gate bias to a first group of transistors and a second back gate bias to a second different group of transistors.
    Type: Grant
    Filed: February 9, 2018
    Date of Patent: September 21, 2021
    Assignee: Xcelsis Corporation
    Inventors: Javier A. Delacruz, David Edward Fisch, Kenneth Duong, Xu Chang, Liang Wang
  • Publication number: 20210263995
    Abstract: Some embodiments provide an IC for implementing a machine-trained network with multiple layers. The IC includes a set of circuits to compute a dot product of (i) a first number of input values computed by other circuits of the IC and (ii) a set of predefined weight values, several of which are zero, with a weight value for each of the input values. The set of circuits includes (i) a dot product computation circuit to compute the dot product based on a second number of inputs and (ii) for each input value, at least two sets of wires for providing the input value to at least two of the dot product computation circuit inputs. The second number is less than the first number. Each input value with a corresponding weight value that is not equal to zero is provided to a different one of the dot product computation circuit inputs.
    Type: Application
    Filed: May 10, 2021
    Publication date: August 26, 2021
    Inventors: Kenneth Duong, Jung Ko, Steven L. Teig
  • Patent number: 11049013
    Abstract: Some embodiments provide a neural network inference circuit for executing a neural network that includes multiple computation nodes at multiple layers. Each of a set of the computation nodes includes a dot product of input values and weight values. The neural network inference circuit includes (i) a first set of memory units allocated to storing input values during execution of the neural network and (ii) a second set of memory units storing encoded weight value data. The weight value data is encoded such that less than one bit of memory is used per weight value of the neural network.
    Type: Grant
    Filed: June 28, 2019
    Date of Patent: June 29, 2021
    Assignee: PERCEIVE CORPORATION
    Inventors: Kenneth Duong, Jung Ko, Steven L. Teig
  • Patent number: 11003736
    Abstract: Some embodiments provide an IC for implementing a machine-trained network with multiple layers. The IC includes a set of circuits to compute a dot product of (i) a first number of input values computed by other circuits of the IC and (ii) a set of predefined weight values, several of which are zero, with a weight value for each of the input values. The set of circuits includes (i) a dot product computation circuit to compute the dot product based on a second number of inputs and (ii) for each input value, at least two sets of wires for providing the input value to at least two of the dot product computation circuit inputs. The second number is less than the first number. Each input value with a corresponding weight value that is not equal to zero is provided to a different one of the dot product computation circuit inputs.
    Type: Grant
    Filed: July 9, 2020
    Date of Patent: May 11, 2021
    Assignee: PERCEIVE CORPORATION
    Inventors: Kenneth Duong, Jung Ko, Steven L. Teig
  • Patent number: 10977338
    Abstract: Some embodiments provide a method for executing a portion of a node of a machine-trained network. The method receives (i) multiple input values computed by previous nodes of the machine-trained network and (ii) for each of the input values, a corresponding predefined weight value. Each of the weight values is zero, a positive value, or a negation of the positive value. To compute a dot product of the input values with the weight values, the method passes to an adder circuit the input value for each input value with a corresponding positive weight value, the value zero for each input value with a corresponding weight value of zero, and a binary inversion of the input value for each input value with a corresponding negative weight value. After the adder circuit adds the values passed to it, the method adds an additional value based on the number of negative weight values.
    Type: Grant
    Filed: September 3, 2018
    Date of Patent: April 13, 2021
    Assignee: PERCEIVE CORPORATION
    Inventors: Kenneth Duong, Jung Ko, Steven L. Teig