Patents by Inventor Evangelos Stavros Eleftheriou
Evangelos Stavros Eleftheriou has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11907380Abstract: In an approach, a process stores a matrix of multibit values for a computation in an analog multiply-accumulate unit including at least one crossbar array of binary analog memory cells connected between respective pairs of word- and bit-lines of the array, where: bits of each multibit value are stored in cells connected along a word-line, and corresponding bits of values in a column of the matrix are stored in cells connected along a bit-line. In each of one or more computation stages for a cryptographic element, the process supplies a set of polynomial coefficients of an element bitwise to respective word-lines of the unit to obtain analog accumulation signals on the respective bit-lines. The process converts the analog signals to digital. The process processes the digital signals obtained from successive bits of the polynomial coefficients in each of the stages to obtain a computation result for the cryptographic element.Type: GrantFiled: May 17, 2021Date of Patent: February 20, 2024Assignee: International Business Machines CorporationInventors: Nandakumar Sasidharan Rajalekshmi, Flavio A. Bergamaschi, Evangelos Stavros Eleftheriou
-
Patent number: 11803737Abstract: The present disclosure relates to a neural network system comprising: a controller including a processing unit configured to execute a spiking neural network, and an interface connecting the controller to an external memory. The controller is configured for executing the spiking neural network, the executing comprising generating read instructions and/or write instructions. The interface is configured for: generating read weighting vectors according to the read instructions, coupling read signals, representing the read weighting vectors, into input lines of the memory, thereby retrieving data from the memory, generating write weighting vectors according to the write instructions, coupling write signals, representing the write weighting vectors, into output lines of the memory, thereby writing data into the memory.Type: GrantFiled: July 2, 2020Date of Patent: October 31, 2023Assignee: International Business Machines CorporationInventors: Thomas Bohnstingl, Angeliki Pantazi, Stanislaw Andrzej Wozniak, Evangelos Stavros Eleftheriou
-
Patent number: 11714999Abstract: Neuromorphic methods, systems and devices are provided. The embodiment may include a neuromorphic device which may comprise a crossbar array structure and an analog circuit. The crossbar array structure may include N input lines and M output lines interconnected at junctions via N×M electronic devices, which, in preferred embodiments, include, each, a memristive device. The input lines may comprise N1 first input lines and N2 second input lines. The first input lines may be connected to the M output lines via N1×M first devices of said electronic devices. Similarly, the second input lines may be connected to the M output lines via N2×M second devices of said electronic devices. The analog circuit may be configured to program the electronic devices so as for the first devices to store synaptic weights and the second devices to store neuronal states.Type: GrantFiled: November 15, 2019Date of Patent: August 1, 2023Assignee: International Business Machines CorporationInventors: Thomas Bohnstingl, Angeliki Pantazi, Evangelos Stavros Eleftheriou
-
Patent number: 11604976Abstract: In a hardware-implemented approach for operating a neural network system, a neural network system is provided comprising a controller, a memory, and an interface connecting the controller to the memory, where the controller comprises a processing unit configured to execute a neural network and the memory comprises a neuromorphic memory device with a crossbar array structure that includes input lines and output lines interconnected at junctions via electronic devices. The electronic devices of the neuromorphic memory device are programmed to incrementally change states by coupling write signals into the input lines based on: write instructions received from the controller and write vectors generated by the interface. Data is retrieved from the neuromorphic memory device, according to a multiply-accumulate operation, by coupling read signals into one or more of the input lines of the neuromorphic memory device based on: read instructions from the controller and read vectors generated by the interface.Type: GrantFiled: April 29, 2020Date of Patent: March 14, 2023Assignee: International Business Machines CorporationInventors: Thomas Bohnstingl, Angeliki Pantazi, Stanislaw Andrzej Wozniak, Evangelos Stavros Eleftheriou
-
Patent number: 11531898Abstract: Methods and apparatus are provided for training an artificial neural network having a succession of neuron layers with interposed synaptic layers each having a respective set of N-bit fixed-point weights {w} for weighting signals propagated between its adjacent neuron layers, via an iterative cycle of signal propagation and weight-update calculation operations. Such a method includes, for each synaptic layer, storing a plurality p of the least-significant bits of each N-bit weight w in digital memory, and storing the next n-bit portion of each weight w in an analog multiply-accumulate unit comprising an array of digital memory elements. Each digital memory element comprises n binary memory cells for storing respective bits of the n-bit portion of a weight, where n?1 and (p+n+m)=N where m?0 corresponds to a defined number of most-significant zero bits in weights of the synaptic layer.Type: GrantFiled: May 16, 2019Date of Patent: December 20, 2022Assignee: International Business Machines CorporationInventors: Manuel Le Gallo-Bourdeau, Riduan Khaddam-Aljameh, Lukas Kull, Pier Andrea Francese, Thomas H. Toifl, Abu Sebastian, Evangelos Stavros Eleftheriou
-
Publication number: 20220366059Abstract: In an approach, a process stores a matrix of multibit values for a computation in an analog multiply-accumulate unit including at least one crossbar array of binary analog memory cells connected between respective pairs of word- and bit-lines of the array, where: bits of each multibit value are stored in cells connected along a word-line, and corresponding bits of values in a column of the matrix are stored in cells connected along a bit-line. In each of one or more computation stages for a cryptographic element, the process supplies a set of polynomial coefficients of an element bitwise to respective word-lines of the unit to obtain analog accumulation signals on the respective bit-lines. The process converts the analog signals to digital. The process processes the digital signals obtained from successive bits of the polynomial coefficients in each of the stages to obtain a computation result for the cryptographic element.Type: ApplicationFiled: May 17, 2021Publication date: November 17, 2022Inventors: Nandakumar Sasidharan Rajalekshmi, Flavio A. Bergamaschi, Evangelos Stavros Eleftheriou
-
Publication number: 20220350514Abstract: A memory controller circuit for mapping data of a convolutional neural network to a physical memory is disclosed. The memory controller circuit comprises a receiving unit to receive a selection parameter value, and a mapping unit to map pixel values of one layer of the convolutional neural network to memory words of the physical memory according to one of a plurality of mapping schemas, wherein the mapping is dependent on the value of the received selection parameter value.Type: ApplicationFiled: April 28, 2021Publication date: November 3, 2022Inventors: Martino Dazzi, Pier Andrea Francese, Abu Sebastian, Evangelos Stavros Eleftheriou
-
Patent number: 11430524Abstract: The present disclosure relates to a storage device comprising a memory element. The memory element may comprise a changeable physical quantity for storing information. The physical quantity may be in a drifted state. The memory element may be configured for setting the physical quantity to an initial state. Furthermore, the memory element may comprise a drift of the physical quantity from the initial state to the drifted state. The initial state of the physical quantity may be computable by means of an initialization function. The initialization function may be dependent on a target state of the physical quantity and the target state of the physical quantity may be approximately equal to the drifted state of the physical quantity.Type: GrantFiled: October 30, 2020Date of Patent: August 30, 2022Assignee: International Business Machines CorporationInventors: Thomas Bohnstingl, Angeliki Pantazi, Stanislaw Andrzej Wozniak, Evangelos Stavros Eleftheriou
-
Patent number: 11386319Abstract: Methods and apparatus are provided for training an artificial neural network, having a succession of neuron layers with interposed synaptic layers each storing a respective set of weights {w} for weighting signals propagated between its adjacent neuron layers, via an iterative cycle of signal propagation and weight-update calculation operations. Such a method includes, for at least one of the synaptic layers, providing a plurality Pl of arrays of memristive devices, each array storing the set of weights of that synaptic layer Sl in respective memristive devices, and, in a signal propagation operation, supplying respective subsets of the signals to be weighted by the synaptic layer Sl in parallel to the Pl arrays. The method also includes, in a weight-update calculation operation, calculating updates to respective weights stored in each of the Pl arrays in dependence on signals propagated by the neuron layers.Type: GrantFiled: March 14, 2019Date of Patent: July 12, 2022Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATIONInventors: Manuel Le Gallo-Bourdeau, Nandakumar Sasidharan Rajalekshmi, Christophe Piveteau, Irem Boybat Kara, Abu Sebastian, Evangelos Stavros Eleftheriou
-
Publication number: 20220172058Abstract: Training a neural network that comprises nodes and weighted connections between selected ones of the nodes is described herein. A function of a desired activity and a current activity during training results in a feedback signal used for adjusting weight values of the connections. For a weight value update cycle the process determines an importance value for various nodes based on current weight values of the connections and determines an adjustment of the feedback signal specific for each weight value of the connections by a combination of a gradient value derived from the feedback signal for a connection and the determined corresponding element of the adjustment matrix. The updates are applied to the connections during update cycles.Type: ApplicationFiled: November 30, 2020Publication date: June 2, 2022Inventors: Giorgia Dellaferrera, Stanislaw Andrzej Wozniak, Angeliki Pantazi, Evangelos Stavros Eleftheriou
-
Patent number: 11348002Abstract: Methods and apparatus are provided for training an artificial neural network having a succession of layers of neurons interposed with layers of synapses. A set of crossbar arrays of memristive devices, connected between row and column lines, implements the layers of synapses. Each memristive device stores a weight for a synapse interconnecting a respective pair of neurons in successive neuron layers. The training method includes performing forward propagation, backpropagation and weight-update operations of an iterative training scheme by applying input signals, associated with respective neurons, to row or column lines of the set of arrays to obtain output signals on the other of the row or column lines, and storing digital signal values corresponding to the input and output signals. The weight-update operation is performed by calculating digital weight-correction values for respective memristive devices, and applying programming signals to those devices to update the stored weights.Type: GrantFiled: June 29, 2018Date of Patent: May 31, 2022Assignee: International Business Machines CorporationInventors: Irem Boybat Kara, Evangelos Stavros Eleftheriou, Manuel Le Gallo-Bourdeau, Nandakumar Sasidharan Rajalekshmi, Abu Sebastian
-
Publication number: 20220138540Abstract: The present disclosure relates to an integrated circuit comprising a first neuromorphic neuron apparatus. The first neuromorphic neuron apparatus comprises an input and an accumulation block having a state variable for performing an inference task on the basis of input data comprising a temporal sequence. The first neuromorphic neuron apparatus may be switchable in a first mode and in a second mode. The accumulation block may be configured to perform an adjustment of the state variable using a current input signal of the first neuromorphic neuron apparatus and a decay function indicative of a decay behavior of the apparatus. The state variable may be dependent on previously received one or more input signals of the first neuromorphic neuron apparatus.Type: ApplicationFiled: October 30, 2020Publication date: May 5, 2022Inventors: Angeliki Pantazi, Milos Stanisavljevic, Stanislaw Andrzej Wozniak, Thomas Bohnstingl, Evangelos Stavros Eleftheriou
-
Publication number: 20220139464Abstract: The present disclosure relates to a storage device comprising a memory element. The memory element may comprise a changeable physical quantity for storing information. The physical quantity may be in a drifted state. The memory element may be configured for setting the physical quantity to an initial state. Furthermore, the memory element may comprise a drift of the physical quantity from the initial state to the drifted state. The initial state of the physical quantity may be computable by means of an initialization function. The initialization function may be dependent on a target state of the physical quantity and the target state of the physical quantity may be approximately equal to the drifted state of the physical quantity.Type: ApplicationFiled: October 30, 2020Publication date: May 5, 2022Inventors: Thomas Bohnstingl, Angeliki Pantazi, Stanislaw Andrzej Wozniak, Evangelos Stavros Eleftheriou
-
Publication number: 20220101117Abstract: A computer-implemented method, system, and computer program product to solve a cognitive task that includes learning abstract properties. One embodiment may comprise accessing datasets that characterize the abstract properties. The accessed datasets may then be inputted into a first neural network to generate first embeddings. Pairs of the first embeddings generated may be formed, which correspond to pairs of the datasets. Data corresponding to the pairs formed may then be inputted into a second neural network, which may be executed to generate second embeddings. The latter may capture relational properties of the pairs of the datasets. A third neural network may be subsequently executed, based on the second embeddings generated, to obtain output values. One or more abstract properties of the datasets are learned based on the output values obtained, in order to solve the cognitive task.Type: ApplicationFiled: September 29, 2020Publication date: March 31, 2022Inventors: Giovanni Cherubini, Hlynur Freyr Jonsson, Evangelos Stavros Eleftheriou
-
Patent number: 11250107Abstract: The present disclosure relates to a method for executing a computation task composed of at least one set of operations where subsets of pipelineable operations of the set of operations are determined in accordance with a pipelining scheme. A single routine may be created for enabling execution of the determined subsets of operations by a hardware accelerator. The routine has, as arguments, a value indicative of input data and values of configuration parameters of the computation task, where a call of the routine causes a scheduling of the subsets of operations on the hardware accelerator in accordance with the values of the configuration parameters. Upon receiving input data of the computation task, the routine may be called to cause the hardware accelerator to perform by the computation task in accordance with the scheduling.Type: GrantFiled: July 15, 2019Date of Patent: February 15, 2022Assignee: International Business Machines CorporationInventors: Christophe Piveteau, Nikolas Ioannou, Igor Krawczuk, Manuel Le Gallo-Bourdeau, Abu Sebastian, Evangelos Stavros Eleftheriou
-
Publication number: 20220027727Abstract: The invention is notably directed to a computer-implemented method for training parameters of a recurrent neural network. The network comprises one or more layers of neuronal units. Each neuronal unit has an internal state, which may also be denoted as unit state. The method comprises providing training data comprising an input signal and an expected output signal to the recurrent neural network. The method further comprises computing, for each neuronal unit, a spatial gradient component and computing, for each neuronal unit, a temporal gradient component. The method further comprises updating the temporal and the spatial gradient component for each neuronal unit at each time instance of the input signal. The computing of the spatial and the gradient component may be performed independently from each other. The invention further concerns a neural network and a related computer program product.Type: ApplicationFiled: June 5, 2021Publication date: January 27, 2022Inventors: Thomas Bohnstingl, Stanislaw Andrzej Wozniak, Angeliki Pantazi, Evangelos Stavros Eleftheriou
-
Publication number: 20220004851Abstract: The present disclosure relates to a neural network system comprising: a controller including a processing unit configured to execute a spiking neural network, and an interface connecting the controller to an external memory. The controller is configured for executing the spiking neural network, the executing comprising generating read instructions and/or write instructions. The interface is configured for: generating read weighting vectors according to the read instructions, coupling read signals, representing the read weighting vectors, into input lines of the memory, thereby retrieving data from the memory, generating write weighting vectors according to the write instructions, coupling write signals, representing the write weighting vectors, into output lines of the memory, thereby writing data into the memory.Type: ApplicationFiled: July 2, 2020Publication date: January 6, 2022Inventors: Thomas Bohnstingl, Angeliki Pantazi, Stanislaw Andrzej Wozniak, Evangelos Stavros Eleftheriou
-
Patent number: 11188825Abstract: A computer-implemented method of mixed-precision deep learning with multi-memristive synapses may be provided. The method comprises representing, each synapse of an artificial neural network by a combination of a plurality of memristive devices, wherein each of the plurality of memristive devices of each of the synapses contributes to an overall synaptic weight with a related device significance, accumulating a weight gradient ?W for each synapse in a high-precision variable, and performing a weight update to one of the synapses using an arbitration scheme for selecting a respective memristive device, according to which a threshold value related to the high-precision variable for performing the weight update is set according to the device significance of the respective memristive device selected by the arbitration schema.Type: GrantFiled: January 9, 2019Date of Patent: November 30, 2021Assignee: International Business Machines CorporationInventors: Irem Boybat Kara, Manuel Le Gallo-Bourdeau, Nandakumar Sasidharan Rajalekshmi, Abu Sebastian, Evangelos Stavros Eleftheriou
-
Publication number: 20210342672Abstract: In a hardware-implemented approach for operating a neural network system, a neural network system is provided comprising a controller, a memory, and an interface connecting the controller to the memory, where the controller comprises a processing unit configured to execute a neural network and the memory comprises a neuromorphic memory device with a crossbar array structure that includes input lines and output lines interconnected at junctions via electronic devices. The electronic devices of the neuromorphic memory device are programmed to incrementally change states by coupling write signals into the input lines based on: write instructions received from the controller and write vectors generated by the interface. Data is retrieved from the neuromorphic memory device, according to a multiply-accumulate operation, by coupling read signals into one or more of the input lines of the neuromorphic memory device based on: read instructions from the controller and read vectors generated by the interface.Type: ApplicationFiled: April 29, 2020Publication date: November 4, 2021Inventors: Thomas Bohnstingl, Angeliki Pantazi, Stanislaw Andrzej Wozniak, Evangelos Stavros Eleftheriou
-
Publication number: 20210150327Abstract: Neuromorphic methods, systems and devices are provided. The embodiment may include a neuromorphic device which may comprise a crossbar array structure and an analog circuit. The crossbar array structure may include N input lines and M output lines interconnected at junctions via N×M electronic devices, which, in preferred embodiments, include, each, a memristive device. The input lines may comprise N1 first input lines and N2 second input lines. The first input lines may be connected to the M output lines via N1×M first devices of said electronic devices. Similarly, the second input lines may be connected to the M output lines via N2×M second devices of said electronic devices. The analog circuit may be configured to program the electronic devices so as for the first devices to store synaptic weights and the second devices to store neuronal states.Type: ApplicationFiled: November 15, 2019Publication date: May 20, 2021Inventors: Thomas Bohnstingl, Angeliki Pantazi, Evangelos Stavros Eleftheriou