Patents by Inventor Hamed Rezazadegan Tavakoli

Hamed Rezazadegan Tavakoli has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240146938
    Abstract: Various embodiments provide an apparatus, a method and a computer program product for end-to-end learned predictive coding of media frames. An example apparatus includes at least one processor; and at least one non-transitory memory including computer program code; wherein the at least one memory and the computer program code are configured to, with the at least one processor, cause the apparatus at least to perform: encode or decode one or more media frames for at least one neural network; wherein an inter-frame codec is applied to at least one media frame of the one or more media frames; and wherein a first decoded reference frame and a second decoded reference frame refer to reference frames for the at least one media frame.
    Type: Application
    Filed: March 9, 2022
    Publication date: May 2, 2024
    Inventors: Nannan ZOU, Honglei ZHANG, Francesco CRICRÌ, Hamed REZAZADEGAN TAVAKOLI, Ramin GHAZNAVI YOUVALARI
  • Publication number: 20240022787
    Abstract: A method is provided for defining a metadata box of a neural network representation (NNR) item data, wherein the NNR item data comprises an NNR bitstream; and defining an association between the NNR item data and an NNR configuration by using a configuration item property, wherein the NNR configuration item property comprises information about stored NNR item data. Corresponding apparatuses and computer program products are also provided.
    Type: Application
    Filed: October 5, 2021
    Publication date: January 18, 2024
    Inventors: Emre AKSU, Miska HANNUKSELA, Francesco CRICRÌ, Hamed REZAZADEGAN TAVAKOLI
  • Publication number: 20240013046
    Abstract: A method is provided for computing predetermined loss terms based on original data and decoded data; training one or more neural networks of a system by using the predetermined loss terms; updating weights for one or more of other loss terms; and determining trade-offs between predetermined objectives of the system. Corresponding apparatuses and computer program products are also provided.
    Type: Application
    Filed: September 2, 2021
    Publication date: January 11, 2024
    Inventors: Nam LE, Francesco CRICRÌ, Honglei ZHANG, Hamed REZAZADEGAN TAVAKOLI, Ramin GHAZNAVI YOUVALARI
  • Publication number: 20230325644
    Abstract: An apparatus comprising: at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the apparatus at least to: maintain a first parameter update tree that tracks residuals of weight updates of a machine learning model; maintain a second parameter update tree that tracks the weight updates of the machine learning model; pass the first parameter update tree and the residuals to an encoder; receive a first bitstream generated for the residuals from the encoder; pass the second parameter update tree and the weight updates to the encoder; receive a second bitstream generated for the weight updates from the encoder; and determine whether to signal to a decoder the first bitstream generated for the residuals or the second bitstream generated for the weight updates.
    Type: Application
    Filed: April 11, 2023
    Publication date: October 12, 2023
    Inventors: Homayun AFRABANDPEY, Hamed REZAZADEGAN TAVAKOLI, Francesco Cricri, Honglei Zhang, Goutham Rangu, Emre Baris Aksu
  • Publication number: 20230269387
    Abstract: In example embodiments, an apparatus, a method, and a computer program product are provided. An example apparatus include processing circuitry; and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the processing circuitry, cause the apparatus at least to: overfit a neural network on each media item, from a batch of media items, for a number of iterations to obtain an overfitted neural network model for the each media item; evaluate the overfitted neural network model on the each media item to obtain evaluation errors; and update parameters of the neural network to be based on the evaluation errors.
    Type: Application
    Filed: June 11, 2021
    Publication date: August 24, 2023
    Inventors: Francesco CRICRÌ, Hamed REZAZADEGAN TAVAKOLI, Honglei ZHANG, Nannan ZOU
  • Publication number: 20230232015
    Abstract: An apparatus comprising: at least one processor; and at least one non-transitory memory storing instructions that, when executed by the at least one processor, cause the apparatus at least to: receive a signal, the signal comprising a sparse signal; perform residual coding on the signal; perform predictive coding on the signal; determine a residual, the residual comprising a residual of the signal and a base signal or a residual of an approximation and the base signal, the approximation being an approximation of the signal; and determine whether to transmit the residual or the signal over a communication channel.
    Type: Application
    Filed: January 17, 2023
    Publication date: July 20, 2023
    Inventors: Homayun Afrabandpey, Hamed Rezazadegan Tavakoli, Francesco CricrÌ, Honglei Zhang, Goutham Rangu
  • Publication number: 20230217028
    Abstract: In example embodiments, an apparatus, a method, and a computer program product are provided. The apparatus comprises at least one processor; and at least one non-transitory memory including computer program code; wherein the at least one memory and the computer program code are configured to, with the at least one processor, cause the apparatus at least to perform determine a processing order of building blocks to encode or decode a media item; and determine a number of processing steps required to encode or decode the media item; wherein the processing order of building blocks and the number of processing steps are determined based on a content of the media item by using a guided probability model based on a neural network.
    Type: Application
    Filed: June 3, 2021
    Publication date: July 6, 2023
    Inventors: Honglei ZHANG, Francesco GRICRI, Emre Baris AKSU, Hamed REZAZADEGAN TAVAKOLI
  • Publication number: 20230209092
    Abstract: In example embodiments, an apparatus, a method, and a computer program product are provided. The apparatus includes at least one processor; and at least one non-transitory memory including computer program code; wherein the at least one memory and the computer program code are configured to, with the at least one processor, cause the apparatus at least to perform: encode or decode a high-level bitstream syntax for at least one neural network; wherein the high-level bitstream syntax comprises at least one information unit, wherein the at least one information unit comprises syntax definitions for the at least one neural network or a portion of the at least one neural network; and wherein a serialized bitstream comprises one or more of the at least one information units.
    Type: Application
    Filed: April 13, 2021
    Publication date: June 29, 2023
    Inventors: Francesco CRICRÌ, Miska Matias HANNUKSELA, Emre Baris AKSU, Hamed REZAZADEGAN TAVAKOLI
  • Publication number: 20230196072
    Abstract: Various embodiments provide an apparatus, a method, and a computer program product. An example apparatus includes at least one processor; and at least one non-transitory memory comprising computer program code; wherein the at least one memory and the computer program code are configured to, with the at least one processor, cause the apparatus at least to perform: determine a subset of parameters to overfit from a set of candidate parameters of decoder side neural network to be overfitted (OPs); wherein the subset of parameters to overfit is smaller than the set of candidate parameters to be overfitted; and overfit the determined subset of parameters.
    Type: Application
    Filed: December 16, 2021
    Publication date: June 22, 2023
    Inventors: Nannan ZOU, Francesco CRICRÌ, Honglei ZHANG, Hamed REZAZADEGAN TAVAKOLI, Jani LAINEMA, Miska Matias HANNUKSELA
  • Publication number: 20230186054
    Abstract: Various embodiments provide an apparatus, a method, and a computer program product. The apparatus includes at least one processor; and at least one non-transitory memory comprising computer program code; wherein the at least one memory and the computer program code are configured to, with the at least one processor, cause the apparatus at least to perform: organize plurality of decoders side neural networks based on one or more task categories or one or more tasks; and select a decoder side neural network based at least on the one or more task categories or the one or more task.
    Type: Application
    Filed: December 15, 2021
    Publication date: June 15, 2023
    Inventors: Francesco Cricrì, Honglei Zhang, Miska Matias Hannuksela, Hamed Rezazadegan Tavakoli, Nam Hai Le, Ramin Ghaznavi Youvalari, Jukka Ilari Ahonen, Emre Baris Aksu
  • Publication number: 20230169372
    Abstract: Various embodiments provide an apparatus, a method, and a computer program product. 1. An apparatus incudes at least one processor; and at least one non-transitory memory includes computer program code; wherein the at least one memory and the computer program code are configured to, with the at least one processor, cause the apparatus at least to: perform an overfitting operation, at an encoder side, to obtain an overfitted probability model, wherein overfitting comprises one or more training operations applied to a probability model, wherein one or more parameters of the probability model are trained; use the overfitted probability model to provide probability estimates to a lossless codec or a substantially lossless codec for encoding data or a portion of the data; and signal information to a decoder on whether to perform the overfitting operation at the decoder side.
    Type: Application
    Filed: December 1, 2021
    Publication date: June 1, 2023
    Inventors: Nannan ZOU, Francesco CRICRÌ, Honglei ZHANG, Hamed REZAZADEGAN TAVAKOLI, Jani LAINEMA, Miska Matias HANNUKSELA
  • Publication number: 20230154054
    Abstract: Various embodiments provide an apparatus, a method, and a computer program product. An example apparatus includes at least one processor; and at least one non-transitory memory comprising computer program code; wherein the at least one memory and the computer program code are configured to, with the at least one processor, cause the apparatus at least to iteratively perform following until a stopping criterion is met: provide a finetuning driving content (FDC) or a content derived from FDC to a decoder side neural network (DSNN); compute an output of the DSNN as a processed FDC; compute a loss based on the processed FDC and an approximated ground truth data (AGT) associated with the FDC; compute an update to the DSNN; and apply the computed update to the DSNN.
    Type: Application
    Filed: November 7, 2022
    Publication date: May 18, 2023
    Inventors: Francesco CRICRÌ, Honglei ZHANG, Miska Matias HANNUKSELA, Hamed REZAZADEGAN TAVAKOLI, Nam Hai LE, Ramin GHAZNAVI YOUVALARI, Jukka AHONEN, Emre Baris AKSU
  • Publication number: 20230112309
    Abstract: A method, an apparatus, and a computer program product are provided. An example method includes defining an enhancement message comprising at least one of the following: an identifying number for identifying a post-processing filter; a mode identity (idc) field used of indicating association of a post-processing filter with the identifying number; a flag for specifying the enhancement message being used for a current layer; and the payload byte comprising a bitstream; and using the enhancement message for at least one of specifying a neural network that is used as a post-processing filter or cancelling a use of a previous post-processing filter with the same identifying number.
    Type: Application
    Filed: September 23, 2022
    Publication date: April 13, 2023
    Inventors: Miska Matias HANNUKSELA, Emre Baris AKSU, Francesco CRICRÌ, Hamed REZAZADEGAN TAVAKOLI
  • Patent number: 11575938
    Abstract: Data may be encoded to minimize distortion after decoding, but the quality required for presentation of the decoded data to a machine and the quality required for presentation to a human may be different. To accommodate different quality requirements, video data may be encoded to produce a first set of encoded data and a second set of encoded data, where the first set may be decoded for use by one of a machine consumer or a human consumer, and a combination of the first set and the second set may be decoded for use by the other of a machine consumer or a human consumer. The first and second set may be produced with a neural encoder and a neural decoder, and/or may be produced with the use of prediction and transform neural network modules. A human-targeted structure and a machine-targeted structure may produce the sets of encoded data.
    Type: Grant
    Filed: December 30, 2020
    Date of Patent: February 7, 2023
    Assignee: Nokia Technologies Oy
    Inventors: Hamed Rezazadegan Tavakoli, Francesco Cricri, Miska Matias Hannuksela, Emre Baris Aksu, Honglei Zhang, Nam Le
  • Patent number: 11558628
    Abstract: An apparatus includes circuitry configured to: partition an input tensor into one or more block tensors; partition at least one of the block tensors into one or more continuation bands, the one or more continuation bands being associated with a caching counter having a value; store the one or more continuation bands in a cache managed using a cache manager; retrieve, prior to a convolution or pooling operation on a current block tensor, the one or more continuation bands of a previous block tensor from the cache that are adjacent to a current block tensor; concatenate the retrieved continuation bands with the current block tensor; apply the convolution or pooling operation on the current block tensor after the concatenation; decrease the respective caching counter value of the retrieved continuation bands; and clear the continuation bands from the cache when its respective caching counter reaches a value of zero.
    Type: Grant
    Filed: December 13, 2021
    Date of Patent: January 17, 2023
    Assignee: Nokia Technologies Oy
    Inventors: Honglei Zhang, Francesco Cricri, Hamed Rezazadegan Tavakoli, Jani Lainema, Emre Aksu, Nannan Zou
  • Publication number: 20220335269
    Abstract: An apparatus includes circuitry configured to: receive a plurality of compressed residual local weight updates from a plurality of respective institutes with a plurality of a respective first parameter, the first parameter used to determine a plurality of respective predicted local weight updates; determine a plurality of local weight updates or a plurality of adjusted local weight updates based on the plurality of compressed residual local weight updates and the plurality of respective predicted local weight updates; aggregate the plurality of determined local weight updates or the plurality of adjusted local weight updates to generate an intended global weight update, and update a model on a server based at least on the intended global weight update, the model used to perform a task; and transfer a compressed residual global weight update to the institutes with a second parameter, the second parameter used to determine a predicted global weight update.
    Type: Application
    Filed: April 11, 2022
    Publication date: October 20, 2022
    Inventors: Honglei Zhang, Hamed Rezazadegan Tavakoli, Francesco Cricri, Homayun Afrabandpey, Goutham Rangu, Emre Baris Aksu
  • Publication number: 20220256227
    Abstract: An example method is provided to include receiving a media bitstream comprising one or more media units and a first enhancement information message, wherein the first enhancement information message comprises at least two independently parsable structures, a first independently parsable structure comprising information about at least one purpose of one or more neural networks (NNs) to be applied to the one or more media units, and a second independently parsable structure comprising or identifying one or more neural networks; decoding the one or more media units; and using the one or more neural networks to enhance or filter one or more frames of the decoded the one or more media units, based on the at least one purpose. An example method includes. Corresponding apparatuses and computer program products are also provided.
    Type: Application
    Filed: February 3, 2022
    Publication date: August 11, 2022
    Inventors: Hamed REZAZADEGAN TAVAKOLI, Francesco CRICRÌ, Emre Baris AKSU, Miska Matias HANNUKSELA
  • Patent number: 11412266
    Abstract: An apparatus includes at least one processor; and at least one non-transitory memory including computer program code; wherein the at least one memory and the computer program code are configured to, with the at least one processor, cause the apparatus at least to perform: encode or decode a high-level bitstream syntax for at least one neural network; wherein the high-level bitstream syntax comprises at least one information unit having metadata or compressed neural network data of a portion of the at least one neural network; and wherein a serialized bitstream comprises one or more of the at least one information unit.
    Type: Grant
    Filed: January 4, 2021
    Date of Patent: August 9, 2022
    Assignee: Nokia Technologies Oy
    Inventors: Emre Baris Aksu, Miska Matias Hannuksela, Hamed Rezazadegan Tavakoli, Francesco Cricri
  • Patent number: 11375204
    Abstract: An apparatus includes at least one processor; and at least one non-transitory memory including computer program code; wherein the at least one memory and the computer program code are configured to, with the at least one processor, cause the apparatus at least to: decode encoded data to generate decoded data, the encoded data having a bitrate lower than that of original data, and extract features from the decoded data; decode encoded residual features to generate decoded residual features; and generate enhanced decoded features as a result of combining the decoded residual features with the features extracted from the decoded data.
    Type: Grant
    Filed: March 31, 2021
    Date of Patent: June 28, 2022
    Assignee: Nokia Technologies Oy
    Inventors: Honglei Zhang, Hamed Rezazadegan Tavakoli, Francesco Cricri, Miska Matias Hannuksela, Emre Aksu, Nam Le
  • Publication number: 20220191524
    Abstract: An apparatus includes circuitry configured to: partition an input tensor into one or more block tensors; partition at least one of the block tensors into one or more continuation bands, the one or more continuation bands being associated with a caching counter having a value; store the one or more continuation bands in a cache managed using a cache manager; retrieve, prior to a convolution or pooling operation on a current block tensor, the one or more continuation bands of a previous block tensor from the cache that are adjacent to a current block tensor; concatenate the retrieved continuation bands with the current block tensor; apply the convolution or pooling operation on the current block tensor after the concatenation; decrease the respective caching counter value of the retrieved continuation bands; and clear the continuation bands from the cache when its respective caching counter reaches a value of zero.
    Type: Application
    Filed: December 13, 2021
    Publication date: June 16, 2022
    Inventors: Honglei ZHANG, Francesco Cricri, Hamed Rezazadegan Tavakoli, Jani Lainema, Emre Aksu, Nannan Zou