Patents by Inventor Daniel Chibotero

Daniel Chibotero has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11874900
    Abstract: Novel and useful system and methods of functional safety mechanisms for use in an artificial neural network (ANN) processor. The mechanisms can be deployed individually or in combination to provide a desired level of safety in neural networks. Multiple strategies are applied involving redundancy by design, redundancy through spatial mapping as well as self-tuning procedures that modify static (weights) and monitor dynamic (activations) behavior. The NN processor incorporates functional safety concepts which reduce its risk of failure that occurs during operation from going unnoticed. The mechanisms function to detect and promptly flag and report the occurrence of an error with some mechanisms capable of correction as well. The safety mechanisms cover data stream fault detection, software defined redundant allocation, cluster interlayer safety, cluster intralayer safety, layer control unit (LCU) instruction addressing, weights storage safety, and neural network intermediate results storage safety.
    Type: Grant
    Filed: September 29, 2020
    Date of Patent: January 16, 2024
    Inventors: Guy Kaminitz, Ori Katz, Or Danon, Daniel Chibotero, Roi Seznayov, Nir Engelberg, Avi Baum, Itai Resh
  • Patent number: 11811421
    Abstract: Novel and useful system and methods of several functional safety mechanisms for use in an artificial neural network (ANN) processor. The mechanisms can be deployed individually or in combination to provide a desired level of safety in neural networks. Multiple strategies are applied involving redundancy by design, redundancy through spatial mapping as well as self-tuning procedures that modify static (weights) and monitor dynamic (activations) behavior. The mechanisms address ANN system level safety in situ, as a system level strategy tightly coupled with the processor architecture. The NN processor incorporates several functional safety concepts that function to detect and promptly flag and report an error with some mechanisms capable of correction as well.
    Type: Grant
    Filed: September 29, 2020
    Date of Patent: November 7, 2023
    Inventors: Guy Kaminitz, Roi Seznayov, Daniel Chibotero, Ori Katz, Nir Engelberg, Yuval Adelstein, Or Danon, Avi Baum
  • Patent number: 11615297
    Abstract: A novel and useful system and method of improved power performance and lowered memory requirements for an artificial neural network based on packing memory utilizing several structured sparsity mechanisms. The invention applies to neural network (NN) processing engines adapted to implement mechanisms to search for structured sparsity in weights and activations, resulting in a considerably reduced memory usage. The sparsity guided training mechanism synthesizes and generates structured sparsity weights. A compiler mechanism within a software development kit (SDK), manipulates structured weight domain sparsity to generate a sparse set of static weights for the NN. The structured sparsity static weights are loaded into the NN after compilation and utilized by both the structured weight domain sparsity mechanism and the structured activation domain sparsity mechanism.
    Type: Grant
    Filed: May 21, 2020
    Date of Patent: March 28, 2023
    Inventors: Avi Baum, Or Danon, Daniel Chibotero
  • Patent number: 11551028
    Abstract: A novel and useful system and method of improved power performance and lowered memory requirements for an artificial neural network based on packing memory utilizing several structured sparsity mechanisms. The invention applies to neural network (NN) processing engines adapted to implement mechanisms to search for structured sparsity in weights and activations, resulting in a considerably reduced memory usage. The sparsity guided training mechanism synthesizes and generates structured sparsity weights. A compiler mechanism within a software development kit (SDK), manipulates structured weight domain sparsity to generate a sparse set of static weights for the NN. The structured sparsity static weights are loaded into the NN after compilation and utilized by both the structured weight domain sparsity mechanism and the structured activation domain sparsity mechanism.
    Type: Grant
    Filed: May 21, 2020
    Date of Patent: January 10, 2023
    Inventors: Avi Baum, Or Danon, Daniel Chibotero, Gilad Nahor
  • Patent number: 11544545
    Abstract: A novel and useful system and method of improved power performance and lowered memory requirements for an artificial neural network based on packing memory utilizing several structured sparsity mechanisms. The invention applies to neural network (NN) processing engines adapted to implement mechanisms to search for structured sparsity in weights and activations, resulting in a considerably reduced memory usage. The sparsity guided training mechanism synthesizes and generates structured sparsity weights. A compiler mechanism within a software development kit (SDK), manipulates structured weight domain sparsity to generate a sparse set of static weights for the NN. The structured sparsity static weights are loaded into the NN after compilation and utilized by both the structured weight domain sparsity mechanism and the structured activation domain sparsity mechanism.
    Type: Grant
    Filed: May 21, 2020
    Date of Patent: January 3, 2023
    Inventors: Avi Baum, Or Danon, Daniel Chibotero, Gilad Nahor
  • Publication number: 20220101043
    Abstract: Novel and useful system and methods of several functional safety mechanisms for use in an artificial neural network (ANN) processor. The mechanisms can be deployed individually or in combination to provide a desired level of safety in neural networks. Multiple strategies are applied involving redundancy by design, redundancy through spatial mapping as well as self-tuning procedures that modify static (weights) and monitor dynamic (activations) behavior. The various mechanisms of the present invention address ANN system level safety in situ, as a system level strategy that is tightly coupled with the processor architecture. The NN processor incorporates several functional safety concepts which reduce its risk of failure that occurs during operation from going unnoticed. The mechanisms function to detect and promptly flag and report the occurrence of an error with some mechanisms capable of correction as well.
    Type: Application
    Filed: September 29, 2020
    Publication date: March 31, 2022
    Inventors: Ori Katz, Roi Seznayov, Daniel Chibotero, Avi Baum, Guy Kaminitz, Amir Shmul, Nir Engelberg, Yuval Adelstein, Or Danon
  • Publication number: 20220100601
    Abstract: Novel and useful system and methods of several functional safety mechanisms for use in an artificial neural network (ANN) processor. The mechanisms can be deployed individually or in combination to provide a desired level of safety in neural networks. Multiple strategies are applied involving redundancy by design, redundancy through spatial mapping as well as self-tuning procedures that modify static (weights) and monitor dynamic (activations) behavior. The various mechanisms of the present invention address ANN system level safety in situ, as a system level strategy that is tightly coupled with the processor architecture. The NN processor incorporates several functional safety concepts which reduce its risk of failure that occurs during operation from going unnoticed. The mechanisms function to detect and promptly flag and report the occurrence of an error with some mechanisms capable of correction as well.
    Type: Application
    Filed: September 29, 2020
    Publication date: March 31, 2022
    Inventors: Avi Baum, Daniel Chibotero, Roi Seznayov, Or Danon, Ori Katz, Guy Kaminitz
  • Publication number: 20220101042
    Abstract: Novel and useful system and methods of several functional safety mechanisms for use in an artificial neural network (ANN) processor. The mechanisms can be deployed individually or in combination to provide a desired level of safety in neural networks. Multiple strategies are applied involving redundancy by design, redundancy through spatial mapping as well as self-tuning procedures that modify static (weights) and monitor dynamic (activations) behavior. The various mechanisms of the present invention address ANN system level safety in situ, as a system level strategy that is tightly coupled with the processor architecture. The NN processor incorporates several functional safety concepts which reduce its risk of failure that occurs during operation from going unnoticed. The mechanisms function to detect and promptly flag and report the occurrence of an error with some mechanisms capable of correction as well.
    Type: Application
    Filed: September 29, 2020
    Publication date: March 31, 2022
    Inventors: Guy Kaminitz, Ori Katz, Or Danon, Daniel Chibotero, Roi Seznayov, Nir Engelberg, Avi Baum, Itai Resh
  • Publication number: 20220103186
    Abstract: Novel and useful system and methods of several functional safety mechanisms for use in an artificial neural network (ANN) processor. The mechanisms can be deployed individually or in combination to provide a desired level of safety in neural networks. Multiple strategies are applied involving redundancy by design, redundancy through spatial mapping as well as self-tuning procedures that modify static (weights) and monitor dynamic (activations) behavior. The various mechanisms of the present invention address ANN system level safety in situ, as a system level strategy that is tightly coupled with the processor architecture. The NN processor incorporates several functional safety concepts which reduce its risk of failure that occurs during operation from going unnoticed. The mechanisms function to detect and promptly flag and report the occurrence of an error with some mechanisms capable of correction as well.
    Type: Application
    Filed: September 29, 2020
    Publication date: March 31, 2022
    Inventors: Guy Kaminitz, Roi Seznayov, Daniel Chibotero, Ori Katz, Nir Engelberg, Yuval Adelstein, Or Danon, Avi Baum
  • Patent number: 11263077
    Abstract: Novel and useful system and methods of several functional safety mechanisms for use in an artificial neural network (ANN) processor. The mechanisms can be deployed individually or in combination to provide a desired level of safety in neural networks. Multiple strategies are applied involving redundancy by design, redundancy through spatial mapping as well as self-tuning procedures that modify static (weights) and monitor dynamic (activations) behavior. The various mechanisms of the present invention address ANN system level safety in situ, as a system level strategy that is tightly coupled with the processor architecture. The NN processor incorporates several functional safety concepts which reduce its risk of failure that occurs during operation from going unnoticed. The mechanisms function to detect and promptly flag and report the occurrence of an error with some mechanisms capable of correction as well.
    Type: Grant
    Filed: September 29, 2020
    Date of Patent: March 1, 2022
    Inventors: Roi Seznayov, Guy Kaminitz, Daniel Chibotero, Ori Katz, Amir Shmul, Yuval Adelstein, Nir Engelberg, Or Danon, Avi Baum
  • Patent number: 11237894
    Abstract: Novel and useful system and methods of several functional safety mechanisms for use in an artificial neural network (ANN) processor. The mechanisms can be deployed individually or in combination to provide a desired level of safety in neural networks. Multiple strategies are applied involving redundancy by design, redundancy through spatial mapping as well as self-tuning procedures that modify static (weights) and monitor dynamic (activations) behavior. The various mechanisms of the present invention address ANN system level safety in situ, as a system level strategy that is tightly coupled with the processor architecture. The NN processor incorporates several functional safety concepts which reduce its risk of failure that occurs during operation from going unnoticed. The mechanisms function to detect and promptly flag and report the occurrence of an error with some mechanisms capable of correction as well.
    Type: Grant
    Filed: September 29, 2020
    Date of Patent: February 1, 2022
    Inventors: Avi Baum, Roi Seznayov, Daniel Chibotero, Ori Katz, Guy Kaminitz, Nir Engelberg, Yuval Adelstein, Itai Resh, Or Danon
  • Patent number: 11221929
    Abstract: Novel and useful system and methods of several functional safety mechanisms for use in an artificial neural network (ANN) processor. The mechanisms can be deployed individually or in combination to provide a desired level of safety in neural networks. Multiple strategies are applied involving redundancy by design, redundancy through spatial mapping as well as self-tuning procedures that modify static (weights) and monitor dynamic (activations) behavior. The various mechanisms of the present invention address ANN system level safety in situ, as a system level strategy that is tightly coupled with the processor architecture. The NN processor incorporates several functional safety concepts which reduce its risk of failure that occurs during operation from going unnoticed. The mechanisms function to detect and promptly flag and report the occurrence of an error with some mechanisms capable of correction as well.
    Type: Grant
    Filed: September 29, 2020
    Date of Patent: January 11, 2022
    Inventors: Ori Katz, Avi Baum, Guy Kaminitz, Daniel Chibotero, Or Danon, Roi Seznayov, Itai Resh
  • Publication number: 20200285892
    Abstract: A novel and useful system and method of improved power performance and lowered memory requirements for an artificial neural network based on packing memory utilizing several structured sparsity mechanisms. The invention applies to neural network (NN) processing engines adapted to implement mechanisms to search for structured sparsity in weights and activations, resulting in a considerably reduced memory usage. The sparsity guided training mechanism synthesizes and generates structured sparsity weights A compiler mechanism within a software development kit (SDK), manipulates structured weight domain sparsity to generate a sparse set of static weights for the NN. The structured sparsity static weights are loaded into the NN after compilation and utilized by both the structured weight domain sparsity mechanism and the structured activation domain sparsity mechanism.
    Type: Application
    Filed: May 21, 2020
    Publication date: September 10, 2020
    Applicant: Hailo Technologies Ltd.
    Inventors: Avi Baum, Or Danon, Daniel Chibotero, Gilad Nahor
  • Publication number: 20200285949
    Abstract: A novel and useful system and method of improved power performance and lowered memory requirements for an artificial neural network based on packing memory utilizing several structured sparsity mechanisms. The invention applies to neural network (NN) processing engines adapted to implement mechanisms to search for structured sparsity in weights and activations, resulting in a considerably reduced memory usage. The sparsity guided training mechanism synthesizes and generates structured sparsity weights A compiler mechanism within a software development kit (SDK), manipulates structured weight domain sparsity to generate a sparse set of static weights for the NN. The structured sparsity static weights are loaded into the NN after compilation and utilized by both the structured weight domain sparsity mechanism and the structured activation domain sparsity mechanism.
    Type: Application
    Filed: May 21, 2020
    Publication date: September 10, 2020
    Applicant: Hailo Technologies Ltd.
    Inventors: Avi Baum, Or Danon, Daniel Chibotero, Gilad Nahor
  • Publication number: 20200285950
    Abstract: A novel and useful system and method of improved power performance and lowered memory requirements for an artificial neural network based on packing memory utilizing several structured sparsity mechanisms. The invention applies to neural network (NN) processing engines adapted to implement mechanisms to search for structured sparsity in weights and activations, resulting in a considerably reduced memory usage. The sparsity guided training mechanism synthesizes and generates structured sparsity weights A compiler mechanism within a software development kit (SDK), manipulates structured weight domain sparsity to generate a sparse set of static weights for the NN. The structured sparsity static weights are loaded into the NN after compilation and utilized by both the structured weight domain sparsity mechanism and the structured activation domain sparsity mechanism.
    Type: Application
    Filed: May 21, 2020
    Publication date: September 10, 2020
    Applicant: Hailo Technologies Ltd.
    Inventors: Avi Baum, Or Danon, Daniel Chibotero
  • Publication number: 20200279133
    Abstract: A novel and useful system and method of improved power performance and lowered memory requirements for an artificial neural network based on packing memory utilizing several structured sparsity mechanisms. The invention applies to neural network (NN) processing engines adapted to implement mechanisms to search for structured sparsity in weights and activations, resulting in a considerably reduced memory usage. The sparsity guided training mechanism synthesizes and generates structured sparsity weights A compiler mechanism within a software development kit (SDK), manipulates structured weight domain sparsity to generate a sparse set of static weights for the NN. The structured sparsity static weights are loaded into the NN after compilation and utilized by both the structured weight domain sparsity mechanism and the structured activation domain sparsity mechanism.
    Type: Application
    Filed: May 21, 2020
    Publication date: September 3, 2020
    Applicant: Hailo Technologies Ltd.
    Inventors: Avi Baum, Or Danon, Daniel Chibotero