Patents by Inventor Lukas Enderich

Lukas Enderich has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240177004
    Abstract: A method for training an artificial neural network. The method includes the following steps: providing training data for training the artificial neural network; detecting at least one specification regarding available resources; ascertaining a cost function that, in addition to an actual learning task, also takes into account the at least one specification regarding available resources; and training the artificial neural network on the basis of the provided training data using the ascertained cost function.
    Type: Application
    Filed: November 27, 2023
    Publication date: May 30, 2024
    Inventor: Lukas Enderich
  • Publication number: 20220277200
    Abstract: A method for training a trainable module that maps input variables onto output variables through an internal processing chain. A learning data set is provided including learning values of the input variables and associated learning values of the output variables. A list of discrete values is provided from which the parameters characterizing the internal processing chain are to be selected, the discrete values being selected such that they can be stored without loss of quality. The learning values are mapped by the trainable module onto assessment values of the output variables. A cost function is evaluated that characterizes deviations of the assessment values of the output variables from the learning values and of at least one parameter of the internal processing chain from at least one discrete value in the list. At least one parameter of the internal processing chain is adjusted to improve the value of the cost function.
    Type: Application
    Filed: August 6, 2020
    Publication date: September 1, 2022
    Inventors: Fabian Timm, Lukas Enderich
  • Publication number: 20220076124
    Abstract: A method for compressing a neural network. The method includes: defining a maximum complexity of the neural network; ascertaining a first cost function; ascertaining a second cost function, which characterizes a deviation of a current complexity of the neural network in relation to the defined complexity; training the neural network in such a way that a sum of a first and a second cost function is optimized as a function of parameters of the neural network; and removing those weightings whose assigned scaling factor is smaller than a predefined threshold value.
    Type: Application
    Filed: August 6, 2021
    Publication date: March 10, 2022
    Inventors: Fabian Timm, Lukas Enderich