Patents by Inventor Christian Haase-Schuetz

Christian Haase-Schuetz has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240078472
    Abstract: An iterative method is for determining training data for a primary model to solve a primary recognition task. The iterative method includes a) providing at least one labeled training sample, b) training the primary model with the at least one labeled training sample, c) providing at least one labeled test sample, and d) evaluating a recognition performance of the primary model using the labeled test sample on the primary recognition task. The iterative method further includes, depending on a result of the evaluating the recognition performance, either (i) re-performing parts a), b), c), and d) of the iterative method, or (ii) ending the iterative method.
    Type: Application
    Filed: September 6, 2023
    Publication date: March 7, 2024
    Inventors: Christian Haase-Schuetz, Heinz Hertlein, Joscha Liedtke, Oliver Rogalla
  • Publication number: 20220284287
    Abstract: An artificial neural network (ANN), including processing layers which are each configured to process input quantities in accordance with trainable parameters of the ANN to form output quantities. At least one normalizer is inserted into at least one processing layer and/or between at least two processing layers. The normalizer includes a transformation element configured to transform input quantities directed into the normalizer into one or more input vectors, using a predefined transformation. The normalizer also includes a normalizing element configured to normalize the input vector(s) using a normalization function, to form one or more output vectors. The normalization function has at least two different regimes and changes between the regimes as a function of a norm of the input vector at a point and/or in a range, whose position is a function of a predefined parameter. The normalizer also includes an inverse transformation element.
    Type: Application
    Filed: July 28, 2020
    Publication date: September 8, 2022
    Inventors: Christian Haase-Schuetz, Frank Schmidt, Torsten Sachse
  • Publication number: 20220156517
    Abstract: The present disclosure relates to a method for generating training data for a recognition model for recognizing objects in sensor data of a vehicle. First sensor data and second sensor data are input into a learning algorithm. The first sensor data comprise measurements of a first surroundings sensor. The second sensor data comprise a measurements of a second surroundings sensor. A training data generation model is generated, using learning algorithm, that generates measurements of the second surroundings sensor assigned to measurements of the first surroundings sensor. First simulation data are input into the training data generation model. The first simulation data comprise simulated measurements of the first surroundings sensor. Second simulation data are generated as the training data based on the first simulation data using the training data generation model. The second simulation data comprise simulated measurements of the second surroundings sensor.
    Type: Application
    Filed: November 18, 2021
    Publication date: May 19, 2022
    Inventors: Christian Haase-Schuetz, Heinz Hertlein, Joscha Liedtke
  • Publication number: 20210224646
    Abstract: A method for generating labels for a data set. The method includes: providing an unlabeled data set comprising a number of unlabeled data; generating initial labels for the data of the unlabeled data set; providing the initial labels as nth labels where n=1; performing an iterative process, where an nth iteration of the iterative process comprises the following steps for every n=1, 2, 3, . . . N: training a model as an nth trained model using a labeled data set, the labeled data set being given by a combination of the data of the unlabeled data set with the nth labels; predicting nth predicted labels for the unlabeled data of the unlabeled data set by using the nth trained model; determining (n+1)th labels from a set of labels comprising at least the nth predicted labels.
    Type: Application
    Filed: December 21, 2020
    Publication date: July 22, 2021
    Inventors: Achim Feyerabend, Alexander Blonczewski, Christian Haase-Schuetz, Elena Pancera, Heinz Hertlein, Jinquan Zheng, Joscha Liedtke, Marianne Gaul, Rainer Stal, Srinandan Krishnamoorthy
  • Publication number: 20210216912
    Abstract: A device and a computer-implemented method for data-efficient active machine learning. Annotated data are provided. A model is trained for a classification of the data as a function of the annotated data. The model trained in this way is calibrated, as a function of the annotated data, with regard to confidence for a correctness of the classification of the annotated data by the model. For unannotated data, the confidence for the correctness of the classification of the unannotated data is determined, using the model calibrated in this way. The unannotated data for the active machine learning whose confidence satisfies a criterion is acquired from the unannotated data.
    Type: Application
    Filed: January 6, 2021
    Publication date: July 15, 2021
    Inventors: Christian Haase-Schuetz, Peter Moeller
  • Publication number: 20210216869
    Abstract: A device and a computer-implemented method for data-efficient active machine learning. Annotated data are provided. A model is trained for a classification of the data as a function of the annotated data. For unannotated data, values of an acquisition function of the unannotated data are determined, and the unannotated data for the active machine learning whose values for the acquisition function satisfy a criterion are acquired from the unannotated data. An autocorrelation is determined via a feature representation for a sample from the unannotated data to be assessed, in particular from at least one layer of the model. The value of the acquisition function of this sample is determined as a function of a root mean square via the autocorrelation, in particular in at least one dimension.
    Type: Application
    Filed: January 6, 2021
    Publication date: July 15, 2021
    Inventor: Christian Haase-Schuetz
  • Publication number: 20210192345
    Abstract: A method and a device for generating labeled data, for example training data, in particular for a neural network.
    Type: Application
    Filed: December 10, 2020
    Publication date: June 24, 2021
    Inventors: Christian Haase-Schuetz, Heinz Hertlein, Rainer Stal
  • Publication number: 20210117787
    Abstract: A method for training a machine learning model for determining a quality grade of data sets from each of a plurality of sensors. The sensors are configured to generate surroundings representations. The method includes: providing data sets of each of the sensors from corresponding surroundings representations; providing attribute data of ground truth objects of the surroundings representations; determining a quality grade of the respective data set of each of the sensors using a metric, the metric comparing at least one variable, which is determined using the respective data set, with at least one attribute datum of at least one associated ground truth object of the surroundings representation; and training the machine learning model using the data sets of each of the sensors and the respectively assigned determined quality grades.
    Type: Application
    Filed: October 5, 2020
    Publication date: April 22, 2021
    Inventors: Rainer Stal, Christian Haase-Schuetz, Heinz Hertlein