Patents by Inventor Till EGGERS

Till EGGERS has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240095911
    Abstract: The present disclosure relates to image processing or computer vision techniques. A computer-implemented method is provided for determining a damage status of a physical object, the method comprising the steps of receiving a surface image of the physical object; and providing a pre-trained machine learning model to derive property values from the received surface map, wherein each property value is indicative of a damage index at a respective location, wherein the property values are preferably usable for monitoring and/or controlling a production process of the physical object. In this way, it is possible to reliably identify local defects and ensure that it is accurate enough to apply the chemical products in suitable amounts.
    Type: Application
    Filed: March 31, 2022
    Publication date: March 21, 2024
    Inventors: Rahul TANEJA, Kamran SIAL, Till EGGERS, Margret KEUPER, Ramon NAVARRA-MESTRE, Sebastian FISCHER, Mike SCHARNER, Javier ROMERO RODRIGUEZ, Francisco Manuel POLO LOPEZ, Andres MARTIN PALMA
  • Publication number: 20240021273
    Abstract: Computer-implemented method for providing training data for a machine learning algorithm for image classifying a pathogen infestation of a plant, comprising the steps: providing image data of a plant or a plant part infested with a pathogen; providing genetic result data of the plant or the plant part to which the image data referred comprising at least information about the type of pathogen; labeling the image data with the genetic result data.
    Type: Application
    Filed: November 12, 2021
    Publication date: January 18, 2024
    Inventors: Isabella SIEPE, Kristina BUSCH, Till EGGERS, Ramon NAVARRA-MESTRE, Egon HADEN, Jessica ARNHOLD, Sebastian FISCHER, Andres MARTIN PALMA, Christian KLUKAS, Swetlana FRIEDEL, Bastian STUERMER-STEPHAN, Stefan HAHN, Stefan TRESCH
  • Publication number: 20230351743
    Abstract: Quantifying plant infestation is performed by estimating the number of biological objects (132) on parts (122) of a plant (112). A computer (202) receives a plant-image (412) taken from a particular plant (112). The computer (202) uses a first convolutional neural network (262/272) to derive a part-image (422) that shows a part of the plant. The computer (202) splits the part-image into tiles and uses a second network to process the tiles to density maps. The computer (202) combines the density maps to a combined density map in the dimension of the part-image and integrates the pixel values to an estimate number of objects for the part. Object classes (132(1), 132(2)) can be differentiated to fine-tune the quantification to identify class-specific countermeasures.
    Type: Application
    Filed: September 29, 2020
    Publication date: November 2, 2023
    Inventors: Aitor ALVAREZ GILA, Amaia Maria Ortiz Barredo, David Roldan Lopez, Javier Romero Rodriguez, Corinna Maria Spangler, Christian Klukas, Till Eggers, Jone Echazarra Huguet, Ramon Navarra Mestre, Artzai Picon Ruiz, Aranzazu Bereciartua Perez
  • Publication number: 20230230373
    Abstract: Computer-implemented method and system (100) for estimating vegetation coverage in a real-world environment. The system receives an RGB image (91) of a real-world scenery (1) with one or more plant elements (10) of one or more plant species. At least one channel of the RGB image (91) is provided to a semantic regression neural network (120) which is trained to estimate at least a near-infrared channel (NIR) from the RGB image. The system obtains an estimate of the near-infrared channel (NIR) by applying the semantic regression neural network (120) to the at least one RGB channel (91). A multi-channel image (92) comprising at least one of the R-, G-, B-channels (R, G, B) of the RGB image and the estimated near-infrared channel (NIR), is provided as test input (TI1) to a semantic segmentation neural network (130) trained with multi-channel images to segment the test input (TI1) into pixels associated with plant elements and pixels not associated with plant elements.
    Type: Application
    Filed: May 7, 2021
    Publication date: July 20, 2023
    Inventors: Artzai PICON RUIZ, Miguel GONZALEZ SAN EMETERIO, Aranzazu BERECIARTUA-PEREZ, Laura GOMEZ ZAMANILLO, Carlos Javier JIMENEZ RUIZ, Javier ROMERO RODRIGUEZ, Christian KLUKAS, Till EGGERS, Jone ECHAZARRA HUGUET, Ramon NAVARRA-MESTRE
  • Publication number: 20230141945
    Abstract: In performing a computer-implemented method to quantify biotic damage in leaves of crop-plants, the computer receives a plant-image (410) showing a crop-plant, showing the aerial part of the plant, with stem, branches, and leaves and showing the ground on that the plant is placed. A segmenter module obtains a segmented plant-image being a contiguous set of pixels that shows in a contour (460A) of the aerial part, the contour (460A) having a margin region (458) that shows the ground partially. The computer uses convolutional neural network that processing the segmented plant-image by regression to obtain a damage degree, the convolutional neural network having been trained by processing damage-annotated segmented plant-images.
    Type: Application
    Filed: March 5, 2021
    Publication date: May 11, 2023
    Inventors: Aranzazu BERECIARTUA-PEREZ, Artzai PICON RUIZ, Corinna Maria SPANGLER, Christian KLUKAS, Till EGGERS, Ramon NAVARRA-MESTRE, Jone ECHAZARRA HUGUET
  • Publication number: 20230100268
    Abstract: To quantify biotic damage in leaves of crop plants, a computer receives (701A) a leaf-image taken from a particular crop plant. The leaf-image shows at least one of the leaves of the particular crop plant. Using a first convolutional neural network (CNN, 262), the computer processes the leaf-image to derive a segmented leaf-image (422) being a contiguous set of pixels that show a main leaf of the particular plant completely. The first CNN has been trained by a plurality of leaf-annotated leaf-images (601A), wherein the leaf-images are annotated to identify main leaves (461). Using a second CNN (272), the computer processes the single-leaf-image by regression to obtain a damage degree (432).
    Type: Application
    Filed: March 15, 2021
    Publication date: March 30, 2023
    Inventors: Aranzazu BERECIARTUA-PEREZ, Artzai PICON RUIZ, Corinna Maria SPANGLER, Christian KLUKAS, Till EGGERS, Jone ECHAZARRA HUGUET, Ramon NAVARRA-MESTRE
  • Publication number: 20230071265
    Abstract: A computer generates a training set with annotated images (473) to train a convolutional neural network (CNN). The computer receives leaf-images showing leaves and biological objects such as insects, in a first color-coding (413-A), changes the color-coding of the pixels to a second color-coding and thereby enhances the contrast (413-C), assigns pixels in the second color-coding to binary values (413-D), differentiates areas with contiguous pixels in the first binary value into non-insect areas and insect areas by an area size criterion (413-E), identifies pixel-coordinates of the insect areas with rectangular tile-areas (413-F), and annotates the leaf-images in the first color-coding by assigning the pixel-coordinates to corresponding tile-areas. The annotated image is then used to train the CNN for quantifying plant infestation by estimating the number of biological object such as insects on the leaves of plants.
    Type: Application
    Filed: February 19, 2021
    Publication date: March 9, 2023
    Inventors: Aranzazu BERECIARTUA-PEREZ, Artzai PICON RUIZ, Aitor ALVAREZ GILA, Jone ECHAZARRA HUGUET, Till EGGERS, Christian KLUKAS, Ramon NAVARRA-MESTRE, Laura GOMEZ ZAMANILLO
  • Publication number: 20230017425
    Abstract: A computer-implemented method, computer program product and computer system (100) for determining the impact of herbicides on crop plants (11) in an agricultural field (10). The system includes an interface (110) to receive an image (20) with at least one crop plant representing a real world situation in the agricultural field (10) after herbicide application. An image pre-processing module (120) rescales the received image (20) to a rescaled image (20a) matching the size of an input layer of a first fully convolutional neural network (CNN1) referred to as the first CNN. The first CNN is trained to segment the rescaled image (20a) into crop (11) and non-crop (12, 13) portions, and provides a first segmented output (20s1) indicating the crop portions (20c) of the rescaled image with pixels belonging to representations of crop.
    Type: Application
    Filed: November 24, 2020
    Publication date: January 19, 2023
    Inventors: Aranzazu Bereciartua-Perez, Artzai Picon Ruiz, Javier Romero Rodriguez, Juan Manuel Contreras Gallardo, Rainer Oberst, Hikal Khairy Shohdy Gad, Gerd Kraemer, Christian Klukas, Till Eggers, Jone Echazarra Huguet, Ramon Navarra-Mestre
  • Publication number: 20220327815
    Abstract: A computer-implemented method, computer program product and computer system (100) for identifying weeds in a crop field using a dual task convolutional neural network (120) having a topology with an intermediate module (121) to execute a classification task being associated with a first loss function (LF1), and with a semantic segmentation module (122) to execute a segmentation task with a second different loss function (LF2). The intermediate module and the segmentation module are being trained together, taking into account the first and second loss functions (LF1, LF2).
    Type: Application
    Filed: September 3, 2020
    Publication date: October 13, 2022
    Inventors: Artzai PICON RUIZ, Miguel LINARES DE LA PUERTA, Christian KLUKAS, Till EGGERS, Rainer OBERST, Juan Manuel CONTRERAS GALLARDO, Javier ROMERO RODRIGUEZ, Hikal Khairy Shohdy GAD, Gerd KRAEMER, Jone ECHAZARRA HUGUET, Ramon NAVARRA-MESTRE, Miguel GONZALEZ SAN EMETERIO
  • Publication number: 20220230305
    Abstract: A computer-implemented method, computer program product and computer system (100) for detecting plant diseases. The system stores a convolutional neural network (120) trained with a multi-crop dataset. The convolutional neural network (120) has an extended topology comprising an image branch (121) based on a classification convolutional neural network for classifying the input images according to plant disease specific features, a crop identification branch (122) for adding plant species information, and a branch integrator for integrating the plant species information with each input image. The plant species information (20) specifies the crop on the respective input image (10). The system receives a test input comprising an image (10) of a particular crop (1) showing one or more particular plant disease symptoms, and further receives a respective crop identifier (20) associated with the test input via an interface (110).
    Type: Application
    Filed: May 14, 2020
    Publication date: July 21, 2022
    Inventors: Artzai PICON, Matthias NACHTMANN, Maximilian SEITZ, Patrick MOHNKE, Ramon NAVARRA-MESTRE, Alexander JOHANNES, Till EGGERS, Amaia Maria ORTIZ BARREDO, Aitor ALVAREZ-GILA, Jone ECHAZARRA HUGUET
  • Patent number: 11037291
    Abstract: A system (100), method and computer program product for determining plant diseases. The system includes an interface module (110) configured to receive an image (10) of a plant, the image (10) including a visual representation (11) of at least one plant element (1). A color normalization module (120) is configured to apply a color constancy method to the received image (10) to generate a color-normalized image. An extractor module (130) is configured to extract one or more image portions (11e) from the color-normalized image wherein the extracted image portions (11e) correspond to the at least one plant element (1).
    Type: Grant
    Filed: April 19, 2017
    Date of Patent: June 15, 2021
    Assignee: BASF SE
    Inventors: Alexander Johannes, Till Eggers, Artzai Picon, Aitor Alvarez-Gila, Amaya Maria Ortiz Barredo, Ana Maria Diez-Navajas
  • Publication number: 20200320682
    Abstract: A system (100), method and computer program product for determining plant diseases. The system includes an interface module (110) configured to receive an image (10) of a plant, the image (10) including a visual representation (11) of at least one plant element (1). A color normalization module (120) is configured to apply a color constancy method to the received image (10) to generate a color-normalized image. An extractor module (130) is configured to extract one or more image portions (11e) from the color-normalized image wherein the extracted image portions (11e) correspond to the at least one plant element (1).
    Type: Application
    Filed: April 19, 2017
    Publication date: October 8, 2020
    Inventors: Johannes ALEXANDER, Till EGGERS, Artzai PICON, Aitor ALVAREZ-GILA, Amaya Maria ORTIZ BARREDO, Ana Maria DIEZ-NAVAJAS