Patents by Inventor Christian KLUKAS

Christian KLUKAS has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240021273
    Abstract: Computer-implemented method for providing training data for a machine learning algorithm for image classifying a pathogen infestation of a plant, comprising the steps: providing image data of a plant or a plant part infested with a pathogen; providing genetic result data of the plant or the plant part to which the image data referred comprising at least information about the type of pathogen; labeling the image data with the genetic result data.
    Type: Application
    Filed: November 12, 2021
    Publication date: January 18, 2024
    Inventors: Isabella SIEPE, Kristina BUSCH, Till EGGERS, Ramon NAVARRA-MESTRE, Egon HADEN, Jessica ARNHOLD, Sebastian FISCHER, Andres MARTIN PALMA, Christian KLUKAS, Swetlana FRIEDEL, Bastian STUERMER-STEPHAN, Stefan HAHN, Stefan TRESCH
  • Publication number: 20230230373
    Abstract: Computer-implemented method and system (100) for estimating vegetation coverage in a real-world environment. The system receives an RGB image (91) of a real-world scenery (1) with one or more plant elements (10) of one or more plant species. At least one channel of the RGB image (91) is provided to a semantic regression neural network (120) which is trained to estimate at least a near-infrared channel (NIR) from the RGB image. The system obtains an estimate of the near-infrared channel (NIR) by applying the semantic regression neural network (120) to the at least one RGB channel (91). A multi-channel image (92) comprising at least one of the R-, G-, B-channels (R, G, B) of the RGB image and the estimated near-infrared channel (NIR), is provided as test input (TI1) to a semantic segmentation neural network (130) trained with multi-channel images to segment the test input (TI1) into pixels associated with plant elements and pixels not associated with plant elements.
    Type: Application
    Filed: May 7, 2021
    Publication date: July 20, 2023
    Inventors: Artzai PICON RUIZ, Miguel GONZALEZ SAN EMETERIO, Aranzazu BERECIARTUA-PEREZ, Laura GOMEZ ZAMANILLO, Carlos Javier JIMENEZ RUIZ, Javier ROMERO RODRIGUEZ, Christian KLUKAS, Till EGGERS, Jone ECHAZARRA HUGUET, Ramon NAVARRA-MESTRE
  • Publication number: 20230141945
    Abstract: In performing a computer-implemented method to quantify biotic damage in leaves of crop-plants, the computer receives a plant-image (410) showing a crop-plant, showing the aerial part of the plant, with stem, branches, and leaves and showing the ground on that the plant is placed. A segmenter module obtains a segmented plant-image being a contiguous set of pixels that shows in a contour (460A) of the aerial part, the contour (460A) having a margin region (458) that shows the ground partially. The computer uses convolutional neural network that processing the segmented plant-image by regression to obtain a damage degree, the convolutional neural network having been trained by processing damage-annotated segmented plant-images.
    Type: Application
    Filed: March 5, 2021
    Publication date: May 11, 2023
    Inventors: Aranzazu BERECIARTUA-PEREZ, Artzai PICON RUIZ, Corinna Maria SPANGLER, Christian KLUKAS, Till EGGERS, Ramon NAVARRA-MESTRE, Jone ECHAZARRA HUGUET
  • Publication number: 20230100268
    Abstract: To quantify biotic damage in leaves of crop plants, a computer receives (701A) a leaf-image taken from a particular crop plant. The leaf-image shows at least one of the leaves of the particular crop plant. Using a first convolutional neural network (CNN, 262), the computer processes the leaf-image to derive a segmented leaf-image (422) being a contiguous set of pixels that show a main leaf of the particular plant completely. The first CNN has been trained by a plurality of leaf-annotated leaf-images (601A), wherein the leaf-images are annotated to identify main leaves (461). Using a second CNN (272), the computer processes the single-leaf-image by regression to obtain a damage degree (432).
    Type: Application
    Filed: March 15, 2021
    Publication date: March 30, 2023
    Inventors: Aranzazu BERECIARTUA-PEREZ, Artzai PICON RUIZ, Corinna Maria SPANGLER, Christian KLUKAS, Till EGGERS, Jone ECHAZARRA HUGUET, Ramon NAVARRA-MESTRE
  • Publication number: 20230071265
    Abstract: A computer generates a training set with annotated images (473) to train a convolutional neural network (CNN). The computer receives leaf-images showing leaves and biological objects such as insects, in a first color-coding (413-A), changes the color-coding of the pixels to a second color-coding and thereby enhances the contrast (413-C), assigns pixels in the second color-coding to binary values (413-D), differentiates areas with contiguous pixels in the first binary value into non-insect areas and insect areas by an area size criterion (413-E), identifies pixel-coordinates of the insect areas with rectangular tile-areas (413-F), and annotates the leaf-images in the first color-coding by assigning the pixel-coordinates to corresponding tile-areas. The annotated image is then used to train the CNN for quantifying plant infestation by estimating the number of biological object such as insects on the leaves of plants.
    Type: Application
    Filed: February 19, 2021
    Publication date: March 9, 2023
    Inventors: Aranzazu BERECIARTUA-PEREZ, Artzai PICON RUIZ, Aitor ALVAREZ GILA, Jone ECHAZARRA HUGUET, Till EGGERS, Christian KLUKAS, Ramon NAVARRA-MESTRE, Laura GOMEZ ZAMANILLO
  • Publication number: 20220327815
    Abstract: A computer-implemented method, computer program product and computer system (100) for identifying weeds in a crop field using a dual task convolutional neural network (120) having a topology with an intermediate module (121) to execute a classification task being associated with a first loss function (LF1), and with a semantic segmentation module (122) to execute a segmentation task with a second different loss function (LF2). The intermediate module and the segmentation module are being trained together, taking into account the first and second loss functions (LF1, LF2).
    Type: Application
    Filed: September 3, 2020
    Publication date: October 13, 2022
    Inventors: Artzai PICON RUIZ, Miguel LINARES DE LA PUERTA, Christian KLUKAS, Till EGGERS, Rainer OBERST, Juan Manuel CONTRERAS GALLARDO, Javier ROMERO RODRIGUEZ, Hikal Khairy Shohdy GAD, Gerd KRAEMER, Jone ECHAZARRA HUGUET, Ramon NAVARRA-MESTRE, Miguel GONZALEZ SAN EMETERIO