Patents by Inventor Jan Martin Lesniak

Jan Martin Lesniak has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11748981
    Abstract: A method for indicating how a cancer patient will respond to a predetermined therapy relies on spatial statistical analysis of classes of cell centers in a digital image of tissue of the cancer patient. The cell centers are detected in the image of stained tissue of the cancer patient. For each cell center, an image patch that includes the cell center is extracted from the image. A feature vector is generated based on each image patch using a convolutional neural network. A class is assigned to each cell center based on the feature vector associated with each cell center. A score is computed for the image of tissue by performing spatial statistical analysis based on classes of the cell centers. The score indicates how the cancer patient will respond to the predetermined therapy. The predetermined therapy is recommended to the patient if the score is larger than a predetermined threshold.
    Type: Grant
    Filed: April 27, 2022
    Date of Patent: September 5, 2023
    Assignee: AstraZeneca Computational Pathology GmbH
    Inventors: Guenter Schmidt, Nicolas Brieu, Ansh Kapil, Jan Martin Lesniak
  • Publication number: 20220254020
    Abstract: A method for indicating how a cancer patient will respond to a predetermined therapy relies on spatial statistical analysis of classes of cell centers in a digital image of tissue of the cancer patient. The cell centers are detected in the image of stained tissue of the cancer patient. For each cell center, an image patch that includes the cell center is extracted from the image. A feature vector is generated based on each image patch using a convolutional neural network. A class is assigned to each cell center based on the feature vector associated with each cell center. A score is computed for the image of tissue by performing spatial statistical analysis based on classes of the cell centers. The score indicates how the cancer patient will respond to the predetermined therapy. The predetermined therapy is recommended to the patient if the score is larger than a predetermined threshold.
    Type: Application
    Filed: April 27, 2022
    Publication date: August 11, 2022
    Inventors: Guenter Schmidt, Nicolas Brieu, Ansh Kapil, Jan Martin Lesniak
  • Patent number: 11348231
    Abstract: A method for indicating how a cancer patient will respond to a predetermined therapy relies on spatial statistical analysis of classes of cell centers in a digital image of tissue of the cancer patient. The cell centers are detected in the image of stained tissue of the cancer patient. For each cell center, an image patch that includes the cell center is extracted from the image. A feature vector is generated based on each image patch using a convolutional neural network. A class is assigned to each cell center based on the feature vector associated with each cell center. A score is computed for the image of tissue by performing spatial statistical analysis based on classes of the cell centers. The score indicates how the cancer patient will respond to the predetermined therapy. The predetermined therapy is recommended to the patient if the score is larger than a predetermined threshold.
    Type: Grant
    Filed: December 6, 2019
    Date of Patent: May 31, 2022
    Assignee: AstraZeneca Computational Pathology GmbH
    Inventors: Guenter Schmidt, Nicolas Brieu, Ansh Kapil, Jan Martin Lesniak
  • Publication number: 20200184641
    Abstract: A method for indicating how a cancer patient will respond to a predetermined therapy relies on spatial statistical analysis of classes of cell centers in a digital image of tissue of the cancer patient. The cell centers are detected in the image of stained tissue of the cancer patient. For each cell center, an image patch that includes the cell center is extracted from the image. A feature vector is generated based on each image patch using a convolutional neural network. A class is assigned to each cell center based on the feature vector associated with each cell center. A score is computed for the image of tissue by performing spatial statistical analysis based on classes of the cell centers. The score indicates how the cancer patient will respond to the predetermined therapy. The predetermined therapy is recommended to the patient if the score is larger than a predetermined threshold.
    Type: Application
    Filed: December 6, 2019
    Publication date: June 11, 2020
    Inventors: Guenter Schmidt, Nicolas Brieu, Ansh Kapil, Jan Martin Lesniak
  • Patent number: 10565479
    Abstract: A method for identifying blurred areas in digital images of stained tissue involves artificially blurring a learning tile and then training a pixel classifier to correctly classify each pixel as belonging either to the learning tile or to a blurred copy. A learning tile is first selected from a digital image of stained tissue. The learning tile is copied and blurred by applying a filter to each pixel. The pixel classifier is trained to correctly classify each pixel as belonging either to the learning tile or to the blurred, copied learning tile. The pixel classifier then classifies each pixel of the entire digital image as most likely resembling either the learning tile or the blurred learning tile. The digital image is segmented into blurred and unblurred areas based on the pixel classification. The blurred areas and the unblurred areas of the digital image are identified on a graphical user interface.
    Type: Grant
    Filed: October 4, 2019
    Date of Patent: February 18, 2020
    Assignee: Definiens GmbH
    Inventor: Jan Martin Lesniak
  • Publication number: 20200034651
    Abstract: A method for identifying blurred areas in digital images of stained tissue involves artificially blurring a learning tile and then training a pixel classifier to correctly classify each pixel as belonging either to the learning tile or to a blurred copy. A learning tile is first selected from a digital image of stained tissue. The learning tile is copied and blurred by applying a filter to each pixel. The pixel classifier is trained to correctly classify each pixel as belonging either to the learning tile or to the blurred, copied learning tile. The pixel classifier then classifies each pixel of the entire digital image as most likely resembling either the learning tile or the blurred learning tile. The digital image is segmented into blurred and unblurred areas based on the pixel classification. The blurred areas and the unblurred areas of the digital image are identified on a graphical user interface.
    Type: Application
    Filed: October 4, 2019
    Publication date: January 30, 2020
    Inventor: Jan Martin Lesniak
  • Patent number: 10438096
    Abstract: A method for identifying blurred areas in digital images of stained tissue involves artificially blurring a learning tile and then training a pixel classifier to correctly classify each pixel as belonging either to the learning tile or to a blurred copy. A learning tile is first selected from a digital image of stained tissue. The learning tile is copied and blurred by applying a filter to each pixel. The pixel classifier is trained to correctly classify each pixel as belonging either to the learning tile or to the blurred, copied learning tile. The pixel classifier then classifies each pixel of the entire digital image as most likely resembling either the learning tile or the blurred learning tile. The digital image is segmented into blurred and unblurred areas based on the pixel classification. The blurred areas and the unblurred areas of the digital image are identified on a graphical user interface.
    Type: Grant
    Filed: December 27, 2016
    Date of Patent: October 8, 2019
    Assignee: Definiens AG
    Inventor: Jan Martin Lesniak
  • Publication number: 20180182099
    Abstract: A method for identifying blurred areas in digital images of stained tissue involves artificially blurring a learning tile and then training a pixel classifier to correctly classify each pixel as belonging either to the learning tile or to a blurred copy. A learning tile is first selected from a digital image of stained tissue. The learning tile is copied and blurred by applying a filter to each pixel. The pixel classifier is trained to correctly classify each pixel as belonging either to the learning tile or to the blurred, copied learning tile. The pixel classifier then classifies each pixel of the entire digital image as most likely resembling either the learning tile or the blurred learning tile. The digital image is segmented into blurred and unblurred areas based on the pixel classification. The blurred areas and the unblurred areas of the digital image are identified on a graphical user interface.
    Type: Application
    Filed: December 27, 2016
    Publication date: June 28, 2018
    Inventor: Jan Martin Lesniak