Patents by Inventor Lily Peng

Lily Peng has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11983912
    Abstract: A method for training a pattern recognizer to identify regions of interest in unstained images of tissue samples is provided. Pairs of images of tissue samples are obtained, each pair including an unstained image of a given tissue sample and a stained image of the given tissue sample. An annotation (e.g., drawing operation) is then performed on the stained image to indicate a region of interest. The annotation information, in the form of a mask surrounding the region of interest, is then applied to the corresponding unstained image. The unstained image and mask are then supplied to train a pattern recognizer. The trained pattern recognizer can then be used to identify regions of interest within novel unstained images.
    Type: Grant
    Filed: September 7, 2018
    Date of Patent: May 14, 2024
    Assignee: VERILY LIFE SCIENCES LLC
    Inventors: Martin Stumpe, Lily Peng
  • Publication number: 20230419694
    Abstract: A machine learning predictor model is trained to generate a prediction of the appearance of a tissue sample stained with a special stain such as an IHC stain from an input image that is either unstained or stained with H&E. Training data takes the form of thousands of pairs of precisely aligned images, one of which is an image of a tissue specimen stained with H&E or unstained, and the other of which is an image of the tissue specimen stained with the special stain. The model can be trained to predict special stain images for a multitude of different tissue types and special stain types, in use, an input image, e.g., an H&E image of a given tissue specimen at a particular magnification level is provided to the model and the model generates a prediction of the appearance of the tissue specimen as if it were stained with the special stain. The predicted image is provided to a user and displayed, e.g., on a pathology workstation.
    Type: Application
    Filed: September 7, 2023
    Publication date: December 28, 2023
    Applicant: Verily Life Sciences LLC
    Inventors: Martin Stumpe, Philip Nelson, Lily Peng
  • Patent number: 11783603
    Abstract: A machine learning predictor model is trained to generate a prediction of the appearance of a tissue sample stained with a special stain such as an IHC stain from an input image that is either unstained or stained with H&E. Training data takes the form of thousands of pairs of precisely aligned images, one of which is an image of a tissue specimen stained with H&E or unstained, and the other of which is an image of the tissue specimen stained with the special stain. The model can be trained to predict special stain images for a multitude of different tissue types and special stain types, in use, an input image, e.g., an H&E image of a given tissue specimen at a particular magnification level is provided to the model and the model generates a prediction of the appearance of the tissue specimen as if it were stained with the special stain. The predicted image is provided to a user and displayed, e.g., on a pathology workstation.
    Type: Grant
    Filed: March 7, 2018
    Date of Patent: October 10, 2023
    Assignee: VERILY LIFE SCIENCES LLC
    Inventors: Martin Stumpe, Philip Nelson, Lily Peng
  • Patent number: 11783604
    Abstract: A method for generating a ground truth mask for a microscope slide having a tissue specimen placed thereon includes a step of staining the tissue specimen with hematoxylin and eosin (H&E) staining agents. A first magnified image of the H&E stained tissue specimen is obtained, e.g., with a whole slide scanner. The H&E staining agents are then washed from the tissue specimen. A second, different stain is applied to the tissue specimen, e.g., a special stain such as an IHC stain. A second magnified image of the tissue specimen stained with the second, different stain is obtained. The first and second magnified images are then registered to each other. An annotation (e.g., drawing operation) is then performed on either the first or the second magnified images so as to form a ground truth mask, the ground truth mask in the form of closed polygon region enclosing tumor cells present in either the first or second magnified image.
    Type: Grant
    Filed: January 11, 2018
    Date of Patent: October 10, 2023
    Assignee: Google LLC
    Inventors: Lily Peng, Martin Stumpe
  • Patent number: 11379516
    Abstract: A system for searching for similar medical images includes a reference library in the form of a multitude of medical images, at least some of which are associated with metadata including clinical information relating to the specimen or patient associated with the medical images. A computer system is configured as a search tool for receiving an input image query from a user. The computer system is trained to find one or more similar medical images in the reference library system which are similar to the input image. The reference library is represented as an embedding of each of the medical images projected in a feature space having a plurality of axes, wherein the embedding is characterized by two aspects of a similarity ranking: (1) visual similarity, and (2) semantic similarity such that neighboring images in the feature space are visually similar and semantic information is represented by the axes of the feature space.
    Type: Grant
    Filed: March 29, 2018
    Date of Patent: July 5, 2022
    Assignee: Google LLC
    Inventors: Lily Peng, Martin Stumpe, Daniel Smilkov, Jason Hipp
  • Publication number: 20220148169
    Abstract: One example method for AI prediction of prostate cancer outcomes involves receiving an image of prostate tissue; assigning Gleason pattern values to one or more regions within the image using an artificial intelligence Gleason grading model, the model trained to identify Gleason patterns on a patch-by-patch basis in a prostate tissue image; determining relative areal proportions of the Gleason patterns within the image; assigning at least one of a risk score or risk group value to the image based on the determined relative areal proportions; and outputting at least one of the risk score or the risk group value.
    Type: Application
    Filed: November 8, 2021
    Publication date: May 12, 2022
    Applicant: Verily Life Sciences LLC
    Inventors: Craig Mermel, Yun Liu, Naren Manoj, Matthew Symonds, Martin Stumpe, Lily Peng, Kunal Nagpal, Ellery Wulczyn, Davis Foote, David F. Steiner, Po-Hsuan Cameron Chen
  • Patent number: 11170897
    Abstract: A method, system and machine for assisting a pathologist in identifying the presence of tumor cells in lymph node tissue is disclosed. The digital image of lymph node tissue at a first magnification (e.g., 40×) is subdivided into a multitude of rectangular “patches.” A likelihood of malignancy score is then determined for each of the patches. The score is obtained by analyzing pixel data from the patch (e.g., pixel data centered on and including the patch) using a computer system programmed as an ensemble of deep neural network pattern recognizers, each operating on different magnification levels of the patch. A representation or “heatmap” of the slide is generated.
    Type: Grant
    Filed: February 23, 2017
    Date of Patent: November 9, 2021
    Assignee: Google LLC
    Inventors: Martin Christian Stumpe, Lily Peng, Yun Liu, Krishna K. Gadepalli, Timo Kohlberger
  • Publication number: 20210064845
    Abstract: A method for training a pattern recognizer to identify regions of interest in unstained images of tissue samples is provided. Pairs of images of tissue samples are obtained, each pair including an unstained image of a given tissue sample and a stained image of the given tissue sample. An annotation (e.g., drawing operation) is then performed on the stained image to indicate a region of interest. The annotation information, in the form of a mask surrounding the region of interest, is then applied to the corresponding unstained image. The unstained image and mask are then supplied to train a pattern recognizer. The trained pattern recognizer can then be used to identify regions of interest within novel unstained images.
    Type: Application
    Filed: September 7, 2018
    Publication date: March 4, 2021
    Inventors: Martin STUMPE, Lily PENG
  • Publication number: 20210019342
    Abstract: A system for searching for similar medical images includes a reference library in the form of a multitude of medical images, at least some of which are associated with metadata including clinical information relating to the specimen or patient associated with the medical images. A computer system is configured as a search tool for receiving an input image query from a user. The computer system is trained to find one or more similar medical images in the reference library system which are similar to the input image. The reference library is represented as an embedding of each of the medical images projected in a feature space having a plurality of axes, wherein the embedding is characterized by two aspects of a similarity ranking: (1) visual similarity, and (2) semantic similarity such that neighboring images in the feature space are visually similar and semantic information is represented by the axes of the feature space.
    Type: Application
    Filed: March 29, 2018
    Publication date: January 21, 2021
    Inventors: Lily PENG, Martin STUMPE, Daniel SMILKOV, Jason HIPP
  • Publication number: 20200394825
    Abstract: A machine learning predictor model is trained to generate a prediction of the appearance of a tissue sample stained with a special stain such as an IHC stain from an input image that is either unstained or stained with H&E. Training data takes the form of thousands of pairs of precisely aligned images, one of which is an image of a tissue specimen stained with H&E or unstained, and the other of which is an image of the tissue specimen stained with the special stain. The model can be trained to predict special stain images for a multitude of different tissue types and special stain types, in use, an input image, e.g., an H&E image of a given tissue specimen at a particular magnification level is provided to the model and the model generates a prediction of the appearance of the tissue specimen as if it were stained with the special stain. The predicted image is provided to a user and displayed, e.g., on a pathology workstation.
    Type: Application
    Filed: March 7, 2018
    Publication date: December 17, 2020
    Inventors: Martin STUMPE, Philip NELSON, Lily PENG
  • Publication number: 20200372235
    Abstract: A method for generating a ground truth mask for a microscope slide having a tissue specimen placed thereon includes a step of staining the tissue specimen with hematoxylin and eosin (H&E) staining agents. A first magnified image of the H&E stained tissue specimen is obtained, e.g., with a whole slide scanner. The H&E staining agents are then washed from the tissue specimen. A second, different stain is applied to the tissue specimen, e.g., a special stain such as an IHC stain. A second magnified image of the tissue specimen stained with the second, different stain is obtained. The first and second magnified images are then registered to each other. An annotation (e.g., drawing operation) is then performed on either the first or the second magnified images so as to form a ground truth mask, the ground truth mask in the form of closed polygon region enclosing tumor cells present in either the first or second magnified image.
    Type: Application
    Filed: January 11, 2018
    Publication date: November 26, 2020
    Inventors: Lily PENG, Martin STUMPE
  • Publication number: 20200066407
    Abstract: A method, system and machine for assisting a pathologist in identifying the presence of tumor cells in lymph node tissue is disclosed. The digital image of lymph node tissue at a first magnification (e.g., 40×) is subdivided into a multitude of rectangular “patches.” A likelihood of malignancy score is then determined for each of the patches. The score is obtained by analyzing pixel data from the patch (e.g., pixel data centered on and including the patch) using a computer system programmed as an ensemble of deep neural network pattern recognizers, each operating on different magnification levels of the patch. A representation or “heatmap” of the slide is generated.
    Type: Application
    Filed: February 23, 2017
    Publication date: February 27, 2020
    Inventors: Martin Christian Stumpe, Lily Peng, Yun Liu, Krishna K. Gadepalli, Timo Kohlberger