Patents by Inventor Ingrid Carlbom

Ingrid Carlbom has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11514569
    Abstract: The method according to the invention utilizes a color decomposition of histological tissue image data to derive a density map. The density map corresponds to the portion of the image data that contains the stain/tissue combination corresponding to the stroma, and at least one gland is extracted from said density map. The glands are obtained by a combination of a mask and a seed for each gland derived by adaptive morphological operations, and the seed is grown to the boundaries of the mask. The method may also derive an epithelial density map used to remove small objects not corresponding to epithelial tissue. The epithelial density map may further be utilized to improve the identification of glandular regions in the stromal density map. The segmented gland is extracted from the tissue data utilizing the grown seed as a mask. The gland is then classified according to its associated features.
    Type: Grant
    Filed: March 28, 2018
    Date of Patent: November 29, 2022
    Assignee: CADESS.AI AB
    Inventors: Christophe Avenel, Ingrid Carlbom
  • Patent number: 11488401
    Abstract: The method of the present invention classifies the nuclei in prostate tissue images with a trained deep learning network and uses said nuclear classification to classify regions, such as glandular regions, according to their malignancy grade. The method according to the present disclosure also trains a deep learning network to identify the category of each nucleus in prostate tissue image data, said category representing the malignancy grade of the tissue surrounding the nuclei. The method of the present disclosure automatically segments the glands and identifies the nuclei in a prostate tissue data set. Said segmented glands are assigned a category by at least one domain expert, and said category is then used to automatically assign a category to each nucleus corresponding to the category of said nucleus' surrounding tissue. A multitude of windows, each said window surrounding a nucleus, comprises the training data for the deep learning network.
    Type: Grant
    Filed: November 28, 2018
    Date of Patent: November 1, 2022
    Assignee: CADESS.AI AB
    Inventors: Christophe Avenel, Ingrid Carlbom
  • Publication number: 20200293748
    Abstract: The method of the present invention classifies the nuclei in prostate tissue images with a trained deep learning network and uses said nuclear classification to classify regions, such as glandular regions, according to their malignancy grade. The method according to the present disclosure also trains a deep learning network to identify the category of each nucleus in prostate tissue image data, said category representing the malignancy grade of the tissue surrounding the nuclei. The method of the present disclosure automatically segments the glands and identifies the nuclei in a prostate tissue data set. Said segmented glands are assigned a category by at least one domain expert, and said category is then used to automatically assign a category to each nucleus corresponding to the category of said nucleus' surrounding tissue. A multitude of windows, each said window surrounding a nucleus, comprises the training data for the deep learning network.
    Type: Application
    Filed: November 28, 2018
    Publication date: September 17, 2020
    Inventors: Christophe AVENEL, Ingrid CARLBOM
  • Publication number: 20200134817
    Abstract: The method according to the invention utilizes a color decomposition of histological tissue image data to derive adensity map.The density map corresponds to the portion of the image data that contains the stain/tissue combination corresponding to the stroma, and at least one gland is extracted from said density map. The glands are obtained by a combination of a mask and a seed for each gland derived by adaptive morphological operations, and the seed is grown to the boundaries of the mask.The method may also derive an epithelial density map used to remove small objects not corresponding to epithelial tissue. The epithelial density map may further be utilized to improve the identification of glandular regions in the stromal density map.The segmented gland is extracted from the tissue data utilizing thegrown seed as a mask. The gland is then classified according to its associated features.
    Type: Application
    Filed: March 28, 2018
    Publication date: April 30, 2020
    Inventors: Christophe AVENEL, Ingrid CARLBOM
  • Publication number: 20070294061
    Abstract: An acoustic modeling system and an acoustic modeling method use beam tracing techniques that accelerate computation of significant acoustic reverberation paths in a distributed virtual environment. The acoustic modeling system and method perform a priority-driven beam tracing to construct a beam tree data structure representing “early” reverberation paths between avatar locations by performing a best-first traversal of a cell adjacency graph that represents the virtual environment. To further accelerate reverberation path computations, the acoustic modeling system and method according to one embodiment perform a bi-directional beam tracing algorithm that combines sets of beams traced from pairs of avatar locations to efficiently find viable acoustic reverberation paths.
    Type: Application
    Filed: November 17, 2006
    Publication date: December 20, 2007
    Applicant: Agere Systems Incorporated
    Inventors: Ingrid Carlbom, Thomas Funkhouser
  • Publication number: 20050018058
    Abstract: An omnidirectional video camera captures images of the environment while moving along several intersecting paths forming an irregular grid. These paths define the boundaries of a set of image loops within the environment. For arbitrary viewpoints within each image loop, a 4D plenoptic function may be reconstructed from the group of images captured at the loop boundary. For an observer viewpoint, a strip of pixels is extracted from an image in the loop in front of the observer and paired with a strip of pixels extracted from another image on the opposite side of the image loop. A new image is generated for an observer viewpoint by warping pairs of such strips of pixels according to the 4D plenoptic function, blending each pair, and then stitching the resulting strips of pixels together.
    Type: Application
    Filed: May 7, 2004
    Publication date: January 27, 2005
    Inventors: Daniel Aliaga, Ingrid Carlbom