Patents by Inventor Ingrid B. Carlbom

Ingrid B. Carlbom has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9779503
    Abstract: A method measures the efficacy of a stain/tissue combination for histological tissue image data comprising a set of pixels. The measures of efficacy comprise a statistical measure of clustering performance of the different stain/tissue combinations in the Maxwellian Chromaticity plane, and a representative density of the stain/tissue combinations in the density maps. The method captures histological tissue image data using a light absorbent stain; transforms the histological tissue image data to optical density data; projects the optical density data to the Maxwellian Chromaticity plane; identifies at least one reference color using a statistical technique; inverts the reference color to form a color mixing matrix; derives a density map; derives a statistical measure for the clustering performance of the different stain/tissue combinations in the Maxwellian Chromaticity plane; and finds a representative density of the stain/tissue combinations in the density maps.
    Type: Grant
    Filed: February 28, 2017
    Date of Patent: October 3, 2017
    Assignee: CADESS MEDICAL AB
    Inventors: Jimmy C. Azar, Christer Busch, Ingrid B. Carlbom, Milan Gavrilovic
  • Publication number: 20170169568
    Abstract: A method measures the efficacy of a stain/tissue combination for histological tissue image data comprising a set of pixels. The measures of efficacy comprise a statistical measure of clustering performance of the different stain/tissue combinations in the Maxwellian Chromaticity plane, and a representative density of the stain/tissue combinations in the density maps. The method captures histological tissue image data using a light absorbent stain; transforms the histological tissue image data to optical density data; projects the optical density data to the Maxwellian Chromaticity plane; identifies at least one reference color using a statistical technique; inverts the reference color to form a color mixing matrix; derives a density map; derives a statistical measure for the clustering performance of the different stain/tissue combinations in the Maxwellian Chromaticity plane; and finds a representative density of the stain/tissue combinations in the density maps.
    Type: Application
    Filed: February 28, 2017
    Publication date: June 15, 2017
    Inventors: Jimmy C. AZAR, Christer BUSCH, Ingrid B. CARLBOM, Milan GAVRILOVIC
  • Patent number: 9607374
    Abstract: The method according to the invention generates a color decomposition of histological tissue image data into density maps, where each density map corresponds to the portion of the original image data that contains one stain/tissue combination. After a microscope captures histological tissue image data from a tissue sample that has been stained with at least one stain, the said stain or stains being light absorbent, the method derives from the histological tissue image data a color mixing matrix with the columns of the matrix being color reference vectors. From this mixing matrix the method derives a density map, when the mixing matrix is well-conditioned, by applying a pseudo-inverse of the mixing matrix to the optical density data, and when the mixing matrix is medially-conditioned, by applying a piece-wise pseudo-inverse of the mixing matrix to the optical density data. The method according to the invention also generates stain/tissue combination efficacy measures.
    Type: Grant
    Filed: November 9, 2012
    Date of Patent: March 28, 2017
    Assignee: CADESS MEDICAL AB
    Inventors: Jimmy C. Azar, Christer Busch, Ingrid B. Carlbom, Milan Gavrilovic
  • Publication number: 20140314301
    Abstract: The method according to the invention generates a color decomposition of histological tissue image data into density maps, where each density map corresponds to the portion of the original image data that contains one stain/tissue combination. After a microscope captures histological tissue image data from a tissue sample that has been stained with at least one stain, the said stain or stains being light absorbent, the method derives from the histological tissue image data a color mixing matrix with the columns of the matrix being color reference vectors. From this mixing matrix the method derives a density map, when the mixing matrix is well-conditioned, by applying a pseudo-inverse of the mixing matrix to the optical density data, and when the mixing matrix is medially-conditioned, by applying a piece-wise pseudo-inverse of the mixing matrix to the optical density data. The method according to the invention also generates stain/tissue combination efficacy measures.
    Type: Application
    Filed: November 9, 2012
    Publication date: October 23, 2014
    Inventors: Jimmy C. AZAR, Christer BUSCH, Ingrid B. CARLBOM, Milan GAVRILOVIC
  • Patent number: 8214179
    Abstract: An acoustic modeling system and an acoustic modeling method use beam tracing techniques that accelerate computation of significant acoustic reverberation paths in a distributed virtual environment. The acoustic modeling system and method perform a priority-driven beam tracing to construct a beam tree data structure representing “early” reverberation paths between avatar locations by performing a best-first traversal of a cell adjacency graph that represents the virtual environment. To further accelerate reverberation path computations, the acoustic modeling system and method according to one embodiment perform a bi-directional beam tracing algorithm that combines sets of beams traced from pairs of avatar locations to efficiently find viable acoustic reverberation paths.
    Type: Grant
    Filed: November 17, 2006
    Date of Patent: July 3, 2012
    Assignee: Agere Systems Inc.
    Inventors: Ingrid B. Carlbom, Thomas A. Funkhouser
  • Patent number: 7146296
    Abstract: An acoustic modeling system and an acoustic modeling method use beam tracing techniques that accelerate computation of significant acoustic reverberation paths in a distributed virtual environment. The acoustic modeling system and method perform a priority-driven beam tracing to construct a beam tree data structure representing “early” reverberation paths between avatar locations by performing a best-first traversal of a cell adjacency graph that represents the virtual environment. To further accelerate reverberation path computations, the acoustic modeling system and method according to one embodiment perform a bi-directional beam tracing algorithm that combines sets of beams traced from pairs of avatar locations to efficiently find viable acoustic reverberation paths.
    Type: Grant
    Filed: August 7, 2000
    Date of Patent: December 5, 2006
    Assignee: Agere Systems Inc.
    Inventors: Ingrid B. Carlbom, Thomas A. Funkhouser
  • Patent number: 7027049
    Abstract: An omnidirectional video camera captures images of the environment while moving along several intersecting paths forming an irregular grid. These paths define the boundaries of a set of image loops within the environment. For arbitrary viewpoints within each image loop, a 4D plenoptic function may be reconstructed from the group of images captured at the loop boundary. For an observer viewpoint, a strip of pixels is extracted from an image in the loop in front of the observer and paired with a strip of pixels extracted from another image on the opposite side of the image loop. A new image is generated for an observer viewpoint by warping pairs of such strips of pixels according to the 4D plenoptic function, blending each pair, and then stitching the resulting strips of pixels together.
    Type: Grant
    Filed: May 7, 2004
    Date of Patent: April 11, 2006
    Assignee: Lucent Technologies Inc.
    Inventors: Daniel G. Aliaga, Ingrid B. Carlbom
  • Patent number: 6831643
    Abstract: An omnidirectional video camera captures images of the environment while moving along several intersecting paths forming an irregular grid. These paths define the boundaries of a set of image loops within the environment. For arbitrary viewpoints within each image loop, a 4D plenoptic function may be reconstructed from the group of images captured at the loop boundary. For an observer viewpoint, a strip of pixels is extracted from an image in the loop in front of the observer and paired with a strip of pixels extracted from another image on the opposite side of the image loop. A new image is generated for an observer viewpoint by warping pairs of such strips of pixels according to the 4D plenoptic function, blending each pair, and then stitching the resulting strips of pixels together.
    Type: Grant
    Filed: April 16, 2002
    Date of Patent: December 14, 2004
    Assignee: Lucent Technologies Inc.
    Inventors: Daniel G. Aliaga, Ingrid B. Carlbom
  • Patent number: 6751322
    Abstract: A system and method for acoustic modeling partitions an input 3D spatial model into convex cells, and constructs a cell adjacency data structure representing the neighbor relationships between adjacent cells. For each sound source located in the spatial environment, convex pyramidal beams are traced through the input spatial model via recursive depth-first traversal of the cell-adjacency graph. During beam tracing, a beam tree data structure is constructed to encode propagation paths, which may include specular reflection, transmission, diffuse reflection, and diffraction events, from the source location to regions of the input spatial model. The beam tree data structure is then accessed for real-time computation and auralization of propagation paths to an arbitrary receiver location.
    Type: Grant
    Filed: October 2, 1998
    Date of Patent: June 15, 2004
    Assignee: Lucent Technologies Inc.
    Inventors: Ingrid B. Carlbom, Gary W. Elko, Thomas A. Funkhouser, Sarma V. Pingali, Man Mohan Sondhi, James Edward West
  • Publication number: 20030004694
    Abstract: A paraboloidal catadioptric camera is calibrated by relaxing the assumption of an ideal system to account for perspective projection, radial distortion, and mirror misalignment occurring within the camera system. Calibration points, which are small and visually distinct objects, are distributed at fixed locations within an environment. Omnidirectional images are captured by the catadioptric camera at different locations of the environment. Data points are obtained by identifying the location of the calibration points in each captured image. An optimization algorithm best-fits the data points to a perspective camera model in order to derive parameters, which are used to calibrate the catadioptric camera.
    Type: Application
    Filed: May 29, 2002
    Publication date: January 2, 2003
    Inventors: Daniel G. Aliaga, Ingrid B. Carlbom
  • Publication number: 20020176635
    Abstract: An omnidirectional video camera captures images of the environment while moving along several intersecting paths forming an irregular grid. These paths define the boundaries of a set of image loops within the environment. For arbitrary viewpoints within each image loop, a 4D plenoptic function may be reconstructed from the group of images captured at the loop boundary. For an observer viewpoint, a strip of pixels is extracted from an image in the loop in front of the observer and paired with a strip of pixels extracted from another image on the opposite side of the image loop. A new image is generated for an observer viewpoint by warping pairs of such strips of pixels according to the 4D plenoptic function, blending each pair, and then stitching the resulting strips of pixels together.
    Type: Application
    Filed: April 16, 2002
    Publication date: November 28, 2002
    Inventors: Daniel G. Aliaga, Ingrid B. Carlbom
  • Patent number: 6441846
    Abstract: A method and apparatus for deriving performance statistics from real time tracking of a sporting event. The method according to the present invention includes a step of obtaining a spatio-temporal trajectory corresponding to the motion of an athlete and based on real time tracking of the athlete. The trajectory is then broken down so that performance information corresponding to the motion of the athlete (such as speed, distance covered, acceleration, etc.) can be derived with respect to time. The information so obtained can be stored in a database or the like for later retrieval or can be used to graphically supplement a video broadcast of a sporting event. The apparatus includes a device for obtaining the trajectory, a computational device for obtaining the performance information based on the obtained trajectory, and a statistical device for compiling the performance information.
    Type: Grant
    Filed: June 22, 1998
    Date of Patent: August 27, 2002
    Assignee: Lucent Technologies Inc.
    Inventors: Ingrid B. Carlbom, Yves D. Jean, Sarma V G K Pingali
  • Patent number: 6233007
    Abstract: A method and apparatus for tracking objects used in connection with athletic activities or sporting events, especially, balls, pucks, and the like. The method includes the steps of differencing present and previous frames of a video image including the, for example, ball to obtain motion regions, converting the motion regions to HSV color space, extracting the region corresponding to the ball based on empirical color data about the ball, obtaining a motion vector based on the motion of the ball region from a previous frame to the current frame, and updating the ball trajectory based on the newest motion vector obtained. The method also preferably includes a step of identifying completed trajectories based on preset constraints. The method is preferably expanded on by using at least one pair of cameras to provide a three-dimensional trajectory, and sometimes preferable expanded on by using a plurality of cameras, especially a plurality of pairs of cameras.
    Type: Grant
    Filed: June 22, 1998
    Date of Patent: May 15, 2001
    Assignee: Lucent Technologies Inc.
    Inventors: Ingrid B. Carlbom, Yves D. Jean, Sarma VGK Pingali
  • Patent number: 6141041
    Abstract: A method and apparatus for deriving an occupancy map reflecting an athlete's coverage of a playing area based on real time tracking of a sporting event. The method according to the present invention includes a step of obtaining a spatio-temporal trajectory corresponding to the motion of an athlete and based on real time tracking of the athlete. The trajectory is then mapped over the geometry of the playing area to determine a playing area occupancy map indicating the frequency with which the athlete occupies certain areas of the playing area, or the time spent by the athlete in certain areas of the playing area. The occupancy map is preferably color coded to indicate different levels of occupancy in different areas of the playing area, and the color coded map is then overlaid onto an image (such as a video image) of the playing area.
    Type: Grant
    Filed: June 22, 1998
    Date of Patent: October 31, 2000
    Assignee: Lucent Technologies Inc.
    Inventors: Ingrid B. Carlbom, Yves D. Jean, Sarma V G K Pingali