Patents by Inventor Ingrid B. Carlbom
Ingrid B. Carlbom has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 9779503Abstract: A method measures the efficacy of a stain/tissue combination for histological tissue image data comprising a set of pixels. The measures of efficacy comprise a statistical measure of clustering performance of the different stain/tissue combinations in the Maxwellian Chromaticity plane, and a representative density of the stain/tissue combinations in the density maps. The method captures histological tissue image data using a light absorbent stain; transforms the histological tissue image data to optical density data; projects the optical density data to the Maxwellian Chromaticity plane; identifies at least one reference color using a statistical technique; inverts the reference color to form a color mixing matrix; derives a density map; derives a statistical measure for the clustering performance of the different stain/tissue combinations in the Maxwellian Chromaticity plane; and finds a representative density of the stain/tissue combinations in the density maps.Type: GrantFiled: February 28, 2017Date of Patent: October 3, 2017Assignee: CADESS MEDICAL ABInventors: Jimmy C. Azar, Christer Busch, Ingrid B. Carlbom, Milan Gavrilovic
-
Publication number: 20170169568Abstract: A method measures the efficacy of a stain/tissue combination for histological tissue image data comprising a set of pixels. The measures of efficacy comprise a statistical measure of clustering performance of the different stain/tissue combinations in the Maxwellian Chromaticity plane, and a representative density of the stain/tissue combinations in the density maps. The method captures histological tissue image data using a light absorbent stain; transforms the histological tissue image data to optical density data; projects the optical density data to the Maxwellian Chromaticity plane; identifies at least one reference color using a statistical technique; inverts the reference color to form a color mixing matrix; derives a density map; derives a statistical measure for the clustering performance of the different stain/tissue combinations in the Maxwellian Chromaticity plane; and finds a representative density of the stain/tissue combinations in the density maps.Type: ApplicationFiled: February 28, 2017Publication date: June 15, 2017Inventors: Jimmy C. AZAR, Christer BUSCH, Ingrid B. CARLBOM, Milan GAVRILOVIC
-
Patent number: 9607374Abstract: The method according to the invention generates a color decomposition of histological tissue image data into density maps, where each density map corresponds to the portion of the original image data that contains one stain/tissue combination. After a microscope captures histological tissue image data from a tissue sample that has been stained with at least one stain, the said stain or stains being light absorbent, the method derives from the histological tissue image data a color mixing matrix with the columns of the matrix being color reference vectors. From this mixing matrix the method derives a density map, when the mixing matrix is well-conditioned, by applying a pseudo-inverse of the mixing matrix to the optical density data, and when the mixing matrix is medially-conditioned, by applying a piece-wise pseudo-inverse of the mixing matrix to the optical density data. The method according to the invention also generates stain/tissue combination efficacy measures.Type: GrantFiled: November 9, 2012Date of Patent: March 28, 2017Assignee: CADESS MEDICAL ABInventors: Jimmy C. Azar, Christer Busch, Ingrid B. Carlbom, Milan Gavrilovic
-
Publication number: 20140314301Abstract: The method according to the invention generates a color decomposition of histological tissue image data into density maps, where each density map corresponds to the portion of the original image data that contains one stain/tissue combination. After a microscope captures histological tissue image data from a tissue sample that has been stained with at least one stain, the said stain or stains being light absorbent, the method derives from the histological tissue image data a color mixing matrix with the columns of the matrix being color reference vectors. From this mixing matrix the method derives a density map, when the mixing matrix is well-conditioned, by applying a pseudo-inverse of the mixing matrix to the optical density data, and when the mixing matrix is medially-conditioned, by applying a piece-wise pseudo-inverse of the mixing matrix to the optical density data. The method according to the invention also generates stain/tissue combination efficacy measures.Type: ApplicationFiled: November 9, 2012Publication date: October 23, 2014Inventors: Jimmy C. AZAR, Christer BUSCH, Ingrid B. CARLBOM, Milan GAVRILOVIC
-
Patent number: 8214179Abstract: An acoustic modeling system and an acoustic modeling method use beam tracing techniques that accelerate computation of significant acoustic reverberation paths in a distributed virtual environment. The acoustic modeling system and method perform a priority-driven beam tracing to construct a beam tree data structure representing “early” reverberation paths between avatar locations by performing a best-first traversal of a cell adjacency graph that represents the virtual environment. To further accelerate reverberation path computations, the acoustic modeling system and method according to one embodiment perform a bi-directional beam tracing algorithm that combines sets of beams traced from pairs of avatar locations to efficiently find viable acoustic reverberation paths.Type: GrantFiled: November 17, 2006Date of Patent: July 3, 2012Assignee: Agere Systems Inc.Inventors: Ingrid B. Carlbom, Thomas A. Funkhouser
-
Patent number: 7146296Abstract: An acoustic modeling system and an acoustic modeling method use beam tracing techniques that accelerate computation of significant acoustic reverberation paths in a distributed virtual environment. The acoustic modeling system and method perform a priority-driven beam tracing to construct a beam tree data structure representing “early” reverberation paths between avatar locations by performing a best-first traversal of a cell adjacency graph that represents the virtual environment. To further accelerate reverberation path computations, the acoustic modeling system and method according to one embodiment perform a bi-directional beam tracing algorithm that combines sets of beams traced from pairs of avatar locations to efficiently find viable acoustic reverberation paths.Type: GrantFiled: August 7, 2000Date of Patent: December 5, 2006Assignee: Agere Systems Inc.Inventors: Ingrid B. Carlbom, Thomas A. Funkhouser
-
Patent number: 7027049Abstract: An omnidirectional video camera captures images of the environment while moving along several intersecting paths forming an irregular grid. These paths define the boundaries of a set of image loops within the environment. For arbitrary viewpoints within each image loop, a 4D plenoptic function may be reconstructed from the group of images captured at the loop boundary. For an observer viewpoint, a strip of pixels is extracted from an image in the loop in front of the observer and paired with a strip of pixels extracted from another image on the opposite side of the image loop. A new image is generated for an observer viewpoint by warping pairs of such strips of pixels according to the 4D plenoptic function, blending each pair, and then stitching the resulting strips of pixels together.Type: GrantFiled: May 7, 2004Date of Patent: April 11, 2006Assignee: Lucent Technologies Inc.Inventors: Daniel G. Aliaga, Ingrid B. Carlbom
-
Patent number: 6831643Abstract: An omnidirectional video camera captures images of the environment while moving along several intersecting paths forming an irregular grid. These paths define the boundaries of a set of image loops within the environment. For arbitrary viewpoints within each image loop, a 4D plenoptic function may be reconstructed from the group of images captured at the loop boundary. For an observer viewpoint, a strip of pixels is extracted from an image in the loop in front of the observer and paired with a strip of pixels extracted from another image on the opposite side of the image loop. A new image is generated for an observer viewpoint by warping pairs of such strips of pixels according to the 4D plenoptic function, blending each pair, and then stitching the resulting strips of pixels together.Type: GrantFiled: April 16, 2002Date of Patent: December 14, 2004Assignee: Lucent Technologies Inc.Inventors: Daniel G. Aliaga, Ingrid B. Carlbom
-
Patent number: 6751322Abstract: A system and method for acoustic modeling partitions an input 3D spatial model into convex cells, and constructs a cell adjacency data structure representing the neighbor relationships between adjacent cells. For each sound source located in the spatial environment, convex pyramidal beams are traced through the input spatial model via recursive depth-first traversal of the cell-adjacency graph. During beam tracing, a beam tree data structure is constructed to encode propagation paths, which may include specular reflection, transmission, diffuse reflection, and diffraction events, from the source location to regions of the input spatial model. The beam tree data structure is then accessed for real-time computation and auralization of propagation paths to an arbitrary receiver location.Type: GrantFiled: October 2, 1998Date of Patent: June 15, 2004Assignee: Lucent Technologies Inc.Inventors: Ingrid B. Carlbom, Gary W. Elko, Thomas A. Funkhouser, Sarma V. Pingali, Man Mohan Sondhi, James Edward West
-
Publication number: 20030004694Abstract: A paraboloidal catadioptric camera is calibrated by relaxing the assumption of an ideal system to account for perspective projection, radial distortion, and mirror misalignment occurring within the camera system. Calibration points, which are small and visually distinct objects, are distributed at fixed locations within an environment. Omnidirectional images are captured by the catadioptric camera at different locations of the environment. Data points are obtained by identifying the location of the calibration points in each captured image. An optimization algorithm best-fits the data points to a perspective camera model in order to derive parameters, which are used to calibrate the catadioptric camera.Type: ApplicationFiled: May 29, 2002Publication date: January 2, 2003Inventors: Daniel G. Aliaga, Ingrid B. Carlbom
-
Publication number: 20020176635Abstract: An omnidirectional video camera captures images of the environment while moving along several intersecting paths forming an irregular grid. These paths define the boundaries of a set of image loops within the environment. For arbitrary viewpoints within each image loop, a 4D plenoptic function may be reconstructed from the group of images captured at the loop boundary. For an observer viewpoint, a strip of pixels is extracted from an image in the loop in front of the observer and paired with a strip of pixels extracted from another image on the opposite side of the image loop. A new image is generated for an observer viewpoint by warping pairs of such strips of pixels according to the 4D plenoptic function, blending each pair, and then stitching the resulting strips of pixels together.Type: ApplicationFiled: April 16, 2002Publication date: November 28, 2002Inventors: Daniel G. Aliaga, Ingrid B. Carlbom
-
Method and apparatus for deriving novel sports statistics from real time tracking of sporting events
Patent number: 6441846Abstract: A method and apparatus for deriving performance statistics from real time tracking of a sporting event. The method according to the present invention includes a step of obtaining a spatio-temporal trajectory corresponding to the motion of an athlete and based on real time tracking of the athlete. The trajectory is then broken down so that performance information corresponding to the motion of the athlete (such as speed, distance covered, acceleration, etc.) can be derived with respect to time. The information so obtained can be stored in a database or the like for later retrieval or can be used to graphically supplement a video broadcast of a sporting event. The apparatus includes a device for obtaining the trajectory, a computational device for obtaining the performance information based on the obtained trajectory, and a statistical device for compiling the performance information.Type: GrantFiled: June 22, 1998Date of Patent: August 27, 2002Assignee: Lucent Technologies Inc.Inventors: Ingrid B. Carlbom, Yves D. Jean, Sarma V G K Pingali -
Patent number: 6233007Abstract: A method and apparatus for tracking objects used in connection with athletic activities or sporting events, especially, balls, pucks, and the like. The method includes the steps of differencing present and previous frames of a video image including the, for example, ball to obtain motion regions, converting the motion regions to HSV color space, extracting the region corresponding to the ball based on empirical color data about the ball, obtaining a motion vector based on the motion of the ball region from a previous frame to the current frame, and updating the ball trajectory based on the newest motion vector obtained. The method also preferably includes a step of identifying completed trajectories based on preset constraints. The method is preferably expanded on by using at least one pair of cameras to provide a three-dimensional trajectory, and sometimes preferable expanded on by using a plurality of cameras, especially a plurality of pairs of cameras.Type: GrantFiled: June 22, 1998Date of Patent: May 15, 2001Assignee: Lucent Technologies Inc.Inventors: Ingrid B. Carlbom, Yves D. Jean, Sarma VGK Pingali
-
Patent number: 6141041Abstract: A method and apparatus for deriving an occupancy map reflecting an athlete's coverage of a playing area based on real time tracking of a sporting event. The method according to the present invention includes a step of obtaining a spatio-temporal trajectory corresponding to the motion of an athlete and based on real time tracking of the athlete. The trajectory is then mapped over the geometry of the playing area to determine a playing area occupancy map indicating the frequency with which the athlete occupies certain areas of the playing area, or the time spent by the athlete in certain areas of the playing area. The occupancy map is preferably color coded to indicate different levels of occupancy in different areas of the playing area, and the color coded map is then overlaid onto an image (such as a video image) of the playing area.Type: GrantFiled: June 22, 1998Date of Patent: October 31, 2000Assignee: Lucent Technologies Inc.Inventors: Ingrid B. Carlbom, Yves D. Jean, Sarma V G K Pingali