Patents by Inventor Ramesh Raskar

Ramesh Raskar has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20130100339
    Abstract: In exemplary implementations of this invention, a set of two scanning mirrors scans the one dimensional field of view of a streak camera across a scene. The mirrors are continuously moving while the camera takes streak images. Alternately, the mirrors may only between image captures. An illumination source or other captured event is synchronized with the camera so that for every streak image the scene looks different. The scanning assures that different parts of the scene are captured.
    Type: Application
    Filed: October 7, 2012
    Publication date: April 25, 2013
    Applicant: Massachusetts Institute of Technology
    Inventors: Ramesh Raskar, Andreas Velten
  • Publication number: 20130100250
    Abstract: In exemplary implementations of this invention, a 3D range camera “looks around a corner” to image a hidden object, using light that has bounced (reflected) off of a diffuse reflector. The camera can recover the 3D structure of the hidden object.
    Type: Application
    Filed: October 7, 2012
    Publication date: April 25, 2013
    Applicant: Massachusetts Institute of Technology
    Inventors: Ramesh Raskar, Andreas Velten
  • Patent number: 8366003
    Abstract: In an illustrative implementation of this invention, an optical pattern that encodes binary data is printed on a transparency. For example, the pattern may comprise data matrix codes. A lenslet is placed at a distance equal to its focal length from the optical pattern, and thus collimates light from the optical pattern. The collimated light travels to a conventional camera. For example, the camera may be meters distant. The camera takes a photograph of the optical pattern at a time that the camera is not focused on the scene that it is imaging, but instead is focused at infinity. Because the light is collimated, however, a focused image is captured at the camera's focal plane. The binary data in the pattern may include information regarding the object to which the optical pattern is affixed and information from which the camera's pose may be calculated.
    Type: Grant
    Filed: July 16, 2010
    Date of Patent: February 5, 2013
    Assignee: Massachusetts Institute of Technology
    Inventors: Ankit Mohan, Ramesh Raskar, Shinsaku Hiura, Quinn Smithwick, Grace Woo
  • Publication number: 20130027668
    Abstract: In exemplary implementations, this invention is a tool for subjective assessment of the visual acuity of a human eye. A microlens or pinhole array is placed over a high-resolution display. The eye is brought very near to the device. Patterns are displayed on the screen under some of the lenslets or pinholes. Using interactive software, a user causes the patterns that the eye sees to appear to be aligned. The software allows the user to move the apparent position of the patterns. This apparent motion is achieved by pre-warping the position and angle of the ray-bundles exiting the lenslet display. As the user aligns the apparent position of the patterns, the amount of pre-warping varies. The amount of pre-warping required in order for the user to see what appears to be a single, aligned pattern indicates the lens aberration of the eye.
    Type: Application
    Filed: April 22, 2011
    Publication date: January 31, 2013
    Inventors: Vitor Pamplona, Manuel Menezes de Oliveira Neto, Ankit Mohan, Ramesh Raskar
  • Publication number: 20120300062
    Abstract: In exemplary implementations of this invention, a time of flight camera (ToF camera) can estimate the location, motion and size of a hidden moving object, even though (a) the hidden object cannot be seen directly (or through mirrors) from the vantage point of the ToF camera (including the camera's illumination source and sensor), and (b) the object is in a visually cluttered environment. The hidden object is a NLOS (non-line-of-sight) object. The time of flight camera comprises a streak camera and a laser. In these exemplary implementations, the motion and absolute locations of NLOS moving objects in cluttered environments can be estimated through tertiary reflections of pulsed illumination, using relative time differences of arrival at an array of receivers. Also, the size of NLOS moving objects can be estimated by backprojecting extremas of NLOS moving object time responses.
    Type: Application
    Filed: May 23, 2012
    Publication date: November 29, 2012
    Applicant: MASSACHUSETTS INSTITUTE OF TECHNOLOGY
    Inventors: Rohit Pandharkar, Andreas Velten, Ramesh Raskar
  • Publication number: 20120206694
    Abstract: In exemplary implementations of this invention, cataracts in the human eye are assessed and mapped by measuring the perceptual impact of forward scattering on the foveal region. The same method can be used to measure scattering/blocking media inside lenses of a camera. Close-range anisotropic displays create collimated beams of light to scan through sub-apertures, scattering light as it strikes a cataract. User feedback is accepted and analyzed, to generate maps for opacity, attenuation, contrast and sub-aperture point-spread functions (PSFs). Optionally, the PSF data is used to reconstruct the individual's cataract-affected view.
    Type: Application
    Filed: February 14, 2012
    Publication date: August 16, 2012
    Applicant: MASSACHUSETTS INSTITUTE OF TECHNOLOGY
    Inventors: Ramesh Raskar, Vitor Pamplona, Erick Passos, Jan Zizka
  • Patent number: 8229244
    Abstract: Embodiments of the invention describe a method for reducing a blur in an image of a scene. First, we acquire a set of images of the scene, wherein each image in the set of images includes an object having a blur associated with a point spread function (PSF) forming a set of point spread functions (PSFs), wherein the set of PSFs is suitable for null-filling operation. Next, we invert jointly the set of images and the set of PSFs to produce an output image having a reduced blur.
    Type: Grant
    Filed: March 30, 2009
    Date of Patent: July 24, 2012
    Assignee: Mitsubishi Electric Research Laboratories, Inc.
    Inventors: Amit Agrawal, Yi Xu, Ramesh Raskar
  • Publication number: 20120140131
    Abstract: In exemplary implementations of this invention, two LCD screens display a multi-view 3D image that has both horizontal and vertical parallax, and that does not require a viewer to wear any special glasses. Each pixel in the LCDs can take on any value: the pixel can be opaque, transparent, or any shade between. For regions of the image that are adjacent to a step function (e.g., a depth discontinuity) and not adjacent to a sharp corner, the screens display local parallax barriers comprising many small slits. The barriers and the slits tend to be oriented perpendicular to the local angular gradient of the target light field. In some implementations, the display is optimized to seek to minimize the Euclidian distance between the desired light field and the actual light field that is produced. Weighted, non-negative matrix factorization (NMF) is used for this optimization.
    Type: Application
    Filed: December 1, 2011
    Publication date: June 7, 2012
    Applicant: MASSACHUSETTS INSTITUTE OF TECHNOLOGY
    Inventors: Douglas Lanman, Matthew Hirsch, Yun Hee Kim, Szymon Jakubczak, Ramesh Raskar
  • Publication number: 20120075423
    Abstract: In illustrative implementations of this invention, multi-path analysis of transient illumination is used to reconstruct scene geometry, even of objects that are occluded from the camera. An ultrafast camera system is used. It comprises a photo-sensor (e.g., accurate in the picosecond range), a pulsed illumination source (e.g. a femtosecond laser) and a processor. The camera emits a very brief light pulse that strikes a surface and bounces. Depending on the path taken, part of the light may return to the camera after one, two, three or more bounces. The photo-sensor captures the returning light bounces in a three-dimensional time image I(x,y,t) for each pixel. The camera takes different angular samples from the same viewpoint, recording a five-dimensional STIR (Space Time Impulse Response). A processor analyzes onset information in the STIR to estimate pairwise distances between patches in the scene, and then employs isometric embedding to estimate patch coordinates.
    Type: Application
    Filed: September 29, 2010
    Publication date: March 29, 2012
    Applicant: MASSACHUSETTS INSTITUTE OF TECHNOLOGY
    Inventors: Ahmed Kirmani, Ramesh Raskar, James Davis
  • Publication number: 20120057040
    Abstract: Provided are an apparatus and method for processing a light field image that is acquired and processed using a mask to spatially modulate a light field. The apparatus includes a lens, a mask to spatially modulate 4D light field data of a scene passing through the lens to include wideband information on the scene, a sensor to detect a 2D image corresponding to the spatially modulated 4D light field data, and a data processing unit to recover the 4D light field data from the 2D image to generate an all-in-focus image.
    Type: Application
    Filed: May 6, 2011
    Publication date: March 8, 2012
    Inventors: Byung Kwan Park, Ghulam Ahmed Kirmani, Ramesh Raskar, Rohit Pandharkar
  • Publication number: 20110316968
    Abstract: A single camera acquires an input image of a scene as observed in an array of spheres, wherein pixels in the input image corresponding to each sphere form a sphere image. A set of virtual cameras are defined for each sphere on a line joining a center of the sphere and a center of projection of the camera, wherein each virtual camera has a different virtual viewpoint and an associated cone of rays, appearing as a circle of pixels on its virtual image plane. A projective texture mapping of each sphere image is applied to all of the virtual cameras on the virtual image plane to produce a virtual camera image comprising circle of pixels. Each virtual camera image for each sphere is then projected to a refocusing geometry using a refocus viewpoint to produce a wide-angle lightfield view, which are averaged to produce a refocused wide-angle image.
    Type: Application
    Filed: June 29, 2010
    Publication date: December 29, 2011
    Inventors: Yuichi Taguchi, Amit K. Agrawal, Ashok N. Veeraraghavan, Srikumar Ramalingam, Ramesh Raskar
  • Patent number: 8009192
    Abstract: An optical receiver is arranged at a location in a scene. The optical receiver includes a photo sensor configured to detect spatio-temporal modulated optical signals directed at the scene from a set of spatially dispersed optical transmitters, and to convert the optical signals from each of the optical transmitters to a corresponding electronic signal. The electronic signals can be analyzed to determine geometric properties of the location in the scene.
    Type: Grant
    Filed: May 17, 2006
    Date of Patent: August 30, 2011
    Assignee: Mitsubishi Electric Research Laboratories, Inc.
    Inventors: Ramesh Raskar, Hideaki Nii, Jay W. Summet, Yong Zhao, Paul H. Dietz, Jonathan Westhues, Michael Noland, Erich Bruns, Shree Nayar, Vlad Branzoi
  • Publication number: 20110191073
    Abstract: In an exemplary implementation of this invention, light from a scattering scene passes through a spatial light attenuation pattern and strikes a sensor plane of a camera. Based on said camera's measurements of the received light, a processing unit calculates angular samples of the received light. Light that strikes the sensor plane at certain angles comprises both scattered and directly transmitted components; whereas light that strikes at other angles comprises solely scattered light. A processing unit calculates a polynomial model for the intensity of scattered-only light that falls at the latter angles, and further estimates the direct-only component of the light that falls at the former angles. Further, a processing unit may use the estimated direct component to calculate a reconstructed 3D shape, such as a 3D shape of a finger vein pattern, using an algebraic reconstruction technique.
    Type: Application
    Filed: February 4, 2010
    Publication date: August 4, 2011
    Applicant: MASSACHUSETTS INSTITUTE OF TECHNOLOGY
    Inventors: Jaewon Kim, Ramesh Raskar
  • Patent number: 7983487
    Abstract: A method and system determines a 3D pose of an object in a scene. Depth edges are determined from a set of images acquired of a scene including multiple objects while varying illumination in the scene. The depth edges are linked to form contours. The images are segmented into regions according to the contours. An occlusion graph is constructed using the regions. The occlusion graph includes a source node representing an unoccluded region of an unoccluded object in scene. The contour associated with the unoccluded region is compared with a set of silhouettes of the objects, in which each silhouette has a known pose. The known pose of a best matching silhouette is selected as the pose of the unoccluded object.
    Type: Grant
    Filed: November 7, 2007
    Date of Patent: July 19, 2011
    Assignee: Mitsubishi Electric Research Laboratories, Inc.
    Inventors: Amit K. Agrawal, Ramesh Raskar
  • Patent number: 7965936
    Abstract: A camera acquires a 4D light field of a scene. The camera includes a lens and sensor. A mask is arranged in a straight optical path between the lens and the sensor. The mask including an attenuation pattern to spatially modulate the 4D light field acquired of the scene by the sensor. The pattern has a low spatial frequency when the mask is arranged near the lens, and a high spatial frequency when the mask is arranged near the sensor.
    Type: Grant
    Filed: June 30, 2010
    Date of Patent: June 21, 2011
    Assignee: Mitsubishi Electric Research Laboratories, Inc
    Inventors: Ramesh Raskar, Amit K. Agrawal
  • Patent number: 7957007
    Abstract: A projector that illuminates a scene with multiplexed light patterns includes a passive physical mask, and a set of spatially dispersed optical emitters arranged behind the physical mask. The optical emitters are modulated to project a set of unique optical light patterns.
    Type: Grant
    Filed: May 17, 2006
    Date of Patent: June 7, 2011
    Assignee: Mitsubishi Electric Research Laboratories, Inc.
    Inventors: Ramesh Raskar, Paul H. Dietz
  • Publication number: 20110017826
    Abstract: In an illustrative implementation of this invention, an optical pattern that encodes binary data is printed on a transparency. For example, the pattern may comprise data matrix codes. A lenslet is placed at a distance equal to its focal length from the optical pattern, and thus collimates light from the optical pattern. The collimated light travels to a conventional camera. For example, the camera may be meters distant. The camera takes a photograph of the optical pattern at a time that the camera is not focused on the scene that it is imaging, but instead is focused at infinity. Because the light is collimated, however, a focused image is captured at the camera's focal plane. The binary data in the pattern may include information regarding the object to which the optical pattern is affixed and information from which the camera's pose may be calculated.
    Type: Application
    Filed: July 16, 2010
    Publication date: January 27, 2011
    Applicant: Massachusetts Institute of Technology
    Inventors: Ankit Mohan, Ramesh Raskar, Shinsaku Hiura, Quinn Smithwick, Grace Woo
  • Publication number: 20110019056
    Abstract: A bidirectional screen alternately switches between a display mode showing conventional graphics and a capture mode in which the LCD backlight is disabled and the LCD displays a pinhole array or a tiled-broadband code. A large-format image sensor is placed behind the liquid crystal layer. Together, the image sensor and LCD function as a mask-based light field camera, capturing an array of images equivalent to that produced by an array of cameras spanning the display surface. The recovered multi-view orthographic imagery is used to passively estimate the depth of scene points from focus.
    Type: Application
    Filed: November 20, 2009
    Publication date: January 27, 2011
    Applicant: MASSACHUSETTS INSTITUTE OF TECHNOLOGY
    Inventors: Matthew Hirsch, Ramesh Raskar, Henry Holtzman, Douglas Lanman
  • Publication number: 20100265386
    Abstract: A camera acquires a 4D light field of a scene. The camera includes a lens and sensor. A mask is arranged in a straight optical path between the lens and the sensor. The mask including an attenuation pattern to spatially modulate the 4D light field acquired of the scene by the sensor. The pattern has a low spatial frequency when the mask is arranged near the lens, and a high spatial frequency when the mask is arranged near the sensor.
    Type: Application
    Filed: June 30, 2010
    Publication date: October 21, 2010
    Inventors: Ramesh Raskar, Amit K. Agrawal
  • Publication number: 20100259670
    Abstract: In exemplary implements of this invention, a lens and sensor of a camera are intentionally destabilized (i.e., shifted relative to the scene being imaged) in order to create defocus effects. That is, actuators in a camera move a lens and a sensor, relative to the scene being imaged, while the camera takes a photograph. This motion simulates a larger aperture size (shallower depth of field). Thus, by translating a lens and a sensor while taking a photo, a camera with a small aperture (such as a cell phone or small point and shoot camera) may simulate the shallow DOF that can be achieved with a professional SLR camera. This invention may be implemented in such a way that programmable defocus effects may be achieved. Also, approximately depth-invariant defocus blur size may be achieved over a range of depths, in some embodiments of this invention.
    Type: Application
    Filed: April 12, 2010
    Publication date: October 14, 2010
    Applicant: MASSACHUSETTS INSTITUTE OF TECHNOLOGY
    Inventors: Ankit Mohan, Douglas Lanman, Shinsaku Hiura, Ramesh Raskar