Patents by Inventor Ramesh Raskar

Ramesh Raskar has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10191154
    Abstract: In some implementations, scene depth is extracted from dual frequency of a cross-correlation signal. A camera may illuminate a scene with amplitude-modulated light, sweeping the modulation frequency. For each modulation frequency in the sweep, each camera pixel may measure a cross-correlation of incident light and of a reference electrical signal. Each pixel may output a vector of cross-correlation measurements acquired by the pixel during a sweep. A computer may perform an FFT on this vector, identify a dual frequency at the second largest peak in the resulting power spectrum, and calculate scene depth as equal to a fraction, where the numerator is the speed of light times this dual frequency and the denominator is four times pi. In some cases, the two signals being cross-correlated have the same phase as each other during each cross-correlation measurement.
    Type: Grant
    Filed: February 13, 2017
    Date of Patent: January 29, 2019
    Assignee: Massachusetts Institute of Technology
    Inventors: Achuta Kadambi, James Schiel, Ayush Bhandari, Ramesh Raskar, Vage Taamazyan
  • Patent number: 10190983
    Abstract: A light source may illuminate a scene with pulsed light that is pulsed non-periodically. The scene may include fluorescent material that fluoresces in response to the pulsed light. The pulsed light signal may comprise a maximum length sequence or Gold sequence. A lock-in time-of-flight sensor may take measurements of light returning from the scene. A computer may, for each pixel in the sensor, perform a Discrete Fourier Transform on measurements taken by the pixel, in order to calculate a vector of complex numbers for the pixel. Each complex number in the vector may encode phase and amplitude of incident light at the pixel and may correspond to measurements taken at a given time interval during the pulsed light signal. A computer may, based on phase of the complex numbers for a pixel, calculate fluorescence lifetime and scene depth of a scene point that corresponds to the pixel.
    Type: Grant
    Filed: April 14, 2017
    Date of Patent: January 29, 2019
    Assignee: Massachusetts Institute of Technology
    Inventors: Ayush Bhandari, Christopher Barsi, Achuta Kadambi, Ramesh Raskar
  • Patent number: 10149136
    Abstract: In one embodiment, a method includes detecting a triggering event to initiate a communication session with a second computing device associated with a second user, where the first computing device includes one or more wireless transceivers and one or more sensors, determining an initial trust score for the second computing device, sensing physical interactions between users using the one or more sensors, adjusting the trust score for the second computing device based at least on the sensed physical interactions, and sending a message to the second computing device if the adjusted trust score for the second computing device satisfies a first threshold.
    Type: Grant
    Filed: January 10, 2018
    Date of Patent: December 4, 2018
    Assignee: Facebook, Inc.
    Inventors: Sai Sri Sathya, Ramesh Raskar
  • Patent number: 10105049
    Abstract: A projector and one or more optical components project a light pattern that scans at least a portion of an anterior segment of an eye of a user, while one or more cameras capture images of the anterior segment. During each scan, different pixels in the projector emit light at different times, causing the light pattern to repeatedly change orientation relative to the eye and thus to illuminate multiple different cross-sections of the anterior segment. The cameras capture images of each cross-section from a total of at least two different vantage points relative to the head of the user. The position of the projector, optical components and cameras relative to the head of the user remains substantially constant throughout each entire scan.
    Type: Grant
    Filed: January 19, 2016
    Date of Patent: October 23, 2018
    Inventors: Shantanu Sinha, Hyunsung Park, Albert Redo-Sanchez, Matthew Everett Lawson, Nickolaos Savidis, Pushyami Rachapudi, Ramesh Raskar, Vincent Patalano, II
  • Publication number: 20180259455
    Abstract: A light source may illuminate a scene with pulsed light that is pulsed non-periodically. The scene may include fluorescent material that fluoresces in response to the pulsed light. The pulsed light signal may comprise a maximum length sequence or Gold sequence. A lock-in time-of-flight sensor may take measurements of light returning from the scene. A computer may, for each pixel in the sensor, perform a Discrete Fourier Transform on measurements taken by the pixel, in order to calculate a vector of complex numbers for the pixel. Each complex number in the vector may encode phase and amplitude of incident light at the pixel and may correspond to measurements taken at a given time interval during the pulsed light signal. A computer may, based on phase of the complex numbers for a pixel, calculate fluorescence lifetime and scene depth of a scene point that corresponds to the pixel.
    Type: Application
    Filed: April 14, 2017
    Publication date: September 13, 2018
    Inventors: Ayush Bhandari, Christopher Barsi, Achuta Kadambi, Ramesh Raskar
  • Publication number: 20180259454
    Abstract: A light source may illuminate a scene with amplitude-modulated light. The scene may include fluorescent material. The amplitude modulation may be periodic, and the frequency of the amplitude modulation may be swept. During the sweep, a time-of-flight sensor may take measurements of light returning from the scene. A computer may calculate, for each pixel in the sensor, a vector of complex numbers. Each complex number in the vector may encode phase and amplitude of light incident at the pixel and may correspond to measurements taken at a given frequency in the sweep. A computer may, based on phase of the complex numbers for a pixel, calculate fluorescence lifetime and scene depth of a scene point that corresponds to the pixel.
    Type: Application
    Filed: April 14, 2017
    Publication date: September 13, 2018
    Inventors: Ayush Bhandari, Christopher Barsi, Achuta Kadambi, Ramesh Raskar
  • Publication number: 20180168440
    Abstract: An otoscope may project a temporal sequence of phase-shifted fringe patterns onto an eardrum, while a camera in the otoscope captures images. A computer may calculate a global component of these images. Based on this global component, the computer may output an image of the middle ear and eardrum. This image may show middle ear structures, such as the stapes and incus. Thus, the otoscope may “see through” the eardrum to visualize the middle ear. The otoscope may project another temporal sequence of phase-shifted fringe patterns onto the eardrum, while the camera captures additional images. The computer may subtract a fraction of the global component from each of these additional images. Based on the resulting direct-component images, the computer may calculate a 3D map of the eardrum.
    Type: Application
    Filed: December 20, 2017
    Publication date: June 21, 2018
    Inventors: Anshuman Das, Ramesh Raskar
  • Patent number: 10003725
    Abstract: An open-ended, incoherent bundle of optical fibers transmits light from a nearby scene. A camera captures images of the back end of the fiber bundle. Because the fiber bundle is incoherent, the captured image is shuffled, in the sense that the relative position of pixels in the image differs from the relative position of the scene regions that correspond to the pixels. Calibration is performed in order to map from the front end positions to the back-end positions of the fibers. In the calibration, pulses of light are delivered, in such a way that the time at which light reflecting from a given pulse enters a given fiber directly correlates to the position of the front end of the given fiber. A time-of-flight sensor takes measurements indicative of these time signatures. Based on the map obtained from calibration, a computer de-shuffles the image.
    Type: Grant
    Filed: January 3, 2018
    Date of Patent: June 19, 2018
    Assignee: Massachusetts Institute of Technology
    Inventors: Barmak Heshmat Dehkordi, Ik Hyun Lee, Hisham Bedri, Ramesh Raskar
  • Publication number: 20180131851
    Abstract: An open-ended, incoherent bundle of optical fibers transmits light from a nearby scene. A camera captures images of the back end of the fiber bundle. Because the fiber bundle is incoherent, the captured image is shuffled, in the sense that the relative position of pixels in the image differs from the relative position of the scene regions that correspond to the pixels. Calibration is performed in order to map from the front end positions to the back-end positions of the fibers. In the calibration, pulses of light are delivered, in such a way that the time at which light reflecting from a given pulse enters a given fiber directly correlates to the position of the front end of the given fiber. A time-of-flight sensor takes measurements indicative of these time signatures. Based on the map obtained from calibration, a computer de-shuffles the image.
    Type: Application
    Filed: January 3, 2018
    Publication date: May 10, 2018
    Inventors: Barmak Heshmat Dehkordi, Ik Hyun Lee, Hisham Bedri, Ramesh Raskar
  • Publication number: 20180113321
    Abstract: A sample may be illuminated in such a way that light passes through the sample, reflects from a set of reflectors, passes through the sample again and travels to a light sensor. The reflectors may be staggered in depth beneath the sample, each reflector being at a different depth. Light reflecting from each reflector, respectively, may arrive at the light sensor during a different time interval than that in which light reflecting from other reflectors arrives—or may have a different phase than that of light reflecting from the other reflectors. The light sensor may separately measure light reflecting from each reflector, respectively. The reflectors may be extremely small, and the separate reflections from the different reflectors may be combined in a super-resolved image. The super-resolved image may have a spatial resolution that is better than that indicated by the diffraction limit.
    Type: Application
    Filed: October 23, 2017
    Publication date: April 26, 2018
    Inventors: Barmak Heshmat Dehkordi, Albert Redo-Sanchez, Gordon Moseley Andrews, Ramesh Raskar
  • Patent number: 9897699
    Abstract: A time-of-flight camera images an object around a corner or through a diffuser. In the case of imaging around a corner, light from a hidden target object reflects off a diffuse surface and travels to the camera. Points on the diffuse surface function as a virtual sensors. In the case of imaging through a diffuser, light from the target object is transmitted through a diffusive media and travels to the camera. Points on a surface of the diffuse media that is visible to the camera function as virtual sensors. In both cases, a computer represents phase and intensity measurements taken by the camera as a system of linear equations and solves a linear inverse problem to (i) recover an image of the target object; or (ii) to compute a 3D position for each point in a set of points on an exterior surface of the target object.
    Type: Grant
    Filed: July 9, 2015
    Date of Patent: February 20, 2018
    Assignee: Massachusetts Institute of Technology
    Inventors: Achuta Kadambi, Hang Zhao, Boxin Shi, Ayush Bhandari, Ramesh Raskar
  • Patent number: 9894254
    Abstract: An open-ended, incoherent bundle of optical fibers transmits light from a nearby scene. A camera captures images of the back end of the fiber bundle. Because the fiber bundle is incoherent, the captured image is shuffled, in the sense that the relative position of pixels in the image differs from the relative position of the scene regions that correspond to the pixels. Calibration is performed in order to map from the front end positions to the back-end positions of the fibers. In the calibration, pulses of light are delivered, in such a way that the time at which light reflecting from a given pulse enters a given fiber directly correlates to the position of the front end of the given fiber. A time-of-flight sensor takes measurements indicative of these time signatures. Based on the map obtained from calibration, a computer de-shuffles the image.
    Type: Grant
    Filed: May 10, 2016
    Date of Patent: February 13, 2018
    Assignee: Massachusetts Institute of Technology
    Inventors: Barmak Heshmat Dehkordi, Ik Hyun Lee, Hisham Bedri, Ramesh Raskar
  • Publication number: 20170372201
    Abstract: A deep neural network may be trained on the data of one or more entities, also know as Alices. An outside computing entity, also known as a Bob, may assist in these computations, without receiving access to Alices' data. Data privacy may be preserved by employing a “split” neural network. The network may comprise an Alice part and a Bob part. The Alice part may comprise at least three neural layers, and the Bob part may comprise at least two neural layers. When training on data of an Alice, that Alice may input her data into the Alice part, perform forward propagation though the Alice part, and then pass output activations for the final layer of the Alice part to Bob. Bob may then forward propagate through the Bob part. Similarly, backpropagation may proceed backwards through the Bob part, and then through the Alice part of the network.
    Type: Application
    Filed: June 22, 2017
    Publication date: December 28, 2017
    Inventors: Otkrist Gupta, Ramesh Raskar
  • Patent number: 9844323
    Abstract: In exemplary implementations of this invention, a bi-ocular apparatus presents visual stimuli to one eye of a human subject in order to relax that eye, while measuring refractive aberration of the subject's other eye. Alternately, a monocular device presents stimuli to relax an eye while testing the same eye. The apparatus induces eye relaxation by displaying virtual objects at varying apparent distances from the subject. For example, the apparatus may do so by (i) changing distance between a backlit film and a lens; (ii) using extra lenses; (iii) using an adaptive lens that changes power; (v) selecting distinct positions in a progressive or multi-focal length lens; (vi) selecting distinct optical depths by fiber optical illumination; (vii) displaying a 3D virtual image at any given apparent depth; or (viii) display both a warped version of the real world and a test image at the same time.
    Type: Grant
    Filed: July 20, 2013
    Date of Patent: December 19, 2017
    Assignee: Massachusetts Institute of Technology
    Inventors: Vitor Pamplona, Ramesh Raskar
  • Publication number: 20170331990
    Abstract: An open-ended, incoherent bundle of optical fibers transmits light from a nearby scene. A camera captures images of the back end of the fiber bundle. Because the fiber bundle is incoherent, the captured image is shuffled, in the sense that the relative position of pixels in the image differs from the relative position of the scene regions that correspond to the pixels. Calibration is performed in order to map from the front end positions to the back-end positions of the fibers. In the calibration, pulses of light are delivered, in such a way that the time at which light reflecting from a given pulse enters a given fiber directly correlates to the position of the front end of the given fiber. A time-of-flight sensor takes measurements indicative of these time signatures. Based on the map obtained from calibration, a computer de-shuffles the image.
    Type: Application
    Filed: May 10, 2016
    Publication date: November 16, 2017
    Inventors: Barmak Heshmat Dehkordi, Ik Hyun Lee, Hisham Bedri, Ramesh Raskar
  • Patent number: 9778363
    Abstract: In illustrative implementations, a time-of-flight camera robustly measures scene depths, despite multipath interference. The camera emits amplitude modulated light. An FPGA sends at least two electrical signals, the first being to control modulation of radiant power of a light source and the second being a reference signal to control modulation of pixel gain in a light sensor. These signals are identical, except for time delays. These signals comprise binary codes that are m-sequences or other broadband codes. The correlation waveform is not sinusoidal. During measurements, only one fundamental modulation frequency is used. One or more computer processors solve a linear system by deconvolution, in order to recover an environmental function. Sparse deconvolution is used if the scene has only a few objects at a finite depth. Another algorithm, such as Wiener deconvolution, is used is the scene has global illumination or a scattering media.
    Type: Grant
    Filed: October 24, 2014
    Date of Patent: October 3, 2017
    Assignee: Massachusetts Institute of Technology
    Inventors: Achuta Kadambi, Refael Whyte, Ayush Bhandari, Lee Streeter, Christopher Barsi, Adrian Dorrington, Ramesh Raskar
  • Publication number: 20170248532
    Abstract: For each X-ray path through a tissue, numerous trials are conducted. In each trial, X-ray photons are emitted along the path until a Geiger-mode avalanche photodiode “clicks”. A temporal average—i.e., the average amount of time elapsed before a “click” occurs—is calculated. This temporal average is, in turn, used to estimate a causal intensity of X-ray light that passes through the tissue along the path and reaches the diode. Based on the causal intensities for multiple paths, a computer generates computed tomography (CT) images or 2D digital radiographic images. The causal intensities used to create the images are estimated from temporal statistics, and not from conventional measurements of intensity at a pixel. X-ray dosage needed for imaging is dramatically reduced as follows: a “click” of the photodiode triggers negative feedback that causes the system to halt irradiation of the tissue along a path, until the next trial begins.
    Type: Application
    Filed: October 29, 2015
    Publication date: August 31, 2017
    Inventors: Achuta Kadambi, Ramesh Raskar, Rajiv Gupta, Adam Pan
  • Publication number: 20170234985
    Abstract: In some implementations, scene depth is extracted from dual frequency of a cross-correlation signal. A camera may illuminate a scene with amplitude-modulated light, sweeping the modulation frequency. For each modulation frequency in the sweep, each camera pixel may measure a cross-correlation of incident light and of a reference electrical signal. Each pixel may output a vector of cross-correlation measurements acquired by the pixel during a sweep. A computer may perform an FFT on this vector, identify a dual frequency at the second largest peak in the resulting power spectrum, and calculate scene depth as equal to a fraction, where the numerator is the speed of light times this dual frequency and the denominator is four times pi. In some cases, the two signals being cross-correlated have the same phase as each other during each cross-correlation measurement.
    Type: Application
    Filed: February 13, 2017
    Publication date: August 17, 2017
    Inventors: Achuta Kadambi, James Schiel, Ayush Bhandari, Ramesh Raskar, Vage Taamazyan
  • Publication number: 20170212059
    Abstract: An imaging system images near-field objects with focused microwave or terahertz radiation. Multiple antennas emit microwave or terahertz radiation, such that the radiation varies in frequency over time, illuminates a near-field object, reflects from the near-field object, and travels to a passive aperture. For example, the passive aperture may comprise a dielectric lens or a parabolic reflector. The passive aperture focuses, onto a spatial region, the microwave or terahertz radiation that reflected from the near-field object. One or more antennas take measurements, in the spatial region, of the microwave or terahertz radiation that reflected from the near-field object. A computer calculates, based on the measurements, an image of the near-field object and depth information regarding the near-field object.
    Type: Application
    Filed: September 15, 2016
    Publication date: July 27, 2017
    Inventors: Gregory Charvat, Andrew Temme, Micha Feigin-Almon, Ramesh Raskar, Hisham Bedri
  • Patent number: 9662014
    Abstract: A retinal imaging device includes a camera, a light source, a projector, an I/O device and a computer. The projector emits two sets of light rays, such that one set of rays lies on an exterior surface of a first cone, and the other set of rays lie on an exterior surface of a second cone. The user adjusts the position of his or her eye relative to the camera, until the rays form a full, undistorted target image on the retina. This full, undistorted image is only seen when the pupil of the eye is positioned in the intersection of the first and second cones, and the eye is thus aligned with the camera. The user provides input, via the I/O device, that the user is seeing this image. The computer then instructs the camera to capture retinal images and the light source to simultaneously illuminate the retina.
    Type: Grant
    Filed: April 14, 2016
    Date of Patent: May 30, 2017
    Assignee: Massachusetts Institute of Technology
    Inventors: Tristan Swedish, Karin Roesch, Ramesh Raskar