Patents by Inventor Taeg Sang Cho

Taeg Sang Cho has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 8451338
    Abstract: Object motion during camera exposure often leads to noticeable blurring artifacts. Proper elimination of this blur is challenging because the blur kernel is unknown, varies over the image as a function of object velocity, and destroys high frequencies. In the case of motions along a 1D direction (e.g. horizontal), applicants show that these challenges can be addressed using a camera that moves during the exposure. Through the analysis of motion blur as space-time integration, applicants show that a parabolic integration (corresponding to constant sensor acceleration) leads to motion blur that is not only invariant to object velocity, but preserves image frequency content nearly optimally. That is, static objects are degraded relative to their image from a static camera, but all moving objects within a given range of motions reconstruct well.
    Type: Grant
    Filed: March 28, 2008
    Date of Patent: May 28, 2013
    Assignee: Massachusetts Institute of Technology
    Inventors: Anat Levin, Peter Sand, Taeg Sang Cho, Fredo Durand, William T. Freeman
  • Patent number: 8340463
    Abstract: Systems and methods for performing image editing operations may divide an input image into overlapping patches and assign those patches to locations in a reconstructed output image such that visual artifacts are minimized. The methods may use belief propagation to compute a joint probability for the assignment of active patch labels to output image nodes. The computation may include an exclusivity term, steering the solution such that each patch is preferably only used once in the output image. The methods may include a pre-computation of a pruned list of candidate patches for placing next to each patch in the output image, dependent on local evidence (e.g., color, intensity, or user-driven placement) for each patch. The pre-computation may include determining groupings of patches, each forming a highly compatible loop of neighboring patches for a given candidate patch. The methods may be implemented as program instructions executable by a CPU and/or GPU.
    Type: Grant
    Filed: November 26, 2008
    Date of Patent: December 25, 2012
    Assignee: Adobe Systems Incorporated
    Inventors: Taeg Sang Cho, Shmuel Avidan, William T. Freeman
  • Patent number: 8233739
    Abstract: Systems and methods for performing image editing operations may use patch transforms and inverse patch transforms to reconstruct output images from input images and to refine them using patch jittering such that visual artifacts are repaired. The methods may include generating one or more jittered versions of patch(es) initially assigned to nodes of the output image and using them as candidate patches for a refined image. Jittered versions of patches may be shifted by a small number of pixels in one or more directions. The number of jittered versions and amount of jittering exhibited by each may be configurable (e.g., programmatically or by a user) and/or may be dependent on the amount of overlap between the patches. Belief propagation may be used to replace patches in the output image with jittered versions in the refined image. The methods may be implemented as program instructions executable on a CPU and/or GPU.
    Type: Grant
    Filed: November 26, 2008
    Date of Patent: July 31, 2012
    Assignee: Adobe Systems Incorporated
    Inventors: Taeg Sang Cho, Shmuel Avidan, William T. Freeman
  • Publication number: 20090244300
    Abstract: Object motion during camera exposure often leads to noticeable blurring artifacts. Proper elimination of this blur is challenging because the blur kernel is unknown, varies over the image as a function of object velocity, and destroys high frequencies. In the case of motions along a 1D direction (e.g. horizontal), applicants show that these challenges can be addressed using a camera that moves during the exposure. Through the analysis of motion blur as space-time integration, applicants show that a parabolic integration (corresponding to constant sensor acceleration) leads to motion blur that is not only invariant to object velocity, but preserves image frequency content nearly optimally. That is, static objects are degraded relative to their image from a static camera, but all moving objects within a given range of motions reconstruct well.
    Type: Application
    Filed: March 28, 2008
    Publication date: October 1, 2009
    Applicant: Massachusetts Institute of Technology
    Inventors: Anat Levin, Peter Sand, Taeg Sang Cho, Fredo Durand, Willliam T. Freeman