ADAPTIVE FILTER DEMOSAICIZING FOR SUPER RESOLUTION

An apparatus and method for demosaicing sampled color values are provided. The method includes storing image information for a plurality of images, selecting an intended viewpoint that is within a composite of image information across the plurality of images, reconstructing color information for at least one color using image information from the composite of image information across the plurality of images that corresponds to the intended viewpoint.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATION(S)

This application claims the benefit under 35 U.S.C. §119(e) of a U.S. Provisional application filed on Jun. 12, 2014 in the U.S. Patent and Trademark Office and assigned Ser. No. 62/011,311, the entire disclosure of which is hereby incorporated by reference.

TECHNICAL FIELD

The present disclosure relates to an apparatus and method for processing an image. More particularly, the present disclosure relates to an apparatus and method for demosaicing an image using sampled color values of a plurality of images.

BACKGROUND

Mobile terminals are developed to provide wireless communication between users. As technology has advanced, mobile terminals now provide many additional features beyond simple telephone conversation. For example, mobile terminals are now able to provide image and video capture. As a result of the ubiquity of mobile terminals, image and video capture has become increasingly popular. Consequently, various image processing techniques are used to provide a user with an accurate representation of the image intended to be captured.

In order to provide a more accurate representation of the color of the image intended to be captured, a technique commonly referred to as demosaicing may be performed. Demosaicing refers to the digital imaging process used to reconstruct a full color image from color samples output from an image detector.

Accordingly, there is a need for an apparatus and method for providing an improved representation of the image intended to be captured. Further, there is a need for an apparatus and method for demosaicing color values sampled during image capture.

The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present disclosure.

SUMMARY

Aspects of the present disclosure are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide an apparatus and method for demosaicing sampled color values.

In accordance with an aspect of the present disclosure, a method for demosaicing sampled color values is provided. The method includes storing image information for a plurality of images, selecting an intended viewpoint that is within a composite of image information across the plurality of images, and reconstructing color information for at least one color using image information from the composite of image information across the plurality of images that corresponds to the intended viewpoint.

In accordance with another aspect of the present disclosure, an apparatus for demosaicing sampled color values is provided. The apparatus includes a storage unit, and at least one processor configured to store image information for a plurality of images, to select an intended viewpoint that is within a composite of image information across the plurality of images, and to reconstruct color information for at least one color using image information from the composite of image information across the plurality of images that corresponds to the intended viewpoint.

In accordance with another aspect of the present disclosure, a method for demosaicing sampled color values is provided. The method includes storing image information for an intended image, determining the subpixels of the intended image for which color information of a particular color is captured, generating an adaptive filter to demosaic the image information for the intended image according to the subpixels of the intended image for which color information of a particular color is captured, and demosaicing the image information for the intended image using the adaptive filter.

In accordance with another aspect of the present disclosure, an apparatus for demosaicing sampled color values is provided. The apparatus includes a storage unit, and at least one processor configured to store image information for an intended image, to determine subpixels of the intended image for which color information of a particular color is captured, to generate an adaptive filter to demosaic the image information for the intended image according to the subpixels of the intended image for which color information of a particular color is captured, and to demosaic the image information for the intended image using the adaptive filter.

Other aspects, advantages, and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses various embodiments of the disclosure.

BRIEF DESCRIPTION OF THE DRAWINGS

The above and other aspects, features, and advantages of various embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:

FIG. 1 illustrates an array of photosites of a photo detector according to the related art;

FIG. 2A illustrates an array of photosites of a photo detector according to the related art;

FIG. 2B illustrates red color values sampled at a plurality of photosites using an image detector having an array of photosites of a photo detector such as, for example, the array of photosites illustrated in FIG. 2A according to the related art;

FIG. 3 illustrates a plurality of images according to an embodiment of the present disclosure;

FIGS. 4A, 4B, 4C, and 4D illustrate a plurality of subpixels according to an embodiment of the present disclosure;

FIG. 5 illustrates a flowchart of a method of determining color values according to an embodiment of the present disclosure;

FIG. 6 illustrates a flowchart of a method of determining color values according to an embodiment of the present disclosure;

FIG. 7A illustrates a flowchart of a method of demosaicing according to an embodiment of the present disclosure;

FIG. 7B illustrates a plurality of subpixels according to an embodiment of the present disclosure;

FIG. 7C illustrates a filter according to an embodiment of the present disclosure;

FIG. 7D illustrates a plurality of subpixels according to an embodiment of the present disclosure;

FIG. 7E illustrates a filter according to an embodiment of the present disclosure;

FIG. 8 illustrates pseudo code for a method of demosaicing according to an embodiment of the present disclosure; and

FIG. 9 illustrates a block diagram schematically illustrating a configuration of an electronic device according to an embodiment of the present disclosure.

Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.

DETAILED DESCRIPTION

The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the disclosure as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the disclosure. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.

The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the present disclosure are provided for illustration purpose only and not for the purpose of limiting the disclosure as defined by the appended claims and their equivalents.

It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.

By the term “substantially” it is meant that the recited characteristic, parameter, or value need not be achieved exactly, but that deviations or variations, including for example, tolerances, measurement error, measurement accuracy limitations and other factors known to those of skill in the art, may occur in amounts that do not preclude the effect the characteristic was intended to provide.

According to various embodiments of the present disclosure, an electronic device may include communication functionality. For example, an electronic device may be a smart phone, a tablet Personal Computer (PC), a mobile phone, a video phone, an e-book reader, a desktop PC, a laptop PC, a netbook PC, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), an mp3 player, a mobile medical device, a camera, a wearable device (e.g., a Head-Mounted Device (HMD), electronic clothes, electronic braces, an electronic necklace, an electronic appcessory, an electronic tattoo, or a smart watch), and/or the like.

According to various embodiments of the present disclosure, an electronic device may be a smart home appliance with communication functionality. A smart home appliance may be, for example, a television, a Digital Video Disk (DVD) player, an audio, a refrigerator, an air conditioner, a vacuum cleaner, an oven, a microwave oven, a washer, a dryer, an air purifier, a set-top box, a TV box (e.g., Samsung HomeSync™, Apple TV™, or Google TV™), a gaming console, an electronic dictionary, an electronic key, a camcorder, an electronic picture frame, and/or the like.

According to various embodiments of the present disclosure, an electronic device may be a medical device (e.g., Magnetic Resonance Angiography (MRA) device, a Magnetic Resonance Imaging (MRI) device, Computed Tomography (CT) device, an imaging device, or an ultrasonic device), a navigation device, a Global Positioning System (GPS) receiver, an Event Data Recorder (EDR), a Flight Data Recorder (FDR), an automotive infotainment device, a naval electronic device (e.g., naval navigation device, gyroscope, or compass), an avionic electronic device, a security device, an industrial or consumer robot, and/or the like.

According to various embodiments of the present disclosure, an electronic device may be furniture, part of a building/structure, an electronic board, electronic signature receiving device, a projector, various measuring devices (e.g., water, electricity, gas or electro-magnetic wave measuring devices), and/or the like that include communication functionality.

According to various embodiments of the present disclosure, an electronic device may be any combination of the foregoing devices. In addition, it will be apparent to one having ordinary skill in the art that an electronic device according to various embodiments of the present disclosure is not limited to the foregoing devices.

Various embodiments of the present disclosure include an apparatus and method for demosaicing sampled color values. In addition, various embodiments of the present disclosure include an apparatus and method for determining color values.

An image is represented by a number of areas called pixels. Each pixel is associated with a color that should be substantially reproduced by a set of subpixels in a display. According to the related art, each subpixel displays a primary color. For example, each subpixel according to the related art is associated with some hue and saturation. Other colors may be obtained by mixing primary colors. Each pixel is mapped into a set of one or more subpixels which are to display the color of the pixel.

In some displays, each repeating set of subpixels includes a subpixel for each primary color. The subpixels are small, and are spaced closely together, to provide a desired resolution. This structure is not cost-effective however because the structure does not match the resolution of human vision. Humans are more perceptive to luminance differences than to chromatic differences. Therefore, some displays map an input pixel into a subpixel repeating set that does not include only the subpixels of each primary color. The chromatic resolution is reduced, but the luminance resolution remains high. One such display may be an RGBW display.

An image detector has a plurality of photo detectors used to sample an image. Each of the plurality of photo detectors may sample (e.g., capture) a value for a single color. For example, each of the plurality of photo detectors may be configured with a color filter. According to the related art, a Color Filter Array (CFA) or a Color Filter Mosaic (CFM) is an array or mosaic of color filters disposed above the plurality of photo detectors.

Each of the plurality of photo detectors may be located at a photosite of the image detector. The photosite refers to the spatial location at which a color may be sampled by a photo detector. The array or mosaic of color filters may be disposed above the plurality of photo detectors such that each photo detector has a single corresponding color filter. Accordingly, each photosite may have a corresponding sampled value for a single color.

Each of the photosites may be mapped or otherwise correspond to a subpixel of the image (e.g., when displayed on a display). Accordingly, each subpixel of the image may have a corresponding sampled value for a single color. Because each subpixel of the image does not have sampled values for all colors, an image represented by the sampled color values at each subpixel may appear pixelated with a disjointed color representation of the intended image. In other words, the image represented by the sampled color values at each subpixel may be an inaccurate representation of the image intended to be captured.

In order to provide a more accurate representation of the color of the image intended to be captured, a technique commonly referred to as demosaicing may be performed. Demosaicing refers to the digital imaging process used to reconstruct a full color image from sampled color values output from an image detector. Because each photo detector has a spatial footprint in the image detector, the image detector is unable to capture a color value for every color at each respective photosite thereof. The reconstruction of the full color image using color samples output from the respective photo detectors constituting the image detector may use interpolation or other numerical methods to determine values for each color (e.g., red, green, and blue) at each subpixel.

FIG. 1 illustrates an array of photosites of a photo detector according to the related art.

Referring to FIG. 1, an array of photosites 100 comprising red, green, blue, and white photosites is illustrated. As an example, the red, green, blue, and white photosites may be created by placing an RGBW filter over the photo detectors respectively corresponding to the photosites.

The array of photosites 100 may correspond to a plurality of pixels. For example, as illustrated in FIG. 1, the plurality of pixels are distinguished from one another by a solid line. Each of the plurality of pixels may include a plurality of subpixels. For example, each of the plurality of pixels may include four subpixels. The plurality of subpixels in each of the plurality of pixels are distinguished from one another by a dotted line in FIG. 1.

If an image detector comprises an RGBW filter, then the plurality of subpixels constituting a pixel corresponding to the respective groups of photosites of the image detector may include a white subpixel, a red subpixel, a blue subpixel, and a green subpixel. As illustrated in FIG. 1, a white subpixel is denoted by ‘W,’ a red subpixel is denoted by ‘R,’ a blue subpixel is denoted by ‘B,’ and a green subpixel is denoted by ‘G.’

The white subpixels may respectively sample a white color value in response to a request to capture an image. The white subpixels may correspond to a subpixel 105a, a subpixel 110a, a subpixel 115a, a subpixel 120a, a subpixel 125a, a subpixel 130a, a subpixel 135a, a subpixel 140a, and a subpixel 145a.

The red subpixels may respectively sample a red color value in response to a request to capture an image. The red subpixels may correspond to a subpixel 105b, a subpixel 110b, a subpixel 115b, a subpixel 120b, a subpixel 125b, a subpixel 130b, a subpixel 135b, a subpixel 140b, and a subpixel 145b.

The blue subpixels may respectively sample a blue color value in response to a request to capture an image. The blue subpixels may correspond to a subpixel 105c, a subpixel 110c, a subpixel 115c, a subpixel 120c, a subpixel 125c, a subpixel 130c, a subpixel 135c, and a subpixel 140c.

The green subpixels may respectively sample a green color value in response to a request to capture an image. The green subpixels may correspond to a subpixel 105d, a subpixel 110d, a subpixel 115d, a subpixel 120d, a subpixel 125d, a subpixel 130d, a subpixel 135d, a subpixel 140d, and a subpixel 145d.

The image detector may have an integer number of rows of pixels, and an integer number of columns of pixels. The number of rows of pixels and the number of columns of pixels may be equal or may be different.

FIG. 2A illustrates an array of photosites of a photo detector according to the related art.

Referring to FIG. 2A, an array of photosites 200 comprising red, green and blue photosites is illustrated. As an example, the red, green, and blue photosites may be created by placing a filter (e.g., an RBG filter) over the photo detectors respectively corresponding to the photosites.

Similar to the array of photosites 100 illustrated in FIG. 1, the array of photosites 200 may correspond to a plurality of pixels. For example, as illustrated in FIG. 2A, the plurality of pixels are distinguished from one another by a solid line. Each of the plurality of pixels may include a plurality of subpixels. For example, each of the plurality of pixels may include four subpixels. The plurality of subpixels in each of the plurality of pixels are distinguished from one another by a dotted line in FIG. 2A.

If an image detector comprises a filter, then the plurality of subpixels corresponding to the respective groups of photosites of the image detector may include a red subpixel, a blue subpixel, and two green subpixels. A pixel may include two green subpixels because the RGB pixel includes three colors, however, there are certain benefits associated with designing an array of photo detectors and thus photosites (respectively corresponding to subpixels of the display) are arranged by an order of two. The color green is selected as a color to be repeated in the pixel because the human eye has been determined to perceive luminance best through the color green.

As illustrated in FIG. 2A, a red subpixel is denoted by ‘R,’ a blue subpixel is denoted by ‘B,’ and a green subpixel is denoted by ‘G.’

The red photosites corresponding to red subpixels may respectively sample a red color value in response to a request to capture an image. The red subpixels may correspond to a subpixel 205a, a subpixel 210a, a subpixel 215a, a subpixel 220a, a subpixel 225a, a subpixel 230a, a subpixel 235a, a subpixel 240a, and a subpixel 245a.

The green photosites corresponding to green subpixels may respectively sample a green color value in response to a request to capture an image. The green subpixels may correspond to a subpixel 205b, a subpixel 205c, a subpixel 210b, a subpixel 210c, a subpixel 215b, a subpixel 215c, a subpixel 220b, a subpixel 220c, a subpixel 225b, a subpixel 225c, a subpixel 230b, a subpixel 230c, a subpixel 235b, a subpixel 235c, a subpixel 240b, a subpixel 240c, a subpixel 245b, and a subpixel 245c.

The blue photosites corresponding to blue subpixels may respectively sample a blue color value in response to a request to capture an image. The blue subpixels may correspond to a subpixel 205d, a subpixel 210d, a subpixel 215d, a subpixel 220d, a subpixel 225d, a subpixel 230d, a subpixel 235d, a subpixel 240d, and a subpixel 245d.

FIG. 2B illustrates red color values sampled at a plurality of photosites using an image detector having an array of photosites of a photo detector such as, for example, the array of photosites illustrated in FIG. 2A according to the related art.

Referring to FIG. 2B, red color values are sampled at the plurality of subpixels corresponding to the plurality of photosites of the array of photosites 200 at which a red color filter is overlaid with the corresponding photo detector. For example, if an image is captured using a photo detector corresponding to the array of pixels illustrated in FIG. 2A, then a red color value may be sampled at photosites respectively corresponding to a subpixel 205a, a subpixel 210a, a subpixel 215a, a subpixel 220a, a subpixel 225a, a subpixel 230a, a subpixel 235a, a subpixel 240a, and a subpixel 245a.

As illustrated in FIG. 2B, a red color value is sampled at only one subpixel for each pixel. Therefore, as discussed above, in order to provide a more accurate color representation of the image intended to be captured, the red color values at the subpixels for which a red color value is not sampled may be estimated. For example, the red values at the values at the subpixels for which a red color value is not sampled may be estimated using nearby sampled color values. Indeed, the red color value of the subpixel 205d may be estimated using the red color values sampled at a subpixel 205a, a subpixel 210a, a subpixel 215a, a subpixel 220a, and/or the like. As discussed above, such a process may be referred to as demosaicing.

Conversely, because the subpixel 205a only has only a sampled red color value associated therewith, a blue color value and a green color value may be estimated at the subpixel 205a respectively using sampled blue color values and sampled green color values at surrounding subpixels.

According to the related art, demosaicing methods have included, for example, pixel replication, bilinear interpolation and median interpolation. In pixel replication, each missing value is taken from the neighbor to the left, above, or diagonally above and left, whichever is nearest. Bilinear interpolation offers some improvement over pixel replication with a moderate increase in complexity. In the bilinear interpolation method, each missing value is calculated based on an average of the neighboring pixel values, horizontally, vertically and/or diagonally. Median interpolation, which is a nonlinear interpolation method, offers the best results among these three algorithms (pixel replication, bilinear and median), especially when there are defective pixels, but has the maximum complexity. Median interpolation has two steps. First, missing values having four diagonal neighbors are interpolated using the median of those four values. Second, the remaining missing pixels are interpolated by the median of north, south, east, and west neighbors.

The demosaicing methods according to the related art are limited to the use of sampled color values from a single image. In other words, the demosaicing methods according the related art are limited to the use of sampled color values from a single capture image to estimate the color values at subpixels for which a color value has not been sampled in order to reconstruct an accurate color representation of the image intended to be captured.

According to various embodiments of the present disclosure, an adaptive method of demosaicing is provided. The adaptive method of demosaicing may be used when there is a variable amount of information available. For example, an adaptive filter is used to interpolate color values from available surrounding color values. In the simplest case, the adaptive method of demosaicing will produce the same results as demosaicing methods according to the related art. However, if more sampled color values are available (e.g., through a plurality of captured images), then the adaptive method of demosaicing uses the adaptive filter to take advantage of the larger number of sampled color values.

According to various embodiments of the present disclosure, a plurality of images may be used as a source for sampled color values for at least a subset of subpixels of an intended image. For example, a plurality of images that capture a same viewpoint may comprise different color values sampled at a same spatial location (e.g., respectively corresponding to an applicable subpixel of an image). According to various embodiments of the present disclosure, a particular spatial location of the viewpoint may have more than one sampled color value for a particular color (e.g., a spatial location of the viewpoint may have two sampled red color values each from a different image) and/or may have more than one sampled color value for different colors (e.g., a spatial location of the viewpoint may have a sampled red color value, a sampled blue color value, a sampled green color value, and/or the like).

According to various embodiments of the present disclosure, a color representation of an image (e.g., of a particular viewpoint) may be reconstructed (e.g., demosaiced) using sampled color values from a plurality of images that substantially capture a same viewpoint. For example, the plurality of images may have at least partially overlapping spatial coverage of the viewpoint. The plurality of images that substantially capture the same viewpoint may be contemporaneously captured by at least one image detector. Alternatively, the plurality of images that substantially capture the same viewpoint may be captured by different image detectors (e.g., image detectors disposed in different electronic devices, distinct image detectors disposed in a same electronic device, and/or the like).

According to various embodiments of the present disclosure, and electronic device may be configured to contemporaneously capture one or more images of substantially a same viewpoint. For example, an electronic device may include one or more cameras (e.g., image detectors) each of which captures an image. Each of the one or more cameras may contemporaneously capture substantially a same viewpoint, thereby capturing one or more images of the viewpoint.

According to various embodiments of the present disclosure, the electronic device may include a camera that is configured to contemporaneously capture substantially a same viewpoint. For example, a camera may be configured to capture a viewpoint using a burst image capture feature. The burst image capture feature may correspond to a feature according to which the camera rapidly captures a series of images of substantially a same viewpoint. A user may engage a burst image capture feature by holding down an image capture button on an electronic device. The number of images captured during the burst image capture feature may be related to the length of time that a user holds the image capture button. As another example, a camera may contemporaneously capture substantially a same viewpoint in response to a series of distinct commands to capture an image. The user may press (e.g., tap) an image capture button more than once to capture substantially a same viewpoint. As another example, an electronic device may capture a plurality of images that contemporaneously capture substantially a same viewpoint when a camera application is executed. For example, as the camera is operated to capture an image, the electronic device may display an image to be captured on a display screen (e.g., in a viewfinder). The electronic device may store a series of images corresponding to the image displayed on the display screen in a buffer.

According to various embodiments of the present disclosure, the electronic device may determine whether a plurality of images correspond to a plurality of images of substantially a same viewpoint. For example, the electronic device may perform an analysis to determine whether at least a portion of two or more images of the plurality of images include substantially a same viewpoint (e.g., within a preset and/or configurable threshold statistical relevancy, and/or the like). The electronic device may determine whether two or more of a plurality of images stored on the electronic device correspond to contemporaneous using metadata associated with the electronic files of the two or more of the plurality of images (e.g., a date created field, a date modified filed, and/or the like).

According to various embodiments of the present disclosure, two or more of the plurality of images that substantially capture the same viewpoint may be spatially offset with each other. As an example, a spatial offset between any two of the plurality of images may be an integer multiple of a pixel or and/or a subpixel (e.g., two of the plurality of images may be translated so as to be offset by 8 subpixels, or the like). As another example, a spatial offset between any two of the plurality of images may be a fractional multiple of a pixel or and/or a subpixel (e.g., two of the plurality of images may be translated so as to be offset by 2.5 subpixels, or the like).

According to various embodiments of the present disclosure, a spatial offset between any two of the plurality of images may be caused by human error when the electronic device is used to capture the plurality of images. For example, human error may be introduced into the capture of the plurality of images in the form of a shaking or vibration of the electronic device during image capture (e.g., a user is generally unable to hold an electronic device completely so as to be motionless).

According to various embodiments of the present disclosure, a spatial offset may be generated between any two of the plurality of images by introducing a vibration and/or motion of the image detector (e.g., the electronic device) during image capture. For example, the electronic device may include a vibration unit (e.g., a motor, or the like) to introduce motion of the electronic device across the image capture of any two or more of the plurality of images. An instance in which a generated vibration across the image capture of any two or more of the plurality of images may be beneficial is when the electronic device (e.g., the camera) is positioned in a holding unit (e.g., a tripod) during image capture.

According to various embodiments of the present disclosure, motion of the camera (e.g., the electronic device) during image capture across any two or more of the plurality of images may increase the sampling of any one or more colors for at least a portion of the intended viewpoint (e.g., the viewpoint of the image to be captured). According to various embodiments of the present disclosure, motion may be introduced to the camera (e.g., the electronic device) during image capture to enhance the sampling of color values across the intended viewpoint (e.g., the viewpoint of the image to be captured). For example, the sampling of any single color may be enhanced by a motion of the camera during image capture (e.g., a motion across any two or more of the plurality of images) because a single color may be sampled at a greater number of spatial locations of the viewpoint. The increased sampling of color values for any one or more colors will reduce the need to rely on an estimated color value for a corresponding spatial location of the viewpoint. The increased sample of color values for any one or more colors will reduce the need for an interpolation (e.g., demosaicing) of a corresponding spatial location of the viewpoint. A sampled color value of a color at a spatial location of the viewpoint may generally be considered to be a more accurate representation of the color of the viewpoint at the spatial location of the viewpoint than an estimated (e.g., interpolated) color value of the same.

According to various embodiments of the present disclosure, the plurality of images that substantially capture the same viewpoint may be normalized. For example, image processing may be performed on at least a subset of the plurality of images that substantially capture the same viewpoint in order to align the plurality of images. The plurality of images that substantially capture the same viewpoint may be processed to align the plurality of images on a common plane (e.g., the plurality of images is processed to be co-planar). In other words, the plurality of images that substantially capture the same viewpoint may be processed to be co-planar with at least a portion of each of the plurality of images overlapping with an intended viewpoint (e.g., the viewpoint of the image to be captured).

FIG. 3 illustrates a plurality of images according to an embodiment of the present disclosure.

Referring to FIG. 3, a first image 310 may capture a first viewpoint 315, a second image 320 may capture a second viewpoint 325, and a third image 330 may capture a third viewpoint 335.

According to various embodiments of the present disclosure, the first viewpoint 315, the second viewpoint 325, and the third viewpoint 335 may each relate to an intended viewpoint (e.g., the viewpoint of the image to be captured). For example, each of the first viewpoint 315, the second viewpoint 325, and the third viewpoint 335 may at least partially overlap with an intended viewpoint (e.g., the viewpoint of the image to be captured). In other words, the first image 310, the second image 320, and the third image 330 may capture substantially a same viewpoint.

Any two or more of the first image 310, the second image 320, and the third image 330 may be co-planar. The first image 310, the second image 320, the third image 330 may be normalized so as to be co-planar (e.g., to be co-planar representations of substantially the same viewpoint).

According to various embodiments of the present disclosure, the electronic device may select one of the first image 310, the second image 320, and the third image 330 to use as a reference viewpoint and/or to which the remaining of the first image 310, the second image 320, and the third image 330 are to be normalized so as to be co-planar with the selected one of the first image 310, the second image 320, and the third image 330.

According to various embodiments of the present disclosure, the electronic device may select the one of the first image 310, the second image 320, and the third image 330 according to user input.

According to various embodiments of the present disclosure, the electronic device may select the one of the first image 310, the second image 320, and the third image 330 according to a determination as to which of the first image 310, the second image 320, and the third image 330 was captured in the middle based on a time at which the images were captured. The image captured in the middle based on the time at which the images were capture may be assumed to be a most accurate image capture of the intended viewpoint based on an assumption that the plurality of images are centered around the median.

According to various embodiments of the present disclosure, the electronic device may select the one of the first image 310, the second image 320, and the third image 330 according to a determination as to which of the first image 310, the second image 320, and the third image 330 was captured first based on a time at which the images were captured.

According to various embodiments of the present disclosure, the electronic device may select the one of the first image 310, the second image 320, and the third image 330 according to a determination as to which of the first image 310, the second image 320, and the third image 330 has a greater amount of spatial overlap from a remaining of the first image 310, the second image 320, and the third image 330 (e.g., based on number of the plurality images that at least partially overlap with a given image, based on the amount of aggregate spatial overlap of a given image from a remainder of the images). As illustrated in FIG. 3, the first image 310 has the greatest number of overlapping images. The second image 320 and the third image 330 at least partially spatially overlap with the first image 310. In contrast, only the first image 310 at least partially spatially overlaps with the second image, and only the first image 310 at least partially spatially overlaps with the third image 330. Similarly, the first image 310 has the greatest amount of aggregate spatial overlap from the remaining images (e.g., the second image 320 and the third image 330). The aggregate spatial overlap of first image 310 is the sum of (i) the spatial overlap of the second image 320 with the first image 310, and (ii) the spatial overlap of the third image 330 with the first image 310.

According to various embodiments of the present disclosure, the selected one of the first image 310, the second image 320, and the third image 330 may be used as a spatial reference. The spatial locations corresponding to the subpixels of the selected one of the first image 310, the second image 320, and the third image 330 may be used as a reference for associating sampled color values of one or more color (or image characteristic) to a spatial location of the intended viewpoint. For example, the spatial locations corresponding to the subpixels of the selected one of the first image 310, the second image 320, and the third image 330 may be used as a reference for forming a composite of the sampled color values of the intended image.

According to various embodiments of the present disclosure, the spatial location of the viewpoints of the non-selected images (e.g., the second image 320 and the third image 330) of the plurality of images may be normalized against the selected one of the plurality of images (e.g., the first image 310). For example, the spatial location of the sample color values of the non-selected images of the plurality of images (e.g., the second image 320 and the third image 330) may be aligned with the spatial location of the sample color values of the selected one of the plurality of images (e.g., the first image 310).

According to various embodiments of the present disclosure, if the subpixels of the non-selected images (e.g., the second image 320 and the third image 330) do not align with the subpixels of the selected one of the plurality of images (e.g., the first image 310) when the spatial location of the viewpoints of the non-selected images (e.g., the second image 320 and the third image 330) are aligned with the viewpoint of the selected one of the plurality of images (e.g., the first image 310), then the electronic device may determine a sampled color value that is to be associated with at least one spatial location of a corresponding at least one subpixel of the selected image. The electronic device may determine the sampled color value that is to be associated with at least one spatial location of a corresponding at least one subpixel of the selected image using at least the sampled color value from a corresponding subpixel of the non-selected image for which subpixels thereof do not align with the subpixels of the selected image. For example, the electronic device may estimate the color value that is to be associated with at least one spatial location of a corresponding at least one subpixel of the selected image using an interpolation (or other numerical method) using sampled color values of the non-selected image or which subpixels thereof do not align with the subpixels of the selected image.

According to various embodiments of the present disclosure, if the subpixels of the non-selected images (e.g., the second image 320 and the third image 330) do not align with the subpixels of the selected one of the plurality of images (e.g., the first image 310) when the spatial location of the viewpoints of the non-selected images (e.g., the second image 320 and the third image 330) are aligned with the viewpoint of the selected one of the plurality of images (e.g., the first image 310), then the electronic device may align the non-selected images with the selected image according to the nearest subpixel alignment (e.g., such that the spatial location of the viewpoint may be slightly unaligned).

FIGS. 4A, 4B, 4C, and 4D illustrate a plurality of subpixels according to an embodiment of the present disclosure.

Referring to FIGS. 4A, 4B, 4C, and 4D, sampling of red color values from different images is described in an example for which the subpixels of the respective images are assumed to be aligned with one another. Further, the FIGS. 4A, 4B, and 4C illustrate sampled red color values from a plurality of images that are at least partially overlapped with one another such as, for example, the first image 310, the second image 320, and the third image 330 illustrated in FIG. 3. The intended viewpoint (e.g., spatial coverage of the image) is assumed to be the viewpoint of the first image 310 illustrated in FIG. 3.

Referring to FIG. 4A, an array of subpixels 400 is illustrated. In particular, the subpixels for which red is sampled in the capture of the first image 310 are illustrated. During capture of the first image 310, red is sampled at a subpixel 405a, a subpixel 410a, a subpixel 415a, a subpixel 420a, a subpixel 425a, a subpixel 430a, a subpixel 435a, a subpixel 440a, and a subpixel 445a.

Referring to FIG. 4B, an array of subpixels 400 is illustrated. In particular, the subpixels of the second image 320 that overlap with the intended viewpoint and for which red is sampled in the capture of the second image 320 are illustrated. During capture of the second image 320, red is sampled at a subpixel 425b, a subpixel 430b, a subpixel 440b, and a subpixel 445b.

Referring to FIG. 4C, an array of subpixels 400 is illustrated. In particular, the subpixels of the third image 330 that overlap with the intended viewpoint and for which red is sampled in the capture of the third image 330 are illustrated. During capture of the third image 330, red is sampled at a subpixel 420c, a subpixel 425c, a subpixel 435c, and a subpixel 440c.

Referring to FIG. 4D, an array of subpixels 400 corresponding to the intended image or viewpoint is illustrated. According to various embodiments of the present disclosure, a mapping of sampled color values for at least one color to the intended image or viewpoint is generated. According to various embodiments of the present disclosure, an image may be demosaiced using a composite of sampling color values from a plurality of images.

According to the related art, an image or viewpoint corresponding to the array of subpixels 400 would be demosaiced using the array of subpixels 400 illustrated in FIG. 4A. In other words, the image or viewpoint corresponding to the array of subpixels 400 would be demosaiced using sampled color values from a single captured image. In contrast, according to various embodiments of the present disclosure, the intended image or viewpoint corresponding to the array of subpixels 400 may be demosaiced (e.g., the color may be reconstructed) taking into account the sampled color values at applicable subpixels for a plurality of images that substantially capture a same viewpoint. As illustrated in FIG. 4D, an intended image or viewpoint may be reconstructed using the composite sampled color values for from the first image 310, the second image 320, and the third image 330 that overlap with the intended image or viewpoint. The composite sampled red color values for the array of subpixels 400 includes 17 sampled red color values. In contrast, as illustrated in FIG. 4A, the sampled red color values used to demosaic the array of subpixels 400 according to the related art would have merely 9 sampled red color values.

The composite sampled red color values for the array of subpixels 400 comprises the sampled red color values from the first image 310, the second image 320, and the third image 330 respectively at a subpixel 405a, a subpixel 410a, a subpixel 415a, a subpixel 420a, a subpixel 420c, a subpixel 425a, a subpixel 425b, a subpixel 425c, a subpixel 430a, a subpixel 430b, a subpixel 435a, a subpixel 435c, a subpixel 440a, a subpixel 440b, a subpixel 440c, a subpixel 445a, and a subpixel 445b.

Comparing the array of subpixels 400 illustrated in FIG. 4D to the array of subpixels 400 illustrated in FIG. 4A, the array of subpixels 400 illustrated in FIG. 4B, or the array of subpixels 400 illustrated in FIG. 4C, the composite of sampled red color values of FIG. 4D results in less of a need to estimate red color values. The composite of sampled red color values illustrated in FIG. 4D reduces the number of subpixels for which a red color value may require estimation (e.g., the subpixels which do not have a sampled red color value from the portions of the first image 310, the second image 320, and the third image 330 that overlap with the intended image or viewpoint.

According to various embodiments of the present disclosure, if two or more images of the plurality of images that substantially capture a same viewpoint has a sampled color value for a given color at a subpixel of the intended image or viewpoint, then the color value associated with the particular subpixel of the intended image or viewpoint for the composite sampled color values may be an average of the various sampled color values for the particular subpixel from the two or more images. According to various embodiments of the present disclosure, the average of the various sampled color values for the particular subpixel from the two or more images may be a weighted average. For example, the average of the various sampled color values may be weighted according to an extent to which an image associated with a respective sampled color value is processed to normalize or otherwise align the image with a reference plane or image. As another example, the average of the various sampled color values may be weighted according the extent to which the subpixels of the corresponding image aligned with the reference image when the spatial locations of the image are aligned with the intended image or viewpoint.

According to various embodiments of the present disclosure, the electronic device may determine the subpixels of the intended image or viewpoint for which a particular color is sampled across the plurality of images that substantially capture the same viewpoint. For example, the electronic device may determine the subpixels of the intended image or viewpoint for which the composite sampled color value of a particular color has a sampled value. The electronic device may use a sum buffer to determine the subpixels of the intended image or viewpoint for which a particular color is sampled across the plurality of images that substantially capture the same viewpoint. The sum buffer may have a field (e.g., a bit) associated with each subpixel of the intended image.

According to various embodiments of the present disclosure, the field associated with a particular subpixel of the intended image may be incremented for each sampled color value of the particular color across the plurality of images that substantially capture the same viewpoint. For example, the field associated with a particular subpixel of the intended image may be incremented so as to reflect the aggregate number of sampled color values for the particular color that exist at the particular subpixel across the plurality of images that substantially capture the same viewpoint.

According to various embodiments of the present disclosure, the field associated with a particular subpixel of the intended image may be a binary representation indicating whether any of the plurality of images that substantially capture the same viewpoint include a sampled color value of the particular color for the particular subpixel. For example, the field associated with a particular subpixel of the intended image may be set to one if the composite sampled color values of the particular color (e.g., or mapping thereof) has a sampled color value of the particular color.

According to various embodiments of the present disclosure, the electronic device may determine the subpixels of the intended image or viewpoint for which a particular color values is missing (e.g., the subpixels for which a value may be estimated using surrounding sampled color values of the particular color value from, for example, the composite sampled color values of the particular color).

FIG. 5 illustrates a flowchart of a method of determining color values according to an embodiment of the present disclosure.

Referring to FIG. 5, at operation 510, the electronic device captures a plurality of images that substantially capture the same viewpoint. As an alternative, the electronic device may receive one or more of the plurality of images that substantially capture the same viewpoint from another electronic device (e.g., a counterpart electronic device). The electronic device may store the plurality of images (e.g., in a storage unit, a frame buffer, and/or the like).

At operation 520, the electronic device aligns the corresponding images. For example, the electronic device aligns the plurality of images that substantially capture the same viewpoint so as to align the spatial location of the plurality of images. The electronic device may process one or more of the plurality of images to make the plurality of images co-planar before aligning the spatial location of the plurality of images.

At operation 530, the electronic device determines a coverage of each color for subpixels in the intended image. For the example, the electronic device may use a sum buffer to determine which of the subpixels of the intended image have a sampled color value of a particular color associated therewith. In other words, the electronic device may use a sum buffer to determine the subpixels of the intended image or viewpoint for which a particular color is sampled across the plurality of images used to reconstruct the color representation of the intended image. Alternatively or additionally, the electronic device may determine which of the subpixels of the intended image do not have a sampled color value of a particular color associated therewith. The electronic device may use a sum buffer to determine the subpixels of the intended image or viewpoint for which a particular color has not been sampled across the plurality of images used to reconstruct the color representation of the intended image.

At operation 540, the electronic device reconstructs a color representation of the intended image. For example, the electronic device may estimate a color value for the subpixels of the intended image or viewpoint for which a particular color has not been sampled across the plurality of images used to reconstruct the color representation of the intended image. The electronic device may estimate the color value for a particular for such subpixels using at least one sampled color value of the particular value from at least one of the plurality of images. The electronic device may interpolate color values for the subpixels that do not have a sampled color value across the plurality of images using at least one of the sampled color values from at least one of the plurality of images.

According to various embodiments of the present disclosure, the electronic device may allocate a weighting to each of the sampled color values across the plurality of images for the estimation of the color value. For example, the electronic device may weigh the sampled color values for a particular color at subpixels spatially closer to the subpixel for which the color value is being estimated more heavily than the sampled color values for the particular color at subpixels relatively more distant from the subpixel for which the color value is being estimated.

Referring back to FIG. 4D, the subpixel 445d does not have a sampled red color value across the plurality of images used to reconstruct the color representation of the array of subpixels 400. According to various embodiments of the present disclosure, the electronic device may estimate a red color value for the subpixel 445d using at least one or more of the sampled red color values (e.g., the sampled red color values at a subpixel 405a, a subpixel 410a, a subpixel 415a, a subpixel 420a, a subpixel 420c, a subpixel 425a, a subpixel 425b, a subpixel 425c, a subpixel 430a, a subpixel 430b, a subpixel 435a, a subpixel 435c, a subpixel 440a, a subpixel 440b, a subpixel 440c, a subpixel 445a, and a subpixel 445b). As discussed above, the electronic device may weigh the sampled red color values differently in order to estimate the color value at the subpixel 445d. For example, the electronic device may associate a higher weighting to the sampled red color values at the subpixel 445a and the subpixel 445b than to the sampled red color value at the subpixel 405a.

According to various embodiments of the present disclosure, the electronic device may store a filter used for determining an estimated value at a subpixel that does not have a sampled color value of a particular color across the plurality of images used to reconstruct the color representation of the intended image. The filter may be a filter table. The electronic device may generate coefficients in the filter table. The coefficients may correspond to the respective weighting to be associated with the color values (e.g., the sampled color values) for the subpixels surrounding the subpixel for which a color value is to be estimated. The coefficients may be normalized such that the sum of all the coefficients in the filter table is equal to one. The electronic device may use the filter (e.g., the filter table) to calculate a weighted average of the nearby surrounding sampled color values of the particular color that are present in order to estimate the color value for the particular color.

According to various embodiments of the present disclosure, the electronic device may determine (e.g., estimate) a color value of a particular color for every subpixel of the intended image for which the particular color has not been sampled across the plurality of images used to reconstruct the color representation of the intended image. According to various embodiments of the present disclosure, the electronic device may determine the color value of a particular color for every subpixel of the intended image for which the particular color has not been sampled for every color (e.g., red, green, blue).

At operation 550, the electronic device may render the image. The electronic device may render the image using at least one or more of the sampled color values across the plurality of images used to reconstruct the image and using the estimated color values (e.g., the color values determined at operation 540).

FIG. 6 illustrates a flowchart of a method of determining color values according to an embodiment of the present disclosure.

Referring to FIG. 6, at operation 605, the electronic device may capture a plurality of images that substantially capture the same viewpoint. As alternative, the electronic device may receive one or more of the plurality of images that substantially capture the same viewpoint from another electronic device (e.g., a counterpart electronic device). The electronic device may store the plurality of images (e.g., in a storage unit, a frame buffer, and/or the like).

At operation 610, the electronic device may determine whether a subpixel has a sampled color value for a particular color across the plurality of images used to reconstruct the intended image.

If the electronic device determines that the subpixel has a sampled color value for the particular color across the plurality of images at operation 610, then the electronic device may proceed to operation 615 at which the electronic device determines whether the subpixel has more than one sampled color value for the particular color values across the plurality of images. For example, the electronic device may determine whether more than one of the plurality of images has a sampled color value for the particular color.

If the electronic device determines that the subpixel does not have more than one sampled color value for the particular color values across the plurality of images at operation 615, then the electronic device may proceed to operation 620 at which the electronic device may determine the color value to be the sampled color value from one of the plurality of images. Thereafter, the electronic device may proceed to operation 625 at which the electronic device stores the corresponding sampled color value for the particular color as the color value for the particular color at the subpixel. Thereafter, the electronic device may proceed to operation 650.

In contrast, if the electronic device determines that the subpixel has more than one sampled color value for the particular color values across the plurality of images at operation 615, then the electronic device may proceed to operation 630 at which the electronic device determines an average value of the sampled color values at the subpixel for the particular color across the plurality of images. Thereafter, the electronic device may proceed to operation 635 at which the electronic device may store the average value of the particular color as the color value for the particular color at the subpixel. Thereafter, the electronic device may proceed to operation 650.

In contrast, if the electronic device determines that the subpixel does not have a sampled color value of the particular color across the plurality of images at operation 610, then the electronic device may proceed to operation 640 at which the electronic device may determine the color value of the particular color at the subpixel. The electronic device may determine the color value by estimating a color value using sampled color values of the particular color at other subpixels (e.g., surrounding subpixels) across the plurality of images. The electronic device may determine the color value of the particular color by interpolation using sampled color values of the particular color at other subpixels of the intended image across the plurality of images. The electronic device may weigh the sampled color values of the particular color at one or more subpixels across the plurality of images more heavily than the sampled color values of the particular color at other subpixels across the plurality of images. Thereafter, the electronic device may store the determined color value of the particular color value at the subpixel. Thereafter, the electronic device may proceed to operation 650.

At operation 650, the electronic device may render the image using the corresponding color values for every color at the subpixels of the intended image. According to various embodiments of the present disclosure, the electronic device may perform further image processing on the mapping of color values to subpixels of the intended image.

FIG. 7A illustrates a flowchart of a method of demosaicing according to an embodiment of the present disclosure.

Referring to FIG. 7A, at operation 710, an electronic device generates a sum buffer. According to various embodiments of the present disclosure, the electronic device may sum the number of sampled color values for a particular color for each subpixel of the intended image across the plurality of images. The electronic device may determine the number of color values for the particular color that are sampled across the plurality of images for each subpixel of the intended image. According to various embodiments of the present disclosure, the electronic device may populate a sum buffer with an indication as to which subpixels of the intended image have at least one corresponding sampled color value for a particular color across the plurality of images and/or an indication as to which subpixels of the intended image do not have at least one corresponding sampled color value for a particular color across the plurality of images.

At operation 720, the electronic device may generate a filter. The electronic device may generate a filter in the form of a filter table. The filter may include coefficients corresponding to a weighting to be applied to at least a subset of the subpixels of the intended image. The coefficients in the filter may be normalized such that the sum of the coefficients is equal to one. According to various embodiments of the present disclosure, the electronic device may generate the filter for each subpixel of the intended image that does not have a corresponding sampled color value of a particular color across the plurality of images. According to various embodiments of the present disclosure, the electronic device may generate the filter for all subpixels of the intended image.

FIG. 7B illustrates a plurality of subpixels according to an embodiment of the present disclosure.

Referring to FIG. 7B, an array of subpixels of at least an area of an intended image is illustrated. The array of subpixels corresponds to a composite of sampled red color values across a plurality of images to be used to reconstruct a color representation of the intended image. The subpixels for which a red value has been captured (e.g., sampled) across the plurality of images are denoted by ‘R’. Conversely, the subpixels for which no red value has been captured (e.g., sampled) across the plurality of images is denoted by ‘0’.

According to various embodiments of the present disclosure, the electronic device may generate a filter using array of subpixels illustrated in FIG. 7B. The electronic device may populate or otherwise generate a sum buffer to determine the subpixels for which no red value has been captured across the plurality of images. As an example, the sub buffer may be populated with ‘1’ for fields corresponding to the subpixels for which a red color value has been captured (e.g., the subpixels having a value ‘R’ in the array of subpixels). As another example, the sum buffer may be populated with ‘0’ for fields corresponding to the subpixels for which a red value has not been captured (e.g., the subpixels having a value ‘0’ in the array of subpixels). According to various embodiments of the present disclosure, the sum buffer may be used to identify the subpixels for which a color value needs to be reconstructed (e.g., estimated).

FIG. 7C illustrates a filter according to an embodiment of the present disclosure.

Referring to FIG. 7C, the electronic device may generate a filter using the array of subpixels, the sum buffer, or the like. Each field in the filter corresponds to a subpixel. As illustrated in FIG. 7C, the coefficients corresponding to subpixels for which no red color value has been captured is equal to zero, and the coefficients corresponding to the subpixels for which a red color value has been captured is 0.5.

According to various embodiments of the present disclosure, the filter may be generated in relation to a specific subpixel. For example, the filter illustrated in FIG. 7C may be generated in relation to subpixel 725 of FIG. 7B. As illustrated in FIG. 7C, the sampled red color values are equally weighted. The sampled red color values may be equally weighted because the sampled red color values are equidistant from the subpixel 725. The electronic device may normalize the filter illustrated in FIG. 7C. The filter may be normalized by summing the coefficients in the filter and dividing each coefficient by the sum of all the coefficients in the filter.

According to various embodiments of the present disclosure, when there are more sampled color values of a particular color than just the results of one image capture, a method of reconstructing the color representation at a subpixel (e.g., demosaicing the image), uses all the nearby sampled color values but weighs the sampled color values closer to the subpixel to be estimated more than the values that are farther away.

FIG. 7D illustrates a plurality of subpixels according to an embodiment of the present disclosure.

Referring to FIG. 7D, an array of subpixels of at least an area of an intended image is illustrated. The array of subpixels corresponds to a composite of sampled blue color values across a plurality of images to be used to reconstruct a color representation of the intended image. The subpixels for which a blue value has been captured (e.g., sampled) across the plurality of images are denoted by ‘B’. Conversely, the subpixels for which no blue value has been captured (e.g., sampled) across the plurality of images is denoted by ‘0’.

According to various embodiments of the present disclosure, the electronic device may generate a filter using array of subpixels illustrated in FIG. 7D. The electronic device may populate or otherwise generate a sum buffer to determine the subpixels for which no blue value has been captured across the plurality of images. As an example, the sub buffer may be populated with ‘1’ for fields corresponding to the subpixels for which a blue color value has been captured (e.g., the subpixels having a value ‘B’ in the array of subpixels). As another example, the sum buffer may be populated with ‘0’ for fields corresponding to the subpixels for which a blue value has not been captured (e.g., the subpixels having a value ‘0’ in the array of subpixels). According to various embodiments of the present disclosure, the sum buffer may be used to identify the subpixels for which a color value needs to be reconstructed (e.g., estimated).

FIG. 7E illustrates a filter according to an embodiment of the present disclosure.

Referring to FIG. 7E, the electronic device may generate a filter using the array of subpixels, the sum buffer, or the like. Each field in the filter corresponds to a subpixel. As illustrated in FIG. 7E, the coefficients corresponding to subpixels for which no blue color value has been captured is equal to zero. The coefficients corresponding to the subpixels for which a blue color value may be weighted according to a distance between the corresponding subpixel having the sampled blue color value and the subpixel for which a blue color value is being determined (e.g., estimated).

According to various embodiments of the present disclosure, the filter may be generated in relation to a specific subpixel. For example, the filter illustrated in FIG. 7E may be generated in relation to a subpixel 735 of FIG. 7D. As illustrated in FIG. 7E, each group of pixels having sampled blue color values that are equidistant from the subpixel for which the blue color value is being determined are equally weighted. However, a subpixel having a sampled blue color value and that is relatively closer to the subpixel for which the blue color is being determined may have a coefficient corresponding to a higher weight than another subpixel having a sampled blue color value and that is relatively farther away from the subpixel for which the blue color is being determined.

Referring back to FIG. 7A, at operation 730, the electronic device may determine a corresponding color value for a subpixel of the intended image that does not have a corresponding sampled color value of a particular color across the plurality of images. The electronic device may determine the corresponding color value for each subpixel of the intended image the does not have a corresponding sampled color value of the particular color across the plurality of images.

According to various embodiments of the present disclosure, the electronic device may determine the corresponding color value for a subpixel of the intended image that does not have a corresponding sampled color value of a particular color across the plurality of images using an applicable filter table (e.g., a filter table generated for use with the particular subpixel). According to various embodiments of the present disclosure, the electronic device may determine the corresponding color value for the subpixel by multiplying the corresponding sampled color values of the other subpixels (e.g., the surrounding subpixels) for which a corresponding color value has been captured (e.g., sampled) by the corresponding coefficient for that subpixel. Thereafter, the sum of the products of the coefficient of the filter and the corresponding sampled color value may be determined to be color value at the subpixel of the intended image that does not have a corresponding sampled color value of the particular color.

At operation 740, the electronic device may store the determined (e.g., estimated) color value for the corresponding color at the applicable subpixel of the intended image. The electronic device may store the determined color values for each respective color at each respective subpixel that does not have a corresponding sampled color value across the plurality of images. Thereafter, the electronic device may reconstruct the image. For example, the electronic device may render the image using the sampled color values and the estimated color values. According to various embodiments of the present disclosure, the electronic device may perform image processing on the image (e.g., the electronic device may further process the mapping of color values at the subpixels of the intended image) before rendering the image.

According to various embodiments of the present disclosure, the electronic device may use a similar method for estimating other characteristics of the intended image at the subpixel level. The electronic device may use measured values of a particular characteristic across a plurality of images to determine (e.g., estimate) a corresponding value of the particular characteristic at subpixels for which no measurement exists.

FIG. 8 illustrates pseudo code for a method of demosaicing according to an embodiment of the present disclosure.

Referring to FIG. 8, the illustrated code shows how the variable of demosaicing is performed. The function “demo(color, x, y)” (e.g., demo(R,x,y)) corresponds to a function call to perform a demosaicing of a particular color at a particular subpixel denoted by the coordinates (x,y). The demosaic function may use an interpolation method or other numerical method for determining a corresponding color value at the particular subpixel. For example, the demosaic function may perform a pixel replication, a bilinear interpolation, a median interpolation, and/or the like.

As illustrated in FIG. 8, the function “demosaic” loops for all of the subpixels. The demosaic function may read the average pixel from the pipeline. The demosaic function may also read the sum buffer count from the “num” buffer for each pixel. If the sum buffer count is non-zero, then the average pixel is used as is (e.g., demosaicing is not performed on subpixels for which at least one sampled color value is captured across the plurality of images). In contrast, if the sum buffer count is zero, then demosaicing is determined to be required and the function “demo” is called to perform a demosaicing procedure.

FIG. 8 further illustrates the generation of a filter.

FIG. 9 illustrates a block diagram schematically illustrating a configuration of an electronic device according to an embodiment of the present disclosure.

Referring to FIG. 9, an electronic device 900 may include a control unit 910, a storage unit 920, an image processing unit 930, a display unit 940, an input unit 950, and a communication unit 960.

According to various embodiments of the present disclosure, the electronic device 900 comprises at least one control unit 910. The at least one control unit 910 may be configured to operatively control the electronic device 900. For example, the at least one control unit 910 may control operation of the various components or units included in the electronic device 900. The at least one control unit 910 may transmit a signal to the various components included in the electronic device 900 and control a signal flow between internal blocks of the electronic device 900. The at least one control unit 910 may be or otherwise include at least one processor. The at least one control unit 910 may include an Application Processor (AP), and/or the like.

The storage unit 920 may be configured to store user data, and the like, as well a program which performs operating functions according to various embodiments of the present disclosure. The storage unit 920 may include a non-transitory computer-readable storage medium. As an example, the storage unit 920 may store a program for controlling general operation of an electronic device 900, an Operating System (OS) which boots the electronic device 900, and application program for performing other optional functions such as a camera function, a sound replay function, an image or video replay function, a signal strength measurement function, a route generation function, image processing, and the like. Further, the storage unit 920 may store user data generated according to a user of the electronic device 900, such as, for example, a text message, a game file, a music file, a movie file, and the like. According to various embodiments of the present disclosure, the storage unit 920 may store an application or a plurality of applications that individually or in combination operate a camera unit (not shown) to capture (e.g., contemporaneously) one or more images of substantially the same viewpoint, and/or the like. According to various embodiments of the present disclosure, the storage unit 920 may store an application or a plurality of applications that individually or in combination operate the image processing unit 930 or the control unit 910 to determine which subpixels across the plurality of images have a corresponding sampled color value of a particular color, to determine which subpixels across the plurality of images do not have a corresponding sampled color value of the a particular color, to estimate the color value at a subpixel across the plurality of images that do not have a corresponding sampled color value of the particular color, to render the image, and/or the like. The storage unit 920 may store an application or a plurality of applications that individually or in combination operate the control unit 910 and the communication unit 960 to communicate with a counterpart electronic device to receive one or more images from the counterpart electronic device, and/or the like. The storage unit 920 may store an application or a plurality of applications that individually or in combination operate display unit 940 to display a graphical user interface, an image, a video, and/or the like.

The image processing unit 930 may be configured to process image data, images, and/or the like. The image processing unit 930 may include a Sub Pixel Rendering (SPR) unit (not shown), a demosaicing unit (not shown), and/or the like. In the alternative or in addition, the image processing unit 930 may be configured to perform demosaicing of image data and/or images, SPR, and/or the like. The image processing unit 930 may be configured to determine which subpixels of an intended image have a corresponding sampled color value for a particular color, to determine which subpixels of an intended image do not have a corresponding sampled color value for a particular color, to determine (e.g., estimate) a color value for a particular value for the subpixels that do not have a corresponding sampled color value of the particular color, and/or the like.

The display unit 940 displays information inputted by user or information to be provided to user as well as various menus of the electronic device 900. For example, the display unit 940 may provide various screens according to a user of the electronic device 900, such as an idle screen, a message writing screen, a calling screen, a route planning screen, and the like. According to various embodiments of the present disclosure, the display unit 940 may display an interface which the user may manipulate or otherwise enter inputs via a touch screen to enter selection of the function relating to the signal strength of the electronic device 900. The display unit 940 can be formed as a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED), an Active Matrix Organic Light Emitting Diode (AMOLED), and the like. However, various embodiments of the present disclosure are not limited to these examples. Further, the display unit 940 can perform the function of the input unit 950 if the display unit 940 is formed as a touch screen.

The input unit 950 may include input keys and function keys for receiving user input. For example, the input unit 950 may include input keys and function keys for receiving an input of numbers or various sets of letter information, setting various functions, and controlling functions of the electronic device 900. For example, the input unit 950 may include a calling key for requesting a voice call, a video call request key for requesting a video call, a termination key for requesting termination of a voice call or a video call, a volume key for adjusting output volume of an audio signal, a direction key, and the like. In particular, according to various embodiments of the present disclosure, the input unit 950 may transmit to the at least one control unit 910 signals related to the operation of a camera unit (not shown), to selection of an image, to selection of a viewpoint, and/or the like. Such an input unit 950 may be formed by one or a combination of input means such as a touch pad, a touchscreen, a button-type key pad, a joystick, a wheel key, and the like.

The communication unit 960 may be configured for communicating with other electronic devices and/or networks. According to various embodiments of the present disclosure, the communication unit 960 may be configured to communicate using various communication protocols and various communication transceivers. For example, the communication unit 960 may be configured to communicate via Bluetooth technology, NFC technology, WiFi technology, 2G technology, 3G technology, LTE technology, or another wireless technology, and/or the like.

It will be appreciated that various embodiments of the present disclosure according to the claims and description in the specification can be realized in the form of hardware, software or a combination of hardware and software.

Any such software may be stored in a non-transitory computer readable storage medium. The non-transitory computer readable storage medium stores one or more programs (software modules), the one or more programs comprising instructions, which when executed by one or more processors in an electronic device, cause the electronic device to perform a method of the present disclosure.

Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a Read Only Memory (ROM), whether erasable or rewritable or not, or in the form of memory such as, for example, Random Access Memory (RAM), memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a Compact Disk (CD), Digital Versatile Disc (DVD), magnetic disk or magnetic tape or the like. It will be appreciated that the storage devices and storage media are various embodiments of non-transitory machine-readable storage that are suitable for storing a program or programs comprising instructions that, when executed, implement various embodiments of the present disclosure. Accordingly, various embodiments provide a program comprising code for implementing apparatus or a method as claimed in any one of the claims of this specification and a non-transitory machine-readable storage storing such a program.

According to various embodiments of the present disclosure, one or more images may be used to reconstruct a color representation of an intended image. Because more information may be obtained from a plurality of images of an intended viewpoint than otherwise obtained from a single image of the intended viewpoint, various embodiments of the present disclosure may use such additional information to reconstruct the intended image (e.g., intended viewpoint). According to the various embodiments of the present disclosure, in the absence of a plurality of images that substantially capture the same viewpoint, a method of reconstructing the color representation of the intended image may be consistent with the results of a normal demosaicing process which uses a single image.

While the disclosure has been shown and described with reference to various embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents.

Claims

1. A method for demosaicing sampled color values, the method comprising:

storing image information for a plurality of images;
selecting an intended viewpoint that is within a composite of image information across the plurality of images; and
reconstructing color information for at least one color using image information from the composite of image information across the plurality of images that corresponds to the intended viewpoint.

2. The method of claim 1, wherein the intended viewpoint corresponds to the spatial area that is to be represented in an intended image to be generated using the reconstructed color information.

3. The method of claim 1, further comprising:

aligning the plurality of images.

4. The method of claim 1, further comprising:

determining the subpixels of the intended viewpoint for which color information of a particular color is captured among the plurality of images.

5. The method of claim 4, wherein the determining of the subpixels of the intended viewpoint for which color information of a particular color is captured across the plurality of images comprises:

populating a sum buffer with an indication of whether a particular subpixel of the intended viewpoint has a sampled color value for a particular color among the plurality of images.

6. The method of claim 1, further comprising:

determining the subpixels of the intended viewpoint for which color information of a particular color is not sampled among the plurality of images.

7. The method of claim 6, wherein the reconstructing of the color information for at least one color using image information from the composite of image information across the plurality of images that corresponds to the intended viewpoint comprises:

determining a color value for the subpixels of the intended viewpoint that are determined to not have corresponding color information of the particular color sampled among the plurality of images.

8. The method of claim 6, wherein the determining of color value for the subpixels of the intended viewpoint that are determined to not have corresponding color information of the particular color sampled among the plurality of images comprises:

performing an interpolation using at least one sampled color value of the particular color from among the composite of image information across the plurality of images.

9. The method of claim 1, wherein the plurality of images were captured contemporaneously by one or more image detectors.

10. The method of claim 1, wherein the intended viewpoint corresponds to the viewpoint of one of the plurality of images.

11. The method of claim 1, further comprising:

rendering, on a display unit, the reconstructed color information as an image.

12. A non-transitory computer-readable storage medium storing instructions that, when executed, cause at least one processor to perform the method of claim 1.

13. An apparatus for demosaicing sampled color values, the apparatus comprising:

a storage unit; and
at least one processor configured to store image information for a plurality of images, to select an intended viewpoint that is within a composite of image information across the plurality of images, and to reconstruct color information for at least one color using image information from the composite of image information across the plurality of images that corresponds to the intended viewpoint.

14. The apparatus of claim 13, wherein the intended viewpoint corresponds to the spatial area that is to be represented in an intended image to be generated using the reconstructed color information.

15. The apparatus of claim 13, wherein the at least one processor is further configured to align the plurality of images.

16. The apparatus of claim 13, wherein the at least one processor is further configured to determine the subpixels of the intended viewpoint for which color information of a particular color is captured among the plurality of images.

17. The apparatus of claim 16, wherein the at least one processor is further configured to populate a sum buffer, stored in the storage unit, with an indication of whether a particular subpixel of the intended viewpoint has a sampled color value for a particular color among the plurality of images.

18. The apparatus of claim 13, wherein the at least one processor is further configured to determine the subpixels of the intended viewpoint for which color information of a particular color is not sampled among the plurality of images.

19. The apparatus of claim 18, wherein the at least one processor is further configured to determine a color value for the subpixels of the intended viewpoint that are determined to not have corresponding color information of the particular color sampled among the plurality of images.

20. The apparatus of claim 19, wherein the at least one processor is further configured to perform an interpolation using at least one sampled color value of the particular color from among the composite of image information across the plurality of images.

21. The apparatus of claim 13, wherein the plurality of images were captured contemporaneously by one or more image detectors.

22. The apparatus of claim 13, wherein the intended viewpoint corresponds to the viewpoint of one of the plurality of images.

23. The apparatus of claim 13, further comprising:

a display unit,
wherein the at least one processor is further configured to render, on the display unit, the reconstructed color information as an image.

24. A method for demosaicing sampled color values, the method comprising:

storing image information for an intended image;
determining subpixels of the intended image for which color information of a particular color is captured;
generating an adaptive filter to demosaic the image information for the intended image according to the subpixels of the intended image for which color information of a particular color is captured; and
demosaicing the image information for the intended image using the adaptive filter.

25. The method of claim 24, wherein demosaicing the image information for the intended image using the adaptive filter comprises:

applying the adaptive filter to a particular subpixel for which color information of a particular color is not captured.

26. The method of claim 25, wherein the generating of the adaptive filter comprises:

allocating a weighing of color information for subpixels that are in an area surrounding the particular subpixel and for which color information of the particular color is captured.

27. An apparatus for demosaicing sampled color values, the apparatus comprising:

a storage unit; and
at least one processor configured to store image information for an intended image, to determine subpixels of the intended image for which color information of a particular color is captured, to generate an adaptive filter to demosaic the image information for the intended image according to the subpixels of the intended image for which color information of a particular color is captured, and to demosaic the image information for the intended image using the adaptive filter.

28. The apparatus of claim 27, wherein the at least one processor is further configured to apply the adaptive filter to a particular subpixel for which color information of a particular color is not captured.

29. The apparatus of claim 28, wherein the at least one processor is further configured to allocate a weighing of color information for subpixels that are in an area surrounding the particular subpixel and for which color information of the particular color is captured.

Patent History
Publication number: 20150363913
Type: Application
Filed: Jun 13, 2014
Publication Date: Dec 17, 2015
Inventor: Michel Francis HIGGINS (Duncans Mills, CA)
Application Number: 14/304,315
Classifications
International Classification: G06T 3/40 (20060101); G06T 5/50 (20060101); G06T 7/40 (20060101); G06T 7/00 (20060101);