Image processing device, image display system, image processing method and program therefor

- Sony Corporation

An image processing device according to the present invention includes a motion vector detecting unit, a response time information storage unit, a compensation processing unit and an output unit. The motion vector detecting unit detects a motion vector of image data. The response time information storage unit stores response time information representing a time since a driving voltage is applied to a display device until an image with a corresponding tone is displayed in association with a tone variation value. The compensation processing unit compensates a pixel value in image data for each pixel in a frame that is one frame ahead of a frame to be displayed, based on the image data, the motion vector and the response time information. The output unit outputs the image data after compensated by the compensation processing unit to the display device.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATION

The present invention contains subject matter related to Japanese Patent Application JP 2007-326342 filed in the Japan Patent Office on Dec. 18, 2007, the entire contents of which being incorporated herein by reference.

BACKGROUND OF THE INVENTION

1. Field of the Invention

The present invention relates to an image processing device which processes externally input image data and outputs it to a hold-type display device, an image display system including the processing device, an image processing method and a program therefor.

2. Description of the Related Art

In recent years, flat panel displays like LCDs (Liquid Crystal Display) have widely been spread, in place of CRTs (Cathode Ray Tube).

In motion display, unlike an impulse-type display device, such as a CRT, a hold-type display device, such as LCD, keeps displaying all pixels constituting an image page, for a period of time since an instruction of displaying a predetermined one of a plurality of frames or fields (hereinafter referred to as a “field”) constituting a motion image is issued until an instruction of displaying a next frame is issued. Thus, in the hold-type display device, an issue is the occurrence of motion blur, such as blurring in the leading edge, tailing in the trailing edge, delay in a perception position, in the moving object, by an Eye-Trace Integration effect (an afterglow characteristic on human retina, when following a motion picture). Specifically, in the LCD, it is considered that this motion blur can easily occur due to a delay in the response speed of the liquid crystal.

For this issue, an overdrive technique is provided for restraining the motion blur by improving the response characteristic of the LCD. To improve the response characteristic in response to a step input in the LCD, in the overdrive technique, for example, a voltage greater than a target voltage corresponding to a specified brightness value is applied so as to accelerate a brightness transition, in a frame where an input signal variation first occurs. With the utilization of this overdrive technique, the response speed of the liquid crystal can increase in a half-tone region, and attaining a restraining effect of the motion blur. A proposed technique is provided for restraining the motion blur more effectively, by changing a waveform of a voltage applied in accordance with a motion vector in each frame, using the overdrive technique (see, for example, JP-A No. 2005-43864).

SUMMARY OF THE INVENTION

However, an issue in the overdrive technique is that a sufficiently high voltage for accelerating the response speed of the liquid crystal may not be applied, because of the limit of the applicable voltage range in the liquid crystal. As a result, the motion blur restraining effect may not sufficiently be attained, for example, when a target voltage for a black display or a white display is near the limit of the voltage range (in the case of the tone variation in a high tone range or a low tone range).

In the liquid crystal display driving in a VA-mode, different characteristics show at between the rise and the decay of the liquid crystal. It takes much time for the variation of molecular alignments, when rise from a level “0” (e.g. black). Thus, an issue is that a specified brightness transition may not be attained within one frame in consideration of the response speed of the liquid crystal using only the overdrive technique.

Recently, a double-speed operation technique has been developed for displaying image data on the LCD, by time-dividing a frame to be displayed in order to reduce the Eye-Trace Integration effect, and obtaining an interpolated image between frames based on a motion vector of an input image in order to increase the motion picture display frequency using a plurality of sub-fields.

However, if the display frequency is increased, the driving frequency of a display driver which drives the display device increases as well. This may result an issue of an insufficient charge, an increase in the number of terminals of IC or connector, an increase in the substrate area, heat generation, an increase in EMI (Electro Magnetic Interference), an increase in cost.

The present invention has been made in consideration of the above issues. It is desirable to restrain an increase in cost, to reduce an Eye-Trace Integration effect, to improve a response characteristic of image display in all tone variations, and to restrain the motion blur, in an image processing device which processes externally input image data and outputs it to a hold-type display device, an image display system including the processing device, an image processing method and a program therefor.

According to an embodiment of the present invention, there is provided an image processing device which processes externally input image data and outputs the display image data to a hold-type display device, including: a motion vector detecting unit which detects a motion vector of the input image data; a response time information storage unit which stores response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value; a compensation processing unit which compensates a pixel value in the image data for each pixel, in a frame which is one frame ahead of a frame to be displayed by the display device, based on the image data, the motion vector and the response time information; and an output unit which outputs the image data after compensated by the compensation processing unit to the display device.

The image processing device may further include an edge detecting unit which detects an edge from the input image data, based on the motion vector.

The compensation processing unit may determine whether to perform a compensation process for the pixel value, in accordance with a detection result of the edge detecting unit.

The compensation processing unit may decide whether to perform the compensation process in accordance with an edge direction of an edge part detected by the edge detecting unit.

Further, at this time, the compensation processing unit may decide to perform the compensation process, when it is determined that the edge part detected by the edge detecting unit is in a rise area from a low tone to a high tone based on the edge direction, and may decide not to perform the compensation process, when it is determined that the edge part is in a decay area from a high tone to a low tone based on the edge direction.

The compensation processing unit may include: a compensation range setting unit which sets a compensation range for compensating a pixel value in the image data based on the motion vector; a filter setting unit which sets a characteristic of a filter for compensating the pixel value in the image data so as to display an image with a tone corresponding to a tone set based on the image data when the display device displays the frame to be displayed, based on the image data, the motion vector and the response time information; and a filter processing unit which compensates a pixel value of the pixel within the compensation range by filtering the image data with a filter having the characteristic set by the filter setting unit, in the frame that is one frame ahead of the frame to be displayed by the display device.

The image processing device having the compensation processing unit may further include an edge detecting unit which detects an edge from the input image data based on the motion vector.

At this time, the compensation processing unit may further include a selecting unit which selects either one of the image data whose pixel value has been compensated by the filter processing unit and the image data whose pixel value has not been compensated by the filter processing unit, in accordance with a detection result of the edge detecting unit.

The selecting unit may select either one of image data whose pixel value has been compensated and image data whose pixel value has not been compensated, in accordance with an edge direction of an edge part detected by the edge detecting unit.

Further, at this time, the selecting unit may select the image data whose pixel value has been compensated, when it is determined that the edge part detected by the edge detecting unit is in a rise area from a low tone to a high tone, and may select the image data whose pixel value has not been compensated, when it is determined that the edge part is in a decay area from a high tone to a low tone, based on the edge direction.

The filter setting unit may change a number of taps of the filter in accordance with a motion vector value detected by the motion vector detecting unit.

The filter may, for example, be a moving average filter.

The compensation processing unit may further include an outside replacement unit which replaces outside of the image data for the input image data, using a maximum value and a minimum value of a tone of the image data, and the filter processing unit may filter the image data after processed by the outside replacement unit, with the filter.

On the other hand, the compensation processing unit may include an interpolated image generating unit which generates interpolated image data corresponding to an interpolated image to be inserted between two continuous frames based on the image data and the motion vector, a display timing information generating unit which generates display timing information representing a timing to display the interpolated image after a predetermined period of time based on the response time information, and an image synthesizing unit which synthesizes the generated display timing information with the input image data.

According to another embodiment of the present invention, there is provided an image display system which includes an image processing device, processing externally input image data, and a hold-type display device, displaying the image data processed by the image processing device and input from the image processing device, wherein: the image processing device includes a motion vector detecting unit which detects a motion vector of the input image data, a response time information storage unit which stores response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value, a compensation processing unit which compensates a pixel value in the image data for each pixel, in a frame that is one frame ahead of a frame to be displayed by the display device, based on the input image data, the motion vector and the response time information, and an output unit which outputs the image data after compensated by the compensation processing unit to the display device; and the display device includes an image display unit which displays an image corresponding to the image data input from the image processing device, and a display controlling unit which controls driving of the image display unit based on the image data input by the image processing device.

According to still another embodiment of the present invention, there is provided an image processing method for processing externally input image data and generating image data to be output to a hold-type display device, the method including the steps of: detecting a motion vector of the input image data; extracting response time information from a response time information storage unit which stores the response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value; compensating a pixel value of the image data for each pixel in a frame that is one frame ahead of a frame to be displayed by the display device, based on the input image data, the motion vector and the response time information; and outputting the image data after compensated to the display device.

According to still further embodiment of the present invention, there is provided a program for controlling a computer to function as an image processing device which processes externally input image data and outputs it to a display device performing hold-type driving, the program including: a motion vector detecting function which detects a motion vector of the input image data; a response time storage function for storing response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value; a compensation processing function for compensating a pixel value in the image data for each pixel, in a frame that is one frame ahead of a frame to be displayed by the display device, based on the input image data, the motion vector and the response time information; and an outputting function for outputting the image data after compensated by the compensation processing unit to the display device.

As described above, according to the image processing device, image display system, image processing method and program of the present invention, in what is so-called a hold-type display device, it is possible to restrain the motion blur, such as blurring in the leading edge, tailing in the trailing edge, delay in a perception position, in a moving object by an Eye-Trace Integration effect, thus improving quality of a moving image. There is a need for a change in the display device itself in the double-speed operation. However, according to the present invention, there is no need to make any change in the display device, thus no resulting in an increase in cost of the display device. Further, the motion blur can sufficiently be restrained in the tone variation in a region, other than the half tone region, that may not be improved using the overdrive technique. Particularly, in the display device using the display element with a slow response speed, the greater the difference of the response times due to the tone variation, resulting in a great restraining effect of the motion blur.

According to the embodiments of the present invention, it is possible to restrain an increase in cost, to reduce the Eye-Trace Integration effect, to improve a response characteristic of image display in all tone variations, and to restrain the motion blur, in an image processing device which processes externally input image data and outputs it to a hold-type display device, an image display system including the processing device, an image processing method and a program therefor.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is an explanatory diagram showing an example of a response waveform of a liquid crystal, when a pulse signal is input to a general VA mode liquid crystal;

FIG. 2 is an explanatory diagram for explaining an example of the relationship between Eye-Trace Integration Effect and a motion blur in a hold-type display device;

FIG. 3 is an explanatory diagram for explaining an example of the relationship between Eye-Trace Integration Effect and a motion blur in a hold-type display device;

FIG. 4 is an explanatory diagram for explaining an example of the relationship between Eye-Trace Integration Effect and a motion blur in a hold-type display device;

FIG. 5 is an explanatory diagram for explaining an example of the relationship between Eye-Trace Integration Effect and a motion blur in a hold-type display device;

FIG. 6 is an explanatory diagram schematically showing an example of an image processing method in an image processing device according to the present invention;

FIG. 7A is an explanatory diagram showing an example of an operation waveform when a step waveform is input to a hold-type display device;

FIG. 7B is an explanatory diagram showing an example of an operation waveform when a step waveform is input to a hold-type display device;

FIG. 7C is an explanatory diagram showing an example of an operation waveform when a step waveform is input to a hold-type display device;

FIG. 7D is an explanatory diagram showing an example of an operation waveform when a step waveform is input to a hold-type display device;

FIG. 8A is an explanatory diagram showing an example of an input signal that is input to an image processing device according to the present invention;

FIG. 8B is an explanatory diagram showing an example of an output signal that is output from an image processing device according to the present invention;

FIG. 8C is an explanatory diagram showing an example of an output signal that is output from an image processing device according to the present invention;

FIG. 9 is an explanatory diagram showing a variation in the spatial direction of the intensity of light accumulated on the retina of a user who has seen a hold-type display device displaying an image based on an output signal that is output from an image processing device according to the present invention;

FIG. 10 is a block diagram showing a functional configuration of an image processing device according to an embodiment of the present invention;

FIG. 11 is a block diagram showing a functional configuration of a display device according to the embodiment;

FIG. 12 is a block diagram showing a functional configuration of a compensation processing unit according to the embodiment;

FIG. 13 is an explanatory diagram for explaining functions of a high frequency detecting unit according to the embodiment;

FIG. 14 is an explanatory diagram showing an example of setting filter characteristics in accordance with a filter setting unit according to the embodiment;

FIG. 15 is an explanatory diagram showing an example of setting filter characteristics in accordance with the filter setting unit according to the embodiment;

FIG. 16 is a block diagram showing a hardware configuration of the image processing device according to the embodiment;

FIG. 17 is a flowchart showing the processing flow of an image processing method according to the embodiment; and

FIG. 18 is a flowchart showing a concrete example of a compensation method according to the embodiment.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.

Improvement of Motion Blur

Before explaining the preferred embodiments of the present invention, the present inventors first explain why the present inventors have reached an image processing device according to the present invention as an improvement device for a motion blur in a hold-type display device, such as a liquid crystal display device or the like.

As described above, in the hold-type display device, an object in motion may result in a motion blur, such as a blur in the leading edge, tailing in the trailing edge, and a delay in a perception position. In related art, the cause of the above has been considered as a result of a delay in a response time of a display device, such as a liquid crystal or the like. Thus, an overdrive technique has been utilized as a device for improving the motion blur in the hold-type display device. This overdrive technique enables to accelerate the response time of the display element, such as a liquid crystal or the like.

The delay in the response time of the display element, such as a liquid crystal, is not only one cause of occurrence of the motion blur in the hold-type display device. Another major cause of the delay is the Eye-Trace Integration Effect that is the afterglow characteristic reflected on the human retina when tracing a motion image. Thus, the motion blur in the hold-type display device has not sufficiently been restrained only using a general overdrive technique under consideration of only the delay of the response time of the display element, such as a liquid crystal or the like.

According of an image processing device disclosed in Patent document 1 applied beforehand by the present applicant, the motion blur in the hold-type display device can sufficiently be restrained under consideration of not only the response time of the liquid crystal but also the Eye-Trace Integration Effect, when utilizing the overdrive technique.

However, the overdrive technique can achieve an effect for accelerating the response time of the display element, in the tone variation in a halftone region. However, with the technique, a sufficiently high voltage may not possibly be applied to the display element, when a target voltage for a white display or a black display is near the limit of the applicable voltage range. Thus, it is difficult to sufficiently achieve the effect of accelerating the response time of the display element.

In the liquid crystal display device using a VA mode driving technique, it takes much time for the variation of molecular alignments, when rise from a level “0” (e.g. black). Thus, the response time may not fall in one frame using only the overdrive technique.

Explanations will now be made to response characteristics of the liquid crystal, in the case where a pulse signal is input to the general VA mode liquid crystal by way of example, with reference to FIG. 1. FIG. 1 is an explanatory diagram showing an example of a response waveform of a liquid crystal, when a pulse signal is input to the general VA mode liquid crystal. In FIG. 1, the vertical axis represents the tone of the liquid crystal, while the horizontal axis represents the time. Further, in FIG. 1, a solid line represents a response waveform L of the liquid crystal which is generated at the time of inputting a waveform pulse signal P, represented by a broken line in one frame period, to the general VA mode liquid crystal.

As shown in FIG. 1, in the VA mode liquid crystal, the response characteristics differ between the rise and the decay. In the rise, the liquid crystal responds along a VT curve, thus causing a delay time since a signal input until a response to the signal. On the other hand, in the decay, the liquid crystal does not respond along a VT curve, thus not causing a much delay time. Particularly, as shown in a region U enclosed by a circle of a broken line in FIG. 1, it is obvious that a long delay occurs in the response time in the rise from a low tone (e.g. a level 0). It is obvious also that a large difference occurs in the response times, between tone differences at the time of inputting a signal, in the rise.

The present inventors have further examined the relationship between the Eye-Trace Integration Effect and the motion blur in the hold-type display device. They have found that the motion blur can effectively be restrained in the hold-type display device by controlling application of a driving voltage in accordance with the response time of the display element, such as a liquid crystal, based on the difference of the response times between the tones. As a result, they have reached to complete the present invention.

Eye-Trace Integration Effect

Explanations will now be made to the relationship between the Eye-Trace Integration Effect examined by the present inventors and the motion blur in the hold-type display device, with reference to FIG. 2 to FIG. 5. FIG. 2 to FIG. 5 are explanatory diagrams each for explaining an example of the relationship between the Eye-Trace Integration Effect and the motion blur in the hold-type display device.

In the following explanations, a liquid crystal display device will be described as a hold-type display device by way of example. A predetermined one of a plurality of pixels included in a frame or field (hereinafter simply referred to as a “frame” for easy explanation) corresponds to each of the display elements (liquid crystals in this embodiment) included in the display screen of the liquid crystal display device.

It is assumed that an image to be handled has a background of a solid color, and that the image of stepwise changes moves at a constant speed. On this assumption, if the Eye-Trace Integration is traced, the brightness of this trace is a periodic function. Thus, the Eye-Trace Integration simply corresponds to one frame. For easy calculation, in this embodiment, the brightness change of the edge of the image (edge part) is vertical.

Whether the improvement of the motion blur in the hold-type display device has reached a target quality can be determined in accordance with whether the same or greater effect can be obtained as the Eye-Trace Integration Effect in an LCD which drives at 120 Hz as a result of double-speed operation of a 60 Hz driving system. Determination matters of the target quality include: a steepness of the perceptual boundary (leading edge and trailing edge) in the Eye-Trace Integration; and a delay in the half value (half value of the maximum brightness) point of the attained brightness.

FIG. 2 to FIG. 5 show an example of a case wherein an image of stepwise changes moves by 4 pixels per frame from left to right on the display screen of the liquid crystal display device. The upper illustrations of FIG. 2 to FIG. 5 show a waveform of an input image signal input to the liquid crystal display device. The middle illustrations of FIG. 2 to FIG. 5 show a time transition of an output level (brightness) of the liquid crystal, when the image based on the input image signal of the upper illustrations is displayed on the liquid crystal display device. The lower illustrations thereof show the intensity of light (i.e. the Eye-Trace Integration Effect) introduced to the retina of a user's eye, when the user (a person) sees an image displayed on the liquid crystal display device.

In the middle illustrations of FIG. 2 to FIG. 5, the position in a horizontal direction represents a position (in the spatial direction) of each of pixels included in each frame. In the same illustrations, the replacement in a vertical and downward direction represents the time transition. In the middle illustrations of FIG. 2 to FIG. 5, one liquid crystal corresponds to one pixel, the intensity of gray tone represents an output level of each liquid crystal, and reference symbols “0F” and “1F” identify the number of each frame.

Further, in the lower illustrations of FIG. 2 to FIG. 5, the position in a horizontal direction represents a position (in the spatial direction) of the retina of a user's eye at a point of time tb in the middle illustration. In the same illustrations, the position in a vertical upward direction represents the intensity of light introduced to the retina of a user's eye. That is, areas S1, S2, S3 and S4 correspond to integration results of the intensity of light in the positions of the retina of the user's eye, and are results of the Eye-Trace Integration. In more particular, in the middle illustrations of FIG. 2 to FIG. 5, oblique arrows toward the lower right position represent the movement of the user's eye. A predetermined level of light, output from the liquid crystal in a position through which each of the oblique arrows pass, enters the user's retina at each moment between a time ta and the time tb. As a result, the light entered at each moment is sequentially accumulated on the user's retina. Thus, the intensity of the accumulated light (the integrated value of the level of the entered light) is introduced at a point of the time tb.

Explanations will now be made to the relationship between the Eye-Trace Integration Effect examined by the present inventors and the motion blur in the hold-type display device, based on the illustrations of FIG. 2 to FIG. 5.

FIG. 2 shows the relationship between the Eye-Trace Integration Effect and motion blur, when an input image signal having a waveform shown in the upper illustration (an input image signal corresponding to the frame 1F of the illustration) is input at the time tb to the display element using an ideal hold-type display device (e.g. a liquid crystal) whose response time is 0.

As shown in FIG. 2, in the display device using an ideal hold-type device, the response time for a step input is 0. The output level of the liquid crystal instantaneously reaches the brightness (target brightness) corresponding to the input image signal, thus realizing a quick response of the liquid crystal. However, the Eye-Trace Integration Effect occurs even in the ideal hold-device, and thus resulting in the motion blur by four pixels corresponding to the movement speed of the input image of stepwise changes.

FIG. 3 shows the relationship between the Eye-Trace Integration effect and the motion blur, when an input image signal having a waveform (an input image signal corresponding to the frame 1F of the illustration) shown in the upper illustration is input at the time tb, to a general liquid crystal display device (LCD).

As shown in FIG. 3, a general LCD has a slow response speed for a step input, has a response time corresponding to about one frame until reaching the target brightness. The LCD performs hold-type driving, thus causing the Eye-Trace Integration Effect. When a step input is applied to the general LCD, the Eye-Trace Integration Effect is added to the response time based on the response speed of the liquid crystal. This results in a motion blur of eight pixels that is twice the movement speed of the input image of stepwise changes.

FIG. 4 shows the relationship between the Eye-Trace Integration Effect and the motion blur, when an input image signal (an input image signal corresponding to the frame 1F in the illustration) having a waveform shown in the upper illustration is input at the time tb to an LCD which performs a double-speed operation (double the motion picture display frequency). That is, this LCD is an LCD that displays an image interpolated based on a motion vector in a sub-field divided into two within one frame.

As shown in FIG. 4, there is no difference in the response speed of the liquid crystal between an LCD performing double-speed operation and a general LCD. In the LCD performing the double-speed operation, one frame is divided into two sub-fields, and the interpolated image is displayed in each of the sub-fields. Thus, the hold time for one input image signal will be reduced to half, thus reducing the Eye-Trace Integration effect. As a result, the motion blur is reduced to correspond to five pixels as a whole. As described above, whether the improvement of the motion blur in the hold-type display device has reached a target quality can be determined in accordance with whether the motion blur is equal to or lower than the motion blur of five pixels in the LCD performing the double-speed operation.

FIG. 5 shows an example of the relationship between the Eye-Trace Integration Effect and the motion blur, when an input image signal having a waveform shown in the upper illustration (an input image signal corresponding to the frame of 1F in the illustration) is input at the time tb to the image processing device according to the present invention, and when this signal is displayed on the hold-type display device.

In the image processing device according to the present invention, response time information is stored in association with the brightness change. This response time information represents the time, since application of a driving voltage for displaying an image with a target brightness to the hold-type display device until the display device displays of an image with brightness corresponding to this driving voltage. The image processing device compensates the brightness value of each pixel included in the frame to be displayed for each pixel, in a frame (0F in this embodiment) ahead of the frame (1F in this embodiment) to be displayed, i.e. at the time ta, based on the response time information and a motion vector of the input image. This compensation is so performed that each pixel reaches the target brightness in the frame (1F) to be displayed. In the example of FIG. 5, for pixels displayed first in the frame (1F) to be displayed (the right four pixels), the image processing device adjusts a voltage to be applied to the liquid crystal corresponding to each pixel at the point of 0F, and adjusts the output level of the liquid crystal for each pixel (see the stair-like part of the output level of the liquid crystal at the point of 0F). As a result, each pixel in the frame (1F) to be displayed reaches the target brightness.

Accordingly, in the frame (0F) ahead of the frame (1F) to be displayed, an optimum voltage is applied to the liquid crystal corresponding to each pixel in advance to each pixel (the pixel value is compensated), thereby remarkably reducing the Eye-Trace Integration Effect, under consideration of the response time of the liquid crystal until each pixel included in the frame to be displayed reaches the target brightness. As a result, as shown in FIG. 5, the motion blur is reduced to correspond to two pixels as a whole. It is therefore obvious to have a grater motion blur restraining effect than that of the LCD which performs the double-speed operation. In the present invention, the pixel value is compensated for each pixel, thus realizing high pixels like a high-resolution display or the like. Further, like the VA-mode liquid crystal, the greater the difference of the response times due to the tone variation, and the higher the movement speed of the moving object (motion vector value), the greater the motion blur restraining effect by the compensation process.

Accordingly, the image after processed is displayed on the hold-type display device by the image processing device according to the present invention, thereby attaining a greater motion blur restraining effect than that of the LCD performing the double-speed operation. In the LCD performing the double-speed operation, the interpolated image is synthesized with an input image, thereby dividing the frame into a plurality of sub-fields so as to increase a frame rate and reducing the hold time so as to restrain the motion blur. In the image processing device according to the present invention, the interpolation is performed in a spatial direction rather than a time direction, based on a motion vector, and the interpolation result is converted from a spatial variation to a time variation based on the response time information, thereby obtaining an effect of increasing the pseudo frame rate. As a result, in the hold-type display device, the motion picture response characteristic is improved, and the motion blur can be restrained.

General View of Image Processing Method of the Present Invention

Explanations will now be made to the general view of an example of an image processing method in the image processing device according to the present invention, with reference to FIG. 6. FIG. 6 is an explanatory diagram schematically showing an example of the image processing method in the image processing according to the present invention.

As shown in FIG. 6, if input image data is input to an image processing device 100, the image processing device 100 compares the input image data corresponding to the input frame to be displayed with image data corresponding to a frame that is one frame ahead of the frame to be displayed and stored in a memory 5-1 of the image processing device 100, so as to detect a motion vector of the input image (S11). The detected motion vector is used in the next step (S13) for generating an interpolated image. The detected motion vector is used also in the following compensation process or an overdrive process, and may be stored in the memory 5-1 as needed.

The image processing device generates an interpolated image to be inserted between the frame to be displayed and the frame that is one frame ahead thereof, based on the motion vector detected in Step S11 (S13). Upon generation of this interpolated image, the motion picture display frequency will be double (from 60 Hz to 120 Hz in a general LCD). The generated interpolated image is used in a next compensation step (S15). The generated interpolated image may be stored in the memory 5-1. This interpolated image generating step (S13) is not an indispensable step in this embodiment. Even if the motion picture display frequency (frame rate) is not increased, a motion blur restraining effect can sufficiently be attained in the hold-type display device by performing the compensation step (S15) as will now be described later.

The image processing device generates compensation information for displaying the interpolated image generated in step S13 after a predetermined period of time in order that an image with a target brightness is displayed in a frame to be displayed, based on the motion vector detected in step S11 and the response time information stored in a lookup table (LUT) 5-2. The image processing device synthesizes this compensation information with the input image data, so as to generate compensated image data whose pixel value has been compensated (S15). The generated image data after compensated is used in a next overdrive process (S17). This compensation process step (S15) is performed in the frame ahead of the frame to be displayed. If step S13 is not performed (i.e. if no interpolated image is generated), the image processing device directly obtains the pixel value after compensated for displaying the image of the target brightness in the frame to be displayed without using the interpolated image in step S15, based on the motion vector detected in step S11 and the response time information stored in the lookup table (LUT) 5-2. After that, the image processing device generates image data after compensated, based on the obtained pixel value after compensated.

The image processing device performs an overdrive process for the image data after compensated, using the input image data stored in the memory 5-1 and the compensated image data generated in step S15 (S17). As a result, display image data to be displayed on the hold-type display device can be generated.

Explanations will now be made to an operation waveform when a step waveform is input to the hold-type display device with reference to FIG. 7A to FIG. 7D. FIG. 7A to FIG. 7D are explanatory diagrams each showing an example of an operation waveform when a step waveform is input to the hold-type display device. In FIG. 7A to 7D, the vertical direction indicates the brightness of each pixel included in a frame, while the horizontal direction indicates the position of each pixel (spatial direction) included in a frame. In FIG. 7A to FIG. 7D, the areas partitioned by broken lines are referred to as units each including a plurality of pixels (four pixels in this embodiment).

FIG. 7A shows a waveform of a step signal input to a general LCD. As shown in FIG. 7A, the input step signal has an edge part on the right end of an N-th unit. Note that the height of this edge is the target brightness in the frame to be displayed.

FIG. 7B shows an operation waveform when a step signal is input to an LCD that adopts an overdrive system. As shown in FIG. 7B, according to the overdrive system, a voltage greater than a target voltage for displaying the image of the target brightness on the display device is applied, for example, in the frame wherein an input variation first occurs, so as to accelerate a brightness transition. Hence, in the N-th unit, the brightness is greater than the target brightness. Note, however, that, according to a general overdrive system, the movement of a moving object in the frame (i.e. motion vector) is not detected, and a voltage is evenly applied not in accordance with the motion vector. Thus, the N-th unit has an even brightness that is greater than the target brightness, as a whole (each of the pixels included in the N-th unit has an equal brightness).

FIG. 7C shows an operation waveform when a step signal is input to an LCD adopting a system for adjusting a voltage to be applied based on a motion vector at the time of performing an overdrive operation, as described in patent document 1. As shown in FIG. 7C, according to this system, the motion vector of an input image is detected when applying a voltage greater than a target voltage, and a voltage to be applied for each pixel is adjusted based on the detected motion vector. As a result, the motion blur restraining effect can be improved in the hold-type display device, as compared with a general overdrive system.

However, as described above, because there is a certain limit on a range of the voltage to be applied to the liquid crystal, an issue occurs. For example, when the target voltage for a black display or a white display is near the limit on the voltage range (i.e. in the case of the tone variation in a high tone range or a low tone range), a sufficiently high voltage for accelerating the response speed of the liquid crystal may not be applied, and the motion blur restraining effect may not sufficiently be attained. In consideration of this, according to the present invention, the compensation process described in step S15 of FIG. 6 is performed.

FIG. 7D shows an example of an operation waveform when a step signal is input to the image processing device according to the image processing method of the present invention. As shown in FIG. 7D, according to the system of the present invention, the brightness value of each pixel constituting the frame to be displayed is compensated in a frame that is one frame ahead of the frame to be displayed based on the response time information and the motion vector of the input image. This compensation is so performed that a target brightness is attained in each pixel in the frame to be displayed. As a result, the brightness does not suddenly drop vertically from a high value to a low value at the edge part of the step signal, but rather such an operation waveform can be attained that the brightness gradually decreases in accordance with the response speed of the liquid crystal in a stair-like manner. In addition to the image processing method of the present invention, FIG. 7D shows the operation waveform when an overdrive system is adopted with consideration of the motion vector. However, in the present invention, the overdrive system may be adopted only as needed, and is not necessarily adopted.

Subsequently, explanations will now be made to operations of a compensation process in the image processing device according to the present invention, while pointing out the waveforms of an input signal input to the image processing device and an output signal output from the image processing device, with reference to FIG. 8A to FIG. 8C and FIG. 9. FIG. 8A is an explanatory diagram showing an example of the input signal input to the image processing device according to the present invention. FIG. 8B and FIG. 8C are exemplary diagrams each showing an example of the output signal output from the image processing device according to the present invention. FIG. 9 is an exemplary diagram showing a variation in the spatial direction of the intensity of light accumulated on the retina of a user who has seen the hold-type display device displaying an image based on the output signal output from the image processing device according to the present invention.

In FIG. 8A to FIG. 8C, the position in the horizontal direction shows the position of each pixel (in the spatial direction) constituting the frame, while the position in the vertical direction shows the brightness level output from the display device. In FIG. 8A to FIG. 8C, the areas partitioned by broken lines represent pixels constituting the frames. In the following explanations, it is assumed that the input signal input to the image processing device is to have a step waveform, and that the input image based on the signal of this step waveform has a motion vector of 4 dot/v.

The signal of the step waveform having edge parts shown in FIG. 8A is input to the image processing device. As described above, this step signal moves from left to right in the illustration at a speed of 4 dot/v. Before this step signal is input, a black display is given on the display device, and the display sifts to a white display upon inputting of this step signal.

As shown in FIG. 8B, in the image processing device according to the present invention, in response to this input step signal, for example, a voltage is applied in advance to the rise part so that the brightness level gradually decreases, in accordance with the response characteristics of the liquid crystal, particularly in order to attain smooth rise of the holding device (liquid crystal or the like) (compensation process). This process is very important particularly in the rise from a black display. At this time, the range in which a voltage is applied in advance is determined based on a motion vector value. In this embodiment, for example, a voltage is applied in advance in the pixel range of 4 dots corresponding to the motion vector value (4 dot/v). When a voltage is applied in advance, a voltage value to be applied may be set for each pixel. For example, as shown in FIG. 8B, a voltage may be applied so that the brightness level gradually decreases in a stair-like manner, or may be applied so that the brightness level gradually decreases in a straight line rather than in a stair-like manner. To realize the smooth rise, it is more preferred that the brightness level decrease in a straight line.

FIG. 8C shows an operation waveform when the overdrive technique disclosed in Patent Document 1 is adopted for compensated image data in the image processing device according to the present invention. In this case, as shown in FIG. 8C, a cone-shaped signal is output, upon application of the overdrive technique. As a voltage greater than a target voltage is applied using the overdrive technique, the voltage will be greater than a voltage value to be applied in advance for the compensation process. Thus, the brightness level is entirely greater than the case of FIG. 8B (in the case of only the compensation process of the present invention).

What is shown is the variation in the spatial direction of the intensity of light accumulated on the user's retina as shown in FIG. 9, by performing the operation for displaying the image as described with reference to FIG. 8A to FIG. 8C. That is, when neither the overdrive technique nor the compensation process of the present invention is performed, the brightness level of the light accumulated on the user's retina does not reach the brightness level of the input step signal, as shown by a curved chain double-dashed line. Thus, a long delay occurs in the display, and a motion blur occurs in the hold-type display device. When only the overdrive technique is performed, there is a little difference between the brightness level of the input step signal and the brightness level of the light accumulated on the user's retina, as shown by the curved broken line. Though the delay in the display is a little shorter, the delay still occurs, thus not sufficiently attaining the motion blur restraining effect. When both of the overdrive technique and the compensation process of the present invention are performed, the brightness level of the light accumulated on the user's retina reaches the brightness level of the input step signal, as shown by the curved solid line. The variation in the brightness level gently decreases but not steeply. As a result, the Eye-Trace Integration Effect can sufficiently be restrained, and the motion blur restraining effect is greatly realized in the hold-type display device.

Configuration of Image Display System According to an Embodiment of the Present Invention

Now, explanations will specifically be made to a functional configuration of an image display system 10 according to an embodiment of the present invention, as a system for realizing the above-described functions. FIG. 10 is a block diagram showing the functional configuration of the image processing device 100 constituting the image display system 10 according to this embodiment. FIG. 11 is a block diagram showing a functional configuration of a display device 200 included in the image display system 10 according to this embodiment.

As shown in FIG. 10 and FIG. 11, the image display system 10 according to this embodiment includes the image processing device 100 and the hold-type display device 200. The image processing device 100 processes externally input image data so as to output image data to be displayed. The display device 200 actually display an image, based on the input display image data for displaying input from the image processing device 100. Here, the “system” indicates an item including a plurality of logically aggregated devices (functions), and is not to express whether each of the plurality of devices (functions) is included in the same casing. Thus, for example, like a TV receiver, the image processing device 100 and the display device 200 constituting the image display system 10 may be incorporated together so as to be handled as one unit, or the display device 200 may be handled as single casing. Explanations will now specifically be made to the functional configuration of each of the image processing device 100 and the display device 200 constituting this image display system 10.

Configuration of Image Processing Device 100

As shown in FIG. 10, the image processing device 100 according to this embodiment includes an input image data storage unit 110, a motion vector detecting unit 120, a response time information storage unit 130, a compensation processing unit 140 and an output unit 160.

The input image data storage unit 110 stores input image data that is externally input to the image processing device 100, in association with each of a plurality of continuous frames. More specifically, for example, when input image data for displaying an image in a frame to be displayed is input to the image processing device 100, the data is stored in the input image data storage unit 110. When input image data for displaying an image in the next frame to be displayed is input to the image processing device 100, the input image data in the frame ahead thereof is kept being stored, and the data is used for detecting the motion vector by the motion vector detecting unit 120. The input image data stored in the input image data storage unit 110 may be deleted sequentially from data that is temporally stored before the rest of them as needed, for example.

When the input image data in a frame to be displayed is input, the motion vector detecting unit 120 extracts, for example, input image data in a frame that is one frame ahead of the frame to be displayed from the input image data storage unit 110. The motion vector detecting unit 120 compares the input image data in the frame to be displayed and the input image data in the frame that is one frame ahead thereof, sees an object moving in the displayed image, and detects a motion vector of the input image data in the frame to be displayed based on the movement direction of this object and its distance. Like this embodiment, the motion vector detecting unit 120 may be one constituent element of the image processing device 100, or may be one constituent element of an external unit of the image processing device 100, such as an MPEG decoder, an IP converter, or the like. In the latter case, the motion vector of the input image data is detected separately by the external unit of the image processing device 100, and is input to the image processing device 100.

The response time information storage unit 130, stores the time, since a driving voltage is applied to the display device 200 until the display device 200 displays an image with a tone corresponding to the driving voltage (i.e. response time information representing a response time of the hold-type display device), in association with the tone variation value of the display device 200. The response time information may be stored in the response time information storage unit 130 in the form of, for example, a lookup table (LUT), in a manner that the tone variation value and the response time of the display element are stored in association with each other. According to another given form of storing the response time information in the response time information storage unit 130, a function indicating the relationship between the tone variation value and the response time of the display element is obtained in advance, and this function is stored in the response time information storage unit 130. In this case, the input image data in the frame to be displayed is compared with the input image data in the frame ahead of the frame to be displayed, so as to obtain the tone variation value in each pixel. The obtained tone variation is converted into response time information using the function stored in the response time information storage unit 130. Such a function can be realized with hardware, such as a RAM, ROM, or the like.

The compensation processing unit 140 compensates the pixel value of the input image data, for each pixel included in one frame, in the frame that is one frame ahead of the frame to be displayed, based on the input image data extracted from the input image data storage unit 110, the motion vector detected by the motion vector detecting unit 120, and the response time information extracted from the response time information storage unit 130. As a result of this compensation, the image data to be displayed is generated, and the generated image data is output to the output unit 160.

The compensation processing unit 140 may include an interpolated image generating unit (not illustrated), a display-timing information generating unit (not illustrated) and an image synthesizing unit (not illustrated). The interpolated image generating unit generates an interpolated image to be inserted between frames input based on the input image data and the motion vector. The display-timing information generating unit generates display-timing information representing a timing in which the interpolated image is displayed after a predetermined period of time based on the response time information. The image synthesizing unit synthesizes the generated display information with the input image data. In this configuration, the interpolated image generating unit generates an interpolated image in a spatial direction rather than a time direction, based on the motion vector. The display-timing information generating unit can change the interpolated image into display-timing information based on a difference between response times of display elements in accordance with the display-tone variation, thereby converting from the spatial direction to the time direction. Thus, by synthesizing the display-timing information with the input image data, the same effect as in the case where the interpolated image in a time direction is generated can be attained (i.e. the effect of increasing the pseudo frame rate), using the interpolated image of a spatial direction that can easily be generated based on the motion vector.

Like the above-described configuration, the pixel value may be directly compensated using a spatial filter, such as a moving average filter or the like, without generating the interpolated image. The functional configuration of the latter configuration will be described later.

The output unit 160 accepts display image data input from the compensation processing unit 140, and outputs the input image data to the display device 200.

(Configuration of Compensation Processing Unit 140)

The functional configuration of the above-described compensation processing unit 140 will more specifically be described with reference to FIG. 12. FIG. 12 is a block diagram showing the functional configuration of the compensation processing unit 140 according to this embodiment.

As shown in FIG. 12, the compensation processing unit 140 includes a compensation range setting unit 141, a maximum/minimum value detecting unit 142, an edge detecting unit 143, a high frequency detecting unit 144, an outside replacement unit 145, a filter setting unit 146, a filter processing unit 147, a gain adjusting unit 148, a selecting unit 149 and a synthesizing unit 150.

The compensation range setting unit 141 sets a compensation range for compensating the pixel value in the input image data, based on a motion vector input from the motion vector detecting unit 120. Specifically, the compensation range setting unit 141 detects an area where there is a movement in the input image data (a part corresponding to a moving object), and sets pixels in the area of the movement as a compensation range. The unit transmits information regarding the set compensation range and information regarding the input motion vector to the maximum/minimum value detecting unit 142, the edge detecting unit 143, the high frequency detecting unit 144 and the filter setting unit 146.

The maximum/minimum value detecting unit 142 detects the maximum and minimum values of the input image data (input signal) within the compensation range, based on the information regarding the compensation range transmitted from the compensation range setting unit 141. The information regarding the maximum and minimum values of the detected input signal is transmitted to the edge detecting unit 143 and the outside replacement unit 145.

The edge detecting unit 143 detects an edge part(s) in the input image data (input signal), based on the information regarding the compensation range transmitted from the compensation range setting unit 141, the information regarding the input motion vector and the information regarding the maximum/minimum values of the input signal transmitted from the maximum/minimum value detecting unit 142. This edge detecting unit 143 detects not only the position (edge part) of the edge, but also the edge direction of the edge part (whether it is the variation direction from a low tone to a high tone, or the variation direction from a high tone to a low tone). Upon detection of this edge direction, determination can be made as to whether the response of the display element is at the rise or decay. Information regarding the detected edge part and the edge direction is transmitted to the selecting unit 149.

The high frequency detecting unit 144 detects a high-frequency signal having spatial frequency of the input image data within the compensation range, based on the information regarding the compensation range transmitted from the compensation range setting unit 141. In this case, the high frequency indicates a signal having a half wavelength (½ wavelength) in a range narrower than the compensation range, as shown in FIG. 13. That is, the unit detects a high-frequency signal whose wavelength is shorter than twice of the compensation range, as a high-frequency signal. This is because, in the case of a high-frequency signal, both a rise area and a decay area exist in the compensation range, thus interfering with performance of an adequate process. The detected high-frequency signal is output to the gain adjusting unit 148, and is used for the gain adjusting after the process performed by the filter processing unit 147.

The outside replacement unit 145 performs outside replacement for the input image data (input signal) using its maximum and minimum values, based on the information regarding the maximum and minimum values of the input signal transmitted from the maximum/minimum value detecting unit 142. The input image data (input signal) after replaced is transmitted to the filter processing unit 147.

The filter setting unit 146 sets a characteristic(s) of the spatial filter for compensating the pixel value in the input image data in such a manner that the image of the tone being set based on the input image data is displayed. This setting is done when the display device 200 displays the frame to be displayed, based on the input image data, the information regarding the compensation range and the motion vector transmitted from the compensation range setting unit 141, and the response time information extracted from the response time information storage unit 130. The filter characteristics are applied only for the pixels within the compensation range. The spatial filter of this embodiment may be a moving average filter, such as a low-pass filter (LPF), and the like. The filter characteristics according to this embodiment include, for example, the area that is filtered with the filter, the number of taps of the filter. Such filter characteristics can be realized by appropriately setting a filter coefficient of the filter matrix. Information regarding such set filter characteristics is transmitted to the filter processing unit 147.

Explanations will now be made to an example of setting the filter characteristics with reference to FIG. 14 and FIG. 15. FIG. 14 and FIG. 15 are explanatory diagrams each showing an example of setting the filter characteristics by the filter setting unit 146 according to this embodiment.

FIG. 14 shows an example of setting different filter characteristics at between the rise and the decay of the display element (liquid crystal or the like). In this example, the filter is used for only the rise area of the edge. FIG. 14 shows an example of four kinds of step signals that move from left to right in the illustration, as input signals. The signals have different maximum values (maximum brightness), different minimum values (minimum brightness), and different edge heights (difference between the maximum and minimum values) from each other. In FIG. 14, those values “255” and “0” indicate the brightness values of each pixel.

As shown in FIG. 14, though different compensation values of the pixel values are given for the pixels in accordance with the tone variations (differences between the maximum and minimum values of the brightness values), the filter may be used only for the rise area of the edge. Specifically, though not illustrated in FIG. 14, for example, the filter setting unit 146 acquires information regarding the edge direction detected by the edge detecting unit 143, and determines whether it is at the rise area or the decay area based on the direction of the tone variation in the edge part. The setting unit can set the filter characteristics applicable only when determined that it is at the rise area.

FIG. 15 shows an example of setting the number of taps of the spatial filter in accordance with the motion vector value of the input image data. In this example, the number of taps of the filter is changed in proportion to the motion vector value. In FIG. 15, given four kinds of step signals move from left to right by different movement values (motion vector values) in the illustration, as input signal. From left to right in the illustration, given step signals are those of: a still image (movement value 0 dot/v); a movement value 2 dot/v; a movement value 4 dot/v; and a movement value 6 dot/v. The values “255” and “0” in FIG. 15 indicate the brightness values of each pixel.

In the example of FIG. 15, the filter setting unit 146 sets a filter characteristics that includes the number of taps (e.g. the number of taps is “2”, if the movement value is 2 dot/v) which is equal to the motion vector value (number of pixels) of the input image data. Accordingly, the greater the motion vector value of the input image signal (the higher the movement speed), the greater the number of taps of the filter. Thus, the greater the motion vector value of the input image signal (the higher the movement speed), the more precisely the compensation process of the pixel value can be performed. Thus, according to the image processing device 100 of this embodiment, the greater the motion vector value of the input image data, the more effectively the motion blur can be restrained in the hold-type display device 200.

The filter processing unit 147 filters the input image data after outside-replaced transmitted from the outside replacement unit 145, using a filter having the filter characteristics set by the filter setting unit 146, in the frame that is one frame ahead of the frame to be displayed by the display device 200. By so doing, the pixel values of the pixels in the compensation range are compensated. The input image data whose pixel value is compensated is transmitted to the gain adjusting unit 148. Though the filter processing unit 147 of this embodiment filters the input image data after outside-replaced, the unit does not necessary filter the input image data after outside-replaced, and may filter the input image data itself.

To avoid an error in the high frequency, the gain adjusting unit 148 performs gain adjustment for the input image data after compensated transmitted from the filter processing unit 147, based on a high-band signal transmitted from the high-frequency detecting unit 144. The input image data after gain adjustment is transmitted to the selecting unit 149.

As a result of the detection by the edge detecting unit 143, the selecting unit 149 accepts inputs including: information regarding the edge part and the edge direction transmitted from the edge detecting unit 143; the input image data whose pixel value is compensated and which is transmitted from the filter processing unit 147; and the input image data whose pixel value is not compensated and which is extracted from the input image data storage unit 110. This selecting unit 149 selects either one of the input image data whose pixel value has been compensated by the filter processing unit 147 and the input image data whose pixel value has not been compensated by the filter processing unit 147, in accordance with the input information regarding the edge part and the edge direction. Further, when the selecting unit 149 selects the input image data whose pixel value has been compensated (i.e. filtering is performed), the unit outputs the input image data to the synthesizing unit 150. More particularly, for example, when it is determined that the edge part is at the rise area from a low tone to a high tone based on the edge direction, the selecting unit 149 selects the input image data whose pixel value has been compensated. When it is determined that the edge part is at the decay area from a low tone to a high tone based on the edge direction, the selecting unit 149 selects the input image data whose pixel value has not been compensated. By performing such processing, only the rise area can be filtered, as explained in FIG. 14.

In this embodiment, the selecting unit 149 is provided on the post stage of the filter processing unit 147. The selecting unit 149 accepts both inputs including input image data which has been filtered by the filter processing unit 147 and the externally input image data itself. The selecting unit 149 uses a system for selecting either of the input image data after filter processed and the externally input image. However, the system is not limited to this. For example, before the filter processing unit 147 performs a filter process, the selecting unit 149 determines whether to perform the filter process in advance. Only when the selecting unit 149 determines to perform the filter process (for example, when determined that the edge part is at the rise area), the filter process may be performed.

When the input image data after filter processed is input from the selecting unit 149, the synthesizing unit 150 synthesizes the externally input image data itself (data which has not been filter processed) with the input image data after filter processed, and outputs the data to the output unit 160. When the input image data after filter processed is not input from the selecting unit 149, the synthesizing unit 150 outputs the externally input image data itself which has not been filter processed, to the output unit 160.

(Configuration of Display Device 200)

The functional configuration of the image processing device 100 has specifically been described above. The configuration of the display device 200 will now be explained with reference to FIG. 11. As shown in FIG. 11, the display device 200 is a hold-type display device, and includes an image display unit 210, a source driver 220, a gate driver 230 and a display controlling unit 240.

The image display unit 210 displays an image corresponding to display image data input from the image processing device 100. The image display unit 210 is, for example, a dot matrix type display in an m×n matrix arrangement. Specific examples of the image display unit 210 are an active matrix type OLED (Organic Light Emitting Diode) display using an a-Si (amorphous silicon) TFT, an LCD and the like.

The source driver 220 and the gate driver 230 are driving units for driving the image display unit 210 in an m×n matrix arrangement. The source driver 220 supplies a data line 221 with a data signal, while the gate driver 230 supplies a scanning line 231 with a select signal (address signal).

The display controlling unit 240 controls driving of the image display unit 210 (driving of the source driver 220 and the gate driver 230), based on the display image data input from the image processing device 100. More specifically, the display controlling unit 240 outputs a control signal to be supplied to each driver (the source driver 220 and the gate driver 230) at an appropriate timing, based on the display image data (video signal) obtained from the image processing device 100.

The explanations have been made to the examples of the functions of the image processing device 100 and identification information reader according to this embodiment. Each of the above-described constituent elements may be formed using a widely used member or circuit, or may be formed with hardware specialized for the constituent elements thereof. Each function of the constituent elements may be executed by a CPU or the like. Thus, the applicable configuration may be changed appropriately in accordance with a technical level at the time this embodiment is implemented.

Hardware Configuration of Image Processing Device 100

Explanations will now be made to a hardware configuration of the image processing device 100 according to this embodiment with reference to FIG. 16. FIG. 16 is a block diagram showing the hardware configuration of the image processing device according to this embodiment.

The image processing device 100 mainly includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 903, a RAM (Random Access Memory) 905, a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925.

The CPU 901 functions as an arithmetic device and a control device, and controls entirely or partially operations of the image processing device 100 in accordance with various programs stored in the ROM 903, the RAM 905, the storage device 919 or a removable recording medium 927. The ROM 903 stores programs or arithmetic parameters used by the CPU 901. The RAM 905 temporarily stores programs used for the performance of the CPU 901 and also parameters that are changed appropriately during the performance. These are connected with each other through the host bus 907 including an internal bus, such as a CPU bus or the like.

The host bus 907 is connected to the external bus 911, such as a PCI (Peripheral Component Interconnect/interface) bus, through the bridge 909.

The input device 915 is an operational unit, such as a mouse, a keyboard, a touch panel, a button, a switch and a lever, and is operated by users. The input device 915 may be a remote control unit (so-called a remote control), using infrared rays or any other electric wave, or may be an external connection unit 929, such as a cell phone, a PDA or the like corresponding to the operations of the image processing device 100. Further, the input device 915 generates, for example, an input signal based on information input by a user with the utilization of the above-described operation unit, and includes an input control circuit for outputting information to the CPU 901. The user of the image processing device 100 operates this input device 915, thereby enabling to input various data for the image processing device 100 and instructs it for processing operations.

The output device 917 includes a device that can visually or aurally inform the user of acquired information. The device 917 may, for example, be a CRT display device, a liquid crystal display device, a plasma display device, a display device such as an EL display device and a lamp, an audio output device such as a speaker and a headphone, a printer device, a cell phone, a facsimile, or the like. Specifically, the display device displays various information, such as image data or the like, in text form or image form. The audio output device converts audio data into a voice, and outputs the converted voice.

The storage device 919 is a device for data storage, and is configured as an example of a storage unit of the image processing device 100 according to this embodiment. The storage device 919 includes a magnetic storage unit device, such as an HDD (Hard Disk Drive), etc., a semiconductor storage device, an optical storage device, or a magneto-optical storage device. This storage device 919 stores programs to be executed by the CPU 901, various data, and externally acquired image signal data.

The drive 921 is a reader/writer for storage medium, and is incorporated in an image signal processing device or externally installed thereon. The drive 921 reads information recorded on a removable recording medium 927, such as an installed magnetic disk, optical disk, magneto-optical disk or semiconductor memory or the like, and outputs the information to the RAM 905. The drive 921 can write the record to the removable recording medium 927, such as the installed magnetic disk, optical disk, magneto-optical disk or semiconductor memory or the like. The removable recording medium 927 may, for example, be a DVD medium, an HD-DVD medium, a Blu-ray medium, a Compact Flash (CF) (registered trademark), a memory stick or an SD memory card (Secure Digital memory card). The removable recording medium 927 may be an IC card (Integrated Circuit card) having a contactless IC chip installed thereon or an electronic unit.

The connection port 923 is a port for directly connecting the image processing device 100 with a unit, such as a USB (Universal Serial Bus) port, an IEEE1394 port (such as an i.Link or the like), a SCSI (Small Computer System Interface) port, an RS-232Cport, an optical audio terminal or the like. Upon connection of this connection port 923 to the external connection unit 929, the image processing device 100 directly acquires image signal data from the external connection unit 929, and provides the external connection unit 929 with the image signal data.

The communication device 925 is a communication interface including a communication device or the like for connecting to a communication network 10. The communication device 925 is a communication card for a cable or wireless LAN (Local Area Network), Bluetooth, or for WUSB (Wireless USB), a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communications. This communication device 925 can transmit and receive an image signal and the like to and from the Internet or any other communication unit. The communication network 10 connected to the communication device 925 includes a network or the like connected through a cable or a wireless system, and may include the Internet, a home LAN, infrared communication or satellite communication.

According to the above-described configuration, the image processing device 100 can acquire information regarding an input image signal from various information sources of the other external connection unit 929 connected to the connection port 923 or communication network 10, and can transmit the image signal to the display device 200.

The hardware configuration of the display device 200 according to this embodiment is substantially the same as that of the image processing device 100, thus will not be explained hereinbelow.

Accordingly, the explanations have been made to the example of the hardware configuration for enabling to realize the function of the image processing device 100 and display device 200 according to this embodiment. Each of the above-described constituent elements may include a widely used member, or may include hardware specialized for the constituent elements thereof. Thus, the applicable configuration may be changed appropriately in accordance with a technical level at the time this embodiment is implemented.

Processing Flow of Image Processing Method According an Embodiment of the Present Invention

The configuration of the image processing device 100 and display device 200 according to this embodiment has specifically been described above. Now, explanations will be made to an image processing method according to this embodiment with using the image processing device 100 having such a configuration with reference to FIG. 17. FIG. 17 is a flowchart showing the processing flow of the image processing method according to this embodiment.

The image processing method according to this embodiment is to process input image data externally input to the image processing device 100, thereby generating display image data output to the hold-type display device 200.

Specifically, as shown in FIG. 17, if the input image data is externally input to the image processing device 100, the input image data is stored in the input image data storage unit 110 (S101), and also input to the motion vector detecting unit 120.

When input image data in a frame to be displayed is input to the motion vector detecting unit 120, the motion vector detecting unit 120 extracts, for example, input image data in a frame that is one frame ahead of the frame to be displayed, from the input image data storage unit 110. The motion vector detecting unit 120 compares the input image data in the frame to be displayed and the input image data in the frame that is one frame ahead thereof, and sees an object moving in this display image, and detects a motion vector of the input image data in the frame to be displayed based on the distance between the object's movement direction and its distance (S103). The detected motion vector is transmitted to the compensation processing unit 140 or the like.

When the input image data in the frame to be displayed is externally input, the compensation processing unit 140 extracts response time information corresponding to a tone variation value of each pixel in the frame to be displayed from the response time information storage unit 130 (S105). The compensation processing unit 140 performs a compensation process for compensating a pixel value in the input image data, for each pixel included in the frame that is one frame ahead of the frame to be displayed, based on the externally input image data, the motion vector input from the motion vector detecting unit 120 and the response time information extracted from the response time information storage unit 130 (S107). As a result of this compensation process, the display image data is generated, and the compensation processing unit 140 outputs the generated display image data to the output unit 160 (S109).

When the display image data is input from the compensation processing unit 140, the output unit 160 outputs the input display image data to the display device 200 (S111).

Explanations will now be made to a specific example of a compensation process step according to this embodiment (S107) with reference to FIG. 18. FIG. 18 is a flowchart showing a specific example of the compensation process according to this embodiment.

As shown in FIG. 18, when input image data is externally input to the compensation processing unit 140 (S201), the compensation range setting unit 141 sets a compensation range for compensating the pixel value in the input image data, based on the motion vector input from the motion vector detecting unit 120 (S203). Specifically, the compensation range setting unit 141 detects an area with a movement in the input image data (a part corresponding to an moving object), and sets the pixel in the area having the movement as a compensation range. Further, the compensation range setting unit 141 transmits information regarding the set compensation range and information regarding the input motion vector, to the maximum/minimum value detecting unit 142, the edge detecting unit 143, the high frequency detecting unit 144 and the filter setting unit 146 and the like.

The maximum/minimum value detecting unit 142 detects the maximum and minimum values of input image data (input signal) in the compensation range, based on the information regarding the compensation range and transmitted from the compensation range setting unit 141 (S205). Further, the maximum/minimum value detecting unit 142 transmits information regarding the maximum and minimum values of the detected input signal to the edge detecting unit 143 and the outside replacement unit 145.

The edge detecting unit 143 detects an edge area in the input image data (input signal), based on the information regarding the compensation range transmitted from the compensation range setting unit 141, the information regarding the input motion vector and the information regarding the maximum/minimum values of the input signal transmitted from the maximum/minimum value detecting unit 142 (S207). At this time, the edge detecting unit 143 detects not only the position of the edge (edge part), but also the edge direction in the edge part (whether it is a variation direction from a low tone to a high tone, or a variation direction from a high tone to a low tone). Further, the edge detecting unit 143 transmits information regarding the detected edge part and edge direction to the selecting unit 149.

The high frequency detecting unit 144 detects a high frequency signal having spatial frequency of the input image data in the compensation range, based on the information regarding the compensation range transmitted from the compensation range setting unit 141 (S209). In this case, the high frequency represents a signal having a half wavelength (½ wavelength) in a range narrower than the compensation range. That is, the detecting unit detects a high-band signal whose wavelength is shorter than twice of the compensation range, as a high-band signal. This is because, in the case of a high-band signal, both a rise area and a decay area exist in the compensation range, thus interfering with performance of an adequate process. The high frequency detecting unit 144 outputs the detected high-band signal to the gain adjusting unit 148, and the output high-band signal is used for the gain adjusting after the process performed by the filter processing unit 147.

The outside replacement unit 145 performs outside replacement for the input image data (input signal) using its maximum and minimum values, based on the information regarding the maximum and minimum values of the input signal transmitted from the maximum/minimum value detecting unit 142 (S211). Further, the outside replacement unit 145 transmits the input image data after replaced (input signal) to the filter processing unit 147.

When the input image data in the frame to be displayed is externally input, and when the information regarding the compensation range from the compensation range setting unit 141 and the motion vector are transmitted, the filter setting unit 146 extracts response time information corresponding to a tone variation value of each pixel in the frame to be displayed (S213).

The filter setting unit 146 sets a characteristic of a spatial filter for compensating the pixel value in the input image data so that the image having the tone set based on the input image data is displayed, when the display device 200 displays the frame to be displayed, based on the input image data, the information regarding the compensation range, the motion vector and the response time information (S215). The spatial filter in this embodiment may be a moving average filter, such as a low pass filter (LPF) or the like. The characteristics of the filter of this embodiment may include the area which is filtered with the filter and the number of taps of the filter. Such filter characteristics can be realized by appropriately setting a filter coefficient(s) of the filter matrix. Further, the filter setting unit 146 transmits information regarding thus set filter characteristics to the filter processing unit 147.

The filter processing unit 147 performs a filter process for compensating the pixel value of each pixel positioned in the compensation range, by providing the input image data after outside replaced transmitted from the outside replacement unit 145 with a filter having the filter characteristic(s) set by the filter setting unit 146, in the frame that is one frame ahead of the frame to be displayed by the display device 200 (S217). Further, the filter processing unit 147 transmits the input image data whose pixel value has been compensated to the gain adjusting unit 148. The filter processing unit 147 according to this embodiment provides the input image data after outside replaced with the filter. However, the filter is not necessarily provided to the input image data after outside replaced, and may be provided to the input image data itself.

In order to avoid an error in the high frequency, the gain adjusting unit 148 performs gain adjustment for the input image data after compensated and transmitted from the filter processing unit 147, based on the high-band signal transmitted from the high-frequency detecting unit 144 (S219). Further, the gain adjusting unit 148 transmits the input image data after gain adjustment to the selecting unit 149.

As a detection result of the edge detecting unit 143, upon input of the input image data whose pixel value has been compensated and transmitted from the filter processing unit 147 and the input image data itself whose pixel value has not been compensated and extracted from the input image data storage unit 110, the selecting unit 149 selects either one of the input image data whose pixel value has been compensated by the filter processing unit 147 and the input image data whose pixel value has not been compensated by the filter processing unit 147, in accordance with information regarding the input edge part and edge direction. In a specific process, the selecting unit 149 determines whether the edge part is at the rise area from a low tone to a high tone or at the decay area from a high tone to a low tone, based on the edge direction (S221).

As a result of this determination, when it is determined that the edge part of the input image data is at the rise area, the selecting unit 149 selects the input image data whose pixel value has been compensated (S223), and outputs the input image data whose pixel value has been compensated (filter processed) (S225).

As a result of the determination of step S221, when it is determined that the edge part of the input image data is at the decay area, the selecting unit 149 selects the input image data whose pixel value has not been compensated (S227).

Finally, when the input image data after filter processed is input from the selecting unit 149, the synthesizing unit 150 synthesizes the externally input image data itself (not filter processed) with the input image data after filter processed (S229), and outputs the data to the output unit (S231). The synthesizing unit 150 outputs the externally input image data itself not filter processed to the output unit 160, when the input image data after filter processed is not input (S233).

In this embodiment, the selection process by the selecting unit is performed after the filter process by the filter processing unit 147 is performed. The selecting unit 149 selects either one of the input image data after filter processed and the externally input image data. However, the timing to perform this process is not limited to the above. For example, before the filter processing unit 147 performs the filter process, the selecting unit 149 determines whether to perform the filter process in advance. Only when the selecting unit 149 determines the filter process is to be performed (for example, when it is determined that the edge part is at the rise area), the filter process may be performed.

It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.

Claims

1. An image processing device which processes externally input image data and outputs the display image data to a hold-type display device, comprising:

a motion vector detecting unit which detects a motion vector of the input image data;
a response time information storage unit which stores response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value;
a compensation processing unit which compensates a pixel value in the image data for each pixel, in a frame which is one frame ahead of a frame to be displayed by the display device, based on the input image data, the motion vector and the response time information; and
an output unit which outputs the image data after compensated by the compensation processing unit to the display device.

2. The image processing device according to claim 1, further comprising an edge detecting unit which detects an edge from the input image data, based on the motion vector.

3. The image processing device according to claim 2, wherein

the compensation processing unit determines whether to perform a compensation process for the pixel value, in accordance with a detection result of the edge detecting unit.

4. The image processing device according to claim 3, wherein

the compensation processing unit decides whether to perform the compensation process in accordance with an edge direction of an edge part detected by the edge detecting unit.

5. The image processing device according to claim 4, wherein

the compensation processing unit decides to perform the compensation process, when it is determined that the edge part detected by the edge detecting unit is in a rise area from a low tone to a high tone based on the edge direction, and decides not to perform the compensation process, when it is determined that the edge part is in a decay area from a high tone to a low tone based on the edge direction.

6. The image processing device according to claim 1, wherein the compensation processing unit includes:

a compensation range setting unit which sets a compensation range for compensating a pixel value in the image data based on the motion vector;
a filter setting unit which sets a characteristic of a filter for compensating the pixel value in the image data so as to display an image with a tone corresponding to a tone set based on the image data when the display device displays the frame to be displayed, based on the image data, the motion vector and the response time information; and
a filter processing unit which compensates a pixel value of the pixel within the compensation range by filtering the image data with a filter having the characteristic set by the filter setting unit, in the frame that is one frame ahead of the frame to be displayed by the display device.

7. The image processing device according to claim 6, further comprising an edge detecting unit which detects an edge from the input image data based on the motion vector.

8. The image processing device according to claim 7, wherein the compensation processing unit further includes a selecting unit which selects either one of the image data whose pixel value has been compensated by the filter processing unit and the image data whose pixel value has not been compensated by the filter processing unit, in accordance with a detection result of the edge detecting unit.

9. The image processing device according to claim 8, wherein

the selecting unit selects either one of image data whose pixel value has been compensated and image data whose pixel value has not been compensated, in accordance with an edge direction of an edge part detected by the edge detecting unit.

10. The image processing device according to claim 9, wherein

the selecting unit selects the image data whose pixel value has been compensated, when it is determined that the edge part detected by the edge detecting unit is in a rise area from a low tone to a high tone, and selects the image data whose pixel value has not been compensated, when it is determined that the edge part is in a decay area from a high tone to a low tone, based on the edge direction.

11. The image processing device according to claim 6, wherein

the filter setting unit changes a number of taps of the filter in accordance with a motion vector value detected by the motion vector detecting unit.

12. The image processing device according to claim 6, wherein

the filter is a moving average filter.

13. The image processing device according to claim 12, wherein

the compensation processing unit further includes an outside replacement unit which replaces outside of the image data for the input image data, using a maximum value and a minimum value of a tone of the image data, and
the filter processing unit filters the image data after processed by the outside replacement unit, with the filter.

14. An image display system which comprising:

an image processing device, processing externally input image data; and
a hold-type display device, displaying the image data processed by the image processing device and input from the image processing device; wherein:
the image processing device includes
a motion vector detecting unit which detects a motion vector of the input image data,
a response time information storage unit which stores response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value,
a compensation processing unit which compensates a pixel value in the image data for each pixel, in a frame that is one frame ahead of a frame to be displayed by the display device, based on the input image data, the motion vector and the response time information, and
an output unit which outputs the image data after compensated by the compensation processing unit to the display device; and
the display device includes:
an image display unit which displays an image corresponding to the image data input from the image processing device, and
a display controlling unit which controls driving of the image display unit based on the image data input by the image processing device.

15. An image processing method for processing externally input image data and generating image data to be output to a hold-type display device, the method comprising the steps of:

detecting a motion vector of the input image data;
extracting response time information from a response time information storage unit which stores the response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value;
compensating a pixel value of the image data for each pixel in a frame that is one frame ahead of a frame to be displayed by the display device, based on the input image data, the motion vector and the response time information; and
outputting the image data after compensated to the display device.

16. A program for controlling a computer to function as an image processing device which processes externally input image data and outputs it to a display device performing hold-type driving, the program comprising:

a motion vector detecting function which detects a motion vector of the input image data;
a response time storage function for storing response time information representing a time since a driving voltage is applied to the display device until the display device displays an image with a tone corresponding to the driving voltage, in association with a tone variation value;
a compensation processing function for compensating a pixel value in the image data for each pixel, in a frame that is one frame ahead of a frame to be displayed by the display device, based on the input image data, the motion vector and the response time information; and
an outputting function for outputting the image data after compensated by the compensation processing unit to the display device.
Patent History
Publication number: 20090153743
Type: Application
Filed: Dec 17, 2008
Publication Date: Jun 18, 2009
Applicant: Sony Corporation (Tokyo)
Inventor: Kenji Arashima (Tokyo)
Application Number: 12/316,837
Classifications
Current U.S. Class: Motion Vector Generation (348/699); Video Display (348/739); 348/E05.062; 348/E05.133
International Classification: H04N 5/14 (20060101); H04N 5/66 (20060101);