SYSTEMS FOR PROVIDING IMAGE OR VIDEO TO BE DISPLAYED BY PROJECTIVE DISPLAY SYSTEM AND FOR DISPLAYING OR PROJECTING IMAGE OR VIDEO BY A PROJECTIVE DISPLAY SYSTEM
A system for providing image or video to be displayed by a projective display system includes: an encoding subsystem and a packing subsystem. The encoding subsystem is configured to encode at least one image or video of a subject to generate encoded image data. The packing subsystem is coupled to the encoding subsystem, and configured to pack the encoded image data with projection configuration information regarding the projective display system to generate packed image data. The projective display system comprises a projection source device and a projection surface, the projection source device projects the image or video to the projection surface according to the packed image data.
This application claims the benefit of U.S. Provisional Application No. 62/003,260, filed on May 27, 2014, and U.S. Provisional Application No. 62/034,952, filed on Aug. 8, 2014. The entire contents of the related applications are incorporated herein by reference.
TECHNICAL FIELDThe present invention relates generally to projective display, and more particularly, to a transmission system including a capture and transmitting system and a display and receiving system that is capable of generating images of a subject and having the images displayed by a projective display system.
BACKGROUNDA stereo display is a display device capable of conveying depth perception to the viewer and reproducing real-world viewing experiences. The stereo display can be implemented with different technologies. However, technologies nowadays respectively have some disadvantages. Stereoscopic display technology has the disadvantage that the viewer must be positioned in a well-defined spot to experience the 3D visual effect and the disadvantage that the effective horizontal pixel count viewable for each eye is reduced by one half as well as the luminance for each eye is also reduced by one half. In addition, glasses-free stereoscopic display is desirable but glasses-free stereoscopic display currently leads to poor user experience. Holographic display technology has a great viewing experience but the cost and the size is too high to apply to mobile devices.
SUMMARYAccording to a first aspect of the present invention, a system for providing image or video to be displayed by a projective display system is provided. The system comprises: an encoding subsystem and a packing subsystem. The encoding subsystem is configured to encode at least one image or video of a subject to generate encoded image data. The packing subsystem is coupled to the encoding subsystem, and configured to pack the encoded image data with projection configuration information regarding the projective display system to generate packed image data. The projective display system comprises a projection source device and a projection surface, the projection source device projects the image or video to the projection surface according to the packed image data.
According to a second aspect of the present invention, a system for displaying or projecting image or video by a projective display system comprises: a de-packing subsystem, a decoding subsystem and a display subsystem. The de-packing subsystem is configured to derive packed image data and de-pack the derived packed image data to obtain encoded image data and projection configuration information. The decoding subsystem is coupled to the de-packing subsystem, and configured to decode the encoded image data to generate at least one image or video. The a projection display subsystem is coupled to the decoding subsystem, and configured to project or display the at least one image or video by a projection source component of the projective display system according to the projection.
These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
Certain terms are used throughout the following descriptions and claims to refer to particular system components. As one skilled in the art will appreciate, manufacturers may refer to a component by different names. This document does not intend to distinguish between components that differ in name but not differ in function. In the following discussion and in the claims, the terms “include”, “including”, “comprise”, and “comprising” are used in an open-ended fashion, and thus should be interpreted to mean “including, but not limited to . . . ” The terms “couple” and “coupled” are intended to mean either an indirect or a direct electrical connection. Thus, if a first device couples to a second device, that connection may be through a direct electrical connection, or through an indirect electrical connection via other devices and connections.
Projective Display System Single-View ConfigurationThe projection source component 122 is configured to project at least one source image. In some embodiments, the projection source component 122 may be an organic light-emitting diode display panel, a liquid crystal display panel, or any other passive or active display panel. In some other embodiments, the projection source component 122 may be a solid-state (laser or LED) light or any other type of light source. The projection processor 124 is configured to adaptively control an image adjustment on an input image to generate the source image. One purpose of the projection processor 124 is to maintain a projection quality of the projective display system 100. The projection processor 124 could be implemented with a general-purpose processor or dedicated hardware. Please note that, the position of the projection processor 124 in
The optical adjustment element 140 may be rotatably attached to and detachable from the base 110. The optical adjustment element 140 may optically adjust forming of the virtual image on the projection surface 130. In various embodiment of the present invention, the optical adjustment element 140 could be a single lens or a compound lens.
Multi-View ConfigurationThe projection surface 130 has four viewable sides P1-P4. The polyhedral projection surface 130 may be formed by folding the projection surface 130 of
Each of the optical adjustment elements 140 is detachable from the base 110. Hence, in various embodiments of the present invention, not every optical adjustment element 140 shown in
According to one embodiment of the present invention, a processing system for the projective display system 100 is provided. The processing system of the present invention may include a capture and transmitting system 300 as illustrated by
The capture and transmitting system 300 may include a capture subsystem 310, an encoding subsystem 320, and a packing subsystem 330. Packed image data generated by the capture and transmitting system 300 may be stored in a storage device 340 or be sent to a channel coding and modulation device 350 or any other device. Afterwards, the coded and modulated data stream may be sent to the display and receiving system 400 with wired or wireless transmission.
The display and receiving system 400 includes a display subsystem 410, a decoding subsystem 420, and a de-packing subsystem 430. The display and receiving system 400 receives the packed data generated by the capture and transmitting system 300 through the storage device 340 or wired/wireless transmission. A channel decoding and demodulation device 600 channel-decodes and demodulates the packed image data that is processed by the channel coding and modulation device 350.
Operations of the capture and transmitting system 300 and the display and receiving system 400 will be illustrated later in further details.
Capture SubsystemThe capture subsystem 310 of the capture and transmitting system 300 may include one or multiple camera devices. Please refer to
In one embodiment, the camera devices of the electronic devices 310_1-310_4 may utilize wide-angle lens or fish-eye lens to cover a wide scene.
The number of the views for capturing the subject can be determined according to the number of the viewable sides of the projection surface 130. For example, in the embodiment of
The capture subsystem 310 may include single camera device. In such case, the single camera device of the capture subsystem 310 needs to shoot the subject for several times from each view to satisfy the projective display system 100 in the multi-view configuration.
In one embodiment, the capture subsystem 310 could generate images or videos by utilizing a graphic engine to render 3D objects in single view or multiple views.
If the projection surface 130 is in the single-view configuration, the flow goes to step 640, where the camera device(s) of the capture subsystem 310 is instructed to shot once in front of the subject. If the projection surface 130 is in the multi-view configuration, the flow may go to step 650, where it is determined whether the subject is still. If yes, the flow may go to step 660; otherwise, the flow may go to step 674, a notice message may be shown, which reminds the user of the failure of the capturing because when the subject is moving, it is not easy to derive the image of subject in multiple views.
At step 660, it is determined whether the depth information of the subject is obtained. If yes, the flow may go to step 672, where the depth synthesis is performed. By the depth synthesis, the images of the subject in other views can be generated according to the depth information. If it is no in step 660, the flow may go to step 670, where multiple shoots are conducted by the camera device(s) to capture the images or videos of the subject in multi-view. Afterwards, the flow may go to step 680, where the multiple view synthesis is performed. The multiple view synthesis may combine the images or videos captured from different views into one or adjust the relative positions of the images or videos captured from different views on the projection source component 122. For example, image or videos captured from different views could be combined into a single image or video like the patterns shown by
The encoding subsystem 320 of the capture and transmitting system 300 encodes the captured images generated by the capture subsystem 310 with respect to a single view or multiple views.
If the projection surface 130 is in the single-view configuration, the flow may go to step 750; if the projection surface 130 is in the multi-view configuration, the flow may go to step 740, where the multi-view coding may be conducted to remove data redundancy between captured images/videos (by camera device) or rendered images/videos (by graphic engine) with respect to multiple views. In step 750, it is determined whether the projection processor 124 of the projection source device 120 is applied before. If the projection processor 124 of the projection source device 120 has been applied, this means the projection processor 124 has obtained the depth information of the subject. Therefore, if it is yes in step 750, the flow may go to step 760, in which shape/depth coding is performed, where the projection processor 124 provides the shape/depth mapping information to the encoding subsystem 320 for performing the shape/depth encoding; otherwise, the flow may go to step 770, in which texture coding is performed.
In one embodiment, the shape/depth coding performed in step 760 may include MPEG-4shape coding. In the case of image encoding, the texture coding may comprise JPEG, GIF, PNG, and still profile of MPEG-1, MPEG-2, MPEG-4, WMV, AVS, H.261, H.263, H.264, H.265, VP6, VP8, VP9, any other texture coding or combination thereof. In the case of video encoding, the texture coding may comprise motion JPEG, MPEG-1, MPEG-2, MPEG-4, WMV, AVS, H.261, H.263, H.264, H.265, VP6, VP8, VP9, any other texture coding or combination thereof.
The decoding subsystem 420 in the display and receiving system 400 is used to decode the encoded data generated by the encoding subsystem 320 based on how the data is encoded as described above.
Packing/De-Packing SubsystemAfter the captured or rendered images are encoded by the encoding subsystem 330 to generate the encoded image data, the packing subsystem 330 of the capture and transmitting system 300 packs the encoded image data with projection configuration information. The projection configuration information may be in form of H.264 SEI (Supplemental Enhancement Information) or any other configuration format. In one embodiment, the projection configuration information includes at least one of the number of the viewable sides of the projection surface 130 and the enablement of the projection function of the projective display system 110. An example of the information to be packed with the encoded image data is illustrated in
The universally unique identifier (UUID) identifies the following fields, which carry information of the number of the viewable sides of the projection surface 130 and information of the enablement of the projection function of the projective display system 110. In this embodiment, UUID is 128-bit long, the field corresponding to the number of the viewable sides of the projection surface 130 is one-byte long, and the field corresponding to the enablement of the projection function of the projective display system 110 is also one-byte long. However, this is just one possible way to implement packing the projection configuring information with encoded image data, rather than a limitation of the present invention.
The de-packing subsystem 430 of the display and receiving system 400 derives the packed image data stored in the storage device 340 or the wired/wireless transmission between the capture and transmitting system 300 and the display and receiving system 400. Accordingly, the de-packing subsystem 430 de-packs the received packed image data to derive encoded image data (which will be decoded by the decoding subsystem 420) and the projection configuration information.
Display SubsystemDisplay subsystem 410 of the display and receiving system 400 are configured to display captured or rendered images/videos provided by capture subsystem 310 on one or more source areas, such as source areas SA1-SA4 on the projection source component 122. The number of the source areas on which the display subsystem 410 displays the images is determined according to the number of the viewable sides of the projection surface 130 (which could be provided by the de-packing subsystem 430). For example, in the embodiment of
For different source areas of the projection source component 122, the display subsystem 410 could project or display identical or different images thereon. This depends on the number of views for capturing the subject and the number of the viewable sides of the projection surface.
When the number of view of capturing the subject is smaller than the number of the viewable points of the projection surface, the display subsystem 410 projects or displays duplicated captured images/videos on the different source areas of the projection source component 122 of the projection source device 100. For example, if the image is captured in single view, the number of views of capturing the subject is 1. Hence, if the captured image is projected to the projection surface 130 in
Reference in the specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least an implementation. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment. Thus, although embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that claimed subject matter may not be limited to the specific features or acts described. Rather, the specific features and acts are disclosed as sample forms of implementing the claimed subject matter.
Circuits in the embodiments of the invention may include function that may be implemented as software executed by a processor, hardware circuits or structures, or a combination of both. The processor may be a general-purpose or dedicated processor. The software may comprise programming logic, instructions or data to implement certain function for an embodiment of the invention. The software may be stored in a medium accessible by a machine or computer-readable medium, such as read-only memory (ROM), random-access memory (RAM), magnetic disk (e.g., floppy disk and hard drive), optical disk (e.g., CD-ROM) or any other data storage medium. In one embodiment of the invention, the media may store programming instructions in a compressed and/or encrypted format, as well as instructions that may have to be compiled or installed by an installer before being executed by the processor. Alternatively, an embodiment of the invention may be implemented as specific hardware components that contain hard-wired logic, field programmable gate array, complex programmable logic device, or application-specific integrated circuit, for performing the recited function, or by any combination of programmed general-purpose computer components and custom hardware component.
Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.
Claims
1. A system for providing image or video to be displayed by a projective display system, comprising:
- an encoding subsystem, configured to encode at least one image or video of a subject to generate encoded image data; and
- a packing subsystem, coupled to the encoding subsystem, and configured to pack the encoded image data with projection configuration information regarding the projective display system to generate packed image data,
- wherein the projective display system comprises a projection source device and a projection surface, the projection source device projects the image or video to the projection surface according to the packed image data.
2. The system of claim 1, further comprising a capture subsystem, configured to generate the at least one image or video of a subject in a single view or multiple views.
3. The system of claim 2, wherein the capture subsystem comprises a plurality of camera devices.
4. The system of claim 2, wherein the camera devices are disposed on different electronic devices.
5. The system of claim 2, wherein the capture subsystem captures a plurality of the images or videos of the subject and combines images or videos in a specific arrangement pattern.
6. The system of claim 2, wherein the capture subsystem captures a plurality of images or videos of the subject and the encoding subsystem encodes the images or the videos to reduce data redundancy between the images or videos.
7. The system of claim 1, further comprising a graphic engine configured to render at least one object in single view or multiple views to generate the at least one image or video.
8. The system of claim 1, wherein the encoding subsystem performs shape/depth encoding to encode the at least one image or video.
9. The system of claim 1, wherein the projection configuration information indicates the number of viewable sides of a projection surface of the projective display system or enablement of projection function of the projective display system.
10. A system for displaying or projecting image or video by a projective display system, comprising:
- a de-packing subsystem, configured to derive packed image data and de-pack the derived packed image data to obtain encoded image data and projection configuration information;
- a decoding subsystem, coupled to the de-packing subsystem, and configured to decode the encoded image data to generate at least one image or video; and
- a projection display subsystem, coupled to the decoding subsystem, and configured to project or display the at least one image or video by a projection source component of the projective display system according to the projection configuration information, and a projection surface of the projective display system mirrors or partially reflects the at least one projected or displayed image.
11. The system of claim 10, wherein the projection configuration information indicates the number of viewable sides of a projection surface of the projective display system or enablement of projection function of the projective display system.
12. The system of claim 10, wherein the decoding subsystem decodes the encoded image data to generate a plurality of images or videos of a subject with respect to different views, and the display subsystem simultaneously displays the images or videos on a plurality of different areas of the projection source component.
13. The system of claim 10, wherein when a number of views of images of the subject is smaller than the number of viewable sides of a projection surface of the projective display system, the project display subsystem projects or displays identical image on several areas of the display panel.
14. The system of claim 10, wherein the de-packing subsystem derives the packed image data from a storage device.
15. The system of claim 10, wherein the de-packing subsystem derives the packed image data from a wired/wireless transmission between the capture and transmitting system and the display and receiving system.
Type: Application
Filed: May 27, 2015
Publication Date: Jun 9, 2016
Inventors: Tsu-Ming Liu (Hsinchu City), Chih-Kai Chang (Taichung City), Chi-Cheng Ju (Hsinchu City), Chih-Ming Wang (Hsinchu County)
Application Number: 14/904,700