Method and Apparatus for Generating Dynamic Real-Time 3D Environment Projections

Methods and apparatus for generating dynamic real-time environment projections. In an exemplary embodiment, a method for generating a dynamic real-time 3D environment projection includes acquiring a real-time 2D image of an environment, and projecting the real-time 2D image of the environment onto a 3D shape to generate a 3D environment projection. In an exemplary embodiment, an apparatus that generates a dynamic real-time 3D environment projection includes an image receiver that acquires a real-time 2D image of an environment, and a projector that projects the real-time 2D image of the environment onto a 3D shape to generate a 3D environment projection.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CLAIM TO PRIORITY

This application claims the benefit of priority based upon U.S. Provisional Patent Application having Application No. 62/527,778, filed on Jun. 30, 2017, and entitled “GENERATION AND USE OF DYNAMIC REAL-TIME ENVIRONMENT MAPS,” which is hereby incorporated herein by reference in its entirety.

FIELD

The present invention relates to the operation of image processing systems. More specifically, the present invention relates to the processing images derived from a surrounding environment.

BACKGROUND

Software user interfaces have evolved significantly over the past forty years. They have progressed through command-driven terminals, mouse-driven 2D graphical user interfaces, and touch-driven 2D graphical user interfaces. In each generation of software user interface, the computer has displayed information via forms and effects crafted by a UI designer or developer. Regions of color, brightness, and contrast are crafted in such a way as to imply, for example, depth and lighting. Such effects help the user visually understand and organize large quantities of information. To date, such effects have been implemented statically in two dimensions. However, graphical user interfaces do not, for example, react to the lighting of the environment in which the computer is used.

Therefore, it would be desirable to have a mechanism that allows characteristics of the surrounding environment to be utilized in visual displays to support a variety of display applications.

SUMMARY

In various exemplary embodiments, methods and apparatus are provided for generating dynamic real-time 3D environment projections that provide a way to utilize characteristics of the surrounding environment in visual displays to support a variety of display applications. For example, the projections establish congruity between a graphical user interface and the physical environment in which both a user and a device reside. Such congruity would reduce cognitive load, improve immersiveness, and generally diminish the boundary between man and machine, thereby making computing devices generally easier to use.

In an exemplary embodiment, one or more (or a stream of) real-time 2D images are acquired. The images are acquired from one or more image capture devices and represent a wide view of a region surrounding the image capture device(s). The images are applied to a 3D shape that results in a dynamic real-time 3D projection that depicts the environment surrounding the image capture device. The dynamic real-time 3D environment projection can be used to support a variety of device applications and user interface controls, such as rendering transparent elements through which the environment behind the device may be seen, reflections, and/or image based illumination.

In an exemplary embodiment, a method is provided for generating dynamic real-time 3D environment projections. The method includes acquiring a stream of real-time 2D images, and projecting each real-time 2D image onto a 3D shape to generate a dynamic real-time 3D environment projection.

In an exemplary embodiment, a method is provided for generating dynamic real-time 3D environment projections. The method includes acquiring a stream of real-time 2D images and projecting the 2D images into an image or set of images representing one or more of six faces of a cube map to generate a dynamic real-time 3D environment projection.

In an exemplary embodiment, a method is provided for generating dynamic real-time 3D environment projections. The method includes acquiring a stream of real-time 2D images and using the 2D images as a lookup table to generate a dynamic real-time 3D environment projection.

In an exemplary embodiment, an apparatus is provided that generates dynamic real-time 3D environment projections. The apparatus includes an image receiver that acquires a stream of real-time 2D images, and a projector that projects each real-time 2D image onto a 3D shape to generate a dynamic real-time 3D environment projection.

Additional features and benefits of the exemplary embodiments of the present invention will become apparent from the detailed description, figures and claims set forth below.

BRIEF DESCRIPTION OF THE DRAWINGS

The exemplary embodiments of the present invention will be understood more fully from the detailed description given below and from the accompanying drawings of various embodiments of the invention, which should not be taken to limit the invention to the specific embodiments, but are for explanation and understanding only.

FIG. 1 shows devices comprising exemplary embodiments of an environment projection system;

FIG. 2 shows a device that includes an exemplary embodiment of the environment projection system;

FIG. 3 shows a detailed exemplary embodiment of the environment projection system shown in FIG. 2;

FIG. 4 shows a detailed exemplary embodiment of the image sensors and the image receiver shown in FIG. 3;

FIG. 5 shows a detailed exemplary embodiment of the projector shown in FIG. 3;

FIG. 6 shows a detailed exemplary embodiment of an image sensor;

FIG. 7A shows exemplary embodiments of 3D shapes for use with the embodiments of the environment projection system;

FIG. 7B shows an exemplary embodiment of a 3D shape represented as a 3D mesh for use with the embodiments of the environment projection system;

FIG. 8 shows a diagram illustrating exemplary operation of an embodiment of the environment projection system;

FIG. 9 shows a diagram illustrating exemplary operation of an embodiment of the environment projection system; and

FIG. 10 shows an exemplary embodiment of a method for generating real-time 3D environment projections.

DETAILED DESCRIPTION

The purpose of the following detailed description is to provide an understanding of one or more embodiments of the present invention. Those of ordinary skill in the art will realize that the following detailed description is illustrative only and is not intended to be in any way limiting. Other embodiments will readily suggest themselves to such skilled persons having the benefit of this disclosure and/or description.

In the interest of clarity, not all of the routine features of the implementations described herein are shown and described. It will, of course, be understood that in the development of any such actual implementation, numerous implementation-specific decisions may be made in order to achieve the developer's specific goals, such as compliance with application and business-related constraints, and that these specific goals will vary from one implementation to another and from one developer to another. Moreover, it will be understood that such a development effort might be complex and time-consuming, but would nevertheless be a routine undertaking of engineering for those of ordinary skill in the art having the benefit of the embodiments of this disclosure.

Various exemplary embodiments illustrated in the drawings may not be drawn to scale. Rather, the dimensions of the various features may be expanded or reduced for clarity. In addition, some of the drawings may be simplified for clarity. Thus, the drawings may not depict all of the components of a given apparatus (e.g., device) or method. The same reference indicators will be used throughout the drawings and the following detailed description to refer to the same or like parts.

FIG. 1 shows devices 100 comprising exemplary embodiments of an environment projection system (EPS). For example, the EPS operates to generate real-time 3D environment projections based on 2D images taken of the surrounding environment. For example, the devices shown include tablet computer 102, notebook computer 104, cell phone 106, and smart phone 108. It should be noted that embodiments of the EPS are suitable for use with virtually any type of device to generate real-time 3D environment projections and are not limited to the devices shown. For example, the EPS also is suitable for use in automobile dashboard systems, billboards, and stadium big screens.

FIG. 2 shows a device 200 that includes an exemplary embodiment of an environment projection system 204. For example, the EPS 202 includes a 3D projection (3DP) unit 204 and image sensors 206. The image sensors 206 operate to acquire real-time 2D images of the environment surrounding the device 200. The 3D projection unit 204 operates to receive the real-time 2D images and generate dynamic real-time 3D environment projections that can be utilized, stored, and displayed by the device 200. The dynamic real-time 3D environment projections provide a visualization of the changing environment around the device 200. Since the environment projections are derived from the real-time images acquired form the image sensor 206, changes in the orientation and/or position of the image sensor 206 (e.g., when the device 200 is moved) result in corresponding changes to the environment projections.

FIG. 3 shows a detailed exemplary embodiment of an environment projection system 300. For example, the EPS 300 is suitable for use as the EPS 200 shown in FIG. 2. The EPS 300 comprises one or more image sensors 302 and a 3D projection unit 304. The image sensors 302 comprise one or more high-resolution image sensors that output real-time 2D images. For example, each image sensor can output a stream of real-time 2D image frames at 30 frames per second (fps) (or other suitable frame rate). The stream of 2D images output from the images sensors 302 is shown at 312.

In an exemplary embodiment, the 3D projection unit 304 includes an image receiver 306, projector 308 and memory 310. The image receiver 306 receives one or more real-time images 312 from the image sensors 302 and processes these images into a real-time 2D image stream 314 that is passed to the projector 308. For example, if the image stream 312 comprises images from multiple image sensors, the image receiver 306 operates to combine these images into the real-time 2D image stream 314. For example, the image receiver 306 may stitch together multiple images to generate the real-time 2D image stream 314 that provides a 360-degree field of view around the image sensors 302.

The projector 308 obtains a selected 3D shape 318 from a plurality of 3D shapes 316 stored in the memory 310. In an exemplary embodiment, the selected 3D shape is in the form of a 3D mesh. The projector 308, projects the received real-time 2D images 314 onto the selected 3D mesh to generate real-time 3D environment projections 320 that are stored in the memory 310 as indicated at 322. The projector 308 operates in real-time (e.g., at least equal to the frame rate of the 2D image stream) so that changes in images of the real-time 2D image stream 314 are immediately reflected in the generated real-time 3D environment projections 320. In an exemplary embodiment, the projector 308 also outputs the real-time 3D environment projections 324 for display or other purposes.

FIG. 4 shows detailed exemplary embodiments of the image sensors 302 and the image receiver 306 shown in FIG. 3. In an exemplary embodiment, the image sensors 302 comprise one or more image sensors that capture images of the environment (or region) surrounding the device to which the image sensors 302 are mounted. In an exemplary embodiment, the image sensors 302 comprise one or more camera sensors that are arranged in such a way as to maximally cover the field of view (up to and even beyond 360°). For example, in one embodiment, the image sensors 302 comprise two opposing camera sensors, each with 180° field of view, that cover a full sphere encompassing the device to which the images sensors 302 are mounted. In an exemplary embodiment, the implementation of two camera sensors, each with a 180° field of view enables a bona fide 360° field of view to be obtained.

In various exemplary embodiments, the image sensors may include but are not limited to high resolution (HD) cameras, video cameras (e.g., outputting 30-60 fps), color or black and white cameras, and/or cameras having special lenses (e.g., wide angle or fish eye). If two cameras each having a 180° field of view are used, they may be placed in opposition to each other to obtain a 360° field of view. Other configurations include four cameras each with 90° field of view to obtain a 360° field of view, or multiple cameras with asymmetrical fields of view that are combined to obtain a 360° field of view.

In an exemplary embodiment, the image receiver 306 comprises an image sensor interface (I/F) 402, image controller 404, and image output I/F 406. The image sensor I/F 402 comprises, logic, registers, storage elements, and/or discrete components that operate to received image data from the image sensors 302 and to pass this image data to the image controller 404.

In an exemplary embodiment, the image controller 404 comprises a processor, CPU, gate array, programmable logic, registers, logic, and/or discrete components that operate to receive real-time images from the image sensors 302 provided by the image sensor I/F 402. The image controller 404 operates to process those images into a real-time 2D image stream that is output to the image output interface 406. For example, the image sensors 302 may include multiple image sensors that each output real-time 2D images. The image controller 404 operates to combine these multiple real-time images into a real-time 2D image stream where each image provides a wide field of view around the image sensors 302. For example, each image may provide a 360° field of view around the image sensors 302. In an embodiment, the image controller 404 operates to stitch together multiple images received from the image sensors 302 to form the real-time 2D output image stream 410. In one embodiment, the image controller 404 includes a memory 408 to facilitate combining images from multiple image sensors.

Once acquisition and processing of the image sensor data is complete, the image controller 404 outputs the real-time 2D image stream 410 to the image output I/F 406, which generates the real-time 2D image stream 314 output. For example, as shown in FIG. 3, the real-time 2D image stream 314 is output from the image receiver 306 to the projector 308.

FIG. 5 shows a detailed exemplary embodiment of the projector 308 and memory 310 shown in FIG. 3. In an exemplary embodiment, the projector 308 comprises an image input I/F 502, projection processor 504, and a projection output I/F 506.

In an exemplary embodiment, the image input I/F 502 comprises at least one of programmable logic, registers, memory, and/or discrete components that operate to receive real-time 2D images 314 from the image receiver 306 and passes these images to the projection processor 504. In an exemplary embodiment, the received images may be stored or buffered by the image input I/F 502.

In an exemplary embodiment, the projection processor 504 comprises at least one of a processor, CPU, gate array, programmable logic, memory, registers, logic, and/or discrete components that operate to receive a stream of real-time 2D images from the image input I/F 502. The projection processor 504 projects those images onto real-time 3D environment projections that are output to the projection output I/F 506.

In an exemplary embodiment, the projection processor 504 retrieves a selected 3D shape 318 from the plurality of shapes 316 stored in the memory 310. The projection processor 504 operates to project the received 2D images onto the 3D shape to generate the real-time 3D environment projections.

UV Mapping

In an exemplary embodiment, the projection processor 504 performs a process known as UV mapping that utilizes the 2D images as a texture. The process of UV mapping is known in the art and will not be describe in detail here. However, using the UV mapping process, the projection processor 504 projects selected portions of the texture into polygons that form the 3D shape to generate the real-time 3D environment projections. This process is further described with respect to FIG. 7B.

Cube Mapping

In an exemplary embodiment, the projection processor 504 generates the real-time 3D environment projections by generating one or more of the six faces of a cube map. This is achieved through a process of sampling the 2D environment image or images and re-expressing them using, for example, a 2D projection scheme such as spherical, cylindrical, conic, or azimuthal, as is appropriate for the particular lens configuration in use. For example, two image sensors placed both in opposition to one another and axially aligned may use a spherical projection in order to convert the captured images into components of a cube map. Once the cube map is generated, it may be provided to rendering applications to be used as a reflection source, sky box, or similar.

Lookup Table

In an exemplary embodiment, the projection processor 504 generates the real-time 3D projections by using the acquired real-time 2D image or images of the environment as a lookup table whereby data from the environment images is used in subsequent rendering steps. For example, referencing data derived from the environment image(s) in the scene camera's rendering procedure could result in a sky being rendered that depicts the environment, but without first creating an intermediate texture, cube map, sky box, or other. Similar operations may be used to generate other effects, like reflections. Mips, convolutions, or other intermediate transformations of the environment image(s) may be used to create real-time 3D environment projections as well.

Environment Reflections

In an exemplary embodiment, the projection processor 504 generates the real-time 3D environment projections in the form of reflections of the environment on elements in a user interface. In this way, user interface elements may be expressed with particular degrees of “shininess” where the notion of “shine” is achieved by rendering the environment upon the surface of the user interface element. Designers may control the degree of shininess, anywhere from a mirror-like finish to a matte finish, by, for example, convolving the environment image before it is rendered upon the surface of the user interface element.

Light Mapped Illumination

In an exemplary embodiment, the projection processor 504 generates the real-time 3D environment projections in the form of light mapped, image-based illumination of a user interface. As image based lighting is a scheme whereby light may be rendered as coming from many directions, and each direction may take on the characteristic color of the light in the environment, the user interface is lit by a high-fidelity representation of the lighting of the environment.

As the real-time 3D environment projections are generated, they are stored in the memory 310 as indicated at 322. The projection output I/F 506 also outputs the real-time 3D environment projections 324 to other systems or device applications.

FIG. 6 shows a detailed exemplary embodiment of an image sensor 600 for use with the environment projection system. For example, the image sensor 600 is suitable for use as part of the image sensors 302 shown in FIG. 3. The image sensor 600 comprises a sensor body 602 that houses an image sensor 606 that is covered by a lens 604. For example, the lens 604 may be a hemispherical dome lens. However, other factors such as cost and form factor may affect the choice of lens design for a given implementation.

In this particular embodiment, the lens 604 operates to provide a wide field of view of the surrounding environment that is captured by the image sensor 606. In other embodiments, different sensor/lens combinations are used to acquire a wide field of view of the surrounding environment. Evaluation of a particular sensor/lens configuration should consider the accuracy of the system's ability to project the image onto the surface of the image sensor 606 with minimal aberration. For example, faceted lenses may not be suitable as they may introduce aberration inherent to the way they capture light.

FIG. 7A shows exemplary 3D shapes 700 for use with the embodiments of the environment projection system. For example, the EPS is operable to utilize 3D shapes such as a hemispherical dome 702, dodecahedron 704, tetrahedron 706, and cube 708. It should be noted that the EPS is suitable for use with other types of 3D shapes and is not limited to utilizing only the shapes shown in FIG. 7A. For example, in an embodiment, the 3D shapes 700 may comprise a sphere, a box, a tetrahedron, a pyramid, a cone, and a capsule.

FIG. 7B shows an exemplary embodiment of the 3D shape 702 represented as a 3D mesh 710 for use with the embodiments of the environment projection system. For example, the 3D mesh 710 may be stored in the memory 310 for use by the projection processor 504 to generate the real-time 3D environment projections 320. In an exemplary embodiment, the 3D mesh 710 comprises polygon shapes (e.g., polygon 712) arranged and configured to form a hemispherical dome onto which real-time 2D images will be projected by the projection processor 504. For example, in an exemplary embodiment, the projection processor 504 performs a UV mapping process that utilizes the 2D images as a texture. Using the UV mapping process, the projection processor 504 projects a selected portion of the texture into each of the polygons of the 3D mesh 710 to generate the real-time 3D environment projections.

FIG. 8 shows a diagram 800 illustrating exemplary operation of an embodiment of the environment projection system. For example, the operations illustrated in the diagram 800 are discussed with reference to FIG. 5.

In a first operation, the projection processor 504 selects a 3D shape from the 3D shapes 316 stored in the memory 310. For example, a hemispherical shape 802 is selected. In an exemplary embodiment, the shape 802 is stored as a 3D mesh in the memory 310. In another embodiment, the projection processor 504 obtains a 3D shape from the memory 310 and converts this shape into a mesh representation. In a second operation, a real-time 2D image is acquired. For example, the image sensors 302 capture the image shown at 804. In a third operation, the projection processor 504 performs a projection to project the 2D image onto the selected 3D shape as shown at 806. In an exemplary embodiment, the projection processor 504 performs a UV mapping operation to generate a real-time 3D environment projection 808, which is stored in a memory (e.g., memory 310) in a fourth operation. The projection may also be output to other programs or devices using the projection output I/F 506. Since this is a real-time process, the next image in the stream of 2D images is acquired in a fifth operation and the process is repeated. Thus, real-time 3D environment projections are generated as a continuous stream corresponding to the captured real-time 2D images.

FIG. 9 shows a diagram 900 illustrating exemplary operation of an embodiment of the environment projection system. The diagram 900 shows real-time 2D images 902 and corresponding real-time 3D environment projections 904. In an exemplary embodiment, the real-time images 902 are acquired by image sensors 302, sent to the image receiver 306 and processed by the projector 308.

At time T0, a first real-time 2D image 906 is acquired that captures an environment surrounding the image sensors 302. The 2D image 906 includes trees, a building, clouds, and the sun 912. The projector 308 projects this first image onto the 3D mesh 710 to generate a first 3D environment projection 906A also shown at time T0. As illustrated in the environment projection 906A, the initial location of the sun is indicated at 912.

At time Tx, another real-time 2D image 908 of the surroundings is acquired that includes trees, a building, clouds, and the sun 912. The projector 308 projects this image onto the 3D mesh 710 to generate a 3D environment projection 908A shown at time Tx. As illustrated in the 2D image 908 and environment projection 908A, the location of the sun is indicated at 912.

At time Tn, another real-time 2D image 910 is acquired that includes trees, a building, clouds, and the sun 912. The projector 308 projects this image onto the 3D mesh 710 to generate a 3D environment projection 910A shown at time Tx. As illustrated in the 2D image 910 and environment projection 910A, the location of the sun is indicated at 912.

Thus, the images 900 illustrates how movement of objects (e.g., the sun 912) in the real-time 2D image stream is reflected in movement of those same objects in the corresponding 3D environment projection.

FIG. 10 shows an exemplary embodiment of a method 1000 for generating real-time 3D environment projections in accordance with exemplary embodiments of the present invention. For example, the method 1000 is suitable for use with the EPS 300 shown in FIG. 3.

At block 1002, a 3D shape is selected. For example, the projector 308 selects a 3D shape from the 3D shapes 316 stored in the memory 310. In an exemplary embodiment, the 3D shapes include spherical, tetrahedral, box and/or any other suitable 3D shape. In an exemplary embodiment, the shape selection is performed during an initialization phase or configuration phase of the operation of the EPS 300. In another exemplary embodiment, the selection of the 3D shape is performed during runtime. In an exemplary embodiment, the 3D shapes 316 are stored in the memory 310 as 3D mesh shapes.

At block 1004, a real-time 2D image is acquired. For example, in an exemplary embodiment, the 2D image is acquired from one or more image sensors 302. For example, the image sensors can be part of a camera system attached to a hand-held device. In one embodiment, the acquired image provides a 360° field of view of the region surrounding the location of the image sensors. In an exemplary embodiment, the image sensors 302 output images at a frame rate of 30 fps.

At block 1006, an optional operation is performed to combine images from multiple sensors into the acquired real-time 2D image. For example, if several images are acquired by multiple image sensors, these images are combined into one image by connecting the images together or otherwise stitching the images to form one real-time 2D image. In an exemplary embodiment, the image controller 404 performs this operation.

At block 1008, the real-time 2D image is projected onto the selected 3D shape to generate a real-time 3D environment projection. In an exemplary embodiment, the projection processor 504 performs a UV mapping operation to project the 2D images onto the 3D shape. In other embodiments, the projection processor 504 performs one or more of Cube Mapping, Lookup Table processing, Environment Reflections processing or Light Mapped Illumination processing as described above, to generate the real-time 3D environment projection.

At block 1010, the real-time 3D environment projection is stored in a memory. For example, the projector 308 stores the real-time 3D environment projection into the memory 310.

At block 1012, the environment projection is output. For example, the real-time 3D environment projection is output for display or is passed to another device or application by the projection output I/F 506. The method then proceeds to block 1004 to acquire the next real-time 2D image for processing.

Thus, the method 1000 operates to generate a real-time 3D environment projection in accordance with exemplary embodiments of the present invention. It should be noted that although the method 1000 describes specific operation, these operations may be changed, modified, rearranged, added to, and subtracted from within the scope of the embodiments.

While particular embodiments of the present invention have been shown and described, it will be obvious to those skilled in the art that, based upon the teachings herein, changes and modifications may be made without departing from these exemplary embodiments of the present invention and its broader aspects. Therefore, the appended claims are intended to encompass within their scope all such changes and modifications as are within the true spirit and scope of this exemplary embodiments of the present invention.

Claims

1. A method, comprising:

acquiring a real-time two-dimensional (2D) image of an environment; and
projecting the real-time 2D image of the environment onto a 3D shape to generate a real-time three-dimensional (3D) environment projection.

2. The method of claim 1, wherein the operation of projecting comprises UV-mapping the real-time 2D image of the environment onto the 3D shape.

3. The method of claim 1, wherein the operation of projecting comprises processing the real-time 2D image of the environment to generate one or more of six faces of a cube map.

4. The method of claim 1, wherein the operation of projecting comprises using the real-time 2D image of the environment as a lookup table to render the environment in a 3D scene.

5. The method of claim 1, wherein the operation of acquiring comprises acquiring the real-time 2D image of the environment from one or more image sensors.

6. The method of claim 5, wherein the operation of acquiring comprises acquiring the real-time 2D image of the environment to form a 360° field of view.

7. The method of claim 1, further comprising storing the real-time 2D image of the environment in a memory.

8. The method of claim 1, further comprising storing the 3D environment projection in a memory.

9. The method of claim 1, further comprising selecting the 3D shape from a plurality of 3D shapes.

10. The method of claim 9, wherein the 3D shape is selected from a set of 3D shapes comprising a sphere, a box, a tetrahedron, a pyramid, a cone, and a capsule.

11. The method of claim 1, further comprising performing the method on at least one of a handheld device, desktop computer, and laptop computer.

12. The method of claim 11, further comprising displaying the environment projection on a device display.

13. An apparatus, comprising:

an image receiver that acquires a real-time two-dimensional (2D) image of an environment; and
a projector that projects the real-time 2D image of the environment onto a three-dimensional (3D) shape to generate a real-time 3D environment projection.

14. The apparatus of claim 13, wherein the projector performs UV-mapping to project the real-time 2D image of the environment onto the 3D shape to generate the 3D environment projection.

15. The apparatus of claim 13, wherein the image receiver acquires the real-time 2D image of the environment from one or more image sensors.

16. The apparatus of claim 15, wherein the one or more image sensors capture the real-time 2D image of the environment in a 360° field of view.

17. The apparatus of claim 13, further comprising a memory that stores the real-time 2D image of the environment.

18. The apparatus of claim 17, wherein the memory stores the 3D environment projection.

19. The apparatus of claim 13, wherein the projector selects the 3D shape from a plurality of 3D shapes.

20. The apparatus of claim 13, wherein the 3D shape is selected from a set of 3D shapes comprising a sphere, a box, a tetrahedron, a pyramid, a cone, and a capsule.

21. The apparatus of claim 13, wherein the apparatus is located in at least one of a handheld device, desktop computer, and laptop computer.

22. The apparatus of claim 21, wherein the projector outputs the 3D environment projection for display on a device display.

23. The apparatus of claim 13, wherein the projector processes the real-time 2D image of the environment to generate one or more of six faces of a cube map.

24. The apparatus of claim 13, wherein the projector uses the real-time 2D image of the environment as a lookup table to render the environment in a 3D scene.

Patent History
Publication number: 20190007672
Type: Application
Filed: Aug 31, 2017
Publication Date: Jan 3, 2019
Inventor: Bobby Gene Burrough (San Jose, CA)
Application Number: 15/693,076
Classifications
International Classification: H04N 13/04 (20060101); H04N 9/31 (20060101);