Optical configurations for head worn computing
Aspects of the present invention relate to optical systems in head worn computing.
Latest Osterhout Group, Inc. Patents:
This application is a continuation of the following U.S. patent application, which is hereby incorporated by reference in its entirety:
U.S. non-provisional application Ser. No. 14/172,901, entitled Optical Configurations for Head Worn Computing, filed Feb. 4, 2014.
This application also is a continuation-in-part of the following U.S. patent applications, which are hereby incorporated by reference in their entirety:
U.S. non-provisional application Ser. No. 14/163,646, entitled Optical Configurations for Head Worn Computing, filed Jan. 24, 2014; and
U.S. non-provisional application Ser. No. 14/160,377, entitled Optical Configurations for Head Worn Computing, filed Jan. 21, 2014.
BACKGROUND1. Field of the Invention
This invention relates to head worn computing. More particularly, this invention relates to optical systems used in head worn computing.
2. Description of Related Art
Wearable computing systems have been developed and are beginning to be commercialized. Many problems persist in the wearable computing field that need to be resolved to make them meet the demands of the market.
SUMMARYAspects of the present invention relate to optical systems in head worn computing. Aspects relate to the management of “off” pixel light. Aspects relate to absorbing “off” pixel light. Aspects relate to improved see-through transparency of the HWC optical path to the surrounding environment. Aspects relate to improved image contrast, brightness, sharpness and other image quality through the management of stray light. Aspects relate to eye imaging through “off” pixels. Aspects relate to security compliance and security compliance tracking through eye imaging. Aspects relate to guest access of a HWC through eye imaging. Aspects relate to providing system and software access based on eye imaging.
These and other systems, methods, objects, features, and advantages of the present invention will be apparent to those skilled in the art from the following detailed description of the preferred embodiment and the drawings. All documents mentioned herein are hereby incorporated in their entirety by reference.
Embodiments are described with reference to the following Figures. The same numbers may be used throughout to reference like features and components that are shown in the Figures:
While the invention has been described in connection with certain preferred embodiments, other embodiments would be understood by one of ordinary skill in the art and are encompassed herein.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT(S)Aspects of the present invention relate to head-worn computing (“HWC”) systems. HWC involves, in some instances, a system that mimics the appearance of head-worn glasses or sunglasses. The glasses may be a fully developed computing platform, such as including computer displays presented in each of the lenses of the glasses to the eyes of the user. In embodiments, the lenses and displays may be configured to allow a person wearing the glasses to see the environment through the lenses while also seeing, simultaneously, digital imagery, which forms an overlaid image that is perceived by the person as a digitally augmented image of the environment, or augmented reality (“AR”).
HWC involves more than just placing a computing system on a person's head. The system may need to be designed as a lightweight, compact and fully functional computer display, such as wherein the computer display includes a high resolution digital display that provides a high level of emersion comprised of the displayed digital content and the see-through view of the environmental surroundings. User interfaces and control systems suited to the HWC device may be required that are unlike those used for a more conventional computer such as a laptop. For the HWC and associated systems to be most effective, the glasses may be equipped with sensors to determine environmental conditions, geographic location, relative positioning to other points of interest, objects identified by imaging and movement by the user or other users in a connected group, and the like. The HWC may then change the mode of operation to match the conditions, location, positioning, movements, and the like, in a method generally referred to as a contextually aware HWC. The glasses also may need to be connected, wirelessly or otherwise, to other systems either locally or through a network. Controlling the glasses may be achieved through the use of an external device, automatically through contextually gathered information, through user gestures captured by the glasses sensors, and the like. Each technique may be further refined depending on the software application being used in the glasses. The glasses may further be used to control or coordinate with external devices that are associated with the glasses.
Referring to
We will now describe each of the main elements depicted on
The HWC 102 is a computing platform intended to be worn on a person's head. The HWC 102 may take many different forms to fit many different functional requirements. In some situations, the HWC 102 will be designed in the form of conventional glasses. The glasses may or may not have active computer graphics displays. In situations where the HWC 102 has integrated computer displays the displays may be configured as see-through displays such that the digital imagery can be overlaid with respect to the user's view of the environment 114. There are a number of see-through optical designs that may be used, including ones that have a reflective display (e.g. LCoS, DLP), emissive displays (e.g. OLED, LED), hologram, TIR waveguides, and the like. In embodiments, lighting systems used in connection with the display optics may be solid state lighting systems, such as LED, OLED, quantum dot, quantum dot LED, etc. In addition, the optical configuration may be monocular or binocular. It may also include vision corrective optical components. In embodiments, the optics may be packaged as contact lenses. In other embodiments, the HWC 102 may be in the form of a helmet with a see-through shield, sunglasses, safety glasses, goggles, a mask, fire helmet with see-through shield, police helmet with see through shield, military helmet with see-through shield, utility form customized to a certain work task (e.g. inventory control, logistics, repair, maintenance, etc.), and the like.
The HWC 102 may also have a number of integrated computing facilities, such as an integrated processor, integrated power management, communication structures (e.g. cell net, WiFi, Bluetooth, local area connections, mesh connections, remote connections (e.g. client server, etc.)), and the like. The HWC 102 may also have a number of positional awareness sensors, such as GPS, electronic compass, altimeter, tilt sensor, IMU, and the like. It may also have other sensors such as a camera, rangefinder, hyper-spectral camera, Geiger counter, microphone, spectral illumination detector, temperature sensor, chemical sensor, biologic sensor, moisture sensor, ultrasonic sensor, and the like.
The HWC 102 may also have integrated control technologies. The integrated control technologies may be contextual based control, passive control, active control, user control, and the like. For example, the HWC 102 may have an integrated sensor (e.g. camera) that captures user hand or body gestures 116 such that the integrated processing system can interpret the gestures and generate control commands for the HWC 102. In another example, the HWC 102 may have sensors that detect movement (e.g. a nod, head shake, and the like) including accelerometers, gyros and other inertial measurements, where the integrated processor may interpret the movement and generate a control command in response. The HWC 102 may also automatically control itself based on measured or perceived environmental conditions. For example, if it is bright in the environment the HWC 102 may increase the brightness or contrast of the displayed image. In embodiments, the integrated control technologies may be mounted on the HWC 102 such that a user can interact with it directly. For example, the HWC 102 may have a button(s), touch capacitive interface, and the like.
As described herein, the HWC 102 may be in communication with external user interfaces 104. The external user interfaces may come in many different forms. For example, a cell phone screen may be adapted to take user input for control of an aspect of the HWC 102. The external user interface may be a dedicated UI, such as a keyboard, touch surface, button(s), joy stick, and the like. In embodiments, the external controller may be integrated into another device such as a ring, watch, bike, car, and the like. In each case, the external user interface 104 may include sensors (e.g. IMU, accelerometers, compass, altimeter, and the like) to provide additional input for controlling the HWD 104.
As described herein, the HWC 102 may control or coordinate with other local devices 108. The external devices 108 may be an audio device, visual device, vehicle, cell phone, computer, and the like. For instance, the local external device 108 may be another HWC 102, where information may then be exchanged between the separate HWCs 108.
Similar to the way the HWC 102 may control or coordinate with local devices 106, the HWC 102 may control or coordinate with remote devices 112, such as the HWC 102 communicating with the remote devices 112 through a network 110. Again, the form of the remote device 112 may have many forms. Included in these forms is another HWC 102. For example, each HWC 102 may communicate its GPS position such that all the HWCs 102 know where all of HWC 102 are located.
The light that is provided by the polarized light source 302, which is subsequently reflected by the reflective polarizer 310 before it reflects from the DLP 304, will generally be referred to as illumination light. The light that is reflected by the “off” pixels of the DLP 304 is reflected at a different angle than the light reflected by the “on” pixels, so that the light from the “off” pixels is generally directed away from the optical axis of the field lens 312 and toward the side of the upper optical module 202 as shown in
The DLP 304 operates as a computer controlled display and is generally thought of as a MEMs device. The DLP pixels are comprised of small mirrors that can be directed. The mirrors generally flip from one angle to another angle. The two angles are generally referred to as states. When light is used to illuminate the DLP the mirrors will reflect the light in a direction depending on the state. In embodiments herein, we generally refer to the two states as “on” and “off,” which is intended to depict the condition of a display pixel. “On” pixels will be seen by a viewer of the display as emitting light because the light is directed along the optical axis and into the field lens and the associated remainder of the display system. “Off” pixels will be seen by a viewer of the display as not emitting light because the light from these pixels is directed to the side of the optical housing and into a light trap or light dump where the light is absorbed. The pattern of “on” and “off” pixels produces image light that is perceived by a viewer of the display as a computer generated image. Full color images can be presented to a user by sequentially providing illumination light with complimentary colors such as red, green and blue. Where the sequence is presented in a recurring cycle that is faster than the user can perceive as separate images and as a result the user perceives a full color image comprised of the sum of the sequential images. Bright pixels in the image are provided by pixels that remain in the “on” state for the entire time of the cycle, while dimmer pixels in the image are provided by pixels that switch between the “on” state and “off” state within the time of the cycle, or frame time when in a video sequence of images.
The configuration illustrated in
The configuration illustrated in
Critical angle=arc−sin(1/n) Eqn 1
Where the critical angle is the angle beyond which the illumination light is reflected from the internal surface when the internal surface comprises an interface from a solid with a higher refractive index (n) to air with a refractive index of 1 (e.g. for an interface of acrylic, with a refractive index of n=1.5, to air, the critical angle is 41.8 degrees; for an interface of polycarbonate, with a refractive index of n=1.59, to air the critical angle is 38.9 degrees). Consequently, the TIR wedge 418 is associated with a thin air gap 408 along the internal surface to create an interface between a solid with a higher refractive index and air. By choosing the angle of the light source 404 relative to the DLP 402 in correspondence to the angle of the internal surface of the TIR wedge 418, illumination light is turned toward the DLP 402 at an angle suitable for providing image light 414 as reflected from “on”pixels. Wherein, the illumination light is provided to the DLP 402 at approximately twice the angle of the pixel mirrors in the DLP 402 that are in the “on” state, such that after reflecting from the pixel mirrors, the image light 414 is directed generally along the optical axis of the field lens. Depending on the state of the DLP pixels, the illumination light from “on” pixels may be reflected as image light 414 which is directed towards a field lens and a lower optical module 204, while illumination light reflected from “off” pixels (generally referred to herein as “dark” state light, “off” pixel light or “off” state light) 410 is directed in a separate direction, which may be trapped and not used for the image that is ultimately presented to the wearer's eye.
The light trap for the dark state light 410 may be located along the optical axis defined by the direction of the dark state light 410 and in the side of the housing, with the function of absorbing the dark state light. To this end, the light trap may be comprised of an area outside of the cone of image light 414 from the “on” pixels. The light trap is typically made up of materials that absorb light including coatings of black paints or other light absorbing materials to prevent light scattering from the dark state light degrading the image perceived by the user. In addition, the light trap may be recessed into the wall of the housing or include masks or guards to block scattered light and prevent the light trap from being viewed adjacent to the displayed image.
The embodiment of
The embodiment illustrated in
The angles of the faces of the wedge set 450 correspond to the needed angles to provide illumination light 452 at the angle needed by the DLP mirrors when in the “on” state so that the reflected image light 414 is reflected from the DLP along the optical axis of the field lens. The wedge set 456 provides an interior interface where a reflective polarizer film can be located to redirect the illumination light 452 toward the mirrors of the DLP 402. The wedge set also provides a matched wedge on the opposite side of the reflective polarizer 450 so that the image light 414 from the “on” pixels exits the wedge set 450 substantially perpendicular to the exit surface, while the dark state light from the ‘off’ pixels 410 exits at an oblique angle to the exit surface. As a result, the image light 414 is substantially unrefracted upon exiting the wedge set 456, while the dark state light from the “off” pixels 410 is substantially refracted upon exiting the wedge set 456 as shown in
By providing a solid transparent matched wedge set, the flatness of the interface is reduced, because variations in the flatness have a negligible effect as long as they are within the cone angle of the illuminating light 452. Which can be f#2.2 with a 26 degree cone angle. In a preferred embodiment, the reflective polarizer is bonded between the matched internal surfaces of the wedge set 456 using an optical adhesive so that Fresnel reflections at the interfaces on either side of the reflective polarizer 450 are reduced. The optical adhesive can be matched in refractive index to the material of the wedge set 456 and the pieces of the wedge set 456 can be all made from the same material such as BK7 glass or cast acrylic. Wherein the wedge material can be selected to have low birefringence as well to reduce non-uniformities in brightness. The wedge set 456 and the quarter wave film 454 can also be bonded to the DLP 402 to further reduce Fresnel reflections at the DLP interface losses. In addition, since the image light 414 is substantially normal to the exit surface of the wedge set 456, the flatness of the surface is not critical to maintain the wavefront of the image light 414 so that high image quality can be obtained in the displayed image without requiring very tightly toleranced flatness on the exit surface.
A yet further embodiment of the invention that is not illustrated, combines the embodiments illustrated in
The combiner 602 may include a holographic pattern, to form a holographic mirror. If a monochrome image is desired, there may be a single wavelength reflection design for the holographic pattern on the surface of the combiner 602. If the intention is to have multiple colors reflected from the surface of the combiner 602, a multiple wavelength holographic mirror maybe included on the combiner surface. For example, in a three-color embodiment, where red, green and blue pixels are generated in the image light, the holographic mirror may be reflective to wavelengths substantially matching the wavelengths of the red, green and blue light provided by the light source. This configuration can be used as a wavelength specific mirror where pre-determined wavelengths of light from the image light are reflected to the user's eye. This configuration may also be made such that substantially all other wavelengths in the visible pass through the combiner element 602 so the user has a substantially clear view of the surroundings when looking through the combiner element 602. The transparency between the user's eye and the surrounding may be approximately 80% when using a combiner that is a holographic mirror. Wherein holographic mirrors can be made using lasers to produce interference patterns in the holographic material of the combiner where the wavelengths of the lasers correspond to the wavelengths of light that are subsequently reflected by the holographic mirror.
In another embodiment, the combiner element 602 may include a notch mirror comprised of a multilayer coated substrate wherein the coating is designed to substantially reflect the wavelengths of light provided by the light source and substantially transmit the remaining wavelengths in the visible spectrum. For example, in the case where red, green and blue light is provided by the light source to enable full color images to be provided to the user, the notch mirror is a tristimulus notch mirror wherein the multilayer coating is designed to reflect narrow bands of red, green and blue light that are matched to the what is provided by the light source and the remaining visible wavelengths are transmitted through the coating to enable a view of the environment through the combiner. In another example where monochrome images are provided to the user, the notch mirror is designed to reflect a single narrow band of light that is matched to the wavelength range of the light provided by the light source while transmitting the remaining visible wavelengths to enable a see-thru view of the environment. The combiner 602 with the notch mirror would operate, from the user's perspective, in a manner similar to the combiner that includes a holographic pattern on the combiner element 602. The combiner, with the tristimulus notch mirror, would reflect the “on” pixels to the eye because of the match between the reflective wavelengths of the notch mirror and the color of the image light, and the wearer would be able to see with high clarity the surroundings. The transparency between the user's eye and the surrounding may be approximately 80% when using the tristimulus notch mirror. In addition, the image provided by the upper optical module 202 with the notch mirror combiner can provide higher contrast images than the holographic mirror combiner due to less scattering of the imaging light by the combiner.
Light can escape through the combiner 602 and may produce face glow as the light is generally directed downward onto the cheek of the user. When using a holographic mirror combiner or a tristimulus notch mirror combiner, the escaping light can be trapped to avoid face glow. In embodiments, if the image light is polarized before the combiner, a linear polarizer can be laminated, or otherwise associated, to the combiner, with the transmission axis of the polarizer oriented relative to the polarized image light so that any escaping image light is absorbed by the polarizer. In embodiments, the image light would be polarized to provide S polarized light to the combiner for better reflection. As a result, the linear polarizer on the combiner would be oriented to absorb S polarized light and pass P polarized light. This provides the preferred orientation of polarized sunglasses as well.
If the image light is unpolarized, a microlouvered film such as a privacy filter can be used to absorb the escaping image light while providing the user with a see-thru view of the environment. In this case, the absorbance or transmittance of the microlouvered film is dependent on the angle of the light,. Where steep angle light is absorbed and light at less of an angle is transmitted. For this reason, in an embodiment, the combiner with the microlouver film is angled at greater than 45 degrees to the optical axis of the image light (e.g. the combiner can be oriented at 50 degrees so the image light from the file lens is incident on the combiner at an oblique angle.
While many of the embodiments of the present invention have been referred to as upper and lower modules containing certain optical components, it should be understood that the image light and dark light production and management functions described in connection with the upper module may be arranged to direct light in other directions (e.g. upward, sideward, etc.). In embodiments, it may be preferred to mount the upper module 202 above the wearer's eye, in which case the image light would be directed downward. In other embodiments it may be preferred to produce light from the side of the wearer's eye, or from below the wearer's eye. In addition, the lower optical module is generally configured to deliver the image light to the wearer's eye and allow the wearer to see through the lower optical module, which may be accomplished through a variety of optical components.
Another aspect of the present invention relates to eye imaging. In embodiments, a camera is used in connection with an upper optical module 202 such that the wearer's eye can be imaged using pixels in the “off” state on the DLP.
In embodiments, the eye imaging camera may image the wearer's eye at a moment in time where there are enough “off” pixels to achieve the required eye image resolution. In another embodiment, the eye imaging camera collects eye image information from “off” pixels over time and forms a time lapsed image. In another embodiment, a modified image is presented to the user wherein enough “off” state pixels are included that the camera can obtain the desired resolution and brightness for imaging the wearer's eye and the eye image capture is synchronized with the presentation of the modified image.
The eye imaging system may be used for security systems. The HWC may not allow access to the HWC or other system if the eye is not recognized (e.g. through eye characteristics including retina or iris characteristics, etc.). The HWC may be used to provide constant security access in some embodiments. For example, the eye security confirmation may be a continuous, near-continuous, real-time, quasi real-time, periodic, etc. process so the wearer is effectively constantly being verified as known. In embodiments, the HWC may be worn and eye security tracked for access to other computer systems.
The eye imaging system may be used for control of the HWC. For example, a blink, wink, or particular eye movement may be used as a control mechanism for a software application operating on the HWC or associated device.
The eye imaging system may be used in a process that determines how or when the HWC 102 delivers digitally displayed content to the wearer. For example, the eye imaging system may determine that the user is looking in a direction and then HWC may change the resolution in an area of the display or provide some content that is associated with something in the environment that the user may be looking at. Alternatively, the eye imaging system may identify different user's and change the displayed content or enabled features provided to the user. User's may be identified from a database of users eye characteristics either located on the HWC 102 or remotely located on the network 110 or on a server 112. In addition, the HWC may identify a primary user or a group of primary users from eye characteristics wherein the primary user(s) are provided with an enhanced set of features and all other user's are provided with a different set of features. Thus in this use case, the HWC 102 uses identified eye characteristics to either enable features or not and eye characteristics need only be analyzed in comparison to a relatively small database of individual eye characteristics.
Another aspect of the present invention relates to the generation of peripheral image lighting effects for a person wearing a HWC. In embodiments, a solid state lighting system (e.g. LED, OLED, etc.), or other lighting system, may be included inside the optical elements of an lower optical module 204. The solid state lighting system may be arranged such that lighting effects outside of a field of view (FOV) of the presented digital content is presented to create an emersive effect for the person wearing the HWC. To this end, the lighting effects may be presented to any portion of the HWC that is visible to the wearer. The solid state lighting system may be digitally controlled by an integrated processor on the HWC. In embodiments, the integrated processor will control the lighting effects in coordination with digital content that is presented within the FOV of the HWC. For example, a movie, picture, game, or other content, may be displayed or playing within the FOV of the HWC. The content may show a bomb blast on the right side of the FOV and at the same moment, the solid state lighting system inside of the upper module optics may flash quickly in concert with the FOV image effect. The effect may not be fast, it may be more persistent to indicate, for example, a general glow or color on one side of the user. The solid state lighting system may be color controlled, with red, green and blue LEDs, for example, such that color control can be coordinated with the digitally presented content within the field of view.
In the embodiment illustrated in
Another aspect of the present invention relates to the mitigation of light escaping from the space between the wearer's face and the HWC itself Another aspect of the present invention relates to maintaining a controlled lighting environment in proximity to the wearer's eyes. In embodiments, both the maintenance of the lighting environment and the mitigation of light escape are accomplished by including a removable and replaceable flexible shield for the HWC. Wherein the removable and replaceable shield can be provided for one eye or both eyes in correspondence to the use of the displays for each eye. For example, in a night vision application, the display to only one eye could be used for night vision while the display to the other eye is turned off to provide good see-thru when moving between areas where visible light is available and dark areas where night vision enhancement is needed.
In embodiments, an opaque front light shield 1412 may be included and the digital content may include images of the surrounding environment such that the wearer can visualize the surrounding environment. One eye may be presented with night vision environmental imagery and this eye's surrounding environment optical path may be covered using an opaque front light shield 1412. In other embodiments, this arrangement may be associated with both eyes.
Another aspect of the present invention relates to automatically configuring the lighting system(s) used in the HWC 102. In embodiments, the display lighting and/or effects lighting, as described herein, may be controlled in a manner suitable for when an eye cover 1408 is attached or removed from the HWC 102. For example, at night, when the light in the environment is low, the lighting system(s) in the HWC may go into a low light mode to further control any amounts of stray light escaping from the HWC and the areas around the HWC. Covert operations at night, while using night vision or standard vision, may require a solution which prevents as much escaping light as possible so a user may clip on the eye cover(s) 1408 and then the HWC may go into a low light mode. The low light mode may, in some embodiments, only go into a low light mode when the eye cover 1408 is attached if the HWC identifies that the environment is in low light conditions (e.g. through environment light level sensor detection). In embodiments, the low light level may be determined to be at an intermediate point between full and low light dependent on environmental conditions.
Another aspect of the present invention relates to automatically controlling the type of content displayed in the HWC when eye covers 1408 are attached or removed from the HWC. In embodiments, when the eye cover(s) 1408 is attached to the HWC, the displayed content may be restricted in amount or in color amounts. For example, the display(s) may go into a simple content delivery mode to restrict the amount of information displayed. This may be done to reduce the amount of light produced by the display(s). In an embodiment, the display(s) may change from color displays to monochrome displays to reduce the amount of light produced. In an embodiment, the monochrome lighting may be red to limit the impact on the wearer's eyes to maintain an ability to see better in the dark.
Although embodiments of HWC have been described in language specific to features, systems, computer processes and/or methods, the appended claims are not necessarily limited to the specific features, systems, computer processes and/or methods described. Rather, the specific features, systems, computer processes and/or and methods are disclosed as non-limited example implementations of HWC. All documents referenced herein are hereby incorporated by reference.
Claims
1. A head-worn computer, comprising:
- a. a housing including a mounting platform adapted to hold a see-through optics facility in front of an eye of a person when the person is wearing the head-worn computer, wherein the see-through optics facility also functions as a computer display for the person;
- b. an image light production facility, supported by the mounting platform, including a mirror support platform including a plurality of multi-positional mirrors on a front surface of the platform;
- c. a lighting facility positioned adjacent to the mirror support platform and adapted to produce a cone of illumination light along an optical axis directed away from the multi-positional mirrors and towards a substantially flat partially reflective surface, wherein the substantially flat partially reflective surface is angled to reflect the illumination light such that the multi-positional mirrors are substantially uniformly illuminated;
- d. wherein each of the multi-positional mirrors has a first state positioned to reflect a first portion of the illumination light, forming image light, on an optical axis in-line with the see-through optics facility, and a second state positioned to reflect a second portion of the illumination light, forming dark light, on an optical axis off-line with the see-through optics facility and in-line with a light absorption facility;
- e. wherein the image light and the dark light are substantially transmitted by the partially reflective surface;
- f. the light absorption facility is positioned to terminate the off-line optical axis within the housing and adapted to prevent dark light from being reflected to the eye of the person; and
- g. a solid optic that reflects a substantial portion of the illumination light, and transmits a substantial portion of the image light and the dark light;
- wherein the solid optic includes a corrective wedge with an exit surface that is oriented perpendicular to the in-line optical axis, so the image light is substantially unrefracted and the dark light is refracted away from the optical axis in-line with the see-through optics facility.
2. The head-worn computer of claim 1, further comprising a redirection wedge to redirect the dark light to a position in proximity with a side of the mirror support platform.
3. The head-worn computer of claim 2 wherein the redirection wedge includes a thin air gap to provide total internal reflection of the dark light.
4. A head-worn computer, comprising:
- a. a housing including a mounting platform adapted to hold a see-through optics facility in front of an eye of a person when the person is wearing the head-worn computer, wherein the see-through optics facility also functions as a computer display for the person;
- b. an image light production facility, supported by the mounting platform, including a mirror support platform including a plurality of multi-positional mirrors on a front surface of the platform;
- c. a lighting facility positioned adjacent to the mirror support platform and adapted to produce a cone of illumination light along an optical axis directed away from the multi-positional mirrors and towards a substantially flat partially reflective surface, wherein the substantially flat partially reflective surface is angled to reflect the illumination light such that the multi-positional mirrors are substantially uniformly illuminated;
- d. wherein each of the multi-positional mirrors has a first state positioned to reflect a first portion of the illumination light, forming image light, on an optical axis in-line with the see-through optics facility, and a second state positioned to reflect a second portion of the illumination light, forming dark light, on an optical axis off-line with the see-through optics facility and in-line with a light absorption facility;
- e. wherein the image light and the dark light are substantially transmitted by the partially reflective surface; and
- f. the light absorption facility is positioned to terminate the off-line optical axis within the housing and adapted to prevent dark light from being reflected to the eye of the person and wherein the light absorption facility comprises an absorptive light trap at the end of a tunnel.
3305294 | February 1967 | Alvarez |
5257094 | October 26, 1993 | LaRussa |
5621424 | April 15, 1997 | Shimada et al. |
5699194 | December 16, 1997 | Takahashi |
5717422 | February 10, 1998 | Fergason et al. |
5914818 | June 22, 1999 | Tejada et al. |
5949583 | September 7, 1999 | Rallison |
6222677 | April 24, 2001 | Budd |
6456438 | September 24, 2002 | Lee et al. |
6461000 | October 8, 2002 | Magarill |
6478429 | November 12, 2002 | Aritake et al. |
6847336 | January 25, 2005 | Lemelson et al. |
6987787 | January 17, 2006 | Mick |
7016116 | March 21, 2006 | Dolgoff et al. |
7088234 | August 8, 2006 | Naito et al. |
7199934 | April 3, 2007 | Yamasaki |
7417617 | August 26, 2008 | Eichenlaub |
7457040 | November 25, 2008 | Amitai |
7646540 | January 12, 2010 | Dolgoff et al. |
7690799 | April 6, 2010 | Nestorovic et al. |
7728799 | June 1, 2010 | Kerr et al. |
7777960 | August 17, 2010 | Freeman |
7830370 | November 9, 2010 | Yamazaki et al. |
7855743 | December 21, 2010 | Sako et al. |
7928926 | April 19, 2011 | Yamamoto et al. |
8004765 | August 23, 2011 | Amitai |
8246170 | August 21, 2012 | Yamamoto et al. |
8376548 | February 19, 2013 | Schultz |
8378924 | February 19, 2013 | Jacobsen et al. |
8427396 | April 23, 2013 | Kim |
8494215 | July 23, 2013 | Kimchi et al. |
8564883 | October 22, 2013 | Totani et al. |
8576276 | November 5, 2013 | Bar-Zeev et al. |
8576491 | November 5, 2013 | Takagi et al. |
8587869 | November 19, 2013 | Totani et al. |
8594467 | November 26, 2013 | Lu et al. |
8662686 | March 4, 2014 | Takagi et al. |
8670183 | March 11, 2014 | Clavin et al. |
8698157 | April 15, 2014 | Hanamura |
8711487 | April 29, 2014 | Takeda et al. |
8745058 | June 3, 2014 | Garcia-Barrio |
8750541 | June 10, 2014 | Dong et al. |
8752963 | June 17, 2014 | McCulloch et al. |
8760765 | June 24, 2014 | Gupta et al. |
8803867 | August 12, 2014 | Oikawa |
8823071 | September 2, 2014 | Oyamada |
8837880 | September 16, 2014 | Takeda et al. |
8867139 | October 21, 2014 | Gupta |
20010019240 | September 6, 2001 | Takahashi |
20020005108 | January 17, 2002 | Ludwig et al. |
20020109903 | August 15, 2002 | Kaeriyama |
20040066547 | April 8, 2004 | Parker et al. |
20060098293 | May 11, 2006 | Garoutte et al. |
20060250322 | November 9, 2006 | Hall et al. |
20070024750 | February 1, 2007 | Wing Chung et al. |
20070024763 | February 1, 2007 | Chung et al. |
20070024764 | February 1, 2007 | Chung et al. |
20070024820 | February 1, 2007 | Chung et al. |
20070024823 | February 1, 2007 | Chung et al. |
20070025273 | February 1, 2007 | Chung et al. |
20080191965 | August 14, 2008 | Pandozy et al. |
20080266645 | October 30, 2008 | Dharmatilleke et al. |
20090279180 | November 12, 2009 | Amitai et al. |
20100007852 | January 14, 2010 | Bietry et al. |
20100103075 | April 29, 2010 | Kalaboukis et al. |
20100149073 | June 17, 2010 | Chaum et al. |
20100283774 | November 11, 2010 | Bovet |
20110164221 | July 7, 2011 | Tilleman et al. |
20110248963 | October 13, 2011 | Lawrence et al. |
20120035934 | February 9, 2012 | Cunningham et al. |
20120050493 | March 1, 2012 | Ernst et al. |
20120075168 | March 29, 2012 | Osterhout et al. |
20120092328 | April 19, 2012 | Flaks et al. |
20120119978 | May 17, 2012 | Bietry et al. |
20120120103 | May 17, 2012 | Border et al. |
20120176682 | July 12, 2012 | DeJong |
20120194553 | August 2, 2012 | Osterhout et al. |
20120200935 | August 9, 2012 | Miyao |
20120212398 | August 23, 2012 | Border et al. |
20120212593 | August 23, 2012 | Na'Aman et al. |
20120223885 | September 6, 2012 | Perez |
20120229367 | September 13, 2012 | Magyari et al. |
20120242251 | September 27, 2012 | Kwisthout et al. |
20120250152 | October 4, 2012 | Larson et al. |
20120264510 | October 18, 2012 | Wigdor et al. |
20120293548 | November 22, 2012 | Perez et al. |
20120306850 | December 6, 2012 | Balan et al. |
20120327116 | December 27, 2012 | Liu et al. |
20130009366 | January 10, 2013 | Hannegan et al. |
20130070344 | March 21, 2013 | Takeda |
20130088413 | April 11, 2013 | Raffle et al. |
20130100259 | April 25, 2013 | Ramaswamy |
20130106674 | May 2, 2013 | Wheeler et al. |
20130127980 | May 23, 2013 | Haddick |
20130154913 | June 20, 2013 | Genc et al. |
20130196757 | August 1, 2013 | Latta et al. |
20130201081 | August 8, 2013 | Evans et al. |
20130250207 | September 26, 2013 | Bohn et al. |
20130250430 | September 26, 2013 | Robbins et al. |
20130257622 | October 3, 2013 | Davalos et al. |
20130342571 | December 26, 2013 | Kinnebrew et al. |
20140028704 | January 30, 2014 | Wu et al. |
20140043682 | February 13, 2014 | Hussey et al. |
20140062854 | March 6, 2014 | Cho |
20140101608 | April 10, 2014 | Ryskamp et al. |
20140129328 | May 8, 2014 | Mathew |
20140146394 | May 29, 2014 | Tout et al. |
20140147829 | May 29, 2014 | Jerauld |
20140152530 | June 5, 2014 | Venkatesha et al. |
20140152558 | June 5, 2014 | Salter et al. |
20140152676 | June 5, 2014 | Rohn et al. |
20140159995 | June 12, 2014 | Adams et al. |
20140160055 | June 12, 2014 | Margolis et al. |
20140160157 | June 12, 2014 | Poulos et al. |
20140160170 | June 12, 2014 | Lyons |
20140168735 | June 19, 2014 | Yuan et al. |
20140176603 | June 26, 2014 | Kumar et al. |
20140177023 | June 26, 2014 | Gao et al. |
20140361957 | December 11, 2014 | Hua et al. |
20150205035 | July 23, 2015 | Border et al. |
20150205107 | July 23, 2015 | Border |
20150205111 | July 23, 2015 | Border et al. |
20150205112 | July 23, 2015 | Border |
20150205113 | July 23, 2015 | Border et al. |
20150205114 | July 23, 2015 | Border et al. |
20150205115 | July 23, 2015 | Border et al. |
20150205116 | July 23, 2015 | Border et al. |
20150205117 | July 23, 2015 | Border et al. |
20150205118 | July 23, 2015 | Border et al. |
20150205119 | July 23, 2015 | Osterhout et al. |
20150205120 | July 23, 2015 | Border et al. |
20150205121 | July 23, 2015 | Border et al. |
20150205122 | July 23, 2015 | Border et al. |
20150205125 | July 23, 2015 | Border et al. |
20150205127 | July 23, 2015 | Border et al. |
20150205128 | July 23, 2015 | Border |
20150205129 | July 23, 2015 | Border et al. |
20150205130 | July 23, 2015 | Border |
20150205131 | July 23, 2015 | Border et al. |
20150205132 | July 23, 2015 | Osterhout et al. |
20150205135 | July 23, 2015 | Border et al. |
20150212324 | July 30, 2015 | Osterhout |
20150212327 | July 30, 2015 | Osterhout et al. |
368898 | May 1990 | EP |
777867 | June 1997 | EP |
2486450 | August 2012 | EP |
2502410 | September 2012 | EP |
2011143655 | November 2011 | WO |
2012058175 | May 2012 | WO |
2012064546 | May 2012 | WO |
2012082807 | June 2012 | WO |
2012118573 | September 2012 | WO |
2012118575 | September 2012 | WO |
2013043288 | March 2013 | WO |
2013049248 | April 2013 | WO |
2013050650 | April 2013 | WO |
2013103825 | July 2013 | WO |
2013110846 | August 2013 | WO |
2013170073 | November 2013 | WO |
2015/109145 | July 2015 | WO |
- U.S. Appl. No. 14/194,537, filed Feb. 28, 2014.
- U.S. Appl. No. 14/194,544, filed Feb. 28, 2014.
- U.S. Appl. No. 14/194,548, filed Feb. 28, 2014.
- U.S. Appl. No. 14/163,646, filed Jan. 24, 2014.
- U.S. Appl. No. 14/185,987, filed Feb. 21, 2014.
- U.S. Appl. No. 14/172,901, filed Feb. 4, 2014.
- U.S. Appl. No. 14/205.267, filed Mar. 11, 2014.
- U.S. Appl. No. 14/457,853, filed Aug. 12, 2014.
- US 8,743,465, 6/2014, Totani et al. (withdrawn).
- US 8,792,178, 7/2014, Totani et al. (withdrawn).
- Allison, et al., ““Tolerance of Temporal Delay in Virtual Environments””, VR '01 Proceedings of the Virtual Reality 2001 Conference (VR'01), Mar. 2001, 2-8.
- Lang, et al., ““Nonlinear Disparity Mapping for Stereoscopic 3D””, Jul. 2010, 1-10.
- PCT/US2015/011697, International Application Serial No. PCT/US2015/011697, International Search Report and Written Opinion mailed Apr. 13, 2015, Osterhout Group, Inc., 14 pages.
- Schedwill, “Bidirectional OLED Microdisplay”, Fraunhofer Research Institution for Organics, Materials and Electronic Device Comedd, Apr. 11, 2014, 2 pages.
- Vogel, et al., “Data glasses controlled by eye movements”, Information and communication, Fraunhofer-Gesellschaft, Sep. 22, 2013, 2 pages.
- U.S. Appl. No. 14/160,377, filed Jan. 21, 2014.
- U.S. Appl. No. 14/194,500, filed Feb. 28, 2014.
- U.S. Appl. No. 14/194,507, filed Feb. 28, 2014.
- U.S. Appl. No. 14/194,511, filed Feb. 28, 2014.
- U.S. Appl. No. 14/194,516, filed Feb. 28, 2014.
- U.S. Appl. No. 14/194,521, filed Feb. 28, 2014.
- U.S. Appl. No. 14/194,526, filed Feb. 28, 2014.
- U.S. Appl. No. 14/194,532, filed Feb. 28, 2014.
- U.S. Appl. No. 14/543,751, filed Nov. 17, 2014.
- U.S. Appl. No. 14/181,459, filed Feb. 14, 2014.
- U.S. Appl. No. 14/296,699, filed Jun. 5, 2014.
- U.S. Appl. No. 14/325,991, filed Jul. 8, 2014.
- U.S. Appl. No. 14/489,706, filed Sep. 18, 2014.
- U.S. Appl. No. 14/498,765, filed Sep. 26, 2014.
- U.S. Appl. No. 14/504,723, filed Oct. 2, 2014.
- U.S. Appl. No. 14/561,146, filed Dec. 4, 2014.
- U.S. Appl. No. 14/561,844, filed Dec. 5, 2014.
- U.S. Appl. No. 14/561,949, filed Dec. 5, 2014.
- U.S. Appl. No. 14/561,990, filed Dec. 5, 2014.
- U.S. Appl. No. 14/562,027, filed Dec. 5, 2014.
- U.S. Appl. No. 14/562,065, filed Dec. 5, 2014.
- U.S. Appl. No. 14/562,105, filed Dec. 5, 2014.
- U.S. Appl. No. 14/554,044, filed Nov. 26, 2014.
- U.S. Appl. No. 14/635,390, filed Mar. 2, 2015.
- U.S. Appl. No. 14/668,033, filed Mar. 25, 2015.
- U.S. Appl. No. 14/668,074, filed Mar. 25, 2015.
- U.S. Appl. No. 14/668,122, filed Mar. 25, 2015.
- U.S. Appl. No. 14/668,167, filed Mar. 25, 2015.
- U.S. Appl. No. 14/668,224, filed Mar. 25, 2015.
- U.S. Appl. No. 14/668,582, filed Mar. 25, 2015.
- U.S. Appl. No. 14/670,677, filed Mar. 27, 2015.
- U.S. Appl. No. 14/671,885, filed Mar. 27, 2015.
- U.S. Appl. No. 14/671,899, filed Mar. 27, 2015.
- U.S. Appl. No. 14/671,906, filed Mar. 27, 2015.
- U.S. Appl. No. 14/589,713, filed Jan. 5, 2015.
- PCT/US2015/011697, Jan. 16, 2015.
- “Lightberry”, https://web.archive.org/web/20131201194408/http:l/lightberry.eu/, Dec. 1, 2013, 11 Pages.
Type: Grant
Filed: Mar 11, 2014
Date of Patent: Nov 24, 2015
Patent Publication Number: 20150205108
Assignee: Osterhout Group, Inc. (San Francisco, CA)
Inventors: John N. Border (Eaton, NH), John D. Haddick (Mill Valley, CA)
Primary Examiner: Claire X Pappas
Assistant Examiner: Robert Stone
Application Number: 14/205,267
International Classification: G02B 27/01 (20060101); G02B 26/08 (20060101); G02B 27/00 (20060101); G02B 5/30 (20060101);