Proximity Screen Display and User Interface

- Broadcom Corporation

A communication device is disclosed that includes a proximity screen display that includes one or more imaging elements that are configured and arranged around a periphery of a display area of the proximity screen display and/or integrated within the display area. The one or more integrated imaging elements are configured and arranged to sense light in their field of view. The one or more integrated imaging elements provide one or more various sensing signals whose magnitudes depend upon an amount of light sensed in their field of view. The communication device can adjust various parameters of video data and/or image data that is being displayed by the display area in response to the various sensing signals.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application claims the benefit of U.S. Provisional Patent Appl. No. 61/549,495, filed Oct. 20, 2011, which is incorporated herein by reference in its entirety.

BACKGROUND

1. Field of Disclosure

The present disclosure relates generally to a proximity screen display for use in a communication device, and more specifically to integration of integrated imaging elements within the proximity screen display for use in adjusting various parameters of video data and/or image data that is being displayed by a display area of the proximity screen display.

2. Related Art

A conventional communication device includes a conventional touch screen display, such as resistive, surface acoustic wave, capacitive, infrared, optical imaging, dispersive signal technology, or acoustic pulse recognition touch screen display to provide some examples, which operates as an interface between the communication device and a user of the communication device. The conventional touch screen display operates as an output device for displaying image and/or video data for the user. Additionally, the conventional touch screen display operates as an input device for receiving command data, control data, and/or other data from the user of the communication device. Most often, the conventional touch screen display includes an integrated virtual keyboard, also referred to as an on-screen integrated virtual keyboard, for receiving the command data, control data, and/or other data from the user of the communication device.

The continued evolution of silicon semiconductor fabrication technologies has reduced a size of the conventional communication device as well as a size of the conventional touch screen display and its integrated virtual keyboard. As a result, the alphanumeric keys of the integrated virtual keyboard have also decreased thereby making use of the integrated virtual keyboard more difficult. For example, users with larger hands can have difficulty in selecting from among the alphanumeric keys leading to erroneous keys being selected. As another example, the conventional communication device is not properly oriented when the integrated virtual keyboard is in use leading to erroneous keys being selected. In this other example, a user of the conventional communication device often times holds the conventional communication device at an angle which leads to erroneous keys being selected.

Manufacturers have developed various error correction techniques within the conventional communication device to correct for errors that result from erroneous keys being selected. These error correction techniques conventionally include sophisticated error techniques to interpolate a word or phrase from a word or a phrase having errors that was entered on the integrated virtual keyboard.

BRIEF DESCRIPTION OF THE DRAWINGS/FIGURES

Embodiments of the disclosure are described with reference to the accompanying drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Additionally, the left most digit(s) of a reference number identifies the drawing in which the reference number first appears.

FIG. 1 illustrates a block diagram of an exemplary communication device according to an exemplary embodiment of the present disclosure;

FIG. 2A illustrates a first block diagram of an exemplary configuration and arrangement of perimeter imaging elements surrounding a screen to form a proximity screen display of the communication device according to an exemplary embodiment of the present disclosure;

FIG. 2B illustrates a second block diagram of an exemplary configuration and arrangement of integrated imaging elements within the communication device according to an exemplary embodiment of the present disclosure;

FIG. 3A illustrates a first exemplary integration of the imaging elements within a screen of the communication device according to an exemplary embodiment of the present disclosure;

FIG. 3B illustrates a second exemplary integration of the imaging elements within the screen of the communication device according to an exemplary embodiment of the present disclosure;

FIG. 3C illustrates a third exemplary integration of the imaging elements within the screen of the communication device according to an exemplary embodiment of the present disclosure;

FIG. 4 illustrates an integrated imaging element within the communication device according to an exemplary embodiment of the present disclosure;

FIG. 5A illustrates a single pixel element of a flexible organic light-emitting diode screen display according to an exemplary embodiment of the present disclosure;

FIG. 5B illustrates a single pixel element of a flexible organic light-emitting diode proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure;

FIG. 6A illustrates a single pixel element of an electronic paper, e-paper, or electronic ink screen display according to an exemplary embodiment of the present disclosure;

FIG. 6B illustrates a single pixel element of an electronic paper, e-paper, or electronic ink proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure;

FIG. 7A illustrates a single pixel element of a liquid crystal screen display according to an exemplary embodiment of the present disclosure;

FIG. 7B illustrates a single pixel element of a liquid crystal proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure;

FIG. 8 illustrates an exemplary proximity screen display and proximity screen display interface that can be implemented within the communication device according to an exemplary embodiment of the present disclosure;

FIG. 9 is a first flowchart of exemplary operational steps of the proximity screen display and proximity screen display interface according to an exemplary embodiment of the present disclosure; and

FIG. 10 is a second flowchart of exemplary operational steps of the proximity screen display and proximity screen display interface according to an exemplary embodiment of the present disclosure.

The disclosure will now be described with reference to the accompanying drawings. In the drawings, like reference numbers generally indicate identical, functionally similar, and/or structurally similar elements. The drawing in which an element first appears is indicated by the leftmost digit(s) in the reference number.

DETAILED DESCRIPTION OF THE DISCLOSURE

The following Detailed Description refers to accompanying drawings to illustrate exemplary embodiments consistent with the disclosure. References in the Detailed Description to “one exemplary embodiment,” “an exemplary embodiment,” “an example exemplary embodiment,” etc., indicate that the exemplary embodiment described can include a particular feature, structure, or characteristic, but every exemplary embodiment can not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same exemplary embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an exemplary embodiment, it is within the knowledge of those skilled in the relevant art(s) to affect such feature, structure, or characteristic in connection with other exemplary embodiments whether or not explicitly described.

The exemplary embodiments described herein are provided for illustrative purposes, and are not limiting. Other exemplary embodiments are possible, and modifications can be made to the exemplary embodiments within the spirit and scope of the disclosure. Therefore, the Detailed Description is not meant to limit the disclosure. Rather, the scope of the disclosure is defined only in accordance with the following claims and their equivalents.

Embodiments of the disclosure can be implemented in hardware, firmware, software, or any combination thereof. Embodiments of the disclosure can also be implemented as instructions stored on a machine-readable medium, which can be read and executed by one or more processors. A machine-readable medium can include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computing device). For example, a machine-readable medium can include non-transitory machine-readable mediums such as read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; and others. As another example, the machine-readable medium can include transitory machine-readable medium such as electrical, optical, acoustical, or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.). Further, firmware, software, routines, instructions can be described herein as performing certain actions. However, it should be appreciated that such descriptions are merely for convenience and that such actions in fact result from computing devices, processors, controllers, or other devices executing the firmware, software, routines, instructions, etc.

The following Detailed Description of the exemplary embodiments will so fully reveal the general nature of the disclosure that others can, by applying knowledge of those skilled in relevant art(s), readily modify and/or adapt for various applications such exemplary embodiments, without undue experimentation, without departing from the spirit and scope of the disclosure. Therefore, such adaptations and modifications are intended to be within the meaning and plurality of equivalents of the exemplary embodiments based upon the teaching and guidance presented herein. It is to be understood that the phraseology or terminology herein is for the purpose of description and not of limitation, such that the terminology or phraseology of the present specification is to be interpreted by those skilled in relevant art(s) in light of the teachings herein.

For purposes of this discussion, the term “module” shall be understood to include at least one of software, firmware, and hardware (such as one or more circuits, microchips, or devices, or any combination thereof), and any combination thereof. In addition, it will be understood that each module can include one, or more than one, component within an actual device, and each component that forms a part of the described module can function either cooperatively or independently of any other component forming a part of the module. Conversely, multiple modules described herein can represent a single component within an actual device. Further, components within a module can be in a single device or distributed among multiple devices in a wired or wireless manner.

Overview

The following Detailed Description describes a communication device that includes a proximity screen display that includes one or more imaging elements that are configured and arranged around a periphery of a display area of the proximity screen display and/or integrated within the display area. The one or more integrated imaging elements are configured and arranged to sense light in their field of view. The one or more integrated imaging elements provide one or more various sensing signals whose magnitudes depend upon an amount of light sensed in their field of view. The communication device can adjust various parameters of video data and/or image data that is being displayed by the display area in response to the various sensing signals.

Exemplary Communication Device

FIG. 1 illustrates a block diagram of an exemplary communication device according to an exemplary embodiment of the present disclosure. A communication device 100 communicates information, such as audio data, video data, image data, command data, control data and/or other data to provide some examples, between a near-end user and a far-end user over various wired and/or wireless communication networks. The communication device 100 can represent a mobile communication device, such as a cellular phone or a smartphone, a mobile computing device, such as a tablet computer or a laptop computer, or any other electronic device that is capable of communicating information over communication networks that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure. The communication device 100 can communicate information that is received from the far-end user, as well as information that is generated by the communication device 100, to the near-end user using a proximity screen display. Additionally, the near-end user can communicate information to the far-end user, as well as information to the communication device 100, using the proximity screen display. The communication device 100 includes a communication module 102, a host processor 104, a proximity screen display 106, a proximity screen display interface 108, and a communication interface 110.

The communication module 102 can include a Bluetooth module, a Global Position System (GPS) module, a cellular module, a wireless local area network (WLAN) module, a near field communication (NFC) module, a radio frequency identification (RFID) module and/or a wireless power transfer (WPT) module. The Bluetooth module, the cellular module, the WLAN module, the NFC module, and the RFID module provide wireless communication between the communication device 100 and other Bluetooth, other cellular, other WLAN, other NFC, and other RFID capable communication devices, respectively, in accordance with various communication standards or protocols. These various communication standards or protocols can include various cellular communication standards such as a third Generation Partnership Project (3GPP) Long Term Evolution (LTE) communications standard, a fourth generation (4G) mobile communications standard, or a third generation (3G) mobile communications standard, various networking protocols such a Worldwide Interoperability for Microwave Access (WiMAX) communications standard or a Wi-Fi communications standard, various NFC/RFID communications protocols such as ISO 1422, ISO/IEC 14443, ISO/IEC 15693, ISO/IEC 18000, or FeliCa to provide some examples. The GPS module receives various signals from various satellites to determine location information for the communication device 100. The WPT module supports wireless transmission of power between the communication device 100 and another WPT capable communication device.

The host processor 104 controls overall operation and/or configuration of the communication device 100. The host processor 104 can receive and/or process information from a user interface such as an alphanumeric keypad, a microphone, a mouse, a speaker, and/or from other electrical devices or host devices that are coupled to the communication device 100. The host processor 104 can provide this information to the communication module 102 and/or the proximity screen display interface 108. Additionally, the host processor 104 can receive and/or process information from the communication module 102 and/or the proximity screen display interface 108. The host processor 104 can provide this information to the user interface, to other electrical devices or host devices, and/or to the communication module 102 and/or the proximity screen display interface 108. Further, the host processor 104 can execute one or more applications such as Short Message Service (SMS) for text messaging, electronic mailing, and/or audio and/or video recording to provide some examples, and/or software applications such as a calendar and/or a phone book to provide some examples.

The proximity screen display 106 represents an electronic visual display that can detect a presence and/or a location of a touch that is proximate to its display area. The proximity screen display 106 includes a display area to provide information from the proximity screen display interface 108 to the near-end user. Additionally, the proximity screen display 106 includes one or more integrated imaging elements that are integrated within and/or approximate to the display area to provide information from the near-end user to the proximity screen display interface 108. The one or more integrated imaging elements are configured and arranged to detect a presence and/or a location of a touch from the near-end user. The touch can represent a physical touching of the display area by the near-end user and/or by other passive objects available to the near-end user, such as a stylus to provide an example or proximity of the near-end user and/or the other passive objects to the display area.

The proximity screen display interface 108 can be configured via user setup and/or via a software programming interface (API) to respond to certain objects and classes of objects, e.g., a finger, fingers, hands, stylus, pencil, etc. Via user setup, any object within a user's grasp can be held in front of the proximity screen display 106 for imaging and thereafter can be used as the primary means of interacting through the proximity screen display interface 108. This setup can be for all applications and operating system interactions on the communication device 100, or it can apply to a single application or to only the operating system. Similarly, a user wearing gloves can easily train the proximity screen display interface 108 to recognize certain gloved hand and finger movements as the input via a similar setup process. Thereafter, any time the proximity screen display interface 108 recognizes a trained or default input element, a pop-up window can appear on the proximity screen display 106 to prompt the user to accept an automatic input device setup can be made without having to retraining

Moreover, although full contact of a finger, hand, gloved hand, stylus, or other input object with the surface of the proximity screen display interface 108 may be required, it need not be. Through setup and the software programming interface, a particular input element can be defined to have a working range. For example, full contact can be required for a pointer finger arrangement for one application. In another application and for perhaps a stylus, passing within two (2) centimeters of the screen surface within a defined speed range will be characterized as contact. Similarly, if within ten (10) centimeters of the screen surface and carrying out a double click tapping motion, even without actual contact, will be recognized and applied as a full contact double click.

A full set of default behaviors for a default set of input types and classes may be preloaded. Adjustments to the underlying settings, whether preloaded or not, can be made over time based on actual interactions. For example, a first attempt at a particular input element motion that is slightly outside a working range or slightly of what has been defined may not be recognized; however, a re-attempt though may be recognized. Once the first attempt is then affectively recognized via the subsequent reattempt, modifications and adjustments can be made to so that subsequent interactions similar to the first attempt will be recognized.

For example, conventional touch screens typically found in tablet and smartphone devices do not support touch typing. The proximity screen display interface 108 supports such interactions by “looking” at the fingers and finger motions of one or both sets of fingers being placed on the screen surface. Such looking involves repeated capture of images to form a sequence of images which together comprise video. By analyzing the sequence in real time, touch typing motions can be recognized such as (i) movement of the fingers away from and toward the screen (change in size indicating distance away), (ii) changes in finger shape (which indicate a pressure associated with a keystroke), (iii) lighting characteristics and shading, (iv) movement velocities/accelerations, (v) relocation, and (vi) rates of change of all the above.

This touch typing mode can be activated when a user decides to begin typing by merely bringing their finger set in a typing configuration toward the screen. By the time the fingers reach the screen, a keypad will be configured to fit appropriately thereunder without forcing the user to find finger to key placement. Thus, a user can begin typing without looking at the keys to make sure the fingers are maintaining their alignment. Instead, the keys will automatically be aligned, sized and positioned to fit the user. With this configuration, small or large hands with a more natural finger positioning can be easily accommodated. The touch typing mode may also be activated by a software application, user interaction with a field that requires typing input, or by any other gesture input (whether full contact or not) and via voice recognized commands.

Of course, instead of touch typing, a user may select a thumb typing configuration or pointer finger input mode as an overall default or on an application by application basis. Thumb typing may be represented by one or more thumb typing modes. Some of these modes might be tailored to interact using a more traditional full contact mode with rather fixed key offerings. Other thumb typing modes may take advantage of the non-contact and user-specific-tailoring aspects of the present invention. For example, thumb typing with or without gloves, small or big thumbs, short or long thumbs, short thumb range in x, y, z directions or regions can all be taken into account in adjusting and tailoring an effective interface for a particular user. Similarly, an elevated and non-contact typing mode (i.e., a “hovering non-contact typing mode” which may recognize fingers with non-contact typing motions. A hovering contact typing mode may also be selected wherein a finger needs full contact to be recognized as a key depression. No matter what typing mode is selected though, the key recognition range and associated finger behaviors can be accounted for dynamically to support a given user and user's situation.

Spelling and grammar tools running within underlying software (operating system or application) on the host processor 104 (or any dedicated other processing circuitry perhaps within the proximity screen display interface 108) assist in such dynamic tailoring no matter what the typing mode happens to be. For example, while in a hovering contact typing mode, spelling and grammatical mistakes can be recognized for stored finger motions. Some users often strike an “a” instead of a “q” after typing a “w.” Likewise, some users may hold their fingers in a hovering and striking position in a configuration somewhat off of a normal horizontal alignment leading to other typing errors. Other finger motions might not be identified as the finger range of motion may be impaired on yet another user. By analysis of mistakes over time as identified with spelling and grammar tools and unsolicited user corrections along with evaluation of both the strokes and hand positions associated with the mistakes and spatial characteristics of a keyboard layout, comfortable, effective and accurate typing input interface can be established for each user.

Other types of full, partial and no-contact user interfacing can also be established through the proximity screen display interface 108. For example, non-contact gestures such as placing all fingertips together then opening the hand to reveal a palm might be recognized to perform a function such as returning to a desktop. Mapping of such and any other gestures, whether involving some aspect of contact or not, to a particular function on the communication device 100 can be managed via default offerings and in a training based setup fashion. Such gestures can be with any object or body part and include other user input marriage. An example of such a marriage might be a gesture plus a detected voice command (simultaneously carried out or in a sequence) might be used to trigger performance of a function.

Although illustrated as solely an image capture based interface, the proximity screen display interface 108 and associated screen may be fitted with full-contact only, touch screen technology. With such configuration, both the visual recognition aspects and full contact detection approaches can be used together in a reinforcing way, or be selected as operational alternatives wherein only one may be powered down or up per user or application software command.

There are various ways for capturing images and video sequences in proximity of the screen display 106 such as those discussed in more detail below. No matter approach is used, because the image and video output is not intended for consumption by a human eye, capture and storage approaches may follow an entirely different set of goals and requirements. For example, to recognize a finger and finger position accurately may not require full HD (High Definition) color, contrast and resolution. Similarly, the image capture rate need not be fixed nor occur at typical film frame rates. Instead, for each particular design embodiment, the goals are much simpler and can be services with perhaps lower cost image processing elements, although more complexity and greater demand requirements could still be met (especially when high speed gestures and motion detection is needed). Likewise, storage may take the form of “stick-man” representations of body parts or other input elements. For example, in a compressed form, fingers and hands might be replaced with a data structure representing each joint in each hand along with related motion and position data. Within such or another data structure, fingertip contact might be represented by an estimated contact time, pressure and duration. Thus, by best-fit correlation of hand motion data to a target set of known motions or gestures, a user's input behaviors can be identified.

During training and when attempting to correct for identified mistakes, such data can be characterized for a particular user as matching a particular input target. At perhaps ten (10) frames per second capture in grey-scale and with a conversion to a data structure of a pivot point, length between pivot points (e.g., joints), and so on, user input whether or not full contact may offer sufficient for some embodiments. If, however, the image capture quality of the proximity screen display 106 is high enough, the full screen or portions thereof may be used for capturing images and video that is sufficient for consumption by the human eye. In such embodiments, the imaging capabilities of the screen server dual purposes (imaging for the eye and imaging for user input interfacing).

The one or more integrated imaging elements are configured and arranged to sense light in their field of view. The one or more integrated imaging elements provide one or more various sensing signals whose magnitudes depend upon an amount of light sensed in their field of view to the proximity screen display interface 108. For example, the sensing signals have a first magnitude when the one or more integrated imaging elements are exposed to a bright light, a second magnitude when the one or more integrated imaging elements are exposed to a dark light, and a third magnitude that varies between the first magnitude and the second magnitude as the amount of light sensed by the one or more integrated imaging elements varies between the bright light and the dark light.

The proximity screen display interface 108 communicates information between the communication module 102, the host processor 104, and the proximity screen display 106. The proximity screen display interface 108 provides various control signals to the proximity screen display 106 for configuration of its display area to display the information. For example, the proximity screen display interface 108 can provide various control signals to the proximity screen display 106 for configuration of its display area to display image data or video data received from the communication module 102 and/or the host processor 104. Additionally, the proximity screen display interface 108 can interpret the various sensing signals provided by the proximity screen display 106 to determine the presence and/or the location of the near-end user and/or the other passive objects. For example, the proximity screen display interface 108 can interpolate an image of an environment surrounding the display area from magnitudes of the various sensing signals to generate an image of an environment surrounding the display area to determine the presence and/or the location of the near-end user and/or the other passive objects. As another example, the proximity screen display interface 108 can compare various images of the environment surrounding the display area at different instances in time to determine movement of the near-end user and/or the other passive objects. Further, the proximity screen display interface 108 can recognize specific portions of the object, such as one or more fingers of a hand of the near-end user to provide an example, from one or more images of the environment surrounding the display area. The proximity screen display interface can 108 assign various control and/command data to different specific portions of the object and provide respective control and/command data to the communication module 102 and/or the host processor 104 when a respective specific portion of the object has been recognized. Yet further, the proximity screen display interface 108 can adjust various image parameters, such as zoom, resolution, pitch, roll, and/or yaw to provide some examples, of the information provided by the communication module 102 and/or the host processor 104 in response to the various sensing signals provided by the proximity screen display 106. For example, the proximity screen display interface 108 can adjust the zoom, resolution, pitch, roll, and/or yaw of image data, such as an image of a integrated virtual keyboard to provide an example, or video data to align the image data or video data with the movement of the near-end user and/or the other passive objects.

The communication interface 110 routes various communications between the communications module 102, the host processor 104, and the proximity screen display interface 108. These communications can include various digital signals, such as one or more commands and/or data to provide some examples, various analog signals, such as direct current (DC) currents and/or voltages to provide some examples, or any combination thereof. The communication interface 110 can be implemented as a series of wired and/or wireless interconnections between the communications module 102, the host processor 104, and the proximity screen display interface 108. The interconnections of the communication interface 110 can be arranged to form a parallel interface to route communications between the communications module 102, the host processor 104, and the proximity screen display interface 108 in parallel, a serial interface to route communications between the communications module 102, the host processor 104, and the proximity screen display interface 108, or any combination thereof.

Exemplary Configurations and Arrangements of Integrated Imaging Elements Within the Exemplary Communication Device

As discussed above, the one or more integrated imaging elements sense changes in light resulting from the movement of the near-end user and/or the other passive objects in their field of view. The exemplary configurations and arrangements of the one or more integrated imaging elements to be discussed below are for illustrative purposes only. Those skilled in the relevant art(s) will recognize that other configurations and arrangements of the one or more integrated imaging elements are possible without departing from the spirit and scope of the present disclosure.

The various proximity screen displays and associated assemblies described herein can include any suitable number of the integrated imaging elements ranging from a single integrated imaging element up to many thousands of integrated imaging elements. However, even larger numbers of integrated imaging elements are possible as will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure. For example, one or more imaging elements can be positioned outside the perimeter of the screen surface. Other embodiments may use imaging elements integrated into the screen itself, while yet others position the imaging elements on top of or behind the screen. No matter what the configuration of a particular embodiment, the resulting proximity screen display and associated processing hardware and software act in concert to provide user output and user input interfacing in a common, spatially mapped interface arrangement.

FIG. 2A illustrates a first block diagram of an exemplary configuration and arrangement of perimeter imaging elements surrounding a screen to form a proximity screen display of the communication device according to an exemplary embodiment of the present disclosure. A proximity screen display 200 includes integrated imaging elements 202.1 through 202.i that are configured and arranged around a periphery of a display area 204. Although, the integrated imaging elements 202.1 through 202.i are illustrated as being configured and arranged around the periphery of the display area 204 in a uniform manner, this is for illustrative purposes only. Those skilled in the relevant art(s) will recognize that the integrated imaging elements 202.1 through 202.i can be configured and arranged around the periphery of the display area 204 in any suitable manner without departing from the spirit and scope of the present disclosure. The proximity screen display 200 can represent an exemplary embodiment of the proximity screen display 106.

As shown in FIG. 2A, the integrated imaging elements 202.1 through 202.i can be configured and arranged around the periphery of the display area 204 in a uniform manner to form rows 206 and/or columns 208 of the integrated imaging elements 202.1 through 202.i. The integrated imaging elements 202.1 through 202.i and the display area 204 can be formed onto a common chip or die or separate chips or dies. Additionally, the integrated imaging elements 202.1 through 202.i and the display area 204 can be integrated within a mechanical housing of a communication device, such as the communication device 100. The mechanical housing can include various openings or cutouts to accommodate the integrated imaging elements 202.1 through 202.i.

FIG. 2B illustrates a second block diagram of an exemplary configuration and arrangement of integrated imaging elements within the communication device according to an exemplary embodiment of the present disclosure. A proximity screen display 210 includes integrated imaging elements 212.1 through 212.i that are configured and arranged in a uniform manner to form a matrix that is integrated within a display area 214. As shown in FIG. 2B, the integrated imaging elements 212.1 through 212.i can be configured and arranged to in rows 216 and/or columns 218 to form the matrix that is integrated within the display area 214. The matrix shown in FIG. 2B is merely illustrative, those skilled the relevant art(s) will recognize that the matrix can be increased or reduced depending on the design goals of a particular embodiment. Moreover, the array size may bear no relationship or a full relationship with the underlying pixel array size and layout. The proximity screen display 210 can represent an exemplary embodiment of the proximity screen display 106.

That is, other configurations and arrangements of the proximity screen display 106 are possible without departing from the spirit and scope of the present disclosure. For example, some configurations and arrangements of the proximity screen display 106 can include a first set of integrated imaging elements, similar to the imaging elements 202.1 through 202.i, around the periphery of its display area and a second set of the integrated imaging elements, similar to the integrated imaging elements 212.1 through 212.i, integrated within the display area. As another example, some configurations and arrangements of the proximity screen display 106 can include integrated imaging elements, similar to the integrated imaging elements 202.1 through 202.i and/or the one or more integrated imaging elements 212.1 through 212.i, which are configured and arranged around the periphery or integrated within the matrix, respectively, in a non-uniform manner. In this other example, the rows and/or the columns of these integrated imaging elements can include different numbers of integrated imaging elements. As a further example, some configurations and arrangements of the proximity screen display 106 can include integrated imaging elements, similar to the integrated imaging elements 202.1 through 202.i and/or the one or more integrated imaging elements 212.1 through 212.i, that are configured and arranged to form any geometric shape around the periphery of its display area and/or integrated within the display area.

Exemplary Integration of the Integrated Imaging Elements Within the Exemplary Communication Device

One or more integrated imaging elements, such as the integrated imaging elements 202.1 through 202.i or the one or more integrated imaging elements 212.1 through 212.i to provide an example, can be integrated within a proximity screen display, such as the proximity screen display 106, the proximity screen display 200, or the proximity screen display 210 to provide some examples, of a communication device, such as the communication device 100 to provide an example. The one or more integrated imaging elements can be formed onto and/or within the proximity screen display around the periphery of a display area in a similar manner as the display area 204 and/or integrated within the display area in a similar manner as the display area 214.

FIG. 3A illustrates a first exemplary integration of the imaging elements within a screen of the communication device according to an exemplary embodiment of the present disclosure. One or more integrated imaging elements 302.1 through 302.k can be formed onto a substrate 304 of a proximity screen display 300. The proximity screen display 300 can represent an exemplary embodiment of the proximity screen display 106, the proximity screen display 200, or the proximity screen display 210 to provide some examples. As such, the integrated imaging elements 302.1 through 302.k can represent an exemplary embodiment of the imaging elements 202.1 through 202.i or the imaging elements 212.1 through 212.i.

As shown in FIG. 3A, the proximity screen display 300 can be formed onto a single substrate or multiple substrates that are communicatively coupled to each other. The substrate 304 can represent a portion of the single substrate, one of the multiple substrates, or a portion of one of the multiple substrates. The substrate 304 can be integrated within a mechanical housing 306 of a communication device, such as the communication device 100 to provide an example. The mechanical housing 306 can include various openings 308.1 through 308.k to accommodate the integrated imaging elements 302.1 through 302.k. The openings 308.1 through 308.k can be physical hole type openings or merely comprise transparent material areas through which imaging can be conducted. The integrated imaging elements 302.1 through 302.k may comprise single photodetector imagers or more complex photodetector arrays with associated lensing, depending on the particular embodiment. Although shown as having a substantially three dimensional shape, the integrated imaging elements 302.1 through 302.k can take on the structure, shape and architecture of a mostly flat imaging structure.

FIG. 3B illustrates a second exemplary integration of the imaging elements within the screen of the communication device according to an exemplary embodiment of the present disclosure. Similar to the proximity screen display 300, a proximity screen display 310 includes the one or more integrated imaging elements 302.1 through 302.k that are formed onto the substrate 304. However, as an alternate to the mechanical housing 306, a protective coating of transparent material 312 can be placed onto the substrate 304 of the communication device to protect the integrated imaging elements 302.1 through 302.k.

FIG. 3C illustrates a third exemplary integration of the imaging elements within the screen of the communication device according to an exemplary embodiment of the present disclosure. One or more integrated imaging elements 316.1 through 316.k can be integrated within one or more integrated circuit layers 318.1 through 318.m of a substrate 320 of a proximity screen display 314. The proximity screen display 314 can represent an exemplary embodiment of the proximity screen display 106, the proximity screen display 200, or the proximity screen display 210 to provide some examples. As such, the integrated imaging elements 316.1 through 316.k can represent an exemplary embodiment of the imaging elements 202.1 through 202.i or the imaging elements 212.1 through 212.i.

As shown in FIG. 3C, the proximity screen display 314 can be formed onto a single substrate or multiple substrates that are communicatively coupled to each other. The substrate 320 can represent a portion of the single substrate, one of the multiple substrates, or a portion of one of the multiple substrates. The one or more integrated imaging elements 316.1 through 316.k can be integrated within the one or more integrated circuit layers 318.1 through 318.m of the substrate 320. Typically, the one or more integrated imaging elements 316.1 through 316.k are formed onto an integrated circuit layer 318.1 that represents a substrate of semiconductor material, which is often flexible. The one or more integrated imaging elements 316.1 through 316.k as well as a display area of the proximity screen display 314 are formed using various integrated circuit layers between the integrated circuit layers 318.1 and 318.m. Optionally, a flexible transparent cover is often formed onto the one or more integrated imaging elements 316.1 through 316.k as the integrated circuit layer 318.m.

Exemplary Integrated Imaging Elements Within the Exemplary Communication Device

As discussed above, the one or more integrated imaging elements sense changes in light resulting from the movement of the near-end user and/or the other passive objects in their field of view. Typically, the one or more integrated imaging elements are implemented using various photosensor and/or photodetector devices to provide an example, which convert energy of the light into electrical energy, such as current or voltage, by a photovoltaic effect. The photosensor and/or photodetector devices can include active pixel element sensors, light emitting diodes, optical detectors, photoresistors, photovoltaic cells, photodiodes, phototransistors, and or other devices that are capable of converting the energy of the light into the electrical energy that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure. These various photosensor and/or photodetector devices can be integrated around the periphery in a similar manner as the imaging elements 202.1 through 202.i, of a display area of a proximity screen display, such as the proximity screen display 106, the proximity screen display 200, or the proximity screen display 210 to provide some examples, and/or integrated within the display area in a similar manner as the integrated imaging elements 212.1 through 212.i.

FIG. 4 illustrates an integrated imaging element within the communication device according to an exemplary embodiment of the present disclosure. As shown in FIG. 4, the integrated imaging element is implemented using a photovoltaic device 400. When photons of sufficient energy strike the photovoltaic device 400, they excite electrons, thereby creating free negatively charged electrons and/or positively charged electron holes. These negatively charged electrons move toward a cathode of the photovoltaic device 400 and/or the positively charged electron holes move toward an anode of the photovoltaic device 400 producing a current and/or voltage. This current and/or voltage can represent an exemplary embodiment of one of the various sensing signals as described above in conjunction with the proximity screen display 106.

As shown in FIG. 4, the photovoltaic device 400 includes a transparent conducting cathode layer 402, an optional buffer layer 404, a donor-acceptor layer 406, and a transparent conducting anode layer 408 that are configured and arranged as a planar hetero-junction, although a bulk heterojunction (BHJ) or an ordered heterojunctions (OHJ) can be used, onto a transparent substrate 410. The transparent conducting cathode layer 402 represents a cathode of the photovoltaic device 400 that attracts negatively charged electrons from the optional buffer layer 404 and/or the donor-acceptor layer 406 when the photons of sufficient energy strike the photovoltaic device 400. The attracting of the negatively charged electrons to the transparent conducting cathode layer 402 can produce the current and/or the voltage which is indicative of an intensity of the photons striking the photovoltaic device 400.

The optional buffer layer 404 is an intrinsic semiconductor layer, also called an undoped semiconductor layer or i-type semiconductor layer, which represents a semiconductor layer without any significant impurity atoms. In some situations, the optional buffer layer 404 can be implemented with donor-acceptor layer 406 to form a p-type semiconductor, an i-type semiconductor, an n-type semiconductor (PIN) structure. In these situations, the negatively charged electrons and/or the positively charged electron holes from the donor-acceptor layer 406 accumulate within the optional buffer layer 404 when the photons of sufficient energy strike the photovoltaic device 400. A current can flow between the transparent conducting cathode layer 402 and the transparent conducting anode layer 408 when a sufficient number of the negatively charged electrons and/or the positively charged electron holes have accumulated in the optional buffer layer 404.

The donor-acceptor layer 406 can be doped with impurity atoms of an acceptor type, such as boron or aluminum to provide some examples, that are capable of accepting an electron and/or doped with impurity atoms of a donor type, such as phosphorus, arsenic, or antimony to provide some examples that are capable of donating an electron. In some situations, a first portion of the donor-acceptor layer 406 is doped with the impurity atoms of the acceptor type and a second portion of the donor-acceptor layer 406 is doped with the impurity atoms of the donor type to form a p-n junction. The donor-acceptor layer 406 provides negatively charged electrons to the transparent conducting cathode layer 402 and/or positively charged electron holes to the transparent conducting anode layer 408 when the photons of sufficient energy strike the photovoltaic device 400 causing a current to flow between the transparent conducting cathode layer 402 and the transparent conducting anode layer 408.

The transparent conducting anode layer 408 represents an anode of the photovoltaic device 400 that attracts positively charged electron holes from the optional buffer layer 404 and/or the donor-acceptor layer 406 when the photons of sufficient energy strike the photovoltaic device 400. The attracting of the positively charged electron holes to the transparent conducting anode layer 408 can produce the current and/or the voltage which is indicative of an intensity of the photons striking the photovoltaic device 400.

Exemplary Integrations of the Exemplary Integrated Imaging Elements Within Various Types of Proximity Screen Displays

As discussed above, an integrated imaging element, such as one of the integrated imaging elements 212.1 through 212.i, one of the integrated imaging elements 316.1 through 316.k, or the photovoltaic device 400 to provide some examples, can be integrated within a display area, such as the display area 214 to provide an example, of a proximity screen display, such as the proximity screen display 106, the proximity screen display 214, or the proximity screen display 314 to provide some examples. These various proximity screen displays can be placed in a side-by-side configuration with various imaging elements, such as one or more of the photovoltaic device 400 to provide an example, to form the proximity screen display 210. Optionally, a transparent flexible cover can be placed onto the proximity screen display 210 to cover the various proximity screen displays and the various imaging elements. For example, 140 10K-imaging elements can be placed in a middle of a full high-definition array of pixel elements of these proximity screen displays to form the proximity screen display 210. As another example, arrays of imaging elements can be interdigitated with arrays of pixel elements of these proximity screen displays to form a checkerboard like configuration. Other ratios of imaging elements to pixel elements and imager element arrays to pixel elements can be selected depending on the goals of the specific design embodiment.

Alternatively, or in addition to, various imaging elements, such as one or more of the photovoltaic device 400 to provide an example, can be placed in an on-top configuration with the various proximity screen displays. In this configuration, the various imaging elements are placed on top of the various proximity screen displays. In some situations, various layers of various imaging elements and the various proximity screen displays can be shared, such as a flexible transparent cover of the various imaging elements and a transparent substrate of the various imaging elements.

Alternatively, or in addition to, discussion to follow describes various proximity screen displays, such as an organic light-emitting diode proximity screen display, an electronic paper, e-paper, or electronic ink proximity screen display, or a liquid crystal display to provide some examples. The discussion to follow then describes integration of the imaging element within these various proximity screen displays to form exemplary integrations of the integrated imaging elements.

FIG. 5A illustrates a single pixel element of a flexible organic light-emitting diode proximity screen display according to an exemplary embodiment of the present disclosure. An organic light-emitting diode proximity screen display 500 includes one or more pixel elements that are configured and arranged to form a display area. Each of the one or more pixel elements includes one or more layers of one or more organic compounds which emit light in response to an electric current. The one or more layers of the one or more organic compounds are positioned between two electrodes on a substrate. The organic light-emitting diode proximity screen display 500 can represent a bottom emission device that uses a transparent or semi-transparent bottom electrode to emit the light through a transparent substrate or a top emission device that uses a transparent or semi-transparent top electrode to directly emit the light. A single pixel element of the organic light-emitting diode proximity screen display 500 includes an optional flexible transparent cover 502, a transparent conducting cathode 504, an optional electron transport layer 506, one or more organic emission layers 508, an optional hole transport layer 510, and a transparent conducting anode 512 that are formed on a flexible substrate 514 such as a substrate of polyethylene terephthalate to provide an example.

The optional flexible transparent cover 502 represents a protective coating of transparent material 312 that can be placed onto the flexible substrate 514 to protect the transparent conducting cathode 504, the optional electron transport layer 506, the one or more organic emission layers 508, the optional hole transport layer 510, and the transparent conducting anode 512.

The transparent conducting cathode 504 provides a current of electrons when a voltage at the transparent conducting anode 512 is positive with respect to the transparent conducting cathode 504. This current of electrons injects electrons into lowest occupied molecular orbitals (LUMO) of the one or more organic emission layers 508 at the optional electron transport layer 506 and withdraws electrons from highest occupied molecular orbitals (HUMO) from the one or more organic emission layers 508 at the optional hole transport layer 510 forming electron holes.

The optional electron transport layer 506 can be doped impurity atoms of a donor type, such as phosphorus, arsenic, or antimony to provide some examples, which are capable of donating an electron. The electron transport layer provides excess carrier electrons to the one or more organic emission layers 508 as the current flows through the optional electron transport layer 506 from the transparent conducting cathode 504 to the transparent conducting anode 512.

The one or more organic emission layers 508 provide various electrostatic forces to bring the electrons and the holes towards each other whereupon the electrons and the holes recombine to form a bound state of the electron and hole, often referred to as an exciton. The decay of the exciton results in a relaxation of the energy levels of the electron, accompanied by emission of radiation whose frequency is in the visible region. The one or more organic emission layers 508 can include organometallic chelates, fluorescent and phosphorescent dyes, conjugated dendrimer to provide some examples.

The optional hole transport layer 510 can be doped with impurity atoms of an acceptor type, such as boron or aluminum to provide some examples, that are capable of accepting an electron. The electron transport layer provides excess carrier holes to the one or more organic emission layers 508 as the current flows through the optional electron transport layer 506 from the transparent conducting cathode 504 to the transparent conducting anode 512.

The transparent conducting anode 512 receives the current of electrons when the voltage at the transparent conducting anode 512 is positive with respect to the transparent conducting cathode 504. In some situations, the transparent conducting anode 512 can be shared with other anode electrodes of other pixel elements of the organic light-emitting diode screen display 500.

FIG. 5B illustrates a single pixel element of a flexible organic light-emitting diode proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure. An organic light-emitting diode proximity screen display 520 includes one or more pixel elements that are configured and arranged to form a display area. Each of the one or more pixel elements includes one or more layers of one or more organic compounds which emit light in response to an electric current. The one or more layers of the one or more organic compounds are positioned between two electrodes on a substrate. The organic light-emitting diode proximity screen display 520 also includes an integrated imaging element that is integrated within the display area. The organic light-emitting diode proximity screen display 520 can represent an exemplary embodiment of the proximity screen display 106, the proximity screen display 210, or the proximity screen display 314 to provide some examples.

A single pixel element of the organic light-emitting diode proximity screen display 520 includes an imaging element 524 and an organic light-emitting diode touch screen pixel element 526. The imaging element 524 sense changes in light 522 resulting from the movement of the near-end user and/or the other passive objects in their field of view. When photons of sufficient energy of the light 522 strike the imaging element 524, they excite electrons, thereby creating free negatively charged electrons and/or positively charged electron holes. The negatively charged electrons move from a photovoltaic absorption layer 530 toward a transparent conducting cathode 528 which represents a cathode of the imaging element 524. Similarly, the positively charged electron holes move from the photovoltaic absorption layer 530 toward a shared transparent conducting electrode 532 which represents an anode of the imaging element 524. In an exemplary embodiment, the photovoltaic absorption layer 530 can be implemented in a substantially similar manner using the optional buffer layer 404 and/or the donor-acceptor layer 406. The movement of the negatively charged electrons toward the transparent conducting cathode 528 and the movement of the positively charged electron holes toward the shared transparent conducting electrode 532 produce a current and/or voltage. This current and/or voltage can represent an exemplary embodiment of one of the various sensing signals as described above in conjunction with the proximity screen display 106.

The organic light-emitting diode touch screen pixel element 526 includes one or more layers of one or more organic compounds which emit light in response to an electric current which emit light in response to an electric current. The organic light-emitting diode touch screen pixel element 526 can be implemented in a similar manner as a pixel element of the organic light-emitting diode proximity screen display 500. As such the organic light-emitting diode touch screen pixel element 526 includes the optional electron transport layer 506, the one or more organic emission layers 508, the optional hole transport layer 510, and the transparent conducting anode 512 with the transparent conducting cathode 504 of the organic light-emitting diode proximity screen display 500 being replaced by the shared transparent conducting electrode 532.

The shared transparent conducting electrode 532 provides a current of electrons when a voltage at the transparent conducting anode 512 is positive with respect to the shared transparent conducting electrode 532. This current of electrons injects electrons into lowest occupied molecular orbitals (LUMO) of the one or more organic emission layers 508 at the optional electron transport layer 506 and withdraws electrons from highest occupied molecular orbitals (HUMO) from the one or more organic emission layers 508 at the optional hole transport layer 510 forming electron holes. Various electrostatic forces to bring the electrons and the holes towards each other in the one or more organic emission layers 508 whereupon the electrons and the holes recombine to form the exciton. The decay of the exciton results in a relaxation of the energy levels of the electron, accompanied by emission of radiation whose frequency is in the visible region.

FIG. 6A illustrates a single pixel element of a electronic paper, e-paper, or electronic ink screen display. A electronic paper, e-paper, or electronic ink screen display 600 includes one or more pixel elements that are configured and arranged to form a display area. Each of the one or more pixel elements includes various charged pigment particles in a layer of liquid polymer that can be re-configured and re-arranged by applying various electric fields to two electrodes on a substrate. Applying a negative charge to a top electrode repels white pigment particles to a bottom of the layer of liquid polymer forcing black pigment particles to a top of the layer of liquid polymer which results in a black appearance. Similarly, applying a positive charge to the top electrode repels black pigment particles to the bottom of the layer of liquid polymer forcing white pigment particles to the top of the layer of liquid polymer which results in a white appearance. A single pixel element of the electronic paper, e-paper, or electronic ink screen display 600 includes the optional flexible transparent cover 502, a top transparent conducting electrode 602, one or more liquid polymer layers 604, and a bottom transparent conducting electrode 606 that are formed on the flexible substrate 514.

The top transparent conducting electrode 602 attracts positively charged pigment particles to a top of the one or more liquid polymer layers 604 and repels negatively charged pigment particles black pigment particles to a bottom of the one or more liquid polymer layers 604 when a negative charge is applied between the top transparent conducting electrode 602 and the bottom transparent conducting electrode 606. Likewise, the top transparent conducting electrode 602 repels the positively charged pigment particles to the bottom of the one or more liquid polymer layers 604 and attracts the negatively charged pigment particles to the top of the one or more liquid polymer layers 604 when a positive charge is applied between the top transparent conducting electrode 602 and the bottom transparent conducting electrode 606.

The one or more liquid polymer layers 604 includes one or more layers of various liquid polymers that suspend the positively charged pigment particles and negatively charged pigment particles until a charge is applied between the top transparent conducting electrode 602 and the bottom transparent conducting electrode 606. The positively charged pigment particles represent as black pigment particles and the negatively charge pigment particles represent white pigment particles. The pixel element of the paper, e-paper, or electronic ink screen display 600 will have a black appearance when the black pigment particles are attracted to the top transparent conducting electrode 602 and the white pigment particles are repelled to the bottom transparent conducting electrode. Likewise, the pixel element of the paper, e-paper, or electronic ink screen display 600 will have a white appearance when the white pigment particles are attracted to the top transparent conducting electrode 602 and the black pigment particles are repelled to the bottom transparent conducting electrode.

The bottom transparent conducting electrode 606 repels the positively charged pigment particles to the top of the one or more liquid polymer layers 604 and attracts the negatively charged pigment particles to the bottom of the one or more liquid polymer layers 604 when the negative charge is applied between the top transparent conducting electrode 602 and the bottom transparent conducting electrode 606. Likewise, the bottom transparent conducting electrode 606 attracts the positively charged pigment particles to the top of the one or more liquid polymer layers 604 and repels the negatively charged pigment particles to the bottom of the one or more liquid polymer layers 604 when the positive charge is applied between the top transparent conducting electrode 602 and the bottom transparent conducting electrode 606.

FIG. 6B illustrates a single pixel element of an electronic paper, e-paper, or electronic ink proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure. An electronic paper, e-paper, or electronic ink proximity screen display 620 includes one or more pixel elements that are configured and arranged to form a display area. The electronic paper, e-paper, or electronic ink proximity screen display 620 also includes an integrated imaging element that is integrated within the display area. The electronic paper, e-paper, or electronic ink proximity screen display 620 can represent an exemplary embodiment of the proximity screen display 106, the proximity screen display 210, or the proximity screen display 314 to provide some examples.

A single pixel element of the electronic paper, e-paper, or electronic ink proximity screen display 620 includes an imaging element 622 and an electronic paper, e-paper, or electronic ink touch screen pixel element 624. The imaging element 622 sense changes in the light 522 resulting from the movement of the near-end user and/or the other passive objects in their field of view. When photons of sufficient energy of the light 522 strike the imaging element 622, they excite electrons, thereby creating free negatively charged electrons and/or positively charged electron holes. The negatively charged electrons move from the photovoltaic absorption layer 530 toward a transparent conducting shared electrode 630 which represents a cathode of the imaging element 622. Similarly, the positively charged electron holes move from the photovoltaic absorption layer 530 toward the transparent conducting anode 512 which represents an anode of the imaging element 622. The movement of the negatively charged electrons toward the transparent conducting shared electrode 630 and the movement of the positively charged electron holes toward transparent conducting anode 512 produce a current and/or voltage. This current and/or voltage can represent an exemplary embodiment of one of the various sensing signals as described above in conjunction with the proximity screen display 106.

Additionally, the single pixel element of the electronic paper, e-paper, or electronic ink proximity screen display 620 includes a first conducting element 626 and a second conducting element 628. The first conducting element 626 attracts positively charged pigment particles to a first side of the one or more liquid polymer layers 604 and repels negatively charged pigment particles black pigment particles to a second side of the one or more liquid polymer layers 604 when a negative charge is applied between the first conducting element 626 and the second conducting element 628. Similarly, the second conducting element 628 repels the positively charged pigment particles to the first side of the one or more liquid polymer layers 604 and attracts the negatively charged pigment particles black pigment particles to the second side of the one or more liquid polymer layers 604 when the negative charge is applied between the first conducting element 626 and the second conducting element 628. The attracting of the positively charged pigment particles to the first side and the negatively charged pigment particles to the first side allows the light 522 to pass through the one or more liquid polymer layers 604 to strike the imaging element 622.

The electronic paper, e-paper, or electronic ink proximity screen display 624 includes various charged pigment particles in a layer of liquid polymer that can be re-configured and re-arranged by applying various electric fields to two electrodes on the substrate. The electronic paper, e-paper, or electronic ink proximity screen display 624 can be implemented in a similar manner as a pixel element of the electronic paper, e-paper, or electronic ink proximity screen display 600. As such the electronic paper, e-paper, or electronic ink proximity screen display 624 includes the top transparent conducting electrode 602 and the one or more liquid polymer layers 604 with the bottom transparent conducting electrode 606 of the electronic paper, e-paper, or electronic ink proximity screen display 600 being replaced by the transparent conducting shared electrode 630.

The transparent conducting shared electrode 630 repels the positively charged pigment particles to the top of the one or more liquid polymer layers 604 and attracts the negatively charged pigment particles to the bottom of the one or more liquid polymer layers 604 when the negative charge is applied between the top transparent conducting electrode 602 and the transparent conducting shared electrode 630. Likewise, the transparent conducting shared electrode 630 attracts the positively charged pigment particles to the top of the one or more liquid polymer layers 604 and repels the negatively charged pigment particles to the bottom of the one or more liquid polymer layers 604 when the positive charge is applied between the top transparent conducting electrode 602 and the transparent conducting shared electrode 630.

FIG. 7A illustrates a single pixel element of a liquid crystal\screen display. A liquid crystal\screen display 700 includes one or more pixel elements that are configured and arranged to form a display area. Each of the one or more pixel elements includes one or more layers of liquid crystal material aligned between two electrodes. Before an electric field is applied between the two electrodes, the liquid crystal material is aligned perpendicular to each other at the two electrodes, and so molecules of the liquid crystal material arrange themselves in a helical structure, or twist. When a voltage applied between the two electrodes is large enough, the liquid crystal molecules in a center of the layers of liquid crystal material are almost completely untwisted and the polarization of the incident light is not rotated as it passes through the liquid crystal material. This light will then be mainly polarized perpendicular to a horizontal polarizing filter, and thus be blocked and the pixel element will appear black. By controlling the voltage applied between the two electrodes, light can be allowed to pass through in varying amounts thus constituting different levels of gray. A light source, such as a backlight or a reflector to provide some examples, is often included to reflect the light passed through the pixel element.

A single pixel element of the liquid crystal\screen display 700 includes the optional flexible transparent cover 502, a vertical axis polarizing filter 702, a top transparent conducting electrode 704, one or more liquid crystal layers 706, a horizontal axis polarizing filter 708, and a bottom transparent conducting electrode 710 that are formed on the flexible substrate 514. The vertical axis polarizing filter 702 is configured to pass vertical components of the light passing through it while absorbing and/or reflecting horizontal components. The horizontal axis polarizing filter 708 is configured to pass horizontal components of the light passing through it while absorbing and/or reflecting vertical components.

The one or more liquid crystal layers 706 contain liquid crystals that twist and untwist at varying degrees to allow light to pass through when a voltage is applied between the top transparent conducting electrode 704 and the bottom transparent conducting electrode 710. The liquid crystals untwist changing their polarization and in proportion to a voltage applied between the top transparent conducting electrode 704 and the bottom transparent conducting electrode 710. By properly adjusting the level of the voltage applied between the top transparent conducting electrode 704 and the bottom transparent conducting electrode 710, almost any gray level or transmission can be achieved.

FIG. 7B illustrates a single pixel element of a liquid crystal proximity screen display that is integrated with an integrated imaging element according to an exemplary embodiment of the present disclosure. A liquid crystal proximity screen display 720 includes one or more pixel elements that are configured and arranged to form a display area. The liquid crystal proximity screen display 720 also includes an integrated imaging element that is integrated within the display area. The liquid crystal proximity screen display 720 can represent an exemplary embodiment of the proximity screen display 106, the proximity screen display 210, or the proximity screen display 314 to provide some examples.

A single pixel element of the liquid crystal proximity screen display 720 includes a the vertical axis polarizing filter 702, the one or more liquid crystal layers 706, the horizontal axis polarizing filter 708, the bottom transparent conducting electrode 710 and a transparent conducting shared electrode 722 to are configured and arranged on the flexible substrate 514 to form a pixel element of the liquid crystal proximity screen display 720. This pixel element of the liquid crystal proximity screen display 720 can be implemented in a similar manner as a pixel element of the liquid crystal proximity screen display 700 with the top transparent conducting electrode 704 of the liquid crystal proximity screen display 700 being replaced by the transparent conducting shared electrode 722. The liquid crystals of the one or more liquid crystal layers 706 twist and untwist at varying degrees to allow light to pass through when a voltage is applied between the transparent conducting shared electrode 722 and the bottom transparent conducting electrode 710.

The single pixel element of the liquid crystal proximity screen display 720 also includes the transparent conducting cathode 528, the photovoltaic absorption layer 530, and the transparent conducting shared electrode 722 that are configured and arranged to form an imaging element. The imaging element sense changes in the light 522 resulting from the movement of the near-end user and/or the other passive objects in their field of view. When photons of sufficient energy of the light 522 strike the imaging element, they excite electrons, thereby creating free negatively charged electrons and/or positively charged electron holes. The negatively charged electrons move from the photovoltaic absorption layer 530 toward a transparent conducting shared electrode 722 which represents a cathode of the imaging element. Similarly, the positively charged electron holes move from the photovoltaic absorption layer 530 toward the transparent conducting anode 512 which represents an anode of the imaging element. The movement of the negatively charged electrons toward the transparent conducting shared electrode 722 and the movement of the positively charged electron holes toward transparent conducting anode 512 produce a current and/or voltage. This current and/or voltage can represent an exemplary embodiment of one of the various sensing signals as described above in conjunction with the proximity screen display 106.

Exemplary Implementation of a Proximity Screen Display Interface that can be Implemented Within the Exemplary Communication Device

FIG. 8 illustrates an exemplary proximity screen display and proximity screen display interface that can be implemented within the communication device according to an exemplary embodiment of the present disclosure. A proximity screen display interface 800 provides various control signals to a proximity screen display 802 for configuration of its display area to display information from a host processor, such as the host processor 104 to provide an example, and/or a communication module, such as the communication module 102 to provide an example. Additionally, the proximity screen display interface 800 can interpret various sensing signals provided by the proximity screen display 802 to determine the presence and/or the location of the near-end user and/or the other passive objects. Further, the proximity screen display interface 800 can adjust various image parameters, such as zoom, resolution, pitch, roll, and/or yaw to provide some examples, of the information provided by the communication module and/or the host processor in response to the various sensing signals provided by the proximity screen display 802. The proximity screen display interface 800 can represent an exemplary embodiment of the proximity screen display interface 108 and the proximity screen display 802 can represent an exemplary embodiment of the proximity screen display 106, the proximity screen display 200, the proximity screen display 210, the proximity screen display 300, the proximity screen display 310, the proximity screen display 314, the organic light-emitting diode proximity screen display 520, the electronic paper, e-paper, or electronic ink proximity screen display 620, the liquid crystal proximity screen display 720, or any combination thereof.

The proximity screen display interface 800 includes a touch screen controller 804, a display area driver module 806, and an integrated imaging element driver module 808. The touch screen controller 804 controls overall operation and/or configuration of the proximity screen display 802. As shown in FIG. 8, the touch screen controller 804 receives information 850 from the host processor and/or the communication module. For example, the information 850 can include video and/or image data to be displayed by the proximity screen display 802. As another example, the information 850 can include command and/or control data to control the operation and/or configuration of the proximity screen display 802. This command and/or control data can include backlight parameters, contrast parameters, brightness parameters, sharpness parameters, color parameters, tint parameters, refresh rate parameters, aspect ratio parameters, and/or resolution parameters to control the displaying of the video and/or image data within a display area of the proximity screen display 802. The touch screen controller 804 provides video and/or image data 852 to the display area driver module 806 that is to be displayed in accordance with the command and/or control data. Additionally, the command and/or control data can include sensing rate parameters or sensing scheme parameters to control the operation and/or configuration of the integrated imaging elements that are configured and arranged around a periphery of the display area and/or within the proximity screen display. The touch screen controller 804 provides imaging command and/or control 854 to the integrated imaging element driver module 808 to control the operation and/or configuration of the integrated imaging elements. Further, the proximity screen display interface 800 can adjust various parameters, such the zoom, resolution, pitch, roll, and/or yaw to provide some examples, of the video and/or image data in response to the sensing signals 856 provided by the proximity screen display 802.

The display area driver module 806 processes the video and/or image data 852 to provide display area control signals 860 for displaying of the video and/or image data 852 on the display area of the proximity screen display 802. Typically, the display area of the proximity screen display 802 is configured and arranged as multiple rows and/or columns of pixel elements that are configured and arranged as a matrix. Each of the pixel elements can display a pixel of red, green, blue, black, white, or any combination thereof color by applying a current and/or a voltage to its respective electrodes. A first electrode of each of the pixel elements in each row is coupled to each other. Similarly, a second electrode of each of the pixel elements in each column is coupled to each other. The display area driver module 806 provides various currents and/or voltages to the rows and/or the columns of the matrix to configure the multiple rows and/or columns of pixel elements to display the video and/or image data 852.

The integrated imaging element driver module 808 provides integrated imaging element control signals 862 to configure to integrated imaging elements to sense light in their field of view. Typically, the integrated imaging elements are controlled in a substantially opposite manner as the pixel elements. For example, when a pixel element displays its respective pixel of red, green, blue, black, white, or any combination thereof color, namely active or turned “ON”, its respective integrated imaging element is configured to be inactive or turned “OFF”. Likewise, when an integrated imaging element is sensing the light in its field of view, namely active or turned “ON”, its respective pixel element is configured to be inactive or turned “OFF”. In some situations, the integrated imaging element and its respective pixel element can be duty cycled to switch between their inactive or active configurations at sufficient rate without affecting an appearance of the video and/or image data 852 to the near-end user. In other situations, the touch screen controller 804 determines which of the pixel elements are to be inactive based upon the video and/or image data 852. In these other situations, the touch screen controller 804 provides the imaging command and/or control 854 to cause integrated imaging elements that correspond to these inactive pixel elements to be active to sense the light in their field of view. In an exemplary embodiment, the integrated imaging elements of the proximity screen display 802 are configured and arranged as multiple rows and/or columns that are configured and arranged as a matrix.

In this exemplary embodiment, a first electrode of each of the integrated imaging elements in each row is coupled to each other and a second electrode of each of the integrated imaging elements in each column is coupled to each other. In this exemplary embodiment, the integrated imaging element driver module 808 provides various currents and/or voltages to the rows and/or the columns of the matrix to configure the multiple rows and/or columns of pixel elements to integrated imaging elements to sense the light in their field of view to provide the sensing signals 856. Generally, the magnitudes of the sensing signals 856 depend upon an amount of light sensed in their field of view.

The proximity screen display 802 can include one or more pixel elements that are configured and arranged in multiple rows and/or columns to form a display area for displaying of the video and/or image data 852. The proximity screen display 802 also includes integrated imaging elements that are configured and arranged around the periphery of the display area and/or within the proximity screen display 802 to sense the light in their field of view. The proximity screen display 802 can include more, less, or equal integrated imaging elements as pixel elements. The integrated imaging elements provide various voltages and/or currents as the sensing signals 856 that are indicative of the light in their field of view. For example, the sensing signals 856 have a first magnitude when the one or more integrated imaging elements are exposed to a bright light, a second magnitude when the one or more integrated imaging elements are exposed to a dark light, and a third magnitude that varies between the first magnitude and the second magnitude as the amount of light sensed by the integrated imaging elements varies between the bright light and the dark light.

Exemplary Operation of a Proximity Screen Display Interface that can be Implemented Within the Exemplary Proximity Screen Display Interface

As discussed above, a proximity screen display interface, such as the proximity screen display interface 108 or the proximity screen display interface 800 to provide some examples, can adjust various image parameters, such as zoom, resolution, pitch, roll, and/or yaw to provide some examples, of video data or image data, provided by a communication module, such as the communication module 102 to provide an example, and/or a host processor, such as the host processor 104 to provide an example. The proximity screen display interface can adjust the various parameters of the video data or the image data in response to the presence and/or the location of the touch from the near-end user in relation to a proximity screen display, such as the proximity screen display 106, the proximity screen display 200, the proximity screen display 210, the proximity screen display 300, the proximity screen display 310, the proximity screen display 314, the organic light-emitting diode proximity screen display 520, the electronic paper, e-paper, or electronic ink proximity screen display 620, the liquid crystal proximity screen display 720, or the proximity screen display 802 to provide some examples.

FIG. 9 is a flowchart of exemplary operational steps of the proximity screen display and proximity screen display interface according to an exemplary embodiment of the present disclosure. The disclosure is not limited to this operational description. Rather, it will be apparent to persons skilled in the relevant art(s) that other operational control flows are within the scope and spirit of the present disclosure. The following discussion describes the steps in FIG. 9.

At step 950, the operational control flow receives video data, image data, command data, and/or control data for displaying the image and/or the video data on to a display area 904 of the proximity screen display 900. The proximity screen display 900 can represent an exemplary embodiment of the proximity screen display 106, the proximity screen display 200, the proximity screen display 210, the proximity screen display 300, the proximity screen display 310, the proximity screen display 314, the organic light-emitting diode proximity screen display 520, the electronic paper, e-paper, or electronic ink proximity screen display 620, the liquid crystal proximity screen display 720, or the proximity screen display 802 to provide some examples. However, those skilled in the relevant art(s) will recognize that the proximity screen display 900 can also be implemented using any conventional proximity screen display, such as any conventional resistive, surface acoustic wave, capacitive, infrared, optical imaging, dispersive signal technology, or acoustic pulse recognition proximity screen display to provide some examples, that is capable of detecting a presence or a location of an object without departing from the spirit and scope of the present disclosure.

A proximity screen display interface, such as the proximity screen display interface 108 or the proximity screen display interface 800 to provide some examples, receives the video data, image data, command data, and/or control data for displaying the image and/or the video data on to a display area 904 of a proximity screen display 900.

At step 952, the operational control flow displays the image and/or the video data on to the display area 904 in according to the command data, and/or control data. Specifically, the proximity screen display interface provides various display area control signals, such as the various display area control signals 860 to provide an example, in response to the video data, image data, command data, and/or control data to configure and arrange the display area 904 to display the video and/or image data on the display area.

At step 954, the operational control flow detects a presence or a location of an object, such as a finger 902 of a user, which is proximate to the display area 904. Specifically, the proximity screen display 904 can include one or more integrated imaging elements that can be integrated around a periphery of the display area 904 in a similar manner as the imaging elements 202.1 through 202.i and/or integrated within the display area in a similar manner as the integrated imaging elements 212.1 through 212.i. The one or more integrated imaging elements are configured and arranged to sense light in their field of view.

The proximity screen display interface provides various integrated imaging element control signals, such as the integrated imaging element control signals 862 to provide an example, to configure the one or more integrated imaging elements to sense light in their field of view. The one or more integrated imaging elements provide various sensing signals, such as the sensing signals 856 to provide an example, whose magnitudes depend upon an amount of light sensed in their field of view at different locations with the proximity screen display. The proximity screen display interface can interpolate an image of an environment surrounding the display area from magnitudes of the various sensing signals to detect the presence or the location of the object. Additionally, the proximity screen display interface can compare various images of the environment surrounding the display area 904 at different instances in time to determine movement of the object within the environment.

Further, the proximity screen display interface can recognize specific portions of the object, such as one or more fingers of a hand of the user, from one or more images of the environment surrounding the display area 904. The proximity screen display interface can assign various control and/command data to different specific portions of the object and provide respective control and/command data when a respective specific portion of the object has been recognized. For example, the proximity screen display interface can provide control and/command data to scroll down and no hypertext jumping or zooming upon recognition of the right thumb, or control and/command data to select hypertext links and invokes zooming upon recognition of the right index finger.

At step 956, the operational control flow adjusts the image and/or the video data as being displayed on the display area 904 in response to detecting the presence or the location of the object. Specifically, the proximity screen display interface can adjust the various display area control signals to adjust various image parameters, such as zoom, resolution, pitch, roll, and/or yaw to provide some examples, of the image and/or the video data as being displayed on the display area 904. For example, the proximity screen display interface can adjust the various parameters to enlarge or to zoom into a coincidental portion 902 of the image and/or the video data, such as one or more alphanumeric keys of an integrated virtual keyboard to provide an example, as being displayed on the display area 904 that coincides with the location of the object. In this example, the coincidental portion 902 of the image and/or the video data can appear to be larger to the user.

Thereafter, the operational control flow reverts to step 952 to display the image and/or the video data.

As mentioned previously, various full or non-contact modes can be automatically identified by the proximity screen display 900 and/or its corresponding display interface such as the proximity screen display interface 108 or the proximity screen display interface 800 to provide some examples. Such modes can also be selected by a user via a setup process. One such mode, involves a non-contact “click” selection using a pointer finger. Such mode can be established as a factory default or user defined, trained and configured to cause a particular software function to trigger such as launching a software API (application program interface) On_Click( ) type function. In particular, when a pointer finger comes within approximately fifteen (15) centimeters of the screen surface, a bordered circle of a larger size appears on the screen at a corresponding x-y location. Within this circle, underlying pixel based visual graphics can be set to a particular user selected magnification which may be set to any degree of magnification or to 100% (or otherwise turned off). As the pointer finger moves closer, the magnification can be set to scale up or down or merely stay the same. Likewise, the circle size can be made to change or stay the same. When passing over an active input element (button, down arrow, text field or widget), the circle can be made to stabilize or to reflect free movement in the x-y-z directions. With stabilization, even a user with shaky hands can find a target input element and stay thereon long enough to recognize same and carry out perhaps a double click motion without ever touching the screen. The stabilization may involve centering and appropriate zooming along with a temporary dwell time that may be user configured or calculated for each user through training or through captured behaviors. The stabilization can be over-ridden when the pointer finger motion appears to be intentional and not within a range of a particular human's jittering.

FIG. 10 is a second flowchart of exemplary operational steps of the proximity screen display and proximity screen display interface according to an exemplary embodiment of the present disclosure. The disclosure is not limited to this operational description. Rather, it will be apparent to persons skilled in the relevant art(s) that other operational control flows are within the scope and spirit of the present disclosure. The following discussion describes the steps in FIG. 10.

At step 950, the operational control flow receives the video data, image data, command data, and/or control data for displaying the image and/or the video data on to a display area 1004 of the proximity screen display 1000. The proximity screen display 1000 can represent an exemplary embodiment of the proximity screen display 106, the proximity screen display 200, the proximity screen display 210, the proximity screen display 300, the proximity screen display 310, the proximity screen display 314, the organic light-emitting diode proximity screen display 520, the electronic paper, e-paper, or electronic ink proximity screen display 620, the liquid crystal proximity screen display 720, or the proximity screen display 802 to provide some examples. However, those skilled in the relevant art(s) will recognize that the proximity screen display 1000 can also be implemented using any conventional proximity screen display, such as any conventional resistive, surface acoustic wave, capacitive, infrared, optical imaging, dispersive signal technology, or acoustic pulse recognition proximity screen display to provide some examples, that is capable of detecting a presence or a location of an object without departing from the spirit and scope of the present disclosure.

At step 952, the operational control flow displays the image and/or the video data on to the display area 1004 in according to the command data, and/or control data.

At step 1050, the operational control flow detects a presence or a location of an object, such as one or more hands 1002 of a user, which is proximate to the display area 1004 in a substantially similar manner as described in step 954.

At step 1052, the operational control flow adjusts the image and/or the video data as being displayed on the display area 1004 in response to detecting the presence or the location of the object. Specifically, the proximity screen display interface can adjust the various display area control signals to adjust the various parameters of the image and/or the video data as being displayed on the display area 1004. For example, the proximity screen display interface can adjust the various parameters to enlarge or to zoom into coincidental portions 1006 of the image and/or the video data as being displayed on the display area 1004 that coincide with various portions of the object, such as one or more fingers of the one or more hands 1002. In this example, the coincidental portions 1006 of the image and/or the video data can appear to be larger to the user. As another example, the proximity screen display interface can adjust the various parameters to adjust an orientation 1008 of the image and/or the video data as being displayed on the display area 1004 to coincide with the various portions of the object. This other example is particularly useful when the image and/or the video data correspond to an integrated virtual keyboard. This other example allows the orientation of the integrated virtual keyboard to be adjusted to coincide with the one or more fingers of the one or more hands 1002.

As discussed previously, the proximity screen display 1000 and supporting hardware and software within the illustrated tablet device can support various typing modes, including touch typing (full finger contact with finger lifts and presses), finger hover with keystroking finger screen contact, hover without contact but with keystroke-like finger motions, and so on. Thumb typing and hunt and peck styles can also be selected. And even with hunt and peck, other objects can be used for the pecking (e.g., a stylus, pencil, or broom handle). Typing input can even support one handed typing, missing digits, and obscure typing preferences and layouts. No matter what the typing configuration though, user specific tailoring is supported through training, ongoing monitoring of typing success (spelling/grammar/corrections) and associated hand and finger positions, motions, and characterizations thereof.

To illustrate such tailoring, consider a situation wherein a user brings their hands into full contact with the tablet as illustrated. Instead of forcing a keyboard fit on the user, the proximity screen display 1000 and supporting hardware and software respond to detecting the approach and construct a keyboard layout that fits the locations of the fingers in their natural typing readiness configuration (again as shown). This keyboard layout may change during the approach as fingers move or only change (or are created) upon contact. Thereafter, as the user types, corrects, makes mistakes and so on, a finger striking range can be identified which may also account for keystroke sequencing. From such information key sizing may be adjusted on a key by key basis. For example, the active area for the letter “q” may need to be much larger than the active area for the letter “a” and both may be ellipsoidal or other more natural finger movement related active key area shapes. From key area shapes, a visual keyboard element can be placed but need not be. The visual keyboard shape and locations can be constructed to parallel how a user typically types or to help tighten up problem areas where a user often makes mistakes.

To further correct such mistakes, a real time predictive spelling can be turned on. Such feature involves making a prediction as to the next letter that will be typed (based on spelling and grammar rules) and which letters will not be typed. When such letters occur in proximity on a keyboard layout, a one of such keys can be made to favor another. For example when ten (10) percent of the time, a user hits an upper region associated with their typical letter “a” strokes and a lower region associated with their typical letter “q,” then “a” could get favored treatment due to the perhaps a most likely spelling prediction. Subsequent keys might change the prediction resulting in a swapping event from “a” to “q” of course. Thus, to avoid confusion, several modes of operation might be selected. In a first mode, both letters can be presented before one is finalized based on subsequent typing entry and spelling and grammar considerations for an entire word or word sequence which either verifies or conflicts with a prediction. In a second mode, the most likely is presented and visually swapped if it proves incorrect. In a third mode, both letter options are withheld until enough further letters are received to converge on one or the other. This applies to more than two possibilities as well based on their nearness in physical keyboard layout.

Each finger contact area for a given key is represented by statistical variation data for each particular user. Such statistics and associated predictions, resizing and relative locations can be extended to three dimensions and can be associated with any element of each hands digits (e.g., joints). Thus, for example, a particular user's hand motion associated with a pinky-stretch for a “q” might be much more important than the landing spot.

Such characterizations and predictions apply equally for other types of keyboard input modes as well as non-keyboard, input element interactions involving full contact, no contact, and partial contact motions of hands and fingers, other body parts and held objects.

CONCLUSION

It is to be appreciated that the Detailed Description section, and not the Abstract section, is intended to be used to interpret the claims. The Abstract section can set forth one or more, but not all exemplary embodiments, of the disclosure, and thus, are not intended to limit the disclosure and the appended claims in any way.

The disclosure has been described above with the aid of functional building blocks illustrating the implementation of specified functions and relationships thereof. The boundaries of these functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternate boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed.

It will be apparent to those skilled in the relevant art(s) that various changes in form and detail can be made therein without departing from the spirit and scope of the disclosure. Thus the disclosure should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.

Claims

1. A proximity screen display system for a communications device, comprising:

a proximity display having a plurality of pixel elements and a plurality of imaging elements, the plurality of pixel elements and the plurality of imaging elements respectively having a first structure and a second structure, the first structure and the second structure having at least one common portion; and
a controller configured to manage sharing of the at least one common portion to permit operation of both the plurality of pixel elements and the plurality of imaging elements.

2. The proximity screen display system of claim 1, wherein the plurality of pixel elements are configured and arranged as a plurality of rows and columns to form a matrix, a first electrode of each of plurality of pixel elements in each row in the matrix being coupled to each other and a second electrode of each of plurality of pixel elements in each column in the matrix being coupled to each other, and

wherein the controller is further configured to provide currents or voltages to the plurality of rows and columns of the matrix to configure the plurality of rows and columns to display the image.

3. The proximity screen display system of claim 1, wherein the controller is further configured to determine inactive pixel elements from among the plurality of pixel elements that are inactive when the image is displayed by the proximity screen display and to cause integrated imaging elements from among the plurality of integrated imaging elements that correspond to the inactive pixel elements to sense the light.

4. The proximity screen display system of claim 1, wherein the controller is further configured to duty cycle the plurality of pixel elements and the plurality of integrated imaging elements to switch between their inactive or active configurations such that when an integrated imaging element from among the plurality of integrated imaging elements senses the light, its respective pixel element from among the plurality of pixel elements does not illuminate its respective color pixel.

5. The proximity screen display system of claim 1, wherein the controller is further configured to adjust an image parameter of a portion of the image in response to the plurality of sensing signals.

6. The proximity screen display system of claim 5, wherein the portion of the image is an integrated virtual keyboard.

7. The proximity screen display system of claim 6, wherein the controller is further configured to cause the proximity screen display to zoom into an alphanumeric key of an integrated virtual keyboard that coincides with a location of the object in response to the plurality of sensing signals.

8. The proximity screen display system of claim 1, wherein the controller is further configured to interpolate an image of an environment surrounding the proximity screen display in response to the plurality of sensing signals to detect a location of an object.

9. The proximity screen display system of claim 8, wherein the controller is further configured to cause the pixel elements that coincide with the object from among the plurality of pixel elements to adjust their illumination in response to the plurality of sensing signals.

10. The proximity screen display system of claim 1, wherein magnitudes of the plurality of sensing signals depend upon an amount of the light sensed by the plurality of integrated imaging elements.

11. A proximity screen display for a communications device, comprising:

a plurality of pixel elements configured and arranged as a plurality of first rows; and
a plurality of imaging elements configured and arranged as a plurality of second rows,
wherein the plurality of plurality of first rows is interdigitated with the plurality of second rows.

12. The proximity screen display of claim 11, wherein the plurality of plurality of first rows is interdigitated with the plurality of second rows to form a checkerboard like configuration.

13. The proximity screen display of claim 11, wherein the plurality of pixel elements comprises at least one of:

a plurality of pixel elements of a flexible organic light-emitting diode screen display;
a plurality of pixel elements of an electronic paper, e-paper, or electronic ink proximity screen display; or
a plurality of pixel elements of a liquid crystal proximity screen display.

14. The proximity screen display of claim 11, wherein at least one of the plurality of imaging elements comprises:

a photovoltaic device.

15. The proximity screen display of claim 14, wherein the photovoltaic device comprises:

a transparent conducting cathode layer configured to attract negatively charged electrons from a donor-acceptor layer when photons of sufficient energy strike the photovoltaic device;
a transparent conducting anode layer configured to attract positively charged electron holes from the donor-acceptor layer when the photons of sufficient energy strike the photovoltaic device.

16. The proximity screen display of claim 15, wherein the donor-acceptor layer configured to provide the negatively charged electrons to the transparent conducting cathode layer or the positively charged electron holes to the transparent conducting anode layer when the photons of sufficient energy strike the photovoltaic device causing a current to flow between the transparent conducting cathode layer and the transparent conducting anode layer.

17. A method for operating a proximity screen display, comprising:

displaying, by a plurality of pixel elements, an image on a display area of a proximity screen display;
capturing, by a plurality of imaging elements within the proximity screen display, a sequence of images containing user input elements at least some of which are non-contact but in proximity to the proximity screen display,
analyzing at least one image from among the sequence of images; and
adjusting a visual interface element to conform to the user input elements.

18. The method of claim 17, wherein the visual interface element is a keyboard, and wherein the adjusting comprises:

adjusting parameters to enlarge or to zoom into coincidental portions of the keyboard that coincide with the user input elements.

19. The method of claim 17, wherein the capturing comprises:

capturing the sequence of images containing at least one of: touch typing having full user contact with the proximity screen display; touch typing having user hover with keystroking contact with the proximity screen display, or touch typing having hover without contact with the proximity screen display but with keystroke-like user motions.

20. The method of claim 17, wherein the adjusting comprises:

causing a bordered circle to appear on the proximity screen display at a corresponding x-y location in response to the analyzing.
Patent History
Publication number: 20130100026
Type: Application
Filed: Oct 19, 2012
Publication Date: Apr 25, 2013
Applicant: Broadcom Corporation (Irvine, CA)
Inventor: Broadcom Corporation (Irvine, CA)
Application Number: 13/655,910
Classifications
Current U.S. Class: Including Keyboard (345/168); Including Optical Detection (345/175)
International Classification: G06F 3/042 (20060101); G06F 3/02 (20060101);