Method and apparatus for tomographic touch imaging and interactive system using same
A touch screen in the form of a panel is capable of conducting signals and a tomograph including signal flow ports is positioned adjacent the panel with the signal flow ports arrayed around the border of the panel at discrete locations. Signals are introduced into the panel to pass from each discrete border location to a plurality of other discrete border locations for being detected and tomographically processed to determine if any change occurred to signals due to the panel being touched during signal passage through the panel, and therefrom determine any local area on the panel where a change occurred. The tomograph computes and outputs a signal indicative of a panel touch and location, which can be shown on a display.
This application claims the priority and benefit of previously filed Argentinean Patent Application No. 20070105651 filed on Dec. 17, 2007, in the name of Victor Manuel Suarez Rovere, which is here incorporated by reference in its entirety.
BACKGROUND OF THE INVENTION1. Field of the Invention
This invention relates to a method and apparatus for tomographic touch imaging sensor and interactive system using same. More particularly, the invention relates to a method and apparatus for touch sensitive devices for computing systems or for controlling actions of associated devices or equipment, and still more particularly, to devices that detect and/or process simultaneously multiple touch interactions (by fingers or other objects) at distinct locations on a touch-sensitive surface.
2. Prior Related Art
Many types of input devices are presently available for performing operations in a computing system, such as buttons or keys, mice, trackballs, touch panels, joysticks, touch screens and the like. Touch screens, in particular, are becoming increasingly popular because of the ease and versatility of operation, as well as, their declining price. Touch screens can include a touch panel, which can be a clear panel with a touch-sensitive surface. The touch panel can be positioned in front of a display screen so that the touch-sensitive surface covers the viewable area of the display screen. Touch screens can allow a user to make selections and move a cursor by simply touching the display screen via a finger or stylus. In general, the touch screen can recognize the touch and position of the touch on the display screen, and the computing system can interpret the touch and thereafter perform an action based on the touch event.
One limitation of many conventional touch panel technologies is that they are only capable of reporting a single point or touch event, even when multiple objects simultaneously come into contact with the sensing surface. That is, they lack the ability to track multiple points of contact at the same time. Thus, even when two points are touched, these conventional devices can only identify a single location, which is typically the average between the two contacts (e.g. a conventional touchpad on a notebook computer provides such functionality). This single-point identification is a function of the way these devices provide a value representative of the touch point, which is generally by providing an average resistance or capacitance value.
Another limitation of most touch panels is that, besides being incapable of reporting a plurality of touch points that occurs simultaneously, is that they provide information about just touch coordinates. Most known touch panels cannot provide a complete representation of the details of all shapes contacting the panel, because the methods of detection often include just triangulation of touch points. Thus, a need exists for providing a better and fuller representation of touch interactions. The present invention proposes to provide a method and apparatus for achieving, for example, by providing a pixilated image representative of all the touch areas and their shapes. This ability of providing a full representation of all touch interactions is sometimes called “true-touch”.
SUMMARY OF THE INVENTIONAccordingly, it is a principal object of my invention to provide a method and apparatus for tomographic touch imaging and interactive system using same, especially for computing systems but also usable to control actions of other devices, and still more particularly, to devices that detect and/or process simultaneously multiple touch interactions (by fingers or other objects) at distinct locations on a touch-sensitive surface. The foregoing is accomplished by combining a touch panel with a tomograph in a unique way, and by using this combination as a touch imaging sensor and/or as an interactive system in a computer system.
The present invention applies techniques from tomography field of science, but instead of using it for medical purposes, the tomographic techniques are used for making a computer input device. Specifically, the input device is a touch device capable of multi-touch operations, oriented to control the forthcoming new generation of software applications that propose a new paradigm for Human Computer Interaction.
Since tomography is used to determine inner information from the borders of a panel, and because invisible signals are used, it is possible to make a transparent touch panel with almost perfect light transmission properties. The transparent properties make the touch sensor useful to work with a display, as a touch screen. Other such input devices need to put complex elements on a panel that obstruct vision. As these other such input devices need to have at least some transparency, but not 100%, this requirement makes it difficult to manufacture, affecting costs.
The present invention can be implemented with a very clear and homogeneous material, like a simple sheet of acrylic. Also, because the panel can be very thin, a slim and small system including a flat display can be made, thus enabling the system to be portable.
Several objects and advantages of my invention are listed as follows.
- a) To provide a device that is sensitive to touch and able to detect objects when they are making a subtle pressure. If the object touching is a finger of an user the invention can provide a tactile feedback. This is in contrast to proximity sensors which can be activated when the user isn't touching anything.
- b) To provide a touch sensitive device that can provide a digital representation of nearly all areas where there is a touch interaction available. This is in contrast to devices that provide location information of just a few touch interactions simultaneously.
- c) To provide a touch sensitive device that can be thin, thus having a large relative surface area with respect to volume.
- d) To provide a touch sensitive device that can have a transparent touch panel so that it can include a display visible through the panel.
- e) To provide a touch sensitive device that can be interactive by providing output information related to touch interactions.
- f) To provide a thin touch sensitive device that can be flat, or non-planar (e.g. domed) in 3-dimensional space.
- g) To provide, in some embodiments, a touch sensitive device that can detect a zero-pressure contact.
- h) To provide, in some embodiments, a touch sensitive device that can discriminate regarding more than one level of pressure on possible different touch interactions that occurs simultaneously.
- i) To provide, in some embodiments, a touch sensitive device that can be robust to some external unwanted interfering signals.
- j) To provide, in some embodiments, a touch sensitive device that can have a transparent touch area of quite low absorption with respect to visible light, useful to make the power requirements of a possibly attached display less demanding.
- k) To provide, in some embodiments, a touch sensitive device that can be constructed with elements relatively easy to manufacture.
Further object and advantages of my invention will become apparent from the following detailed description of preferred embodiments of the invention when taken in conjunction with the appended drawings.
Referring now to the drawings, and particularly to the embodiment of the present invention, a touch device is illustrated in
Surrounding the conductive panel 10, there is an optical tomograph 30. In the context of the present invention, “tomograph” means, as usually defined, a device that can image from the borders. The tomograph is controlled to emit light signals 20 by light emitters 34, through signal flow ports, which also detects light signals by detectors 32.
The term “signal flow port” refers to both emitters and receivers (detectors), and the signal flow port physically acts either as emitter or receiver. For example, an electrode in the electrical part of the embodiment can be a unique device that could serve both purposes emitter and/or receiver. Also an optoelectronic device like a LED can be used as a detector, although with less performance. So a signal flow port here isn't anything material and thus, isn't either an element that can house an emitter or detector. In the context of this invention, a signal flow port is a term used to refer interchangeably to an emitter, or to a detector, or to a device that may accomplish both roles. Thus, the port houses an emitter and/or detector and may be virtual or real.
The signal flow ports (emitters and detectors) are positioned at the edge 22 of the panel 10. The light signals 20 introduced into the panel 10 by emitters 34 at their associated signal flow ports penetrate or pass through the panel 10, in the manner described, and are detected by detectors 32 at their associated signal flow ports after circulating or passing through the panel 10. The tomograph 30 generates a tomogram representative of the panel conductivity, which is a two dimensional representation of the calculated or estimated conductivity on many imaginary or notional regions 40 defined on the panel surface 12, as shown by the array on
An exemplary shape 50 of an interesting and quite challenging touch interaction is shown on
The tomograph 30 of the touch device as illustrated on
The tomograph 30 also includes an emission pattern controller 60 operatively coupled to each emitter 34, for example, using any conventional set of communication channels 62, shown in
The emission pattern controller 60 can energize, upon requirement or according to any predetermined program, any desired combination or set of said emitters 34. Each emitter 34 can be energized at any desired emission level of at least two possible values. The configuration of a set of emitters and its levels defines an emission pattern. A representative emission pattern, according to the invention, can be to have one determinate or preselected emitter 34 active while leaving the rest inactive, and then serially process through the emitters 34, one at a time.
The signal sampler 64 can measure, upon requirement or according to any predetermined program, the signal level received on each detector 32. A representative signal sampler 64, according to the invention, can have one or more Analog to Digital Converters (ADC) and also may have one or more analog and/or digital multiplexers.
The data processor 66 can be incorporated in any known computer system, and is adapted to command the emission pattern controller 60 to energize the emitters 34 using a desired sequence of patterns. Also the emission pattern controller 60 and/or signal sampler 64 may function based on a determinate or preselected program, without being commanded by the data processor, for example a digital logic circuit.
By operating the emission pattern controller 60 and the signal sampler 64, an unreconstructed tomograph can be acquired.
To operate emission controller 60, a set of emission patterns has to be defined, for example, a sequence of patterns may be to activate, in turn or serially, different emitters 34. Each emission pattern is active at different times, forming a sequence. While each emission pattern is in effect, an associated set of detectors 32 have to be defined to measure the effects of the active emissions. Then, the data processor 66 acquires from the signal sampler 64, the measured values from the associated set of detectors 32, the set being typically all or a substantial subset of the detector 32. The measured values form part of the unreconstructed tomogram. Thus after applying all emission patterns and obtaining measurements from the detectors, the unreconstructed tomogram is obtained. An example of two emission patterns, its associated detectors, and a representation of the signal alterations, shown in graph of
The data processor 66 may be a microcontroller, microprocessor, FPGA, ASIC or special purpose electronic circuit or even part of a complete computer system. Although it may be desirable for the data processor 66 to include a data memory containing a computer program, it is possible to practice the invention performing most computations with just a digital logic circuit. For example, a tomographic reconstruction calculation, as known in the art, can be executed just with an integrated circuit. That integrated circuit can lack a specific “memory”, because computation can be hardwired. The invention contemplates a model using a programmable logic circuit (FPGA) that has a memory, but without any “computer program” as such stored in the memory.
In a typical embodiment, the data processor 66 commands the emission pattern controller 60 to activate one determinate emitter 34, then, acquires measurements from all detectors 32, then, repeating these steps serially for the rest of the emitters 34, one by one. Thus, if the number of emitters 34 is Ne and the number of detector ports is Nd, a vector of Ne by Nd elements is acquired. One can call this vector the projection vector, or one can call it a measurements vector, or just “b”. This acquired vector, not to be confused with a matrix, is sometimes called in the context of Computer Tomography (CT) the “unreconstructed tomogram” or sometimes “sinogram”.
Having the unreconstructed tomogram with Ne by Nd potentially independent values, the amount of available information to make reconstruction can be thus in quadratic relation with respect to number of signal flow ports. So the number of potentially independently recognized touch interactions regions can be in accordance to that quadratic relation, and not in linear relation with respect to number of emitters 34 or detectors 32. Thus, that number of potentially discriminated regions can be quite large or big. Hence, the foregoing is a sample of the potential of the present invention of being able to discriminate not just a few simultaneous touch coordinates, but many.
Having the unreconstructed tomogram, a reconstructed tomogram can be calculated by the use of a Tomographic Reconstruction algorithm. The reconstructed tomogram, or just tomogram, is the representation of the features of the tomographed “phantoms”, a phantom being a particular “test” or “active” touch interaction.
In a tomographic reconstruction algorithm, a linear or nonlinear model of the tomographic system may be used to transform the unreconstructed tomogram. If a linear model is chosen, the system may be modeled by matrix equations and solved by linear matrix operations in a known manner.
A typical equation used to model tomographic systems is Ax=b, where A corresponds to the “system matrix”, as is sometimes called, x corresponds to the reconstructed tomogram and is a vector that represents the regions conductivity values, not to be confused with a matrix just because regions are usually a grid. The variable b is the unreconstructed tomogram, a vector containing the measured projection values reaching the detectors 32 for each emission pattern. The system matrix, in the context of CT, represents the transformation that the typical linear tomograph system accomplishes to generate the projection data. A typical linear tomograph thus works by transforming b into x taking into account matrix A as a transformation operator.
To be able to calculate a two-dimensional tomogram adequately, one needs to have defined appropriate locations to each emitter 34 and detector 32, which are typically surrounding the panel 10 on its edge(s). Also, one needs to define imaginary or notional locations and shapes of a set or array of regions 40 on the surface 22 of the panel 10, where an estimation of conductivity is going to be calculated. The regions 40 are sometimes called “reconstruction regions”. In a typical embodiment, the regions 40 are rectangular and are disposed next to each other in an appropriate imaginary or notional grid of pixels.
Also, to be able to calculate an adequate tomogram, the regions have to be geometrically defined in a way that each region is affected by more than two non-parallel line integrals; each line integral being defined as the path of an actually measured signal between a determined pair of emitter 34 and detector 32. This is to accomplish, for each region, adequate “angle density” as is usually called. In a diffusion tomograph the concept of line integrals may be geometrically not similar to a line, but to a sort of “field” as shown on
A determinate signal corresponding to a line integral may be altered in more than one point of its path without being substantially obstructed by a single touch, as depicted in
Those skilled in the art will understand that the requirements of high angle density for each region and existence of an integral function to do tomographic reconstruction has support in the Projection Slice Theorem, which is related to the Radon Transform. Basically, the projection-slice theorem tells us that if one had an infinite number of one-dimensional projections of an object taken at an infinite number of angles, one could perfectly reconstruct the original object, f(x,y). Note the need of multiple angles (as much as possible in a non-theoretical device) and the need of “projection” (ie. integration data) along the line. However, the present invention is not limited by this theorem.
It's well known that a finger interacting with a waveguide like in the presented embodiments makes better and broader optical contact as more pressure is applied. This effect is because, together with other causes, the skin on the fingerprints gets more deformed, so the average contact area per unit area and thus, average conductivity gets altered in a higher level. This higher change of conductivity could be interpreted as more pressure and used to take distinctive actions. For example an icon of a user interface can be selected with a light pressure but activated with more pressure.
Also, as in any measurement instrument, repeated measurements over time can be done to have a time-varying measurement, so taking a series of tomographs over time is useful to determine how touch interactions changes over time, and that information may be used to detect touch events or gestures, with algorithms well known in the art.
Another embodiment of the touch device, as illustrated in
Another embodiment of the touch device, as illustrated on
The transformation of the information representative of touch interactions into information for the output device may be of the type commonly used in Graphical User Interfaces (GUI). For example, the transformation could be to show an icon on the display with an alternative color when a touch interaction event is detected on an area inside the icon, so a user can have the sensation of touching the icon.
The output device 70 may typically be a display such as a Liquid Crystal Display (LCD). A device with a display behind a transparent panel to form an interactive system is one of the main objectives of the invention. However, the touch sensor could be used without a display, for example, just to generate sounds.
The tomograph representative of regions where touch interactions occur may be like a pixilated image representing the conductivity on each region, but an event based GUI may be designed to just accept coordinates of touch interactions or more complex gestures instead of level of activity on determinate regions. So an algorithm to translate activity on each region to coordinates of touch interactions or gestures is useful and contemplated by the invention. The algorithm for determination of touch events often called “Blob Detection and Tracking” and is well known in the art. It is used by some systems that calculate and track touch locations by first using computer vision over an image of the touch regions, and using the image captured with a video camera.
In another embodiment of the touch device according to
In another embodiment of the touch device the panel may be of a material acting as a dielectric, and the tomograph may be a Capacitance Tomograph, where the touch of a finger or other object can alter the dielectric properties of the panel.
In another embodiment of the touch device the panel may be of a material acting as a vibration conducting medium, and the tomograph may have acoustic or piezoelectric transducers as ports, in a similar way as an ultrasonic tomograph for pregnancy is implemented. The acoustic impedance of the panel can be altered by the touch of a finger or other object.
Operation
The operation of the touch screen device included in an embodiment of an interactive device (the method of the invention) is shown in the flow chart of
Steps S1 to S7 carried out as described in the blocks of
Steps S8 to S22 carried out as described in the blocks of
Step S18 to S22 are carried out as described in the blocks in
Step S1 is for defining positions and shapes of the imaginary or notional reconstruction regions over the sensitive surface. The regions forms typically a pixilated grid but each region can have its own shape, that can be polygonal, or inclusive defining a surface of more than one disjoint shape.
Step S2 is for defining the set of emission and detector ports to be involved in unreconstructed tomogram acquisition, and also for selecting its positions. The port positions may be typically the surroundings of the sensitive panel, for example alternating an emitter with a detector at regular intervals. Also, the positions may be near just some portions of the panel's surroundings, for example on just a subset of edges of a polygonal panel. Where there are not ports, the panel may have reflective materials so signals can bounce there (be reflected) and reach other ports.
Step S3 is for defining a sequence of emission patterns, where the sequence typically contains a linear independent set of emission patterns, for example, the activation in sequence of just one emitter. Another set of patterns may include patterns interpreted as a matrix that has a substantially high rank so different emission patterns can be discriminated. Matrix rank is well known to those of ordinary skill in the art. Another example of patterns can be Code Sequences as normally used for CDMA, a well known technique for multiplexing. The objective is to permit multiplexing and adequate discrimination of the different patterns.
Step S4 is for defining for each emission pattern a set of associated detector ports. Once an emission pattern is active, the signal reaches a set of detector ports and some of them are selected for measurement. The set of ports associated with an emission pattern may be selected taking into account the level of the signal normally received.
Step S5 is for taking a reference unreconstructed tomogram that will be used as reference to calculate the differences on conductivity when any touch is occurring. The panel has default conductivity when no touch is affecting its conductivity, and the measurements are for determining the changes produced by touch interactions.
Step S6 is for calculating the system matrix A, assuming a linear model of the tomographic system. A possible way of calculating the system matrix to be further used to calculate the tomogram may be by making X an identity matrix, on equation AX=B, being B a matrix of measurements, a set of unreconstructed tomogram acquisition. So if X is the identity, thus A=B, thus the way of calculating A may be to touch in sequence each reconstruction region with corresponding shape and “unit pressure”, and measuring in each touch step the projection, so to construct B, and thus this set of measurement results in the system matrix A. This procedure, as explained above, will be readily evident to a person of ordinary skill in the art.
Step S7 is for calculating its pseudoinverse A′ by using a Truncated Singular Value Decomposition (TSVD) algorithm. A typical algorithm for pseudoinverse calculation is provided by the function pinv of the MATLAB software from MathWorks. The parameter that controls the truncation may be used to adjust the results provided by the reconstruction algorithm, for example to make a better estimation of the conductivity values by reducing resulting reconstruction noise by the use of that truncation parameter. A way of estimating the truncation parameter may be to simulate the system with respect to its noise performance, and optimizing the parameter trying to minimize the noise figure.
The definition of parameters for tomographic reconstruction, that includes the definition of regions, position of ports, and sequence of emission patterns with its associated detectors, needs to be selected in a way to permit adequate reconstruction. A way of evaluating suitability of parameters, is by calculating the associated system matrix A, assuming a linear model of the tomographic problem, and then evaluating characteristics of the matrix A. For example, a criterion could be to evaluate if the matrix rank is close to number of defined regions, and with a condition number that characterizes a “well-conditioned” problem. If after evaluating suitability of tomographic parameters, they don't meet the defined criteria, the parameters can be redefined until the criteria are met. For example, less and bigger regions may be defined. Conditions number, rank, and matrix conditioning are concepts well understood to those skilled in the art.
Step S8 represents the actions of a user that perform control operations through one or more touch interactions that can occur simultaneously over the sensitive surface.
Step S9 is for initializing a vector b for holding the posterior acquired unreconstructed tomograph.
Step S10 is for energizing emitter ports by configuring the pattern controller with an emission pattern. In each step of the loop, each emission pattern is selected, in sequence, from the set of defined emission patterns and applied.
Step S11 is for measuring the signal levels reaching the set of detector ports associated with the active emission pattern. To measure signal levels having some rejection to possible interfering signals, for example external light on the optical embodiment, a double reading can be done. For example, the signal levels reaching detectors with all emitters off can be subtracted from the value measured when emission pattern is active.
An embodiment can discard some of the acquired values with a criterion, or just not measure the value. For example, emitting from a determinate emitter and measuring from a detector in close proximity, may be of little use. Other criteria to discard measurements may include discarding emitter-detector combinations that involves a weak signal between them, because, for example, their relative angle of incidence. Most emitters and receivers respond weakly to signals at high angles with respect to the perpendicular. Also, the measured values can be weighted by associating weighting factors with each measured value, being that operation useful to reduce overall reconstruction noise.
Step S12 is for subtracting to this value, the corresponding reference values taken from vector b0. So when no touch is active, the resulting value will be zero plus the possible measurement noise. And in Step 13, the obtained values are appended to the vector b. As shown on Step S14, the steps from S10 to S13 are repeated until unreconstructed tomograph is completely acquired.
Step S15 is for preprocessing the unreconstructed tomogram, for example to linearize it by applying for example a nonlinear operation, in a similar way that most X-ray tomography devices first apply a logarithm to linearize the cumulative effect of successive attenuations that tissue imposes to X-rays. Step S16 is for calculating the reconstructed tomograph x by solving the equation x=A′b, where A′ is the pseudoinverse of system matrix and b is the unreconstructed tomograph. Step S17 is for post processing the reconstructed tomogram, for example with a noise gate that assign zero to values that are below some threshold. The threshold can be set just above normal peak noise values.
Step S18 is for recognizing touch interactions taking into account the reconstructed tomogram and generate touch events. To recognize events, a set of previously acquired tomograms may be also used. Algorithms of recognition of touch events and gestures are well known in the art and are sometimes called “blob tracking and detection”, and may be based on algorithms such as the Watershed algorithm. The kind of events recognized can be a new touch appearing, a touch disappearing, an existing touch moving, or other gestures. A gesture can be a “pinch” gesture that is produced by touching the panel in two locations, and separating apart the fingers. That pinch gesture can be used for example to control the zoom factor in a map application. A touch event can be associated with a coordinate that can have more resolution than the provided by the reconstructed tomogram. A set of values around an area where the touch is occurring, can be processed to obtain a sub-pixel coordinate. For example, having a 3×3 pixels area, the coordinate of each pixel center on that area can be averaged by weighting its coordinate with the value associated with each pixel, obtaining a coordinate similar to each pixel but with sub-pixel resolution.
Alternatively, another embodiment of the touch device operations may skip touch the events detection and generation, transforming the reconstructed tomogram by DSP algorithms or other methods to control the output device. For example the output device may be a sound generation device. The application to be controller may be a synthesizer, using touch interactions to alter synthesizer parameters. As an example, interactions on the horizontal axis may be mapped to pitch and interactions on the vertical axis, to harmonics levels, thus forming a multi spectral instrument.
Step S19 is for modifying the state of a user interface program in response to the touch events. For example if the user interface has a model of a virtual button, that button state becomes active if a touch interaction is detected on the area within the virtual button perimeter.
Step S20 is for generating the output information representative of the altered state of the UI program. For example, in the virtual button case, the output information is one that maps to different colors according to the button activation state.
Step S21 is for showing output information on the display adapted to that information through a transparent implementation of the touch sensitive panel.
Finally, the output of Step 21 is passed to Step 22, which is the user interface receiving feedback of the touch control operations. The user sees the results on the display. Further interactions from the user can be processed by repeating steps from S8 to S22.
The best mode contemplated for practicing the present invention, as of the filing of this application, is an interactive system with the optical FTIR-based embodiment. This embodiment permits the use of a panel with almost perfect light transmission properties, the detection of touch interactions with zero-pressure and also with above-zero pressure, and the use of a very low-cost panel, and other manufacturing advantages, like ease of mounting. This embodiment is detailed as follows.
A conventional LCD display is used as an output device. On top of the display, there's a clear panel made of a sheet of acrylic of less than a few millimeters thick, where infrared light passes from the borders. The objective is to make the screen visible through the panel. The panel is in mechanic contact with the screen by using two-sided tape put on the borders of the screen top, out of its active visible area. To prevent the display making unwanted TIR effect frustration on the panel's bottom surface, the tape is reflective or mirror-like. The tape is thick so if mechanic deformation occurs when pressure is applied with the fingers, the panel doesn't touch the screen. The panel shape is similar to a rectangle, but octagonal with short diagonal corners similar to the panel shown on
Surrounding all panel perimeter, there are infrared LEDs used as emitters and photo transistors used as detectors, touching the panel' borders, so infrared light can enter and exit from the panel. The LED and photo transistors are alternating, each emitter between two detectors and vice versa. The emitters and detectors are spaced apart at regular intervals on each panel edge, separated by about one millimeter. The emitters and detectors are soldered on different Printed Circuit Boards, one for each panel edge.
The emitters are connected to a circuit that works similar to a LED Matrix controller that permits to activate any LED. The possible activation levels for each LED are two: no current or the maximum current allowed for the emitters. Taking into account each LED is active just a short period of time, more than the nominal continuum rate of emission is possible, like in conventional LED Matrices.
The photo transistor detectors have infrared filters matching the emission spectrum of emitters. Each phototransistor makes its current flow through a resistance to convert the current in voltage. The detectors are grouped in groups of eight elements, and each group connected to an 8-to-1 analog multiplexer.
Each analog multiplexer is associated with a 14-bit Analog to Digital Converter, and each digital output of the ADCs are connected to a digital multiplexer, so each ADC conversion value can be addressed. The digital multiplexer is a bus-based one, so each non-selected ADC is in high impedance state while the addressed one ADC is in low impedance with respect to the bus.
The LED matrix controller and the digital multiplexer are connected to a FPGA that controls them and is able to execute a Tomographic Reconstruction Algorithm.
The FPGA is connected to a CPU adapted to be able to run an application with a User Interface. The CPU is also connected to the display.
On the panel's top surface a pixilated matrix is defined. The total area matches the screen's visible area. The pixels are square with a side size matching emitter and detectors separations. A vector representing an unreconstructed tomogram is acquired by energizing in turn each emitter, and while each emitter is active, taking a measurement from all detectors. To optimize acquisition time, all ADC are commanded in parallel to take the measurement using a selected multiplexer channel, repeating this for the eight acquisition channels. The measurements are sent to the FPGA through the bus.
Each time an emitter is energized and measurements are taken, another set of measurements is taken but with all emitters deactivated. This measurement is to read ambient light interference level to be subtracted from the measurement with the emission pattern active. That measurement is taken many times because ambient light can be varying, for example most light bulbs oscillates at 120 Hz. To estimate the interference level at the time of measurement with the emitter active, an average with previous and past interference measurement is calculated. This takes into account the fact that photo transistors are not as fast as other optoelectronic devices like PIN photo diodes, so before measurement a delay is inserted to account for phototransistor time response.
With the tomogram acquisition as described, a base measurement representative of the tomogram acquired when no touch interactions occurs, is taken at factory and stored with the device on a provided memory. That base measurement is called b0. This measurement is taken in every touch device produced, to account possible variations on each specific device properties. For example, not all LED in a batch produces the same level of light emission for same conditions. So this measurement accounts for that possible variation.
A linear model of the tomographic system is assumed. The system matrix A is also calculated at factory by methods used to achieve Algebraic Reconstruction Methods. Each column of matrix A is the unreconstructed tomogram taken when a determinate “unity” touch interaction is in effect. This unreconstructed tomogram is the change with respect to the base tomogram b0. The “unity” touch interaction is an interaction in a determinate region with a shape and position in accordance with that region. Columns of matrix A are representing each reconstruction region. Each “unity” touch interaction can be made by actually touching the panel with a mechanic arm, or by simulating the effect. Those skilled in the art will know to simulate the effect but a method is proposed anyways. Since the rows of system matrix A represents a determinate pair of emitter-detector, for each emitter-detector pair, the row represents all regions associated with that pair. The regions correspond to a pixilated grid and each region has an associated coefficient, so it can be viewed like a 2D grayscale image canvas. To calculate each coefficient, an anti aliased “white” line is drawn between the emitter and detector on a “black” background, taking into account emitter and detector coordinates. White corresponds to value 1 and black to 0. The “gray” level of each canvas pixel is then associated with the row coefficient. Then each row coefficient is multiplied with the value of the corresponding pair from base tomogram b0, to account for actual signal levels involved on the specific device implementation. A rectangular system matrix A is used with more columns that rows, so more measurements are taken than the amount of regions. More measurements than regions are useful to have more precision on tomographic reconstruction. It should be noted that on all steps for tomographic reconstruction, information associated with some pairs of emitter-detector are not taken into account. Some are discarded according to the following criteria: if the dot product of corresponding row on A matrix is below a threshold. The threshold is selected in a way that the rank of A matrix isn't much below region count.
Having the system matrix A, a pseudoinverse A′ is calculated by the Truncated SVD method, by using MATLAB's pinv function. The tolerance parameter is calculated as follows. A value, say 1, is selected. Then by computer simulation, the reconstruction noise for a set of simulated touch interactions is calculated. Based on the noise properties of simulation results for each region, the value is lowered or incremented, so an optimization algorithm is run. The objective is to have noise lowered and if possible equally distributed for all the regions. A practical value is sometimes around 0.4. The pseudoinverse A′ is also stored on device for later recall.
Once all parameters for tomographic reconstruction are calculated, the device enters an interactive loop that runs continuously.
A tomogram b is acquired as described, with the base tomogram b0 subtracted, so if no touch interactions are occurring, the b vector contains zeros or just normal noise values. A reconstructed tomogram x is calculated by solving the equation x=A′b on the FPGA, by techniques well know to those skilled in the art. The reconstructed tomogram, as represented by vector x, is interpreted as an image representing touch interactions, with each pixel corresponding to each reconstruction region.
The tomogram x is transmitted to the CPU, where it is post processed by a noise gate, thus making zero the values below some threshold. The threshold is by default fixed just above normal reconstruction noise. The default threshold value is also calculated at factory but a noise control knob is provided to the user that can run configuration software.
On the CPU, a set of current and previous tomograms are analyzed to find touch events and gestures. The software used analyzes the events by executing a Blob Detection and Tracking algorithm. The algorithm can be selected by the user with a configuration application, where adapted versions of the software TOUCHLIB, http://www.nuigroup.com/touchlib/ and OPENTOUCH, http://code.google.com/p/opentouch/ are offered as options.
After events or gestures are recognized, they are sent by TUIO protocol to applications adapted to work with that protocol. An example application is a GPS mapping application.
The application running on the CPU generates visual information as feedback that is sent to the display to be seen through the panel by the user, thus being that visual response feedback of the control operations introduced by touching the touch device of the present invention.
Although the present invention has been described regarding several illustrative embodiments, numerous variations and modifications will be apparent to those skilled in the art that do not depart from the teachings herein. Such variations and modifications are deemed to fall within the purview of the appended claims.
Claims
1. A touch sensitive human-computer interaction device comprising:
- a waveguide having a touch surface;
- a plurality of light emitters optically coupled at input locations around the periphery of the waveguide touch surface to emit light signals for transmission through the waveguide by the internal reflection effect, wherein one or more touch interactions on the touch surface of the waveguide alters the light conductivity of the waveguide in the location(s) of the one or more touch interactions causing frustration of the internal reflection effect;
- a plurality of detectors optically coupled at output locations around the periphery of the waveguide touch surface to receive the light signals along respective sensing signal paths defined between each respective pair of light emitter and detector, and arranged with a device comprising an analog to digital converter to output respective signals on the basis of the light signals received;
- wherein each output signal is representative of the level of frustration of the internal reflection caused by one or more touch interactions along the respective sensing signal paths, and wherein each output signal is alterable by each of a plurality of simultaneous touch interactions along a single sensing signal path of the respective signal paths; and
- a data processor operatively coupled to the device comprising the analog to digital converter and arranged to process said output signals according to a reconstruction algorithm to obtain a digital representation of said light conductivity and, based on the reconstruction, to output a signal for controlling a visualization device,
- wherein the reconstruction algorithm is a Computerized Tomography reconstruction algorithm.
2. The touch sensitive human-computer interaction device according to claim 1, wherein the level of each said output signals is further alterable according to the level of pressure of each touch interaction and said reconstruction accordingly represents said respective pressure levels.
3. The touch sensitive human-computer interaction device according to claim 1, wherein said processing further discriminates interior properties of the one or more touch interactions.
4. The touch sensitive human-computer interaction device according to claim 1, wherein said processing further discriminates the light conductivity at a plurality of contact areas defined on the touch surface.
5. The touch sensitive human-computer interaction device according to claim 1, wherein said Computerized Tomography reconstruction algorithm is implemented according to the Projection-Slice theorem.
6. The touch sensitive human-computer interaction device according to claim 1, wherein said processing further provides location information of the one or more touch interactions relative to the touch surface.
7. The touch sensitive human-computer interaction device according to claim 1, further including the visualization device, wherein the waveguide is transparent and the visualization device is located on the opposite side of the waveguide to the touch surface.
8. The touch sensitive human-computer interaction device according to claim 1, wherein the waveguide is curved.
9. The touch sensitive human-computer interaction device according to claim 1, wherein the waveguide is flexible.
10. The touch sensitive human-computer interaction device according to claim 1, wherein said signal paths collectively define more than two angles.
11. The touch sensitive human-computer interaction device according to claim 1, further comprising a plurality of analog to digital converters.
12. The touch sensitive human-computer interaction device according to claim 11, wherein the plurality of analog to digital converters operate in parallel.
13. A method of controlling a visualization device using a touch sensitive human-computer interaction device comprising:
- a waveguide having a touch surface;
- a plurality of light emitters optically coupled at input locations around the periphery of the waveguide touch surface to emit light signals for transmission through the waveguide by the internal reflection effect, wherein one or more touch interactions on the touch surface of the waveguide alters the light conductivity of the waveguide in the location(s) of the one or more touch interactions causing frustration of the internal reflection effect;
- a plurality of detectors optically coupled at output locations around the periphery of the waveguide touch surface to receive the light signals along respective sensing signal paths defined between each respective pair of light emitter and detector, and arranged with a device comprising an analog to digital converter to output respective signals on the basis of the light signals received;
- wherein each output signal is representative of the level of frustration of the internal reflection caused by one or more touch interactions along the respective sensing signal paths, and wherein each output signal is alterable by each of a plurality of simultaneous touch interactions along a single sensing signal path of the respective signal paths;
- and a data processor operatively coupled to the device comprising the analog to digital converter;
- wherein the method includes the steps of the data processor:
- processing said output signals according to a reconstruction algorithm to obtain a digital representation of said light conductivity and,
- outputting, based on the reconstruction, a signal for controlling the visualization device,
- wherein the reconstruction algorithm is a Computerized Tomography reconstruction algorithm.
14. The method according to claim 13, wherein the level of each said output signals is further alterable according to the level of pressure of each touch interaction and said reconstruction accordingly represents said respective pressure levels.
15. The method according to claim 13, wherein said processing further discriminates the light conductivity at a plurality of contact areas defined on the touch surface.
16. The method according to claim 13, wherein said processing further provides location information of the one or more touch interactions relative to the touch surface.
17. The method according to claim 13, wherein the touch sensitive human-computer interaction device includes the visualization device, and the waveguide is transparent and the visualization device is located on the opposite side of the waveguide to the touch surface.
18. The method according to claim 13, wherein said signal paths collectively define more than two angles.
19. The method according to claim 13, further comprising a plurality of analog to digital converters.
3681110 | April 1975 | Hounsfield et al. |
4448837 | May 15, 1984 | Ikeda |
4484179 | November 20, 1984 | Kasday |
4644100 | February 17, 1987 | Brenner et al. |
4700176 | October 13, 1987 | Adler |
4746770 | May 24, 1988 | McAvinney |
4746914 | May 24, 1988 | Adler |
4749853 | June 7, 1988 | Salim |
4791416 | December 13, 1988 | Adler |
4980646 | December 25, 1990 | Zemel |
5565682 | October 15, 1996 | Frank et al. |
5623158 | April 22, 1997 | Frank et al. |
5768386 | June 16, 1998 | Yokomoto et al. |
5847405 | December 8, 1998 | Acquaviva et al. |
6164135 | December 26, 2000 | Bicz |
6323846 | November 27, 2001 | Westerman et al. |
6570557 | May 27, 2003 | Westerman et al. |
6690363 | February 10, 2004 | Newton |
6747636 | June 8, 2004 | Martin |
6864882 | March 8, 2005 | Newton |
6965836 | November 15, 2005 | Richardson |
7157649 | January 2, 2007 | Hill |
7265748 | September 4, 2007 | Ryynanen |
7334485 | February 26, 2008 | Weseman |
7432893 | October 7, 2008 | Ma |
7435940 | October 14, 2008 | Eliasson et al. |
7629967 | December 8, 2009 | Newton |
7653883 | January 26, 2010 | Hotelling et al. |
7705835 | April 27, 2010 | Eikman |
7880732 | February 1, 2011 | Goertz |
8068101 | November 29, 2011 | Goertz |
8094137 | January 10, 2012 | Morrison |
8125649 | February 28, 2012 | Godbillon et al. |
8144336 | March 27, 2012 | Quadling et al. |
8174701 | May 8, 2012 | Masuda |
8227742 | July 24, 2012 | Drumm |
20040145575 | July 29, 2004 | Weindorf et al. |
20040252091 | December 16, 2004 | Ma et al. |
20050248540 | November 10, 2005 | Newton |
20060086896 | April 27, 2006 | Han |
20060097991 | May 11, 2006 | Hotelling |
20060114237 | June 1, 2006 | Crockett et al. |
20060114244 | June 1, 2006 | Saxena et al. |
20060161870 | July 20, 2006 | Hotelling et al. |
20060161871 | July 20, 2006 | Hotelling et al. |
20060227120 | October 12, 2006 | Eikman |
20070118305 | May 24, 2007 | Sullivan et al. |
20070125937 | June 7, 2007 | Eliasson et al. |
20070198926 | August 23, 2007 | Joguet |
20070201042 | August 30, 2007 | Eliasson et al. |
20070268274 | November 22, 2007 | Westerman |
20080005703 | January 3, 2008 | Radivojevic et al. |
20080029691 | February 7, 2008 | Han |
20080074401 | March 27, 2008 | Chung |
20080088593 | April 17, 2008 | Smoot |
20080143683 | June 19, 2008 | Hotelling |
20080158172 | July 3, 2008 | Hotelling |
20080158176 | July 3, 2008 | Land et al. |
20080179507 | July 31, 2008 | Han |
20090122028 | May 14, 2009 | Ing |
20090127005 | May 21, 2009 | Zachut et al. |
20090135162 | May 28, 2009 | Van De Wijdeven et al. |
20090267919 | October 29, 2009 | Chao et al. |
20090273794 | November 5, 2009 | Ostergaard et al. |
20100127992 | May 27, 2010 | Schmid |
20110227874 | September 22, 2011 | Fahraeus et al. |
20110310064 | December 22, 2011 | Keski-Jaskari et al. |
241973 | January 1993 | AR |
006106 | August 1999 | AR |
029424 | June 2003 | AR |
042333 | June 2005 | AR |
064377 | April 2009 | AR |
949233 | June 1974 | CA |
WO2006095320 | September 2006 | NL |
0802531-4 | July 2010 | SE |
2005026930 | March 2005 | WO |
2005026938 | March 2005 | WO |
2005114369 | December 2005 | WO |
2004053781 | June 2006 | WO |
WO2006/095320 | September 2006 | WO |
2006108443 | October 2006 | WO |
2007112742 | October 2007 | WO |
2008000964 | January 2008 | WO |
2008017077 | February 2008 | WO |
2008038066 | April 2008 | WO |
2008066004 | June 2008 | WO |
2008146098 | December 2008 | WO |
2008148307 | December 2008 | WO |
2008154792 | December 2008 | WO |
2009077962 | June 2009 | WO |
2009092599 | July 2009 | WO |
2010064983 | June 2010 | WO |
- Gaudette et al.,“A comparison study of linear reconstruction techniques for diffuse optical tomographic imaging of absorption coefficient”, Jan. 20, 2000, Phys. Med. Biol. 45, p. 1051-1070.
- NextWindow's Multi-Touch Overview, http://touch-base.com/documentation/Documents/nextwindow—multitouch.pdf, copyrighted 2007.
- NextWindow's Optical Touch Screen Technology, http://www.nextwindow.com/benefits/touchscreen—technology.html, 2008.
- Meg Davis et al. (Oct. 2006) “Final Project Proposal: Multi-touch Screen” CPSC 452 Team 1 Selected portions: source code, pictures, schematics, descriptions, (available online http://sites.googl.com/site/cpsc462team1/project-proposal).
- Wayne Westerman (Mar. 1999) “Hand tracking, finger identification, and chordic manipulation on a multi-touch surface” PhD thesis, University of Delaware Selected portions: Abstract, pp. 34-38, all the rest. (available online: http://www.ece.udel.edu/˜westerma/main.pdf).
- U.S. Appl. No. 61/193,526, filed Dec. 5, 2008: pp. 5-15, p. 24, abstract, claims, figures, all the rest.
- PCT/SE2009/051364: International Preliminary Report (a) and Written Opinion (b), Apr. 5, 2010.
- U.S. Appl. No. 12/998,771: First non-final action, Sep. 20, 2012.
- PCT/IB2007/052720: Written Opinion, Nov. 28, 2009.
- U.S. Appl. No. 11/908,032: Office Action Selected passages: pp. 2-14, all the rest, Sep. 29, 2010.
- U.S. Appl. No. 11/320,742, filed Dec. 30, 2005.
- F.M. Dickey, A. W. Doerry (May 2008) “Recovering shape from shadows in synthetic aperture radar imagery” Radar Sensor Technology XII, Proceedings of SPIE vol. 6947.694707 Selected passages: Section 2.2 first paragraph, pp. 4-5, all the rest. (available online: http://144.206.159.178/FT/CONF/16414556/16414560.pdf).
- Caspar Von Wrede and Pavel Laskov (Apr. 2004) “Using classification to determine the number of finger strokes on a multi-touch tactile device” 10 ESANN'2004 proceedings 28-30 ISBN: 2-930307-04-8; pp. 549-554 (available online: http://www.dice.ucl.ac.be/Proceedings/esann/esannpdf/es2004-127.pdf).
- Rubine (1991) “The Automatic Recognition of Gestures” Thesis, Carnegie Mellon University, CMU-CS-91-202 pp. 25-28 and pp. 221-222 (available online: http://www.cs.cmu.edu/afs/cs/Web/People/music/papers/dean—rubine—thesis.pdf).
- Bill Buxton (Sep. 2007) “Multi-Touch Systems that I Have Known and Loved”, online article. (available online: http://web.archive.org/web/20071011015629/http://www.billbuxton.com/multitouchOverview.html).
- “The sensor frame graphic manipulator” (May 1990) NASA-CR-19424 document id 1994000326 (available online: http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa gov/19940003261—1994003261.pdf).
- SK. Lee, W. Buxton and K.C. Smith (Apr. 1985) “A multi-touch three dimensional touch-sensitive tablet” CHI '85 Conference proceedings, San Francisco, ISBN: 0-89791-149-0 (available online: http://www.billbuxton.com/leebuxtonsmith.pdf).
- Ian Maxwell (Dec. 2007) “An Overview of Optical-Touch Technologies” Information Display vol. 23, No. 12 (available online: http://www.walkermobile.com/December—2007—ID—Overview—of—Optical—Touch—Technologies.pdf).
- Fink (Apr. 1998) “Ultrasound puts materials to the test” NDTnet vol. 3 No. 4 (available online http://www.ndt.net/article/0498/fink/fink.htm).
- Hodges et al. (Oct. 2007) “ThinSight: Versatile Multi-touch Sensing for Thin Form-factor Displays” In UIST '07: Prooeedtngs of the 20th annual ACM symposium on User intetface software and technology, pp. 259-268 (available online: http://www.billbuxton.com/UISTthinSight.pdf).
- Kak and Slaney (1999) Online book: “Principles of Computerized Tomographic Imaging” Selected portion: Chapter 1 (available online: http://www.slaney.org/pct/pct-toc.html).
- Natterer (2001) Book: “The Mathematics of Computerized Tomography” (Classics in Applied Mathematics) Selected portion: Preface (p. xi) ISBN-13: 978-0898714937.
- Kuba (1999) Paper book: “Discrete Tomography: Foundations, Algorithms, and Applications (Applied and Numerical Harmonic Analysis)” 1 Selected portion: Chapter 1, section 1.1 (Introduction, pp. 3-6) ISBN-13: 978-0817641016.
- Herman (1980) Paper book: “Fundamentals of Computerized Tomography: Image Reconstruction from Projections (Advances in Computer Vision and Pattern Recognition)” Selected portion: pp. 4-11 ISBN-13: 978-0123420503.
- Gardner (2006) Paper book: “Geometric Tomography (Encyclopedia of Mathematics and its Applications)” Selected portion: Preface (pp. xvii-xxii) ISBN-13: 978-052168493.
- Deans (2007) Paper book: “The Radon Transform and Some of Its Applications” Selected portion: Chapter 1, section 1.1 (Introduction) ISBN-13: 978-0486482417.
Type: Grant
Filed: Dec 15, 2008
Date of Patent: Jul 8, 2014
Patent Publication Number: 20090153519
Inventor: Victor Manuel Suarez Rovere (San Miguel)
Primary Examiner: Sumati Lefkowitz
Assistant Examiner: David Tung
Application Number: 12/334,670
International Classification: G06F 3/042 (20060101);