Touch Screen Apparatus for Recognizing a Touch Gesture
A touch-based user interface system comprising an arrangement for scanning a tactile sensor array to produce a corresponding array of measurement values that are presented to a processor for computation. The computation produces a plurality of running sums created from selected measurement values or functions of selected measurement values. A post-scan computation algorithm derives at least three independently-adjustable interactive control parameters responsive to at least displacements or angles the contact of a single area of threshold contact or threshold proximity. The system provides output control signals responsive to the independently-adjustable interactive control parameters. In one aspect of the invention, n algorithmic element for handling of regions of threshold contact or threshold proximity having non-convex shapes. In another aspect of the invention, an algorithmic element calculates the rate of change of one or more of the independently-adjustable interactive control parameters. Other aspects of the invention include shape and gesture recognition.
Latest Advanced Touchscreen and Gestures Technologies, LLC Patents:
This application is a continuation of U.S. application Ser. No. 15/090,219, filed Apr. 4, 2016 (the '219 application). The '219 application is a continuation of U.S. application Ser. No. 13/473,525, filed May 16, 2012 (the '525 application), now U.S. Pat. No. 9,304,677, issued Apr. 5, 2016. The '525 application is a continuation of U.S. application Ser. No. 11/761,978, filed Jun. 12, 2007, which is a continuation of U.S. application Ser. No. 09/812,400, filed Mar. 19, 2001, now U.S. Pat. Ser. No. 7,786,370, issued Aug. 31, 2010, which is a division of U.S. application Ser. No. 09/313,533, filed May 15, 1999, now U.S. Pat. Ser. No. 6,610,917, issued Aug. 26, 2003, which claims benefit of priority of U.S. provisional application Ser. No. 60/085,713, filed May 15, 1998.
This application is also related to U.S. application Ser. No. 13/470,725, filed May 14, 2012.
FIELD OF INVENTIONThe present invention relates generally to a control system, and in particular, to a tactile input controller for controlling an associated system.
SUMMARY OF THE INVENTIONTouchpad user interfaces for controlling external systems such as computers, machinery, and process environments via at least three independent control signals. The touchpad may be operated by hand, other parts of the body, or inanimate objects. Such an interface affords a wide range of uses in computer applications, machine and process control, and assistance to the disabled. In one embodiment simple contact position-sensing touchpads, producing control signals responsive to a contact region, are enhanced to provide several independent control signals. Enhancements may include velocity sensors, pressure sensors, and electronic configurations measuring contact region widths. Touch-screens positioned over visual displays may be adapted. According to other aspects pressure-sensor array touchpads are combined with image processing to responsively calculate parameters from contact regions. Six independent control parameters can be derived from each region of contact. These may be easily manipulated by a user. In one implementation, smaller pressure-sensor arrays are combined with data acquisition and processing into a chip that can be tiled in an array.
The above and other aspects, features and advantages of the present invention will become more apparent upon consideration of the following description of preferred embodiments taken in conjunction with the accompanying drawing figures, wherein:
Described herein are two kinds of novel touch-pads. Null/contact touchpads are contact-position sensing devices that normally are in a null state unless touched and produce a control signal when touched whose signal value corresponds to typically one unique position on the touch-pad. A first enhancement is the addition of velocity and/or pressure sensing. A second enhancement is the ability to either discern each dimensional-width of a single contact area or, alternatively, independently discern two independent contact points in certain types of null/contact controllers. A third possible enhancement is that of employing a touch-screen instance of null/contact touch pad and positioning it over a video display.
The invention also provides for a pressure-sensor array touch-pad. A pressure-sensor array touch-pad of appropriate sensitivity range, appropriate “pixel” resolution, and appropriate physical size is capable of measuring pressure gradients of many parts of the human hand or foot simultaneously. A pressure-sensor array touch-pad can be combined with image processing to assign parameterized interpretations to measured pressure gradients and output those parameters as control signals. The pressure-sensor “pixels” of a pressure-sensor array are interfaced to a data acquisition stage; the data acquisition state looks for sensor pixel pressure measurement values that exceed a low-level noise-rejection/deformity-reject threshold; contiguous regions of sufficiently high pressure values are defined; the full collection of region boundaries are subjected to classification tests; various parameters are derived from each independent region; and these parameters are assigned to the role of specific control signals which are then output to a signal routing, processing, and synthesis entity.
It is possible to derive a very large number of independent control parameters which are easily manipulated by the operating user. For example, six degrees of freedom can be recovered from the contact of a single finger. A whole hand posture can yield 17 instantaneously and simultaneously measurable parameters which are independently adjustable per hand. The recognized existence and/or derived parameters from postures and gestures may be assigned to specific outgoing control signal formats and ranges. The hand is used throughout as an example, but it is understood that the foot or even other body regions, animal regions, objects, or physical phenomena can replace the role of the hand.
It will be evident to one of ordinary skill in the art that it is advantageous to have large numbers of instantaneously and simultaneously measurable parameters which are independently adjustable. For instance, a symbol in a 2-D CAD drawing can be richly interactively selected and installed or edited in moments as opposed to tens to hundreds of seconds as is required by mouse manipulation of parameters one or two at a time and the necessary mode-changes needed to change the mouse action interpretation. As a result, said touch-pad has applications in computer workstation control, general real-time machine control, computer data entry, and computer simulation environments.
Various hardware implementations are possible. A particularly advantageous implementation would be to implement a small pressure-sensor array together with data acquisition and a small processor into a single chip package that can be laid as tiles in a larger array.
Null/Contact Touch-PadsDistinguished from panel controls and sensors are what will be termed null/contact touch-pads. This is a class of contact-position sensing devices that normally are in a null state unless touched and produce a control signal when touched whose signal value corresponds to typically one unique position on the touch-pad. Internal position sensing mechanisms may be resistive, capacitive, optical, standing wave, etc. Examples of these devices include one-dimensional-sensing ribbon controllers found on early music synthesizers, two-dimensional-sensing pads such as the early Kawala pad and more modern mini-pads found on some lap-top computers, and two-dimensional-sensing see-through touch-screens often employed in public computer kiosks.
The null condition, when the pad is untouched, requires and/or provides the opportunity for special handling. Some example ways to handle the untouched condition include:
-
- sample-hold (hold values issued last time sensor was touched, as does a joystick)
- bias (issue maximal-range value, minimal-range value, mid-range value, or other value)
- touch-detect on another channel (i.e., a separate out-of-band “gate” channel).
Additional enhancements can be added to the adaptation of null/contact touch-pad controllers as instrument elements. A first enhancement is, the addition of velocity and/or pressure sensing. This can be done via global impact and/or pressure-sensors. An extreme of this is implementation of the null/contact touch-pad controller as a pressure-sensor array; this special case and its many possibilities are described later.
A second enhancement is the ability to either discern each dimensional-width of a single contact area or alternatively, independently discern two independent contact points in certain types of null/contact controllers.
Referring to
The value of the voltage drop then equals a value in proportion to the distance separating the extremes of the wide and/or multiple contact points. By subtracting the actual voltage across the entire resistive element from the value this voltage is normally, a control voltage proportional to distance separating the extremes of the wide and/or multiple contact points is generated. Simultaneously, the voltage difference between that of the contact plate/wire and that of the end of the resistive element closest to an external contact point is still proportional to the distance from said end to said external contact point. Using at most simple op-amp summing and/or differential amplifiers, a number of potential control voltages can be derived; for example one or more of these continuously-valued signals:
-
- value of distance difference between external contact points (or “width”;
- as described above via constant current source, nominal reference voltage, and differential amplifier)
- center of a non-trivial-width region (obtained by simple averaging, i.e., sum with gain of ½)
- value of distance difference between one end of the resistive element and the closest external contact point (simple differential amplifier)
- value of distance difference between the other end of the resistive element and the other external contact point (sum above voltage with “width” voltage with appropriate sign).
Further, through use of simple threshold comparators, specific thresholds of shorted resistive element can be deemed to be, for example, any of a single point contact, a recognized contact region width, two points of contact, etc., producing corresponding discrete-valued control signals. The detection of a width can be treated as a contact event for a second parameter analogous to the single contact detection event described at the beginning. Some example usages of these various continuous and discrete signals are:
-
- existence of widths or multiple contact points may be used to trigger events or timbre changes
- degree of widths may be used to control degrees of modulation or timbre changes
- independent measurement of each extremal contact point from the same end of the resistive element can be used to independently control two parameters. In the simplest form, one parameter is always larger than another; in more complex implementations, the trajectories of each contact point can be tracked (using a differentiator and controlled parameter assignment switch); as long as they never simultaneously touch, either parameter can vary and be larger or smaller than the other.
It is understood that analogous approaches may be applied to other null/contact touchpad technologies such as capacitive or optical.
A third possible enhancement is that of employing a touch-screen instance of null/contact touch-pad and positioning it over a video display. The video display could for example provide dynamically assigned labels, abstract spatial cues, spatial gradients, line-of-site cues for fixed or motor-controlled lighting, etc. which would be valuable for use in conjunction with the adapted null/contact touch-pad controller.
These various methods of adapted null/contact touch-pad elements can be used stand-alone or arranged in arrays. In addition, they can be used as a component or addendum to instruments featuring other types of instrument elements.
Pressure-Sensor Array Touch-PadsThe invention provides for use of a pressure-sensor array arranged as a touch-pad together with associated image processing. As with the null/contact controller, these pressure-sensor array touch-pads may be used stand-alone or organized into an array of such pads.
It is noted that the inventor's original vision of the below described pressure-sensor array touch-pad was for applications not only in music but also for computer data entry, computer simulation environments, and real-time machine control, applications to which the below described pressure-sensor array touch-pad clearly can also apply.
A pressure-sensor array touch-pad of appropriate sensitivity range, appropriate “pixel” resolution, and appropriate physical size is capable of measuring pressure gradients of many parts of the flexibly-rich human hand or foot simultaneously.
The pressure-sensor “pixels” of a pressure-sensor array touch-pad 1300 are interfaced to a data acquisition stage 1301. The interfacing method may be fully parallel but in practice may be advantageously scanned at a sufficiently high rate to give good dynamic response to rapidly changing human touch gestures. To avoid the need for a buffer amplifier for each pressure-sensor pixel, electrical design may carefully balance parasitic capacitance of the scanned array with the electrical characteristics of the sensors and the scan rates; electrical scanning frequencies can be reduced by partitioning the entire array into distinct parts that are scanned in parallel so as to increase the tolerance for address settling times and other limiting processes.
Alternatively, the pressure-sensor array 1300 may be fabricated in such a way that buffer amplifier arrays can be inexpensively attached to the sensor array 1300, or the sensors may be such that each contains its own buffer amplifier; under these conditions, design restrictions on scanning can be relaxed and operate at higher speeds. Although the pressure sensors may be likely analog in nature, a further enhancement would be to use digital-output pressure-sensor elements or sub-arrays.
The data acquisition stage 1301 looks for sensor pixel pressure measurement values that exceed a low-level noise-rejection/deformity-rejection threshold. The sufficiently high pressure value of each such sensor pixel is noted along with the relative physical location of that pixel (known via the pixel address). This noted information may be stored “raw” for later processing and/or may be subjected to simple boundary tests and then folded into appropriate running calculations as will be described below. In general, the pressure values and addresses of sufficiently high pressure value pixels are presented to a sequence of processing functions which may be performed on the noted information:
-
- contiguous regions of sufficiently high pressure values are defined (a number of simple run-time adjacency tests can be used; many are known—see for example [Ronse; Viberg; Shaperio; Hara])
- the full collection of region boundaries are subjected to classification tests; in cases a given contiguous region may be split into a plurality of tangent or co-bordered independently recognized regions
- various parameters are derived from each independent region, for example geometric center, center of pressure, average pressure, total size, angle-of-rotation-from reference for non-round regions, second-order and higher-order geometric moments, second-order and higher-order pressure moments, etc.
- assignment of these parameters to the role of specific control signals (note events, control parameters, etc.) which are then output to a signal routing, processing, and synthesis entity; for example, this may be done in the form of MIDI messages.
Because of the number of processes involved in such a pipeline, it is advantageous to follow a data acquisition stage 1301 with one or more additional processing stages 1303, 1305, 1309, and 1311. Of the four example processing functions just listed, the first three fall in the character of image processing. It is also possible to do a considerable amount of the image processing steps actually within the data acquisition step, namely any of simple adjacency tests and folding selected address and pressure measurement information into running sums or other running pre-calculations later used to derive aforementioned parameters. The latter method can be greatly advantageous as it can significantly collapses the amount of data to be stored.
Regardless of whether portions of the image processing are done within or beyond the data acquisition stage, there are various hardware implementations possible. One hardware approach would involve very simple front-end scanned data acquisition hardware and a single high-throughput microprocessor/signal-processor chip. Alternatively, an expanded data acquisition stage may be implemented in high-performance dedicated function hardware and this would be connected to a lower performance processor chip. A third, particularly advantageous implementation would be to implement a small pressure-sensor array together with data acquisition and a small processor into a single low-profile chip package that can be laid as tiles in a nearly seamless larger array. In such an implementation all image processing could in fact be done via straightforward partitions into message-passing distributed algorithms.
One or more individual chips could direct output parameter streams to an output processor which would organize and/or assign parameters to output control channels, perhaps in a programmable’ manner under selectable stored program control. A tiled macro array of such “sensor mini-array” chips could be networkedsby a tapped passive bus, one- or two-dimensional mode active bus daisy-chain, a potentially expandable star-wired centralized message passing chip or subsystem, or other means.
Creating a large surface from such “tile chips” will aid in the serviceability of the surface. Since these chips can be used as tiles to build a variety of shapes, it is therefore possible to leverage a significant manufacturing economy-of-scale so as to minimize cost and justify more extensive feature development. Advanced seating and connector technologies, as used in laptops and other high-performance miniature consumer electronics, can be used to minimize the separation between adjacent chip “tiles” and resultant irregularities in the tiled-surface smoothness. A tiled implementation may also include a thin rugged flexible protective film that separates the sensor chips from the outside world.
With the perfection of a translucent pressure-sensor array, it further becomes possible for translucent pressure-sensor arrays to be laid atop aligned visual displays such as LCDs, florescent, plasma, CRTs, etc. as was discussed above for null/contact touch-pads. The displays can be used to label areas of the sensor array, illustrate gradients, etc. Note that in the “tile chip” implementation, monochrome or color display areas may indeed be built into each chip.
Returning now to the concept of a pressure-sensor array touch-pad large enough for hand-operation: examples of hand contact that may be recognized, example methods for how these may be translated into control parameters, and examples of how these all may be used are now described. In the below the hand is used throughout as an example, but it is understood that the foot or even other body regions, animal regions, objects, or physical phenomena can replace the role of the hand in these illustrative examples.
Relatively simple pattern recognition software can be used to discern these and other hand contact patterns which will be termed “postures.” The pattern recognition working together with simple image processing may, further, derive a very large number of independent control parameters which are easily manipulated by the operating user. In many cases it may be advantageous to train a system to the particulars of a specific person's hand(s) and/or specific postures. In other situations the system may be designed to be fully adaptive and adjust to a person's hand automatically. In practice, for the widest range of control and accuracy, both training and ongoing adaptation may be useful. Further, the recognized postures described thus far may be combined in sequence with specific dynamic variations among them (such as a finger flick, double-tap, etc.) and as such may be also recognized and thus treated as an additional type of recognized pattern; such sequential dynamics among postures will be termed “gestures.”
The admission of gestures further allows for the derivation of additional patterns such as the degree or rate of variation within one or more of the gesture dynamics. Finally, the recognized existence and/or derived parameters from postures and gestures may be assigned to specific outgoing control signal formats and ranges. Any training information and/or control signal assignment information may be stored and recalled for one or more players via stored program control.
For each recognized pattern, the amount of information that can be derived as parameters is in general very high. For the human hand or foot, there are, typically, artifacts such as shape variation due to elastic tissue deformation that permit recovery of up to all six degrees of freedom allowed in an object's orientation in 3-space.
In general other and more complex hand contacts, such as use of two fingers, the whole hand, etc. forfeit some of these example degrees of freedom but often introduce others. For example, in the quite constrained case of a whole hand posture, the fingers and thumb can exert pressure independently (5 parameters), the finger and thumb separation angles can be varied (4 parameters), the finger ends 1504a can exert pressure independently from the middle 1504b and inner 1504c segments (4 parameters), the palm can independently vary its applied pressure (1 parameter) while independently tilting/rocking in two directions (2 parameters) and the thumb can curl (1 parameter), yielding 17 instantaneously and simultaneously measurable parameters which are independently adjustable per hand. Complex contact postures may also be viewed as, or decomposed into, component sub-postures (for example here, as flat-finger contact, palm contact, and thumb contact) which would then derive parameters from each posture independently. For such complex contact postures, recognition as a larger compound posture which may then be decomposed allows for the opportunity to decouple and/or renormalize the parameter extraction in recognition of the special affairs associated with and constraints imposed by specific complex contact postures.
It is noted that the derived parameters may be pre-processed for specific uses. One example of this would be the quantization of a parameter into two or more discrete steps; these could for example be sequentially interpreted as sequential notes of a scale or melody. Another example would be that of warping a parameter range as measured to one with a more musically expressive layout.
Next examples of the rich metaphorical aspects of interacting with the pressuresensor array touch-pad are illustrated. In many cases there may be one or more natural geometric metaphor(s) applicable, such as associating left-right position, left-right twisting, or left-right rotation with stereo panning, or in associating overall pressure with volume or spectral complexity. In more abstract cases, there may be pairs of parameters that go together—here, for example with a finger end, it may be natural to associate one parameter pair with (left/right and forward/backward) contact position and another parameter pair with (left/right and forward/backward) twisting/rocking. In this latter example there is available potential added structure in the metaphor by viewing the twisting/rocking plane as being superimposed over the position plane. The superposition aspect of the metaphor can be viewed as an index, or as an input-plane/output-plane distinction for a two-input/two-output transformation, or as two separated processes which may be caused to converge or morph according to additional overall pressure, or in conjunction with a dihedral angle of intersection between two independent processes, etc.
Next, examples of the rich syntactical aspects of interacting with the pressure-sensor array touch-pad are illustrated. Some instruments have particular hand postures naturally associated with their playing. It is natural then to recognize these classical hand-contact postures and derive control parameters that match and/or transcend how a classical player would use these hand positions to evoke and control sound from the instrument. Further, some postures could be recognized either in isolation or in gestural-context as being ones associated with (or assigned to) percussion effects while remaining postures may be associated with accompanying melodies or sound textures.
As an additional syntactic aspect, specific hand postures and/or gestures may be mapped to specific selected assignments of control signals in ways affiliated with specific purposes. For example, finger ends may be used for one collection of sound synthesis parameters, thumb for a second potentially partially overlapping collection of sound synthesis parameters, flat fingers for a third partially-overlapping collection, wrist for a fourth, and cuff for a fifth, and first for a sixth. In this case it may be natural to move the hand through certain connected sequences of motions; for example: little finger end, still in contact, dropping to flat-finger contact, then dropping to either palm directly or first to cuff and then to palm, then moving to wrist, all never breaking contact with the touch-pad. Such permissible sequences of postures that can be executed sequentially without breaking contact with the touch-pad will be termed “continuous grammars.”
Under these circumstances it is useful to set up parameter assignments, and potentially associated context-sensitive parameter renormalizations, that work in the context of selected (or all available) continuous grammars. For example, as the hand contact evolves as being recognized as one posture and then another, parameters may be smoothly handed-over in interpretation from one posture to another without abrupt changes, while abandoned parameters either hold their last value or return to a default value (instantly or via a controlled envelope).
Now a number of example applications of the pressure-sensor array touchpad are provided. It is known to be possible and valuable to use the aforementioned pressure-sensor array touch-pad, implicitly containing its associated data acquisition, processing, and assignment elements, for many, many applications such as general machine control and computer workstation control. One example of machine control is in robotics: here a finger might be used to control a hazardous material robot hand as follows:
-
- left/right position: left/right hand position
- in/out position: in/out hand position
- in/out rock: up/down hand position
- rotation: hand grip approach angle
- overall pressure: grip strength
- left/right twist: gesture to lock or release current grip from pressure control
A computer workstation example may involve a graphical Computer-Aided Design application currently requiring intensive mouse manipulation of parameters one or two at a time:
-
- left/right position: left/right position of a selected symbol in a 2-D CAD drawing
- in/out position: up/down position of a selected symbol in 2-D CAD drawing
- left/right twist: symbol selection—left/right motion through 2-D palette
- in/out rock: symbol selection—up/down motion through 2-D palette
- rotation: rotation of selected symbol in the drawing
- overall pressure: sizing by steps
- tap of additional finger: lock selection into drawing or unlock for changes
- tap of thumb: undo
- palm: toggle between add new object and select existing object
Clearly a symbol can be richly interactively selected and installed or edited in moments as opposed to tens to hundreds of seconds as is required by mouse manipulation of parameters one or two at a time and the necessary mode-changes needed to change the mouse action interpretation.
Touch-Pad ArrayTouch-pad instrument elements, such as null/contact types and pressure-sensor array types described earlier, can be used in isolation or arrays to create electronic controller instruments. The touch-pad(s) may be advantageously supplemented with panel controls such as push buttons, sliders, knobs as well as impact sensors for velocity-controlled triggering of percussion or pitched note events. If one or more of the touch-pads is transparent (as in the case of a null/contact touch screen overlay) one or more video, graphics, or alphanumeric displays 2711 may placed under a given pad or group of pads.
All publications and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication or patent application was specifically and individually indicated to be incorporated by reference. The invention now being fully described, it will be apparent to one of ordinary skill in the art that many changes and modifications can be made thereto without departing from its spirit or scope.
REFERENCES CITEDThe following references are cited in this patent application using the format of the first one or two authors last name(s) within square brackets“[ ]”, multiple references within a pair of square brackets separated by semicolons “;”
[Ronse] Ronse, Christian and Devijver, Pierre A., Connected Components in Binary Images: the Detection Problem, John Wiley & Sons Inc. New York, 1984;
[Viberg] Viberg, Mats, Subspace Fitting Concepts in Sensor Array Processing, Linkoping Studies in Science and Technology. Dissertations No. 27 Linkoping, Sweden 1989;
[Shapiro] Shapiro, Larry S, Affine Analysis of Image Sequences, Cambridge University Press, 1995;
[Hara] Hara, Yoshiko “Matsushita demos multilayer MPEG-4 compression”, Electronic Engineering Times, Apr. 19, 1999.
Claims
1. (canceled)
2. An apparatus comprising:
- a) a display screen;
- b) a transparent sensor array positioned to detect touch-based interaction on the display screen, the transparent sensor array having a plurality of sensors, each sensor responding to contact with the apparatus; and
- c) at least two separate processors that are configured to: i) acquire data from the plurality of sensors, ii) use the acquired data to identify a sequence of postures forming a continuous grammar, each posture having a contiguous region of detected contact, iii) identify a touch gesture by detecting specific dynamic variations in the sequence of postures, iv) derive a control parameter for the touch gesture by performing calculations based on the data acquired from the plurality of sensors, and v) assign the control parameter to a control signal; wherein one of the at least two separate processors is configured to acquire data from the plurality of sensors, and further wherein another of the at least two separate processors is an output processor configured to assign the control parameter to the control signal.
3. The apparatus of claim 2, wherein the transparent sensor array is positioned over the display screen.
4. The apparatus of claim 2, wherein the output processor receives the control parameter from another processor.
5. The apparatus of claim 2, wherein the output processor dynamically assigns the control parameter to the control signal in a programmable manner under stored program control.
6. The apparatus of claim 5, wherein the output processor assigns the control parameter to the control signal using stored control signal assignment information.
7. The apparatus of claim 6, wherein the control parameter is assigned to the control signal using programming that dynamically selects between a first control assignment and a second control assignment.
8. The apparatus of claim 2, wherein the output processor assigns the control parameter to the control signal using stored control signal assignment information.
9. The apparatus of claim 8, wherein the control parameter is assigned to the control signal using programming that dynamically selects between a first control assignment and a second control assignment.
10. The apparatus of claim 2, wherein each posture has a plurality of contiguous regions of detected contact.
11. The apparatus of claim 10, wherein each contiguous region of detected contact comprises a sub-posture associated with a different portion of a human hand contacting the apparatus.
12. The apparatus of claim 11, wherein each sub-posture is associated with a separate finger on the human hand.
13. The apparatus of claim 2, wherein the touch gesture is a finger flick.
14. The apparatus of claim 2, wherein the touch gesture is identified through dynamic positional changes in the sequence postures.
15. The apparatus of claim 2, wherein the touch gesture is identified through rotational movement detected in the sequence of postures.
16. The apparatus of claim 2, wherein the touch gesture corresponds to interactions with visual content displayed on the display screen.
17. The apparatus of claim 16, wherein the visual content comprises dynamically assigned labels.
18. The apparatus of claim 2, wherein the control parameter reflects a velocity associated with the dynamic changes in the sequence of postures.
19. The apparatus of claim 2, wherein the contiguous region of detected contact is identified through a run-time adjacency test.
20. An apparatus comprising:
- a) a display screen;
- b) a transparent sensor array positioned to detect touch-based interaction with the display screen, the sensor array having a plurality of sensors, each sensor producing data responsive to contact with the apparatus; and
- c) at least two separate processors that are configured to: i) acquire the data from the plurality of sensors, ii) use the acquired data to identify postures based upon a detected contact pattern found in the data, iii) identify a touch gesture by detecting specific dynamic variations in the postures, iv) derive a control parameter for the touch gesture, and v) assign the control parameter to a control signal; wherein one of the at least two separate processors is configured to acquire data from the plurality of sensors, and further wherein another of the at least two separate processors is an output processor configured to assign the control parameter to the control signal.
21. The apparatus of claim 20, wherein the output processor receives the control parameter from outside the output processor before assigning the control parameter to the control signal.
22. The apparatus of claim 20, wherein the output processor assigns the control parameter to the control signal in a programmable manner under stored program control.
23. The apparatus of claim 22, wherein the output processor assigns the control parameter to the control signal using stored control signal assignment information.
24. The apparatus of claim 23, wherein the control parameter is assigned to the control signal using programming that dynamically selects between a first control assignment and second control assignment.
25. The apparatus of claim 20, wherein the touch gesture is a finger flick.
26. The apparatus of claim 20, wherein the touch gesture corresponds to interactions with visual content displayed on the display screen.
Type: Application
Filed: Apr 7, 2016
Publication Date: Aug 4, 2016
Applicant: Advanced Touchscreen and Gestures Technologies, LLC (San Antonio, TX)
Inventor: Lester F. Ludwig (San Antonio, TX)
Application Number: 15/092,961