OBJECT CHARACTERISATION FOR TOUCH DISPLAYS

An optical IR touch sensing apparatus configured to determine, based on output signals of light detectors, a light energy value for each light path across a touch surface, and generate a transmission value for each light path based on the light energy value. A processor is then configured to process the transmission values to determine a region around the object reference point on the touch surface and a set of light paths intersecting the region. By performing statistical analysis of the set of light paths, characteristics of the object may be determined.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
INCORPORATION BY REFERENCE TO ANY PRIORITY APPLICATIONS

Any and all applications for which a foreign or domestic priority claim is identified in the Application Data Sheet as filed with the present application are hereby incorporated by reference under 37 CFR 1.57.

BACKGROUND OF THE INVENTION Field of the Invention

The present disclosure relates to techniques for detecting and identifying objects on a touch surface.

Description of the Related Art

To an increasing extent, touch-sensitive panels are being used for providing input data to computers, electronic measurement and test equipment, gaming devices, etc. The panel may be provided with a graphical user interface (GUI) for a user to interact with using e.g. a pointer, stylus or one or more fingers. The GUI may be fixed or dynamic. A fixed GUI may e.g. be in the form of printed matter placed over, under or inside the panel. A dynamic GUI can be provided by a display screen integrated with, or placed underneath, the panel or by an image being projected onto the panel by a projector.

There are numerous known techniques for providing touch sensitivity to the panel, e.g. by using cameras to capture light scattered off the point(s) of touch on the panel, by using cameras to directly observe the objects interacting with the panel, by incorporating resistive wire grids, capacitive sensors, strain gauges, etc. into the panel.

In one category of touch-sensitive panels known as ‘above surface optical touch systems’ and known from e.g. U.S. Pat. No. 4,459,476, a plurality of optical emitters and optical receivers are arranged around the periphery of a touch surface to create a grid of intersecting light paths (otherwise known as detection lines) above the touch surface. Each light path extends between a respective emitter/receiver pair. An object that touches the touch surface will block or attenuate some of the light paths. Based on the identity of the receivers detecting a blocked light path, a processor can determine the location of the intercept between the blocked light paths.

For most touch systems, a user may place a finger onto the surface of a touch panel to register a touch. Alternatively, a stylus may be used. A stylus is typically a pen shaped object with at least one end configured to be pressed against the surface of the touch panel. An example of a stylus according to the prior art is shown in FIG. 2. Use of a stylus 60 may provide improved selection accuracy and pointer precision over a simple finger touch. This can be due to the engineered stylus tip 62 providing a smaller and/or more regular contact surface with the touch panel than is possible with a human finger. Also, muscular control of an entire hand in a pen holding position can be more precise than a single finger for the purposes of pointer control due to lifelong training in the use of pens and pencils.

PCT/SE2016/051229 describes an optical IR touch sensing apparatus configured to determine a position of a touching object on the touch surface and an attenuation value corresponding to the attenuation of the light resulting from the object touching the touch surface. Using these values, the apparatus can differentiate between different types of objects, including multiple stylus tips, fingers, palms. The differentiation between the object types may be determined by a function that takes into account how the attenuation of a touching object varies across the touch surface, compensating for e.g. light field height, detection line density, detection line angular density etc.

For larger objects applied to the touch surface, such as palms and board erasers, it is possible to use an attenuation map of the touch surface to determine an approximate shape of the object. For example, where an optical IR touch sensing apparatus is used, an attenuation map may be generated showing an area on the touch surface where the light is highly attenuated. The shape of an attenuated area may then be used to identify the position and shape of the touching object. In a technique known according to the prior art, a rough shape of the large object can be determined by identifying all points with an attenuation above a threshold value. An approximate centroid and orientation of the large object may then be determined using the image moments of the identified points. Such techniques are described in “Image analysis via the general theory of moments” by Michael Reed Teague. Once the centroid and orientation of the large object are determined, width and height of the board eraser can be found by determining the extent of the identified pixels in the direction of the orientation angle and the normal of the orientation angle.

However, for smaller objects, use of attenuation map to determine object characteristics like size, orientation, and shape becomes very difficult due to the low resolution of the attenuation map. In particular a stylus tip may present only a few pixels of interaction on an attenuation map.

Therefore, what is needed is a method of determining object characteristics that overcome the above limitations.

SUMMARY OF THE INVENTION

It is an objective of the disclosure to at least partly overcome one or more of the above-identified limitations of the prior art.

One or more of these objectives, as well as further objectives that may appear from the description below, are at least partly achieved by means of a method for data processing, a computer readable medium, devices for data processing, and a touch-sensing apparatus according to the independent claims, embodiments thereof being defined by the dependent claims.

A first embodiment provides a touch sensing apparatus, comprising: a touch surface, a plurality of emitters arranged around the periphery of the touch surface to emit beams of light such that one or more objects touching the touch surface cause an attenuation or occlusion of the light; a plurality of light detectors arranged around the periphery of the touch surface to receive light from the plurality of emitters on a plurality of light paths, wherein each light detector is arranged to receive light from more than one emitter; and a processing element configured to: determine, based on output signals of the light detectors, a transmission value for each light path; process the transmission values to determine an object reference point on the touch surface where the light is attenuated or occluded by an object, determine a region around the object reference point, determine a plurality of light paths intersecting the region, determine a statistical measure for each of at least one light path variables of the plurality of light paths intersecting the region, including at least the transmission values of the light paths, and determine one or more characteristics of the object in dependence on the at least one statistical measure.

A second embodiment provides a method in a touch sensing apparatus, said touch sensing apparatus comprising: a touch surface, a plurality of emitters arranged around the periphery of the touch surface to emit beams of light such that one or more objects touching the touch surface cause an attenuation or occlusion of the light; and a plurality of light detectors arranged around the periphery of the touch surface to receive light from the plurality of emitters on a plurality of light paths, wherein each light detector is arranged to receive light from more than one emitter; said method comprising: determining, based on output signals of the light detectors, a transmission value for each light path; processing the transmission values to determine an object reference point on the touch surface where the light is attenuated or occluded by an object, determining a region around the object reference point, determining a plurality of light paths intersecting the region, determining a statistical measure of values for each of at least one light path variable of the plurality of light paths intersecting the region, including at least the transmission values of the light paths, and determining one or more characteristics of the object in dependence on the at least one statistical measure.

BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the invention will now be described in more detail with reference to the accompanying schematic drawings.

FIG. 1 is a top plan view of an optical touch apparatus.

FIG. 2 shows a cross-section of an above-surface-type IR optical touch apparatus according to the prior art.

FIG. 3 shows a cross-section of am FTIR-type IR optical touch apparatus according to the prior art.

FIG. 4 is a flow chart showing a process for determining characteristics of an interacting object.

FIG. 5 shows a top-down view of touch surface with an applied stylus tip and finger.

FIGS. 6a-6d shows a sequence of steps for determining a plurality of detection lines intersecting a region around a touching object.

FIG. 7a shows the set of detection lines passing intersecting a region around a finger and a subset of detection lines interacting with the finger.

FIG. 7b shows the set of detection lines passing through a region around a stylus tip and a subset of detection lines interacting with the stylus tip.

FIG. 8a shows a frequency distribution of transmission values for detection lines passing through a region around a finger.

FIG. 8b shows a frequency distribution of transmission values for detection lines passing through a region around a stylus tip.

FIG. 9a shows a frequency distribution of transmission values for detection lines passing through a region around a finger including a threshold value.

FIG. 9b shows a frequency distribution of transmission values for detection lines passing through a region around a stylus tip including a threshold value.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT

The present disclosure relates to optical touch panels and the use of techniques for providing touch sensitivity to a display apparatus. Throughout the description the same reference numerals are used to identify corresponding elements.

In addition to having its ordinary meaning, the following terms can also mean:

A “touch object” or “touching object” is a physical object that touches, or is brought in sufficient proximity to, a touch surface so as to be detected by one or more sensors in the touch system. The physical object may be animate or inanimate.

An “interaction” occurs when the touch object affects a parameter measured by the sensor.

A “touch” denotes a point of interaction as seen in the interaction pattern.

A “light field” is the light flowing between an emitter and a corresponding detector. Although an emitter may generate a large amount of light in many directions, only the light measured by a detector from an emitter defines the light field for the emitter and detector.

FIG. 1 is a top plan view of an optical touch apparatus which may correspond to the IR optical touch apparatus of FIG. 2. Emitters 30a are distributed around the periphery of touch surface 20, to project light across the touch surface 20 of touch panel 10. Detectors 30b are distributed around the periphery of touch surface 20, to receive part of the propagating light. The light from each of emitters 30a will thereby propagate to a number of different detectors 30b on a plurality of light paths 50.

Light paths 50 may conceptually be represented as “detection lines” that extend across the touch surface 20 to the periphery of touch surface 20 between pairs of emitters 30a and detectors 30b, as shown in FIG. 1. Thus, the detection lines 50 correspond to a projection of the light paths 50 onto the touch surface 20. Thereby, the emitters 30a and detectors 30b collectively define a grid of detection lines 50 (“detection grid”) on the touch surface 20, as seen in the top plan view of FIG. 1. The spacing of intersections in the detection grid defines the spatial resolution of the touch-sensitive apparatus 100, i.e. the smallest object that can be detected on the touch surface 20. The width of the detection line is a function of the width of the emitters and corresponding detectors. A wide detector detecting light from a wide emitter provides a wide detection line with a broader surface coverage, minimising the space in between detection lines which provide no touch coverage. A disadvantage of wide detection lines may be the reduced touch precision, worse point separation, and lower signal to noise ratio.

In one embodiment, the light paths are a set of virtual light paths converted from the actual light paths via an interpolation step. Such an interpolation step is described in PCT publication WO2011139213. The virtual light paths may be configured so as to match the requirements of certain CT algorithms, viz. algorithms that are designed for processing efficient and/or memory efficient and/or precise tomographic reconstruction of an interaction field. In this embodiment, any characteristics of the object are determined from a statistical measure of the virtual light paths intersecting the region.

As used herein, the emitters 30a may be any type of device capable of emitting radiation in a desired wavelength range, for example a diode laser, a VCSEL (vertical-cavity surface-emitting laser), an LED (light-emitting diode), an incandescent lamp, a halogen lamp, etc. The emitters 30a may also be formed by the end of an optical fibre. The emitters 30a may generate light in any wavelength range. The following examples presume that the light is generated in the infrared (IR), i.e. at wavelengths above about 750 nm. Analogously, the detectors 30b may be any device capable of converting light (in the same wavelength range) into an electrical signal, such as a photo-detector, a CCD device, a CMOS device, etc.

The detectors 30b collectively provide an output signal, which is received and sampled by a signal processor 140. The output signal contains a number of sub-signals, also denoted “transmission values”, each representing the energy of light received by one of light detectors 30b from one of light emitters 30a. Depending on implementation, the signal processor 140 may need to process the output signal for separation of the individual transmission values. The transmission values represent the received energy, intensity or power of light received by the detectors 30b on the individual detection lines 50. Whenever an object touches a detection line 50, the received energy on this detection line is decreased or “attenuated”. Where an object blocks the entire width of the detection line of an above-surface system, the detection line will be fully attenuated or occluded.

FIG. 2 shows a cross-section of an IR optical touch apparatus according to the prior art. In the example apparatus shown in FIG. 2, object 60 having tip 62 will attenuate light propagating along at least one light path 50. In the example shown of FIG. 2, object 60 may even fully occlude the light on at least one light path 50.

In one embodiment, the touch apparatus is arranged according to FIG. 2. A light emitted by emitters 30a is transmitted through transmissive panel 10 in a manner that does not cause the light to TIR within transmissive panel 10. Instead, the light exits transmissive panel 10 through touch surface 20 and is reflected by reflector surface 80 of edge reflector 70 to travel along a path 50 in a plane parallel with touch surface 20. The light will then continue until deflected by reflector surface 80 of the edge reflector 70 at an opposing or adjacent edge of the transmissive panel 10, wherein the light will be deflected back down through transmissive panel 10 and onto detectors 30b. An object 60 (optionally having object tip 62) touching surface 20 will occlude light paths 50 that intersect with the location of the object on the surface resulting in an attenuated light signal received at detector 30b.

In one embodiment, the top edge of reflector surface 80 is 2 mm above touch surface 20. This results in a light field 90 which is 2 mm deep. A 2 mm deep field is advantageous for this embodiment as it minimizes the distance that the object needs to travel into the light field to reach the touch surface and to maximally attenuate the light. The smaller the distance, the shorter time between the object entering the light field and contacting the surface. This is particularly advantageous for differentiating between large objects entering the light field slowly and small objects entering the light field quickly. A large object entering the light field will initially cause a similar attenuation as a smaller object fully extended into the light field. The shorter distance for the objects to travel, the fewer frames are required before a representative attenuation signal for each object can be observed. This effect is particularly apparent when the light field is between 0.5 mm and 2 mm deep.

In an alternative embodiment shown in FIG. 3, the transmitted light illuminates a touch surface 20 from within the panel 10. The panel 10 is made of solid material in one or more layers and may have any shape. The panel 10 defines an internal radiation propagation channel, in which light propagates by internal reflections. The propagation channel is defined between the boundary surfaces of the panel 10, where the top surface allows the propagating light to interact with touching objects 60 and thereby defines the touch surface 20. This is achieved by injecting the light into the panel 10 such that the light is reflected by total internal reflection (TIR) in the touch surface 20 as it propagates through the panel 10. The light may be reflected by TIR in the bottom surface or against a reflective coating thereon. In this embodiment, an object 60 may be brought in contact with the touch surface 20 to interact with the propagating light at the point of touch. In this interaction, part of the light may be scattered by the object 60, part of the light may be absorbed by the object 60, and part of the light may continue to propagate in its original direction across the panel 10. Thus, the touching object 60 causes a local frustration of the total internal reflection, which leads to a decrease in the energy (or, equivalently, power or intensity) of the transmitted light.

The signal processor 140 may be configured to process the transmission values so as to determine a property of the touching objects, such as a position (e.g. in a x,y coordinate system), a shape, or an area. This determination may involve a straight-forward triangulation based on the attenuated detection lines, e.g. as disclosed in U.S. Pat. No. 7,432,893 and WO2010/015408, or a more advanced processing to recreate a distribution of attenuation values (for simplicity, referred to as an “attenuation pattern”) across the touch surface 20, where each attenuation value represents a local degree of light attenuation. The attenuation pattern may be further processed by the signal processor 140 or by a separate device (not shown) for determination of a position, shape or area of touching objects. The attenuation pattern may be generated e.g. by any available algorithm for image reconstruction based on transmission values, including tomographic reconstruction methods such as Filtered Back Projection, FFT-based algorithms, ART (Algebraic Reconstruction Technique), SART (Simultaneous Algebraic Reconstruction Technique), etc. Alternatively, the attenuation pattern may be generated by adapting one or more basis functions and/or by statistical methods such as Bayesian inversion. Examples of such reconstruction functions designed for use in touch determination are found in WO2009/077962, WO2011/049511, WO2011/139213, WO2012/050510, and WO2013/062471, all of which are incorporated herein by reference.

For the purposes of brevity, the term ‘signal processor’ is used throughout to describe one or more processing components for performing the various stages of processing required between receiving the signal from the detectors through to outputting a determination of touch including touch co-ordinates, touch properties, etc. Although the processing stages of the present disclosure may be carried out on a single processing unit (with a corresponding memory unit), the disclosure is also intended to cover multiple processing units and even remotely located processing units. In an embodiment, the signal processor 140 can include one or more hardware processors 130 and a memory 120. The hardware processors can include, for example, one or more computer processing units. The hardware processor can also include microcontrollers and/or application specific circuitry such as ASICs and FPGAs. The flowcharts and functions discussed herein can be implemented as programming instructions stored, for example, in the memory 120 or a memory of the one or more hardware processors. The programming instructions can be implemented in machine code, C, C++, JAVA, or any other suitable programming languages. The signal processor 130 can execute the programming instructions and accordingly execute the flowcharts and functions discussed herein.

FIG. 4 shows a flow diagram according to an embodiment.

In step 410 of FIG. 4, the signal processor 140 receives and samples output signals from detectors 30b.

In step 420, the output signals are processed for determination of the transmission values (or ‘transmission signals’). As described above, the transmission values represent the received energy, intensity or power of light received by the detectors 30b on the individual detection lines 50.

In step 430, the signal processor 140 is configured to process the transmission values to determine the presence of one or more touching objects on the touch surface. In an embodiment, the signal processor 140 is configured to process the transmission values to generate a two-dimensional attenuation map of the attenuation field across the touch surface, i.e. a spatial distribution of attenuation values, in which each touching object typically appears as a region of changed attenuation. From the attenuation map, two-dimensional touch data may be extracted and one or more touch locations may be identified. The transmission values may be processed according to a tomographic reconstruction algorithm to generate the two-dimensional attenuation map of the attenuation field.

In one embodiment, the signal processor 140 may be configured to generate an attenuation map for the entire touch surface. In an alternative embodiment, the signal processor 140 may be configured to generate an attenuation map for a sub-section of the touch surface, the sub-section being selected according to one or more criteria determined during processing of the transmission values.

In an alternative embodiment, the signal processor 140 is configured to process the transmission values to determine the presence of one or more touching objects on the touch surface by determining intersections between attenuated or occluded detection lines, i.e. by triangulation. In yet another embodiment, the signal processor 140 is configured to process the transmission values to determine the presence of one or more touching objects on the touch surface using non-linear touch detection techniques such as those described in US patent application publication 20150130769 or 20150138105.

In step 440, the signal processor 140 is configured to determine an object reference point 250 for each touching object 210, 220. As shown in FIG. 5, finger 210 and stylus 220 are applied to touch surface 20. Object reference point 250′ is determined for finger 210. Similarly, object reference point 250″ is determined for stylus 220.

In one embodiment, an image moment is applied to the attenuation map, or to a sub-region of the attenuation map, to determine a centroid of a detected touching object, for use as the object reference point. E.g. For a scalar attenuation map with pixel intensities I(x,y), raw image moments Mij are calculated by:

M ij = x y x i y j I ( x , y )

The centroid of the image moment may be calculated as:


x,°y°}={M10/M00,°M01/M01°}

The object reference point 250 is then set to the co-ordinates of the centroid of the image moment.

In another embodiment, signal processor 140 is configured to determine an object reference point 250 within the interaction area of the touching object by determining a local maxima (i.e. point of highest attenuation) in the area of the attenuation map covered by the object. The identified maxima may be further processed for determination of a touch shape and a center position, e.g. by fitting a two-dimensional second-order polynomial or a Gaussian bell shape to the attenuation values, or by finding the ellipse of inertia of the attenuation values. There are also numerous other techniques as is well known in the art, such as clustering algorithms, edge detection algorithms, standard blob detection, water shedding techniques, flood fill techniques, etc. Step 440 results in a collection of peak data, which may include values of position, attenuation, size, and shape for each detected peak. The attenuation value may be calculated from a maximum attenuation value or a weighted sum of attenuation values within the peak shape.

In another embodiment, signal processor 140 is configured to determine an object reference point 250 within the interaction area of large touching object by selecting a point at random within the boundary of the touching object.

In an embodiment in which touching objects are identified using intersections between attenuated or occluded detection lines, i.e. by triangulation, the object reference point is set to the intersection point or average of intersection points, including a weighted average determined in dependence on the attenuation of the detection lines used for computing the intersection points.

In step 450, a region 200 is determined around object 210, 220. The region corresponds to an area of the touch surface at the point of and surrounding an object interacting with the touch surface. In one embodiment, region 200 may be a circular area, centred on object reference point 250 and having radius R. Radius R may be a predetermined length. Alternatively, radius R may be dynamically determined in dependence on properties of the touching object, including the contact area of the touching object, or a pressure exerted by the touching object on the touch surface. Other embodiments are envisioned in which region shapes are alternative shapes, e.g. a rectangular shaped region defined by a width and height and with object reference point 250 at its centre. Similarly, an ellipse may be used, defined by a width and height and with object reference point 250 at its centre.

In step 460, a set of detection lines intersecting region 200 is determined. In an embodiment where region 200 is a circular area, centred on object reference point 250 and having radius R, the set of detection lines intersecting region 200 is determined to be the set of detection lines passing within distance R of the object reference point 250.

In embodiment of step 460 is now described. This embodiment is recognised as one of numerous possible solutions for determining detection lines intersecting region 200.

1) The emitter/detector pairs forming each detection line are analysed in a counterclockwise direction. As shown in FIG. 6a, the detection line from the first emitter e0 on the bottom side of the touch surface and the first detector d0 on the right side is the first detection line to be analysed. For the purposes of clear explanation, the touch system shown in FIG. 6a shows only emitters along left and bottom edges and detectors along the right and top edges. However, it is understood that the present concepts may be applied to touch systems having a variety of emitter and detector geometries including interleaved emitter and detector arrangements.

The detector counter is then incremented in counterclockwise direction (i.e. di+1) and the detection line between emitter e0 and the incremented detector di+1 is analysed. This loop continues and the detection lines from the emitter are therefore analysed in a counterclockwise pattern until a detection line is identified that passes sufficiently close to the object reference point 250, i.e. distance 255 is within the specified radii R. In FIG. 6a, this is the detection line 170. Measuring distance 255 is preferably achieved using the dot product:


s=clot product(normal[e0−di], object reference point−detection line position[e0−di])

Where s is the closest distance from a point to a line.

Other search sequences are envisaged including a binary search, or root-finding algorithm, such as secant method or Newton's method.

FIG. 6a also shows detection line angle ϕ, the use for which is described below.

In embodiments where region 200 is non-circular, other techniques for determining intersection of the region by the detection line may be used. E.g. Ray/Polygon Intersection algorithms as known in the art.

As shown in FIG. 6b, the loop then continues and the detection lines from the emitter continue to be analysed in a counterclockwise pattern, identifying all of the detection lines passing within distance R of the object reference point 250 until a detection line is identified that does not pass within distance R of the object reference point 250. All of the detection lines D0 are defined as the set of detection lines from emitter e0 intersecting region 200. Of this set, the most clockwise detection line is dcw,0 and the most counterclockwise detection line is dccw,0.

For all detection lines D0, the transmission values and reference values are determined. In one embodiment, the reference values are an estimated background transmission value for the detection line without any touching objects present. In an alternative embodiment, reference values can be a transmission value of the detection line recorded at a previous time, e.g. within 500 ms. Alternatively, reference values can be an average of transmission values over a period of time. E.g. within the last 500 ms. Such averaging techniques are described in U.S. Pat. No. 9,377,884.

As shown in FIG. 6c, the next step is to move on to the next emitter to determine detection lines from the next emitter that intersect region 200.

As the emitter/detectors are processed in a circular order, a geometric consequence is that the detection line defined by [ej+1, dk] will be further away from the region 200 than [ej, dk]. Therefore, in a preferable configuration, when detection lines for the next emitter in the counterclockwise direction are analysed, the first detection line to be analysed may be [ej+1, dcw,j] and then continued in a counterclockwise direction. This allows a significant reduction in the number of computations required to determine the set of object boundary lines. As an alternative to selecting the next detection line in the counterclockwise direction, the next detection line to be analysed may be determined using a binary search or a root finding algorithm. As shown in FIG. 6c, once [e0, dcw,0] is determined to be the most clockwise detection line to intersect region 200 from emitter e0, detection line e1, dcw,0, shown in the figure at detection line 172, is an effective detection line to start the next loop with. This allows a significant reduction in the number of computations required to determine the set intersecting detection lines.

As shown in FIG. 6d, the loop then continues and the detection lines from the next emitter continue to be analysed in a counterclockwise pattern, identifying all of the detection lines passing within distance R of the object reference point 250 until a detection line is identified that does not pass within distance R of the object reference point 250. All of the detection lines D1 are defined as the set of detection lines from emitter e1 intersecting region 200. Of this set, the most clockwise detection line is dcw,1 and the most counterclockwise detection line is dccw,1.

The above steps are repeated for every emitter until every detection line intersecting with region 200 is determined. It is noted that the order in which detection lines are analysed is arbitrary. It is possible to start with fixed emitters or detectors when searching for intersect detection lines.

In step 470 of FIG. 4, the signal processor 140 is configured to determine characteristics of the touching object in dependence on the set of detection lines intersecting region 200 around the touching object.

FIG. 7a shows a set of detection lines that intersect region 200′ surrounding finger 210. Non-interacting detection lines 240 intersect region 200′ but do not interact with finger 210 in a significant way. Interacting detection lines 230 intersect region 200′ and are also attenuated or occluded by finger 210.

FIG. 7b shows a similar arrangement to FIG. 7a but where the interacting object is a stylus tip instead of a finger. In FIG. 7b, a set of detection lines are shown that intersect region 200″. Non-interacting detection lines 240 intersect region 200′ but do not interact with stylus tip 220 in a significant way. Interacting detection lines 230 intersect region 200′ and are also attenuated or occluded by stylus tip 220.

Although FIGS. 7a and 7b show the detection lines as thin lines, an actual embodiment may have much wider detection lines in the plane of the touch surface. For a system such as that shown in FIG. 2 where an object may fully occlude light, an un-occluded portion of the detection line may still be received at the detector and so the detection line appears to only be attenuated and not occluded. Therefore, in an embodiment of a touch system of the type shown in FIG. 2, a detection line may be determined to be an interacting detection line if it is attenuated by the object by between 30% to 100%. In an embodiment of a touch system of the type shown in FIG. 3, a detection line may be determined to be an interacting detection line if it is attenuated by the object by between 0.5% to 10%. The ‘interaction’ threshold should be higher than the expected noise on the respective detection line.

From a visual inspection of FIGS. 7a and 7b, it is clear that the ratio of interacting detection lines 230 to non-interacting detection lines 240 is greater for a finger object than for a stylus object. Therefore, in one embodiment, an object type may be determined in dependence on at least one statistical measure of variables of the detection lines.

FIGS. 8a and 8b show histograms of attenuation values (where the attenuation values represent the drop in transmission of the signal, e.g. attenuation=1−transmission) for the interacting detection lines. Four separate finger touch events are shown. FIG. 8a shows values of interacting detection lines 230 for finger 210. FIG. 8a shows values of interacting detection lines 230 for stylus tip 220. Four separate stylii touch events are shown. From FIGS. 8a and 8b, it is clear that fingers have a negative skew while styli tend to have a more positive skew (as defined per its normal meaning in statistics). One can also note that there is almost no tail for the distribution of detection lines with high attenuation for the fingers while there is a distinct tail for the distribution of detection lines for the pens. In the embodiment used to produce the histogram, the attenuation is computed using the current transmission signals and an exponential forget average of 3-4 previous transmission signals.

In one embodiment, changes in attenuation are on a relatively short time scale, i.e. during the touch down event. Such an attenuation map is described in U.S. Pat. No. 9,377,884.

FIGS. 9a and 9b show histograms of attenuation values for the interacting detection lines with corresponding first threshold levels. FIG. 9a shows values of interacting detection lines 230 for finger 210. FIG. 9b shows values of interacting detection lines 230 for stylus tip 220.

The first threshold in FIG. 8 is computed as:

first threshold = factor * sum ( frequency ( n ) * attenuation ( n ) ) sum ( frequency ( n ) ) Where factor = 2

The threshold factor may be adjusted in dependence on temporal information of the interactions of the touch system. In one embodiment, where a plurality of styli have been recently identified in an area, the threshold for detecting styli in that area may be reduced to make stylus classification more likely. Where a plurality of fingers have been recently identified in an area, the factor may be increased for determinations made in that area to make finger classification more likely. The factor may also be adjusted to ensure better performance when several proximal touches are detected, due to some detection lines passing more than one object.

In one embodiment, a first threshold is used to find a ratio of detection lines above and below the first threshold. This ratio is small for fingers and higher for pens. In the example results of FIGS. 8a, 8b, 9a, and 9b, a second threshold is located between the finger ratio of approximately 0.02 and pen ratio of 0.08. i.e. 0.05. The second threshold is then used to determine the object type. i.e. An object having a set of detection lines with the ratio above the second threshold may be determined to be a stylus (or a finger if below the second threshold). In alternative embodiment, the first threshold is computed in dependence on an attenuation peak from an attenuation map. E.g. the first threshold is set to a value corresponding to the peak value multiplied by a typical finger size.

For systems where the detection line width is similar to that of the pen, reconstructed peaks of the same attenuation (touches and pens) have different attenuation histograms. Since a finger is generally bigger it will have lower attenuation per detection line (if the reconstructed attenuation is the same) than for a pen (that attenuates fewer detection lines) even though the reconstructed attenuation value may end up at the same level.

In one embodiment, the ratio of attenuated detection lines (whose attenuation is above a threshold) compared to the number of detection lines passing through the radii may be used to determine an object type. E.g. if all detection lines that pass within 5 mm from the touch point are analysed, a finger can be expected to affect almost all of the detection lines (most fingers are larger than 10 mm in diameter). A stylus tip with 2-4 mm contact diameter will only affect around 10-70% of the detection lines depending on the width of the detection line. Consequently, in an embodiment, the object type may be determined to be a finger where the ratio of the number of affected detections vs total intersecting detections exceeds 0.7.

In other embodiments, the statistical measure may comprise the symmetry, skewness, kurtosis, mode, support, head, tail, mean, median, variance or standard deviation of a variable of the set of intersecting detection lines.

In some embodiments, characteristics of the object may be determined in dependence on a plurality of the statistical measures. In one example, object type and an orientation of the object is determined in dependence on the statistical measure of at least the angle of the light path in the plane (shown as φ in FIG. 6a) of the touch surface and the transmission value of the light path.

In some embodiments, at least one statistical measure is a multivariate statistical measure of values for a plurality of light path variables of the set of intersecting light paths. E.g. A combination of the median and the skewness of the attenuation values may be used to determine object type. Alternatively, variance and median values may be used to determine object type. In an alternative example, an orientation of the object is determined in dependence on the statistical measure of the angle of the light path in the plane of the touch surface and the transmission value of the light path.

A True Centre Point

A true centre point of a touch object (as opposed to object reference point 250) can now be found as the solution to the following over-determined set of linear equations, solved using normal equations.

For each of the interacting detection lines 230, a normal vector (having unit length) is determined as well as a position on the respective detection line (which can be the geometrical position of either emitter or detector or some other point).

For all detection lines passing through the region we get one “weighted” equation:


0=attenuation*clot product (normal [ej−di], object reference point−detection line position[ej−di])

Using the attenuation as weight when solving the normal equations eliminates the need to threshold the affected vs unaffected detection lines when computing the centre point in this fashion.

Where normal is the normal vector and detection line position[ej−di] is a position along the detection line. Then, all of the linear equations are solved to determine a centre position.

This technique also allows a centre position to be determined for regular shapes, oblongs, etc.

Geometric characteristics of the object may also be determined in dependence on the one or more statistical measure, including length, width, radii, orientation in the plane of the touch surface, shape.

Orientation of an Elongated Touching Object

In one embodiment, all determined detection lines for all emitters are analysed to determine their angle φ (phi), defined as the angle between the normal to the detection line and the touch surface x-axis 400, and the shortest distance from true centre point to the detection line. Given all detection lines passing through the region, a minimum average (over a small phi-region) of attenuation*(shortest distance from detection to true centre point), provides the orientation of an elongated object.

Object Boundary Lines

A boundary line may be determined as the detection line with the largest magnitude distance from centre point 250 where the attenuation is above a threshold. The characteristics of the selected boundary line will provide useful information about the characteristics of object 210, 220. First, where the object is substantially rectangular, the length (i.e. the major axis) of the object the may be determined in dependence on a vector defining the shortest distance from the boundary line to the true centre point. As the object is rectangular, the magnitude of the vector may be assumed to be half of the length. Therefore, the length of object may be determined to be twice the magnitude of the vector.

Furthermore, the angle of the vector also defines the orientation angle of the rectangular object. The angle phi of the vector defines the wide axis of the object. Consequently, the angle of the narrow axis of the rectangle may be defined as

phi ± π 2 .

Using

phi ± π 2 ,

we can also use the distance between the boundary line located at

phi ± π 2

and the true centre point in to determine the width of the object. Similar to above, the width of the object may be determined to be twice the magnitude of the vector of the boundary line located at

phi ± π 2 .

In one embodiment, the phi and length values for the object are determined using an average of a plurality of the highest values.

In another embodiment, a touch system is provided includes a touch surface, a display, a touch sensor configured to detect one or more objects touching the touch surface and generate a touch signal, a processing element configured to: determine a position of the one or more objects in dependence on the touch signal, determine whether an object is an eraser in dependence on the touch signal, output a user interface to the display, wherein the user interface is configured to display one or more interaction objects and wherein the user interface is controlled via the one or more objects on the touch surface, wherein an erase function may only be applied to the user interface by means of an object determined to be an eraser. The eraser may have a rectangular surface for application to the touch surface allowing the touch system to easily identify the shape of the eraser, either according to the above techniques or techniques otherwise known to the skilled man. In a class room environment where a teacher and children are interacting with a digital white board and where erasing objects on the digital whiteboard is only permitted by means of the physical eraser, it is surprisingly difficult for a child to accidentally or deliberately simulate the shape of a rectangular eraser on the touch surface using their fingers and hands. Therefore, it is advantageous possible to prevent a child from erasing objects (e.g. ink, text, or geometric shapes) on the digital white board without using the eraser object. i.e. without the teacher's authorization.

In the embodiment above, the user interface may be a canvas or whiteboard application. Furthermore, the one or more interaction objects may comprise ink, text, or geometric shapes. The one or more interaction objects may be added to the user interface by means of a non-eraser object type applied to the touch surface. The erase function may remove interaction objects from the user interface at a position on the user interface corresponding to the position of the eraser on the touch surface.

Claims

1. A touch sensing apparatus, comprising:

a touch surface,
a plurality of emitters, arranged around the periphery of the touch surface, configured to emit beams of light such that one or more objects touching the touch surface cause an attenuation or occlusion of light;
a plurality of detectors, arranged around the periphery of the touch surface, configured to receive light from the plurality of emitters on a plurality of light paths, wherein each detector in the plurality of detectors is arranged to receive light from more than one emitter in the plurality of emitters; and
a hardware processor configured to: determine, based on output signals from the plurality of detectors, a plurality of transmission values, each of the plurality of transmission values corresponding to each of the plurality of light paths; determine an object reference point on the touch surface where the light is attenuated or occluded by an object based on the plurality of transmission values; determine an area on the touch surface including the object reference point; determine one or more light paths of the plurality of light paths intersecting the area; determine a numerical measure based on the determined one or more light paths intersecting the area, and determine one or more characteristics of the object based on the numerical measure.

2. The touch sensing apparatus of claim 1, further comprising a light transmissive panel defining the touch surface and an opposite surface, wherein the emitters are configured to introduce light into the panel for propagation by internal reflection between the touch surface and the opposite surface, and the detectors are configured to receive the light propagating in the panel.

3. The touch sensing apparatus of claim 1, wherein the emitters are configured to transmit the beams of light above the touch surface and the detectors are configured to receive said beams of light travelling above the touch surface.

4. The touch sensing apparatus of claim 1, wherein processing the transmission values to determine the object reference point on the touch surface comprises processing the transmission values according to an image reconstruction algorithm to determine areas of the touch surface where the light is attenuated or occluded by an object, and selecting an object reference point at a position on the touch surface corresponding to an area of occlusion or high attenuation of the light.

5. The touch sensing apparatus of claim 4, wherein the image reconstruction algorithm is an algorithm for transmission tomography.

6. The touch sensing apparatus of claim 1, wherein processing the transmission values to determine the object reference point on the touch surface comprises triangulation of attenuated or occluded light paths.

7. The touch sensing apparatus of claim 1, wherein the region is defined as a circular region with a radius R from the object reference point at the centre.

8. The touch sensing apparatus of claim 7, wherein the plurality of light paths intersecting the region are determined to be the plurality of light paths passing within radius R of the object reference point.

9. The touch sensing apparatus of claim 1, wherein the at least one light path variables may further comprise:

an angle of the light path in the plane of the touch surface,
a closest distance from object reference point to the light path,
a noise value for the light path,
a validity status of light path,
a width of the light path in the plane of the touch surface.

10. The touch sensing apparatus of claim 1, wherein the one or more statistical measure is a ratio of values above a first threshold to values below the first threshold.

11. The touch sensing apparatus of claim 10, wherein the first threshold value is determined in dependence on a determination of the attenuation or occlusion of the light at the object reference point.

12. The touch sensing apparatus of claim 1, wherein the at least one statistical measure comprises: symmetry, skewness, kurtosis, mode, support, head, tail, mean, median, variance or standard deviation.

13. The touch sensing apparatus of claim 1, wherein the one or more characteristics of the object are determined in dependence on a plurality of the one or more statistical measures.

14. The touch sensing apparatus of claim 13, wherein an object type and an orientation of the object is determined in dependence on the statistical measure of at least the angle of the light path in the plane of the touch surface and the transmission value of the light path.

15. The touch sensing apparatus of claim 1, wherein the at least one statistical measure is a multivariate statistical measure of values for each of at least two light path variables of the plurality of light paths intersecting the region.

16. The touch sensing apparatus of claim 15, wherein an orientation of the object is determined in dependence on the statistical measure of the angle of the light path in the plane of the touch surface and the transmission value of the light path.

17. The touch sensing apparatus of claim 1, wherein the one or more characteristics of the object determined in dependence on the at least one statistical measure, comprises object type.

18. The touch sensing apparatus of claim 17, wherein the touch sensing apparatus is configured to differentiate between a finger and at least one stylus.

19. The touch sensing apparatus of claim 1, wherein the touch sensing apparatus is configured to determine at least one geometric characteristic of the object from: length, width, radii, orientation in the plane of the touch surface, shape.

20. A method in a touch sensing apparatus, said touch sensing apparatus comprising:

a touch surface,
a plurality of emitters arranged around the periphery of the touch surface, configured to emit beams of light such that one or more objects touching the touch surface cause an attenuation or occlusion of the light; and
a plurality of light detectors, arranged around the periphery of the touch surface, configured to receive light from the plurality of emitters on a plurality of light paths, wherein each detector in the plurality of detectors is arranged to receive light from more than one emitter in the plurality of emitters;
said method comprising: determining, based on output signals from the plurality of light detectors, a plurality of transmission values, each of the plurality of transmission values corresponding to each of the plurality of light paths; determine an object reference point on the touch surface where the light is attenuated or occluded by an object based on the plurality of transmission values; determining an area on the touch surface including the object reference point, determining one or more of light paths of the plurality of light paths intersecting the area, determining a numerical measure based on the determined one or more light paths intersecting the area, and determining one or more characteristics of the object in dependence based on the at least one numerical measure.
Patent History
Publication number: 20180275830
Type: Application
Filed: Mar 19, 2018
Publication Date: Sep 27, 2018
Inventors: Tomas Christiansson (Torna-Hallestad), Kristofer Jakobson (Malmo), Nicklas Ohlsson (Bunkeflostrand), Mattias Krus (Lund), Magnus Hollström (Lund)
Application Number: 15/925,329
Classifications
International Classification: G06F 3/042 (20060101);