SYSTEM AND METHOD FOR MULTI-MODE COMMAND INPUT
A controlling device has a moveable touch sensitive panel positioned above a plurality of switches. When the controlling device senses an activation of at least one of the plurality of switches when caused by a movement of the touch sensitive panel resulting from an input at an input location upon the touch sensitive surface, the controlling device responds by transmitting a signal to an appliance wherein the signal is reflective of the input location upon the touch sensitive surface.
This application claims the benefit of and is a continuation of U.S. Application No. 15/902,007, filed on Feb. 22, 2018, which application claims the benefit of and is a continuation of U.S. Application No. 12/645,037, filed on Dec. 22, 2009, the disclosures of which are incorporated herein by reference in its entirety.
BACKGROUNDControlling devices for use in issuing commands to entertainment and other appliances, for example remote controls, and the features and functionality provided by such controlling devices are well known in the art. Traditionally, user input means on such controlling devices has comprised a series of buttons each of which may result in the transmission of a specific command when activated. Increasingly in today’s environment, such controlling devices must be used to interact with displayed menu systems, browse web pages, manipulate pointers, and perform other similar activities which may require directional control input, e.g., to scroll displayed information on a screen, to move a pointer, to control a game activity or avatar, to zoom in or out, to control functions such as fast forward or slow motion, or the like (such activities collectively referred to hereinafter as “navigation”). Although certain navigation functions may be performed using conventional controlling device input mechanisms, such as a group of up, down, left, and right arrow keys, in many instances the user experience may be improved by the provision of an input mechanism which is better suited to this type of activity. Additionally, multi-functional use of this input mechanism may further improve user experience by reducing the number of keys or buttons on a controlling device.
SUMMARYThe following generally describes a system and method for providing improved user input functionality on a controlling device. To this end, in addition to a conventional key matrix for receiving button inputs as is well known in the art, a controlling device may be provided with input means such as for example a resistive or capacitive touch sensor, etc., whereby motion and/or pressure by a user’s finger may be translated into navigation commands to be transmitted to a target controlled device. These commands may be applied at the target device to control operations such as scrolling a menu, movement of a cursor on the screen, motion of a game object, etc., as appropriate for a particular application. Furthermore, in addition to, or when not required for, the performance of navigation functions, the touch sensitive input means may be adapted to provide for conventional keypress input operations, such as for example without limitation a numeric keypad in an illustrative embodiment.
A better understanding of the objects, advantages, features, properties and relationships of the invention will be obtained from the following detailed description and accompanying drawings which set forth illustrative embodiments and which are indicative of the various ways in which the principles of the invention may be employed.
For a better understanding of the various aspects of the invention, reference may be had to preferred embodiments shown in the attached drawings in which:
Turning now to
With reference to
As will be understood by those skilled in the art, some or all of the memories 202, 204, 206 may include executable instructions (collectively, the program memory) that are intended to be executed by the processor 200 to control the operation of the remote control 100, as well as data which serves to define to the operational software the necessary control protocols and command values for use in transmitting command signals to controllable appliances (collectively, the command data). In this manner, the processor 200 may be programmed to control the various electronic components within the remote control 100, e.g., to monitor the key matrix 216, to cause the transmission of signals, etc. The non-volatile read/write memory 206, for example an EEPROM, battery-backed up RAM, FLASH, Smart Card, memory stick, or the like, may additionally be provided to store setup data and parameters as necessary. While the memory 204 is illustrated and described as a ROM memory, memory 204 can also be comprised of any type of readable media, such as ROM, FLASH, EEPROM, or the like. Preferably, the memories 204 and 206 are non-volatile or battery-backed such that data is not required to be reloaded after battery changes. In addition, the memories 202, 204 and 206 may take the form of a chip, a hard disk, a magnetic disk, an optical disk, and/or the like. Still further, it will be appreciated that some or all of the illustrated memory devices may be physically combined (for example, a single FLASH memory may be logically partitioned into different portions to support the functionality of memories 204 and 206 respectively), and/or may be physically incorporated within the same IC chip as the microprocessor 200 (a so called “microcontroller”) and, as such, they are shown separately in
To cause the controlling device 100 to perform an action, the controlling device 100 may be adapted to be responsive to events, such as a sensed user interaction with the key matrix 216, touchpad 218, etc. In response to an event, appropriate instructions within the program memory (hereafter the “operating program”) may be executed. For example, when a function key is actuated on the controlling device 100, the controlling device 100 may retrieve from the command data stored in memory 202, 204, 206 a command value and control protocol corresponding to the actuated function key and, where necessary, current device mode, and will use the retrieved command data to transmit to an intended target appliance, e.g., STB 104, a command in a format recognizable by that appliance to thereby control one or more functional operations of that appliance. It will be appreciated that the operating program can be used not only to cause the transmission of commands and/or data to the appliances, but also to perform local operations. While not limiting, local operations that may be performed by the controlling device 100 may include displaying information/data, favorite channel setup, macro key setup, function key relocation, etc. Examples of local operations can be found in U.S. Pat. Nos. 5,481,256, 5,959,751, and 6,014,092.
In some embodiments, controlling device 100 may be the universal type, that is provisioned with a library comprising a multiplicity of command codes and protocols suitable for controlling various appliances. In such cases, for selecting sets of command data to be associated with the specific appliances to be controlled (hereafter referred to as a setup procedure), data may be entered into the controlling device 100 that serves to identify each intended target appliance by its make, and/or model, and/or type. The data may typically be entered via activation of those keys that are also used to cause the transmission of commands to an appliance, preferably the keys that are labeled with numerals. Such data allows the controlling device 100 to identify the appropriate command data set within the library of command data that is to be used to transmit recognizable commands in formats appropriate for such identified appliances. The library of command data may represent a plurality of controllable appliances of different types and manufacture, a plurality of controllable appliances of the same type but different manufacture, a plurality of appliances of the same manufacture but different type or model, etc., or any combination thereof as appropriate for a given embodiment. In conventional practice as is well known in the art, such data used to identify an appropriate command data set may take the form of a numeric setup code (obtained, for example, from a printed list of manufacturer names and/or models with corresponding code numbers, from a support Web site, etc.). Alternative setup procedures known in the art include scanning bar codes, sequentially transmitting a predetermined command in different formats until a target appliance response is detected, interaction with a Web site culminating in downloading of command data and/or setup codes to the controlling device, etc. Since such methods for setting up a controlling device to command the operation of specific home appliances are well-known, these will not be described in greater detail herein. Nevertheless, for additional information pertaining to setup procedures, the reader may turn, for example, to U.S. Pat. Nos. 4,959,810, 5,614,906, or 6,225,938 all of like assignee and incorporated herein by reference in their entirety.
In keeping with the teachings of this invention, controlling device 100 may include input means for accepting user touch input to be translated into navigation commands. In an exemplary embodiment, input means 218 may take the form of a multiple-electrode capacitive touch sensor. In this form, input means 218 may accept finger sliding gestures on either axis for translation into navigation step commands in an X or Y direction, as well as finger pressure at, for example, the cardinal points and center area for translation into discrete commands, for example equivalent to a conventional keypad’s four arrow keys and a select key, all as will be described in further detail hereafter.
Turning to
In a first input mode, a user may slide a finger across the surface of the touch surface, e.g., keycap 304, to cause navigation command output, for example as described in co-pending U.S. Pat. Application 12/552,761, of like assignee and incorporated herein by reference in its entirety. Such navigation step commands resulting from finger sliding gestures may be reported to a target appliance using any convenient transmission protocol, IR or RF, as known in the art. In general, such reports may include information representative of both direction and speed of the input gesture. Since exemplary gesture interpretation and reporting techniques are presented in the above referenced ‘761 application, for the sake of brevity these will not be repeated herein.
In a second input mode, which may be used in conjunction with or separately from finger slide input, a user may press downwards 322 anywhere upon the touch surface, e.g., acrylic keycap 304. As illustrated, this will result in compression of one or more of the underlying silicon rubber buttons 310 through 313, for example button 310′ as shown in
By way of further example, if conventional keypress decoding based only on the status of silicon rubber buttons 310 through 313 were to be employed in this example and user finger pressure was applied at location 324, it will be appreciated that the circuits associated with either or both of buttons 310 and 313 may be completed individually or collectively in either order and within a short time of one another, which may lead to uncertainty as to the exact location of the actuating finger. Likewise, considering for a moment an alternate embodiment in which the silicon buttons are dispensed with and the touch input pad fixedly mounted in the controlling device casing, the decoding function of the controlling device operating program may in this instance be required to distinguish between a finger tap action and the commencement or termination of a finger slide action. Accordingly, it will be appreciated that in the exemplary embodiment presented, advantageously finger press detection and finger position detection are performed separately in the manner described above, which may result in a more robust and reliable overall detection mechanism. Further, the provision of keypad elements as part of such a floating touch sensor may also result in improved user tactile feedback.
Certain embodiments of controlling device 100 may support multiple modes of operation of touch input area 106. By way of example, with reference to
Turning now to
By way of more detailed example, the flowchart of
If the actuated key is not the “1-2-3” button, at step 610 the operating program of controlling device 100 may next determine if the actuated key is one of the group 310 through 313 associated with touch sensor assembly 302, 304. If not, the key input may represent a conventional button, for example “volume up” 406, and is processed at step 612. Since such conventional key decoding and command output are well known in the art, for the sake of brevity this aspect of controlling device 100 and associated operating program will not be discussed further herein.
If however, the operating program of controlling device 100 determines that the actuated key is one or more of the group 310 through 313, at step 614 the “X” and “Y” coordinates of the user’s actuating finger position may be ascertained from touch sensor 302. Next, in order to establish the interpretation to be applied to these values, at step 616 the operating program of controlling device 100 may determine if touch pad input is currently to be interpreted as digit entry or as navigation entry. If navigation entry is the current operational mode, then at step 618 the reported X,Y coordinates may be interpreted according to a five zone model 506 illustrated in
For example, with reference to the bottom row of Table 1, i.e., when reported Y coordinate is in the range 0 through 4:
As will be evident from an examination of Table 1, similar algorithms may be symmetrically applied to the other possible ranges of X and Y to resolve these values as locations within the five zone pattern 506 of
If however, the operating program of controlling device 100 determines at step 616 that digit, i.e., numeric key, entry is the current operational mode, then at step 620 the reported X,Y coordinates may be interpreted according to the twelve zone model 520 illustrated in
After determining the requested appliance command function in the manner described above, at step 622 the operating program of controlling device 100 may transmit the indicated command to the target appliance. In certain embodiments, actuation of the numeric “Enter” key 408 may be defined to also cause controlling device 100 to exit the digit entry mode. In such embodiments, at step 624 it may be determined if the command just issued was “Enter” in which case processing continues at step 608 in order to clear the digit entry mode status, whereafter processing of the key matrix input is complete.
Turning now to
While various concepts have been described in detail, it will be appreciated by those skilled in the art that various modifications and alternatives to those concepts could be developed in light of the overall teachings of the disclosure. For example, while the exemplary embodiment presented above utilizes a silicon rubber keypad as an actuation element for the floating touch sensor, it will be appreciated that various other mechanisms such as metallic dome switches, micro switches, flexible leaf contacts, etc. may be successfully utilized in other embodiments.
Further, while described in the context of functional modules and illustrated using block diagram format, it is to be understood that, unless otherwise stated to the contrary, one or more of the described functions and/or features may be integrated in a single physical device and/or a software module, or one or more functions and/or features may be implemented in separate physical devices or software modules. It will also be appreciated that a detailed discussion of the actual implementation of each module is not necessary for an enabling understanding of the invention. Rather, the actual implementation of such modules would be well within the routine skill of an engineer, given the disclosure herein of the attributes, functionality, and inter-relationship of the various functional modules in the system. Therefore, a person skilled in the art, applying ordinary skill, will be able to practice the invention set forth in the claims without undue experimentation. It will be additionally appreciated that the particular concepts disclosed are meant to be illustrative only and not limiting as to the scope of the invention which is to be given the full breadth of the appended claims and any equivalents thereof.
All publications cited within this document are hereby incorporated by reference in their entirety.
Claims
1. A method for remotely controlling one or more devices and/or a user interface, the method comprising:
- in response to a discrete user input event being detected at a user input element of a remote control: determining a location on a surface of the user input element of the remote control at which the discrete user input event was detected; determining whether the discrete user input event is a click event or a touch event; selecting from a library of control commands stored in a memory of the remote control a control command based on whether the discrete user input event is a click event or a touch event and based only on the location on the surface of the user input element at which the discrete user input event was detected; and causing the control command to be executed; wherein a threshold associated with a depression of the user input element is used to determine whether the discrete user input event is a click event or a touch event.
2. The method as recited in claim 1, wherein determining the location on the surface of the user input element of the remote control at which the discrete user input event was detected comprises determining an X/Y location of a touch upon the touch sensing device.
3. The method as recited in claim 1, wherein the threshold associated with the depression comprises a threshold associated with a depression of a metallic dome underlying the user input element.
4. The method as recited in claim 1, wherein the control command comprises a graphical user interface navigational control command when the discrete user input event is determined to be a touch event.
5. The method as recited in claim 1, comprising causing indicia to be displayed on a display device for indicating to a user a plurality of control commands transmittable from the remote control via user interaction with the input element.
6. The method as recited in claim 1, further comprising enabling a user to selectively map different control commands from the library of control commands to different user input events receivable via the user input element.
7. The method as recited in claim 1, further comprising enabling a user to selectively map different control commands from the library of control commands to different locations on the surface of the user input element.
8. The method as recited in claim 1, wherein control commands transmitted from the remote control are executable through interaction with the graphical user interface when displayed on a display screen.
9. The method as recited in claim 1, wherein the control command is a command for remotely controlling a controlled device and the method further comprising executing the control command at the controlled device.
10. The method as recited in claim 1, wherein a unique control command is mapped to each of plural locations on the surface of the user input element for each of at least a click input event and a touch input event.
11. A remote control system for remotely controlling one or more devices and/or a user interface, the remote control system comprising:
- a plurality of user input elements, each of the user input elements configured to receive a user input event;
- a plurality of sensors associated with a one of the plurality of user input elements, the sensors being configured to generate sensor data in response to a discrete user input event being received at the one of the plurality of user input elements;
- user input event detection logic configured to receive the sensor data and identify whether the discrete user input event received at the one of the plurality of user input elements was a click event or a touch event, or another user input event and to identify a location on a surface of the one of the plurality of user input elements at which the click event or the touch event was received; and
- command selection logic configured to use only the location on the surface of the one of the plurality of user input elements at which the click event or the touch event was received to select a first control command from a library of control commands stored in a memory of the remote control to be executed in response to determining that the user input event received at the one of the plurality of user input elements was a click event and to select from the library of control commands a second control command to be executed in response to determining that the user input event received at the one of the plurality of user input elements was a touch event;
- wherein a threshold associated with a depression of the user input element is used to determine whether the discrete user input event is a click event or a touch event.
12. The remote control system as recited in claim 11, wherein the one of the plurality of user input elements comprises a touch sensing device and wherein the user input event detection logic identifies the location on the surface of the one of the plurality of user input elements at which the click event or the touch event was received by determining an X/Y location of a touch upon the touch sensing device.
13. The remote control system as recited in claim 11, wherein the threshold associated with the depression comprises a threshold associated with a depression of a metallic dome underlying the one of the plurality of user input elements.
14. The remote control system as recited in claim 11, wherein the second control command comprises a graphical user interface navigational control command.
15. The remote control system as recited in claim 11, wherein control commands transmitted from the remote control system are executable through interaction with the graphical user interface when displayed on a display screen.
16. The remote control system as recited in claim 11, wherein the first and/or second control command is a command for remotely controlling a controlled device and the first and/or second control command is executed at the controlled device.
17. The remote control system as recited in claim 11, wherein a unique control command is mapped to each of a plurality of locations on the surface of the one of the plurality of user input elements for each of at least a click input event and a touch input event.
18. The remote control system as recited in claim 12, further comprising remote control user customization logic, the remote control user customization logic being configured to enable a user to selectively map different control commands from the library of control commands to different locations on the surface of the one of the plurality of user input elements.
19. A remote control system for remotely controlling one or more devices and/or a user interface, the remote control system comprising:
- a remote control comprising: a plurality of user input buttons, at least one of the user input buttons configured to receive a user input event and comprising at least one metal dome and a printed circuit board; a plurality of sensors being coupled to the at least one of the user input buttons of the plurality of user input buttons, the plurality of sensors configured to generate sensor data in response to a user input event being received at the at least one of the user input buttons of the plurality of user input buttons; a memory storing a plurality of command values; and user input event detection logic configured to receive the sensor data and to immediately identify whether the user input event received at the at least one of the user input buttons of the plurality of user input buttons was a click event or a touch event, wherein the user input event detection logic identifies that the user input event is the click event based on receiving sensor data indicating that the at least one metal dome is depressed such that it forms an electrical connection on the printed circuit board; and command selection logic configured to include a first one of the plurality of command values in a first control command transmission in immediate response to the user input event detection logic determining that the user input event received at the at least one of the user input buttons of the plurality of user input buttons was the click event and to include a second one of the plurality of command values in a second control command transmission in immediate response to the user input event detection logic determining that the user input event received at the at least one of the user input buttons of the plurality of user input buttons was the touch event.
20. The remote control system of claim 19, wherein the command selection logic comprises part of the remote control.
21. The remote control system of claim 19, further comprising device control command execution logic, wherein the first and/or second control command transmission is a command for remotely controlling a controlled device, and wherein the device control command execution logic causes the first and/or second control command to be executed at the controlled device.
22. The remote control system of claim 19, wherein the at least one of the user input buttons of the plurality of user input buttons of the plurality of user input buttons is a click pad having a plurality of sensors coupled thereto at corresponding sensor positions of the click pad, the click pad configured to receive a user input event at each of the sensor positions.
23. The remote control system of claim 22, wherein a different one of the plurality of command values is mapped to each of the sensor positions of the click pad for each of at least a click input event and a touch input event.
24. The remote control system of claim 19, further comprising remote control user customization logic, the remote control user customization logic being configured to enable the user to selectively map different ones of the plurality of command values to different user input events.
25. The remote control system of claim 19, wherein the first one of the plurality of command values included in the first control command transmission is further based on a type for a controllable appliance that is intended to receive the first control command transmission.
26. The remote control system of claim 19, wherein the first one of the plurality of command values included in the first control command transmission is further based on a brand of a controllable appliance that is intended to receive the first control command transmission.
27. The remote control system of claim 19, wherein the second one of the plurality of command values included in the second control command transmission is further based on a type for a controllable appliance that is intended to receive the second control command transmission.
28. The remote control system of claim 19, wherein the second one of the plurality of command values included in the second control command transmission is further based on a brand of a controllable appliance that is intended to receive the second control command transmission.
Type: Application
Filed: Sep 29, 2022
Publication Date: Jun 22, 2023
Inventors: Arsham Hatambeiki (Scottsdale, AZ), Jeffrey Kohanek (Westminster, CA), Pamela Eichler Keiles (Long Beach, CA)
Application Number: 17/955,756