EVENT DISPOSITION CONTROL FOR MOBILE COMMUNICATIONS DEVICE
A method, system, and medium are provided for responding to an event by way of a mobile communications device. One embodiment of the method includes receiving at the mobile communications device an indication of an occurrence of an event; presenting an informational element (e.g., a slideable graphical-user-interface control) on a display of the device that presents contextually relevant data that is related to the event; and receiving user input that disposes the informational element to one of at least two drop zones that are defined by respective portions of the display and that are respectively associated with certain actions such that moving the informational element to one of the at least two drop zones invokes a certain response that is consistent with the corresponding drop zone.
Latest SPRINT COMMUNICATIONS COMPANY L.P. Patents:
- Electronic subscriber identity module (eSIM) transfer via activation code
- Method and system for sensing-data collection and transport from Minimization-of-Drive Tests device
- Fifth generation (5G) edge application authentication
- System and method of access point name (APN) dynamic mapping
- Internet of Things communication service
This Application claims the benefit of, and expressly incorporates by reference, U.S. Provisional Application No. 61/040,149, filed on Mar. 28, 2008.
The following five applications are related by subject matter, one of which is instant application, and the other four are hereby expressly incorporated by reference herein: 1) “EVENT DISPOSITION CONTROL FOR MOBILE COMMUNICATIONS DEVICE” having attorney docket number 5534/SPRI.140035; 2) “PERSISTENT EVENT-MANAGEMENT ACCESS IN A MOBILE COMMUNICATIONS DEVICE” having attorney docket number 5535/SPRI.140036; 3) “LIST-POSITION LOCATOR” having attorney docket number 5537/SPRI.140038; 4) “CORRECTING DATA INPUTTED INTO A MOBILE COMMUNICATIONS DEVICE” having attorney docket number 5538/SPRI.140039; 5) “PHYSICAL FEEDBACK TO INDICATE OBJECT DIRECTIONAL SLIDE” having attorney docket number 5536/SPRI.140037.
SUMMARYThe present invention is defined by the claims below, not this summary. We offer a high-level overview of embodiments of the invention here for that reason, to provide an overview of the disclosure.
In one aspect, a method is provided for responding to an event by way of a mobile communications device. The method includes, incident to an occurrence of the event, presenting an informational element on a display of the mobile communications device. The informational element might take the form of a slideable control that can be manipulated by touch actions. The method also includes presenting descriptive information in the informational element that is contextually related to the event such that the descriptive information is related to the event; providing at least two drop zones, that each consume an area of the display. The first drop zone is associated with a first action. The second drop zone is associated with a second action. If the informational element is moved to the first drop zone, then the first action occurs. If the informational element is moved to the second drop zone, then the second action occurs. If the informational element is not moved to either the first or second drop zone, then allowing a default action to occur that is different from the first or second actions.
In a second aspect, a method of responding to an event by way of a mobile communications device includes receiving at the mobile communications device an indication of an occurrence of an event; presenting an informational element (e.g., a slideable graphical-user-interface control) that presents contextually relevant data that is related to the event; and receiving user input that disposes the informational element to one of at least two drop zones that are defined by respective portions of the display and that are respectively associated with certain actions such that moving the informational element to one of the at least two drop zones invokes a certain response that is consistent with the corresponding drop zone.
In another aspect, a mobile communications device includes a touchscreen display that presents a graphical user interface that includes at least two drop zones, which consume opposite portions of the display. The mobile communications device is configured to initiate a corresponding action incident to any of the drop zones sufficiently receiving an informational element. A processor is also included that helps facilitate a presentation of the informational element, which is presented based on an occurrence of an event, and which includes descriptive information that presents data associated with the event such that the corresponding action can be initiated in response to sufficiently receiving an informational element.
Illustrative embodiments of the present invention are described in detail below with reference to the attached drawing figures, which are incorporated by reference herein and wherein:
Throughout the description of the present invention, several acronyms and shorthand notations are used to aid the understanding of certain concepts pertaining to the associated system and services. These acronyms and shorthand notations are intended for the purpose of providing an easy methodology of communicating the ideas expressed herein and are not meant to limit the scope of the present invention.
Further, various technical terms are used throughout this description. An illustrative resource that fleshes out various aspects of these terms can be found in Newton's Telecom Dictionary by H. Newton, 24th Edition (2008).
Embodiments of the present invention may be embodied as, among other things: a method, system, or computer-program product. Accordingly, the embodiments may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware. In one embodiment, the present invention takes the form of a computer-program product that includes computer-useable instructions embodied on one or more computer-readable media.
Computer-readable media include both volatile and nonvolatile media, removable and nonremovable media, and contemplates media readable by a database, a switch, and various other network devices. By way of example, and not limitation, computer-readable media comprise media implemented in any method or technology for storing information. Examples of stored information include computer-useable instructions, data structures, program modules, and other data representations. Media examples include, but are not limited to information-delivery media, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile discs (DVD), holographic media or other optical disc storage, magnetic cassettes, magnetic tape, magnetic disk storage, and other magnetic storage devices. These technologies can store data momentarily, temporarily, or permanently.
Storage components 112 may take the form of the aforementioned computer-readable media. As with all of the illustrative components of
Processors 116 facilitate a flow of information among all or a portion of the components shown in
Touchscreen display 118 provides one avenue of inputting data into device 100. In one embodiment, touchscreen display 118 takes the form of a resistive touch screen, but in some embodiments, it might be capacitive. Touchscreen display 118 receives input by way of touch actions that cause a device to come in contact with touchscreen display 118. An illustrative example includes a user utilizing his or her finger to tap or use some other form of touch action to interact with mobile device 100. Other items such as a stylus, fingernail, etc. may be used to provide input to mobile device 100 by way of touchscreen display 118. Other illustrative touch actions include a sliding motion as well as multipoint touches.
Radios 120 facilitate the communication of wireless communication signals to and from mobile device 100. Illustrative protocols that can be utilized in connection with an embodiment of the present invention include CVMA, TVMA, GSM, GPRS, EVVO, etc. The radios facilitate wireless communications between the device and a national or even global telecommunications network.
Input/output ports 122 provide a way for mobile device 100 to interact with other peripheral components. Illustrative input/output ports include an ear-piece or headphone jack, a USB port, an infrared port, and the like. Different input/output ports could be provided as is needed to facilitate communication of other peripheral components.
Vibrating component 124 enables mobile device 100 to experience a vibrating action incident to an occurrence of different events. Vibrating component 124 may take on a variety of forms, such as a motor that operates with an offset mass. In one embodiment, vibrating component 124 takes the form of a haptics motor. Vibrating component 124 includes the ability to operate at various frequencies, which can be controlled by way of different software or hardware mechanisms of mobile device 100. That is, instead of mere playback of a vibrating action, vibrating component 124 can respond in real time to a varying stimulus.
Power supply 126 may also take on a variety of forms ranging from a battery to a charging mechanism to other forms of power sources that serve to provide power to mobile device 100.
The selected components of mobile device 100 are meant to be illustrative in nature, and the various lower-level details of the components are not elaborated on so as to not obscure the present invention. Clearly, some of the components may be absent in some embodiments of the present invention, and additional components not shown may also be part of mobile device 100. Attempting to show all of the various components of mobile device 100 would obscure certain novel aspects, and we will refrain from such elaboration at least for the sake of brevity.
Persistent Event-Management Access in a Mobile Communications DeviceAn aspect of an embodiment of the present invention includes an ability to view notification of new events with minimal effort by way of a mobile communications device. In one embodiment, a way to provide notification of new events (such as receiving a voicemail, instant message, etc.) is to provide a dynamically updateable list of items that indicates a change in status of an event, wherein the list is viewable by touching a predetermined region on a touchscreen device that is part of the mobile communications device.
Turning now to
Touchscreen display 212 also includes a predetermined region 214 that includes one or more dynamic icons 216 which indicate a state associated with the mobile communications device 210. As used herein, a state associated with the mobile communications device 210 may include any configuration of settings, features, event notifications, etc. associated with the functionality of the device 210. For example, in one embodiment, dynamic icons 216 may indicate that, as part of the state of the device 210, the battery power is at 100%, the radio signal is at a certain strength, the time of day, or any number of other state variables known in the art. We do not mean to limit the nature of the dynamic icons displayed in the predetermined region 214 herein, and it will be appreciated by those skilled in the art that any number of icons can be included to provide information of any kind.
In some embodiments, the predetermined region may be provided without including icons, and in other embodiments, the predetermined region may be provided in any area of the touchscreen display 212. For clarity and consistency, we illustrate predetermined region 214 as being provided at the top of touchscreen display 212 but do not mean to limit the arrangement to that illustrated herein. In an embodiment of the present invention, predetermined region 214 may be persistently presented. That is, regardless of what is being displayed on the touchscreen display 212, the predetermined region 214 is always displayed. Additionally, regardless of any type of application that is running, or any other type of functionality that is being presented on display 212, the predetermined region 214 may be visible. In various embodiments, predetermined region 214 is persistently visible, even when transitioning between a first and second screen displayed on the touchscreen display 212. It should be evident that the functionality associated with the display of predetermined region 214 is generally independent of many other aspects of the mobile communications device 210.
In further embodiments, exceptions may be made for the persistent display of predetermined region 214 such as, for example, when an application is being utilized that requires all of the screen real estate of display 212. Such an application may correspond to a text input mode presented in landscape orientation or portrait orientation, an audio/video presentation, and the like. Various modifications of such an exception may be made, including having no exception, and any one or combination of such configurations is intended to be within the ambit of the present invention.
Predetermined region 214 may also include a dynamic icon 218 that indicates whether a new event has occurred. As used herein, a new event may include things such as receiving a phone call, message (e.g., a voicemail, an email, a text message, and the like), initiating an alarm, receiving a file (e.g., a picture, an audio file, a video file, and the like), or arriving at a time associated with a calendared event, which may be an event that, upon occurrence, triggers an alarm, notification, or reminder to be presented. In one embodiment, dynamic icon 218 may be a particular image corresponding to a particular event. For example, dynamic icon 218 may appear as a phone handset when a new call is received.
In another embodiment, as shown in
In an embodiment of the present invention, predetermined region 214 is capable of receiving touch input. It will be appreciated by those skilled in the art that touchscreen display 212 may include numerous areas or regions (including the entire touchscreen display 212) that are capable of receiving touch input, and that the functionality provided by touch input at a particular location of touchscreen display 212 may vary, depending on particular applications, underlying operating systems, and the like. In an embodiment, predetermined region 214 may be configured in such a way that touch input to the predetermined region 214 always produces the same type of functionality. For example, as described further below, touch input to predetermined region 214 may always result in the presentation of a list of items.
Turning now to
As shown in
In an embodiment, the presentation displayed incident to receiving touch input in the predetermined region 214 may be superimposed over the top of whatever is being displayed or interacted with on the touchscreen display 212 such that the presentation does not interfere with anything associated with that which is being displayed on display 212. Also, the presentation may be displayed without having to first (or concurrently) transition away from the screen being displayed previous to receiving touch input to predetermined region 214.
For example, a contacts menu may be displayed on touchscreen display 212 before touch input is received in the predetermined region 214. A user may be interacting with the contacts menu such as by creating a new contact, selecting an existing contact, editing an entry and the like, when the user decides to provide touch input to the predetermined region 214. In an embodiment, upon receiving that touch input, the listing 222 may be superimposed over the top of the contacts menu without changing the state of any functionality or display characteristics associated with the contacts menu. Thus, when the user causes a further touch input to the predetermined region, in an embodiment, the list 222 may be removed from the display, revealing the contacts menu, which may be displayed in exactly the same state that it was before the user interacted with the predetermined region 214. This way, new events or other information may be viewed without interrupting the functionality of a current application, and without having to exit or transition from a current screen to another.
Additionally, touch input by way of predetermined region 214 may result in a presentation of information 224 associated with one or more of the dynamic icons 216. Although both the list 222 and the information 224 are illustrated in
Turning to
Moreover, the contents of the information 224 may be somewhat static, completely static, somewhat dynamic, or completely dynamic. For instance, in an embodiment, the information 224 may always contain images of icons and text describing what the icons represent. In other embodiments, the icons and associated text provided may vary depending on which icons are presented in the predetermined region. In another embodiment, the information 224 may present a duplicate icon along with text explaining the status of the functionality represented by the icon, which may vary depending on the state of the device 210. In still a further embodiment, text may be provided along with a check box, that when checked indicates that the subject matter of the text is relevant to the current state of the device 210. Any number of combinations of the above examples, including other possibilities and configurations are possible and are intended to be within the ambit of the present invention.
With continued reference to
That is, in one case a user may be interacting with a screen that lists text messages received. While interacting with or viewing that screen, a new text message may be received. In this case, the user may be notified of the receipt of the new text message by seeing an indication on the screen with which the user is interacting. In one embodiment, this new message is not included as a new event in the listing 222. However, in a similar embodiment, if a new voicemail is received while the user is interacting within the text message screen, the new voicemail may be included in the listing 222 as a new event.
In other embodiments, an event may be included in the list 222 so long as the event has not been acted on by a user. That is, so long as a user does not view, reply to, interact with, or in some other way confirm receipt or notification of the event, the event may be included in the list 222. In still further embodiments, the list 222 may be adapted to be configured by a user such that the user can determine the conditions under which an event will be included in the list 222. Similarly, any number of possibilities exist for the type of status of an event that may be reflected in the listing 222 of statuses, as will be understood by the description above.
In an embodiment, the list may include any number of items 230, 232, 234, 236, 238, 240 that present statuses of events. The items 230, 232, 234, 236, 238, 240 may be dynamically updateable, in that their presentation may vary as new events occur, as statuses of events change, etc. In an embodiment, the list 222 only includes items 230, 232, 234, 236, 238, 240 that correspond to a particular status such as “new.” Thus, for example, if the only new event that the device 210 has received (or has observed the occurrence of) the list 222 may only present that single item. In another embodiment, the list 222 may present a set of statuses for events wherein the set of events represented remains consistent, but the notification of the statuses changes. This feature may also be one that is inherent to the functionality of the device 210, or in other embodiments may be a feature that is configurable by a user or service provider.
As shown in
In an embodiment, items 230, 232, 234, 236, 238, 240 are capable of being interacted with by touch input, which can include focus interactions or even motion gestures. That is, each of items 230, 232, 234, 236, 238, 240 may include a particular region or area of the touchscreen display 212 that is capable of receiving touch input. Upon receiving touch input to an item 230, 232, 234, 236, 238, 240, further information may be presented. That further information, in an embodiment, corresponds to the event or events associated with the item that received touch input. For example, incident to receiving touch input to the item 232 associated with receipt of a new email, further information such as the time, sender, subject, etc., of the email may be provided. In another embodiment, touch input to item 232 may result in the presentation of the text of the new email. In still further embodiments, touch input to item 232 may cause the presentation of an email inbox, listing all or a portion of received email messages. It will be appreciated that any number of the above, combinations of the above, or other configurations may be utilized in presenting further information incident to receiving touch input to an item.
To recapitulate, we have described an aspect of the invention that relates to performing a method of presenting on a user interface of a mobile communications device a persistently visible predetermined region capable of receiving touch input. With reference to
The first set of information may include any information associated with a feature, aspect, functionality, option, etc., of the mobile communications device. For example, the first screen may include a menu that presents information in the form of items that correspond to further menus, lists, applications, or other features. We do not mean to limit the scope of the first set of information as used herein, and recognize that the first set of information may be anything displayable on a display device of a mobile communications device, and may include text, objects, items, graphics, and the like.
A step 322 includes transitioning to a second screen that presents a second set of information while presenting the predetermined region. The second set of information, like the first set of information, may include anything displayable on a mobile communications device. Additionally, the predetermined region is persistently viewable during and after the transition. At a step 324, a touch input is received to the predetermined region. Incident to receiving the touch input, at a step 326, a listing of a set of statuses of events is presented. As indicated above, the listing may be presented without affecting the functionality of the second screen, and may be superimposed over the second set of information, such that a subsequent touch input may remove the listing, revealing the second set of information associated with the second screen.
With reference to
At a step 342, a persistently visible predetermined region capable of receiving touch input is presented across each of the screens such that the predetermined region is never unavailable when any one of the screens is presented. At a step 344, touch input is received to the predetermined region, and incident to receiving that touch input, at a step 346, a set of statuses of events is displayed on a display of a mobile communications device.
With reference to
In a final illustrative step, step 356, a dynamically updateable list is presented which includes at least one item that conveys a status of an event. If an event experiences a change from a first state to a second state, the change is reflected in the list. It should be understood, in light of the description above, that a change in status as described herein may include any number of status changes related to an event such as the receipt of a new event, an aging of a previous event by a certain duration, an updated event, an occurrence of a calendared event, and the like.
Correcting Data Inputted into a Mobile Communications Device
An aspect of an embodiment of the present invention includes an ability to input and correct data inputted into a mobile communications device using a touchscreen. In one embodiment, a way to input data (e.g., text, character strings, etc.) is to provide a modal keypad on a touchscreen display of a mobile communications device that is capable of receiving touch input. Character strings and other text inputted by way of the modal keyboard is checked against recognized character strings, and an option for automatically correcting incorrectly inputted character strings is provided.
Turning to
Touchscreen display 412 includes a region 414 that contains various icons, indicators, and the like. In one embodiment, the region 414 is a persistently visual predetermined region capable of receiving touch input, as described above. In another embodiment, region 414 may be adapted to contain any number of icons or indicators of any kind. In a further embodiment, touchscreen display 412 does not contain a region 414.
Touchscreen display 412 is shown to display a modal keypad 416. As used herein, a modal keypad includes a set of objects that are displayed on a touchscreen display 412 of a mobile communications device 410, where each of the set of objects represents a letter, number, punctuation mark, or other character or set of characters. The modal keypad 416 may also include other objects representing buttons that have other functionality associated therewith such as, for example, a space bar, a return key, and a caps-lock button, as shown in row 418 of
Each of the objects displayed as part of the modal keypad 416 are capable of receiving touch input so that a user may interact with them in a conventional manner. In an embodiment, touch input to an object representing a character or characters causes that character or characters to appear as text on a viewing screen 422 provided on the touchscreen display 412. Viewing screen 422 displays characters 424 inputted by way of touch actions to the modal keypad 416 and may be oriented in any number of manners. Although viewing screen 422 is shown, in
With continued reference to
Turning briefly to
Additionally, whether the modal keypad is oriented as in
Returning to
For instance, in one embodiment, an application such as an input method application or predictive text application may compare the character string to recognized character strings as the user enters the character string, and the interpretation thereof may continuously change in pace with the user's entering of data, finally resting on a most likely interpretation when the user has completed entering data. For example, one such application that may be used or modified for this purpose is the XT9 Smart Input predictive text solution, available from Nuance Communications, Inc. of Burlington, Mass. In one embodiment, the continuous results of the accuracy may be displayed on the touchscreen display 412 as the user enters the character string. In another embodiment, the continuous results may not be displayed on the touchscreen display 412, and only the final interpretation may be displayed. In yet another embodiment, none of the interpretations or corrections may be displayed, where the exact characters that the user has entered will be displayed. It will be appreciated by those skilled in the art that the correction application may be configured in any other manner, so long as it is configured to compare, in some way, user-entered character strings to recognized character strings. Recognized character strings may include correctly spelled words, commonly used words (although incorrectly spelled), or any other type of word or character string that is included in the dictionary database. In some embodiments, as described further below, one or more of the recognized character strings contained in a dictionary database may be entered into the database by a user or other individual, application, program, or functionality.
In any of the embodiments described above, when a user-entered character string is received, the device 412 may determine one or more suggestions for replacing the user-entered character string. In an embodiment, suggestions for replacing the user-entered character string may be determined only in the case where the user-entered character string is not a recognized character string. In other embodiments, suggestions may be provided regardless of whether the user-entered character string is recognized. In still further embodiments, the determination of suggestions may be dependent upon any number of factors, which may be established by a programmer, an application, or even a user, for example. Suggestions may include other character strings containing similar characters, character arrangements, and the like, and may be further based upon other factors such as context, dialect, grammar, and the like. Any number of suggestions may be determined. In one embodiment, for example, one or two suggestions may be determined. In another embodiment, three or more suggestions may be determined. In various embodiments, the number of suggestions determined may also be based upon various factors such as, for example, those described above.
In an embodiment of the present invention, after determining a suggestion for replacing a character string, the user-entered character string may be marked with a visual indication that a suggestion for replacement thereof has been determined. In an embodiment, as shown in
In another embodiment, an automatic-correction function is provided. As shown in
In an embodiment, the suggested character string 430, 432 that replaces the user-entered character string may be marked with a visual indication that the suggested character string 430, 432 has replaced the user-entered character string 426, 428. The visual indication may take any form, as described above with respect to the user-entered character string. In an embodiment, for example, as shown in
Additionally, in an embodiment, an input region 434 is provided that is associated with the visual indication, whether the visual indication corresponds to a user-entered character string 426, or a suggested character string 430. The input region 434 is capable of receiving touch input. Upon receiving touch input to the input region 434, which may be defined by any amount of space surrounding and/or including the character string having the visual indication, a set of alternative suggestions for replacing the character string may be provided. In an embodiment of the present invention, cursor 429 may be unaffected by user input to the input region 434. That is, even though a user may interact with the input region 434, the cursor 429 will remain in the position in which it was before the user interacted with input region 434, thus allowing for a user to perform editing and be able to rapidly return to inputting data.
Turning now to
It should be understood that, although we illustrate four alternative suggested character strings, any number of alternative suggested character strings may be provided. For example, in an embodiment, one, two, or three alternative suggested character strings may be provided. In a further embodiment, five or more alternative character strings may be provided.
In another embodiment, a button 484 capable of receiving touch input may be provided for allowing a user to select an active or inactive state corresponding to the automatic-correction function. In other words, a user, by interacting with a button 484, may be able to turn the automatic-correction function on or off. It will be appreciated that the automatic-correction function may also be toggled between the active and inactive states in any number of other ways. For example, in one embodiment, the option for setting the auto-correction function to a particular state may be provided in a menu such as, for example, a settings or options menu. In other embodiments, the option for selecting between active and inactive states for the automatic-correction function may be presented in a predetermined region of the touchscreen. In a further embodiment, for example, the option may be available as a series of keystrokes or by a particular touch input anywhere or in a certain region of the touchscreen display. It will be further appreciated by those skilled in the art that any number of additional buttons, options, and the like may be provided for allowing a user to perform any number of other functions by interacting therewith.
In an embodiment, each of the alternative suggested character strings 474, 476 has at least one character in common with the user-entered character string. In other embodiments, the alternative suggested character strings 474, 476 include the user-entered character string 474.
In another embodiment, the set of alternative suggested character strings 474, 476 includes a suggested character string that is used to automatically replace the user-entered character string, in which case the character string 470 may actually have been replaced by a character string such as suggested character string 476. Each of the alternative suggested character strings 474, 476 may also have an associated input region 475 capable of receiving touch input. In an embodiment, incident to receiving touch input to an alternative suggested character string 474, 476 by way of an associated touch region 475 the user-entered character string 470 is replaced by the alternative suggested character string with which the input region is associated.
For example, as shown in
In an embodiment, the user interface presents a “learn” button 480 and an “unlearn” button 482, as shown in
In an embodiment of the present invention, upon receiving touch input to the “unlearn” button 482, a suggested character string may be removed from the dictionary database. For example, a user may spell a name incorrectly several times, causing the incorrect spelling to be automatically learned by device 410. The user may then, upon receiving that spelling as a suggested character-string, interact with an “unlearn” button 482, causing the incorrect spelling to be removed from the dictionary database. This removal may be permanent or temporary, and it will be appreciated that such a function may be accompanied by functionality that accomplishes the reverse, such as an “undo” functionality. In various embodiments, the options described above (i.e., adding character strings to a database and removing character strings from a database), may be accomplished by means other than presenting a “learn” and “unlearn” button as illustrated. In various embodiments, other types of objects may be presented on the screen for a user to interact with. In other embodiments, rules may be established such that one or both of the functionalities are invoked incident to some sequence of events such as, for example, the repetitious use of a particular character string. In still further embodiments, a user may interact directly with the character string as it is represented on the screen in order to add to or remove from a dictionary database.
To recapitulate, we have described an aspect of the invention that relates to performing a method of inputting data into a mobile communications device having a touchscreen. With reference to
At a step 522, the user-entered character string is automatically replaced with a suggested character string. In various embodiments, the suggested character string is a correctly spelled word that has at least one character in common with the user-entered character string. In other embodiments, the suggested character string may be a correctly spelled word, and in some embodiments, the suggested character string may be an incorrectly spelled word. Generally, the suggested character string is one that is contained within a dictionary database associated with the mobile communications device.
Continuing with
Accordingly, at step 530, touch input is received by way of the input region, and incident to receiving that touch input, as shown at step 532, a set of alternative character strings is presented. The set of alternative character strings includes suggestions for replacing the suggested character string. It will be understood that as suggested replacements for the suggested character string, the alternative character strings may also include alternative suggestions for replacing the user-entered character string. In an embodiment, each of the alternative character strings has at least one character in common with the user-entered character string. In another embodiment, each of the alternative character strings may have at least one character in common with the suggested character string. In further embodiments, the set of alternative character strings may include either or both of the user-entered character string and the suggested character string. In still further embodiments, an input region is provided which corresponds to each of the alternative character strings, such that a user may be able to “tap” on one of the alternative character strings to cause some functionality such as, for example, to cause the selected alternative character string to replace the suggested character string.
Turning to
At a step 538, as shown in
At a step 542, a suggested character string is automatically determined. In an embodiment, the suggested character string is a recognized character string from the dictionary database and is selected as a suggested replacement for the user-entered character string. The decision diamond 544 indicates an internal determination of whether the automatic-correction is in an active or inactive state, the state generally being dependent upon the user selection of step 536. If the automatic-correction function is in an active state, the illustrative method of
At a step 548, the suggested character string is marked with a visual indication that the suggested character string replaced the user-entered character string. As described above, this visual indication may take any form suitable such as, for example, underlining, bolding, and the like. Further, as illustrated at a step 550, that visual indication is maintained until a touch input is received that corresponds to an instruction to remove the visual indication. Such an instruction may be presented in any manner, so long as it is prompted by some purposeful touch input to the device. In an embodiment, an input region capable of receiving touch input and that corresponds to the visual indication may be provided. In one embodiment, incident to receiving touch input by way of the input region, a set of alternative character strings may be presented as suggestions for replacing the suggested (or the user-entered) character string. Additionally, in another embodiment, an option to select an inactive state associated with the automatic-correction function may again be presented at any time.
With continued reference to
With reference to
As described above, with reference to
At a step 572, a touch input representing a selection of the option to remove the suggested character string from the database dictionary is received. In another embodiment, an option to add the user-entered character string, or any other character string, to the dictionary database may be received. In a further embodiment, an option to take no action may be received, and in still a further embodiment, no option may be received. In the illustrative embodiment of
As referenced above, another aspect of an embodiment of the invention includes the ability to arrive at a desired position in an ordered list of items on a display. The ordered list of items is stored in a mobile communications device. In one embodiment of the invention, a user may select, by way of a touch action, a selectable option, which causes a positional indicator to appear on the display. The positional indicator is responsive to the movement of the selectable option, and indicates a corresponding position among the ordered list of items.
Referring to
In some embodiments, a touch action may be accomplished by using a touch sensitive surface that may not be located on the 4 display 610, but that may be located on the side of the mobile device in the form of a touch sensitive strip. In other embodiments, the mobile device may include an optical sensor that is capable of detecting the motion of an object, such as a finger, fingernail, or stylus, through imaging. In addition, biometric finger print sensors may be used that detect motion of a fingerprint across the optical detector.
Numerals 612 and 614 represent the width and length, respectively, of the display 610. When the mobile device is vertically oriented, so that the user may view the display 610 in a portrait view, as represented by display 610 in
Display 610 has a user interface 616, which is illustrated as a plurality of icons and data. We have included a portion of an exemplary user interface 616 in
As mentioned above, an embodiment of the invention includes the ability to navigate through and reach a desired position among an ordered list of items. This may be achieved by moving a selectable option 618 in a direction allowed for by the mobile device. In embodiments, the selectable option 618 is capable of being moved by a touch action in a first direction, such as along the length 614 of display 610. Here, the user may move the selectable option 618 in this direction to navigate through an ordered list of items, as described above. In these and other embodiments, selectable option 618 is also capable of being moved in a second direction, such as along the width 612 of display 610. In these embodiments, the user may have located the desired item within the ordered listed of items, but there may be a subset list of items associated with the item that the user is able to navigate through. The user may then move selectable option 618, for example by a touch action, along the width 612 of display 610 to navigate through the subset list of items.
It should be noted that the selectable option 618 may indicate the overall list position in many embodiments. For example, if the ordered list of items is alpha sorted, or in alphabetical order, there may not be 26 evenly spaced regions (e.g., one evenly spaced region for each letter of the alphabet) in relation to the movement of the selectable option 618. In these embodiments, for instance, if the alpha sorted list includes only items beginning with the letter A through D, the selectable option 618, because it indicates the current location within the overall list of items, may appear at or near the bottom of the display 610 once the end of the list has been reached, even though the end of the list may be an item beginning with D, but not Z.
As described above, the selectable option 618 allows the user to navigate through an ordered list of items to reach a desired position among this list. To give the user an indication of the relationship between the position of the selectable option 618 and the current location in the ordered list of items, we have provided a positional indicator 620 that dynamically displays an indication of the current location within the ordered list of items. Examples of an ordered list of items, as discussed above include, but are not limited to contacts, photographs, videos, music, sports teams, and the like, all of which may be stored in the mobile device. Depending on how the items are arranged within the ordered list of items, the items may be in alphabetical order, numerical order, or ordered by date, time, size, or any other way that items may be arranged in a list.
As an example of the above, the ordered list of items is a list of contacts saved in the mobile device, and the contacts are ordered alphabetically by first name. Here, when the user selects the selectable option 618 by some type of touch action as described above, the positional indicator 620 is presented on the display 610. In one embodiment, the positional indicator 620 is presented on the display 610 incident to the user moving the selectable option 618 in any direction allowable by the mobile device. In another embodiment, the positional indicator is presented on the display 610 at some set time after the selectable option 618 is moved by the user's touch action. In still another embodiment, the positional indicator 620 is presented on the display 610 incident to the user's touch action, even before the selectable option has been moved. In various aspects of each embodiment described, the positional indicator 620 may gradually appear on display 610, so that the brightness of positional indicator 620 is displayed gradually. Positional indicator 620 may also gradually disappear in this same fashion.
Positional indicator 620, as previously described, dynamically indicates a corresponding position within the ordered list of items. Positional indicators 620A, 620B, 620C, 620D, and 620E represent various examples of the indication that may be provided to the user when moving the selectable option, as described above. As shown in
In another embodiment, the list may be a list of sports teams, such as a list of baseball teams, football teams, soccer teams, basketball teams, and the like. In this embodiment, the positional indicator, such as positional indicator 620C may present the logo representing the team where the selectable option is located. In one aspect of this embodiment, each team may have one or more items associated with it, such as recent scores, schedules, statistics, and the like, so that moving the selectable option 618 until the desired team logo appears on the positional indicator 620C provides the user with an efficient way to locate this information.
In some instances, the ordered list of items may include pictures, videos, or other items that can be ordered by size. In accordance with this embodiment, we have shown positional indicator 620D with “20 MB” presented on it, representing the file size of one or more items in the ordered list corresponding to the current position of selectable option 618. This may give the user an easy and efficient way to locate a certain item, such as a picture or video, if the file size is known or can be approximated. Moving to positional indicator 620E, we show an embodiment that presents “Jan. 1, 2000,” representative of a date that may appear on a positional indicator if the ordered list of items may be ordered by date. Such lists may include events (e.g., calendar-related items), pictures, videos, and the like. As mentioned above, the positional indicator may take on a form consistent with the items in the ordered list, and as such, other indication information may be presented on the positional indicator other than what has been described above. As shown by numeral 621, these are contemplated to be within the scope of the present invention.
As we previously discussed, the width 612 and length 614 of the display are shown in
Now referring to
Display 720 illustrates a second position of the selectable option 722 as a result of the user selecting and moving it in a first direction, for example, in a downward direction along the length of display 720, as shown. Moving selectable option 722 to a second position has caused positional indicator 724 to display a different letter corresponding to the new position of selectable option 722 in relation to the location among the ordered list of items. For instance, as the letter “G” is shown on positional indicator 724, one or more items within the ordered list may start with the letter “G.” In this instance, if there are no items that start with the letter “G,” “G” would not be presented on the positional indicator, and would skip to the next letter that is associated with the next item in the ordered list. In another instance, however, even if there are no items starting with “G,” the letter “G” may still appear on the positional indicator.
In continued reference to
We have included display 750 to illustrate an instance where the selectable option 752 has been moved to the bottom or near the bottom of display 750, wherein its location corresponds to the last or one of the last items within the ordered list. Here, the letter “Z” is presented on positional indicator 754, indicating to the user that the end or near the end of the ordered list has been reached. The embodiment described above easily and efficiently allows a user to reach a desired item within an ordered list of items.
In the embodiments described above with respect to moving the selectable option, presenting the positional indicator, and reaching a desired location within an ordered list of items, it should be noted that while the ordered list of items may be visible on the display while the positional indicator is presented to the user, the ordered list of items may not change its position as the selectable option is moved on the display. In order to save processing power, the positional indicator dynamically displays the current position within the ordered list of items, but the ordered list of items remains in a constant position or state until the user reverses the touch action, for example, by removing the object (e.g., finger, fingernail, stylus) from the selectable option.
To recapitulate, we have described an aspect of the invention that relates to performing a method of reaching a position in an ordered list of items on a display. With reference to
The user may, at any time, reverse the touch action by, for example, removing the object used to select the selectable option. For example, if the user used his or her finger to provide the touch action, the user may lift the finger from the display to reverse the touch action. Reversing the touch action causes an item in the ordered list to be presented on the display. In some embodiments, the item presented on the display is the first item in the ordered list that starts with the letter presented on the positional indicator when the touch action is reversed. For example, if the positional indicator displays an “M” at the time that the touch action is reversed, the first item starting with “M” may be the first item listed on the display.
In some embodiments of
We previously mentioned that while the selectable option is being moved by the user, which causes the display of the positional indicator, the ordered list of items in the background may still be viewable. The ordered list, however, may become dimmed to focus the user's attention on the positional indicator, rather than on the ordered list. In an embodiment, the ordered list is gradually dimmed when the positional indicator is presented on the display, and may gradually revert to its original brightness once the touch action is reversed, as described above. In addition, in some embodiments, the presentation of the positional indicator may also be gradual, so that the positional indicator is first dimmed, and gradually becomes brighter on the display. The same would occur when the touch action is reversed, wherein the positional indicator may gradually disappear from the display.
With reference to
We have also mentioned that the user, upon locating an item using the selectable option and the positional indicator, may wish to locate an item in a subset list of items for the located item. One or more of the items in the ordered list may have an associated subset list of items, so that the positional indicator may be moved in a different direction, such as a second direction, to allow the user to navigate through the subset list of items and reach a desired location within this list as well. As with reversing the touch action described above, a user may cause an item within the subset list to be presented on the display by reversing the touch action (e.g., releasing the finger used to move the selectable option), wherein the item is categorized in the subset list at a position consistent with the positional indicator.
Event Disposition Control for Mobile Communications DeviceAnother aspect of an embodiment of the invention includes an ability to respond to various events by way of a mobile communications device. In one embodiment, a way to respond to a given event (such as receiving a call, receiving a voicemail, etc.) is to provide an informational element, which can take the form of a graphical user interface (GUI) object, that presents information related to the event and is also a slideable object that can be moved by a user into a drop zone to give rise to a desired action.
Turning now to
Blown up for detail, a user interface 914 is shown as being presentable on display 912. Various types of events may give rise to seeking user input. Illustrative events include things such as receiving a phone call; receiving a message (e.g., a voicemail, an email, a text message, and the like); initiating an alarm; receiving a file (e.g., a picture, an audio file, a video file, and the like); or an arriving at a time associated with a calendared event. For example, ten o'clock on a Monday morning may arrive, which triggers a reminder to be presented.
In one embodiment, an informational element 916 is presented on display 912. Informational element 916 serves as an event summary that includes descriptive information that is contextually related to the event. Thus, for example, when mobile device 910 receives an incoming call, informational element 916 might present caller-identifying information that helps a user to determine an identification of the incoming caller. In the case of an alarm, informational element 916 might present textual data that describes the event associated with the alarm. We will describe this aspect in greater detail below. In some embodiments, our technology of utilizing sliding and drop zones helps prevent unintentional actions, such as accidentally unlocking the mobile device.
As mentioned, incident to an occurrence of some event, informational element 916 is presented on display 912. This affords the option to a user to move informational element 916 to one of at least two drop zones, including a first drop zone 918 and a second drop zone 920. Upward movement is indicated by dotted arrow 922, and downward movement is indicated by dotted arrow 924. Informational element 916 both presents information describing or related to the event that gave rise to its presentation and is also slideable by way of touch actions into either of the first or second drop zones 918, 920.
Although we say “into” the drop zones, we do not mean to imply that informational element 916 needs to be moved to wholly within a drop zone. This is not the case. In fact, in some embodiments, mobile device 910 tracks an amount of movement away from an initial starting position. If informational element 916 is moved beyond a threshold distance away from the initial starting position in a certain direction, then it will be deemed to have been dropped in one of the two drop zones. Similarly, if information element 916 is moved beyond a threshold distance toward another drop zone, then it will be deemed to have been released into the other drop zone. In still other embodiments, informational element 916 will be deemed to have been dropped into one of the drop zones if it is moved within a certain proximity of that drop zone. Thus, it might be the case that if an upper portion of informational element 916 crosses some threshold area associated with first drop zone 918, then it will be considered to have been moved to the first drop zone. Same with the second drop zone.
In some embodiments, a user may use his or her finger or a stylus or some other object to indicate a desire to move informational element 916. One way that motion of informational element 916 might be tracked is for mobile device 910 to map a general pressure point to a specific point or line. This specific point or line can then be used to track how far informational element 916 is moved along display 912.
Each of the drop zones 918 and 920 are associated with certain actions. That is, if informational element 916 is moved to first drop zone 918 then a first action occurs, and if informational element 916 is dragged to second drop zone 920, then a second action occurs. In one embodiment, if informational element 916 is not dragged to either first drop zone 918 or to second drop zone 920, then a default action is allowed to occur. This default action is different than either the first or second actions. Being different, though, does not necessarily contemplate an entirely different action, but also includes a temporal differentiation. For example, if a user of device 910 receives an incoming call, the device might ring for a certain period of time, after which the notification will stop. But this only occurs after a certain amount of time; for example, six rings or 25 seconds (random examples). It may be the case that the second action, associated with second drop zone 920 is to dismiss a call which, in effect, stops the incoming-call notification. Although it may be true that in both situations the incoming-call notifications were stopped, the second action is still different than the default action because the default action would have allowed the call-notification operations to continue for a certain amount of time, but the second action (which is associated with second drop zone 920) stopped the call-notification operations immediately.
Turning now to
As mentioned, descriptive information 932 is related to the incoming event that gave rise to the presentation of informational element 916. For example, if the relevant event were an incoming call, then graphic 934 might present a picture of the incoming caller. Textual information 936 might present identifying indicia that identifies the incoming caller. If the incoming event were an alarm, then graphic 934 might take the form of a stagnant bell or some other indicator that a user might readily associate with an alarm. Graphic 934 can also be customized by a user so that the user can determine what gets presented incident to an occurrence of some event.
The type of descriptive information that gets presented varies with the nature of the instant event. We show some illustrative events in
As we have discussed so far, an incoming event might give right to a presentation of informational element 916B. In such a situation, a user has an option of moving informational element 916B to one of at least two drop zones, which will trigger a certain action to be performed. We have shown some illustrative actions such as answering a call, opening attachments, or snoozing an alarm. If a user desires to carry out what might be opposites of the types of actions listed in
Turning to
One way that this aspect of the invention is helpful is that a sliding motion is hard to accidentally cause, especially sliding to a specific area. Thus, if mobile device 910 is in a user's pocket or handbag and a certain event occurs, then by utilizing embodiments of the present invention, unintended actions can be avoided. That is, a user is unlikely to accidentally complete a sequence necessary to give rise to an action, especially when that sequence includes initially tapping in a certain portion of display 912 so as to activate or direct a focus to informational element 916, and then to also drag informational element 916 to one of the two drop zones 918 or 920.
In some embodiments, more than two drop zones can be provided. For example, and with reference to
To recapitulate, we have described an aspect of the invention that relates to performing a method of responding to an event by way of a mobile communications device. With reference to
Decision diamond 1116 indicates different actions that might occur as the informational element 916 is moved in various ways. As mentioned, if the informational element is moved to the first drop zone then mobile device 910 will do the first action, as indicated by numeral 1118. If the informational element is moved to the second drop zone, then mobile device 910 will do the second action, as indicated by numeral 1120. If the informational element is not dragged to either drop zones, then a default action will be allowed to occur, which is referenced by numeral 1122. We have mentioned that this default action is different at least in some respects than either of the first or second actions, including a difference that may be defined in a temporal aspect.
Illustrative positive actions include answering an incoming call, observing an incoming message, viewing an incoming file, viewing an incoming media attachment, opening a calendar-related item, and the like. Illustrative negative-type actions include dismissing an incoming call, sending an incoming call to voicemail, dismissing an incoming message, dismissing an incoming call, silencing an alarm, or the like. What some people refer to as positive actions, other people may refer to as negative actions.
With reference to
A final illustrative step 1154 includes receiving user input that disposes the informational element to one of at least two drop zones that are defined by respective portions of the device's display and that are respectively associated with certain actions such that moving the informational element to one of the at least two drop zones invokes a certain response that is consistent with the corresponding drop zone. We have also mentioned how the response can vary based on either the nature of the event or based on how the user has customized what actions should flow incident to the informational element being moved to a respective drop zone.
Physical Feedback to Indicate Object Directional SlideAs previously alluded to, another aspect of the invention includes an ability to enhance vibrational feedback so that it can provide information on tasks that are being performed. An illustrative task is a drag task. In this aspect of the invention, directional slide can be indicated by vibrational response.
Consider
There may be some cases where it is desirable for a user to be able to determine a direction that object 1212 is being moved without looking at the display of mobile device 1210. Moreover, there may be cases where a user has limited visual acuity, or may even be blind, such that viewing the display of mobile device 1210 is not even possible. Rather than making such a device as mobile communications device 1210 unavailable to such users, we provide a way to indicate directional movement by providing vibrational feedback.
For example, in one embodiment, as object 1212 is moved upward 1218, an intensity of a vibrational response increases. Similarly, as object 1212 is moved in a downward direction 1220, an intensity of a vibrational response of device 1210 decreases. As previously alluded to, “intensity” can include magnitude and/or frequency.
As previously mentioned, this object 1212 can include descriptive information that describes attributes of the event that gave rise to its presentation.
The vibrational response that is provided incident to moving object 1212 is dynamic in nature and occurs in real time. That is, it is dynamic in nature in that it can continuously vary as the location of object 1212 varies. It is continuous in that whatever response is being provided continues to be provided until a desired outcome is reached. Thus, rhythmic or pulsing responses are contemplated within the scope of “continuous.”
In some embodiments, the vibrational response continues until a desired task is completed. For example, to complete a task might be to drop object 1212 into one of the two drop zones 1214 or 1216. In some embodiments, the intensity of the vibrational response continues along some pattern as object 1212 is moved along a first direction. In some embodiments, the vibrational response changes drastically once 1212 is moved to drop zone 1214. In some embodiments, changing drastically means to cease a vibrational response. In other embodiments, changing drastically means providing a vibrational response that is inconsistent with the pattern that had been followed as object 1212 was being moved along a first direction.
As mentioned, moving object 1212 in an upwardly direction might cause an increase in intensity of vibrational response. An increase in intensity can include an increase in magnitude and/or an increase in frequency. An increase in magnitude would mean that a small vibrational response would turn into a larger vibrational response. For example, a gentle ping might turn into a more robust thud. Another way that intensity can be increased is by increasing frequency. Mobile device 1210 might buzz more the more object 1212 is moved in an upward direction. The frequency by which vibrations are measured can increase as object 1212 is moved in a first direction.
Everything we say can be applicable to movement in the second direction but with an opposite effect. That is, moving object 1212 in a downward direction 1220 might cause a decrease in intensity of a vibrational response. That is, a frequency decrease might occur, or a magnitude decrease might occur.
Turning now to
Reference numeral 1242 marks an illustrative starting location. Thus, say an incoming call was received by mobile device 1210. In this example, object 1212 would be presented, and mobile device 1210 would vibrate with an intensity consistent with a level denoted by reference numeral 1242. As object 1212 is moved toward first drop zone 1214, the vibrational intensity might increase consistent with the upper portion 1244 of curve 1230. As object 1212 is moved in a different direction, vibrational intensity might decrease consistent with a lower portion 1246 of curve 1230.
In one embodiment, curve 1230 is monotonically increasing. That is, each instance of intensity is greater than the previous when moving in a rightward direction. It may be the case that curve 1230 is monotonically increasing only over a certain range of values. For example, perhaps curve 1230 is only monotonically increasing from the area marked by a first dashed line 1248 as well as a second dashed line 1250. It may be the case that beyond this range, curve 1230, although not shown in this way, may take on a drastically different pattern than in the past. For example, for displacements beyond marker 1250, perhaps the vibrational intensity drops markedly off, which would translate to a steep decline in curve 1230 after marker 1250. It might also be the case that the vibrational intensity markedly increases after marker 1250.
One way that such marked changes might be used can be seen in connection with
Similarly, moving object 1212 along a lower portion 1268 of curve 1254 might continue along a certain pattern until a point 1270 is reached, after which it may make a marked change, which is intended to be shown by reference numerals 1272 as well as numeral 1274.
The general shape of curve 1254 does not need to follow the shape that is shown in
At a step 1312, the object is made moveable on the touchscreen such that it can be moved by way of a touch interaction. Thus, for example, a user can use his or her thumb to move object 1212. As the user moves object 1212 in different directions, different vibrational responses are provided dynamically and in real time so that a user can perceive a direction that object 1212 is being moved by perceiving the vibrational response. Thus, at a step 1314, in real time, an intensity of an output of the vibration component is continuously varied to cause a vibrational response of the mobile communications device such that movement of the object in a first direction causes a first continuous vibrational output, and movement of the object in a second direction causes a second continuous vibrational output. We have mentioned that in one embodiment, the intensity can monotonically increase as object 1212 is progressively more displaced along a first direction such that every advancement of the object in the first direction produces a vibrational output that is greater than when the object was in any proceeding position. In some embodiments, this occurs up until a certain threshold, after which the vibrational intensity changes in an inconsistent way so as to denote that an object such as object 1212 has moved into a certain area of interest such as a drop zone. Of course embodiments of the present invention do not need to rely on dual drop zones as we have shown. Rather, the scope of this aspect of the invention is widely applicable to any graphical user interface object that is to be moved in some direction. Directionality of that object can be made to correspond to a vibrational intensity such that directionality of the object can be perceived by perceiving the intensity of the corresponding vibrational output. In some embodiments, processor 116 is configured to help coordinate the vibrational response to a movement of an object such as object 1212 so that movement of the object can be deciphered without (or with) physically viewing display 118. This vibrational response would not be a mere playback of a prerecorded action; but rather, a dynamically created vibrational response that is created based on a movement of the object. Thus, more than just playing back a response incident to an occurrence of some event, embodiments of the invention provide for a real-time creation of a vibrational response that occurs based on real-time events. In some embodiments, vibrational outputs follow respectively consistent patterns; namely, for some ranges they progressively increase or progressively decrease.
Turning now to
At a step 1324, in real time, an intensity of a vibrational response is varied consistent with a movement of the object such that the movement of the object in a first direction results in a first vibrational output, and movement of the object in a second direction results in a second vibrational output.
Many different arrangements of the various components depicted, as well as components not shown, are possible without departing from the spirit and scope of the present invention. Embodiments of the present invention have been described with the intent to be illustrative rather than restrictive. Alternative embodiments will become apparent to those skilled in the art that do not depart from its scope. A skilled artisan may develop alternative means of implementing the aforementioned improvements without departing from the scope of the present invention.
It will be understood that certain features and subcombinations are of utility and may be employed without reference to other features and subcombinations and are contemplated within the scope of the claims. Not all steps listed in the various figures need be carried out in the specific order described.
Claims
1. One or more computer-readable media having computer-useable instructions embodied thereon for performing a method of responding to an event by way of a mobile communications device, the method comprising:
- incident to an occurrence of the event, presenting an informational element on a display of the mobile communications device, wherein the informational element takes the form of a slideable control that can be manipulated by touch actions;
- presenting descriptive information in the informational element that is contextually related to the event such that the descriptive information is related to the event;
- providing at least two drop zones, a first drop zone and a second drop zone, that each consume an area of the display, wherein the first drop zone is associated with a first action, and wherein the second drop zone is associated with a second action;
- such that, (1) if the informational element is moved to the first drop zone, then the first action occurs, (2) if the informational element is moved to the second drop zone, then the second action occurs, and (3) if the informational element is not moved to either the first or second drop zone, then allowing a default action to occur that is different from the first or second actions.
2. The media of claim 1, wherein the event includes one or more of the following:
- receiving a phone call;
- receiving a message, including one or more of a voicemail, an email, or a text message;
- initiating an alarm;
- receiving a file, including one or more of a picture, an audio file, or a video file; or
- an arriving at a time associated with a calendared event.
3. The media of claim 2, wherein the descriptive information includes one or more of the following:
- a caller's phone number;
- a picture associated with a caller; or
- a filename of an file.
4. The media of claim 1, wherein the first drop zone is in an upper portion of the display, and wherein the second drop zone is in a lower portion of the display.
5. The media of claim 1, wherein the first action occurs incident to a moving of the informational element to the first drop zone, and wherein the first second action occurs incident to a moving of the informational element to the second drop zone.
6. The media of claim 5, wherein moving the informational element to a corresponding drop zone includes:
- providing a positional indicator that indicates an initial position of the informational element; and one or more of, (1) moving the informational element a threshold distance away from a starting position such that the positional indicator is moved the threshold distance away from the initial position; or (2) moving the informational element within a threshold proximity of the corresponding drop zone.
7. The media of claim 6, wherein moving the informational element within a threshold proximity of the corresponding drop zone includes moving at least a portion of the informational element into the drop zone.
8. The media of claim 1, wherein the first and second actions are related to the event that gave rise to the presentation of the informational element.
9. The media of claim 8, wherein the first action is not always the same, and wherein the first action differs according to the event such that the first action is contextually relevant to the event.
10. The media of claim 9, wherein the first action is a positive action, and wherein the second action is a negative action or vice versa.
11. The media of claim 10, wherein the positive action includes one or more of:
- answering an incoming call;
- observing an incoming message;
- viewing an incoming file;
- viewing an incoming media attachment; and
- opening a calendar-related item.
12. The media of claim 10, wherein the negative action includes one or more of:
- dismissing an incoming call;
- sending an incoming call to voicemail;
- dismissing an incoming message;
- dismissing an incoming file;
- silencing an alarm;
13. The media of claim 1, wherein the first or second action is a user-defined action.
14. The media of claim 1, further comprising
- providing a third drop zone and a fourth drop zone, which are respectively with a third action and fourth action such that when the informational element is moved to the third drop zone then the third action occurs and when the informational element is moved to the fourth drop zone, then the fourth action occurs.
15. One or more computer-readable media having computer-useable instructions embodied thereon for performing a method of responding to an event by way of a mobile communications device, the method comprising:
- receiving at the mobile communications device an indication of an occurrence of an event;
- presenting an informational element on a display of the device, wherein the informational element is a slideable graphical-user-interface control (GUI control) that presents contextually relevant data that is related to the event; and
- receiving user input that disposes the informational element to one of at least two drop zones that are defined by respective portions of the display and that are respectively associated with certain actions such that moving the informational element to one of the at least two drop zones invokes a certain response that is consistent with the corresponding drop zone.
16. The media of claim 15, wherein the contextually relevant data includes textual data that presents information associated with the event.
17. The media of claim 16, wherein the certain response is selected from a plurality of possible responses and is determined by a nature of the event such that the nature of the event determines the certain response, and that the certain response is variable and varies according to the nature of the event.
18. The media of the 17, wherein at least one of the plurality of possible responses is a customizable response that is customizable by a user of the device.
19. A mobile communications device comprising:
- a touchscreen display that presents a graphical user interface that includes at least two drop zones, which consume opposite portions of the display, wherein the mobile communications device is configured to initiate a corresponding action incident to any of the at least two drop zones sufficiently receiving an informational element; and
- a processor that helps facilitate a presentation of the informational element, which is presented based on an occurrence of an event, and which includes descriptive information that presents data associated with the event such that the corresponding action can be initiated in response to sufficiently receiving an informational element.
20. The mobile communications device of claim 19, wherein the corresponding action if one of at least two actions, and wherein taking no action is not one of the at least two actions, such that if the informational element is not moved to any of the at least two drop zones, then no action occurs.
Type: Application
Filed: Mar 28, 2008
Publication Date: Oct 1, 2009
Applicant: SPRINT COMMUNICATIONS COMPANY L.P. (Overland Park, KS)
Inventors: Michael T. Lundy (Olathe, KS), Mathew Jay Van Orden (Leawood, KS)
Application Number: 12/058,445
International Classification: H04M 11/04 (20060101); H04Q 7/22 (20060101); H04M 1/21 (20060101); H04M 1/23 (20060101);