USER INTERFACE FOR MOBILE COMPUTER UNIT
An electronic device including a display, sensors detecting contact above the display, a processor receiving information from the sensors, and a user interface accessing a plurality of applications, each application running on the electronic device in an activated state and in a non-activated state whereby, in its activated state, the application presents a graphical user interface (GUI) on the display and runs interactively via the GUI, and, in its non-activated state, the application presents a gadget on the display and runs non-interactively to present, within the gadget, dynamically generated information related to the application, wherein the user interface initializes each application in its non-activated state when the electronic device is turned on, and alters the layout when an application transitions to its activated state, by displacing some gadgets from in the display to out of the display, and replacing the displaced gadgets with a window for the activated application's GUI.
This application is a continuation of U.S. Ser. No. 12/486,033, filed on Jun. 17, 2009, now U.S. Pat. No. 9,164,654, entitled USER INTERFACE FOR MOBILE COMPUTER UNIT, by inventors Magnus Goertz and Joseph Shain. U.S. Ser. No. 12/486,033 is a continuation-in-part of U.S. Ser. No. 10/315,250, now U.S. Pat. No. 8,095,879, filed on Dec. 10, 2002, entitled USER INTERFACE FOR MOBILE HANDHELD COMPUTER UNIT by inventor Magnus Goertz. U.S. Ser. No. 12/486,033 claims priority from provisional application No. 61/132,469, filed on Jun. 19, 2008, entitled KEYPAD FOR CHINESE CHARACTERS by inventors Magnus Goertz, Robert Pettersson, Staffan Gustafsson and Johann Gerell.
FIELD OF THE INVENTIONThe field of the present invention is user interfaces for electronic devices and, more particularly, to touch screen user interfaces.
BACKGROUND OF THE INVENTIONTouch screens provide user interfaces through which a user enters input to a computing device by touching a screen at a selected location, with a stylus or with his finger.
Conventional touch screens are limited as to the types of user inputs that they can recognize. For example, conventional touch screens are unable to distinguish between a soft tap and a hard press. In some prior art embodiments users initially select an item on the screen, and then subsequently activate the selected item. However, because prior art touch screens do not distinguish between degrees of pressure, the user is required to remove his finger or stylus from the screen and activate his selection with a second tap. It would be advantageous to produce touch screens that distinguish between varying amounts of pressure applied to the screen so that a user can select an item by touching its location on the screen, and then activate the item by applying additional pressure to the touch location without having to first remove his finger or stylus from the screen.
In conventional touch screens the keys are often small relative to the touch area, especially in handheld devices. The keys are also often situated close together. This can make it difficult to determine which key is being pressed by the user. It would be advantageous to clearly indicate to the user which key the user has selected, and furthermore, to allow the user to accept or reject the selected key without first removing his finger or stylus from the screen in order to perform a second tap.
Current user interfaces are basic and often require navigation through a series of menus in order to perform a desired operation. The present invention presents a user interface that is versatile in providing the user with many options, while requiring only few selections to activate a desired function. To further enhance user experience, certain functions are performed automatically without requiring the user to enter a selection.
SUMMARY OF THE DESCRIPTIONAspects of the present invention relate to user interfaces designed for use with a touch screen. The present invention relates to computer readable media storing computer programs with computer program code, which, when read by a computer unit, allows the computer to present a user interface for the computer unit.
In accordance with embodiments of the present invention, the computer unit features a touch sensitive display area. According to preferred embodiments an initial display configuration presents a plurality of gadgets on the display. These gadgets are small areas of the screen that indicate which functions each gadget will perform when activated by the user. When a gadget is activated, typically by the user touching the area of the display on which the gadget is displayed, the gadget increases in size and provides the user with icons and information about the gadget's functions. Significantly, the gadget does not cover the entire display area. Thus, when a gadget, or even a plurality of gadgets, is active, the primary display of all available gadgets is still accessible. This primary display can be compared to a desktop in computer operating system user interfaces. However, this primary display in the user interface of the present invention is not the same as a desktop where active windows can cover icons on the desktop. In the present invention, gadgets are arranged in a manner that open gadgets do not cover other gadgets. Rather, when an open gadget expands in size, other gadgets are shifted to make room for the expanded gadget. This allows the user to scroll the primary display or desktop to view any gadget. In the context of the present invention, this primary display area that includes both open and closed gadgets is called the home window. The user scrolls the home window to view gadgets that are shifted beyond the viewable area of the display. An expanded, or activated, gadget has an expanded window, but often it is not large enough to display everything contained in that gadget window. To view contents of the gadget not displayed in the visible portion of the gadget window, the user scrolls the gadget window. Thus, two different scroll operations are provided: scrolling the home window and scrolling a gadget window. According to one embodiment, scrolling is executed by gliding a finger or stylus along the touch screen to shift the active display area of the home window or of the gadget. The scrolling affects the home window if the finger glide began in an area of the screen that does not belong to an active gadget; the scroll affects an active gadget window if the finger glide began inside that active gadget window.
Various embodiments of the invention support several methods of scrolling a window. According to one embodiment, scrolling is done when the user touches the display inside the window area, for example at an edge of the window, or on an icon, such as an arrow or scrollbar, indicating a scroll operation. According to another embodiment, scrolling is done by the user touching the window with a finger or stylus and then gliding the finger or stylus along the touch sensitive screen in a direction indicating the desired direction of the scroll. When the content of the home display is larger in two dimensions than the actual display screen, this operation is like panning an image or map. When it is larger along only one axis (e.g., only vertically), the scrolling only scrolls in the one axis even when the glide is not orthogonal along the one axis.
Another aspect of the present invention relates to computer readable media storing a computer program with computer program code, which, when read by a mobile handheld computer unit, allows the computer to present a user interface for the mobile handheld computer unit. The user interface features a touch sensitive area in which representations of a plurality of keys are displayed, and each key is mapped to a corresponding location in the touch sensitive area at which the representation of the key is displayed. A key in this context includes, inter alia, alphabetic keys such as in a QWERTY keypad, numeric keys and also icons representing programs or functions. A key is selected, but not activated, when an object touches the corresponding location. This intermediate status of being selected but not activated facilitates the user to subsequently activate a desired key and avoid activating a neighboring key that the user selected but does not wish to activate. A selected key is activated when the object touching it applies additional pressure to the key location.
According to preferred embodiments of the invention, when a key is selected, the user interface generates a secondary representation of the key, such as a callout balloon containing the key representation. The callout balloon is placed away from the key location (being touched) so that the user can easily view which key is selected without lifting his finger. According to another embodiment, an audio representation of the selected key is generated so the user hears which key was selected.
According to still further features in preferred embodiments of the invention, the user touches the screen (with a finger or stylus) at a first location, for example selecting a first key. The user then glides his finger or stylus over the screen to additional locations. At each additional location a new key is selected and the previously selected key is deselected. The user can activate any selected key by applying additional pressure to the screen. The user does not have to remove the object from the screen to glide and select additional keys even after activating a first key.
Additional touch pressure is detected in various ways according to several embodiments. According to one embodiment, the touch sensitive area is a light-based touch screen operable to detect different levels of touch pressure. For example, light-based touch screens typically include a calculating unit operable to identify the size, shape and contours of an area being touched based on a pattern of obstructed light. See applicant's U.S. patent application Ser. No. 10/494,055, now U.S. Pat. No. 7,880,732, titled ON A SUBSTRATE FORMED OR RESTING DISPLAY ARRANGEMENT, the contents of which are incorporated herein by reference. When a finger or flexible object is used as a touch object, as additional pressure is applied to the touch surface, the contact area of the finger or object touching the screen increases. Thus, additional pressure is detected when an increase in the contours of the covered touch area is detected.
Alternatively, or in combination with the above, the touch sensitive area features both a touch screen operable to identify a touch location on the screen and a pressure sensor operable to detect pressure applied to the screen but not sensitive to the location of the object applying the pressure.
Other aspects of the present invention relate to convenient arrangement and function of icons to perform popular functions within a user interface. Thus, a camera gadget features a multimedia messaging service (MMS) button facilitating sending an active photo in an MMS message; a keylock gadget locks the computer and displays an instrumental keypad for entering a musical code to unlock the computer; a reporting gadget displays information for a first period of time and is then automatically deactivated. Several reporting gadgets are provided, including a gadget that displays the time of day; a gadget displays a weather forecast; a gadget that displays stock market information.
According to still further features in preferred embodiments the reporting gadget continues to display its information for a second period of time if the gadget is touched during the first period of time. I.e., automatic deactivation after the first period of time is canceled.
The present invention will be more fully understood and appreciated from the following detailed description, taken in conjunction with the drawings in which:
Touch screen keypads offer great flexibility in keypad interface design—flexibility that cannot be achieved with electro-mechanical keypads. Custom design keypads can be generated on a touch screen, where the markings on each key and the functions that each key provides are optimized for a designated application. Moreover, touch screen keypads can change modes, from one pad of keys and associated functions to a different pad of keys and associated functions. Custom keypads are of particular advantage for multi-lingual applications.
A general description of touch screen keypad interfaces, in accordance with embodiments of the present invention, and several examples thereof, are described in detail hereinbelow.
Embodiments of the present invention relate to improved keypads for inputting Chinese characters using XT9 stroke input, and using Chinese Pinyin. XT9 stroke input builds Chinese characters using six basic strokes, and offers selection of possible characters and phrases based on a set of strokes that have been input. Chinese Pinyin uses Latin characters that transliterate a sound or a syllable, in combination with a digit that represents an intonation or inflection. E.g., Ma in a rising tone is m-a-1, and Ma in a descending tone is m-a-2.
Reference is now made to
In accordance with an embodiment of the present invention, keypad 100 is generated and displayed on a touch screen. Keypad 100 has fewer than the standard 12 keys in a touch pad, allowing more room on screen for displaying characters.
Further in accordance with an embodiment of the present invention, the keys of keypad 100 are customized so that they contain only relevant information. For example, a prior art keypad displays a digit, 3-4 characters, and a basic Chinese stroke, all inside one key, even though in XT9 stroke mode the basic Chinese stroke is the only useful one. The custom keys of the present invention display only the basic Chinese strokes, or the strokes and numbers, but no characters.
There are two types of key presses supported by keypad 100—regular and long. A regular key press adds the stroke shown on the key to the series of strokes 101-106 already pressed. As strokes are successively entered, a numbered array 121 of Chinese characters or phrases is dynamically displayed along the top of the keypad. These characters or phrases are predicted based on the key presses already entered. In order to select one of the numbered elements of array 121, the user performs a long key press on that number. Alternatively, the user may keep entering strokes until only one option remains.
Often, however, more predicted characters or phrases exist than can be displayed along the top of the keypad. The N2 phone, manufactured by Neonode of Stockholm, Sweden, has a joystick button at the bottom of the phone. Twitching the joystick up/down displays different sets of predicted characters or phrases. When the intended character or phrase is displayed and associated with a given digit, a long press on that digit serves to select the intended character or phrase.
Reference is now made to
In accordance with an embodiment of the present invention, keypad 200 uses accent characters, rather than digits, to convey an intended intonation or inflection. Further in accordance with an embodiment of the present invention, keypad 200 displays only information relevant for Pinyin input on each key; no Chinese basic strokes are shown.
There are two types of Pinyin input. A user enters a Latin transliteration of an intended word using the Latin keypad input (12 keys). For each key, several letters are possible. The list of predicted Latin syllables based on the current sequence of keypad presses is displayed. Twitching the joystick right or left selects the desired combination. Also, a series of predicted Chinese characters or phrases is displayed and selected by a long press on a respective digit. Twitching the joystick up/down displays other predicted Chinese characters or phrases. Entering a space after a series of letters indicates the end of a previous character or phrase.
In accordance with an embodiment of the present invention, the user is able to combine stroke and Pinyin input, and compose a sequence of at least two Chinese characters using XT9 stroke input for at least one character and Pinyin input for at least one other character. The user switches between XT9 stroke input mode and Pinyin input mode by performing a sweeping motion in relation to the touch screen, such as, inter alia, sweeping a finger across the top of the touch screen. The series of at least two Chinese characters may be a text message, a name, a data entry, or any other such input.
Further in accordance with an embodiment of the present invention, the user is able to compose a series of at least one Chinese character and at least one non-Chinese term, wherein the non-Chinese term includes at least one Latin character, digit, emoticon, punctuation mark, another non-Chinese symbol, or any combination thereof. The series is composed by switching input modes for each alphabet or Chinese input or digit input by sweeping across the top of the touch screen. For example, the user may input at least one Chinese character using either Pinyin or stroke input, or a combination thereof. The user may then perform a sweeping motion in relation to the touch screen to change the input mode to English. For example, the user may sweep a finger across the top of the touch screen to change the input mode to English. In this mode, the keypad presents Latin characters. The user then proceeds to input Latin characters using the Latin keypad displayed on the touch screen. Alternatively, the user may repeat a series of sweeping motions; e.g., the user sweeps a finger across the top of the touch screen, repeatedly, changing the input mode with each sweeping motion, until a digit keypad is displayed and digit input mode is active. The user may then proceed to enter at least one digit, adding the at least one digit to the series of Chinese characters already contained in the message. It will thus be appreciated that the user may switch between different input modes while composing a single message, a command, a name, a data entry or another such input, including at least two different types of characters, in an easy and simple and convenient manner.
Further in accordance with an embodiment of the present invention, a keypad displaying emoticons is displayed. In this mode, the user may select an emoticon to be entered into the text of a message, or such other input.
Yet further in accordance with an embodiment of the present invention, drawings, including inter alia, emoticons, are constructed in a similar manner to XT9 stroke input. In this mode, the user interface displays the basic building blocks for the drawing, such as a curve, a semicolon, a circle, and other symbols. As the user taps multiple symbols, possible drawings or emoticons that can be formed using the selected elements are displayed, and the user may either select the desired complete drawing or emoticon from the displayed list, or may continue entering additional building blocks until only one option remains. This mode of input is convenient as the number and size of the keys presented is optimized for the number of available building blocks, and each key only displays information relevant for the active input mode.
Embodiments of the present invention provide methods and systems for enabling multiple input modes, whereby the screen display in each input mode is optimized for that mode. Optimizations include (i) configuring the number of keys displayed, (ii) configuring the size, position and shape of the keys in relation to the screen, (iii) configuring the size, position and shape of the area of the display showing text already entered, (iv) configuring the size, position and shape of the area of the display showing possible completions for the current character, phrase or symbol, and (v) displaying only at least one character, symbol, digit or other figure that is relevant to the active input mode on each key.
Embodiments of the present invention also provide methods and systems for enabling multiple input modes and switching between the input modes by performing a sweeping motion in relation to the screen. These methods and systems are easier and more convenient than using a menu interface to switch input modes. Additionally, these methods do not use up screen space to provide a switching key, to switch between input modes, and, as such, screen space may be used for information related to the current input mode and the current text. These methods and systems also enable switching between many input modes, by repeating the sweeping motions to advance through the input modes. In the N2 phone manufactured by Neonode AB, this mode of switching active keypads is used with standard 12-key keypads, for example, inter alia, switching between numeric, Swedish and English keypads. However, switching the layout of the keys is not used in the Neonode N2 phone. Moreover, other, prior art methods of switching between input modes require more than one action to advance through a list of input modes.
Reference is now made to FIGS. 3 and 9-45, which include several exemplary touch screen interfaces. Shown in
Problem: it is hard to hit the right item on a touch display because when you move the finger to and from the surface it is also easy to move it in the X-Y direction by mistake, especially when used with one hand. According to embodiments of the present invention, the solution to this problem is to use a conventional touch technology to find the object's X-Y position, add tactile or graphical illustration of where your finger is, and add a force sensor to the display to read the activation. Thus, according to embodiments of the present invention, selection and subsequent activation of an on-screen element, such as, inter alia, a gadget, a letter, a key, a button or an icon, is implemented by two primary features. The first feature, referred to as pressure sensing, provides the touch screen with one or more pressure sensors operative to determine contact with the screen by an external object such as a finger. In one embodiment, the user places a finger or a stylus over the desired on-screen element to select it, as illustrated in
The second feature, referred to as callout balloons, indicates a selected on-screen element to the user by displaying a graphic, such as by (i) enlarging the on-screen element, (ii) displaying a callout balloon with a copy of the on-screen element inside, or (iii) a combination thereof. The callout balloon is illustrated in
In accordance with an embodiment of the present invention, touch screens for phones captioned “Power ON/Power OFF” are shown in
Additionally in accordance with an embodiment of the present invention, touch screens for phones captioned “Key lock” are shown in
Further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Key lock high security” is shown in
Yet further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Home/return from application” is shown in
Moreover in accordance with an embodiment of the present invention, a touch screen for a phone is shown in
In
In
In
Additionally in accordance with an embodiment of the present invention, a touch screen for a phone is shown in
After the user scrolls to reveal additional gadgets and a specific time limit has elapsed, e.g., 3 seconds, the screen automatically returns to a default display, hiding recently exposed gadgets and revealing previously hidden gadgets. Such return to default display is of advantage for revealed gadgets that do not require extended viewing, such as, inter alia, a clock or a weather report. The return to default display obviates the need for the user to perform an action in order to return the screen to its initial display. Activating the home key after scrolling also returns the screen to its initial display, showing the originally displayed gadgets and hiding the recently revealed gadgets. Such return to initial display is graphically presented (i) by returning the original display completely, (ii) by fading in the original display over the current scrolled display, (iii) by gradually displacing the scrolled display with the original display, (iv) by graphically scrolling the display in reverse to the original display, or (v) by a combination of such presentations. The user puts a finger on the screen and scroll bar handle and drags down to display more gadgets down the page. The user uses the Home button to get back to the home area. An optional feature: the user puts a finger on the scroll bar handle and drags up to display upper gadgets that are less frequently used. The screen is automatically scrolled back three seconds after the finger is removed.
Further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Fav 5” is shown in
Further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “History” is shown in
Yet further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Telephone” is shown in
Moreover in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Active call” is shown in
Additionally in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Volume” is shown in
Additionally in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Mp3 player” is shown in
Further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Operator ad” is shown in
Yet further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Video” is shown in
Yet further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Camera” is shown in
Yet further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Messaging SMS” is shown in
The effect of the scrolling activity is illustrated in
In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made to the specific exemplary embodiments without departing from the broader spirit and scope of the invention as set forth in the appended claims. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
Claims
1-20. (canceled)
21. An electronic device comprising:
- a housing;
- a display mounted in said housing;
- one or more sensors mounted in said housing detecting contact above said display;
- a processor mounted in said housing, connected to said display and receiving information from said sensors; and
- a user interface accessing a plurality of applications, each application running on the electronic device in an activated state and in a non-activated state whereby, in its activated state, the application presents a graphical user interface (GUI) on said display and runs interactively via the GUI, and, in its non-activated state, the application presents a gadget on said display and runs non-interactively to present, within the gadget, dynamically generated information related to the application, wherein the user interface: initializes each application in its non-activated state when the electronic device is turned on; arranges the gadgets in a layout that is larger than said display, whereby some of the gadgets are in the display and others of the gadgets are out of the display; causes an application in its non-activated state to transition to its activated state when said sensors detect tapping that application's gadget; and alters the layout when an application transitions to its activated state, comprising: displacing some, but not all, of the gadgets from in said display to out of said display; and replacing the displaced gadgets with a window for the activated application's GUI.
22. The electronic device of claim 21, wherein the user interface prevents two different applications from running in their activated states simultaneously.
23. The electronic device of claim 21, wherein the plurality of applications comprises a reporting application that, upon activation, displays information for a first period of time within a window for the reporting application's GUI, and then automatically transitions to its non-activated state, and wherein during the automatic transitioning, the user interface:
- replaces, within the layout, the window for the reporting application's GUI with the reporting application's gadget; and
- moves other gadgets within the layout to fill space previously occupied by that window.
24. The electronic device of claim 23, wherein the reporting application, in its non-activated state, dynamically displays a time of day within its gadget.
25. The electronic device of claim 23, wherein the reporting application, in its non-activated state, dynamically displays a weather forecast within its gadget.
26. The electronic device of claim 23, wherein the reporting application, in its non-activated state, dynamically displays stock market information within its gadget.
27. The electronic device of claim 23, wherein, in response to said sensors detecting a touch of the window for the reporting application's GUI where information is being displayed, during the first period of time, the reporting application remains in its activated state for a second period of time.
28. The electronic device of claim 21, further comprising a home button on said housing, connected to said processor, wherein an initial portion of the layout is presented on said display when the electronic device is turned on, wherein the user interface pans the layout within said display to bring some of the gadgets into said display or to move some of the gadgets out of said display, in response to a multi-step gesture detected by said sensors comprising (i) an object touching a gadget displayed on said display, and then (ii) the object gliding along said display away from the touched location, and wherein the user interface restores the initial portion of the layout to said display in response to actuation of the home button.
29. The electronic device of claim 28 wherein the user interface causes an activated application to transition to its non-activated state, in response to actuation of the home button.
30. The electronic device of claim 28, wherein the user interface powers off the electronic device in response to prolonged actuation of the home button.
31. The electronic device of claim 28, wherein the user interface pans the layout within said display in a specific direction when the gliding in the multi-step gesture is not parallel to that direction.
32. The electronic device of claim 21, wherein the plurality of applications comprises a music player application that, in its non-activated state, dynamically displays within its gadget information about a current song.
33. The electronic device of claim 21, wherein the plurality of applications comprises a telephone application, comprising a keypad and an address book, that, in its non-activated state, displays within its gadget information about previous calls.
34. The electronic device of claim 21, wherein the window for an activated application's GUI covers most of the display.
35. The electronic device of claim 21, wherein the user interface displaces gadgets in only one direction within the layout in order to accommodate space for the window for an activated application's GUI.
36. The electronic device of claim 21, wherein the user interface displaces gadgets within the layout on one side of the window for an activated application's GUI, and does not displace gadgets within the layout on another side of that window, in order to accommodate space for that window.
37. The electronic device of claim 21, wherein the housing is a mobile phone housing and the device is a mobile phone.
38. A non-transitory computer readable medium storing program code which, when executed by a processor of an electronic device comprising a display, the processor receiving information from one or more sensors operative to detect contact above the display, causes a user interface to access a plurality of applications, each application running on the electronic device in an activated state and in a non-activated state whereby, in its activated state, the application presents a graphical user interface (GUI) and runs interactively via the GUI, and, in its non-activated state, the application presents a gadget and runs non-interactively to present, within the gadget, dynamically generated information related to the application, wherein the user interface:
- initializes each application in its non-activated state when the electronic device is turned on;
- arranges the gadgets in a layout that is larger than the display, whereby some of the gadgets are in the display and others of the gadgets are out of the display;
- causes an application in its non-activated state to transition to its activated state, when the sensors detect tapping that application's gadget; and
- alters the layout when an application transitions to its activated state, comprising: displacing some, but not all, of the gadgets from in the display to out of the display; and replacing the displaced gadgets with a window for the activated application's GUI.
39. The computer readable medium of claim 38, wherein the user interface prevents two different applications from running in their activated states simultaneously.
40. The computer readable medium of claim 38, wherein the plurality of applications comprises a reporting application that, upon activation, displays information for a first period of time within the window for the reporting application's GUI, and then automatically transitions to its non-activated state, and wherein during the automatic transitioning, the user interface:
- replaces, within the layout, the window for the reporting application's GUI with the reporting application's gadget; and
- moves other gadgets within the layout to fill space previously occupied by that window.
41. The computer readable medium of claim 40, wherein the reporting application, in its non-activated state, dynamically displays a time of day within its gadget.
42. The computer readable medium of claim 40, wherein the reporting application, in its non-activated state, dynamically displays a weather forecast within its gadget.
43. The computer readable medium of claim 40, wherein the reporting application, in its non-activated state, dynamically displays stock market information within its gadget.
44. The computer readable medium of claim 40, wherein, in response to the sensors detecting a touch of the window where the information is being displayed, during the first period of time, the reporting application remains in its activated state for a second period of time.
45. The computer readable medium of claim 38, wherein the plurality of applications comprises a music player application that, in its non-activated state, dynamically displays within its gadget information about a current song.
46. The computer readable medium of claim 38, wherein the plurality of applications comprises a telephone application comprising a keypad and an address book, that, in its non-activated state, displays within its gadget information about previous calls.
47. The computer readable medium of claim 38, wherein the user interface displaces gadgets within the layout on one side of the window for an application's GUI and does not displace gadgets within the layout on another side of that window, in order to accommodate space for that window.
48. An electronic device comprising:
- a housing;
- a display mounted in said housing;
- a touch sensor detecting contact above said display;
- a force sensor distinguishing between different amounts of pressure applied to said display from above;
- a processor mounted in said housing connected to said display, receiving information from said touch sensor and from said force sensor; and
- a user interface comprising a plurality of displayed activatable controls, wherein the user interface selects a control in response to the touch sensor detecting an object touching that control on said display, and wherein the user interface activates a thus-selected control in response to the force sensor detecting an increase in pressure applied to the display by the object from above.
49. The electronic device of claim 48, wherein the user interface displays a callout balloon containing a copy of a thus-selected control, outside the location touched by the object.
50. The electronic device of claim 49, wherein the location of the callout balloon is configurable to be displayed on one side of the object touching the control when that object is a finger of the right hand, and to be displayed on an opposite side of the object touching the control when that object is a finger of the left hand.
Type: Application
Filed: Oct 17, 2015
Publication Date: Apr 7, 2016
Inventors: Magnus Goertz (Lidingo), Joseph Shain (Rehovot)
Application Number: 14/886,048