TOUCH SCREEN ELECTRONIC DEVICE AND ASSOCIATED USER INTERFACE
An electronic device including a display, sensors detecting contact above the display, a processor receiving information from the sensors, and a user interface accessing a plurality of applications, each application running on the electronic device in an activated state and in a non-activated state whereby, in its activated state, the application presents a graphical user interface (GUI) on the display and runs interactively via the GUI, and, in its non-activated state, the application presents a gadget on the display and runs non-interactively to present, within the gadget, dynamically generated information related to the application, wherein the user interface initializes each application in its non-activated state when the electronic device is turned on, and alters the layout when an application transitions to its activated state, by displacing some gadgets from in the display to out of the display, and replacing the displaced gadgets with a window for the activated application's GUI.
This application claims priority to and is a continuation of U.S. patent application Ser. No. 14/886,048, which is a continuation of U.S. patent application Ser. No. 12/486,033, now U.S. Pat. No. 9,164,654, which claims priority to U.S. Provisional Patent Application No. 61/132,469 and is a continuation-in-part of U.S. patent application Ser. No. 10/315,250, now U.S. Pat. No. 8,095,879, the contents of all of which are incorporated herein by reference in their entirety.
FIELD OF THE INVENTIONThe field of the present invention is user interfaces for electronic devices and, more particularly, to touch screen user interfaces.
BACKGROUND OF THE INVENTIONTouch screens provide user interfaces through which a user enters input to a computing device by touching a screen at a selected location, with a stylus or with his finger.
Conventional touch screens are limited as to the types of user inputs that they can recognize. For example, conventional touch screens are unable to distinguish between a soft tap and a hard press. In some prior art embodiments users initially select an item on the screen, and then subsequently activate the selected item. However, because prior art touch screens do not distinguish between degrees of pressure, the user is required to remove his finger or stylus from the screen and activate his selection with a second tap. It would be advantageous to produce touch screens that distinguish between varying amounts of pressure applied to the screen so that a user can select an item by touching its location on the screen, and then activate the item by applying additional pressure to the touch location without having to first remove his finger or stylus from the screen.
In conventional touch screens the keys are often small relative to the touch area, especially in handheld devices. The keys are also often situated close together. This can make it difficult to determine which key is being pressed by the user. It would be advantageous to clearly indicate to the user which key the user has selected, and furthermore, to allow the user to accept or reject the selected key without first removing his finger or stylus from the screen in order to perform a second tap.
Current user interfaces are basic and often require navigation through a series of menus in order to perform a desired operation. The present invention presents a user interface that is versatile in providing the user with many options, while requiring only few selections to activate a desired function. To further enhance user experience, certain functions are performed automatically without requiring the user to enter a selection.
SUMMARY OF THE DESCRIPTIONAspects of the present invention relate to user interfaces designed for use with a touch screen. The present invention relates to computer readable media storing computer programs with computer program code, which, when read by a computer unit, allows the computer to present a user interface for the computer unit.
In accordance with embodiments of the present invention, the computer unit features a touch sensitive display area. According to preferred embodiments an initial display configuration presents a plurality of gadgets on the display. These gadgets are small areas of the screen that indicate which functions each gadget will perform when activated by the user. When a gadget is activated, typically by the user touching the area of the display on which the gadget is displayed, the gadget increases in size and provides the user with icons and information about the gadget's functions. Significantly, the gadget does not cover the entire display area. Thus, when a gadget, or even a plurality of gadgets, is active, the primary display of all available gadgets is still accessible. This primary display can be compared to a desktop in computer operating system user interfaces. However, this primary display in the user interface of the present invention is not the same as a desktop where active windows can cover icons on the desktop. In the present invention, gadgets are arranged in a manner that open gadgets do not cover other gadgets. Rather, when an open gadget expands in size, other gadgets are shifted to make room for the expanded gadget. This allows the user to scroll the primary display or desktop to view any gadget. In the context of the present invention, this primary display area that includes both open and closed gadgets is called the home window. The user scrolls the home window to view gadgets that are shifted beyond the viewable area of the display. An expanded, or activated, gadget has an expanded window, but often it is not large enough to display everything contained in that gadget window. To view contents of the gadget not displayed in the visible portion of the gadget window, the user scrolls the gadget window. Thus, two different scroll operations are provided: scrolling the home window and scrolling a gadget window. According to one embodiment, scrolling is executed by gliding a finger or stylus along the touch screen to shift the active display area of the home window or of the gadget. The scrolling affects the home window if the finger glide began in an area of the screen that does not belong to an active gadget; the scroll affects an active gadget window if the finger glide began inside that active gadget window.
Various embodiments of the invention support several methods of scrolling a window. According to one embodiment, scrolling is done when the user touches the display inside the window area, for example at an edge of the window, or on an icon, such as an arrow or scrollbar, indicating a scroll operation. According to another embodiment, scrolling is done by the user touching the window with a finger or stylus and then gliding the finger or stylus along the touch sensitive screen in a direction indicating the desired direction of the scroll. When the content of the home display is larger in two dimensions than the actual display screen, this operation is like panning an image or map. When it is larger along only one axis (e.g., only vertically), the scrolling only scrolls in the one axis even when the glide is not orthogonal along the one axis.
Another aspect of the present invention relates to computer readable media storing a computer program with computer program code, which, when read by a mobile handheld computer unit, allows the computer to present a user interface for the mobile handheld computer unit. The user interface features a touch sensitive area in which representations of a plurality of keys are displayed, and each key is mapped to a corresponding location in the touch sensitive area at which the representation of the key is displayed. A key in this context includes, inter alia, alphabetic keys such as in a QWERTY keypad, numeric keys and also icons representing programs or functions. A key is selected, but not activated, when an object touches the corresponding location. This intermediate status of being selected but not activated facilitates the user to subsequently activate a desired key and avoid activating a neighboring key that the user selected but does not wish to activate. A selected key is activated when the object touching it applies additional pressure to the key location.
According to preferred embodiments of the invention, when a key is selected, the user interface generates a secondary representation of the key, such as a callout balloon containing the key representation. The callout balloon is placed away from the key location (being touched) so that the user can easily view which key is selected without lifting his finger. According to another embodiment, an audio representation of the selected key is generated so the user hears which key was selected.
According to still further features in preferred embodiments of the invention, the user touches the screen (with a finger or stylus) at a first location, for example selecting a first key. The user then glides his finger or stylus over the screen to additional locations. At each additional location a new key is selected and the previously selected key is deselected. The user can activate any selected key by applying additional pressure to the screen. The user does not have to remove the object from the screen to glide and select additional keys even after activating a first key.
Additional touch pressure is detected in various ways according to several embodiments. According to one embodiment, the touch sensitive area is a light-based touch screen operable to detect different levels of touch pressure. For example, light-based touch screens typically include a calculating unit operable to identify the size, shape and contours of an area being touched based on a pattern of obstructed light. See applicant's U.S. patent application Ser. No. 10/494,055, now U.S. Pat. No. 7,880,732, titled ON A SUBSTRATE FORMED OR RESTING DISPLAY ARRANGEMENT, the contents of which are incorporated herein by reference. When a finger or flexible object is used as a touch object, as additional pressure is applied to the touch surface, the contact area of the finger or object touching the screen increases. Thus, additional pressure is detected when an increase in the contours of the covered touch area is detected.
Alternatively, or in combination with the above, the touch sensitive area features both a touch screen operable to identify a touch location on the screen and a pressure sensor operable to detect pressure applied to the screen but not sensitive to the location of the object applying the pressure.
Other aspects of the present invention relate to convenient arrangement and function of icons to perform popular functions within a user interface. Thus, a camera gadget features a multimedia messaging service (MMS) button facilitating sending an active photo in an MMS message; a keylock gadget locks the computer and displays an instrumental keypad for entering a musical code to unlock the computer; a reporting gadget displays information for a first period of time and is then automatically deactivated. Several reporting gadgets are provided, including a gadget that displays the time of day; a gadget displays a weather forecast; a gadget that displays stock market information.
According to still further features in preferred embodiments the reporting gadget continues to display its information for a second period of time if the gadget is touched during the first period of time. I.e., automatic deactivation after the first period of time is canceled.
The present invention will be more fully understood and appreciated from the following detailed description, taken in conjunction with the drawings in which:
APPENDIX A is a copy of the original specification of U.S. Ser. No. 10/315,250 as filed on Dec. 10, 2002.
DETAILED DESCRIPTIONTouch screen keypads offer great flexibility in keypad interface design—flexibility that cannot be achieved with electro-mechanical keypads. Custom design keypads can be generated on a touch screen, where the markings on each key and the functions that each key provides are optimized for a designated application. Moreover, touch screen keypads can change modes, from one pad of keys and associated functions to a different pad of keys and associated functions. Custom keypads are of particular advantage for multi-lingual applications.
A general description of touch screen keypad interfaces, in accordance with embodiments of the present invention, and several examples thereof, are described in detail hereinbelow.
Embodiments of the present invention relate to improved keypads for inputting Chinese characters using XT9 stroke input, and using Chinese Pinyin. XT9 stroke input builds Chinese characters using six basic strokes, and offers selection of possible characters and phrases based on a set of strokes that have been input. Chinese Pinyin uses Latin characters that transliterate a sound or a syllable, in combination with a digit that represents an intonation or inflection. E.g., Ma in a rising tone is m-a-1, and Ma in a descending tone is m-a-2.
Reference is now made to
In accordance with an embodiment of the present invention, keypad 100 is generated and displayed on a touch screen. Keypad 100 has fewer than the standard 12 keys in a touch pad, allowing more room on screen for displaying characters.
Further in accordance with an embodiment of the present invention, the keys of keypad 100 are customized so that they contain only relevant information. For example, a prior art keypad displays a digit, 3-4 characters, and a basic Chinese stroke, all inside one key, even though in XT9 stroke mode the basic Chinese stroke is the only useful one. The custom keys of the present invention display only the basic Chinese strokes, or the strokes and numbers, but no characters.
There are two types of key presses supported by keypad 100—regular and long. A regular key press adds the stroke shown on the key to the series of strokes 101-106 already pressed. As strokes are successively entered, a numbered array 121 of Chinese characters or phrases is dynamically displayed along the top of the keypad. These characters or phrases are predicted based on the key presses already entered. In order to select one of the numbered elements of array 121, the user performs a long key press on that number. Alternatively, the user may keep entering strokes until only one option remains.
Often, however, more predicted characters or phrases exist than can be displayed along the top of the keypad. The N2 phone, manufactured by Neonode of Stockholm, Sweden, has a joystick button at the bottom of the phone. Twitching the joystick up/down displays different sets of predicted characters or phrases. When the intended character or phrase is displayed and associated with a given digit, a long press on that digit serves to select the intended character or phrase.
Reference is now made to
In accordance with an embodiment of the present invention, keypad 200 uses accent characters, rather than digits, to convey an intended intonation or inflection. Further in accordance with an embodiment of the present invention, keypad 200 displays only information relevant for Pinyin input on each key; no Chinese basic strokes are shown.
There are two types of Pinyin input. A user enters a Latin transliteration of an intended word using the Latin keypad input (12 keys). For each key, several letters are possible. The list of predicted Latin syllables based on the current sequence of keypad presses is displayed. Twitching the joystick right or left selects the desired combination. Also, a series of predicted Chinese characters or phrases is displayed and selected by a long press on a respective digit. Twitching the joystick up/down displays other predicted Chinese characters or phrases. Entering a space after a series of letters indicates the end of a previous character or phrase.
In accordance with an embodiment of the present invention, the user is able to combine stroke and Pinyin input, and compose a sequence of at least two Chinese characters using XT9 stroke input for at least one character and Pinyin input for at least one other character. The user switches between XT9 stroke input mode and Pinyin input mode by performing a sweeping motion in relation to the touch screen, such as, inter alia, sweeping a finger across the top of the touch screen. The series of at least two Chinese characters may be a text message, a name, a data entry, or any other such input.
Further in accordance with an embodiment of the present invention, the user is able to compose a series of at least one Chinese character and at least one non-Chinese term, wherein the non-Chinese term includes at least one Latin character, digit, emoticon, punctuation mark, another non-Chinese symbol, or any combination thereof. The series is composed by switching input modes for each alphabet or Chinese input or digit input by sweeping across the top of the touch screen. For example, the user may input at least one Chinese character using either Pinyin or stroke input, or a combination thereof. The user may then perform a sweeping motion in relation to the touch screen to change the input mode to English. For example, the user may sweep a finger across the top of the touch screen to change the input mode to English. In this mode, the keypad presents Latin characters. The user then proceeds to input Latin characters using the Latin keypad displayed on the touch screen. Alternatively, the user may repeat a series of sweeping motions; e.g., the user sweeps a finger across the top of the touch screen, repeatedly, changing the input mode with each sweeping motion, until a digit keypad is displayed and digit input mode is active. The user may then proceed to enter at least one digit, adding the at least one digit to the series of Chinese characters already contained in the message. It will thus be appreciated that the user may switch between different input modes while composing a single message, a command, a name, a data entry or another such input, including at least two different types of characters, in an easy and simple and convenient manner.
Further in accordance with an embodiment of the present invention, a keypad displaying emoticons is displayed. In this mode, the user may select an emoticon to be entered into the text of a message, or such other input.
Yet further in accordance with an embodiment of the present invention, drawings, including inter alia, emoticons, are constructed in a similar manner to XT9 stroke input. In this mode, the user interface displays the basic building blocks for the drawing, such as a curve, a semicolon, a circle, and other symbols. As the user taps multiple symbols, possible drawings or emoticons that can be formed using the selected elements are displayed, and the user may either select the desired complete drawing or emoticon from the displayed list, or may continue entering additional building blocks until only one option remains. This mode of input is convenient as the number and size of the keys presented is optimized for the number of available building blocks, and each key only displays information relevant for the active input mode.
Embodiments of the present invention provide methods and systems for enabling multiple input modes, whereby the screen display in each input mode is optimized for that mode. Optimizations include (i) configuring the number of keys displayed, (ii) configuring the size, position and shape of the keys in relation to the screen, (iii) configuring the size, position and shape of the area of the display showing text already entered, (iv) configuring the size, position and shape of the area of the display showing possible completions for the current character, phrase or symbol, and (v) displaying only at least one character, symbol, digit or other figure that is relevant to the active input mode on each key.
Embodiments of the present invention also provide methods and systems for enabling multiple input modes and switching between the input modes by performing a sweeping motion in relation to the screen. These methods and systems are easier and more convenient than using a menu interface to switch input modes. Additionally, these methods do not use up screen space to provide a switching key, to switch between input modes, and, as such, screen space may be used for information related to the current input mode and the current text. These methods and systems also enable switching between many input modes, by repeating the sweeping motions to advance through the input modes. In the N2 phone manufactured by Neonode AB, this mode of switching active keypads is used with standard 12-key keypads, for example, inter alia, switching between numeric, Swedish and English keypads. However, switching the layout of the keys is not used in the Neonode N2 phone. Moreover, other, prior art methods of switching between input modes require more than one action to advance through a list of input modes.
Reference is now made to
Problem: it is hard to hit the right item on a touch display because when you move the finger to and from the surface it is also easy to move it in the X-Y direction by mistake, especially when used with one hand. According to embodiments of the present invention, the solution to this problem is to use a conventional touch technology to find the object's X-Y position, add tactile or graphical illustration of where your finger is, and add a force sensor to the display to read the activation. Thus, according to embodiments of the present invention, selection and subsequent activation of an on-screen element, such as, inter alia, a gadget, a letter, a key, a button or an icon, is implemented by two primary features. The first feature, referred to as pressure sensing, provides the touch screen with one or more pressure sensors operative to determine contact with the screen by an external object such as a finger. In one embodiment, the user places a finger or a stylus over the desired on-screen element to select it, as illustrated in
The second feature, referred to as callout balloons, indicates a selected on-screen element to the user by displaying a graphic, such as by (i) enlarging the on-screen element, (ii) displaying a callout balloon with a copy of the on-screen element inside, or (iii) a combination thereof. The callout balloon is illustrated in
In accordance with an embodiment of the present invention, touch screens for phones captioned “Power ON/Power OFF” are shown in
Additionally in accordance with an embodiment of the present invention, touch screens for phones captioned “Key lock” are shown in
Further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Key lock high security” is shown in
Yet further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Home/return from application” is shown in
Moreover in accordance with an embodiment of the present invention, a touch screen for a phone is shown in
In
In
In
Additionally in accordance with an embodiment of the present invention, a touch screen for a phone is shown in
After the user scrolls to reveal additional gadgets and a specific time limit has elapsed, e.g., 3 seconds, the screen automatically returns to a default display, hiding recently exposed gadgets and revealing previously hidden gadgets. Such return to default display is of advantage for revealed gadgets that do not require extended viewing, such as, inter alia, a clock or a weather report. The return to default display obviates the need for the user to perform an action in order to return the screen to its initial display. Activating the home key after scrolling also returns the screen to its initial display, showing the originally displayed gadgets and hiding the recently revealed gadgets. Such return to initial display is graphically presented (i) by returning the original display completely, (ii) by fading in the original display over the current scrolled display, (iii) by gradually displacing the scrolled display with the original display, (iv) by graphically scrolling the display in reverse to the original display, or (v) by a combination of such presentations. The user puts a finger on the screen and scroll bar handle and drags down to display more gadgets down the page. The user uses the Home button to get back to the home area. An optional feature: the user puts a finger on the scroll bar handle and drags up to display upper gadgets that are less frequently used. The screen is automatically scrolled back three seconds after the finger is removed.
Further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Fav 5” is shown in
Further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “History” is shown in
Yet further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Telephone” is shown in
“Telephone” gadget is shown shaded in
Moreover in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Active call” is shown in
Additionally in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Volume” is shown in
Additionally in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Mp3 player” is shown in
Further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Operator ad” is shown in
Yet further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Video” is shown in
Yet further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Camera” is shown in
Yet further in accordance with an embodiment of the present invention, a touch screen for a phone captioned “Messaging SMS” is shown in
The effect of the scrolling activity is illustrated in
In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made to the specific exemplary embodiments without departing from the broader spirit and scope of the invention as set forth in the appended claims. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
Claims
1-20. (canceled)
21. A portable wireless computer system, comprising:
- a top surface comprising a touch sensitive display, the top surface containing no movable button;
- a processor;
- a transceiver for sending and receiving wireless signals over a communications network;
- a solid-state memory storing computer instructions configured to: enable the portable wireless computer system to run a plurality of applications, the applications comprising a telephone, chat or SMS, a calculator, a camera, an alarm, a clock, a music player, and email; and cause the portable wireless computer system (a) to highlight a first item corresponding to a current position of an object gliding over a linear list of items on said display and to select a second item from the list at least partially based on where the object stops gliding and is lifted from said display, and (b) to present a plurality of communication options on said display for a given contact, wherein the communication options comprise call, email, and chat or SMS; and
- a housing surrounding said display and enclosing said processor, said transceiver, and said solid-state memory.
22. The portable wireless computer system of claim 21, wherein the plurality of applications comprise: (a) a first application providing options on said display for editing, deleting and sending a picture, (b) a second application and a third application capable of running simultaneously, the second application capable of being presented on top of the third application on said display, and (c) a fourth application and a fifth application capable of running simultaneously, the fourth application being a music player, and the fifth application being email, chat or SMS.
23. The portable wireless computer system of claim 22, wherein the portable wireless computer system is a mobile phone, and the computer instructions are configured to enable the portable wireless computer system to present a user interface, the user interface comprising at least two states, namely: (a) a tap-present state, wherein a plurality of tap-activatable icons for activating a plurality of corresponding pre-designated applications, functions, services, settings or tasks are present, each of the plurality of pre-designated applications, functions, services, settings or tasks being activated in response to a tap on its corresponding icon, and (b) a tap-absent state, wherein the plurality of tap-activatable icons are absent, the tap-absent state configured to be transitioned to the tap-present state in response to a multi-step user gesture comprising the object: (i) touching an edge area of said display, and then (ii) gliding on said display away from the edge area.
24. The portable wireless computer system of claim 23, wherein the plurality of tap-activatable icons represent applications comprising a telephone, chat or SMS, a camera, a music player and email.
25. The portable wireless computer system of claim 22, wherein the computer instructions are configured to activate a function in response to a multi-step user gesture comprising the object touching said display at a location corresponding to a demarcated representation of the function followed by the object gliding away from the location along said display, wherein the maximum diagonal dimension of the representation is less than a thumb's width, the representation represents only the function, and is not relocated during the multi-step user gesture, and the function is not activated differently based on a direction of the gliding.
26. The portable wireless computer system of claim 25, wherein the function is a member of the group consisting of an application, and a menu for configuring services or settings for an operations system or an application, and wherein the maximum diagonal dimension of the representation is less than one inch.
27. The portable wireless computer system of claim 26, wherein the computer instructions are configured (a) to enable the portable wireless computer system to scroll content on said display in response to the object touching a first location on said display and gliding up or down on said display from the first location, and (b) to enable the portable wireless computer system to move an application, a function, a service or a setting one step forward or backward or to close or remove an application, a function, a service or a setting on said display in response to the object touching a second location on said display and gliding to the right or to the left from the second location.
28. The portable wireless computer system of claim 25, wherein the function activated in response to the multi-step user gesture presents one or more alphanumeric characters in a keyboard user interface on said display.
29. The portable wireless computer system of claim 22, wherein the computer instructions are configured to enable the portable wireless computer system:
- (a) to enable a graphical user interface for accessing a plurality of gadgets, each gadget comprising an area containing at least a portion of dynamically generated data related to that gadget, wherein the user interface: (i) arranges the plurality of gadgets in a layout that is larger than said display, whereby some of the gadgets are in said display and others of the gadgets are out of said display; (ii) expands one of the gadgets in said display beyond an edge of said display to show more data therein by shifting other gadgets; and (iii) pans the layout within said display to bring some of the gadgets into said display and/or to move some of the gadgets out of said display, in response to the object touching one of the gadgets on said display, and then the object gliding along said display away from the touched location; and
- (b) to detect an object touching with hard pressure an on-screen element selected from the group consisting of a gadget, a letter, a key, a button or an icon and then to activate the on-screen element, thereby resulting in: (i) enlarging the on-screen element, (ii) displaying a callout balloon with a copy of the on-screen element inside, or (iii) a combination thereof.
30. A mobile phone system, comprising:
- a top surface comprising a touch sensitive display, the top surface having no movable button;
- a processor coupled to said display;
- a transceiver for sending and receiving wireless signals over a communications network;
- a solid-state memory storing computer instructions configured to enable the mobile phone system: (a) to activate a function in response to a first multi-step user gesture comprising an object touching an area corresponding to a demarcated representation of the function followed by gliding away from the area on said display, wherein the demarcated representation represents only the function and does not relocate during the first multi-step user gesture, and the function is not activated differently based on a direction of the gliding, (b) to present a plurality of functions, applications, services or settings in response to a second multi-step user gesture comprising the object touching an edge area of said display followed by gliding within said display away from the edge area, (c) to highlight a first item corresponding to a current position of the object gliding over a linear list of items on said display and to select a second item from the list at least partially based on where the object stops gliding and is lifted from said display, (d) to run a first application wherein a touch keyboard presented on said display enables a user to edit or save text in the first application, (e) to run a second application configured to provide options for editing, deleting and sending a picture on said display, (f) to present a plurality of communication options for a given contact, the communication options comprising call, email, and chat or SMS, (g) to scroll content on said display in response to the object touching a first location on said display and gliding up or down on said display from the first location, and (h) to move an application, a function, a service or a setting one step forward or backward or to close or remove an application, a function, a service or a setting on said display in response to the object touching a second location on said display and gliding to the right or to the left from the second location; and
- a housing surrounding said display and enclosing said processor, said transceiver, and said solid-state memory.
31. The mobile phone system of claim 30, wherein the computer instructions are configured to enable the mobile phone system to present a user interface, the user interface comprising at least two states, namely, (a) a tap-present state, wherein a plurality of tap-activatable icons for activating a plurality of corresponding pre-designated applications, functions, services, settings or tasks are present, each of the plurality of pre-designated applications, functions, services, settings or tasks being activated in response to a tap on its corresponding icon, and (b) a tap-absent state, wherein the plurality of tap-activatable icons are absent, the tap-absent state configured to be transitioned to the tap-present state in response to a third multi-step user gesture comprising: the object (i) touching an edge area of said display, and then (ii) gliding on said display away from the edge area.
32. The mobile phone system of claim 31, wherein the plurality of tap-activatable icons represent applications comprising a telephone, chat or SMS, a camera, a music player and email.
33. The mobile phone system of claim 30, wherein the computer instructions are configured to enable the mobile phone system to run a plurality of applications, the applications comprising: (a) a telephone, chat or SMS, a calculator, a camera, an alarm, a clock, a music player, and email; (b) a third application and a fourth application capable of running simultaneously, the third application capable of being presented on top of the fourth application on said display; and (c) a fifth application and a sixth application capable of running simultaneously, the fifth application being a music player and the sixth application being email, chat or SMS.
34. The mobile phone system of claim 30, wherein the text saved in the first application is configured for use as an address, a telephone number, or a message in phone call, email, chat or SMS.
35. The mobile phone system of claim 30, wherein the function activated in response to the first multi-step user gesture enables an alphanumeric character to be entered using a keyboard presented on said display.
36. The mobile phone system of claim 30, wherein the computer instructions are configured to enable the mobile phone system:
- (1) to enable a graphical user interface for accessing a plurality of gadgets, each gadget comprising an area containing at least a portion of dynamically generated data related to that gadget, wherein the user interface: (a) arranges the plurality of gadgets in a layout that is larger than said display, whereby some of the gadgets are in said display and others of the gadgets are out of said display; (b) expands one of the gadgets in said display to show more data therein by shifting other gadgets; and (c) pans the layout within said display to bring some of the gadgets into said display and/or to move some of the gadgets out of said display, in response to (i) the object touching one of the gadgets on said display, and then (ii) the object gliding along said display away from the touched location; and
- (2) to detect an object touching with hard pressure an on-screen element from the group consisting of a gadget, a letter, a key, a button or an icon and then to activate the on-screen element, thereby resulting in (a) enlarging the on-screen element, (b) displaying a callout balloon with a copy of the on-screen element inside, or (c) a combination thereof.
37. A mobile phone system, comprising:
- a top surface comprising a touch sensitive display, the top surface containing no movable button;
- a processor;
- a transceiver for sending and receiving wireless signals over a communications network;
- a solid-state memory storing computer instructions configured to enable the mobile phone system: (1) to highlight a first item corresponding to a current position of an object gliding over a linear list of items on said display and to select a second item from the list at least partially based on where the object stops gliding and thereafter is lifted from said display, and (2) to provide a user interface, the user interface comprising at least two states, namely: (a) a tap-present state, wherein a plurality of tap-activatable icons for activating a plurality of corresponding pre-designated applications, functions, services, settings or tasks are present, each of the plurality of pre-designated applications, functions, services, settings or tasks being activated in response to a tap on its corresponding icon, and (b) a tap-absent state, wherein the plurality of tap-activatable icons are absent, the tap-absent state configured to be transitioned to the tap-present state in response to a first multi-step user gesture comprising: the object (i) touching an edge area of said display, and then (ii) gliding on said display away from the edge area; and
- a housing surrounding said display and enclosing said processor, said transceiver, and said solid-state memory.
38. The mobile phone system of claim 37, wherein the computer instructions are further configured to enable the mobile phone system to run a plurality of applications, the applications comprising a phone, chat or SMS, a calculator, a camera, an alarm, a clock, a music player, and email, the plurality of applications comprising: (a) a first application wherein a touch keyboard presented on said display enables a user to edit or save text in the first application, (b) a second application providing options for editing, deleting and sending a picture on said display, (c) a third application and a fourth application capable of running simultaneously, the third application capable of being presented on top of the fourth application on said display, and (d) a fifth application and a sixth application capable of running simultaneously, the fifth application being a music player, and the sixth application being email, chat or SMS, and (e) a seventh application providing a plurality of communication options for a given contact, the communication options comprising call, email, and chat or SMS.
39. The mobile phone system of claim 38, wherein the instructions are configured to enable the mobile phone system: (a) to scroll content on said display in response to the object touching a first location on said display and gliding up or down on said display from the first location, (b) to move an application, a function, a service or a setting one step forward or backward or to close or remove an application, a function, a service or a setting on said display in response to the object touching a second location on said display and gliding to the right or to the left from the second location, and (c) to activate a function in response to a second multi-step user gesture comprising the object touching an area corresponding to a demarcated representation of the function followed by gliding away from the area on said display, wherein the demarcated representation represents only the function and does not relocate during the second multi-step user gesture, and the function is not activated differently based on a direction of the gliding.
40. The mobile phone system of claim 39, wherein the computer instructions are configured to enable the mobile phone system:
- (a) to enable a graphical user interface for accessing a plurality of gadgets, each gadget comprising an area containing at least a portion of dynamically generated data related to that gadget, wherein the user interface: (i) arranges the plurality of gadgets in a layout that is larger than said display, whereby some of the gadgets are in said display and others of the gadgets are out of said display; (ii) expands one of the gadgets in said display to show more data therein by shifting other gadgets; and (iii) pans the layout within said display to bring some of the gadgets into said display and/or to move some of the gadgets out of said display, in response to the object touching one of the gadgets on said display, and then the object gliding along said display away from the touched location; and
- (b) to detect an object hard-pressing an on-screen element from the group consisting of a gadget, a letter, a key, a button or an icon, and then to activate the on-screen element, thereby resulting in (i) enlarging the on-screen element, (ii) displaying a callout balloon with a copy of the on-screen element inside, or (iii) a combination thereof.
Type: Application
Filed: Feb 20, 2020
Publication Date: Jun 18, 2020
Inventors: Magnus Goertz (Lidingo), Joseph Shain (Rehovot)
Application Number: 16/796,880