User Interface Devices
A method and apparatus of user interface (“UI”) having multiple motion dots capable of detecting user inputs are disclosed. In one embodiment, a digital processing system includes a first motion dot, a second motion dot, and a device. The first motion dot can be attached to a first location of a user's body and the second motion dot may be attached to the second location of the user's body. The first motion dot, for example, includes accelerometers able to identify a physical location of the first motion dot and the second motion dot also includes accelerometers capable of detecting an input generated based on relative physical position between the first motion dot and the second motion dot. The device, which is logically coupled to the second motion via a wireless connection, is configured to store the input in a local storage.
The present application is a continuation application of co-pending U.S. patent application Ser. No. 11/952,428, entitled “User Interface Devices” filed on Dec. 7, 2007, which is hereby incorporated by reference in its entireties.
FIELDThe exemplary embodiment(s) of the present invention relates to the field of electronic communication devices. More specifically, the exemplary embodiment(s) of the present invention relates to user interface devices for portable devices.
BACKGROUNDWith increasing popularity of handheld or portable devices such as iPhone®, Blackberry®, PDA (Personal Digital Assistants), cellular phones, and the like, the handheld devices are not only getting more powerful with sophisticated networking functionalities, but also getting more compact. While the portable devices can typically access ubiquitous information, such as e-mail, instant messages, VoIP (Voice over IP), video, photos, and the like, user interface (“UI”) devices for such portable devices become less intuitive and troublesome. Various currently available UI devices, such as touch-pads (PDAs and the iPhone®) or miniature keyboards (Blackberry®), are less user-friendly and intuitive. For example, touch pads allow larger screen areas than a keyboard and provide direct manipulation UI.
A problem associated with a typical touch pad is that the user's hand and fingers obscure the users' ability to see the screen when the users try to touch the pad. For example, selecting text using a finger over a portable screen can be cumbersome.
Another problem associated with a touch pad is that it is limited to two-dimensional implementation. For example, it can cause confusion to differentiate a scrolling gesture (touching the screen and moving your finger) from a navigation gesture (touching the screen).
SUMMARYA user interface (“UI”) device having multiple moving dots capable of detecting user inputs is disclosed. In one embodiment, the UI device includes a first motion dot and a second motion dot. The first motion dot is capable of attaching to a first finger and the second motion dot is configured to attach to a second finger. The first finger, in one example, is a thumb and the second finger is an index finger. The first motion dot includes multiple accelerometers used for identifying the physical location of the first motion dot. In an alternative embodiment, the first motion dot further includes a gyroscope to monitor the orientation of the first motion dot attached to a finger. The second motion dot, which is logically coupled to the first motion dot via a wireless communications network, is capable of detecting a user input in response to a relative physical position between the first and the second motion dots.
Additional features and benefits of the exemplary embodiment(s) of the present invention will become apparent from the detailed description, figures and claims set forth below.
The exemplary embodiment(s) of the present invention will be understood more fully from the detailed description given below and from the accompanying drawings of various embodiments of the invention, which, however, should not be taken to limit the invention to the specific embodiments, but are for explanation and understanding only.
Embodiments of the present invention are described herein in the context of a method, system, and apparatus for communicating with a portable device using multiple motion dots.
Those of ordinary skilled in the art will realize that the following detailed description of the present invention is illustrative only and is not intended to be in any way limiting. Other embodiments of the present invention will readily suggest themselves to such skilled persons having the benefit of this disclosure. Reference will now be made in detail to implementations of the exemplary embodiments of the present invention as illustrated in the accompanying drawings. The same reference indicators (or numbers) will be used throughout the drawings and the following detailed description to refer to the same or like parts.
In the interest of clarity, not all of the standard hardware and routine features of the implementations described herein are shown and described. It will, of course, be appreciated that in the development of any such actual implementation, numerous implementation-specific decisions must be made in order to achieve the developer's specific goals, such as compliance with application- and business-related constraints, and that these specific goals will vary from one implementation to another and from one developer to another. Moreover, it will be appreciated that such a development effort might be complex and time-consuming, but would nevertheless be a routine undertaking of engineering for those of ordinary skilled in the art having the benefit of this disclosure.
A user interface (“UI”) device having multiple motion dots capable of detecting user inputs is disclosed. In one embodiment, the UI device includes a first motion dot and a second motion dot. The first motion dot is capable of attaching to a first finger and the second motion dot is configured to attach to a second finger. The first finger, in one example, is a thumb and the second finger is an index finger. The first motion dot includes multiple accelerometers used for identifying the physical location of the first motion dot. In an alternative embodiment, the first motion dot further includes a gyroscope to detect the orientation of the first motion dot. The second motion dot, which is logically coupled to the first motion dot via a wireless communications network, is capable of detecting a user input in response to a relative physical position or relative motion between the first and the second motion dots.
Computer or handheld device 102 includes an antenna 108 and a display screen 112, wherein display screen 112 shows a cursor 110. Display screen 112 may be a touch screen, a flat panel display, and the like. Device 102 is configured to communicate with motion dots 104 and 106 via wireless signals 120-122 over a short-range wireless communications network. The short-range wireless communications network includes a personal area network (“PAN”) and/or a WLAN. PAN includes, but not limited to, Bluetooth, Ultra Wideband, ZigBee, Ultra-WideB and (UWB), WiMax, or Ambient networks. Alternatively, device 102 also supports one or more types of communication protocols, such as TCP/IP, UDP, http, SNMP, cellular (GPRS, CDMA, GSM, CDPD, 2.5G, 3G, etc), and/or other ad-hoc/mesh network technologies. It should be noted that the advantage of using the existing networking infrastructures is to enhance the flexibility and to reduce the implementation cost. Device 102 can be a PDA, a cellular phone, a portable device, a laptop computer, a smart phone, an iPhone®, a television, and the like.
Motion dots 104-106 is a user interface (“UI”) device used to communicate with device 102 via the wireless communications network 120-122. The motion dots, in one embodiment, have unique identifiers “IDs” to allow device 102 to differentiate between the dots. The unique ID may be encapsulated into the input data or acceleration data or motion data transmitted between each motion dot and device 102. Each computer or portable device will recognize the source of the data from the ID, and process the motion data in accordance with the source of the data. Unique IDs, for example, should prevent interference when motion dots from two or more computer devices are used close together. Unique IDs may further allow collaborative work from multiple sets of motion dots with a single handheld device. For example, if two or more users, outfitted with a pair of motion dots each, were to work on a handheld device collaboratively, the device can distinguish and interpret each user's gestures and assign the appropriate command/action. In this way, two or more users can collaboratively arrange objects on a same display screen. In an alternative embodiment, collaboration can share motion information with one or more display devices. For example, two or more users use motion dots to collaborate while looking at their own screens. The collaborating feature is particularly advantageous for users located at different geographic locations. For instance, if participants of a remote sales meeting were all outfitted with motion dots, they could collaboratively manipulate charts and graphs to present the information to interested parties.
Motion dots 104-106 include antennas and fasten mechanism 114-116, wherein motion dots 104-106 may be attached to two fingers. In one embodiment, fasten mechanism 114-116 of motion dots 104-106 include ring configurations capable of attaching to fingers. It should be noted that other means of attaching mechanism for motion dots 104-106 are available, such as adhesive backing, a hook, and/or loop (i.e., Velcro) securing means.
Each motion dot is capable of communicating with handheld device 102 and/or a nearby neighboring motion dot. For example, motion dot 104 uses a short range communications network 120 to communicate with handheld device 102 and uses another short range communications network 124 to communicate with a nearby neighboring motion dot 106. Motion dots 104-106, in one embodiment, use short range communications network 120 to identify a relative geographic (or physical) position between motion dots 104-106. It should be noted that more motion dots may be used. For example, motion dots may be placed on all ten fingers depending on the application intended by the user.
In operation, motion dots 104-106 are attached to a user's finger such as the thumb and index finger and capable of detecting the relative movements between the fingers. Upon detecting the movements of fingers, motion dots 104-106 identify one or more user inputs and subsequently, forward the user inputs to handheld device 102 via wireless communications network 120-122. Motion dots 104-106 are the remote UI device that allows inputs to be entered independent from a handheld device 102.
An advantage of using the motion dots as an UI device is to remove an input mechanism such as touch pad and “mini-keyboard” from a handheld device. Another advantage of using the motion dots is to allow a user to enter inputs in a more naturally, intuitive three-dimensional (3D) space.
Having briefly described one embodiment of remote UI device using a short arrange wireless network environment in which the present exemplary embodiment operates,
Main memory 204, which may include multiple levels of cache memories, stores frequently used data and instructions. Main memory 204 may be RAM (random access memory), MRAM (magnetic RAM), or flash memory. Static memory 206 may be a ROM (read-only memory), which is coupled to bus 211, for storing static information and/or instructions. Bus control unit 205 is coupled to buses 211-212 and controls which component, such as main memory 204 or processor 202, can use the bus. Bus control unit 205 manages the communications between bus 211 and bus 212. Mass storage memory 207, which may be a magnetic disk, an optical disk, hard disk drive, floppy disk, CD-ROM, and/or flash memories for storing large amounts of data, may optionally coupled to bus 211.
I/O unit 220 includes a display 221 and optionally includes keyboard 222, cursor control device 223, and communication device 225. Display device 221 may be a liquid crystal device, cathode ray tube (“CRT”), touch-screen display, or other suitable display device. Communication device 225 is coupled to bus 211 for accessing information from remote computers or servers, such as server 104 or other computers, through wide-area network or wireless communications networks. Communication device 225 may include a modem or a network interface device, or other similar devices that facilitate communication between portable computer 200 and the network. Motion dot module 230 is configured to communicate with one or more motion dots connected through one or more wireless communications networks.
Accelerometer 310, in one embodiment, is a three dimensional (“3D”) accelerometer capable of measuring acceleration or movement along x, y, and z coordinates or axis. Accelerometer 310 includes an x-axis accelerometer, a y-axis accelerometer, and/or a z-axis accelerometer and is capable of detecting a 3D physical movement relating to motion dot 300. Physical movement includes a geographic 3D motion. The accelerometers included in the motion dots capture a relative position and motion/acceleration in the space, along, x, y, and z axis of the thumb, forefinger, and/or hand. The data detected by accelerometer 310, for example, is processed by microcontroller 308.
Microcontroller 308 on the motion dot, in one embodiment, decides when to gather the motion data, when to transmit the data, and when to conserve power. Upon receipt of movement information from accelerometer 310, microcontroller 308 identifies the relative physical position in response to a home position or reference point. For instance, microcontroller 308 tracks and updates the home position continuously whereby a relative physical position for a pair of motion dots can be accurately obtained. It should be noted that a relative physical position is easier to obtain than an absolute physical position.
Microcontroller 308 also provides power management to control power consumption and data transmission mode. To conserve power, microcontroller 308 may take discrete snapshots such as 1/10th of a second interval for data transmission. Moreover, microcontroller 308 may instruct motion dot 300 to enter sleep state between finger movements for power conservation. The radio transmission of RF device 304, which transmits acceleration data via antenna 303, may also be controlled by microcontroller 308.
Power supply 312 may include a disposable battery, a rechargeable battery, a solar battery, a kinetic energy generator, or a combination of batteries and kinetic generators. The rechargeable battery is capable of charging the rechargeable battery via charging connecter 306, while kinetic energy generator is capable of generating power via movements and/or velocities. In one embodiment, power supply 312 includes a power management component to regulate dot power consumption. For example, to minimize power consumption, transmitting information between motion dots and device at a moderate rate than a constant or continuous rate.
Adhesive mechanism 314 includes an attaching mechanical form that is capable of attaching to an object. For example, adhesive mechanism 314 may cause dot 300 to attach a thumb nail. Also, adhesive mechanism 314 may be a ring shape that fits into a finger or a cot shape that fits into the tip of a finger. Other attachment methods such as Velcro or adhesion may be used to perform the similar fastening function.
Dot 300, in one embodiment, further includes a gyroscope and a casing apparatus. Gyroscope is used to sense the orientation of dot 300. For example, Gyroscope can detect rotating motion. Casing apparatus encloses or encapsulates dot 300 into a small case such as 10×10×4 millimeters. Potting material such as polymers may be used as filler to fill various gaps between the components in dot 300.
Motion dots 104-106 is logically coupled with device 102 via a wireless communications network such as a PAN. The wireless communications network facilitates information transfer between motion dots 104-106 and device 102. For example, a graphical object or cursor 110 is displayed on display screen 112 in response to inputs entered via motion dots 104-106. It should be noted that device 102 is configured to perform various functions such as cursor movements and image displays in accordance with the inputs received from two or more remote motion dots.
Motion dots 104 and 106, as shown in
System 400, in one embodiment, is configured to identify an input or inputs from a motion or a sequence of motions. A sequence of motion includes finger joining, finger release, lifting, putting, panning, picking, and the like. In addition, they represent one or more input commands, such as home positioning, cursor pointing, file opening, and file magnifying. For example, when a cursor control input is detected, cursor 110 will move in accordance with the movement of the fingers. For instance, as the hand moves diagonally up and to the user's right, cursor 110 will move toward the upper right hand corner of display screen 112. Similarly, as the hand moves diagonally down and to the user's left, cursor 110 will move toward the bottom left corner of screen 112.
An advantage of using system 400 is to separate the UI device from a portable device. The UI device is moved from a portable device to user's fingers, which allow the hand or fingers to work free from the touch screen or device itself. Motion dots 104-106 having multiple 3D accelerometers and wireless capabilities can be attached to user's fingernails, which, for example, communicate with a device 102 via a short distance radio frequency network.
Another advantage of using system 400 is to add or leverage a 3D input command, such as “picking” and “placing” command gestures. The motion measured by motion dots 104 and 106 may include 3D while the display screen 112 renders objects and movements in two dimensions (2D). Using techniques such as object size and shading, objects may be rendered in a simulated 3D.
A sensed motion, for example, is a relative motion between two motion dots. Finger relative motion is one of the natural motions that can be detected. Finger relative motion is, for example, any combination of finger motions that create independent motions of the fingers. For instance, brushing the thumb with the opposing finger creates a relative motion in the line of the thumb, which may be a natural gesture for list scrolling. Finger joining is another one of the natural motions that can be detected. Finger joining, for example, causes the two fingers to join together and become joined in their motion. Finger joining starts with finger relative motion and ends in the fingers moving together. An example of finger joining is a picking gesture. To detect a finger joining, the motion dots report substantially the same accelerations. Alternatively, finger release may be the opposite of finger joining.
Furthermore, if a cursor is moved to an object on the display screen prior to performing a “picking” motion, the portable device may translate this series of motion as grabbing or picking a file or object. Continuing with this example, while holding the index finger 502 and thumb 504 together, the user moves her hand orthogonal to horizontal plane of the object. The processor may be programmed to translate this series of motions as lifting the object up from the horizontal plane.
Motion dots 702 and 704 are held slightly apart from each other as well as slightly away from palm 706. Representation 700, in one embodiment, is a default position or a home position for the beginning of various hand gestures. It should be noted that the home position may be updated continuously or set at a predefined fixed relative position.
Touch screens or touch pads or flat panel displays are limited to 2D. Accelerometers, however, allow the detection of motion in 3D. For example, finger joining gesture may be followed by a lifting motion (joining on one plane and lifting in a direction orthogonal to that plane). As such, using a 2D media to represent a 3D motion can be challenge. A handheld device using a pair of remote UI device may also use an adaptive algorithm to learn certain user specific gestures to enhance the performance.
The exemplary embodiment(s) of the present invention includes various processing steps, which will be described below. The steps of the embodiments may be embodied in machine or computer executable instructions. The instructions can be used to cause a general purpose or special purpose system, which is programmed with the instructions, to perform the steps of the present invention. Alternatively, the steps of the present invention may be performed by specific hardware components that contain hard-wired logic for performing the steps, or by any combination of programmed computer components and custom hardware components. While embodiments of the present invention will be described with reference to the Internet, the method and apparatus described herein is equally applicable to other network infrastructures or other data communications environments.
At block 1104, the process activates the first motion dot and the second motion dot. In one embodiment, a third motion dot is attached to a third finger and it is activated. Each motion dot, for example, may include a microprocessor, a memory, a power supply, one or more accelerometers, and so on. After block 1104, the process proceeds to the next block.
At block 1106, the process establishes a wireless communications network between the first motion dot and the second motion dot. For example, the wireless communications network may be a wireless personal area network (“PAN”) for providing communications between the first motion dot, the second motion dot, and the portable electronic device. It should be noted that PAN is a wireless network used for communications between portable devices. The coverage of PAN is generally several feet and is used for intrapersonal communications. The process, in one embodiment, recognizes multiple unique identifiers for multiple motion dots, wherein unique identifiers facilitate collaborative work. After block 1106, the process moves to the next block.
At block 1108, the process identifies a relative reference position between the first motion dot and the second motion dot. For example, home position may be set for finger joining motion, wherein the process senses a finger joining motion between the first motion dot and the second motion dot. After block 1108, the process moves to next block.
At block 1110, the process detects a user input in response to relative motion between the first motion dot and the second motion dot with respect to the relative reference position. In one embodiment, the process detects finger movements in accordance with 3D accelerometers. Alternatively, the process identifies one of the finger relative motion, finger joining motion, finger release motion, finger lifting motion, putting motion, and scrolling motion. After block 1110, the process moves the next block.
At block 1112, the process performs a task in a portable electronic device in response to the user input. The process further displays a pictorial image on a display in the portable electronic device representing the relative motion. The process also instructs the portable device to enter a sleep mode or to wake up from the sleep mode to conserve power consumption. The process further establishes a wireless communications network between the first motion dot and the second motion dot and the third motion dot. For example, a relative motion is a motion between the first motion dot and the second motion dot and the third dot with respect to the relative reference position.
While particular embodiments of the present invention have been shown and described, it will be obvious to those skilled in the art that, based upon the teachings herein, changes and modifications may be made without departing from this invention and its broader aspects. Therefore, the appended claims are intended to encompass within their scope all such changes and modifications as are within the true spirit and scope of the exemplary embodiment(s) of is present invention.
Claims
1. A digital processing system comprising:
- a first motion dot operable to attach to a first location of a user's body, wherein the first motion dot includes a plurality of accelerometers for identifying a physical location of the first motion dot;
- a second motion dot logically coupled to the first motion dot and configured to attach to a second location of the user's body, wherein the second motion dot having a plurality of accelerometers is capable of detecting a first input generated based on a first relative physical position between the first motion dot and the second motion dot; and
- a device logically coupled to the second motion via a wireless connection and configured to store the first input in a local storage.
2. The device of claim 1, further comprising a handheld device logically coupled to the second motion dot, wherein the handheld device is able to receive the first input via a wireless communications network.
3. The device of claim 1, wherein the second motion dot is able to generate a second input generated based on a second relative physical position between the first motion dot and the second motion dot.
4. The device of claim 1, wherein the first motion dot includes:
- a transceiver capable of facilitating wireless communications;
- a battery coupled to the transceiver and configured to supply power; and
- a controller coupled to the battery and configured to control communications.
5. The device of claim 1, wherein the second motion dot includes an x-axis accelerometer, a y-axis accelerometer, and a z-axis accelerometer for identifying a relative physical position between the first motion dot and the second motion dot.
6. The device of claim 5, wherein the first motion dot is attached to left foot of the user and the second motion dot is attached to right foot of the user.
7. The device of claim 6, wherein the device is able to log a series of input data representing user's feet movement detected by the first and the second motion dots.
8. The device of claim 1, wherein the first motion dot includes a first gyroscope to identify orientation of the first motion dot and the second motion dot includes a second gyroscope to identify orientation of the second motion dot.
9. The device of claim 1, wherein the first motion dot is attached to user's right foot and the second motion dot is attached to user's left arm.
10. The device of claim 1, wherein the first motion dot is attached to user's right index finger and the second motion dot is attached to user's right thumb.
11. The device of claim 1, wherein the device is able to log a series of input data representing user's movement detected by the first and the second motion dots.
12. The device of claim 1, wherein the first motion dot is attached to a first place located on a user's leg, and the second motion dot is attached to a second place located on a foot of the user's leg.
13. The device of claim 1, wherein the first motion dot is attached to a first place located above knee of a user's leg, and the second motion dot is attached to a second place located below the knee.
14. The device of claim 1, wherein the first motion dot is attached to a user's arm at a first place located above an elbow of the user's arm; and the second motion dot is attached to a second place located below the elbow.
15. A method for providing system interface comprising:
- attaching a first motion dot to a first location of an object and attaching a second motion dot to a second location of the object;
- activating the first motion dot and the second motion dot and establishing a wireless communications network between the first motion dot, the second motion dot, and a digital processing device;
- generating a first input based on a first relative reference position between the first motion dot and the second motion dot; and
- recording the first input in a local storage memory of the digital processing device.
16. The method of claim 15, further generating a second input based on a second relative reference position between the first motion dot and the second motion dot; and storing the second input in the local storage memory.
17. The method of claim 15,
- wherein attaching a first motion dot to a first location of an object includes attaching the first motion dot to user's left arm; and
- wherein attaching a second motion dot to a second location of the object includes attaching the second motion dot to user's right arm.
18. The method of claim 15, further comprising:
- attaching a third motion dot to a third location of the object and activating the third motion dot;
- establishing a wireless communications network between the first motion dot and the second motion dot and the third motion dot; and
- detecting relative motion between the first motion dot and the second motion dot and the third dot with respect to the relative reference position.
19. The method of claim 15, wherein establishing a wireless communications network includes activating a wireless personal area network for providing communications between the first motion dot, the second motion dot, and the digital processing device.
20. The method of claim 17, wherein generating a first input based on a first relative reference position includes detecting user's arm movement sensed by the first motion dot and the second motion dot.
21. The method of claim 20, further comprising retrieving input data from the local storage memory of the digital processing device and regenerating data representing a sequence of user's arm movement based on retrieved input data.
22. A computing system comprising:
- a first motion dot operable to attach to a first hand, wherein the first motion dot includes multiple accelerometers for identifying a physical location of the first motion dot; and
- a second motion dot logically coupled to the first motion dot and configured to attach to a second hand, wherein the second motion dot having multiple accelerometers is capable of generating data based on relative physical positions between the first motion dot and the second motion dot; and
- a digital processing controller wirelessly coupled to the second motion dot and configured to receive the data from the second motion dot and store the data in a local storage memory.
23. The system of claim 22, further comprising a third motion dot logically couple to the second motion dot and configured to attached to a user's first leg, wherein the third motion dot is configured to detect a user input in response to a relative motion between the second motion dot and the third motion dot.
24. The system of claim 23, further comprising a fourth motion dot logically couple to the second motion dot and configured to attached to a user's second leg, wherein the first, second, third, and fourth motion dots are able to sense and record data relating user's hands and legs corresponding movement.
25. The system of claim 22, wherein the first motion dot includes:
- a transceiver capable of facilitating wireless communications;
- a battery coupled to the transceiver and configured to supply power; and
- a controller coupled to the battery and configured to control communications.
Type: Application
Filed: Dec 31, 2012
Publication Date: May 16, 2013
Inventor: Robert Welland (Seattle, WA)
Application Number: 13/731,552
International Classification: G06F 3/01 (20060101);