WEARABLE DEVICE

An intelligent wristband system is disclosed, the system comprising a wearable wristband configured to be worn by a user; a control unit within the wristband; and a sensor configured to detect at least one gesture made by the user, the at least one gesture indicating instructions to be performed by the control unit, the control unit configured translate the at least one gesture into a specific command for an action to occur within the wristband system.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
RELATED APPLICATIONS

This application is a Continuation of PCT Patent Application No. PCT/GB2014/052693 having International filing date of Sep. 4, 2014, which claims the benefit of priority of United Kingdom Patent Application Nos. 1400225.7, filed on Jan. 7, 2014, 1315764.9 filed on Sep. 4, 2013, 1315765.6 filed on Sep. 4, 2013 and U.S. Provisional Patent Application No. 61/874,219 filed on Sep. 5, 2013 and 61/874,107 filed on Sep. 5, 2013. The contents of all of the above applications are incorporated by reference as if fully set forth herein.

BACKGROUND

Technical Field

This invention relates to an intelligent wristband and life management environment, and more specifically, to an intelligent wristband system and life management environment that makes recommendations for a user based on the user's real-time physical, social, and biotelemetric activity.

Description of the Related Art

In recent years, stress levels have been increasing for American adults, and many people report that they do not manage or reduce stress well. Stress can be caused by a difficulty to effectively multi-task with respect to activities such as work, exercise, nutritional intake, travel, and social engagements. Systems and methods for lowering stress and improving individuals' lifestyles are desirable.

Current intelligent wristwatches monitor and display a user's physical activity through sensors such as a pedometer, an elevation detector, or a heart rate monitor. However, these technologies are not necessarily used to lower an individual's stress level or make meaningful recommendations to the user regarding future activities or lifestyle choices. Further, these technologies are limited in the information that they can provide to a user, and do not provide an integrated life management solution allowing a user to better manage and improve multiple aspects of his lifestyle, including health, social, stress, schedule, organization, productivity, and overall well being. Additionally, health monitoring systems are generally user-centric, in the sense they do not allow third parties to analyze a user's physical activity, social activity, and biotelemetric data to suggest and/or schedule actions that are beneficial to a user's health.

There does not currently exist an intelligent wristband that takes into account a user's physical activity, social activity, and biotelemetric data to make recommendations to the user. There is a need for systems and methods for an intelligent wristband and life management environment to effectively act as a personal assistant by managing the user's schedule and making lifestyle recommendations to ultimately reduce an individual's stress.

BRIEF DESCRIPTIONS OF THE DRAWINGS

The disclosed examples have other advantages and features which will be more readily apparent from the following detailed description of the invention and the appended claims, when taken in conjunction with the accompanying drawings.

The invention will now be described, purely by way of example, with reference to the accompanying drawings, in which:

FIG. 1A illustrates a perspective view of a wristband system, according to one example.

FIG. 1B illustrates another alternative perspective view of a wristband system, according to one example.

FIG. 2A is a high-level block diagram illustrating an environment for life management including a client device, assistant device, scale device, and life management system, according to one embodiment.

FIG. 2B is a high-level block diagram illustrating an environment for life management including a client device, assistant device, scale device, and life management system, according to another example.

FIG. 3 is a high-level block diagram illustrating an example of a computer, according to one example.

FIG. 4A is a high-level block diagram illustrating a detailed view of various units within the client device, according to one example.

FIG. 4B is a high-level block diagram illustrating a detailed view of various units within the client device, according to another example.

FIGS. 4C and D show a test board mounted on a person's wrist.

FIG. 4E shows a cross-section of an example wearable device.

FIG. 4F shows a voltage/time plot of an example single stimuli impulse wave

FIG. 5 is a high-level block diagram illustrating a detailed view of various modules within the life management system, according to one example.

FIG. 6 is a flowchart illustrating steps performed by the wristband system, according to one example.

FIGS. 7A-7E illustrate examples of gestures made by a wearing user (7A and 7E) or examples of user interfaces displayed on a wristband system (7B-7D).

FIGS. 8A-8B illustrate alternative examples of user interfaces displayed on a wristband system.

FIG. 9A illustrates a perspective view of a scale device, according to one embodiment.

FIG. 9B illustrates another alternative perspective view of a scale device, according to one embodiment.

FIGS. 10A and 10B show example interactions with the user's calendar through a life management system.

FIGS. 11A-11M show examples of gestures made by the user to control wristband functions.

Device for Providing Alerts

FIG. 12 is a schematic cross-sectional view of a device according to one embodiment of the invention.

FIG. 13 is a diagram showing an exemplary stimulus wave for the electrode of the device of the invention.

FIG. 14 is block diagram of the device of FIG. 12.

Life Management System

FIG. 15 is a high-level block diagram illustrating an embodiment of a life management environment including a life management system connected by a network to a client device, a scale platform, a third party service provider, and an assistant device.

FIG. 16 is a high-level block diagram illustrating a detailed view of the life management system according to one embodiment.

FIG. 17 illustrates an example of a user interface displayed by an assistant device and/or client device showing a detailed user snapshot associated with a user of a life management system according to an embodiment.

FIG. 18 illustrates an example of a user interface displayed by an assistant device and/or a client device showing a basic user snapshot associated with a user of a life management system according to an embodiment.

FIG. 19 illustrates an example of a user interface displayed by a client device showing a portion of snapshot information associated with a user of a life management system according to an embodiment.

FIG. 20 illustrates another example of a user interface displayed by a client device showing portions of snapshot information associated with a user of a life management system according to an embodiment.

FIG. 21 illustrates an example of a user interface displayed by a client device showing basic snapshot information in a calendar associated with a user of a life management system according to an embodiment.

FIG. 22 illustrates an example of a user interface displayed by a client device showing an expanded general calendar entry associated with a user of a life management system according to an embodiment.

FIG. 23 illustrates an example of a card displayed by a client device to an associated user of a life management system according to an embodiment.

FIG. 24 is a flowchart illustrating the process for generating a card using snapshot information associated with a user of a life management system according to one embodiment.

FIG. 25 is a flowchart illustrating the process for generating and managing mood information associated with a user of a life management system according to one embodiment.

FIG. 26 illustrates another example of a user interface displayed by an assistant device showing a detailed user snapshot associated with a user of a life management system according to an embodiment.

FIG. 27 illustrates another example of a user interface displayed by a client device showing a portion of snapshot information associated with a user of a life management system according to an embodiment.

FIGS. 28A-C illustrate an example of a user interface displayed by a client device and/or an assistant device showing basic snapshot information in a calendar associated with a user of a life management system according to an embodiment.

Calendars and Moods

FIG. 29 shows a schematic overview of a data management system;

FIG. 30 illustrates data aggregation and processing in the data management system;

FIG. 31 further illustrates data aggregation and processing;

FIGS. 32 and 33 show exemplary architectures of the data management system;

FIG. 34 displays an exemplary Graphical User Interface (GUI) relating to a calendar tool integrated into the data management system;

FIG. 35 shows a calendar tool scheduling a new activity for a user;

FIG. 36 shows an exemplary GUI with calendar and available activity alternatives;

FIG. 37 shows a flow diagram of an out-of-the-box experience (OOBE) for integrating events between users;

FIG. 38 shows a flow diagram of an out-of-the-box experience (OOBE) for integrating contacts between users;

FIG. 39 shows a cost splitting functionality of a data management system;

FIG. 40 shows an exemplary output from the data aggregation and processing system;

FIG. 41 shows the processing of a mood; and

FIG. 42 illustrates communication between users on the basis of a mood certificate, derived from a mood.

The term ‘embodiment’ as used in this description is synonymous with the term ‘example’; features, functionality and/or advantages which are nominally attributed to one example being applicable to other examples.

The figures depict various examples of the present invention for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.

SUMMARY

Intelligent Wristband

The invention provides an intelligent wristband and life management environment, and methods of processing data from the intelligent wristband and life management environment. The system includes a wearable wristband with a display screen and one or more sensors. The sensors can includes sensor types that capture various types of data (e.g., biotelemetry readings from the user or physical activity of the user) and/or that detect user gestures that control the wristband function (e.g., a roll and cue gesture that includes a lift and roll of the wrist to activate or wake up the wristband, or a wave and shake gesture that includes a shaking motion to bring the wristband back to the home screen). The user can also view a variety of different types of information on the wristband, such as the biotelemetry or activity data collected by the sensors, his calendar, reminders, recommendations, social data, news, among other information associated with daily life management.

Within the life management environment, the user can also connect across a network to share information with a variety of external devices, such as the user's mobile phone or computer, or devices operated by third parties. For example, a health or fitness coach of the user can receive biotelemetry or physical data from the user, can analyze this, and can send fitness recommendations to the wristband or control certain settings of the wristband. The user's executive assistant can manage the user's schedule and provide reminders to the wristband. The user can also use a concierge service associated with the wristband that can help the user with any questions and needs the user might have. The life management solution also includes a scale to which the wristband can automatically detect the user when wearing the wristband and directly share readings that the user can immediately view on the wristband. Together, all of these features to provide an overall integrated life management solution that allows the user to reduce stress, improve health, and better manage wellness and lifestyle.

According to one aspect of the invention there is provided an intelligent wristband system comprising: a wearable wristband configured to be worn by a user; a control unit within the wristband; a sensor configured to detect at least one gesture made by the user, the at least one gesture indicating instructions to be performed by the control unit, the control unit configured translate the at least one gesture into a specific command for an action to occur within the wristband system.

According to another aspect of the invention there is provided a method of processing data from an intelligent wristband system, comprising: detecting at least one gesture made by the user, the at least one gesture indicating instructions to be performed by the wristband system; translating the at least one gesture into a specific command for an action to occur within the wristband system.

Preferably, the method comprises recognizing a user of the wristband system; preferably based on the at least one biotelemetric function associated with the user.

According to another aspect of the invention there is provided a wristband configured to be worn by a user, the wristband comprising: a control unit; and a sensor configured to detect at least one gesture made by the user, the at least one gesture indicating instructions to be performed by the control unit, the control unit configured to translate the at least one gesture into a specific command for an action to occur within the wristband system.

In another embodiment, the intelligent wristband system includes a wearable wristband configured to be worn by a user, a control unit within the wristband, a number of sensors, and a display screen for displaying information to the user, such as that collected by the sensors. This system includes a first sensor coupled to the wristband and configured to detect at least one gesture made by the user, where the gesture can indicate instructions to be performed by the control unit. This system also includes a second sensor coupled to the wristband and configured to detect at least one biotelemetric function associated with the user. The control unit is configured to generate biotelemetry data from the detected biotelemetric function. In addition, a third sensor is coupled to the wristband and configured to detect at least one physical activity associated with the user. The control unit is configured to generate physical activity data from the detected physical activity. A curved electronic display screen coupled to the wristband provides, at least a portion of the biotelemetry data and physical activity data for viewing by the user on the screen.

In an embodiment, the gesture made by the user detected by the first sensor of the wristband system includes one or more of the following: (a) the user lifting and rolling his wrist, (b) tapping the display screen, (c) making a swiping motion across the display screen, (d) tapping the display screen with two fingers, (e) pressing on the display screen, (f) making a circular motion on the display screen with a finger, and (g) shaking the wristband system. In a further embodiment, the user lifting and rolling his wrist indicates an instruction to the control unit to activate the display screen. In a further embodiment, the user shaking the wristband indicates an instruction to the control unit to display a home screen user interface. In another embodiment, the sensitivity of the first sensor is customizable by the user to recognize the user's gesture.

In another embodiment, the biotelemetric function detected by the second sensor of the wristband system includes one or more of the following: (a) heart rate, (b) calories burned, (c) blood pressure, (d) skin temperature, (e) hydration level, and (f) galvanic skin response. In another embodiment, the physical activity detected by the third sensor of the wristband system includes one or more of the following: (a) steps taken by the user, (b) stairs climbed by the user, (c) physical movement by the user, (d) speed of the user, and (e) sleep patterns of the user

In an embodiment, the control unit of the wristband can send the biotelemetry data or physical activity data through a network to a computing device, such as the user's mobile phone, tablet, or home computer, or a device used by a third party (e.g., a doctor, health or fitness coach, a concierge service associated with the wristband, an executive or personal assistant of the user, etc.). A software platform within the computing device is configured to analyze the biotelemetry data and physical activity data. At least a portion of the analyzed biotelemetry data and physical activity data is viewable on a display screen of the computing device or viewable on the wristband display screen. In another embodiment, the communications interface can receive social data regarding the user that was analyzed by the computing device, and can be viewed by the user on the wristband display screen. In a further embodiment, at least a portion of the analyzed biotelemetry data and physical activity data is accessible to a second user, such as a health coach or a personal assistant, on the computing device. In one embodiment, the wristband can receive information submitted from the second user via the computing device, such that the information is viewable on the display screen of the wristband. Furthermore, the communications interface can receive information sent from the computing device from the second user.

In an embodiment, the wristband may recognize the user based on biotelemetry data associated with the user. In another embodiment, the control unit may determine the user's mood based on the user's biotelemetric functions. For example, a high heart rate may indicate that the user is feeling stressed.

In one embodiment, the wristband system may wirelessly couple to a scale, which can calculate one or more of the following: (a) a weight of the user, (b) a BMI of the user, (c) a body fat percentage of the user, and (d) a heart rate of the user. In a further embodiment, the scale is can recognize the user based on a galvanic skin response associated with the user. In a further embodiment, a display screen of the scale is only visible when the scale is set to an “ON” setting. In another embodiment, the scale can send data through a network back to the wristband, or to a software platform on a computing device. The wristband or software platform can analyze the calculated measurement and provide the measurement for display. In another embodiment, the scale communicates with an interface coupled to the control unit of the wristband, which can send biotelemetry data or physical activity data from the wristband to the scale for display.

In an embodiment, the wristband system includes a Global Positioning System (GPS) that can determine the location of the wearing user. In another embodiment, the control unit of the wristband is in communication with the GPS, and can use the GPS to show the location of the user on a map on the display screen, or to recommend an exercise course for the user. In an embodiment, the control unit may provide instructions to the display screen to display a schedule or a message for the user. In some embodiments, the control unit can generate an exercise recommendation based on the physical activity data of a user, or generate another type of recommendation based on the biotelemetry data and physical activity data of the user.

In another embodiment, the biotelemetry data and physical activity data is used to calculate one or more of the following scores: (a) a sleep efficiency of the user, (b) a stress level of the user, (c) an exercise intensity of the user, (d) an activity level of the user, (e) a self-control level of the user, and (f) a general wellness level of the user. In a further embodiment, the display screen of the wristband is configured display one or more of the scores to the wearing user.

According to another aspect of the present invention there is provided an intelligent wristband system comprising: a wearable wristband configured to be worn by a user; a control unit within the wristband; a first sensor coupled to the wristband and configured to detect at least one gesture made by the user, the at least one gesture indicating instructions to be performed by the control unit; a second sensor coupled to the wristband and configured to detect at least one biotelemetric function associated with the user, the control unit configured to generate biotelemetry data from the detected at least one biotelemetric function; a third sensor coupled to the wristband and configured to detect at least one physical activity associated with the user, the control unit configured to generate physical activity data from the detected at least one physical activity; and a (preferably curved) electronic display screen coupled to the wristband, wherein at least a portion of the biotelemetry data and physical activity data is viewable on the screen.

Preferably, the gesture is selected from the group consisting of: (a) a user lifting a wrist of the user and rolling the wrist, (b) a user tapping the display screen, (c) a user making a swiping motion across the display screen, (d) a user tapping the display screen with two fingers, (e) a user pressing on the display screen, (f) a user making a circular motion on the display screen with a finger, and (g) a user shaking the wristband system.

Preferably, the gesture sensor is capable of detecting a lifting of a wrist of the user and rolling the wrist, and wherein the display screen is configured to become active in response to the detection.

Preferably, the gesture sensor is capable of detecting a shake of the wristband, and wherein the display screen is configured to display a home screen user interface in response to the detection.

Preferably, the sensitivity of the first sensor is customizable by the user to recognize the at least one gesture.

Preferably, the at least one biotelemetric function is selected from the group consisting of: (a) heart rate, (b) calories burned, (c) blood pressure, (d) skin temperature, (e) hydration level, and (f) galvanic skin response.

Preferably, the at least one physical activity is selected from the group consisting of: (a) steps taken by the user, (b) stairs climbed by the user, (c) physical movement by the user, (d) speed of the user, and (e) sleep patterns of the user.

Preferably, the system further comprises a communications interface in communication with the control unit and configured to send at least a portion of the biotelemetry data or physical activity data through a network to a computing device, wherein a software platform within the computing device is configured to analyze the biotelemetry data and physical activity data, and wherein at least a portion of the analyzed biotelemetry data and physical activity data is viewable on a display screen.

Preferably, the computing device is a mobile device.

Preferably, the communications interface is configured to receive social data regarding the user that was analyzed by the computing device, at least a portion of the analyzed social data being viewable on the display screen.

Preferably, the at least a portion of the analyzed biotelemetry data and physical activity data is accessible to a second user on the computing device.

Preferably, the second user is a health coach or a personal assistant.

Preferably, the wearable wristband is capable of receiving information submitted from the second user via the computing device, and wherein the information is viewable on the display screen.

Preferably, the communications interface is configured to receive information sent from the computing device from the second user.

Preferably, the wristband system is configured to recognize the user based on the at least one biotelemetric function associated with the user.

Preferably, the control unit is configured to determine a mood of the user based on one or more biotelemetric functions detected by the second sensor.

Preferably, the wristband system is configured for wireless coupling to a scale, the scale configured to calculate a measurement for at least one of the group consisting of (a) a weight of the user, (b) a BMI of the user, (c) a body fat percentage of the user, and (d) a heart rate of the user.

Preferably, the scale is configured to recognize the user based on a galvanic skin response associated with the user.

Preferably, the scale comprises a display screen, and wherein the display screen is only visible when the scale is set to an “ON” setting.

Preferably, the scale is configured to send at least a portion of the calculated measurement through a network to a computing device, wherein a software platform on the computing device configured to analyze the calculated measurement, and wherein the software platform displays at least a portion of the analyzed calculated measurement on a display screen.

Preferably, the scale is configured to send at least a portion of the calculated measurement through a network to the wearable wristband, the control unit configured to analyze the calculated measurement, and wherein at least a portion of the analyzed calculated measurement is viewable on the screen.

Preferably, the system further comprises a communications interface coupled to the control unit and configured to send at least a portion of the biotelemetry data or physical activity data to the scale, wherein the scale is configured to display the at least a portion of the biotelemetry data or physical activity data on a display screen of the scale.

Preferably, the system further comprises a Global Positioning System (GPS) in communication with the control unit, the GPS configured to determine the geospatial location of the user.

Preferably, the control unit is configured to use the GPS to recommend an exercise course for the user.

Preferably, the display screen is configured to depict the geospatial location of the user on a map displayed on the display screen.

Preferably, the control unit is configured to provide instructions to the display screen to display a schedule of the user.

Preferably, the control unit is configured to provide instructions to the display screen to display a message for the user.

Preferably, the control unit is configured to generate a physical activity recommendation for the user based on the detected at least one physical activity.

Preferably, the control unit is configured to generate a recommendation for the user based on at least a portion of the biotelemetry data and physical activity data.

Preferably, at least a portion of the biotelemetry data or physical activity data is used to determine at least one of a calculation of (a) a sleep efficiency of the user, (b) a stress level of the user, (c) an exercise intensity of the user, (d) an activity level of the user, (e) a self-control level of the user, and (f) a general wellness level of the user.

Preferably, the display screen is configured to provide the at least one calculation for display to the user.

Preferably, the system further comprises an electric feedback stimulator configured to deliver an electric shock from the wristband in response to an instruction from the control unit.

Preferably, a frequency, duration, or voltage of the electric shock is customizable.

Preferably, the electric feedback stimulator is configured to deliver the electric shock from the wristband at a specific time.

Preferably, the electric feedback stimulator is configured to deliver the electric shock from the wristband based on the biotelemetry data or physical activity data.

Preferably, the electric feedback stimulator is configured to deliver the electric shock from the wristband based on an instruction from the user or from another user.

Preferably, the electric shock is combined with a vibration from a haptic feedback stimulator of the wristband, and wherein the combined electric shock and vibration is delivered from the wristband based on an instruction from the control unit.

Preferably, the system further comprises a haptic feedback stimulator configured to deliver a vibration from the wristband in response to an instruction from the control unit.

Preferably, a frequency, duration, or amplitude of the vibration is customizable.

Preferably, the haptic feedback stimulator is configured to deliver the vibration from the wristband at a specific time.

Preferably, the haptic feedback stimulator is configured to deliver the vibration from the wristband based on the biotelemetry or physical activity data.

Preferably, the haptic feedback stimulator is configured to deliver the vibration from the wristband based on an instruction from the user or from another user.

According to another aspect of the invention, there is provided an intelligent wristband system comprising: a wearable wristband configured to be worn by a user; a control unit within the wristband; a sensor configured to detect at least one gesture made by the user, the at least one gesture indicating instructions to be performed by the control unit, the control unit configured translate the at least one gesture into a specific command for an action to occur within the wristband system, wherein each different gesture corresponds to a different specific command; and a (preferably curved) electronic display screen coupled to the wristband.

Preferably, the sensor is configured to detect a gesture comprising lifting of a wrist of the user and rolling the wrist.

Preferably, the control unit is configured to activate the display screen in response to the gesture of listing the wrist of the user and rolling the wrist.

Preferably, the sensor is configured to detect a shake of the wristband.

Preferably, the control unit is configured to display a home screen user interface on the display screen in response to the gesture of shaking the wristband.

Preferably, the gesture is selected from the group consisting of: (a) a user lifting a wrist of the user and rolling the wrist, (b) a user tapping the display screen, (c) a user making a swiping motion across the display screen, (d) a user tapping the display screen with two fingers, (e) a user pressing on the display screen, (f) a user making a circular motion on the display screen with a finger, and (g) a user shaking the wristband system.

According to another aspect of the present invention, there is provided a method of processing data from an intelligent wristband system, comprising: recognizing a user of the wristband system; detecting at least one gesture made by the user, the at least one gesture indicating instructions to be performed by the wristband system; detecting at least one biotelemetric function associated with the user and using a control unit to generate biotelemetry data from the detected at least one biotelemetric function; detecting at least one physical activity associated with the user and using the control unit to generate physical activity data from the detected at least one physical activity; and displaying at least a portion of the biotelemetry data and physical activity data on a (preferably curved) electronic display screen of the wristband system.

Preferably, the gesture is selected from the group consisting of: (a) a user lifting a wrist of the user and rolling the wrist, (b) a user tapping the display screen, (c) a user making a swiping motion across the display screen, (d) a user tapping the display screen with two fingers, (e) a user pressing on the display screen, (f) a user making a circular motion on the display screen with a finger, and (g) a user shaking the wristband system.

Preferably, the gesture made by the user comprises lifting a wrist of the user and rolling the wrist.

Preferably, the method further comprises receiving instructions from the user to customize the sensitivity of the sensor to recognize the at least one gesture.

Preferably, the at least one biotelemetric function is selected from the group consisting of: (a) heart rate, (b) calories burned, (c) blood pressure, (d) skin temperature, (e) brain activity, (f) hydration level, and (g) galvanic skin response.

Preferably, the at least one physical activity is selected from the group consisting of: (a) steps taken by the user, (b) stairs climbed by the user, (c) physical movement by the user, (d) pace of the user, and (e) sleep patterns of the user.

Preferably, the method further comprises sending at least a portion of the biotelemetry data or physical activity data through a network to a computing device, wherein a software platform within the computing device analyzes the biotelemetry data and physical activity data, and wherein at least a portion of the analyzed biotelemetry data and physical activity data is viewed on a display screen.

Preferably, a second user uses the software platform to access the at least a portion of the analyzed biotelemetry data and physical activity data.

Preferably, the second user is a health coach or a personal assistant.

Preferably, the method further comprises receiving information from the software platform used by the second user, and displaying the information received to the user.

Preferably, the method further comprises recognizing the user based on the at least one biotelemetric function associated with the user.

Preferably, the method further comprises wirelessly communicating with a scale, the scale configured to calculate a measurement for at least one of (a) the weight of the user, (b) the BMI of the user, (c) the body fat percentage of the user, and (d) the heart rate of the user.

Preferably, the scale is configured to recognize the user based on a galvanic skin response associated with the user.

Preferably, the scale sends at least a portion of the calculated measurement through a network to a computing device, wherein a software platform on the computing device analyzes the calculated measurement, and wherein the software platform displays at least a portion of the analyzed calculated measurement on a display screen.

Preferably, the method further comprises receiving from the scale at least a portion of the calculated measurement through a network and displaying information about the calculated measurement to the user.

Preferably, the method further comprises sending to the scale at least a portion of the biotelemetry data or physical activity data, wherein the scale displays the at least a portion of the biotelemetry data or physical activity data on a display screen of the scale.

Preferably, the method further comprises determining the geospatial location of the user with a Global Positioning System (GPS).

Preferably, the method further comprises recommending an exercise course for the user using the GPS.

Preferably, the method further comprises displaying the geospatial location of the user on a map displayed on the display screen.

Preferably, the method further comprises displaying a schedule of the user on the display screen.

Preferably, the method further comprises displaying a message for the user on the display screen.

Preferably, the method further comprises wherein the control unit is configured to use the detected at least one physical activity to make a recommendation to the user regarding the at least one physical activity.

Preferably, the method further comprises making a recommendation to the user based on at least a portion of the biotelemetry data and physical activity data.

Preferably, the method further comprises analyzing at least a portion of the biotelemetry data or physical activity data to determine at least one of a calculation of (a) a sleep efficiency of the user, (b) a stress level of the user, (c) an exercise intensity of the user, (d) an activity level of the user, (e) a self-control level of the user, and (f) a general wellness level of the user.

Preferably, the method further comprises providing the at least one calculation for display on the display screen.

Preferably, the method further comprises delivering an electric shock from an electric feedback stimulator in the wristband in response to an instruction from the control unit.

Preferably, a frequency, duration, or voltage of the electric shock is customizable.

Preferably, the electric feedback stimulator is configured to deliver the electric shock from the wristband at a specific time.

Preferably, the electric feedback stimulator is configured to deliver the electric shock from the wristband based on the biotelemetry data or physical activity data.

Preferably, the electric feedback stimulator is configured to deliver the electric shock from the wristband based on an instruction from the user or from another user.

Preferably, the electric shock is combined with a vibration from a haptic feedback stimulator of the wristband, and wherein the combined electric shock and vibration is delivered from the wristband based on an instruction from the control unit.

Preferably, the method further comprises delivering a vibration from a haptic feedback stimualtor in the wristband in response to an instruction from the control unit.

Preferably, a frequency, duration, or amplitude of the vibration is customizable.

Preferably, the haptic feedback stimulator is configured to deliver the vibration from the wristband at a specific time.

Preferably, the haptic feedback stimulator is configured to deliver the vibration from the wristband based on the biotelemetry or physical activity data.

Preferably, the haptic feedback stimulator is configured to deliver the vibration from the wristband based on an instruction from the user or from another user.

According to another aspect of the present invention, there is provided a method of processing data from an intelligent wristband system, comprising: detecting a movement of the wristband, the movement comprising a gesture of a user wearing the wristband, the gesture indicating instructions to be performed by the wristband system, wherein each different gesture corresponds to a different instructions for the wristband system; responsive to detecting the movement, activating a display screen of the wristband system, the display screen displaying information; and receiving an interaction with the display screen by a user, wherein the information displayed on the display screen is changed based on the interaction.

According to another aspect of the present invention, there is provided a method of processing data from a scale and an intelligent wristband system, comprising: detecting, by a wristband system worn by a user, that the user is interacting with a scale to determine body metric data; responsive to the detection, communicating with the scale to retrieve at least a portion of the body metric data determined by the scale; and providing the at least a portion of the body metric data retrieved from the scale for display to the user on a display of the wristband system.

Preferably, the body metric data includes a measurement for at least one of (a) the weight of the user, (b) the BMI of the user, (c) the body fat percentage of the user, and (d) the heart rate of the user.

Preferably, the method further comprises transmitting data to the scale for display to the user on a display of the scale.

Device for Providing Alerts

Broadly, the invention provides a device for providing alert signals to a user, the device comprising electrodes arranged to contact the user's skin so as to deliver a low intensity electric shock to the user.

According to the present invention, there is provided a wearable device for providing alert signals to a user, the device being adapted to be secured adjacent the skin of the user and comprising a controller adapted to receive an alert instruction corresponding to an alert event, and at least one electrode arranged to be in contact with the skin in use of the device, the controller being arranged to actuate the electrode(s) to supply an alert signal in response to the alert instruction, the alert signal comprising at least one pulse to cause a sensation in the skin of the user, in which, in response to the alert instruction, the controller is configured to select one of a plurality of alert signals in dependence upon the alert event.

Preferably, the controller comprises a wireless receiver for receiving an alert instruction corresponding to an alert event from a remote device, preferably a mobile phone.

Preferably, each alert signal corresponds to one of a plurality of different alert events, and each alert signal varies from each other signal by a variation of at least one of: a pulse count, a pulse frequency, a pulse duration, and a pulse voltage. Alternatively the alerts may vary from each other by a combination with another type of alert stimulus such as a light, sound or vibration.

Thus the device may comprise a vibrating element actuable by the controller, such as are known in the art, and thus each alert signal may include a combination of at least one shock pulse and a vibration pulse in order to provide further selectable alert signal configurations. The device may also include other alert outputs actuable by the controller, such as a light or a speaker for providing further alert signal combinations.

Thus the device can alert the user to different events, such as receiving a call or text message, etc., by delivering a mild electric shock or sequence of shocks to the user. Thus the user may receive a silent and discrete notification which is undetectable by others, and is able to recognize the event being notified by the number, frequency or intensity of the shocks delivered by the device. The user may also recognize alerts by the combination of at least one shock with other stimuli such as vibrations, light or sounds. Thus the plurality of alerts may correspond to a vocabulary of commands or words which the user is able to understand.

In some embodiments, the device may be adapted to provide alerts at a plurality of alert levels, differing in characteristics such as type and intensity. These may be used to indicate urgency, whether for particular events or in order to escalate an alert in the absence of user response, say within a certain time. For example, a series of alerts may be provided, wherein an initial alert by discrete electric shock, if ignored by the user, may be followed by a larger shock, which, if further ignored by the user, may be followed by or a less discrete vibration or an audible alert. Alternatively, combinations of different types of alert may be used together or simultaneously to indicate urgency or escalation. In some embodiments, a series of de-escalating alerts may be used, wherein an initial ‘strong’ alert, for example an audio alert, may be followed by successive ‘weak’ alerts, say vibrations or shocks, so as to serve as a gentle and discrete reminder to the user of an alert outstanding.

Preferably, the device is configurable by or on behalf of the user to select an alert signal to correspond to a particular alert event, or to select a preconfigured set of signals corresponding to a set of events. Thus the user may choose a preferred sensation (or combination of sensations) to be associated with each event. Furthermore a user may select a preferred set of alerts. For example, a user with hearing impairment may select a set of alerts which do not include sounds.

The event may be an event which occurs on a wireless communication device such as a mobile device, for example a mobile phone, the event being a communication event such as receiving a call or message. The event may also be generated by a software application on the wireless communication device or other wireless enabled device such as a desktop or laptop computer, for example a calendar notification.

The device may include one or more sensors, for example environmental sensors such as a light sensor, an accelerometer, a gyro, or a GPS device, or physiological sensors such as a galvanic skin response sensor (GSR), heart rate monitor, etc. Preferably the electrodes may also serve as GSR plates.

Thus the device may provide alerts in relation to environmental events, such as proximity to a location, or in relation to physiological events such as a raised heart rate, or in relation to a combination of two or more of environmental, physiological and communication events. For example, the device may provide an alert comprising an instruction to take an action such as travel to a meeting, take a rest from an activity, or take exercise during a break between appointments, based on a combination of calendar information and information from the device sensors such as location and heart rate.

Furthermore the controller may select an alert signal in relation to an alert event in dependence upon an output of the one or more sensors. Thus the device may change the alert signals to be appropriate to a condition of the user such as the user's state of activity and/or physiological state such as heart rate. For example, when the sensor such as an accelerometer senses that the user is still, the controller may select ‘silent’ alerts such as shocks and/or vibrations, and when the sensor senses that the user is walking or running, the controller may select more intense alerts such as sounds or more intensive shocks. Sets of alert signals appropriate to various sensed conditions may be preconfigured or may be programmable by the user.

Conveniently the device is in the form of a wristband having a releasable strap. The pulse is of an intensity such that the wearer is aware of the sensation, but it is not uncomfortable. For example, the pulse may have a voltage of up to about 45V, preferably above about 20V, and a current of up to about 5 mA, preferably above about 1 mA. The pulse may have a duration of between about 0.2 and 1 us, and a repeat frequency of between about 1 and 100 Hz. Thus the intensity of the or each pulse causes a mild sensation similar to a light touch or tap. Any of the voltage, current, pulse duration and frequency may be variable.

In another aspect, the invention provides a wearable device for providing alert signals to user, the device being adapted to be secured adjacent the skin of the user and comprising at least one sensor for sensing a parameter relating to the condition of the user, a controller adapted to receive an alert instruction corresponding to an alert event, the controller being arranged to actuate the device to supply an alert signal to the user in response to the alert instruction, the controller being configured to select one of a plurality of alert signals in dependence upon the alert event and the sensed parameter.

In another aspect, the invention provides an alert system comprising: a wearable device for providing alert signals to a user; a translation module adapted to convert an alert event into an alert to be provided to a user of said wearable device in dependence on at least one user related factor.

Preferably, the system further comprises a user input device adapted to receive input from a user as to the manner in which an event is converted into an alert.

Preferably, the translation module is adapted to convert said alert event into an alert signal which are distinct combinations of shocks, vibrations, and/or sounds.

Preferably, the wearable device comprises alert comprises at least one electrode arranged to be in contact with the skin in use of the device for providing an alert stimulus; and said alert is said alert stimulus.

Preferably, the at least one user-related factor comprises: location; physiological state; movements; environment; activity; pre-set settings; user priorities.

Preferably the wearable device comprises the wearable device as described herein.

Life Management System

Client devices associated with users of a life management system may collect data (e.g., biotelemetry data and/or activity data) about the user and provide it to a life management system. In certain embodiments, the life management system may perform certain actions based on the received data. Additionally, in some embodiments, the life management system may analyze the received data and/or make portions of the data accessible to an assistant associated with the user. The assistant may then perform certain actions and/or make recommendations based on the data. Additionally, in some embodiments, the life management system may utilize the received information to determined a mood of the user. The life management system, in accordance with the user's user controls, may then communicate the mood information to other users of the life management system. Additionally, in some embodiments, the user may configure user controls such that the life management system and/or the assistant may perform certain actions when one or more criteria are met.

According to one aspect of the present invention there is provided a method for making a recommendation for a user of a life management system, comprising: receiving user data relating to a user, the user data comprising biotelemetry data and activity data collected about a user wearing the client device; generating snapshot information using information from a group comprising: the biotelemetry data, activity data, social data associated with the user, and user profile information associated with the user; generating a recommendation using portions of the snapshot information; updating the snapshot information with the recommendation; and making a recommendation associated with the snapshot information in accordance with the user controls associated with the user.

Preferably, the biotelemetry data comprises information related to at least one of: heart rate, calories burned, blood pressure, skin temperature, brain activity, hydration level, galvanic skin response, a pedometer count, an optical skin and blood vessel dilation measurement, a blood glucose level, a blood oxygen level, a blood alcohol level, an electrocardiogram, an electroencephalogram, an electromyogram, a respiration rate, a measure of stress, a number of steps taken, a measure of calories used, a measure of activity, a movement from an accelerometer, a movement from a gyroscope, a response to mechanical or electrical stimuli, an environment temperature, an ambient ultraviolet light level, and an ambient CO2 level.

Preferably, the activity data comprises information related to at least one of: steps taken, stairs climbed, exercise intensity, pace, sleep pattern, and sleep duration.

Preferably, the social data comprises information related to at least one of: a calendar for the user, an interest of the user, one or more connections to the user, and a location of the user.

Preferably, generating a recommendation using portions of the biotelemetry data and portions of the snapshot information, further comprises: determining values for one or more health parameters using portions of the snapshot information; and developing a recommendation based in part on a comparison between one or more of the values and corresponding threshold values.

Preferably, generating a recommendation using portions of the biotelemetry data and portions of the snapshot information, further comprises: analyzing a calendar associated with the user to identify a time slot for a recommended activity; and updating the recommendation with the time slot.

Preferably, generating a recommendation using portions of the snapshot information, further comprises: analyzing a calendar associated with the user and another calendar associated with a different user who is connected to the user, to identify a time slot for an activity; and updating the recommendation with the time slot.

Preferably, the recommendation comprises a nutritional recommendation, an exercise-related recommendation, a scheduling recommendation, a travel-related recommendation, a shopping recommendation (e.g., purchase a good and/or service), a sleeping recommendation, a suggestion to add a particular calendar entry an advertisement for a good, an advertisement for a service, a suggestion to improve one or more health parameters associated with the user, one or more advertisements that facilitate improving the one or more health parameters, one or more tips based on the user's activities (e.g., have a glass of water, take a break, etc.), or some combination thereof.

Preferably, the recommendation is based on at least one of an interest of the user, a schedule of the user, a location of the user, or health of the user.

Preferably the method further comprises: receiving a request from an assistant to the user for a portion of the snapshot information associated with the user; providing the portion of the snapshot information to the assistant in accordance with user controls associated with the user; and updating the snapshot information using actionable information received from the assistant.

Preferably, the assistant is a health coach, a personal assistant, or a customer service representative.

Preferably, the actionable information comprises: a new recommendation for the user, modifying the recommendation, adding a calendar entry to the user's calendar, purchasing a good, or purchasing a service.

Preferably, providing the portion of the snapshot information to the assistant in accordance with user controls associated with the user, comprises: providing a graphical user interface to the assistant that displays one or more health parameters, a momentum score for the user, analytical graphs for each of the displayed health parameters, and the recommendation.

Preferably, providing the portion of the snapshot information to the assistant in accordance with user controls associated with the user, comprises: providing a graphical user interface to the assistant that displays one or more calendar entries, one or more emails, and a momentum score for the user.

Preferably the method further comprises providing a graphical user interface to the user that displays information selected from the group consisting of: calendar information associated with the user, one or more health parameters associated with the user, analytical graphs for each of the displayed health parameters, the recommendation, and a momentum score.

Preferably, the method further comprises generating a card for presentation to the user, wherein the card includes the recommendation; and providing the card to the client device.

Preferably, the card also includes an icon related to a health parameter and the recommendation.

Preferably, generating a recommendation using portions of the snapshot information, further comprises: determining a value for a momentum score using portions of the snapshot information.

According to another aspect of the present invention, there is provided a method for making a recommendation for a user of a life management system, comprising: receiving user data relating to a user, the user data comprising biotelemetry data and activity data relating to the user; generating mood information using information from a group consisting of: the biotelemetry data, the activity data, social data associated with the user, and user profile information associated with the user; and providing, in accordance with user controls associated with the user, the mood information to a different user.

Preferably, the method further comprises designating a user a trusted user in accordance with user controls associated with the user, such that the trusted user may provide mood information to other users.

Preferably, the different user provides the mood information to the user.

According to yet another aspect of the present invention there is provided a method for making a recommendation for a user of a life management system, comprising: receiving one or more user controls associated with the user, that establish one or more criteria for performing an associated action; determining whether the one or more criteria have been met; responsive to the determination that the one or more criteria are met, automatically performing the action, the action including generating an entry within a calendar of the user, the entry reserving a time slot in the calendar associated with the action to be taken; and providing, for display, the entry to the user.

According to yet another aspect of the present invention there is provided a system for making a recommendation for a user of a life management system, comprising: receiving user data relating to user, the data comprising biotelemetry data and activity data collected about the; generating snapshot information using information from a group consisting of: the biotelemetry data, activity data, social data associated with the user, and user profile information associated with the user; generating a recommendation using portions of the snapshot information; updating the snapshot information with the recommendation; and making a recommendation associated with the snapshot information in accordance with the user controls associated with the user.

Preferably, the user data is received from a client device worn by the user, said client device being configured to collect biotelemetry data and activity data about the user.

Preferably, the biotelemetry data comprises information related to at least one of: heart rate, calories burned, blood pressure, skin temperature, brain activity, hydration level, galvanic skin response, a pedometer count, an optical skin and blood vessel dilation measurement, a blood glucose level, a blood oxygen level, a blood alcohol level, an electrocardiogram, an electroencephalogram, an electromyogram, a respiration rate, a measure of stress, a number of steps taken, a measure of calories used, a measure of activity, a movement from an accelerometer, a movement from a gyroscope, a response to mechanical or electrical stimuli, an environment temperature, an ambient ultraviolet light level, and an ambient CO2 level.

Preferably, the activity data comprises information related to at least one of: steps taken, stairs climbed, exercise intensity, pace, sleep pattern, and sleep duration.

Preferably, the social data comprises information related to at least one of: a calendar for the user, an interest of the user, one or more connections to the user, and a location of the user.

Preferably, generating a recommendation using portions of the biotelemetry data and portions of the snapshot information, further comprises: determining values for one or more health parameters using portions of the snapshot information; and developing a recommendation based in part on a comparison between one or more of the values and corresponding threshold values.

Preferably, generating a recommendation using portions of the biotelemetry data and portions of the snapshot information, further comprises: analyzing a calendar associated with the user to identify a time slot for a recommended activity; and updating the recommendation with the time slot.

Preferably, generating a recommendation using portions the snapshot information, further comprises: analyzing a calendar associated with the user and another calendar associated with a different user who is connected to the user, to identify a time slot for an activity; and updating the recommendation with the time slot.

Preferably, the recommendation comprises a nutritional recommendation, an exercise-related recommendation, a scheduling recommendation, a travel-related recommendation, a shopping recommendation (e.g., purchase a good and/or service), a sleeping recommendation, a suggestion to add a particular calendar entry an advertisement for a good, an advertisement for a service, a suggestion to improve one or more health parameters associated with the user, one or more advertisements that facilitate improving the one or more health parameters, one or more tips based on the user's activities (e.g., have a glass of water, take a break, etc.), or some combination thereof.

Preferably, the recommendation is based on at least one of an interest of the user, a schedule of the user, a location of the user, or health of the user.

Preferably, the method further comprises receiving a request from an assistant, for a portion of the snapshot information associated with the user; providing the portion of the snapshot information to the assistant in accordance with user controls associated with the user; and updating the snapshot information using actionable information received from the assistant device.

Preferably, the request from an assistant is received from an assistant device associated with the assistant, and wherein the portion of the snapshot information is provided to the assistant device. Preferably, the assistant is a health coach, a personal assistant, or a customer service representative.

Preferably, the actionable information comprises: a new recommendation for the user, modifying the recommendation, adding a calendar entry to the user's calendar, purchasing a good, or purchasing a service.

Preferably, providing the portion of the snapshot information to the assistant device in accordance with user controls associated with the user, comprises: providing a graphical user interface to the assistant that displays one or more health parameters, a momentum score for the user, analytical graphs for each of the displayed health parameters, and the recommendation.

Preferably, providing the portion of the snapshot information to the assistant in accordance with user controls associated with the user, comprises: providing a graphical user interface to the assistant that displays one or more calendar entries, one or more emails, and a momentum score for the user.

Preferably, the method further comprises providing a graphical user interface to the user that displays information selected from the group consisting of: calendar information associated with the user, one or more health parameters associated with the user, analytical graphs for each of the displayed health parameters, the recommendation, and a momentum score.

Preferably, the method further comprises generating a card for presentation to the user, wherein the card includes the recommendation; and providing the card to the client device. Preferably, the card also includes an icon related to a health parameter and the recommendation. Preferably, generating a recommendation using portions of the snapshot information, further comprises: determining a value for a momentum score using portions of the snapshot information.

According to yet further aspect of the present invention there is provided a system for making a recommendation for a user of a life management system, comprising: receiving user data relating to a user, the data comprising biotelemetry data and activity data collected about a user; generating mood information using information from a group consisting of: the biotelemetry data, the activity data, social data associated with the user, and user profile information associated with the user; and providing, in accordance with user controls associated with the user, the mood information to a different user.

Preferably, the system further comprises: designating the user a trusted user in accordance with user controls associated with the user, such that the trusted user may provide mood information to other users. Preferably, the different user provides the mood information to the user.

Preferably, the user data is received from a client device worn by the user, the client device being configured to collect biotelemetry data and activity data.

According to yet further aspect of the present invention there is provided a system for making a recommendation for a user of a life management system, comprising: receiving one or more user controls, from the user, that establish one or more criteria for performing an associated action; determining whether the one or more criteria have been met; responsive to the determination that the one or more criteria are met, automatically performing the action, the action including generating an entry within a calendar of the user, the entry reserving a time slot in the calendar associated with the action to be taken; and providing, for display, the entry to the user.

Preferably, the one or more user controls are received from a client device associated with the user, and the generated entry is provided to the client device for display.

According to yet further aspect of the present invention there is provided a computer-implemented method comprising: receiving data from a client device worn by a user, the data comprising biotelemetry data and activity data collected about a user wearing the client device; generating snapshot information using information from a group consisting of: the biotelemetry data, activity data, social data associated with the user, and user profile information associated with the user; generating a recommendation using portions of the snapshot information; updating the snapshot information with the recommendation; and executing a recommendation associated with the snapshot information in accordance with the user controls associated with the user.

Preferably, the biotelemetry data comprises information related to at least one of: heart rate, calories burned, blood pressure, skin temperature, brain activity, hydration level, and galvanic skin response.

Preferably, the biotelemetry data comprises information related to at least one of: heart rate, calories burned, blood pressure, skin temperature, brain activity, hydration level, galvanic skin response, a pedometer count, an optical skin and blood vessel dilation measurement, a blood glucose level, a blood oxygen level, a blood alcohol level, an electrocardiogram, an electroencephalogram, an electromyogram, a respiration rate, a measure of stress, a number of steps taken, a measure of calories used, a measure of activity, a movement from an accelerometer, a movement from a gyroscope, a response to mechanical or electrical stimuli, an environment temperature, an ambient ultraviolet light level, and an ambient CO2 level.

Preferably, the activity data comprises information related to at least one of: steps taken, stairs climbed, exercise intensity, pace, sleep pattern, and sleep duration.

Preferably, the social data comprises information related to at least one of: a calendar for the user, an interest of the user, one or more connections to the user in the life management system, and a location of the user.

Preferably, generating a recommendation using portions of the biotelemetry data and portions of the snapshot information, further comprises: determining values for one or more health parameters using portions of the snapshot information; and developing a recommendation based in part on a comparison between one or more of the values and corresponding threshold values.

Preferably, generating a recommendation using portions of the biotelemetry data and portions of the snapshot information, further comprises: analyzing a calendar associated with the user to identify a time slot for a recommended activity; and updating the recommendation with the time slot.

Preferably, generating a recommendation using portions the snapshot information, further comprises: analyzing a calendar associated with the user and another calendar associated with a different user of the life management system who is connected to the user, to identify a time slot for an activity; and updating the recommendation with the time slot.

Preferably, the recommendation comprises a nutritional recommendation, an exercise-related recommendation, a scheduling recommendation, a travel-related recommendation, a shopping recommendation (e.g., purchase a good and/or service), a sleeping recommendation, a suggestion to add a particular calendar entry an advertisement for a good, an advertisement for a service, a suggestion to improve one or more health parameters associated with the user, one or more advertisements that facilitate improving the one or more health parameters, one or more tips based on the user's activities (e.g., have a glass of water, take a break, etc.), or some combination thereof.

Preferably, the recommendation is based on at least one of an interest of the user, a schedule of the user, a location of the user, or health of the user.

Preferably the method further comprises: receiving a request from an assistant device, associated with an assistant, for a portion of the snapshot information associated with the user; providing the portion of the snapshot information to the assistant device in accordance with user controls associated with the user; and updating the snapshot information using actionable information received from the assistant device.

Preferably, the assistant is a health coach, a personal assistant, or a customer service representative.

Preferably, the actionable information comprises: a new recommendation for the user, modifying the recommendation, adding a calendar entry to the user's calendar, purchasing a good, or purchasing a service.

Preferably, providing the portion of the snapshot information to the assistant device in accordance with user controls associated with the user, comprises: providing a graphical user interface to the assistant that displays one or more health parameters, a momentum score for the user, analytical graphs for each of the displayed health parameters, and the recommendation.

Preferably, providing the portion of the snapshot information to the assistant device in accordance with user controls associated with the user, comprises: providing a graphical user interface to the assistant that displays one or more calendar entries, one or more emails, and a momentum score for the user.

Preferably the method further comprises: providing a graphical user interface to the user that displays information selected from the group consisting of: calendar information associated with the user, one or more health parameters associated with the user, analytical graphs for each of the displayed health parameters, the recommendation, and a momentum score.

Preferably the method further comprises: generating a card for presentation to the user, wherein the card includes the recommendation; and providing the card to the client device.

Preferably, the card also includes an icon related to a health parameter and the recommendation.

Preferably, generating a recommendation using portions of the snapshot information, further comprises: determining a value for a momentum score using portions of the snapshot information. According to a yet further aspect of the present invention there is provided a computer-implemented method comprising: receiving data from a client device worn by a user in association with a life management system, the data comprising biotelemetry data and activity data collected about a user wearing the client device; generating mood information using information from a group consisting of: the biotelemetry data, the activity data, social data associated with the user, and user profile information associated with the user; and providing, in accordance with user controls associated with the user, the mood information to a different client device associated with another user of the life management system.

Preferably the method further comprises: designating a user of the life management system a trusted user in accordance with user controls associated with the user, such that the trusted user may provide mood information to other users of the life management system.

Preferably, a client device associated with the another user provides the mood information to the requesting client device.

According to a yet further aspect of the present invention there is provided a computer-implemented method comprising: receiving one or more user controls, from a user of a life management system via a client device associated with the user, that establish one or more criteria for performing an associated action; determining whether the one or more criteria have been met; responsive to the determination that the one or more criteria are met, automatically performing the action, the action including generating an entry within a calendar of the user, the entry reserving a time slot in the calendar associated with the action to be taken; and providing, for display, the entry to the client device associated with the user.

In one embodiment, data is received from a client device worn by a user, the data comprising biotelemetry data and activity data collected about a user wearing the client device. Snapshot information is generated using information from a group consisting of: the biotelemetry data, activity data, social data associated with the user, and user profile information associated with the user. A recommendation is generated using portions of the snapshot information, and the snapshot information is updated with the recommendation. The recommendation associated with the snapshot information is executed in accordance with the user controls associated with the user.

In another embodiment, data is received from a client device worn by a user in association with a life management system, the data comprising biotelemetry data and activity data collected about a user wearing the client device. Mood information is generated using information from a group consisting of: the biotelemetry data, the activity data, social data associated with the user, and user profile information associated with the user. The mood information is provided, in accordance with user controls associated with the user, to a different client device associated with another user of the life management system.

In yet another embodiment, one or more user controls are received from a user of a life management system via a client device associated with the user, that establish one or more criteria for performing an associated action. It is then determined whether the one or more criteria have been beet, and responsive to the determination that the one or more criteria are met, the action is automatically performed where the action includes generating an entry within a calendar of the user, and the entry reserves a time slot in the calendar associated with the action to be taken.

According to one aspect of the present invention there is provided a computer-implemented method comprising: receiving data from a client device worn by a user, the data comprising biotelemetry data and activity data collected about a user wearing the client device; generating snapshot information using information from a group consisting of: the biotelemetry data, activity data, social data associated with the user, and user profile information associated with the user; generating a recommendation using portions of the snapshot information; updating the snapshot information with the recommendation; and executing a recommendation associated with the snapshot information in accordance with the user controls associated with the user.

Preferably, the biotelemetry data comprises information related to at least one of: heart rate, calories burned, blood pressure, skin temperature, brain activity, hydration level, and galvanic skin response.

Preferably, the activity data comprises information related to at least one of: steps taken, stairs climbed, exercise intensity, pace, sleep pattern, and sleep duration.

Preferably, the social data comprises information related to at least one of: a calendar for the user, an interest of the user, one or more connections to the user in the life management system, and a location of the user.

Preferably, generating a recommendation using portions of the biotelemetry data and portions of the snapshot information, further comprises: determining values for one or more health parameters using portions of the snapshot information; and developing a recommendation based in part on a comparison between one or more of the values and corresponding threshold values.

Preferably, generating a recommendation using portions of the biotelemetry data and portions of the snapshot information, further comprises: analyzing a calendar associated with the user to identify a time slot for a recommended activity; and updating the recommendation with the time slot.

Preferably, generating a recommendation using portions the snapshot information, further comprises: analyzing a calendar associated with the user and another calendar associated with a different user of the life management system who is connected to the user, to identify a time slot for an activity; and updating the recommendation with the time slot.

Preferably, the recommendation comprises a nutritional recommendation, an exercise-related recommendation, a scheduling recommendation, a travel-related recommendation, a shopping recommendation (e.g., purchase a good and/or service), a sleeping recommendation, a suggestion to add a particular calendar entry an advertisement for a good, an advertisement for a service, a suggestion to improve one or more health parameters associated with the user, one or more advertisements that facilitate improving the one or more health parameters, one or more tips based on the user's activities (e.g., have a glass of water, take a break, etc.), or some combination thereof.

Preferably, the recommendation is based on at least one of an interest of the user, a schedule of the user, a location of the user, or health of the user.

Preferably, the method further comprises receiving a request from an assistant device, associated with an assistant, for a portion of the snapshot information associated with the user; providing the portion of the snapshot information to the assistant device in accordance with user controls associated with the user; and updating the snapshot information using actionable information received from the assistant device.

Preferably, wherein the assistant is a health coach, a personal assistant, or a customer service representative.

Preferably, the actionable information comprises: a new recommendation for the user, modifying the recommendation, adding a calendar entry to the user's calendar, purchasing a good, or purchasing a service.

Preferably, providing the portion of the snapshot information to the assistant device in accordance with user controls associated with the user, comprises: providing a graphical user interface to the assistant that displays one or more health parameters, a momentum score for the user, analytical graphs for each of the displayed health parameters, and the recommendation.

Preferably, providing the portion of the snapshot information to the assistant device in accordance with user controls associated with the user, comprises: providing a graphical user interface to the assistant that displays one or more calendar entries, one or more emails, and a momentum score for the user.

Preferably, the method further comprises providing a graphical user interface to the user that displays information selected from the group consisting of: calendar information associated with the user, one or more health parameters associated with the user, analytical graphs for each of the displayed health parameters, the recommendation, and a momentum score.

Preferably, the method further comprises generating a card for presentation to the user, wherein the card includes the recommendation; and providing the card to the client device.

Preferably, the card also includes an icon related to a health parameter and the recommendation.

Preferably, generating a recommendation using portions of the snapshot information, further comprises: determining a value for a momentum score using portions of the snapshot information.

According to another aspect of the present invention there is provided a computer-implemented method comprising: receiving data from a client device worn by a user in association with a life management system, the data comprising biotelemetry data and activity data collected about a user wearing the client device; generating mood information using information from a group consisting of: the biotelemetry data, the activity data, social data associated with the user, and user profile information associated with the user; and providing, in accordance with user controls associated with the user, the mood information to a different client device associated with another user of the life management system.

Preferably, the method further comprises a user of the life management system a trusted user in accordance with user controls associated with the user, such that the trusted user may provide mood information to other users of the life management system.

Preferably, wherein a client device associated with the another user provides the mood information to the requesting client device.

According to another aspect of the present invention there is provided a computer-implemented method comprising: receiving one or more user controls, from a user of a life management system via a client device associated with the user, that establish one or more criteria for performing an associated action; determining whether the one or more criteria have been met; responsive to the determination that the one or more criteria are met, automatically performing the action, the action including generating an entry within a calendar of the user, the entry reserving a time slot in the calendar associated with the action to be taken; and providing, for display, the entry to the client device associated with the user.

Moods

According to the invention, there is provided a method of processing information relating to a user state, the method comprising determining a state of a user and affecting one or more elements external to the user in dependence on the user state, wherein the user state is determined from user biometric data in combination with user online activity.

Preferably, the user state is further determined from telemetry data. Preferably, user state comprises a mood.

Preferably, biometric data comprises one or more of: a galvanic skin response, a blood pressure, a heart rate (pulse), a pedometer count, an optical skin and blood vessel dilation measurement, a skin hydration measurement, a blood glucose level, a blood oxygen level, a blood alcohol level, an electrocardiogram, an electroencephalogram, an electromyogram, a respiration rate, a skin temperature, a measure of stress, a number of steps taken, a measure of calories used, a measure of activity, a movement from an accelerometer, a movement from a gyroscope, a response to mechanical or electrical stimuli, an environment temperature, an ambient ultraviolet light level, touch, taste, smell, age, weight, height, fitness level, body-mass index, appearance rating, demographic data and an ambient CO2 level.

Preferably, user online activity comprises one or more of: user calendar activity, user browser, email, social media use. Preferably, user online activity comprises one or more of: user calendar activity, user browser, email, social media use, internet searches based on all names of individuals for a group of users, media setting preferences, playlists and bright colours, location, proximity to places, systems and people, time of day, week and year, status of day such as holiday, and affiliations to external groups.

Preferably, the method further comprises the feedback of information regarding the user state to the user.

Preferably each invention further comprises adapting one or more aspects of the user environment in response to the user state. Preferably the inventions further comprise adapting a computer user interface, according to the user state. Preferably the inventions further comprise adapting a combination of user interfaces, according to the user state. Preferably, adapting a computer user interface comprises using a style sheet adapted in dependence on the state of the user.

Preferably, the method further comprises forwarding information regarding the user state to another user and/or creating sets of users with user states that correlate with each other. Preferably each invention further comprises forwarding information regarding the user state or set of users state to another user via a trusted third party and/or an untrusted third party.

Preferably, the method further comprises the issuance and transmission of a certificate associated with a user to establish the identity and state of the user or sets of users. Preferably, the certificate is time-limited.

Preferably, the method further comprises forwarding the information regarding the user state to a set of users defined by role. Preferably each invention further comprises forwarding the information regarding the user state to a set of users of the same role as the user. Alternatively, each invention further comprises forwarding the information regarding the user state to a set of users of a different role to that of the user.

Preferably, information regarding the user state is used to establish communications with the user. Preferably, communications settings are configured automatically based on the information regarding the user state.

Preferably, the method further comprises assigning a user an access profile, wherein access to mood dependent data is restricted to users having a certain access profile. Preferably each invention further comprises grouping users together according to their user state, wherein mood dependent data is provided only to selected groups of users having a certain user state.

Preferably, the user is a group of individuals. Preferably each invention further comprises determining that the user is a group of individuals; assigning weighting to indicate relative importance of individuals in the group; and applying an algorithm to calculate a collective user state and/or mood of the user, taking into account the weighting of each individual in the group.

Preferably, the method further comprises stimulating one or more user senses that comprise the biometric data to achieve a change of user state. Preferably, the user senses in the biometric data comprise audible, visual, smell, touch, taste and temperature. Preferably each invention further comprises stimulating one or more of the user sense according to a predetermined rate of change profile.

Preferably, the method further comprises detecting the presence of a user in a room using a near file communication (NFC) antenna in a cell phone. Preferably each invention further comprises detecting the presence of a user in a room by detecting personal physical attributes of the user using one or more sensors in the room. Preferably each invention further comprises affecting one or more elements external to the user in the room upon detection of the user.

Preferably, the data relating to the user is received from a client device worn by the user, the client device being configured to collect biotelemetry data and activity data relating to the user.

According to the invention there is also provided a system for processing information relating to a user state, the system comprising: means for determining a state of a user; and means for affecting one or more elements external to the user in dependence on the user state, wherein the user state is determined from user biometric data in combination with user online activity.

According to the invention there is further provided a method of processing information relating to a user state, the method comprising: determining a state of a user; and affecting one or more elements external to the user according to the determined user state, wherein the user state is determined at least in part by an entertainment profile of the user.

According to the invention there is still further provided a system of processing information relating to a user state, the system comprising: means for determining a state of a user; and affecting one or more elements external to the user according to the determined user state, wherein the user state is determined at least in part by an entertainment profile of the user.

Preferably, an entertainment profile comprises media preferences including at least one of the following: audible, visual, time, location and genre.

Preferably, the user state is further determined by characteristics associated with the user. Preferably, characteristics comprise one or more items relating to biometrics, calendar items, email, social network activity, time, date, location, external events, playlists.

Preferably, the entertainment profile of a user can be used by another user to affect elements external to that another user.

According to the invention, there is further provided a method of detecting a user's presence in a room and selecting or adapting media being played through a home entertainment system according to the user's profile and/or mood.

According to the invention there is still further provided a system for detecting a user's presence in a room and selecting or adapting media being played through a home entertainment system according to the user's profile and/or mood.

Calendars

According to one aspect of the invention, there is provided a system for managing a calendar, comprising means for determining parameters associated with existing appointments, means for receiving user data relating to a user of the calendar and means for scheduling a new activity for a user in dependence on said user data and said parameters.

Preferably, the user data is at least one of: biometric data; location data; online activity data; and mood data. Preferably, the user data is provided by at least one of a sensor, a linked remote processor, and/or a local processor.

Preferably, the user data comprises user settings, preferably including at least one of: user preferences; user travel preferences; user purchase information; and user activity preferences. Preferably, the system is further adapted to enable a user to create and/or modify user settings.

Preferably, biometric data comprises one or more of: a galvanic skin response, a blood pressure, a heart rate (pulse), a pedometer count, an optical skin and blood vessel dilation measurement, a skin hydration measurement, a blood glucose level, a blood oxygen level, a blood alcohol level, an electrocardiogram, an electroencephalogram, an electromyogram, a respiration rate, a skin temperature, a measure of stress, a number of steps taken, a measure of calories used, a measure of activity, a movement from an accelerometer, a movement from a gyroscope, a response to mechanical or electrical stimuli, an environment temperature, an ambient ultraviolet light level, and an ambient CO2 level.

Preferably, the system further comprises a sensor for providing biometric data. Preferably, the system further comprises a wearable device incorporating the sensor for providing biometric data.

Preferably, location data comprises one or more of: a location; location-dependent data; local weather data; local traffic data; local time; local services information; and local public transport data.

Preferably, the system further comprises a sensor for providing a location and/or a means for receiving location-dependent data, preferably the sensor being a GPS sensor.

Preferably, online activity data comprises one or more of: user calendar activity, user browser, email and social media use. Preferably, the system further comprises a means for receiving online activity data.

Preferably, the parameters associated with existing appointments comprises one or more of: an appointment start time, an appointment end time, an appointment duration, an appointment location, an appointment attendee, an appointment invitee, an appointment purpose, an appointment category, appointment circumstance information, and further appointment information.

Preferably, the system further comprises a means for receiving appointment parameters. Preferably, the means for receiving appointment parameters comprises a communication link to a remote processor and/or an input interface for a user.

Preferably, scheduling a new activity for a user comprises one or more of: determining a suitable appointment for a new activity; determining a requirement for a new activity; determining the priority of a new activity; creating a new appointment for a new activity; determining further information in relation to a new activity; providing activity alternatives for user selection; determining an optimum activity alternative; executing an external booking for a new activity; executing a purchase for a new activity; seeking third-party approval for a new activity; providing third-party notification of a new activity; and providing information relating to the new activity.

Preferably, determining a suitable appointment for a new activity comprises determining an available time between a current time and a next appointment start time; and determining whether the available time is sufficient for an activity, preferably a wellbeing promoting activity.

Preferably, determining a suitable appointment for a new activity comprises determining if an available time is within an activity-free period. Preferably, the activity-free period is determined in dependence on the user data, and preferably biometric data. Preferably, an activity-free period can be entered into the user calendar through user interaction.

Preferably, determining a requirement for a new activity comprises determining a first appointment location of a first existing appointment, and a second appointment location of a second existing appointment subsequent to the first appointment, and determining that a travel activity is required from the first to the second appointment location.

Preferably, determining a requirement for a new activity comprises determining from an appointment parameter that an associated activity is required. Preferably, the appointment parameter is an appointment location and the associated activity is travel to the appointment location. Preferably, the appointment parameter is an appointment required item and the associated activity is procuring the required item.

Preferably, scheduling a new activity for a user comprises determining a user propensity for a new activity. Preferably, a user propensity for a new activity takes into account a combination of subjective factors and objective factors. Preferably, a subjective factor is specific to a particular user. Preferably, a subjective factor is estimated in dependence on the user data. Preferably, a subjective factor is estimated in dependence on a user preference input.

Preferably, providing activity alternatives for user selection comprises evaluation of user data; and determining suitable new activities in dependence on user data.

Preferably, determining an optimum activity alternative comprises evaluation of a control factor. Preferably, the control factor includes, individually or in combination, optionally weighted, one, some, or all of the following: a cost, a user wellbeing factor, a user stress level, a user mood, a user propensity, a maximum duration of the second appointment, and a user preference.

Preferably, providing information relating to the new activity comprises at least one of: providing a route or travel directions to the new activity location; providing address and/or contact information for the new activity venue; sending travel directions to a mobile device; and/or printing travel directions.

Preferably, scheduling a new activity for a user comprises evaluating whether the user has engaged a scheduled activity; and if not, then updating the scheduled activity to provide an alternative activity. Preferably, user location information can be used to provide the alternative activity at a location convenient to the user.

Preferably, the system further comprises providing access to the calendar associated with the user to an auxiliary user. Preferably, the system further comprises restricting the auxiliary user from and/or permitting the auxiliary user to adapt the calendar associated with the user. Preferably, the system further comprises applying a restriction and/or permission associated with a first auxiliary user to a second auxiliary user for delegation.

Preferably, the system further comprises a graphical user interface configured for user interaction with the system.

Preferably, the system further comprises payment means for performing a payment transaction associated with an appointment in a calendar. Preferably, the payment means is further adapted to provide evidence of a transaction to the user or to an auxiliary user and/or to a remote and/or local processor. Preferably, the remote and/or local processor is arranged to perform accounting functions. Preferably, the payment transaction relates to the user's contribution toward a group event.

According to another aspect of the invention, there is provided a system for managing a calendar, the system comprising means for determining parameters associated with existing appointments, means for receiving data relating to a user of the calendar, and means for performing a payment transaction as a result of an appointment in the calendar, wherein the payment transaction relates to the user's contribution toward a group event.

Preferably, the or a remote and/or local processor performs a payment transaction on behalf of a plurality of users.

Preferably, the appointment in the calendar comprises data relating to one or more of: cost sharing information; a total appointment cost; a maximum number of appointment participants; a fixed number of participants; a fixed cost per participant; a variable cost per participant, optionally dependent on a number of participants; a payment means; a payment deadline; a late payment consequence; a payment failure provision; a minimum event commitment requirement; and an oversubscription provision.

Preferably, performance of a payment transaction is deferred following acceptance of an invitation to an appointment in the calendar. Preferably, the appointment is cancelled if an insufficient number of users accept the invitation by a predetermined acceptance deadline, and optionally wherein no payment transaction is performed. Preferably, the appointment is cancelled if an insufficient level of funds is committed by users that accepted the invitation by a predetermined acceptance deadline, and optionally wherein no payment transaction is performed. Preferably, a payment transaction is refunded if the appointment is cancelled.

Preferably, a payment transaction for a deposit payment is performed at acceptance of an invitation, and optionally a further payment transaction is deferred following acceptance of the invitation. Preferably, in the case of the user failing to perform a further payment transaction a deposit payment is not refunded as a payment failure provision.

Preferably, the appointment in the calendar comprises an indicator of whether a minimum event commitment requirement is fulfilled or not, and preferably wherein the indicator is a colour of the appointment in the calendar. Preferably, the system comprises means for providing a reminder in relation to performing a payment transaction relating to an appointment in the calendar.

Preferably, the oversubscription provision relates to the remote and/or local processor performing a further payment transaction on behalf of the plurality of users. Preferably, the oversubscription provision relates to allocation of a cancellation to one or more of the users that accepted the invitation.

Preferably, the system further comprises means for prioritising users that accepted the invitation for allocation of a cancellation to one or more low-priority users, and preferably wherein the prioritising occurs in dependence on user data, including biometric data; location data; online activity data; and/or mood data. Preferably, the prioritising occurs in dependence on date of invitation acceptance; date of user performing a payment transaction; a random factor; and/or a user-selected priority.

According to another aspect of the invention there is provided a method of managing a calendar, the method comprising: determining parameters associated with existing appointments; receiving user data relating to a user of the calendar; and scheduling a new activity for a user in dependence on said user data and said parameters.

According to the present invention there is also provided a method of managing a calendar, the method comprising: determining parameters associated with existing appointments; receiving user data relating to a user of the calendar; and performing a payment transaction as a result of an appointment in the calendar, wherein the payment transaction relates to the user's contribution toward a group event.

According to a further aspect of the invention, there is provided a method of configuring a user account, comprising detecting when the user is arranging an event to be attended by selected participants, determining whether there is a pre-existing event arranged by one of the selected participants, determining whether there is a group and/or team associated with the pre-existing event and offering the user the option to become a member of the group and/or team, optionally to attend the pre-existing event.

Preferably, the method further comprises making the user a member of the team and/or group. Preferably, the method further comprises adding the members of the team to the user's list of contacts and/or list of teams and/or groups. Preferably, the method further comprises adding calendar entries for the team to the calendar of the user.

According to a still further aspect of the invention, there is provided a method of configuring a user account, comprising detecting when the user is adding a new contact, determining whether the new contact is a member of a pre-existing team and/or group and offering the user the option to become a member of the pre-existing team and/or group.

Preferably, the method further comprises making the user a member of the team and/or group. Preferably, the method further comprises adding the members of team to the user's list of contacts and/or adding the user to the team members' list of contacts. Preferably, the method further comprises adding calendar entries for the team to the calendar of the user.

According to the present invention there is also provided a method of co-ordinating user interfaces between first and second user devices, the method comprising: receiving a signal at a first device from a second device, the signal relating to the status of at least one application on the second device; and adjusting the status of a corresponding application on the user interface of the first device.

Preferably, the status of the application on the second device is at least one of: selected, pinned and popular. Preferably, adjusting the status comprises promoting the prominence of the corresponding application on the user interface of the first device, optionally activating the corresponding application on the first device. Preferably, adjusting the status comprises duplicating the status of the application on the second device on the corresponding application on the user interface of the first device.

According to the present invention there is also provided a method of configuring a user interface of a user device, the method comprising: determining a telemetry parameter related to the device; determining an item of biometric data relating to the user of the device; and adjusting the user interface in dependence on the telemetry and biometric parameters in order to promote a desired behaviour of the user.

Preferably, the telemetry parameters relates to at least one of: phone usage, tariff plan, and API system wide information or API system logs events. Preferably, the API system wide information comprises at least one of: battery state; MSISDN; network name; signal strength; network type; service area; roaming state; mobile network state; IMEI; IMEI SV; MAC address (Wi-Fi); Bluetooth address; uptime; activity name; network MCC; network MNC; phone model; OS version; firmware version; kernel version; build number; software version; device locale; list of installed applications; memory information; GPS last position; and display manufactured. Preferably, the API system logs events comprise at least one of: camera state; screen actions; alarm indications; Wi-Fi state; application crash log; camera usage; screen orientation; call start; call number; SMS sent; SMS number; email accounts information (sent, received and other); and browser history (visited sites).

Preferably, the biometric data relates to at least one of: a galvanic skin response, a blood pressure, a heart rate (pulse), a pedometer count, an optical skin and blood vessel dilation measurement, a skin hydration measurement, a blood glucose level, a blood oxygen level, a blood alcohol level, an electrocardiogram, an electroencephalogram, an electromyogram, a respiration rate, a skin temperature, a measure of stress, a number of steps taken, a measure of calories used, a measure of activity, a movement from an accelerometer, a movement from a gyroscope, a response to mechanical or electrical stimuli, an environment temperature, an ambient ultraviolet light level, and an ambient CO2 level.

Preferably, the method further comprises: determining a utility activity of the user; and adjusting the user interface in dependence also on the utility activity of the user. Preferably, the utility activity comprises at least one of: Calendar events, emails, Facebook activity, Twitter activity and Browsing activity.

According to another aspect of the invention, there is provided a method of facilitating use of a calendar comprising receiving user biometric data and user online activity data in relation to a user; and adapting a calendar associated with the user in response to the user biometric data and user online activity data.

Preferably, the method further comprises providing access to the calendar associated with the user to an auxiliary user. Preferably, the method further comprises restricting the auxiliary user from and/or permitting the auxiliary user to adapt the calendar associated with the user. Preferably, the method further comprises applying a restriction and/or permission associated with a first auxiliary user to a second auxiliary user for delegation.

Preferably, the method further comprises receiving user device information from a user device associated with the calendar, and adapting the calendar in response to the user device information. Preferably, the user device information is user location information from a user device location detector. Preferably, adapting the calendar in response to the user location information comprises setting a user local time. Preferably, adapting the calendar in response to the user device information comprises transferring calendar functionality from the user device to another device. Preferably, the transferred calendar functionality is maintenance of a to-do list and/or receiving user biometric data and/or user online activity data. Preferably, adapting the calendar in response to the user device information comprises adapting a display associated with the calendar.

Preferably, the method further comprises receiving user device information from a first user device associated with the calendar, and adapting a second user device associated with the calendar in response to the first user device information. Preferably, adapting a second user device comprises adapting the calendar display in the second user device in dependence on a user selection relating to the calendar display in the first user device.

Preferably, the user device information is a user device battery status information. Preferably, when a user device battery reaches a predetermined level of charge, an auxiliary user is alerted to the state of the battery level.

Preferably, the method further comprises receiving a calendar appointment with at least one invited user; determining that the invited user has a pre-existing calendar appointment that corresponds to the received calendar appointment; and synchronising the calendar appointment by adding the user as invitee to the pre-existing calendar appointment. Preferably, the method further comprises receiving user approval of a proposed synchronisation. Preferably, the method further comprises adding contact information of one or more further users associated with the pre-existing calendar appointment to the calendar.

Further details relating to various aspects of the present invention are described in the following patent applications, the contents of which are hereby incorporated by reference in their entirety:

United Kingdom Patent Application No. 1315765.6, titled “Processing system and method”, filed Sep. 4, 2013;

United Kingdom Patent Application No. 1400225.7, titled “Processing system and method”, filed Jan. 7, 2014;

United Kingdom Patent Application No. 1315764.9, titled “Device for providing alerts”, filed Sep. 4, 2013;

U.S. Provisional Patent Application No. 61/874,107, titled “Intelligent Wristband and Life Management Environment” filed on Sep. 5, 2013;

U.S. Provisional Patent Application No. 61/874,219, titled “Life Management System”, filed on Sep. 5, 2013; and

four PCT applications filed on the same day as the present application by the same applicant titled “Processing system and method” (three applications, with agent references P41407 WO, P41407WO-01 and P43674WO), and “Device for providing alerts” (agent reference P41406WO) respectively.

Any feature in any of the abovementioned documents may be combined with any feature described herein in any appropriate combination.

The invention extends to any novel aspects or features described and/or illustrated herein. Further features of the invention are characterised by the other independent and dependent claims

Any feature in one aspect of the invention may be applied to other aspects of the invention, in any appropriate combination. In particular, method aspects may be applied to apparatus aspects, and vice versa.

Furthermore, features implemented in hardware may be implemented in software, and vice versa. Any reference to software and hardware features herein should be construed accordingly.

Any apparatus feature as described herein may also be provided as a method feature, and vice versa. As used herein, means plus function features may be expressed alternatively in terms of their corresponding structure, such as a suitably programmed processor and associated memory.

It should also be appreciated that particular combinations of the various features described and defined in any aspects of the invention can be implemented and/or supplied and/or used independently.

The invention extends to methods and/or apparatus substantially as herein described with reference to the accompanying drawings.

DETAILED DESCRIPTION

The figures and the following description relate to preferred embodiments by way of illustration only. It should be noted that from the following discussion, alternative embodiments of the structures and methods disclosed herein will be readily recognized as viable alternatives that may be employed without departing from the principles of what is claimed.

Reference will now be made in detail to several embodiments, examples of which are illustrated in the accompanying figures. It is noted that wherever practicable similar or like reference numbers may be used in the figures and may indicate similar or like functionality. The figures depict embodiments of the disclosed system (or method) for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein.

A wristband system includes an intelligent wearable wristband. The wristband comprises one or more sensors, a curved electronic display screen, and a control unit. The wristband determines physical activity data, biotelemetry data, and/or gesture data of a wearing user, and provides recommendations for the user on the curved display screen of the wristband based on this data. The wristband is connected through a network to a life management system, which can analyze data from the wristband, along with other data for the user (such as social data) and input from a second user, to make recommendations for the user.

FIGS. 1A and 1B illustrate different perspective views of a wearable intelligent wristband 100. An intelligent wristband 100 is configured to be worn by a user. The wristband 100 is generally worn around a user's wrist, but can be worn anywhere on a user's body including around an arm or leg. The wristband 100 or components thereof can also be coupled to a user's waist, chest, head, or any other body part. The wristband 100 may be made of any solid substance, such as plastic, metal, glass, silicone, or rubber. Furthermore, the wristband 100 may include various input mechanisms, such as buttons, switches, dials, and touch-screen mechanisms. FIGS. 1A and B show just one example of a wristband design, though a variety of other band shapes and designs and screen shapes and designs are also possible.

The wristband 100 includes an electronic display screen 102. The screen 102 could be an LED display, an OLED display, a plasma display, a liquid crystal (LDC) display, a dot matrix display, or any other type of screen display. In one embodiment, the display screen 102 is elongated and/or curved to match the curvature of the wristband 100. The display screen 102 may have a length covering a portion of the wristband 100, or the display screen 102 may curve all the way around the wristband 100. Also, the display screen 102 may include touch-screen capabilities for interaction from a wearing user.

The wristband 100 includes one or more sensors (shown and described regarding FIGS. 4A-F). In an embodiment, a sensor is configured to detect physical activity by a wearing user. In another embodiment, a sensor is configured to detect biotelemetry data for a wearing user. In yet another embodiment, a sensor is configured to detect gestures made by a wearing user. The data detected by the one or more sensors is processed by a control unit (shown and described regarding FIGS. 4A-F) of the wristband 100. The control unit is configured to control the operations of the wristband, including the receiving, processing, and transmitting sensor data and other data. In one embodiment, the control unit is a processor. The wristband 100 can show a representation of this data on the display screen 102. The wristband 100 can also display a date, time, or other information on the display screen 102.

Furthermore, the wristband 100 can be connected to one or more external devices, such as a computing device, a scale, another wristband, or another monitoring device. In one embodiment, the wristband and the one or more external devices are connected through a network. In a further embodiment, the network is a wireless network. In another embodiment, the wristband 100 and the one or more external devices are connected directly through a cable or other connector, such as a USB connector.

FIGS. 2A-B is a high-level network diagram illustrating a life management environment 200 including a client device 202, an assistant device 204, a scale device 206, and a life management system 208, according to one embodiment. A network 210 connects the client device 202, assistant device 204, scale device 206, and life management system 208. Only one example is shown for some of the devices and system. However, there can be more than one of each entity in some embodiments, and some embodiments may lack one or more of the entities. Other entities not shown here may also connect via the network 210. Moreover, the environment may also include multiple networks.

The client device 202 is associated with a wearing user of the wristband system. There may be one or more client devices 202 associated with a single wearing user of a wristband system in the system 200. In one embodiment, the client device 202 is a wristband 100. In other embodiments, the client device 202 is another wristband belonging to a third party, or a distinct monitoring device such as a second wristband for the wearing user. In still another embodiment, the client device 202 is a computer such as a mobile device, desktop computer, or laptop computer. In some embodiments, multiple of these client devices 202 may connect to the network 210, such that the user has one or more wearable devices (e.g., a wristband 100), a mobile phone, a tablet, a personal computer, among other devices that connect to and share information with one another or with other devices across the network 210. The one or more client devices 202 can also exchange data with the life management system 208, assistant device 204, and scale device 206. For example, a wristband 100 client device 202 provides capabilities for detecting physical activity data, biotelemetry data, or other data from a wearing user, and provides capabilities for generating scores and recommendations for the wearing user. The data can also be transmitted to the life management system 208, assistant device 204, or scale device 206 for further analysis and/or display. Furthermore, recommendations and other information can be received from the life management system 208, assistant device 204, or scale device 206 and processed by the client device 202 to be displayed to a wearing user.

In some embodiments, one or more client devices 202 (e.g., mobile phone, tablet, computer, etc.) execute an application designed to interact with and analyze data from a wristband worn by the user, and allowing the user to interact with the life management system 208. For example, a mobile application running on a user's mobile phone might include a calendar that allows the user to manage his schedule, to leverage external data like traffic or weather to help the user manage his time, provide messaging capability, to track health/wellness, to interact with other users or utilize social data from social networking sites, among other capabilities. Any of this data can be shared amongst other components of the life management environment 200, including being provided for display to the user on a wristband. As another example, a web portal can be provided to allow the user to access his data (e.g., sensor data collected, scheduling, health management, etc.) via his computer. In one embodiment, the client device 202 can execute a browser application to enable interaction between the client device 202 and the life management system 208 via the network 210. In another embodiment, a client device 202 interacts with the life management system 208, through an application programming interface (API) running on a native operating system of the client device 202, such as IOS® or ANDROID™.

The assistant device 204 is associated with an assistant of a wearing user of a wristband 100. The assistant may be any third party associated with the wearing user, such as a doctor, health or fitness coach, a concierge, an executive or personal assistant, or a friend of the user. In an embodiment, the assistant device 204 is a computer such as a mobile device, desktop computer, or laptop computer. In another embodiment, the assistant device 204 is a wristband 100. There may be one or more assistant devices 204 associated with each wearing user (or with each client device 202). The assistant device 204 exchanges data with the life management system 208, client device 202, and scale device 206. For example, an assistant device 204 may receive information regarding a wearing user, such as social data, biotelemetry data, physical activity data, and other data. An assistant using the assistant device 204 may analyze this information and provide a recommendation to the wearing user based on the information.

The scale device 206 is associated with a scale for taking certain measurements associated with a user. The scale device 206 exchanges data with the life management system 208, client device 202, and assistant device 204. The scale device 206 can be a scale that detects the weight of a user. In some embodiments, the scale 206 also detects the body mass index (BMI), body fat percentage, or heart rate of the user. In an embodiment, the scale 206 detects that a wearing user of a wristband 100 is currently using the scale 206. The scale 206 then detects the weight of the user along with other body metric information, and transmits this information to the client device 202, assistant device 204, or life management system 208. Likewise, the scale 206 may receive information from another device about the wearing user currently using the scale 206, and provide for display on the scale 206 some of the received information (such as biotelemetry data or physical activity data). In some embodiments, the device 206 can be another smart device for tracking the fitness of a user, including exercise equipment, body fat analyzers, etc.

The life management system 208 exchanges data with the client device 202, assistant device 204, and scale device 206. The client device 202, assistant device 204, and scale device 206 can all access the life management system 208 through the network 210, and can provide data to the life management system 208. This data can include social data, recommendations for a wearing user from an assistant, biotelemetry data, physical activity data, body metric data, or other data. The life management system 208 can analyze the data and recommendations through a software platform, and transmit at least a portion of the analysis as a recommendation or other information to the client device 202, assistant device 204, or scale device 206 for display.

In some embodiments, some or all of the functions of the life management system 208 are provided via a cloud computing environment. In one example, the cloud represents a personal cloud for the user on which user data can be stored and accessed and which includes one or more engines for analysis of the user data, including using machine learning algorithms that process user biotelemetry data and associate this with the context under which this data is received (e.g., when the user's schedule shows that he is busy and so may be under stress) to identify patterns that can be used for better analysis of user data in the future and provision of better advice to the user. In some embodiments, the cloud also incorporates data from other sources, such as social networking sites, travel sites (regarding a user's travel schedule), health sites (tracking a user's health), calendar/email/contact management sites or programs (for incorporating a user's schedule), and so forth.

The network 210 facilitates communications between the client device 202, the assistant device 204, the life management system 208, and the scale device 206. In an embodiment, the network 210 uses standard communications technologies and/or protocols. For example, the network 210 can include links using technologies such as Ethernet, 3G, worldwide interoperability for microwave access (WiMAX), and digital subscriber line (DSL), etc. Networking protocols used on the network can include, for example, multiprotocol label switching (MPLS), the transmission control protocol/Internet protocol (TCP/IP), the hypertext transport protocol (HTP), or the file transfer protocol (FTP), etc. The data exchanged over the network can be represented using technologies and/or formats including the hypertext markup language (HTML), or the extensible markup language (XML), etc. Some or all of the links can be encrypted by conventional encryption technologies such as secure sockets layer (SSL) or virtual private networks (VPNs), etc. In some embodiments, the client device 202, assistant device 204, scale device 206, and life management system 208 can use custom and/or dedicated data communications technologies instead of, or in addition to, the ones described above.

FIG. 2B shows the relationship between various aspects of the system 208 (“Life management”). The system 208 may perform the following tasks:

    • 1. Collate digital information about the User from multiple sources
    • 2. Analyze said data to provide insights
    • 3. Merge insights and additional services into Life Management
    • 4. Provide browser 210, mobile app 212 and device user interfaces to Life Management

Life Management may be used by persons with the following roles (Roles):

    • 1. The User
    • 2. The User's Executive Assistant (“EA”)
    • 3. Customer Relationship Management (“CRM”)
    • 4. Personal Trainer (PT)

These persons (Roles) have access to Life Management using any combination of three user interfaces, namely browser, mobile app and/or devices.

Life Management is a calendar-based service that merges external data 214 (e.g. calendar), sensor data 216 (e.g. pulse), and external services 218 (e.g. travel booking) to provide insights and additional services for Roles.

Further detail relating to ‘Life Management’ is provided below with reference to FIGS. 15-28.

The above description provides just one example of the life management environment 200. The life management environment 200 and life management system 208 are also described in detail in U.S. Provisional Patent Application No. 61/874,219, filed on Sep. 5, 2013, with inventors George Arriola, Kouji Kodera, and Peter Karsten, entitled “Life Management System,” which is hereby incorporated by reference in its entirety.

FIG. 3 is a high-level block diagram illustrating an example of a computing device, or computer 300, according to one embodiment. The client device 202, assistant device 204, or scale device 206 could each be a computer. Furthermore, a computer 300 can be used by a wearing user of a wristband system, an assistant, or any other party to access the life management system via the network 210. The computer 300 includes at least one control unit 302 coupled to a chipset 404. The chipset 304 includes a memory controller hub 320 and an input/output (I/O) controller hub 322. A memory 306 and a graphics adapter 312 are coupled to the memory controller hub 320, and a display 318 is coupled to the graphics adapter 312. A storage device 308, keyboard 310, pointing device 314, and network adapter 316 are coupled to the I/O controller hub 322. Other embodiments of the computer 300 have different architectures.

The storage device 308 is a non-transitory computer-readable storage medium such as a hard drive, compact disk read-only memory (CD-ROM), DVD, or a solid-state memory device. The memory 306 holds instructions and data used by the control unit 302. The pointing device 314 is a mouse, touch screen, or other type of pointing device, and is used in combination with the keyboard 310 to input data into the computer system 300. The graphics adapter 312 displays images and other information on the display 318. The network adapter 316 couples the computer system 300 to one or more computer networks 210.

The computer 300 is adapted to execute computer program modules for providing functionality described herein. Since “module” refers to computer program logic used to provide the specified functionality, a module can be implemented in hardware, firmware, and/or software. In one embodiment, program modules are stored on the storage device 308, loaded into the memory 306, and executed by the control unit 302.

The types of computer 300 used in the system 200 of FIGS. 2A-B can vary depending upon the embodiment and the processing power required for the uses. For example, a computer 300 could be a desktop computer, laptop computer, tablet, or mobile telephone. Computers 300 can lack some of the components described above, such as keyboards 310, graphics adapters 312, and displays 318. As an example, a wearing user may use a computer comprising a mobile telephone with a touch screen but no physical keyboard.

FIGS. 4A-F are a high-level block diagram illustrating a detailed view of various modules within the client device 202, according to one embodiment. In the embodiment of FIGS. 4A-F, the client device 202 is represented by a wearable device, such as a wristband 100. Thus, the components of the wristband 100 are illustrated in FIGS. 4A-F. As illustrated, the client device includes a physical activity sensor 402, a biotelemetry sensor 404, a gesture sensor 406, a geospatial location sensor 408, a control unit 410, a score calculator 412, a recommendation unit 414, a display unit 416, a communications interface 418, and an electric/haptic feedback unit 420. Some embodiments of the client device 202 have different and/or other units than the ones described herein. Similarly, the functions can be distributed among the units in accordance with other embodiments in a different manner than is described here.

The physical activity sensor 402 detects physical activities performed by a wearing user of a wristband 100, and generates physical activity data therefrom. The physical activity sensor detects these activities by using a plurality of sensors that monitor movements of the wristband 100, thus detecting movements of the wearing user. For example, the physical activity sensor 402 may act as a pedometer and monitors the number of steps taken by the user. A change in altitude is also detectable, such that the physical activity sensor 402 can monitor the number of stairs climbed by the user. The physical activity sensor 402 may also detect when the user is resting or sleeping, and can measure a sleep pattern of the user. For example, the sensor 402 can detect whether the user is tossing and turning during sleep, or having a more restful sleep. Other examples of physical activities detectable by the physical activity sensor 402 include exercise intensity performed by the user, and a pace or speed of the user. The physical activity sensor 402 can also include multiple sensors, such as a different sensor for each different type of activity data collected.

Physical activity data can be displayed on the wristband 100 to a wearing user as raw numbers regarding physical activities performed by the user. Alternatively, the score calculator 412 and recommendation unit 414 can analyze the data to generate scores and recommendations regarding the physical activity data for display to the wearing user. Furthermore, the data can be sent to the life management system 208, which analyzes the data along with other data received at the life management system 208, and transmits this analyzed data back to the wristband 100 for display to the wearing user.

The biotelemetry sensor 404 detects physiological functions of a wearing user of a wristband 100, and generates biotelemetry data therefrom. The biotelemetry sensor 404 detects these physiological functions by using a plurality of sensors that recognize various biotelemetric functions. For example, the biotelemetry sensor 404 can detect a heart rate of the user, calories burned by the user, blood pressure of the user, skin temperature of the user, hydration level of the user, galvanic skin response of the user, and skin temperature of the user. In one embodiment, biotelemetry data for a user can be used by the wristband to identify a specific user. The biotelemetry sensor 404 can also include multiple sensors, such as a different sensor for each different type of biotelemetry data collected. Biotelemetry data can be displayed on the wristband 100 to a wearing user as raw numbers regarding physical activities performed by the user. Alternatively, the score calculator 412 and recommendation unit 414 can analyze the data to generate scores and recommendations regarding the biotelemetry data for display to the wearing user. Furthermore, the data can be sent to the life management system 208, which analyzes the data along with other data received at the life management system 208, and transmits this analyzed data back to the wristband 100 for display to the wearing user.

The gesture sensor 406 recognizes one or more gestures made by a wearing user, a gesture indicating an instruction for the wristband 100. The gesture sensor 406 uses a plurality of sensors to recognize various distinct gestures of the wearing user, based on movement of the wristband 100, or based on a detection of a touchscreen display screen 102 of the wristband 100 being used by a wearing user. In one embodiment, if the user is wearing the wristband 100 around the user's wrist, the gesture sensor 406 can detect when the user lifts his wrist and rolls it towards his body, as if to check the time. This gesture may instruct the wristband 100 to be activated from a sleep mode or to turn on from an off mode. In another embodiment, the gesture sensor 406 can detect when the user shakes the wristband 100, which may instruct the wristband 100 to display a home screen user interface. Other gestures that can be recognized by the gesture sensor 406 include a user making a swiping motion with his finger across the display screen 102 of the wristband 100, a user tapping the display screen 102 of the wristband 100 with one finger, two fingers, or three fingers, a user pressing on the display screen 102 of the wristband 100 with one finger, two fingers, or three fingers, or a user making a circular motion on the display screen 102 of the wristband 100 with one finger.

Different gestures made by a wearing user may instruct the wristband 100 to perform one or more tasks, including, for example, turning the wristband 100 power on, activating a display, selecting a function, moving between user interfaces on the display, entering a settings user interface, scrolling within the display, zooming in or out of the display, resetting a user interface, exiting a user interface, or displaying a home screen user interface. In one embodiment, the wearing user can customize the sensitivity of the gesture sensor 406, so that the gesture sensors 406 have weaker or stronger capabilities for detecting gestures performed by the wearing user, depending on the customized sensitivity. In some embodiments, the user can customize what instruction a particular gesture provides to the wristband, such that a tap with two fingers might control the wristband differently for different users depending on the customization.

The geospatial location sensor 408 detects geospatial location data of a wearing user. A Global Positioning System (GPS) within the wristband 100 is used to determine the geospatial location of the wearing user. The resulting geospatial location data can be used to provide the wearing user's location on a map on the display screen of the wristband 100. The geospatial location data can also be used by the recommendation unit 414 or life management system 208 to recommend directions, an exercise course, traffic conditions, or nearby restaurants/bars/stores for the wearing user.

The sensors described above can include various different sensor types. Some examples include an accelerometer (e.g., 3 axis), a gyroscope, a light sensor, a heart rate monitor, a thermometer, a galvanic skin response sensor, an optical blood flow monitoring sensor, a thermometer, an oxygen sensor, an optical skin profile sensor, an electrical signal monitoring sensor, an altimeter, an audio sensor or microphone, a compass or magnetometer, a weight sensor, a body fat sensor, an air quality sensor, a pedometer, a hydration or perspiration sensor, a brain activity sensor, a blood pressure sensor, a motion sensor, among others.

The control unit 410 receives, processes, stores, and displays data from the physical activity sensor 402, biotelemetry sensor 404, gesture sensor 406, geospatial location sensor 408, and from other external sources. The external sources may include the assistant device 204, scale device 206, life management system 208, or other client devices 202 such as another monitoring device or a third-party wristband. In addition, the life management system 208 may acquire social data from external sources, such as social networking websites, email services, calendar services, contact management services, and so forth. Social data includes a calendar or schedule of the wearing user, and other information about the wearing user, such as the wearing user's demographic data or personal interests. The life management system 208 can analyze this data and transmit the analyzed data to the wristband 100, where the control unit 410 processes and displays the data on the display screen 102. In another example, a wearing user may use a scale device 206 to determine a weight of the user. The scale device 206 can transmit this weight data to the wristband 100, where the control unit 410 processes and displays the data on the display screen 102. Furthermore, the control unit 410 can process and display data that was collected from the wristband system itself, such as physical activity data and biotelemetry data.

The control unit 410 also transmits data from the wristband system through the network 210 to the life management system 208 for further analysis. After the life management system 208 has analyzed data transmitted by the control unit 410, the life management system 208 may send a recommendation or other information back to the wristband 100 for display to a wearing user. This recommendation is processed by the control unit 410 before it is displayed on the display screen 102 of the wristband system for the user. In an embodiment, the control unit 410 saves all of the data received from the life management system 208 or any other source, and can process historical data for a wearing user for display.

The score calculator module 412 determines one or more scores for a wearing user based at least in part on a portion of the data processed by the control unit 410 at the wristband 100. For example, the physical activity data and biotelemetry data from the wristband 100 can be used to calculate a score. Also, external data received by the control unit 410, such as a schedule of a wearing user, a recommendation from an assistant, or weight and BMI data from a scale device, can all be used to calculate a score. In some embodiments, a score represents a sleep efficiency of the wearing user, a stress level of the wearing user, an exercise intensity of the wearing user, a general activity level of the wearing user, a self-control or motivational level of the wearing user, or a general wellness level of the wearing user.

In an embodiment, a score is determined by combining different types of data for a wearing user. For example, a stress level score of the wearing user may be determined by combining at least a portion of the social data (schedule) and biotelemetry data (heart rate) of the wearing user. In another example, an exercise intensity score of the wearing user may be determined by combining at least a portion of the biotelemetry data (heart rate) and physical activity data (pace) of the wearing user. The scores may be provided for display on the display screen of the wristband. The scores may be provided by the life management system 208 for display to the wearing user at a client device 202, an assistant of the wearing user at an assistant device 204, or another third party.

The recommendation unit 414 makes recommendations to the wearing user, based at least in part on a portion of the data processed by the control unit 410 at the wristband 100, and/or one or more scores determined by the score calculator module 412. In an embodiment, if the one or more scores from the score calculator module 412 or the life management system 208 surpasses a particular threshold, the recommendation unit 414 makes a specific recommendation to the wearing user. For example, if a stress level score of the wearing user is above a threshold level, the recommendation unit 414 may recommend that the user take a 15 minute break from work, or that the user go jogging for an hour. In some embodiments, the recommendation unit 414 displays recommendations from third parties, such as a doctor or health coach using an assistant device to interact with the client device.

In another embodiment, the recommendation unit 414 analyzes a portion of the data processed or received at the client device 202, and makes a recommendation based at least in part on the data. In a further embodiment, the recommendation unit 414 makes a recommendation if a portion of the data surpasses a particular threshold. For example, if the hydration levels of a user are detected by the biotelemetry sensor as being relatively low, the recommendation unit 414 may recommend that the user drink some water. In another embodiment, the recommendation unit 414 may use general data from an external source to generate a recommendation. For example, depending on the time of day, or weather conditions, the recommendation unit 414 could generate a recommendation for the user to wear a jacket or bring an umbrella when going outdoors.

In an embodiment, the recommendation unit 414 analyzes a portion of the data processed or received at the client device 202 to determine a mood for the wearing user. For example, the recommendation unit 414 may determine that the wearing user is feeling “happy” based on an above-average heart rate and relaxed schedule of the wearing user. In another example, the recommendation unit 414 may determine that the wearing user is feeling “stressed out” based on a busy schedule and low sleep efficiency of the wearing user.

Any or all of the generated data, scores, and recommendations can be provided for display on the display unit 416 of the wristband. The display unit 416 includes the elongated display screen 102, on which a user may view the data or manipulate the data using touch-screen mechanisms. Furthermore, any or all of the generated data, scores, and recommendations can be provided to the life management system 208 or to other devices in the life management environment 200 through a communications interface 418. Similarly, data can be received by the client device 202 via the communications interface 418. The communications interface 418 is communicatively coupled to the control unit 410.

The electric/haptic feedback unit 420 delivers electric and/or haptic stimuli to a wearing user. The electric/haptic feedback unit 420 includes an electric feedback stimulator, which is configured to deliver small electric shocks from the wristband to a wearing user. The electric/haptic feedback unit 420 also includes a haptic feedback stimulator, which is configured to deliver vibrations from the wristband to the wearing user. In an embodiment, the electric/haptic feedback unit 420 is configured to deliver a simultaneous combination of electric shocks and vibrations to a wearing user.

The electric shocks and/or vibrations may be manually set to serve as notifications or reminders for the wearing user. A user may set the electric shocks and/or vibrations at the wristband, from another client device 202, or from an assistant device 204. For example, a user may use the wristband to set an electric shock to be delivered at 7 AM, which serves as an alarm clock for the user. In another example, an assistant may provide instructions from an assistant device 204 for the wristband to deliver a vibration accompanying a message sent by the assistant to the wearing user.

The electric shocks and/or vibrations may also be automatically delivered to the wearing user. For example, the electric/haptic feedback unit 420 may deliver an electric shock to the wearing user 15 minutes before every meeting on the user's calendar. In another example, the electric/haptic feedback unit 420 may deliver a vibration to the wearing user every time the wearing user's heart rate goes above 120 bpm. The electric shocks can vary in frequency, duration, or voltage, and the vibrations can vary in frequency, duration, or amplitude. These variations are customizable by the wearing user or an assistant, or are automatically determined by the wristband or life management system 208. For example, an assistant may customize a vibration so that each time the assistant sends a message to the wearing user's wristband, the message is accompanied by three long vibrations.

FIG. 4B provides one example embodiment of the electric/haptic feedback unit 420 in association with a wristband system showing example componentry. The device 202 is represented in FIG. 4B by a development board 422 (such as a Qualcomm Intrinsyc Dragonboard Development Board (“Dragonboard”)) connected by a 2m RS232 wire to a board with components (“Test Board”). Together, the items on the diagram below form “Device” 202. An analogue-to-digital converter (ADC) may be provided between the ambient light sensor 424 and the controller 410 (preferably a 16-bit ADC). A reset button and a Serial Wire Debug (SWD) interface may also be provided

The Device 202 is connected to Life Management 208 (for example, via network 210). Roles can use a Life Management browser interface to set Device parameters and operate Device 202 as described below.

The following parameters can be set by Roles, either in a test environment or in use:

    • 1. Select Dragonboard 422 and Test Case (the following)
    • 2. Voltage range for Galvanic Skin Response (GSR) measurements
      • a. Minimum voltage from 0.05 mV in steps of 0.05 mV to 1 mV
      • b. Maximum voltage from 1 mV in steps of 0.05 mV to 10 mV
    • 3. Accelerometer sensitivity
    • 4. Gyro sensitivity (Acc. And Gyro will later be combined for gestures)
    • 5. Optical skin reading sensitivity
    • 6. Small electric shock sequences (for alerts regarding Events, such as pending calendar items, biometrics such as high heart rate, or alerts from one Role(s) to another Role(s))
      • a. Define shock:
        • i. Name shock
        • ii. Set shock frequency
        • iii. Set shock duration
        • iv. Set voltage
      • b. Define shock alert
        • i. <name 1>, <time 1>, <name 2>, <time 2>, <name 3> . . .
      • c. Test on user with pads linked to test board
    • 7. Vibra (e.g., mechanical vibration)
      • a. Define vibra:
        • i. Name vibra
        • ii. Set vibra frequency
        • iii. Set vibra duration
        • iv. Set amplitude
      • b. Define vibra alert
        • i. <name 1>, <time 1, <name 2>, <time 2>, <name 3> . . .
      • c. Test on user with vibra on test board

Combinations of eVibra (electric shocks) and Vibra (vibration) may be used for such alerts. In one embodiment, one user interface (e.g. mobile phone or browser) is used to set such eVibra and/or Vibra [combinations] for another device (e.g. Greedo).

    • 1. Combining biometric data with data such as calendar, email or social network information to generate Combined Alerts
    • 2. Using Combined Alerts to trigger eVibra and/or Vibra [combination] alerts
    • 3. Allowing Role(s) to create custom eVibra and/or Vibra sequences for such alerts

Data Output

Test data can be collected into the Life Management database on a per test board basis.

The browser outputs shall include:

    • 1. Dragonboard found, RS232 found, Test Board found indicators
    • 2. GSR voltage sequences
    • 3. Accelerometer and Gyro readout sequences
    • 4. Optical skin reading sequences
    • 5. Power consumption characteristics for each sensor individually

Life Management has access to the data.

The Test Board may be mounted on a person's (Role's) wrist as shown in FIG. 4C. The Test Board can be placed on the person's (Role's) wrist by wrapping a Velcro band around the edges of the board and wrist as shown in FIG. 4D.

If the development board 422 (for example, The Dragonboard) does not directly support Inter-Integrated Circuit (I2C), and some components use Pulse Width Modulation (PWM) and Serial Port Interface (SPI), the Dragonboard may be connected to the Test Board using RS232 and have a local microcontroller on the Test Board that talks to each component. Therefore the provision of I2C interfaces on the Test Board for individual sensors is not a requirement.

In one example, the test board has the following features/functionality:

    • 1. The test board has 120 connectivity and uses the same components as are intended for Greedo (e.g., a wristband device that might be worn by a user). The sensor manufacturers' Android drivers shall be used in the Dragonboard. It may not be possible to analyse the sensors' sleep/power management with this setup.
    • 2. The test board has two metal plates (round, 5 mm in diameter and 5 cm between them) that are similar to the ones intended for Greedo. The GSR voltage ranges defined under Browser Interface (above) are supported. Like Greedo, the resolution is 16 bit. The plates are mounted under the board (on the skin).
    • 3. Same accelerometer, gyro and I2C interface as intended for Greedo
    • 4. The test board has the same optical skin reading components and sensor distances as intended for Greedo, mounted under the board (on the skin).
    • 5. The test board supports dual functionality for the GSR leads/plates. The second function (beyond GSR) is to provide small electric stimuli (“eVibra”) to the user. The ranges supported by the test board shall be configurable using 120 and at least:
      • a. U<=from 20 to 45V (expected to be smaller variable voltage in Greedo)
      • b. I<=5 mA (expected to be smaller variable current in Greedo)
      • c. Stimuli active output time (stimuli time) 0.20-1 us (see Image)
      • d. Stimuli repeat frequency frepeat=1-100 Hz (expected to be variable in Greedo)

FIG. 4E shows an example cross-section of the device described above and FIG. 4F shows a voltage/time plot of an example single stimuli impulse wave. Each square wave (impulse) is a stimulus. It is possible for the output time, number of impulses and impulses frequency to be controlled. For example: 5 impulses in a row with 1 us each impulse length and 1 s gap between impulses. Such a result can be achieved by driving a step-up DC/DC converter with STM32 Controller PWM output.

    • 6. There is a vibra on the test board, of the same type as intended for Greedo
    • 7. There is an ambient light sensor on the test board, of the same type as intended for Greedo
    • 8. Display and touch (capacitive and/or Qitec) may be tested separately

The following sensor components may be used in the device (not including some aspects, such as the electric stimuli components):

Photodiode (1 Component)

EVERLIGHT PD60-48C/TR8 TEMP-00061682

LEDs for optical sensors (2)

EVERLIGHT 19-217/G7C-AL1M2B/3T SE031M2B330

Digital Microphone (1)

ADI ADMP441ACEZ-RL S0M441AC000

6-Axis Sensor (1)

INVENSENSE MPU-6050 SA0U6050080

eCompass (1)

AKM AK8963C SA08963C130

ALS Sensor (1)

SITRONIX STK2207-018 SA007018020

Altmeter (1)

Bosch BMP180 SA0MP180020

Haptic Driver (1)

TI DRV2603RUNR SA02603R080

Optical sensor (1)

TI AFE4400RHAR SA04400R080

GSR Sensor (1)

ADI AD5933YRSZ-REEL7 SA05933Y040

Further details relating to the device is described in more detail below with reference to FIGS. 12-14 in the section titled ‘Device for providing alerts’.

FIG. 5 is a high-level block diagram illustrating a detailed view of various modules within the life management system 208 according to one embodiment. As used herein, the term “module” refers to computer program logic used to provide a specified functionality. The life management system 208 is a system that exchanges data with the client device 202, assistant device 204, or scale device. As illustrated, the life management system 208 includes a wristband data module 502, a scale data module 504, a social data module 506, an assistant module 508, a score calculation module 510, and a recommendation module 512. Some embodiments of the life management system 208 have different and/or other modules than the ones described herein. Similarly, the functions can be distributed among the modules in accordance with other embodiments in a different manner than is described here.

The wristband data module 502 receives and analyzes data from a wearing user received from the client device 202, or wristband 100. The wristband data analysis module 502 can receive biotelemetry data, physical activity data, geospatial location data, or other data from the client device. The scale data module 504 receives and analyzes data from a wearing user received from the scale device 206. The scale data module 504 can receive body metric data such as a weight of the wearing user, a body mass index (BMI) of the wearing user, a body fat percentage of the wearing user, or a heart rate of the wearing user.

The social data module 506 receives and analyzes social data regarding a wearing user. Social data may include a calendar and schedule of the wearing user, interests of the wearing user, demographic information regarding the wearing user (such as relationship status, family history, birthplace, schools attended, friends, etc.), nutritional intake by the user, or any other information about the wearing user. The social data module 506 may receive social data as an input at a client device 202 by the wearing user, or from an assistant device 206, or collected from an external source, such as a social networking website.

The assistant module 508 receives and analyzes data regarding a wearing user sent from an assistant device 204. An assistant may view information regarding a wearing user at the life management system 208, such as the wearing user's biotelemetry data, social data, or scores generated by the score calculation module 510, allowing the assistant to determine recommendations and other information for the user. An assistant can be any individual, or the assistant can be a computing device. In some embodiments, more than one assistant may be associated with the user. An assistant can view at least a portion of the biotelemetry data, physical activity data, and/or social data for the user. The assistant can then use the computing device to make one or more recommendations to the user.

For example, an assistant can act as a personal assistant, concierge, health coach, and/or personal trainer. When data for a user is displayed on a computing device of the assistant or third-party wristband worn by the assistant, the assistant may view the data and make a recommendation to the user regarding exercise, relaxation, sleep schedule, social schedule, or anything else. In one embodiment, the assistant may send a motivational or instructional message to the user. For example, if the assistant sees that the heart rate of the user does not increase enough during the user's exercise regimen, the assistant may send a message to a wristband 100 of the user to increase exercise intensity. In another embodiment, an assistant can act as a personal assistant and schedule meetings on the user's calendar, or provide other services based on the user's needs. For example, the assistant may see from the user's calendar that the user is attending a destination wedding in one month. The assistant may use a travel agency to book the flight for the wearing user, and send the user a notification regarding the flight to the wristband 100 for display. An extension of this concept is whereby the system blocks out a time space in the user's calendar, define desired air travel, getting a few flight examples, picking a suitable flight and automatically getting tickets—and automatically being checked in by the system. In such an example, the device may require eCommerce functionality/permissions so as to be able to book and pay for real-world assets such as flight tickets. This example is described in more detail with below with reference to FIGS. 10A and 10B.

In some embodiments, the user can sign up for one or more services associated with his wristband that connect him to certain third parties, such as well-being specialists, doctors, coaches, fitness trainers, dietitians, weight loss coaches, sleep specialists, etc. who can work one-on-one with the user to improve his well being. This or another service could also include a concierge that provides answers to user's questions, gets the user tickets to events and restaurant reservations, helps the user with issues with the wristband, among other services. A further service could include a dedicated personal or executive assistant for management of a user's schedule and performing the functions that a secretary might normally perform. A variety of other services to help the user with life management are also possible. The score calculation module 510 determines scores for a wearing user based at least in part on a portion of the data transmitted by the wristband 100, or data transmitted from another external device such as an assistant device 204. In some embodiments, a score represents a sleep efficiency of the wearing user, a stress level of the wearing user, an exercise intensity of the wearing user, a general activity level of the wearing user, a self-control or motivational level of the wearing user, or a general wellness level of the wearing user.

In an embodiment, a score is determined by combining different types of data for a wearing user. For example, a stress level score of the wearing user may be determined by combining at least a portion of the social data (schedule) and biotelemetry data (heart rate) of the wearing user. In another example, an exercise intensity score of the wearing user may be determined by combining at least a portion of the biotelemetry data (heart rate) and physical activity data (pace) of the wearing user. The scores may be transmitted to the wristband 100 from the score calculation module 510 for display on the display screen 102.

The recommendation module 512 generates recommendations for a wearing user based at least in part on a portion of the data transmitted by the wristband 100 or from another external device such as an assistant device 204, and/or one or more scores determined by the score calculation module 510. For example, if the one or more scores from the score calculation module 510 surpasses a particular threshold, the recommendation module 512 makes a specific recommendation for the wearing user. For example, if a stress level score of the wearing user is above a threshold level, the recommendation module 512 may recommend that the user take a 15 minute break from work, or that the user go jogging for an hour.

In some embodiments, the recommendation module 512 generates recommendations automatically without instructions from an assistant. In other embodiments, one or more assistant devices 204 use data provided by the life management system 208 or scores determined by the score calculation module 510 to determine a recommendation to be transmitted to a wristband 100 of the wearing user for display. The recommendation module can also take into account historical data and data trends of the user to make recommendations. For example, if the wearing user has a history of engaging in intense exercise, the recommendation module 512 may recommend increasing exercise intensity by a particular amount. The recommendation module 512 can also use general data from an external source to make a recommendation. For example, depending on the time of day, or weather conditions, the recommendation module 512 could generate a recommendation for the user to wear a jacket or bring an umbrella when going outdoors.

In an embodiment, the recommendation module 512 analyzes a portion of the data processed or received at the client device 202 to determine a mood for the wearing user. For example, the recommendation module 512 may determine that the wearing user is feeling “happy” based on an above-average heart rate and relaxed schedule of the wearing user. In another example, the recommendation module 512 may determine that the wearing user is feeling “stressed out” based on a busy schedule and low sleep efficiency of the wearing user.

This part of the description focuses on how your mood can drive how the world interacts with you. A basic example is to adapt all user interfaces in your life to your mood; this example can be generalized and structured. The result is “Mood-as-a-Service”.

This section sets out the environment. The system (“Life Management”) performs these tasks:

    • 1. Collate digital information about the User from multiple sources
    • 2. Analyze said data to provide insights
    • 3. Merge insights and additional services into Life Management
    • 4. Provide browser, mobile app and device user interfaces to Life Management

In one example, Life Management is used by one or more of the ‘Roles’ as described above. In one example, a change in mood measured for one Role causes user interface changes for the same Role. An equivalent action may occur for a particular group of Roles (for example a user and the user's assistant). Examples of a user interface change include LEDs in the device (Greedo) display's light pipe change color based on Galvanic Skin Response (GSR) measurements performed by the device; a colour (e.g. background colour) change. This could (in one example) signal to other people the mood of the user, for example ii. “Check this out—she's happy today!”.

In another example, mood changes measured for one Role(s) may cause user interface changes for other Role(s), or an interface change for whole world facing website, for example:

    • i. “Check this out—the guys at zero° are happy today!” (e.g. the mood of a group of people resulting in a change of interface on a website).
    • ii. “My assistant is in a really bad mood today!” (e.g. the mood of a person changing the interface on a user's device).

In another example, mood changes measured in one system causes user interface changes (e.g. color) in another user interface, for example:

    • iii. “Just opening the site with my browser shows me my mood. Is it a cookie?”

Life Management

The Roles have access to Life Management using any combination of three user interfaces, namely browser, mobile app and devices.

Life Management is a calendar-based service that merges external data (e.g. calendar), sensor data (e.g. pulse), and external services (e.g. travel booking) to provide insights and additional services for Roles.

Moods

Moods can be calculated based on biometric data, calendar, email, and social media data.

    • Biometric data may include items such as Galvanic Skin Response (GSR), movement from accelerometer(s) and gyro(s), optical skin and blood vessel dilation measurements, and responses to stimuli such as Vibras and eVibras (e.g., electric and haptic feedback stimulations—electric shocks and vibrations).

Ways of Making Moods Available

In a further example, an (online) resource is provided that publishes a Role(s) mood for any user interface. Using such a resource, the Role's user interface can be modified based on their mood without the need for locally stored or created interfaces, for example:

“Ok, it's time to give Alice a user interface. What's her mood as we speak? I'll use a web service to fetch her mood from the Trusted Third Party Mood Service. Here comes an XML+CSSs with recommendations based on how she is feeling right now. Got it. Ok, based on that, let's create a mood that speaks clearly with no nonsense to her. She is under pressure”

Moods can be made available on a peer-to-peer basis, through a trusted mood web or through a Trusted Third Party (“TTP”).

A trusted third party is an entity which facilitates interactions between two parties who both trust the third party. Hence a trusted mood party (“TMP”) is an entity that facilitates the secure transfer of moods from one Role(s) to another Role(s). TMPs may use 3rd party architecture not necessarily explicitly associated with the present system.

The TMP model enables the relying parties to use this trust to know the mood of each counterpart in their interactions.

The TMP model allows the creation a mood certificates (“MC”) to guarantee a mood for a period of time and/or until conditions change. The certificate authority would act as a mood certificate authority (“MCA”) and issue a digital mood certificate to one of the two parties in the next example. The MCA then becomes the Trusted Mood Party to that certificates issuance. Such an arrangement creates a ‘trusted mood web’.

Likewise interactions that need a third party recordation would also need a third-party repository service of some kind or another.

An example:

Suppose Alice and Bob wish to communicate, knowing securely what the mood of the other party is. The TMP knows Bob or is otherwise willing to vouch that his mood (typically expressed in a mood certificate) describes the person indicated in that certificate, in this case, Bob. In discussions, this third person is often called Trent. Trent gives it to Alice, who then uses it to send messages to Bob based on knowing Bob's mood. Alice can trust this mood to be Bob's if she trusts Trent. In such discussions, it is simply assumed that she has valid reasons to do so (of course there is the issue of Alice and Bob being able to properly identify Trent as Trent and not someone impersonating Trent).

If Role(s) become TMPs, we have a trusted mood web where Role(s) digitally sign each other's mood certificates and do so only if they are confident the mood and the Role(s) belong together. A mood signing party is one way of combining a get-together with some certificate signing. Nonetheless, doubt and caution remain sensible as some users have been careless in signing others certificates.

However, trusting humans, or their organizational creations, can be risky.

Further details in relation to the ‘moods’ aspect are provided below with reference to FIGS. 29-42 in the section titled ‘Calendars and moods’.

FIG. 6 is a flowchart illustrating steps performed by a client device 102 to display a recommendation to a wearing user, according to one embodiment. Other embodiments of the client device may perform different or additional steps, and may perform the steps in different orders. In the first step, a wristband system detects physiological functions and physical activity from a wearing user, and generates biotelemetry data and physical activity data therefrom 602. This data is transmitted through a network 210 to a life management system 604. The life management system 208 analyzes the data. In some cases the data is combined with other data, such as social data from an external source, body metric data from the scale device 206, or data from an assistant at an assistant device 204. The life management system 208 determines a recommendation for the wearing user based on the analyzed data, and sends the recommendation back to the wristband system 606. The control unit 410 of the wristband 100 then processes this recommendation and displays the recommendation to the wearing user on the display screen 102 of the wristband system 608.

FIGS. 7A and 7E illustrate examples of gestures made by a wearing user, and FIGS. 7B-7D illustrate examples of user interfaces displayed on a wristband system, in an example scenario of a wearing user using the wristband system. Further detail relating to gestures is described below with reference to FIGS. 11A-11M. FIG. 7A illustrates a gesture that the wearing user may make to activate the display screen 102 out of a sleep mode. Here, the gesture involves the wearing user lifting a wrist wearing the wristband system, and rolling the wrist towards the wearing user. After the display screen 102 is activated, the user may decide to use a combination of gestures to view a calendar of the wearing user on the display screen 102 of the wristband system. FIG. 7B shows an embodiment of a calendar that is displayed to a wearing user. The calendar may be sent from a life management system 208, at which the wearing user or an assistant has provided input regarding the wearing user's schedule. Here, the wearing user is viewing a 3-Day View of a calendar, and focusing on a schedule for Monday, 5/27.

A wearing user may receive various updates or recommendations from the life management system 208. FIG. 7C shows an example recommendation for display to a wearing user. In this case, the life management system 208 uses the wearing user's calendar (social data) to determine the time and location of the user's next meeting. The life management system 208 also uses geospatial location data from the geospatial location sensor 408 on the wristband 100 to determine a route to take to get to the meeting, and to determine the current traffic conditions. Finally, the life management system 208 uses general data from an external source to determine the current weather conditions. Using all of this data, the life management system 208 sends the user a recommendation to leave immediately for the wearing user's next meeting, due to traffic and weather conditions.

Further details in relation to the ‘calendars’ aspect are provided below with reference to FIGS. 29-42 in the section titled ‘Calendars and moods’.

Another example recommendation is illustrated in FIG. 7D. In this embodiment, the wristband system displays to a wearing user a “wellness alert,” providing a recommendation to relax and hydrate. This recommendation may be generated by analyzing the wearing user's heart rate (biotelemetry data) and the wearing user's hydration levels (biotelemetry data), and determining that the wearing user is stressed and/or dehydrated. The display also shows a recommendation to visit “your favorite juice bar, Jamba Juice”. This recommendation may be generated by analyzing the wearing user's interests (social data), and determining that the wearing user enjoys Jamba Juice. The recommendation may also be generated because the wearing user often frequents Jamba Juice, and the life management system 208 has determined automatically that the wearing user enjoys Jamba Juice.

FIG. 7E illustrates another gesture that a wearing user can make that is recognized by the gesture sensor 406. This gesture includes a user shaking the wrist wearing the wristband. For example, after viewing a recommendation, the wearing user may shake the wrist wearing the wristband to change the display from the recommendation user interface to a home screen.

FIGS. 8A-8B illustrate alternative examples of user interfaces displayed on a wristband system, in another example scenario of a wearing user using the wristband 100. The user interface in FIG. 8A shows a recommendation transmitted to the wristband 100 by the life management system 208. The recommendation is to train for a marathon by running a 7 mile course at 3 pm. The life management system may use a combination of the user's calendar (social data), interests (social data), current location (geospatial location data), and physical activity data history to generate the recommendation. Alternatively, an assistant of the user may have viewed the user's calendar, current location, and other information (such as the assistant's own knowledge that the user is training for a marathon, or the assistant's own knowledge of the user's exercise capabilities) to send the recommendation to the wearing user.

The life management system can send a further recommendation to the wearing user as the user is exercising. For example, the user may have taken the recommendation for marathon training at 3 pm. FIG. 8B shows a recommendation transmitted to the wristband 100 by the life management system 208 to increase the user's pace for the next mile. The life management system could have used the user's current pace (physical activity data) and the user's historical pace data to determine that the user should increase the user's running pace.

FIGS. 9A and 9B illustrate different perspective views of a scale device 206. FIG. 9A shows a perspective view of the scale device 206 in an “OFF” position. The scale 206 may be made of any solid substance, including metal, plastic, glass, silicone, or rubber. In one embodiment, at least a portion of the scale 206 is made from aluminum. In a further embodiment, the surface of the scale 206 is made from a brushed metal, such as brushed aluminum. The scale 206 includes a display screen 902. The screen 902 could be an LED display, an OLED display, a plasma display, a liquid crystal (LDC) display, a dot matrix display, or any other type of screen display. In an embodiment, the display screen 902 is not visible unless the scale is in an “ON” position. In a further embodiment, a lighted display on the scale 206 or on the display screen 902 of the scale 206 is activated when the scale is in an “ON” position, and is not activated when the scale 206 is in an “OFF” position.

FIG. 9B shows an alternative perspective view of a scale device 206, in which the scale 206 is in an “ON” position and displaying body metric data on the display screen 902. This body metric data includes a weight of the user (85 kg) and a body fat percentage of the user (15%). In one embodiment, body metric data for the user is displayed on the display screen 902 when the user is standing on the scale 206. The scale 206 can be configured to automatically send the body metric data through a network 210 to a client device 202, such as a wristband 100 of the user, for display to the user. In an embodiment, the body metric data is sent to a wristband 100 of the user for display as the user is using the scale 206. Therefore, the user can use a scale 206 and immediately view body metric data from the scale at the wristband display screen 102 while the user is using or standing on the scale (or immediately after use of the scale). In another embodiment, biotelemetry data, physical activity data, or other data from the wristband 100 can be sent to the scale 206 for display 902 to the user on the scale display.

The above description is included to illustrate the operation of the preferred embodiments and is not meant to limit the scope of the invention. The scope of the invention is to be limited only by the following claims. From the above discussion, many variations will be apparent to one skilled in the relevant art.

The text of the abstract is repeated here so as to form part of the description. An intelligent wristband system and life management environment includes a wristband to be worn by a user. The wristband contains a control unit, a curved display screen, and multiple sensors to detect biotelemetry data and physical activity data for the wearing user. Another sensor within the wristband detects one or more distinct gestures made by the wearing user, such that the gestures indicate instructions to be performed by the control unit of the wristband. The biotelemetry and physical activity data is sent to a communications interface, which determines scores for the user based on the received data and data transmitted from external sources such as a scale device, or from a second user such as a personal assistant or health coach. The scores can be used to make lifestyle recommendations or instructions for the wearing user, which can be displayed on the wristband.

FIGS. 10A and 10B illustrate non-limiting example embodiments of interactions with the user's calendar through the life management system. Life Management has three main themes called Live, Act and Do as shown in FIG. 10A.

These themes allow a Role to create Space in Life Management for an Action.

Space can be defined as any sufficient combination of the requesting party's identity, time span, size span, model type(s), price span, brand(s), description, material etc.

For example, Space can be a calendar start-stop time, flight (e.g. SFO-JFK), lowest price. Alternatively Space could be Nike shorts, white, L, men's. Or blood pressure high, location NYC.

Having created Space, the next task is to assign an Action.

The Action may be defined before the Space, in parallel or within the relevant Space, and/or after the Space is created. The Action may be automatic, automatic based on parameters, and/or manual.

For example, Life Management may automatically book a flight based on the lowest price preference for SFO-JFK in the time slot indicated.

The Action and/or the Space can generate notification to any Role(s) or identity related to a Role.

For example, the receipt may be copied to the EA and the User, while the PT receives information about the time spent flying, and the User's husband receives notification that dinner in the vicinity of SFO is not possible that day.

The Action and the Space can have different States prior to, during and after the Space.

For example, alerting a PT that a User's blood pressure is high (an Action) may be inactive while the User's blood pressure is low, active as of the blood pressure becoming high (start of Space), and urgently active when the blood pressure becomes very high (end of Space). And a Space may be ended by an Action taking place, such as a purchase ending the need to make a purchase.

A number of other examples and/or features are provided below in relation to the system:

A system as described (“Life Management”) that collates data related to a Role(s).

A system that allows Role(s) to manage Role(s) collated data.

A system that allows the delegation of rights from a Role (e.g. Executive(s)) to a role(s) (e.g. Executive Assistant(s)) for managing the Role(s) collated data.

A system that allows a Role (e.g. EA) to create Space and Actions for Roles(s) (e.g. self and/or User).

A system that allows a Role (e.g. EA) to set Space and Action parameters for Role(s) (e.g. self and/or User).

    • a. For example, blocking out the time from 1 pm to 7 pm for air travel

A system that uses data related to Role(s) to o set Space and Action parameters for Role(s).

    • a. For example, using calendar and/or GPS locations to conclude that flights are needed, and blocking out the time from 1 pm to 7 pm for air travel

A system that allows a Role to manually trigger steps in a process using Space and Actions (e.g. approve a flight booking).

    • a. For example, approving a flight that has been selected by a Role

A system that automatically triggers steps in a process using Space and Actions (e.g. approve a flight booking).

    • a. For example, making a purchase that meets the criteria automatically
    • b. For example, automatically inserting frequent flyer card details into the booking

A system that uses parameters in one part of the system to set parameters in another part of the system.

    • a. For example, a GPS location collected from a phone can be used to set User(s) time parameters online and in an online watch
    • b. For example, high stress levels observed in an online watch can be used to change travel options for maximum convenience instead of lowest cost.
    • c. For example, low battery indication in a phone can be used to set an online watch to poll for information more often, and alert fellow Role(s) about the phone's battery being low

A system that uses parameters in one part of the system to shift tasks to another part of the system

    • a. For example, a TO DO list may reside in a phone, and be moved to an online watch if the phone's battery is running low
    • b. A GPS measurement in a phone can drive the choice of image shown online (e.g. sunrise in Seattle)

FIGS. 11A-M illustrate non-limiting example embodiments of gestures made by the user to control wristband functions.

FIG. 11A is an example of the ‘natural lift with wrist rotation’ gesture. Raising one's arms with a rotation of the wrist triggers sensors to wake and display current watch functions and present biometric information. The motion may be calibrated by moving from the rest position to the glance position. The gesture may be parameterised by the starting rest position, the ending glance position, the time duration between the start and end positions, and the time duration at the glance position.

FIG. 11B shows an embodiment of the device from three different perspectives.

FIG. 110 shows an embodiment of the device fastened to the wrist of a user.

FIG. 11D shows a ‘single tap’ gesture. This may be used to select or activate a state.

FIG. 11E shows a ‘swipe’ gesture. By swiping left or right the user may move between modules.

It may be calibrated from the glance position by swiping left or right. (Parameters: platform general I/O same as Jelly Bean/Key lime pie implementation)

FIG. 11F shows the ‘two finger tap/ttap’ gesture. This may act as a back button to back out of a given state. It may be calibrated from the glance position by tapping the screen with two fingers and parameterised by the time duration to complete the two finger tap/ttap, the time lag between the first and second finger taps, the force of each individual finger tap and the distance between finger contacts.

FIG. 11G indicated a ‘long press’ gesture. This is used to dive into the settings mode for the platform, or module. Performing the ‘two finger tap’ or ‘shake’ gestures exits the settings mode. It may be calibrated from the glance position by performing a single tap and holding it. (Parameters: platform general I/O same as Jelly Bean/Key lime pie implementation)

FIG. 11H shows possible paths taken by a finger of a user to use the ‘single-touch, drag jog shuttle circular’ gesture. These may be performed either clockwise or anticlockwise. This gesture may be used to scrub biotelemetry data points. It may be calibrated from the glance position, and parameterised by the distance from the screen center of the of the user's finger when performing the gesture, the force applied by the user's finger and the radial displacement between the initial and final positions of the user's finger.

FIG. 11I shows an example of the ‘single-touch, drag’ gesture, with the dashed circle indicating the initial touch of the user's finger and the arrows indicating possible directions of motion of the user's finger after the initial touch. This gesture may be used to slide the display around within movable areas within modules. It may be calibrated from the glance position by performing a single tap plus a drag. (Parameters: platform general I/O same as Jelly Bean/Key lime pie implementation)

FIG. 11J shows examples of the ‘single touch, pinch and spread’ gestures. In the left diagram shows an example of the ‘spread’ gesture, with the dashed circles indicating final locations of two of the user's fingers and the arrows indicating directions of motion of the user's fingers after the initial touch. The right diagram shows an example of the ‘pinch’ gesture, with the dashed circles indicating the initial position of the two fingers the user is using to perform the gestures and the arrows indicating the subsequent direction of motion of those fingers. These gestures may be used to zoom in and out of large content. The gesture may be calibrated from the glance position. (Parameters: platform general I/O same as Jelly Bean/Key lime pie implementation)

FIG. 11K shows an example of the ‘natural turning shake of your wrist’ or ‘home/shake reset’ gestures, which may be used as a reset to the very initial state or glance home. They may be calibrated from the glance position by shaking the wrist and parameterised by the starting glance position, a count of the number of shakes performed by the user, the displacement (distance) of the shakes and the time duration to complete the shakes.

FIG. 11L shows a representation of the ‘Vibe language’ as described above; a silent communication language for time, alarm, notifications and other custom events

FIG. 11M shows various symbolic representations of the gestures in other figures. The single tap gesture is shown by 1, the double tap gesture by 2, the long press gesture by 3, the single touch drag gesture by 4, the single-touch, drag jog shuttle circular gesture by 5 and the vibe language by 6.

Definition of Terms/Gestures in 3D Space.

Note: all definitions are from the perspective of an observer facing the user's front.

    • The X axis extends left and right, bisecting the user's front and back.
    • Positive X is right of the user's center (to the user's left).
    • A move in the positive X direction is a move to the right of the current X position.
    • Negative X is left of the user's center (to the user's right).
    • A move in the negative X direction is a move to the left of the current X position.
    • The Y axis extends up and down, bisecting the user's top and bottom.
    • Positive Y is above the user's center of mass.
    • A move in the positive Y direction is a move upwards from the current Y position.
    • Negative Y is below the user's center of mass.
    • A move in the negative Y direction is a move downwards from the current Y position.
    • The Z axis extends forwards and backwards, bisecting the user's left and right.
    • Positive Z is in front of the user.
    • A move in the positive Z direction is a move forwards (towards the observer) from the current Z position.
    • Negative Z is behind the user.
    • A move in the negative Z direction is a move backwards (away from the observer) from the current Z position.
    • The center of mass.
    • The center point of the user in all three dimensions.
    • For an average human this is located just below the belly button/at the geometric center.
    • When the user is standing naturally, arms hanging loosely at sides, with glance worn on the user's left wrist, glance will be positioned at:
    • X=(bodyWidth/2)+(space between forearm and left ribs)+(forearmWidth/2).
    • Y=0//Assuming glance is positioned at approximately the user's center of mass in the y direction.
    • Z=0.
    • Rotation.
    • Rotation is angular displacement in regards to a reference axis.
    • 360°=2π radians.
    • 270°=3π/2 radians.
    • 180°=π radians.
    • 90°=π/2 radians.
    • 45°=π/4 radians.
    • 0°=0 radians.
    • X rotation is rolling or flipping head-over hills.
    • Positive X rotation starts with the head moving in the positive Z direction and the feet moving in the negative Z direction.
    • Y rotation spinning as an ice skater.
    • Positive Y rotation starts with the left arm moving in the positive Z direction and the right arm moving in the negative Z direction.

Device for Providing Alerts

Referring to FIG. 12, the wearable device may be in the form of a wristband 100 having a strap 3 with a releasable fastening such as hook and loop fasteners for attaching it to the user's wrist. The wristband 2 includes a pair of electrodes 4 which are positioned spaced apart on the inner surface 6 of the band 100 so as to be in contact with the user's skin. The electrodes 4 may for example take the form of stainless steel pads.

The band 100 is conveniently relatively thin, for example around 8 mm thick. The electrodes may be relatively small, for example about 5 mm in diameter, and spaced around 50 mm apart. The electrodes 4 may also serve as galvanic skin response (GSR) plates for measuring a physiological state of the wearer, for example having 16 bit resolution.

A vibrating element 8 such as is known in the art may be mounted inside the device for example adjacent one of the electrodes 4. The device may also include other outputs such as a speaker and a light.

A controller 10 (or 410FIGS. 4A and 4B) is mounted inside the device, and includes a receiver for receiving wireless signals. The controller 10 is in communication with the electrodes 4 and each of the other output elements for actuating them according to a selected alert signal in response to receipt of an alert instruction.

The device 100 may also include a display panel 12 in which partly wraps around the outer surface 14 thereof, which may be a touch screen for receiving user inputs to control and/or interact with the device 2. The controller may be configured to receive inputs from the touch screen and from other sensors mounted in the device 100. The device 100 may for example include an ambient light sensor, an accelerometer, a gyroscope, or a GPS device. The controller may include a wireless transmitter for transmitting output signals from the touch screen, sensors and from the electrodes. Further details relating to other aspects of the device, such as gesture detection, biotelemetry monitoring, physical activity monitoring, control and alerts (and combinations thereof) are described in U.S. Provisional Patent Application No. 61/874,107, titled “Intelligent Wristband and Life Management Environment,” filed on Sep. 5, 2013; and in a PCT application filed by the same applicant and on the same day as the present application titled “Wearable device”. All of these documents are hereby incorporated by reference in their entirety. The technology and functionality of the device described in either of these documents may be incorporated into the device described herein.

The device 100 may also be used in conjunction with the system as described in United Kingdom Patent Application No. 1400225.7, titled “Processing system and method”, filed Jan. 7, 2014; U.S. Provisional Patent Application No. 61/874,219, titled “Life Management System”, filed on Sep. 5, 2013; or any of the three PCT applications filed by the same applicant and on the same day as the present application both titled “Processing system and method” (agent references P41407 WO, P41407WO-01 and P43674WO). All of these documents are hereby incorporated by reference in their entirety.

The controller is configured to actuate the electrodes by providing a stimulus pulse as shown in FIG. 13, or a sequence of such pulses. For example, each pulse is a square wave pulse 16. The pulse may have the following characteristics: U<=from 20 to 45V, I<=5 mA, active output time 0.20-1 us, repeat frequency frepeat=1-100 Hz. All of these characteristics, or selected ones of them, may be variable. By varying the characteristics of the pulse or sequence of pulses, an alert signal corresponding to a particular alert event may be provided. For example: 5 impulses in a row with 1 us each impulse length and 1 s gap between impulses may correspond to an incoming call. This can be provided using a step up DC/DC converter with the controller output.

Referring also to FIG. 14, the controller may be in wireless communication with a device such as a mobile device 18, for example the user's mobile phone, which includes an Internet browser 20 (or 210FIG. 2B) and a software application 22 (or 212FIG. 2B) which provides calendar and/or email functionality. The mobile device 18 may further be in communication with an external information system (EIS) 23, which communicates with application 22 and/or browser 20.

The controller 10 is in communication with various sensors which may include an ambient light sensor 24, haptic driver 26, GSR sensor 28, gyro/accelerometer 30, altimeter 32 and optical skin reader 34, for receiving input signals from such sensors. The controller 10 is also in communication with various output elements for providing alert signals to the user, including a pulse generator 36, speaker 38, light 40 and vibrating element 42.

In use controller 10 receives alert instructions wirelessly from the mobile device 18. The instruction is to provide an alert signal corresponding to an alert event which needs to be notified to the user. The controller actuates the pulse generator 36 to provide a pulse or sequence of pulses to the electrodes 4 according to the alert signal which is required. The controller may also actuate one or more of the other output elements 38 to 42 to provide further alerts such as a vibration, sound, light or sequence of such alerts in combination with the pulse provided to the electrodes 4. The user thus receives an alert signal indicating a particular event.

The device may be programmable by the user, for example using a touch screen, to select an alert signal corresponding to any or each alert event. Various alert signals or sets of alert signals may be preconfigured for selection, or may be customized by the user, such as combinations of alerts, for example alternating electric shocks and vibrations to alert the user that they have received a new email. In one example, the device is programmable offline via a web portal connectable to device 100, the programmed instructions are subsequently downloaded to the device 100. Furthermore, a markup language is available to be used to facilitate programming of device 100. This may effectively take the form of a translation module which converts events, commands or words into alert signals which are distinct combinations of shocks, vibrations, sounds etc. The translation may be preset or personalized for the user. There may be a plurality of different preset vocabularies appropriate to different people or circumstances, such as signals without sounds for those that are hearing impaired.

The instruction may relate to an alert event generated by the software application which communicates with the EIS 23. The alert event may be a communication event on the mobile device, or a data event such as a calendar event. The alert event may further be a combination of such an event and a sensor event received from the controller, such as a location indication. In one example, alerts are generated from social media sources via the browser 20, application 22, and/or EIS 23.

The device may thus provide alerts to a user indicating a variety of situations. For example, where the current time and location of the user are known, and the time and location of the meeting are known, the travel time to the location may be calculated and the user may receive an alert indicating the need to begin travelling to the meeting. The user may further receive alerts relating to other environmental or physiological conditions or events.

The device may also provide different alert signals or sets of alert signals in dependence upon various conditions such as the user's sensed physiological state or movements or environment, or a combination of these. Thus if the user sitting still in a quiet room the controller may select alerts suitable for that situation, such as a low intensity or silent alert or a set of such alerts, whereas a user running along a busy street may receive higher intensity alerts. Furthermore, the type of alert may be modified in dependence upon various conditions such as the user's sensed physiological state or movements or environment, or a combination of these. For example, an electric shock alert may be provided rather than a sound alert if it is detected that the user is in a meeting, or an alert may comprise multiple alerts, sounds, vibrations etc. if the event is considered to have a high priority (e.g. incoming email from your patent attorney). The user may set such priorities and the associated alerts (for example, using the markup language) so as to customise the device, or the priorities and associated alerts may be pre-set, or the device may learn the relative priorities of various events over time based upon a user's reaction to same or similar events.

Generally, embodiments of the invention may include one or more of the following features:

    • Using small electric shocks (“eVibra”) to alert a user about events (events such as pending calendar items, biometrics such as high heart rate, or alerts from one user to another user)
    • Using eVibra sequences for such alerts
    • Example
      • Define Shock:
        • Name shock
        • Set shock frequency
        • Set shock duration
        • Set voltage
      • Define Shock Alert
        • <name 1>, <time 1>, <name 2>, <time 2>, <name 3> . . .
    • Using mechanical vibrations (“Vibra”) sequences for such alerts
    • Example:
      • Define Vibra:
        • Name Vibra
        • Set Vibra frequency
        • Set Vibra duration
        • Set amplitude
      • Define Vibra Alert
        • <name 1>, <time 1, <name 2>, <time 2>, <name 3> . . .
    • Using combinations of eVibra and Vibra for such alerts
    • Using one user interface (e.g. mobile phone or browser) to set such eVibra and/or Vibra (combinations) for another device (e.g. a wearable device)
    • Combining biometric data with data such as calendar, email or social network information to generate combined alerts
    • Using combined alerts to trigger eVibra and/or Vibra (combination) alerts

Allowing users to create custom eVibra and/or Vibra sequences for such alerts via, for example, a markup language.

Life Management System

The Figures (FIGS.) and the following description describe certain embodiments by way of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein. Reference will now be made in detail to several embodiments, examples of which are illustrated in the accompanying figures. It is noted that wherever practicable similar or like reference numbers may be used in the figures and may indicate similar or like functionality.

FIG. 15 is a high-level block diagram illustrating an embodiment of a life management environment 101 including a life management system 140 (or 208 as shown in FIGS. 2A and 2B) connected by a network 110 (or 210 as shown in FIGS. 2A and 2B) to a client device 105, a scale platform 115, a third party service provider 120, and an assistant device 125. Here only one client device 105, scale platform 115, third party service provider 120, assistant device 125, and life management system 140 are illustrated but there may be multiple instances of each of these entities. For example, there may be thousands or millions of client devices 105 in communication with multiple life management systems 140.

The network 110 provides a communication infrastructure between the client device 105, the scale platform 115, the third party service provider 120, the assistant device 125, and the life management system 140. The network 110 is typically the Internet, but may be any network, including but not limited to a Local Area Network (LAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a mobile wired or wireless network, a private network, or a virtual private network. In one embodiment, the network 110 uses standard communications technologies and/or protocols. For example, the network 110 includes communication links using technologies such as Ethernet, 802.11, worldwide interoperability for microwave access (WiMAX), 3G, 4G, code division multiple access (CDMA), digital subscriber line (DSL), etc. Examples of networking protocols used for communicating via the network 110 include multiprotocol label switching (MPLS), transmission control protocol/Internet protocol (TCP/IP), hypertext transport protocol (HTTP), simple mail transfer protocol (SMTP), and file transfer protocol (FTP). Data exchanged over the network 120 may be represented using any suitable format, such as hypertext markup language (HTML) or extensible markup language (XML). In some embodiments, all or some of the communication links of the network 110 may be encrypted using any suitable technique or techniques.

The scale platform 115 is a computing device that measures physical data associated with the user. Physical data is data that is indicative of the health of the user. Physical data may include, for example, body mass index, body fat percentage, heart rate, air quality measurement, or some combination thereof. In some embodiments, the scale platform 115 is a scale that when activated by the user (e.g., stood upon) measures the physical data associated with the user. The scale platform 115 communicates physical data to the client device 105 via the network 110, some wireless connection (e.g., WiFi, Bluetooth, etc.), or some combination thereof. Additionally, in some embodiments, the scale platform 115 may communicate physical data to, and receive software updates from the life management system 140 via the network 110.

The third party service provider 120 comprises one or more computer servers offering goods and/or services that the life management system 140 may offer to its users. Goods and/or services may include, for example, travel services, entertainment services, health services (e.g., gym membership), dining services, consumer goods, some other service, some other good, or some combination thereof. The services can be offered by a third party that is separate from the life management system 140 or, in some embodiments, can be offered by the life management system 140 itself.

The assistant device 125 is a computing device that allows an assistant to interact with snapshot information associated with one or more users of the life management system. An assistant is a third party that views snapshot information associated with a user of the life management system and recommends some action for the user based on the snapshot information. The assistant device 125 communicates the recommended action to the life management system 140. In some embodiments, an assistant may act as a health coach, concierge, personal assistant, well-being specialist, doctor, fitness trainer, dietitian, weight loss coach, sleep specialist, behavioral coach, habit coach, some other type of coach, or some combination thereof.

The client device 105 is a computing device that executes computer program modules which allow a user to interact with the life management system 140. The client device 105 is a computing device capable of receiving user input as well as transmitting and/or receiving data via the network 110. In one embodiment, a client device 105 is a conventional computer system, such as a desktop or laptop computer. Alternatively, a client device 105 may be a device having computer functionality, such as a personal digital assistant (PDA), a mobile telephone, a wearable computing device (e.g., GOOGLE® Glass, wristband, etc.) a smartphone or another suitable device. In one embodiment, the client device 105 is an intelligent wristband or other wearable device that collects data about the user wearing the device via sensors and provides an interface on which the user can interact with the wristband and the data.

A client device 105 is configured to communicate via the network 110. In one embodiment, a client device 110 executes an application allowing a user of the client device 105 to interact with the life management system 140, the third party service provider 120, or some combination thereof. For example, a client device 105 executes a browser application to enable interaction between the client device 105 and the life management system 140 via the network 110. In another embodiment, a client device 105 interacts with the life management system 140, the third party service provider 120, or both, through an application programming interface (API) running on a native operating system of the client device 105, such as IOS® or ANDROID™.

Additionally, in some embodiments, the client device 105 has biotelemetry monitoring capabilities, physical activity monitoring capabilities, or some combination thereof. For example, a wearable computing device (e.g., a wristband) provides biotelemetry data and/or activity data as further described in U.S. Provisional Patent Application No. 61/874,107, titled “Intelligent Wristband and Life Management Environment,” filed on Sep. 5, 2013 and in a PCT application filed by the same applicant and on the same day as the present application titled “Intelligent Wristband and Life Management Environment”, which are both hereby incorporated by reference in their entirety. In some embodiments, the biotelemetry monitoring capabilities may be provided via one or more peripheral devices (e.g., heart rate monitor) that are coupled to the client device 105. Biotelemetry data may include physical data, calories burned by the user, blood pressure of the user, skin temperature of the user, hydration level of the user, galvanic skin response of the user, brain activity of the user, sleep pattern of the user (e.g., duration and efficiency), or some combination thereof. In some embodiments, the client device 105 is configured to monitor biotelemetry data associated with the user (e.g., via one or more sensors on a wristband), and provide the biotelemetry data to the life management system 140. Activity data is data related to physical activities of the user. Activity data may include, for example, steps taken, stairs climbed, exercise intensity, pace, sleep pattern, sleep duration, some other activity, or some combination thereof. In some embodiments, the client device 105 has means for alerting the user, for example by delivering mild electric shocks to the user via at least one electrode and/or a vibration unit, as further described in United Kingdom application No. 1315764.9, titled “Device for providing alerts”, filed Sep. 4, 2013 and PCT application filed by the same applicant and on the same day as the present application titled “Device for providing alerts” which are both hereby incorporated by reference in their entirety.

The client device 105 is configured to present snapshot information to the user. Snapshot information describes different and/or possible aspects of the life of the user. Snapshot information may include, e.g., biotelemetry data, activity data, user profile information, social data, one or more calendar cards, one or more health parameters (e.g., momentum level of the user, stress score, mood information, etc.), one or more recommendations, or some combination thereof. Additionally, the client device 105 is configured to enable the user to interact with the snapshot information, via, for example, a graphical user interface. Additionally, in some embodiments, the client device 105 is configured to present one or more cards to the user. A card presents portions of snapshot information (e.g., recommendations), advertisements, or some combination thereof, to a user of the client device 105. Additionally, in some embodiments, the client device 105 is configured to perform an action based on mood information received from the life management system 140. For example, the client device 105 may adjust colors associated with its display based on the mood information, or may change the display to include different types of information based on mood (e.g., a user who is stressed may receive a simplified display with less information, a user who is angry may receive a display that incorporates no items on the topic associated with his anger, a user who is sad may receive uplifting, brightly colored messages, and so forth).

The client device 105 is configured to communicate with the scale platform 115, the assistant device 125, the third party service provider 120, the life management system 140, or some combination thereof, via the network 110. Additionally, in some embodiments, the client device 105 is configured to receive physical data from the scale platform 115 via a wireless connection, e.g., WiFi, Bluetooth, etc. The client device 105 is configured to receive snapshot information, cards, advertisements, or some combination thereof, from the life management system 140. Additionally, in some embodiments, the client device 105 is configured to send biotelemetry data and/or activity data, via the network 110, to the life management system 140. Additionally, in some embodiments, the client device 105 may send requests for one or more services from the third party service provider 120, via the network 110.

The life management system 140 generates snapshot information using social data associated with a user of the life management system 140, biotelemetry data associated with the user, user profile information associated with the user, activity data associated with the user, or some combination thereof. Social data describes activities and connections of the user. Social data may include, for example, a calendar associated with a user of the life management system, emails between the user and other users of the life management system 140, information from a social networking system associated with the user, connections between the user and other users of the life management system 140, or some combination thereof. In some embodiments, the life management system 140 is configured to provide some or all of the snapshot information to the client device 105.

In some embodiments, the life management system 140 provides some or all of the snapshot information associated with a user to the assistant device 125 of an assistant who is associated with the user. The life management system 140 is configured to receive one or more recommended actions from the assistant device 125. The life management system 140 is configured to adjust the snapshot information using the one or more recommended actions and provide some or all of the adjusted snapshot information to the client device 105. In some embodiments, all or a portion of the snapshot information is provided directly from the assistant device 125 to the client device 105 without or with limited involvement of the life management system 140.

As discussed in detail below, the life management system 140 is configured to interact with the third party service provider 120 to provide one or more services for users of the life management system 140. For example, the life management system 140 may select and provide one or more advertisements to the client device 105 based on targeting criteria of the advertisements and snapshot information associated with the user. Or in another example, the life management system 140 may interact with the third party service provider 120 to obtain a particular service and/or good for the user based on the snapshot information associated with the user (e.g., book a massage for the user when he is stressed).

In some embodiments, the life management system 140 is configured to calculate a mood of a user using some or all of the snapshot information associated with the user. The life management system 140 may then communicate mood information associated with one user to another user of the life management system 140. The other user, now aware of the mood of the user, may decide whether to interact with the user.

In some embodiments, the life management system 140 and/or other entities of the life management environment 101 may perform one or more of the functions described in Appendixes A-B.

FIG. 16 is a high-level block diagram illustrating a detailed view of the life management system 140 according to one embodiment. The life management system 140 is comprised of modules including a user profile store 205, an advertisement store 211, a social interaction module 215, a snapshot generation module 220, a recommendation module 225, a management interaction module 230, a card generation module 235, an action module 240, and a mood module 245. Some embodiments of the life management system 140 have different modules than those described here. Similarly, the functions can be distributed among the modules in a different manner than is described here.

Each user of the life management system 140 is associated with a user profile, which is stored in the user profile store 205. Information stored in the user profile is known as user profile information. A user profile includes declarative information about the user that was explicitly shared by the user and may also include profile information inferred by the life management system 140. Examples of information stored in a user profile include login and password information, biographic, demographic, and other types of descriptive information, type of employment, health descriptors (e.g., diet, exercise level, smoker/nonsmoker, medication allergies, etc.), educational history, gender, location, or other medical information associated with the user, user controls, or some combination thereof.

User controls control how some or all of snapshot information (e.g., mood information, user profile information, biotelemetry data, activity data, recommendations, etc.) associated with the user may be used by the life management environment 101. In some embodiments, user controls are used to set authorization levels for assistants and/or the action module 240. An authorization level determines whether the associated entity (e.g., assistant and/or life management system 140) is able to perform an action without first getting express approval from the user. Authorization levels may be set to a high level, such that an entity must obtain express approval from a user to perform an action (e.g., purchase a good/service, add a calendar entry, etc.), or be set to a low level, such that an entity may perform an action without first getting express approval from the user. Additionally, different entities may have different authorization levels, and the same entity may have different authorization levels depending on the type of action involved.

Additionally, in some embodiments, authorization levels for entities may be customized based on the type of action and one or more criteria selected by the user. Criteria may include, for example, identity of the requesting entity, identity of some other entity, time span, size of a good, model of a good, price range of a good/service, brand of a good/service, price of a good/service, travel destination, preferred mode of transportation (e.g., airline, train, car, etc.), preferred transportation carrier, preferred merchant, preferred method of payment, preferred good/service, geographic location, portions of a user's biotelemetry data, portions of a user's activity data, portions of a user's physical data, portions of a user's social data, portions of a user's user profile information, or some combination thereof. For example, a user may configure the user controls such that an entity generally has a low authorization level, but if one or more particular criteria are met, the entity's authorization level is high.

Additionally, in some embodiments, the user controls may be configured to enable the action module 240 to perform certain actions when one or more criteria are met. User controls may define one or more actions that may be performed if the one or more criteria are met. The actions may be performed automatically, manually, or automatically and subject to additional conditions. An action may include, for example, purchase of a good, purchase of a service, adding a calendar entry, modifying a calendar entry, deleting a calendar entry, shifting tasks between entities in the life management environment 101, shifting data between entities in the life management environment 101, using conditions in one part of the life management environment 101 to set conditions in another part of the life management environment 101, automatically performing one or more actions in accordance with a user's user controls, or some combination thereof. The user controls may be configured such that the user selects an action before or after identifying one or more criteria associated with the action.

The advertisements store 211 stores one or more advertisement requests for goods and/or services. The advertisement store 211 receives one or more advertisements requests from the third party service provider 125, some other advertiser, an ad exchange, or some combination thereof. An advertisement request includes advertisement content and a bid amount. The advertisement content is text, image, audio, video, or any other suitable data presented to a user. In various embodiments, the advertisement content also includes a landing page specifying a network address to which a user is directed when the advertisement is accessed. The bid amount is associated with an advertisement by an advertiser and is used to determine an expected value, such as monetary compensation, provided by an advertiser (e.g., the third party service provider 125) to the life management system 140 if the advertisement is presented to a user, if the advertisement receives a user interaction, or based on any other suitable condition. For example, the bid amount specifies a monetary amount that the life management system 140 receives from the advertiser if the advertisement is displayed and the expected value is determined by multiplying the bid amount by a probability of the advertisement being accessed by a user.

Additionally, an advertisement request may include one or more targeting criteria specified by the advertiser. Targeting criteria included in an advertisement request specifies one or more characteristics of users eligible to be presented with advertisement content in the ad request. For example, targeting criteria are used to identify users having snapshot information satisfying at least one of the targeting criteria.

The social interaction module 215 maintains social data associated with users of the life management system 140. The social interaction module 215 is able to interact with, in accordance with a user's user controls, a user's email (external and/or internal to the life management system 140), calendar (external and/or internal to the life management system 140), connections in the life management system 140, connections to an external social networking system, or some combination thereof. The social interaction module 215 generates social data using information collected from users' calendars internal to the life management system 140, calendars external to the life management system 140, likes and/or dislikes within the life management system 140, likes and/or dislikes on external social networking systems, received emails (external/and or internal to the life management system 140), geo-location of client devices 105, or some combination thereof.

The social interaction module 215 interacts with calendars associated with users of the life management system 140. Each calendar includes one or more calendar entries. The social interaction module 215 may update (i.e., create, delete, or modify) calendar entries associated with one or more calendars based on instructions from the client device 105 associated with the user, the management interaction module 220, the action module 240, or some combination thereof. A calendar entry is associated with one or more information fields. An information field may include various information items associated with the entry, e.g., date, time period, name, description, reminder information (e.g., alert user 30 minutes prior to event), location information (e.g., address and/or map), attendee information (names and/or profile pictures of parties to the event), mood information for one or more participants (e.g., before, during, and/or after event), document attachments, biotelemetry data associated with one or more users associated with a calendar entry, or some combination thereof. In some embodiments, the social interaction module 215 may send notifications to participants associated with calendar entries. Further detail relating to calendar aspects are described in United Kingdom Patent Application No. 1400225.7, titled “Processing system and method”, filed Jan. 7, 2014 and a PCT application filed by the same applicant on the same day as the present application titled “Processing system and method”. Both of these documents are hereby incorporated by reference in their entirety.

The snapshot generation module 220 generates snapshot information associated with one or more users of the life management system 215. In some embodiments, the snapshot generation module 220 generates one or more health parameters for a user based in part on some, or all of, the social data associated with the user, the biotelemetry data associated with the user, the activity data associated with the user, user profile information associated with the user, or some combination thereof. In some embodiments, a health score represents a sleep efficiency of the user, a stress level of the user, an exercise intensity of the user, a general activity level of the user, a self-control or motivational level of the user, blood pressure of the user, obesity of the user, a momentum level of the user, mood information of the user, or some combination thereof. For example, a stress level score of the wearing user may be determined by combining at least a portion of the social data (e.g., schedule) and biotelemetry data (e.g., heart rate) of the wearing user. In another example, an exercise intensity score of the wearing user may be determined by combining by analyzing portions of the biotelemetry data (e.g., heart rate) and activity data (e.g., pace). The momentum level of the user describes the overall well being of the user. In some embodiments, the momentum score may be generated based a user's performance relating to assigned goals. For example, an assistant may assign certain tasks (e.g., run two miles weekly) to the user. The snapshot generation module 220 may determine a momentum level associated with the user based on the user's performance of the assigned tasks. Additionally, in some embodiments, the momentum score may be further increased if the user goes above and beyond the assigned tasks (e.g., running 4 miles weekly). Additionally, in some embodiments, the momentum level may take into account the performance of other users who were assigned the same task. In this manner, the momentum score may take into account the relative differences between the user's performance and other users of the life management system 140.

Mood information of a user is an estimation of the emotional state of the user based on an analysis of the social data associated with the user, biotelemetry data associated with the user, activity data associated with the user, user profile information associated with the user, or some combination thereof. Mood information may indicate, for example, a user is excited, happy, sad, fatigued, angry, grumpy, or any other emotional state capable of being inferred by the snapshot generation module 220. For example, mood information may be determined from a user's hear rate and galvanic skin response. In some embodiments, mood information may be determined via manual entry of by the user of the user's mood. In some embodiments, the mood information may include a numerical score.

Additionally, in some embodiments, the snapshot generation module 220 may prompt a user to identify their current mood. In some embodiments, the snapshot generation module 220 may perform a machine learning algorithm using the received feedback, social data associated with the user, biotelemetry data associated with the user, activity data associated with the user, user profile information associated with the user, or some combination thereof. For example, social data, biotelemetry data, activity data, user profile information, feedback, or some combination thereof, can be considered input signals that are analyzed by the machine learning algorithm. The machine learning algorithm can be trained on a set of signals associated with users of known moods that correspond to particular data taken from the social data, the biotelemetry data, the activity data, the user profile information, or some combination thereof. Once the machine learning algorithm has been trained on a known data set, the algorithm can be used for determining mood information based on a user's social data, biotelemetry data, activity data, user profile information, or some combination thereof. Further detail relating to ‘mood’ aspects are described in United Kingdom Patent Application No. 1400225.7, titled “Processing system and method”, filed Jan. 7, 2014 and a PCT application filed by the same applicant on the same day as the present application titled “Processing system and method”. Both of these documents are hereby incorporated by reference in their entirety.

The recommendation module 225 generates one or more recommendations for users of the life management system 140 using the generated snapshot information, advertisement requests, or some combination thereof. A recommendation is a suggestion to the user to perform some action.

A recommendation may include, for example, a nutritional recommendation, an exercise-related recommendation, a scheduling recommendation, a travel-related recommendation, a shopping recommendation (e.g., purchase a good and/or service), a sleeping recommendation, a suggestion to add a particular calendar entry an advertisement for a good, an advertisement for a service, a suggestion to improve one or more health parameters associated with the user, one or more advertisements that facilitate improving the one or more health parameters, one or more tips based on the user's activities (e.g., have a glass of water, take a break, etc.), or some combination thereof. The recommendation module 225 is configured to update the snapshot information with the one or more generated recommendations.

In some embodiments, the recommendation module 225 analyzes some or all of the generated snapshot information to develop recommendations to improve one or more health parameters associated with the user. In some embodiments, the one or more health parameters (e.g., sleep efficiency, a stress level, an exercise intensity, etc.) have corresponding activities of a particular category that when performed by a user generally have a beneficial effect on the corresponding health parameter. For example, a stress level score may have certain corresponding stress reduction activities (e.g., exercise, diet, increased sleep, etc.). In some embodiments, the recommendation module 225 determines whether one or more health parameters for users of the life management system 140 are below a threshold value. The recommendation module 225 may automatically calculate the threshold values, receive the threshold values from the user, receive threshold values from an assistant associated with the user, or some combination thereof. If a health score is below the threshold value, the recommendation module 225 may recommend one or more corresponding activities (e.g., stress reduction activities to reduce stress of the user). The recommendation module 225 may analyze the social data of the user to suggest time slots available to perform the one or more recommended activities.

In some embodiments, the recommendation module 225 may identify one or more advertisements in the advertisement store 211 that are associated with the one or more recommended activities. The recommendation module 225 may analyze the advertisement content, advertisement targeting criteria, user snapshot information, or some combination thereof, to determine what effect the service/product advertised may have on one or more health parameters of the user. The recommendation module 225 may then generate one or more recommendations using the appropriate advertisements. For example, a recommendation to reduce the stress level of the user may be generated that includes an advertisement for a massage.

Additionally, in some embodiments, the recommendation module 225 may analyze snapshot information associated with users of the life management system 140 that are connected to the user, in accordance with the users' user controls, to generate one or more recommendations for the user and the user's connections. For example, the recommendation module 225 may determine that both the user and a connection of the user need to get more exercise, and that they are both free Saturday afternoon. The recommendation module 225 may generate a recommendation for both the user and the connection to the user to participate in the same spin class occurring on Saturday at 2:00 pm. Additionally, in some embodiments, the recommendation module 225 may take into account the geographic locations of the user, connections to the user, the recommended activity, or some combination thereof in developing the one or more recommendations.

The management interaction module 230 associates users of the life management system 140 with one or more assistants. Assistants may be automatically associated to users of the life management system 140, manually selected by the users, or some combination thereof. Additionally, in some embodiments, users of the life management system 140 may configure one or more user controls, e.g., via the client device 105, that control how much snapshot information is be provided to the assistants, third party service providers 120, and other users of the life management system 140.

The management interaction module 230 receives requests for snapshot information associated with users of the life management system 140 from one or more assistant devices 125, client devices 105, or some combination thereof. The management interaction module 230 provides some or all of the requested snapshot information to the requesting one or more assistant devices 125, one or more client devices 105, or some combination thereof, in accordance with user controls of users associated with the requested snapshot information.

In some embodiments, portions of snapshot information provided to an assistant device 125, a client device 105, or both, may be displayed via one or more graphical user interfaces (“GUIs”). In one embodiment, one or more GUIs display some or all of the snapshot information associated with the user, e.g., an avatar, a momentum gauge indicating a momentum level, a user name, social data (e.g., calendar information), one or more health parameters, one or more analytical graphs of one or more health parameters, one or more recommendations, portions of biotelemetry data, portions of activity data, some other portion of snapshot information, mood information, portions of a user's calendar, or some combination thereof. Additionally, in some embodiments, the one or more GUIs display, some or all of the snapshot information associated with a plurality of users. For example, an assistant may be able to concurrently view snapshot information associated with different users of the life management system 140. Discussed in detail below with respect to FIGS. 17-22 are example embodiments of some of the one or more GUIs.

Additionally, in some embodiments, multiple GUIs may be concurrently be displayed to the user, as shown in for example, FIGS. 28A-C. For example, a GUI presenting a calendar in a monthly format

    • that allows the user to select a specific week, and/or day. Responsive to the selection of a particular week/day a second GUI may concurrently be presented to the user that displayed a detailed view of the selected area. In some embodiments, the detailed view may be responsive to additional selections that request the life management system 140 perform an additional action. For example, a single GUI may initially be presented to the user that displays a user's calendar for a particular month. Responsive to the user selecting a particular day within the month, additional details of the week including the selected day are concurrently presented in a second GUI. Additionally, in some embodiments, the selected day may also display an expanded view of one or more calendar entries in relation to other calendar entries displayed for the rest of the week. Response to a selection of a particular calendar entry one or more additional details may also be displayed. For example, for a calendar entry associated with travel—there can be an option to select flights. Responsive to a selection of the select flights option, the GUI may display one or more possible flight options.

An assistant may analyze the information presented via the one or more GUIs to develop actionable information associated with the user. Actionable information is information associated with an action or recommendation that has been approved by an assistant associated user. Actionable information may be, for example, a recommendation generated by the recommendation module 225, a recommendation generated by the recommendation module 225—but modified by the assistant, a new recommendation created by the assistant, instructions to perform an action authorized by the assistant, a message (e.g., text, image, video, or some combination thereof) to the user, instructions to purchase a service or good associated from the third party service provider 120, update social data (e.g., calendar) associated with a user, or some combination thereof. For example, a health coach viewing a portion of snapshot information associated with the user may recommend an additional workout, the type of workout, the location of the workout, a proposed time for the workout, etc., via the assistant device 125. The management interaction module 230 receives the actionable information from the assistant device 125.

Additionally, in some embodiments, actionable information may include a container or package of information that an assistant may push to the user's client device 105. The container or package of information may include details like expected traffic during a particular time, suggestions of routes to take to drive to meeting and avoid traffic, a pick-up location for others the user may need to pick up to take to the meeting, travel data associated with the meeting (flight, hotel information), suggested restaurants near the meeting location, suggested stores at which to pick up supplies for the meeting along the way, links that provide more information associated with the meeting, among a variety of other pieces of information, a package of information about wellness (e.g., such as a list of all of the things a user should bring to his massage appointment, and suggestions of nearby places the user can pick up additional items needed), some other actionable information, or some combination thereof. Thus, the calendar entries/invites provided can include a rich, animated data set that is actionable.

The card generation module 235 generates one or more cards for presentation to one or more users of the life management system 140 using the users' associated snapshot information, actionable information, one or more advertisements, one or more recommendations, or some combination thereof. The card generation module 235 may generate a card, e.g., when an event associated with a calendar entry is set to begin in a certain period of time, one or more health parameters of the user are below a certain threshold, to suggest an advertisement for a good or service potentially of interest to the user, responsive to a request from an assistant device 125, responsive to a request from an event owner of a calendar entry, etc. The card generation module 235 provides the one or more cards generated for a user of the life management system 140 to the client device 105 associated with the user.

A card may include, for example, a card identifier, a general recommendation, one or more problem details, one or more recommendation details, a reminder (e.g., movie starts in 10 minutes), or some combination thereof. A card identifier identifies the type of card. For example, a card identifier may be a momentum alert, a calendar update, you may like this, event reminder, ‘Be Fit,’ ‘Be Effective,’ ‘Be Aware,’ some other action item, etc. A ‘Be Fit’ card identifier is associated with cards relating to physical and/or nutritional tasks (e.g., physical activity, hear rate, sleep, blood pressure, doctor's visit, eating habits, etc.). A ‘Be Effective’ identifier is associated with cards relating to social and/or occupational tasks (e.g., quality time with friends & family, meeting new people, volunteering, donating, work—life balance, efficient meetings, being punctual, referring friends, etc.). A ‘Be Aware’ identifier is associated with cards relating to emotional and/or environmental tasks (e.g., positive attitude, time management, breathing exercises, setting priorities, life balance, turn of lights, recycling, voting, volunteering, etc.). The general recommendation is a summary of the recommendation. General recommendations may be, for example, take a quick break, take a nap, drink water, reduce your calorie intake, or any other message that generally describes the recommendation. The one or more problem details include specific snapshot information (e.g., heart rate, hydration, sleep pattern, etc.). In some embodiments, the one or more problem details may include one or more icons that correspond to snapshot information (e.g., portions of biotelemetry data, activity data, health parameters, etc.) that is outside a preferred range. An icon may be, for example a heart with an arrow pointing upward to represent a rapid heart rate, a water droplet with a downward facing arrow to represent low hydration, etc. Additionally, the card generation module 235 may alter the icons (e.g., change color, shape, etc.) to indicate the present state of the user. The one or more recommendation details propose possible solutions to the identified problem. For example, if a user is dehydrated, the recommendation details may include a suggestion to drink water, visit the user's favorite juice bar, etc. Additionally, in some embodiments, the one more recommendation details may present one or more advertisements associated with solving the problem (e.g., a discount coupon for a local juice bar). The card generation module 235 is configured to provide one or more cards associated with a user of the life management system 140 to their associated client device 105. An example card is discussed below with regard to FIG. 23.

The action module 240 performs one or more actions, in accordance with the user controls of a user of the life management system 140, using the snapshot information, the actionable information, requests from one or more client devices 105, or some combination thereof. The action module 240 may, for example, instruct the social interaction module 215 to update a user's social data (e.g., calendar), coordinate with the third party service provider 120 to obtain a good and/or service, utilize snapshot information associated with the user in obtaining a good and/or service for the user, shift tasks among different entities in the life management environment 101, use conditions in one part of the life management environment 101 to set conditions in another part of the life management environment 101, automatically performing one or more actions in accordance with a user's user controls, some other action, or some combination thereof, based in part on snapshot information and/or actionable information associated with the user.

As noted above, in one embodiment, the action module 240 is able to use conditions in one part of the life management environment 101 to set conditions in another part of the life management environment 101. For example, the action module 240 may use geographic location information associated with a client device 105 associated with a user to set time parameters for another client device 105 (e.g., a wearable computer or wristband) associated with the user. In another embodiment, the action module 240 automatically changes one or more recommendations and/or scheduled events based on one or more health parameters associated with the user. For example, the action module 240 may recommend to the user a direct flight versus a flight with a connection if one or more health parameters of the user are below an associated threshold. In an additional embodiment, the action module 240 may notify an assistant associated with the user when the power level of the client device 105 associated with the user drops below a power threshold value.

As noted above, in one embodiment, the action module 240 is able to shift tasks between entities in the life management environment 101. For example, the action module 240 may automatically shift data among different client devices 105 associated with a user of the life management system 140. In one embodiment, the action module 240 monitors the power levels of a plurality of client devices 105 associated with a user. The action module 240 detects when the power level of one client device 105 drops below a threshold value (e.g., indicating a low batter level). The action module 240 may then automatically transfer some data items (e.g., a To Do list) from the client device 105 with the low power threshold to another client device 105 associated with the user that has an adequate power level. In another embodiment, the action module 240 may select an image, and push the selected image to a client device 105 based on the geographic location information associated with the client device 105. For example, sending an image of a sunrise in Seattle to a client device 105 that is located in Seattle.

In some embodiments, the action module 240 determines whether a conflict exists between locations associated with calendar entries in a calendar. A conflict occurs when there is not enough time to travel between one location associated with one calendar entry to another location associated with a different calendar entry. The action module 240 determines a route based on location information associated with the calendar entries. The action module 240 calculates optimal travel time between the location associated with different calendar entries using the determined route, the locations of the event described in the calendar entries, the location of the client device 105 associated with the user, travel conditions, weather data, or some combination thereof. If the calculated travel time is such that it the user is projected to not arrive at the estimated time, the action module notifies the user. Additionally, in some embodiments the action module updates the calculated travel time in real time. If the calculated travel time is such that it the user is projected to not arrive at the estimated time, the action module may instruct the social interaction module 215 and/or the card generation module 235 to notify the user, notify other participants in the calendar event, or some combination thereof.

The mood module 245 monitors mood information associated with various users of the life management system 140. The mood module 245 provides mood information associated with a user, in accordance with the user's user controls, to one or more entities of the life management environment 101. In some embodiments, the mood information may be associated with different moods of the user over a particular period of time (e.g., hourly, daily for the last week, etc.). In some embodiments, the mood information is automatically pushed to, e.g., one or more associated assistants (e.g., to the assistant device 125), one or more client devices 105 associated with the user, one or more client devices 105 associated users who are connected to the user, or some combination thereof. Additionally, in some embodiments, a client device 105 associated with some other user and/or an assistant device 125 may request the mood of the user from the mood module 245. The other user and/or assistant may then make a determination on whether to interact with the user based on the mood information associated with the user.

Additionally, in some embodiments, a user may indicate, via their user controls, that one or more other users of the life management system 140 are trusted users. The mood module 245 then designates the one or more other users as trusted users, such that the trusted users may provide mood information to other users of the life management system 140.

FIG. 17 illustrates an example of a user interface 301 displayed by the assistant device 125 and/or client device 105 showing a detailed user snapshot associated with a user of the life management system 140 according to an embodiment. The user interface 301 includes an avatar of the user 305, a momentum gauge 311, a name of the user 315, calendar information 321, analytical graphs 325, recommendations 330, and health parameters 335. The momentum gauge 311 presents a momentum score associated with the user. The calendar information 321 includes displays one or more calendar entries associated with the user. In this example, the calendar entries run from left to right, such that time progresses throughout the day (i.e., May 24, 2013) from right to left. Additionally, in this example, the user as one calendar entry 340 scheduled for a particular time, and is double booked with calendar entries 345 and 350 over a different time slot. In some embodiments, the interface 301 includes information about multiple users for an assistant, such as all of the users on the team supported by an executive assistant or being managed by a health coach. The assistant can thus view summaries of each person's schedule, review each person's momentum gauge, have access to each person's current biotelemetry data and/or activity data, among other data. The assistant can use this information in assisting each user. For example, if member 1 of the assistant's team doing poorly in terms of momentum or has mood data showing he is in a bad mood, the assistant can cancel some of member 1's meetings and schedule a yoga session. If member 2's data shows her heart rate has been elevated for a period of time, the assistant can send a suggestion regarding relaxation techniques. Additionally, alternative views exist, for example, as shown in FIG. 26 that illustrates another example of a user interface displayed by an assistant device 125 and/or client device 105 showing a detailed user snapshot associated with a user of a life management system according to an embodiment.

FIG. 18 illustrates an example of a user interface 400 displayed by the assistant device 125 and/or a client device 105 showing a basic user snapshot associated with a user of the life management system 140 according to an embodiment. The user interface 400 includes an avatar of the user 405, a momentum gauge 411, a name of the user 415, calendar information 421, and emails 425.

FIG. 19 illustrates an example of a user interface 500 displayed by a client device 105 showing a portion of snapshot information associated with a user of the life management system 140 according to an embodiment. The user interface 500 includes a momentum gauge 505, general information 511, calendar entries 515, and detailed views 520 of corresponding calendar entries 515, and a recommendation 525. In some embodiments, a user interface may also include mood information as shown in FIG. 27.

FIG. 20 illustrates another example of a user interface 600 displayed by a client device 105 showing portions of snapshot information associated with the user of a life management system 140 according to an embodiment. The user interface 600 includes a health parameter section 605, analytical graph section 610, a date section 615, a health parameters details section 620, a sleep information section 625, and a time aggregate control 630. In this example, the health parameter section 605 displays a plurality of health parameter identifiers 635a-635d, their associated scores 640a-640d, and graphical indications 645a-645d of the scores 640a-640d. In this example, the analytical graph section 610 includes graphs 650a-650d, where graph 650a is associated with the health parameter identifier 635a, graph 650b is associated with the health parameter identifier 635b, graph 650c is associated with the health parameter identifier 635c, and graph 650d is associated with the health parameter identifier 635d. The health parameters details section 620 includes additional information about the displayed health parameters. The time aggregate control 630 allows a user to select what time interval is used to calculate the displayed snapshot information. For example, in this example a day is selected, and the associated snapshot information is taken from a single day (i.e., May 29, 2013). If the user were to select some other time interval, e.g., a week, the displayed snapshot information would cover a weeks' time (e.g., the last seven days—or some other seven day period selected by the user).

FIG. 21 illustrates an example of a user interface 700 displayed by a client device 105 showing basic snapshot information in a calendar associated with a user of the life management system 140 according to an embodiment. The user interface 700 includes a plurality of general calendar entries 705, a view control 710, and a time indicator 715. The user interface 700 may receive a selection from the user to expand one or more general calendar entries 705. Additionally, the user interface 700 may receive a selection from the user to view a larger or small section (e.g., a single day, a week, a month, etc.) of the calendar via the view control 710. The time indicator 715 graphically illustrates the local time (e.g., 8:30 am). In addition, the calendar entries 705 in FIG. 21 correspond with the tabs 515 shown in the FIG. 19 calendar summary view. For example, the top four tabs each labeled 515 in FIG. 19 correspond with the top four calendar entries 705 on the left-most calendar column in FIG. 21. The break in the calendar of FIG. 21 from a little before 2 pm until 3 pm is illustrated in the view of FIG. 19 as an empty spot before the fifth tab labeled 515. Thus, the user can view the interface of FIG. 19 and see in short-hand using tabs 515 what is coming up on his calendar and when he has breaks, alongside the view of his momentum gauge 505 showing his current level of momentum (bar 505 moves up or down corresponding to momentum level) and a summary of some of the next few calendar items coming up. The user can use the FIG. 21 interface to see the details of the calendar entries 705 corresponding to the tabs 515.

FIG. 22 illustrates an example of a user interface 800 displayed by a client device 105 showing an expanded general calendar entry associated with a user of the life management system 140 according to an embodiment. The user interface 800 displays the expanded general calendar entry 805. The expanded calendar entry 805 may include portions of snapshot information associated with the general calendar entry. FIG. 22 illustrates that a variety of different types of actionable information can be included in each calendar entry or invite, and this information might be provided by an assistant who has pushed this calendar entry out to the user's client device 105. The assistant can create a container or package of information that includes details like expected traffic during that time, suggestions of routes to take to drive to meeting and avoid traffic, a pick-up location for others the user may need to pick up to take to the meeting, travel data associated with the meeting (flight, hotel information), suggested restaurants near the meeting location, suggested stores at which to pick up supplies for the meeting along the way, links that provide more information associated with the meeting, among a variety of other pieces of information. In another example, the calendar entry is provided by a physician or coach of the user, and it includes a package of information about momentum, such as a list of all of the things a user should bring to his massage appointment, and suggestions of nearby places the user can pick up additional items needed. Thus, the calendar entries/invites provided can include a rich, animated data set that is actionable.

FIG. 23 illustrates an example of a card 900 displayed by a client device 105 to an associated user of the life management system 140, according to an embodiment. The card 900 includes a card identifier 905, a general recommendation 910, problem details 915, and recommendation details 920.

FIG. 24 is a flowchart illustrating the process for generating a card using snapshot information associated with a user of the life management system 140 according to one embodiment. In one embodiment, the process of FIGS. 10A-B are performed by the life management system 140. Other entities may perform some or all of the steps of the process in other embodiments. Likewise, embodiments may include different and/or additional steps, or perform the steps in different orders.

The life management system 140 receives 1010 biotelemetry data and/or activity data from a client device 105 associated with the user. The life management system 140 generates snapshot information using the biotelemetry data, the activity data, social data associated with the user, user profile information associated with the user, or some combination thereof.

The life management system 140 generates 1010 a recommendation using portions of the snapshot information, an advertisement request, or some combination thereof. In some embodiments, the life management system 140 generates a recommendation to improve one or more health parameters associated with the user using portions of the snapshot information. For example, the life management system may determine that a user's is overly stressed. The life management system 140 may then select an advertisement associated with reducing stress (e.g., for a massage, bed and breakfast, etc.). The life management system 140 then may incorporate the advertisement and a suggestion to reduce to user's stress level in a recommendation.

The life management system 140 updates 1015 the snapshot information with the one or more recommendations. If a request is received 1020 from an assistant device 125 for a portion of the snapshot information associated with the user, the life management system 140 provides 1025 the portion of snapshot information to the assistant device 125 in accordance with user controls associated with the user. The information may be presented via, for example, one or more graphical user interfaces that display portions of the received snapshot information. The assistant associated with the assistant device 125 may then analyze the snapshot information to develop actionable information. For example, the assistant may recommend an item to the user, want to purchase a good/service for the user, etc. The life management system 140 receives 1030 actionable information from the assistant device 125.

The life management system 140 updates 1035 the snapshot information using the actionable information. For example, the life management system 140 may update one or more recommendations in the snapshot information based on the actionable information (e.g., add one or more new recommendations, modify an existing recommendation, etc.).

The life management system 140 executes 1040 a recommendation associated with the snapshot information in accordance with the user controls associated with the user. In embodiments, where the life management system 140 does not have a sufficient authorization level to perform the action, the life management system 140 requests approval from the client device 105 to perform the action. Similarly, in embodiments, where the action was requested by the assistant and the assistant does not have sufficient authorization level to perform the action, the life management system 140 requests approval from the client device 105 to perform the action.

The life management system 140 generates 1045 a card using the snapshot information. For example, the card may present a health parameter that has a value below a certain threshold, a recommendation on how to improve the health parameter, and an associated advertisement for a good/service that may facilitate improvement of the health parameter. In another example, the card may be a request for the user to approve a suggested change to their calendar. In another example, the card may contain information reminding the user about a scheduled calendar entry. Additionally, the in some embodiments, where the life management system 140 has already executed the action the card may be a notification that the action has been executed by the life management system 140 and/or the assistant.

The life management system 140 provides 1055, for display to the user, the card to the client device 105. In embodiments, where the card requests user approval of an action or recommendation described by the card, if the life management system 140 receives approval from the user, the process moves to step 1040.

FIG. 25 is a flowchart illustrating the process for generating and managing mood information associated with a user of the life management system 140 according to one embodiment. In one embodiment, the process of FIGS. 11A-M are performed by the life management system 140. Other entities may perform some or all of the steps of the process in other embodiments. Likewise, embodiments may include different and/or additional steps, or perform the steps in different orders.

The life management system 140 receives 1110 biotelemetry data and/or activity data from a client device 105 associated with the user. The life management system 140 generates 1120 mood information using the biotelemetry data, activity data, social data associated with the user, user profile information associated with the user, or some combination thereof.

The life management system 140 provides, in accordance with user controls associated with the user, the mood information to a different client device 105 associated with another user of the life management system 140. In one embodiment, the life management system 140 may automatically push mood information associated with the user to other users of the life management system 140 in accordance with the user's user controls. In other embodiments, other users of the life management system 140 may request mood information associated with a user from the life management system 140 and/or some other trusted third party. The trusted third party and/or the life management system 140 then provide, in accordance with the user's user controls, the mood information to the requesting client device 105.

Additional Configuration Considerations

The foregoing description of the embodiments has been presented for the purpose of illustration; it is not intended to be exhaustive or to limit the embodiments to the precise forms disclosed. Persons skilled in the relevant art can appreciate that many modifications and variations are possible in light of the above disclosure.

Some portions of this description describe embodiments in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are commonly used by those skilled in the data processing arts to convey the substance of their work effectively to others skilled in the art. These operations, while described functionally, computationally, or logically, are understood to be implemented by computer programs or equivalent electrical circuits, microcode, or the like. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules, without loss of generality. The described operations and their associated modules may be embodied in software, firmware, hardware, or any combinations thereof.

Any of the steps, operations, or processes described herein may be performed or implemented with one or more hardware or software modules, alone or in combination with other devices. In one embodiment, a software module is implemented with a computer program product comprising a computer-readable medium containing computer program code, which can be executed by a computer processor for performing any or all of the steps, operations, or processes described.

Embodiments may also relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, and/or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a non transitory, tangible computer readable storage medium, or any type of media suitable for storing electronic instructions, which may be coupled to a computer system bus. Furthermore, any computing systems referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.

Embodiments may also relate to a product that is produced by a computing process described herein. Such a product may comprise information resulting from a computing process, where the information is stored on a non transitory, tangible computer readable storage medium and may include any embodiment of a computer program product or other data combination described herein.

The text of the abstract of one priority document is hereby repeated. A life management system receives data from a client device worn by a user, the data comprising biotelemetry data and activity data collected about a user wearing the client device. The life management system generates snapshot information using information from a group consisting of: the biotelemetry data, activity data, social data associated with the user, and user profile information associated with the user. The life management system generates a recommendation using portions of the snapshot information, and updates the snapshot information with the recommendation. The life management system executes a recommendation associated with the snapshot information in accordance with the user controls associated with the user.

Calendars and Moods

FIG. 29 illustrates an overview of a data management system 1000. The system 1000 comprises data inputs 1011 from a variety of sources which relate to a user. A portion of the input data 1011 is obtained from a sensing device used by the user. The sensing device generates sensor data 1021 sent as biometric data relating to the user. External Information Systems (EIS) 1031 provide external data relating to the user and/or other users of the system 1000.

In one example, the data inputs are obtained from any of the following sources:

    • Sensor data 1021 can include: pulse rate, pedometer, Galvanic Skin Response (GSR), accelerometer, gyroscope, optical skin and blood vessel dilation, calories used, activity, stress, hydration, skin temperature, environment temperature, blood pressure, blood oxygen level, blood glucose level, electrocardiogram, electroencephalogram, electromyogram, respiration, ambient ultraviolet light, ambient CO2 level, and blood alcohol level.
    • External data 1041 can include calendar, email, and contact information for example from sources such as Google Apps, Microsoft Exchange, iCloud.
    • External data 1041 can include wellness and fitness data for example from sources such as Fitbit, Jawbone, Nike+, Withings, BodyMedia, MapMyRun.
    • External data 1041 can include travel data for example from sources such as TripIt, TripCase.
    • External data 1041 can include contacts and current sentiments information for example from social media sources such as Facebook, Twitter, LinkedIn, as well as general browsing activity
    • External services 1050 can include a weather information provider, travel, accommodation and/or restaurant booking services.

In a further example, data received by the data aggregator and processor 1060, includes:

    • data from a user device, including system-wide information, such as: user phone usage; tariff plan; phone battery state; MSISDN; network name; signal strength; network type; service area; roaming state; mobile network state; IMEI; IMEI SV; MAC address (Wi-Fi); Bluetooth address; uptime; activity name (information about running applications and other user actions); network mobile country code; network mobile network code; phone model; operating system version; firmware version; kernel version; build number; software version; device locale; list of installed applications; memory information; global positioning system (GPS) last position; display manufacturer.
    • data from a user device, including system logs events, such as: camera state; screen actions (in order to determine user actions, for example: clicking on an application icon to run an application, selecting a widget on a home screen, in order to determine the most useful options that a user runs when using a phone device); alarm indications; Wi-Fi state; application crash log; camera usage; screen orientation; call start; call number; SMS sent; SMS number; E-mail accounts information (sent, received and other); and browser history (visited sites).

The input data 1011 is received and collated by a data aggregator and processor 1060, comprising various interconnected servers (as illustrated in FIGS. 32 and 33), arranged to compile and analyse the input data 1011. In part, the role of the data aggregator and processor 1060 is to generate actions, triggers or prompts based on the analysed data for users, which are communicated to users via devices 1070, mobile applications 1080 and/or browsers 1090 (via an online web portal, to which users have access).

Different individuals can access and interact with a user's data in the data management system 1000. The different individuals are grouped according to the purpose of their interaction with the user and the user's data, which allows definition of permitted interactions for such an individual participating in a particular role. In an example, the individuals (also referred to as “Roles”) with access to user's data include:

    • the user;
    • the user's executive assistant;
    • a customer relationship manager (CRM) assigned to the user by the system operator; and
    • the user's personal trainer or lifestyle coach.

The individuals, including the user as well as auxiliary users (e.g. assistant, line manager, coach, etc.), have access to the system 1000 using any combination of interfaces including (web) browsers 1090, mobile apps 1080 and dedicated devices 1070.

FIG. 30 demonstrates a flow diagram 2000 showing the overall format of the data management system 1000. In a first step, multi-source data pertaining to a variety of types of information, including External Information Systems (EIS) 1031 and sensor data (such as biometric data) 1021, is recorded. The data from various sources is subsequently communicated to the data aggregator and processor 1060 for compilation and analysis of the data. The analysed data and/or actions generated by the data aggregator and processor 1060 are output to user devices 1070, mobile apps 1080 and/or browsers 1090.

FIG. 31 further illustrates the process of data aggregation and processing by a data management system 1000. Input data 1011, from various sources, which might include calendar, email, contacts and appointments; social networks and current user sentiment; biometric data; upcoming travel; and the user's activity, wellness and fitness, is input into the system 1000 via one or more application programming interface (API) 2020. The data is aggregated and subsequently processed 2030, thereby outputting processed data 2040. The user is able to control the extent to which the processed data 2040 is output, for example the degree to which data is shared amongst other users and/or commercial bodies.

FIG. 32 shows an overview of the architecture of a data management system 1000. In this example, sensor data 1011 is obtained by a personal monitoring device 3000, such as a biotelemetric device for measuring and recording biological parameters including heart rate, blood pressure, glucose and oxygen levels. In addition, the personal monitoring device 3000 might also measure and record activity parameters, for example by using pedometers, accelerometers and/or a GPS system. The personal monitoring device 3000 is preferably a wearable device (such as the device 100 described above) that incorporates a number or sensors and further functionality.

The data obtained by the personal monitoring device 3000 is communicated, preferably wirelessly (e.g. via Bluetooth), to a network-enabled computing device 3010, such as a personal computer or mobile smartphone. The computing device 3010 relays the data, via a network (for example, the internet or a local connection), from the personal monitoring device 3000 to the data aggregator and processor 1060. The sensor data 1011 is recorded and associated with the user from whom 1011 it originated. Data from the personal monitoring device 3000 may also be communicated directly to the data aggregator and processor 1060, if the personal monitoring device 3000 incorporates suitable data communication means.

External Information Systems (EIS) data 1031 is also received and/or collated by the data aggregator and processor 1060 and recorded. The EIS user data 1031 is obtained from multiple sources, including information derived from calendars, schedules (including travel details), email, contacts, exercise platforms, social media platforms, and/or publicly available information sources (such as a traffic report provider at the user's location).

The input EIS data 1031, sensor data 1011 and/or any miscellaneous data related to the user are aggregated and processed by a data analysis module in a distributed computing network or cloud-based computing system. The output of the data analysis module is recorded and associated with the user from which the input data originated. Furthermore, the data analysis module generates actions and outputs to the user. The output data is accessible by the user and/or auxiliary users (that are associated with the user) via an application or web portal, for example via a computing device 3010. For the data aggregator and processor 1060 and the user to interact with one another an online portal or browser provides an interface, or a dedicated device 3010, or software such as a mobile software application or a computer programme.

Miscellaneous user data includes for example any user-defined rules, data from auxiliary users and/or advertising targeted to the user.

Further functionality can be provided by the system. For example, an indication of low battery of a user's mobile phone is used to trigger an action to for an online browser application to poll for information more often instead of the mobile phone. At the same time the system can alert auxiliary users about the low battery of the mobile phone. In another example a “to do” list stored in a mobile phone is moved to cloud-based storage if the mobile phone's battery is running low.

FIG. 33 shows a further exemplary overview of the architecture of the data management system 1000.

A system for managing a calendar in the form of a calendar tool will now be described. Based on existing calendar entries (appointments) and associated appointment parameters in a user's calendar and user data, a calendar tool performs actions to schedule a new activity for a user. The new activity scheduled by the calendar tool is associated with a time, either before, after or during an existing appointment. The calendar tool thus populates a calendar.

Existing appointments are defined by appointment parameters including a time or timespan and further appointment information. Such further appointment information includes a number of different kinds of information, such as parties to the appointment (including invitees and attendees), appointment location, nature of the appointment, and circumstances of the appointment.

User data (based on which the calendar tool performs scheduling actions in order to schedule a new activity for a user) provides further information relating to the user of the calendar. User data can include for example user biometric data, user location data, user online activity data, user mood data, and user settings selected by the user. User settings might include (but are not limited to) user preferences, travel preferences, purchase information and activity preferences.

Appointment parameters and user data can be provided by user input (or by auxiliary user input), or by input from other sources, such as EIS data 1031.

The calendar tool can be implemented in the data aggregator and processor 1060, or it can be implemented locally on a user device in a local software application. Parts of the calendar tool can be distributed between the data aggregator and processor 1060 and a user device.

The calendar tool performs an action to do with scheduling a new activity for a user in response to one or more calendar entries and commences from within a calendar application. Some examples of possible actions to do with scheduling a new activity for a user are:

    • suggesting routes to arrive at a location for an appointment, and providing maps
    • suggesting activities to enhance the user's wellbeing in time available until the next appointment
    • using an eCommerce service to purchase an airplane ticket
    • listing flower shops in the vicinity of a host's home in preparation for an invitation to dinner

A wide variety of factors can be taken into account in scheduling the activity for the user. Some examples of such factors include:

    • whether or not approval is required prior to execution of a purchase
    • who can approve which actions
    • user metrics (including historical metrics, current metrics, and predicted future metrics) such as a measure of the exercise a user has undertaken in the past five days according to sensor data
    • user activity (including historical activity, current activity, and predicted future activity) such as a recent increase in visits to a particular theatre
    • other user information, such as whether or not a user participates in a loyalty scheme for an airline, and if so what are the user's frequent flyer card details
    • information relating to the user's location, such as a weather forecast or traffic congestion information

In FIG. 34, an exemplary Graphical User Interface (GUI) shows a three-day display of a user's calendar 4000 with various existing appointments, as extracted from the EIS 1031. The data aggregator and processor 1060 identifies a time slot 4010 in the calendar which has no appointments associated and determines that there is potential for a space 4050 to be added to the user's calendar for a new activity, at a time between the end of a first appointment 4020 and the start of a second appointment 4030.

In the context of the present invention, a “space” is a period to which the new activity for a user being scheduled by the calendar tool relates. For example, this period can be the time between two appointments, or the time until a later, subsequent appointment. A space can include a definition of some or all of the factors the calendar tool takes into account to schedule a new activity for a user.

In the exemplary GUI shown in FIG. 34, a first appointment 4020 blocks out a time space for a meeting in New York from 1:15 PM until 3:15 PM, and a second appointment 4030 blocks out a time space on the next day for an exhibition in San Francisco from 3:00 PM onwards. From these two appointments, the data aggregator and processor 1060 is able to:

    • determine that between the end of a first appointment 4020 and the start of a second appointment 4030 travel from New York to San Francisco is necessary
    • define desired travel details, such as by air, direct, departing after 4:30 PM New York local time and arriving before 2:00 PM San Francisco local time on the next day
    • obtain a selection of suitable flights
    • select the best flight (for example depending on price, or by user selection)
    • book tickets (automatically and without further user engagement)
    • complete check-in for the user prior to the flight (automatically and without further user engagement).

The data aggregator and processor 1060 collates information including length of the time-span between the first appointment 4020 and second appointment 4030, the current user location and relative location of the first appointment 4020 and second appointment 4030, as well as biometric data. The aggregated information is processed by the data aggregator and processor 1060 and a suitable space 4050 for the new travel activity is determined and output to the user's calendar 4000. In an example (not shown) a list of suitable spaces is provided for user selection in order to take user preference into account.

A space can depend on and include information such as the requesting party's identity, a time span, a magnitude, a model type, a price range, brand(s), a description, a material, a selection criteria, etc. For example, a space may be defined by:

    • a calendar start-stop time, flight SFO-JFK, lowest price.
    • now until an appointment for playing basketball; Nike shorts, white, L, men's
    • now, blood pressure high, location NYC

An action to be performed by the calendar tool in order to schedule a new activity for a user can be defined in the absence of a space (for example for template scheduling actions or default scheduling actions). An action to be performed by the calendar tool can be defined specific to a particular space and/or as part of a calendar entry. Actions to be performed by the calendar tool can be defined before the space is defined, in parallel with the definition of the space, or after the space is created. An action may be defined as part of a particular space, or external to a particular space. The action may be automatic, automatic based on parameters or manual.

FIG. 35 shows, further to FIG. 34, the scheduling of a new activity for a user based on a determined space. In the example shown, the data aggregator and processor 1060 determines that a space 4050 is available and an action can be initiated. In the illustrated example, the new user activity is travel, and the scheduling actions relate to the necessary travel arrangements.

The data aggregator and processor 1060 determines that, given the timespan of the events either side of the space, their geographic separation (New York and San Francisco) and/or the GPS location of the user, air travel is needed between the two events. The data aggregator and processor 1060 identifies travel arrangement information 4060 in order to determine suitable flights for the space 4050.

Once a selection of alternative travel arrangements is found, an optimum is selected, for example based on biometric parameters. In one example, high stress levels observed in the user are used to select a travel alternative for maximum convenience instead of lowest cost.

The data aggregator and processor 1060 determines whether the calendar tool takes further actions. For example, in a determination step 4070, the flight can be scheduled in the user calendar and various scheduling actions, such as booking, check-in and notification can be performed automatically and without further user engagement.

The identification of travel arrangement information 4060 and the step 4070 of determining further scheduling actions is performed subject to the user's rules 4080 for handling automatic processes. Such exemplary rules include:

    • Book features of flight depending on the users biometrics, e.g. if tired or stressed book business class.
    • Commercial rules, such as upper and lower limits on cost or selecting lowest cost flights.
    • Preference of flight times—if available, select early morning flights as opposed to late evening flights.
    • Automatically book flights or otherwise, a user manually selects preferences, rules and/or approves the flight booking
    • Identify auxiliary users with permission to approve flight bookings, or with permission to select a wellbeing promoting activity.

Scheduling actions are performed by the calendar tool using various External Information Systems (EIS) information and autofill functions. In one example, frequent flyer card details are automatically inserted into the flight booking process.

The data aggregator and processor 1060 can output a notification 4094 to the user and other auxiliary users, such as the user's executive assistant (EA). In another example, a user's personal trainer (PT) receives information about the time spent flying, and the user's partner receives notification that dinner in the vicinity of New York is not possible when the user is scheduled to fly to San Francisco.

In the example shown in FIG. 35, the data aggregator and processor 1060 automatically schedules a flight in the space from 8:00 AM to 3:00 PM, blocks out that space in the calendar 4000, and makes the flight booking.

A new activity that is scheduled by the calendar tool can have different states, such as high priority and low priority. The state of an activity can change as time progresses. In one example a user activity is to exercise, and the associated calendar tool action is to schedule an exercise. This action is inactive while the user's blood pressure is low, active when the blood pressure becomes high, and urgently active when the blood pressure becomes very high, in which case an additional calendar tool action is to generate an alert to the user's coach that the user's blood pressure is high.

A space can be ended by an action being performed by the calendar tool. For example, an automated on-line purchase being executed ends a space for making a purchase prior to an appointment.

In another example, the user has time in his or her calendar before the next appointment, and the calendar determines that the user has not taken many steps that day; the calendar determines a park near to the GPS location of the user's device, and suggests a walk in the park to the user and provides directions to the park. In another example, the user's surrounding temperature is detected to be high, in response to which the calendar tool suggests places where cold drinks are available.

Recommendations or suggestions are generated (and/or appointments are made) and provided by the system to the user according to a propensity factor that quantifies the benefit to the user from engaging or partaking in a possible activity. The propensity factor is based on an estimate of a user's desire to engage in the possible activity, and can take subjective factors (e.g. likeliness of enjoyment, estimate of benefit to overall wellbeing) as well as objective factors (e.g. travel time, cost information) into account. Subjective factors are specific to that particular user, and may be different for different users. Subjective factors are indicative of the user's individual attitudes and preferences. Conversely objective factors are not specific to or dependent on that particular user, and in particular are not dependent on the user's individual attitudes and preferences.

Subjective factors may be estimated based on past user behaviour and activity decisions (e.g. user previously chose jogging rather than swimming), or based on user input (e.g. user prefers giving quirky gifts over conventional gifts), or based on analysis of user data associated with an activity (e.g. reduced stress indicator levels in 3-day period following a ski break). Parameters taken into account to estimate a propensity factor can include user data such as biometric data, location data, online activity data and mood data.

Examples of parameters taken into account to estimate a propensity factor include:

    • calendar appointment density
    • user blood pressure
    • recent snowfall in a favoured ski resort
    • travel distance required for the possible activity
    • duration since last occurrence of similar activity
    • user activity history
    • other parameters

Many other parameters can be taken into account to estimate a propensity factor.

By blending objective and subjective factors a more meaningful propensity for an activity can be determined than if an objective factor alone, for example physical distance or cost, is considered. For example, excellent weather conditions can outweigh the inconvenience of travelling to a ski resort for a day for a user that particularly enjoys skiing. In another example attendance at a comedy event the day after attendance at a funeral might be appropriate for a user with a propensity for distraction, but might be inappropriate for a user with a propensity for thoughtfulness.

In another example the system determines a propensity to invest effort into purchasing a gift prior to an invitation to dinner at a friend's house. Subjective factors that may affect the propensity can for example include: a level of effort the user previously invested on a similar occasion; the amount of time the user spent preparing for the appointment; whether the user cancelled a previous appointment; or whether a previous appointment was cancelled by the other party. Objective factors that may affect the propensity can for example include: physical distance, traffic conditions, and whether a particular retailer is a luxury or high-end retailer, and hence more expensive. Depending on the determined propensity, the suggested action may be to purchase a gift at a convenience store on the way, or may be to make a detour to a department store to purchase a gift.

For a given available timeslot, the benefit of an activity to the user can thereby be optimised with a high degree of sophistication. As a result of subjective factors being taken into account, the activity can be truly tailored to a specific user, and hence provide greater benefit to that user.

The system can also provide alerts for recommendations or suggestions according to user activity settings. For example, if a user defines an activity-free ‘bed time’ period as 10 pm to 8 am, then no alerts are generated during that period. The system can determine if the user is in an activity-free period in dependence on biometric data, for example by using motion, breathing and pulse data to determine if a user is still asleep (and not available for activities) or awake (and so available for activities) in the morning. Other activity-free periods may be defined, such as a ‘work time’ (which the system may determine if the user is in a particular location defined as his work place) or a ‘break’ where no activities are desired for a day.

The system can also update suggested activities as time progresses, for example suggesting a jog in favoured but distant park at first, and if the user's location has not changed to the suggested park after half an hour, then suggesting instead a jog in the local neighbourhood park instead.

FIG. 36 shows an exemplary graphical user interface (GUI) illustrating a user's calendar output 4000 generated by the data aggregator and processor 1060. A travel activity 5010 to Tokyo is scheduled into the calendar 4000. The calendar 4000 accounts for the travel time, time zone, daylight saving and crossing of the International Date Line when computing the time span of the scheduled travel activity 5010. Existing appointments following the travel activity 5010 are listed according to the local time of the user. In one example, the data aggregator and processor 1060 is arranged to adapt the calendar to local time according to the user's location as derived from a GPS instrument integrated in a user device.

A further exemplary graphical user interface (GUI) 4060 is also illustrated in FIG. 36, in which possible flight options are identified, as determined by the data aggregator and processor 1060, according to the time constraints of the travel activity 5010 being scheduled. A user can, if desired, book flights via this GUI 4060.

FIG. 37 shows an exemplary process executed by the data aggregator and processor 1060 to manage events and teams and/or groups in order to improve a user's out-of-the-box experience. In a first step 6010, a user creates a new event and selects a group of participants for the event. The data aggregator and processor 1060 determines whether there is an overlapping event previously created by one of the selected participants 6030, by searching the user data of the selected participants 6020.

If the data aggregator and processor 1060 determines 6030 that there is no overlapping event, the event is created and the selected participants are notified 6040. If an overlapping event is identified by the data aggregator and processor 1060, then the user the user creating the new event is notified that there is a pre-existing overlapping event and a query is generated as to whether the user would like to join the team associated with the pre-existing event 6050. If the user declines, the new event is created 6040. Conversely, if the user wishes to join the pre-existing event, the user is added to the team associated with the pre-existing event and the team is added to a list of teams and/or groups associated with the user 6060, this thereby allows the user to access calendar events associated with the team in their own calendar 6070. The team members are also added to the user's list of contacts 6080.

Overlapping events are identified by the data aggregator and processor 1060 in dependence of the temporal and/or geographic coincidence of two or more events. In addition, two or more events are deemed to be overlapping due to the nature of the events, such as the intended activity. In one example, two events are determined to overlap if a baseball match is the intended activity of both the events.

FIG. 38 shows an a flow diagram of the process used to integrate contacts, teams and events in order to rapidly grow a new user's list of contacts, teams and events. In a further example, the data aggregator and processor 1060 is arranged to integrate new users with pre-existing teams and/or groups. A user creates and adds a new contact 7010, for example via an EIS interfaced system. The data aggregator and processor 1060 proceeds to identify whether the user is listed in any pre-existing teams and/or groups common to both the user and their new contact 7030, by searching user data associated with the new contact 7020. If such teams and/or groups are identified, then the data aggregator and processor 1060 adds the new user to the teams and/or groups 7040 and adds the events for these teams and/or groups to the user's calendar 7050. The members of the teams and/or groups with which the user is now associated, are added as new contacts for the user 7060 and the process repeats for the new contacts. In this viral manner the data aggregator and processor 1060 quickly assembles the teams, contacts and events for users and in particular new users.

Conversely, no further action is taken if the data aggregator and processor 1060 determines that the user is not associated with any teams 7070.

In order to facilitate organisation and management of events with a number of participants, the data management system 1000 can provide cost splitting functionality. To do so, cost sharing information is associated with the appointment for the event in question. An event may, in this context, include a group purchase, for example of a gift. Depending on the nature of the event, cost sharing information can include, for example:

    • Total cost
    • Maximum number of participants (attendees)
    • Fixed number of participants
    • Fixed cost per participant
    • Variable cost per participant dependent on number of participants
    • Variable cost per participant freely selectable by each individual participant
    • Variable cost per participant selectable by each individual participant between an upper limit and a lower limit
    • Payment means
    • Payment deadline
    • Late payment consequence
    • Payment failure provisions
    • Minimum event commitment (e.g. minimum payment requirement, minimum number of participants)
    • Oversubscriptions provisions

FIG. 39 shows an exemplary process executed by the data aggregator and processor 1060 to schedule/manage events and teams and/or groups that are sharing the costs associated with an event. The event in this example is a ski break with a chalet rented for a group of participants. When the appointment for the event is created, the event organiser creating the appointment specifies that the event will cost £200 in total (for renting a chalet), and the estimated cost per participant is £20 (for occupancy of 10 in the chalet). A payment date of 7 January is set in the calendar for participation of an invitee. Payment options are specified to allow payment by PayPal, credit card, debit card, or cash payment. The minimum commitment for the event to take place is specified to be 5 participants (in which case the cost per participant is £40).

The event organiser invites a number of invitees that are potential event participants. The invitees that accept the invitation to the event complete the payment process and become committed participants. The payment process can be embedded in the calendar interface, or it can be performed in a linked external payment interface. In an alternative, the invitee can accept the invitation and defer payment. The event may have a payment window associated, and/or a payment deadline by which payments are to be made by the participants. In another example where the event is the group purchase of a gift, the participants may choose their payment contribution freely, or between a set upper and lower limit.

The appointment may disclose information regarding the different invitees and participants to all other invitees and participants, or only to the event organiser, or it may be fully anonymous with only the data management system 1000 collecting the group information. For example, for the purchase of a joint birthday gift the participants' contributions may be disclosed to all invitees, or only to the event organiser, or only the sum of the contributions may be disclosed to the event organiser. In another example, for the rental of a chalet the appointment can list participants that have committed to the event as well as invitees to the event, or the appointment can omit this information until after an invitee has committed to the event, or this information can be revealed to the organiser only.

If there is insufficient commitment to the event as time progresses, for example for a joint gift costing £100 only £50 has been committed, or for the chalet with minimum occupancy of 5 only 4 participants have committed, this may be indicated in the calendar entry (appointment) representing the event, for example by a colour of the calendar entry (appointment).

At specific times (e.g. a month before the event is scheduled) the data management system 1000 can provide reminders to invitees regarding the event, reminders to participants regarding under- or overpayment, and/or reminders to the event organiser regarding the event (e.g. minimum commitment not yet reached). As the pay by date approaches, additional reminders may be sent, for example to notify the event organiser of insufficient funds and/or to remind invitees that have accepted the invitation but not yet made a payment to do so.

In one example, the participants enter their payment details when they complete the payment process (or optionally pay a deposit), but payment (or optionally full payment) is only taken once the minimum commitment level is reached. For example for the rental of a chalet, payment is only taken on the payment deadline if the minimum commitment level of 5 participants is reached, and the total cost is divided between the number of participants (e.g. for 6 participants each pays £33.33). In this case, once the payment is taken, receipts are sent to the participants.

In one example, the event organiser can determine provisions in the case of a payment occurring late or failing for a particular would-be participant. For example the data management system 1000 can automatically provide a notification to the would-be participant, and the event organiser can be informed. A deposit may be retained and the would-be participant may be considered as non-participating. A surcharge may be added for late payment.

The data management system 1000 can automatically place a transaction, for example in order to book the chalet with the chalet owner with the funds from the participants' payments. In another example a case of wine as a joint gift is ordered with the funds from the participants' payments. Notification may be provided to the event organiser, and/or to the participants, that the booking has been made, and/or reminders may be provided to invitees that have not accepted the invitation.

In one example, the event organiser can determine provisions in the case of oversubscription to the event. For example, if £150 for a joint gift is raised by participants instead of the minimum of £100 for a case of wine, then an additional bottle of cognac may be purchased, or a more expensive case of wine may be purchased, depending on the provisions determined by the event organiser. In another example, if 18 participants commit to (e.g. accept and pay for) the rental of a chalet with maximum occupancy of 10, then a second chalet is booked, and the cost for the two chalets is divided between the participants. In another example, would-be participants are allocated a place in the event on a priority basis. A number of factors can be taken into account to determine priority, for example the time when an invitee committed to the event, whether a would-be participant has paid yet or not, an organiser-defined invitee priority level, and/or a random factor for a lottery of available places. Data relating to the would-be participant may be taken into account, for example biometric data; location data; online activity data; and/or mood data. For example somebody may be prioritised for a football tournament if they have a status of ‘top player’.

Refund of payment can be provided to participants that receive a cancellation and do not receive a place at the event.

FIG. 40 shows an exemplary output from the data aggregator and processor 1060 as viewed using via a web portal. The output shows upcoming calendar events (e.g. schedule, transport, agenda and reminder), biometric data (e.g. heart rate, GSR, hydration, output) and EIS data (e.g. blocks providing information). An action or alert may be generated by the data aggregator and processor 1060 advising the user to leave their current appointment soon on the basis of external service data relating to slow traffic conditions.

The data aggregator and processor 1060 is arranged to control multiple devices in the system 1000. In one example, a user, with a smart phone having installed on it a set of applications, such as Golf, Running, Cycling, Sailing and Tennis, pins an app to the home screen of the smart phone interface. The decision to pin an app to the home screen on the mobile phone causes the same app to be pinned to the home screen, or otherwise be made more prominent, on the user's other devices, such as a laptop or tablet device. In a further example, the user accesses a set of apps on a remote mobile device, such as a tablet, and chooses to use only some of this set of apps. On another device, such as the laptop, the icons shown to the user are selected to be the same as those used on the mobile device.

In another example, a GPS measurement tool in a user device drives the choice of image shown in a browser, a software application, or a device. For example, if the GPS determines the user's location to be in Paris, and the device time is near sunset, then an image of the sun setting behind the Eiffel Tower is shown in the device display.

Generally, embodiments of the invention may include one or more of the following features:

    • A system as described (“Life Management”) that collates data related to a Role(s)
    • A system that allows Role(s) to manage Role(s) collated data
    • A system that allows the delegation of rights from a Role (e.g. Executive(s)) to a Role(s) (e.g. Executive Assistant(s)) for managing the Role(s) collated data
    • A system that allows a Role (e.g. Executive Assistant) to create Space and Actions for Roles(s) (e.g. self and/or User).
    • A system that allows a Role (e.g. Executive Assistant) to set Space and Action parameters for Role(s) (e.g. self and/or User).
      • For example, blocking out the time from 1 pm to 7 pm for air travel
    • A system that uses data related to Role(s) to set Space and Action parameters for Role(s).
      • For example, using calendar and/or GPS locations to conclude that flights are needed, and blocking out the time from 1 pm to 7 pm for air travel
    • A system that allows a Role to manually trigger steps in a process using Space and Actions (e.g. approve a flight booking).
      • For example, approving a flight that has been selected by a Role
    • A system that automatically triggers steps in a process using Space and Actions (e.g. approve a flight booking).
      • For example, making a purchase that meets the criteria automatically
      • For example, automatically inserting frequent flyer card details into the booking
    • A system that uses parameters in one part of the system to set parameters in another part of the system
      • For example, a GPS location collected from a phone can be used to set User(s) time parameters online and in an online watch
      • For example, high stress levels observed in an online watch can be used to change travel options for maximum convenience instead of lowest cost
      • For example, low battery indication in a phone can be used to set an online watch to poll for information more often, and alert fellow Role(s) about the phone's battery being low
    • A system that uses parameters in one part of the system to shift tasks to another part of the system
      • For example, a TO DO list may reside in a phone, and be moved to an online watch if the phone's battery is running low
      • A GPS measurement in a phone can drive the choice of image shown online (sunrise in Seattle)
    • Configuring the user interface on one device based on a combination of other device selections and external parameters.

Moods

In some embodiments, whether in combination with other features described herein or provided separately, there are presented apparatus for and methods of providing user state or mood related feedback to a user, more particularly to determining a state or “mood” of a user and in dependence on that state affecting one or more elements external to the user. This may allow a user's mood to drive how the world interacts with the user, in a basic example adapting all user interfaces (UI) in the user's life to the user's mood, and changing the colours of LEDs on a user's device in response to a mood.

User interfaces may be any user interface, including but not limited to any combination of an app, browsed page, smart watch, smart phone, computer screen, multi-room entertainment system, in-store media surroundings, night club bass-smoke dance entertainment system, in-car entertainment system or other user interfaces. A user may be an individual user or a group of users.

The invention may, for example, be used for such purposes as:

    • to (passively) feedback mood information directly to the user, thereby informing the user of a mood they may not be aware they are experiencing, such as through choice of audio/visual entertainment;
    • to (actively) seek to alter one or more aspects of the user's present or future behaviour and/or environment in response to the user mood, thereby acting to accommodate or change the mood of the user;
    • to entertain a group of users based on another group's moods;
    • to entertain a group of users based on the group's future moods, and
    • to inform other parties of the mood of the user, including in some embodiments to allow the exchange between (preferably trusted) parties of information regarding the mood of the user.

This may in some instances be viewed as providing “Mood-as-a-Service”.

Generally, moods can be determined or calculated based on biometric data, such as Galvanic Skin Response (GSR), touch, taste, smell, movement from accelerometer(s) and gyro(s), optical skin and blood vessel dilation measurements, and responses to mechanical or electrical stimuli such as vibrations, electric shocks, “Vibras” and “eVibras” or age, weight, height, fitness level, body-mass index, appearance rating, demographic data or combinations thereof.

Moods may also be determined from other user data, such as calendar activity, email, app usage, social media and, ideally, other online activity data including but not limited to searches based on all names of individuals for a group of users, media setting preferences (such as loud sound), playlists and bright colours, location, proximity to places, systems and people, time of day, week and year, status of day such as holiday, and affiliations to external groups such as the Marilyn Monroe Fan Club. For example, the level of calendar population (densely populated or sparsely populated with appointments, and also how early and how late in a day appointments extend) can give an indication of a stress level.

The mood-related aspects described herein may be provided as part of a system with architecture as shown in FIGS. 29 to 33, for example implemented in data aggregator and processor 1060.

FIG. 41 shows in overview an example of the determination of and feedback arising from a user mood. User 8010 is shown generating via various interaction channels 8020 a plurality of output data (including biometric, social media and calendar data) which are processed by data processor 1060 and a user state or mood 8030 determined. The mood 8030 is used by mood processor 8040 to determine appropriate feedback 8050 which is relayed either directly to the user 8010 or via one of the various interaction channels 8020.

Optionally, the mood 8030 is made available via a sharing module 8060 to other parties 8070. Feedback from the other parties 8070 may then be relayed to the user 8010 as above, either directly to the user 8010 or via one of the various interaction channels 8020.

A mood can be calculated also when the user consists of several individuals, in which case an algorithm can use all available data, and possibly also weighting to favour individuals or groups within groups, and possibly also weighting to favour some aspects such as location or age.

Mood Availability/Sharing

User moods can be shared or otherwise made available between users on a peer-to-peer basis, through a trusted mood web or through a Trusted Third Party (“TTP”). Moods and user state each include all characteristics and data related to a user as listed above, both historically, currently, projected into the future and desired in the future. And as stated previously, a user can be a group or sub-group of individuals.

This may allow for enhanced communication between parties, as it will be known beforehand what mood the other party is in and therefore allow for the communication to be tailored appropriately. A known mood for one user can also allow tailoring of suggestions and actions, such as offering movie tickets, buying a share or making a bet, for the same user or another user.

Generally, a trusted third party is an entity which facilitates interactions between two parties who both trust the third party. Hence a trusted mood party (“TMP”) is an entity that facilitates the secure transfer of moods from one user or type of user or Role(s) to another. The TMP model therefore enables two parties to use such a trust in order to know each other's mood.

In some embodiments, this trust is based on the exchange of certificates; specifically, the TMP model allows the creation of a mood certificate (“MC”) to effectively “guarantee” a mood for a period of time and/or until conditions change. The certificate authority would act as a mood certificate authority (“MCA”) and issue a digital mood certificate to one of the two parties. The MCA then becomes the Trusted Mood Party in respect of that issued certificate.

Likewise interactions that need a third party recordation make use of a third-party repository service.

FIG. 42 shows an example of mood availability/sharing. Suppose users Alice (“A”) 9010 and Bob (“B”) 9020 wish to communicate, but only on the basis of knowing (preferably, securely) what the mood of the other party is. The TMP 9030 either knows Bob 9020 or is otherwise willing to vouch that Bob's mood 9034 (typically expressed in a mood certificate 9040) describes the person indicated in that certificate, in this case, Bob 9020.

In discussions, this third person (the TMP 9030) is often called Trent (“T”) 9030. Bob's mood 9034 and/or a mood certificate 9040 associated with Bob's mood 9034 is sent to Trent 9030. Trent 9030 gives the mood certificate 9040 to Alice 9010, who then uses it to send messages 9050 to Bob 9020 based on knowing Bob's mood 9034. Alice 9010 can trust this mood to be Bob's if she trusts Trent 9030. In such discussions, it is simply assumed that she has valid reasons to do so (of course there is the issue of Alice and Bob being able to identify Trent 9030 properly as Trent and not someone impersonating Trent).

In some embodiments, rather than relying on a dedicated mood certificate-issuing TMPs, users may themselves control the use of mood certificates. If users or Role(s) become TMPs, we have a trusted mood web where Role(s) digitally sign each other's mood certificates and do so only if they are confident the appropriate mood and the Role(s) belong together. A mood signing party is one way of combining a get-together with some certificate signing. Nonetheless, doubt and caution remain sensible as some users may have been careless in signing others certificates.

Generally, embodiments of the invention may include one or more of the following features:

    • Mood based user interface (UI)
      • Mood changes measured for one Role(s) causes user interface changes for the same Role(s) i.e. ensuring all users of a particular Role(s) are notified of a change of mood of one of their number
        • LEDs in a device's display change colour based on biometric indicators such as Galvanic Skin Response (GSR) measurements performed by a sensor (optionally built into the user device).
        • Notification messages, whether visual and/or audio e.g. “Check this out—she's happy today!”
      • Mood changes measured for one Role(s) causes user interface changes for other Role(s) or whole world facing website i.e. relaying mood changes to specific groups of users, making the mood information available or broadcast
        • “Check this out—the guys at Company X are happy today!”
        • “My assistant is in a really bad mood today!”
      • Mood changes measured in one system causes user interface changes (e.g. colour) in another user interface i.e. relaying mood information to a specific user
        • “Just opening the site with my browser shows me my mood.”
    • Mood Style Sheet
      • Online resource that publishes a Role(s) mood for any user interface
        • For example to give a user a user interface, the user's current mood is identified by calling a web service to fetch the user's mood from a Trusted Third Party Mood Service. The Trusted Third Party Mood Service supplies XML+CSS+MIDI sound setting data with recommendations based on how the user is feeling right now. Based on that data the user interface is adapted. For example if the user is under pressure, then a user interface is created that speaks clearly with no nonsense to the user. This may allow a stressed user to focus on important information, whether by demoting less important information and/or digesting and presenting more simply the important information.
        • A style sheet may include CSS or audio-visual preferences.
    • Trusted Mood Party, comprising elements such as:
      • Peer-to-peer
      • Third party architecture
      • Trusted mood web

The following are examples of uses of mood information:

    • One might initiate a phone call, and see the mood of the person being called as the person answers.
      • mood data of communication participants may be supplied before, during setup, during, and/or after communicating with the person or group of persons.
      • communication settings (e.g. volume) may be adapted automatically based on communication participant moods.
    • Mood dependent data (i.e. “mood data”) may be supplied only to users with certain moods indicated
      • Useful for targeted advertising or sales.
    • Mood-dependent data may be supplied only to approved users and a pre-defined selection (or level) approved users (e.g. you and your best friends can see my mood).
    • A group of people may walk into a restaurant, causing a change in the audio-visual entertainment as a result.
    • A restaurant owner may be shown graphically the mood of his guests and a desired mood graph for his guests, and can thereby determine or set a time by which his guests should have the desired mood.
      • One or more user senses (e.g. audio-visual, smell, touch, temperature, and taste) may be stimulated or otherwise influenced to achieve a change of mood.
      • A mood may be described as a mood profile with dimensions of each parameter used.
      • A user sense (e.g. audio-visual, smell, touch, temperature, and taste) may be stimulated or otherwise influenced to achieve:
        • a mood change towards a given mood profile.
        • a mood change at a certain rate of mood change.
        • a mood change towards a given profile at a certain rate of mood change.
        • a mood change towards a given profile at a multiple rates of mood change for each mood dimension.

Home Entertainment

The following applies, in one example, to a multi-room entertainment system where users can interact with an entertainment system in each room separately. The term “home entertainment” also encompasses entertainment in a broader sense, such as in cars, on motorcycles, and in public places such as shops, offices and train stations.

A user state or mood may also be determined from the combination of online services used by a user, for example a web browser search combined with the use of previously unused social media/interaction apps. Moods may also be determined by adding data from an entertainment profile, which may include preferences such as:

    • Media Types
      • Sound preferences
      • Visual preferences
      • Touch, smell and taste differences
    • Preferences driven by time (“only in the mornings”) and location
    • Media Settings
      • e.g. loud, bass, dark
    • Preferences driven by participants
      • e.g. only play funk when I am alone

A user will also have associated characteristics, which may include:

    • Digital footprint/status
      • Biometrics
      • Calendar items
      • Email
      • Social network activity
      • Time of day
      • Location (e.g. GPS)
      • Time of week (e.g. Sunday)
      • Time of year (e.g. holiday)
      • External conditions (e.g. Elvis Presley Revival Week)
      • 3rd party playlists
    • Derivative data, calculated by analyzing digital footprint, such as
      • Physical strength
      • Psychological strength
      • Social strength
    • Group adherence (e.g. has to be an Elvis fan)

Group moods can also drive the selection of home entertainment, wherein group moods are, ideally, the aggregate of individual moods:

    • average of all individual moods
    • weighted average of all individual moods,
      • e.g. following a given sub-group's moods when calculating moods
    • historic moods
    • projected future moods
    • desired future moods, based on current Moods

This may allow the moods of a user or user group to influence/drive how the world interacts with the user or user group(s). A simple example might be adapting music played through a home entertainment system to the user's mood, and/or changing the colours of related displays. The media provided in a room can be any media that can be sensed by the user, such as sound.

A user's presence in a room can be detected using personal equipment such as an NFC antenna in a cell phone, or personal physical attributes such as weight or appearance detected by sensors in the room. A “room” refers to the room in which a user or user group is present—e.g. adjacent room(s) or remote rooms.

Ideally, a system will automatically detect that a user is present in a room, and adjust the media provided accordingly. A user's entertainment profile and/or characteristics could be used to select automatically the media provided in a room.

For a group of users, a weighting might be assigned to each user's entertainment profile and/or characteristics when selecting media provided in a room containing the group of users. For example, the entertainment profile and/or characteristics of the “host” user might be assigned a weighting of 30%, with the entertainment profiles and/or characteristics of the remaining users in the group having a collective weighting of 70%. A suitable algorithm can be used to adjust the weightings and aggregate the entertainment profiles and/or characteristics of the individual users.

Furthermore, a user's entertainment profile might be adjusted to complement another user's entertainment profile and/or characteristics. Ideally, the entertainment profile of another user can be used to drive the user's entertainment profile, such that media is selected as if that other user was present, even if they are not.

Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the disclosure be limited not by this detailed description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of the embodiments is intended to be illustrative, but not limiting, of the scope of the disclosure, which is set forth in the following claims.

It will be understood that the present invention has been described above purely by way of example, and modifications of detail can be made within the scope of the invention.

Each feature disclosed in the description, and (where appropriate) the claims and drawings may be provided independently or in any appropriate combination.

Reference numerals appearing in the claims are by way of illustration only and shall have no limiting effect on the scope of the claims.

Claims

1-105. (canceled)

106. An intelligent wristband system comprising:

a wearable wristband configured to be worn by a user;
a control unit within the wristband; and
a sensor configured to detect at least one gesture made by the user, the at least one gesture indicating instructions to be performed by the control unit, the control unit configured to translate the at least one gesture into a specific command for an action to occur within the wristband system.

107. The wristband system of claim 106 wherein each different gesture corresponds to a different specific command.

108. The wristband system of claim 106 wherein the sensitivity of the gesture detection sensor is customizable by the user to recognize the at least one gesture.

109. The wristband system of claim 106, wherein the gesture is selected from the group consisting of: (a) a user lifting a wrist of the user and rolling the wrist, (b) a user tapping the display screen, (c) a user making a swiping motion across the display screen, (d) a user tapping the display screen with two fingers, (e) a user pressing on the display screen, (f) a user making a circular motion on the display screen with a finger, and (g) a user shaking the wristband system.

110. The wristband system of claim 106, comprising a display screen, preferably wherein the display screen is curved.

111. The wristband system of claim 106 comprising a sensor coupled to the wristband and configured to detect at least one biotelemetric function and/or at least one physical activity associated with the user, the control unit configured to generate biotelemetry data and/or physical activity data from the detected at least one biotelemetric function and/or at least one physical activity.

112. The wristband system of claim 111, wherein the at least one biotelemetric function is selected from the group consisting of: (a) heart rate, (b) calories burned, (c) blood pressure, (d) skin temperature, (e) hydration level, and (f) galvanic skin response.

113. The wristband system of claim 111, wherein the at least one physical activity is selected from the group consisting of: (a) steps taken by the user, (b) stairs climbed by the user, (c) physical movement by the user, (d) speed of the user, and (e) sleep patterns of the user.

114. The wristband system of claim 111, wherein the wristband system is configured to recognize the user based on the at least one biotelemetric function associated with the user.

115. The wristband system of claim 111, wherein the control unit is configured to determine a mood of the user based on one or more biotelemetric functions associated with the user.

116. The wristband system of claim 111, wherein at least a portion of the biotelemetry data or physical activity data is used to determine at least one of a calculation of (a) a sleep efficiency of the user, (b) a stress level of the user, (c) an exercise intensity of the user, (d) an activity level of the user, (e) a self-control level of the user, and (f) a general wellness level of the user.

117. The wristband system of claim 1, further comprising a communications interface in communication with the control unit and configured to send sensor data through a network to a computing device, wherein a software platform within the computing device is configured to analyze the sensor data, and wherein at least a portion of the analyzed sensor is viewable on a display screen, preferably wherein the computing device is a mobile device.

118. The wristband of claim 117 wherein the sensor data comprises biotelemetry data or physical activity data.

119. The wristband system of claim 117, wherein the communications interface is configured to receive social data regarding the user that was analyzed by the computing device, at least a portion of the analyzed social data being viewable on the display screen.

120. The wristband system of claim 118, wherein the at least a portion of the analyzed biotelemetry data and/or physical activity data is accessible to a second user on the computing device, preferably wherein the second user is a health coach or a personal assistant.

121. The wristband system of claim 120, wherein the communications interface is configured to receive information sent from the computing device from the second user and wherein the information is viewable on the display screen.

122. The wristband system of claim 1, wherein the wristband system is configured for wireless coupling to a scale, the scale configured to calculate a measurement for at least one of the group consisting of (a) a weight of the user, (b) a BMI of the user, (c) a body fat percentage of the user, and (d) a heart rate of the user, preferably herein the scale is configured to send at least a portion of the calculated measurement through a network to the wearable wristband, the control unit configured to analyze the calculated measurement, and wherein at least a portion of the analyzed calculated measurement is viewable on the screen.

123. The wristband system of claim 1, further comprising an electric feedback stimulator configured to deliver an electric shock from the wristband in response to an instruction from the control unit.

124. A wristband configured to be worn by a user, the wristband comprising:

a control unit; and
a sensor configured to detect at least one gesture made by the user, the at least one gesture indicating instructions to be performed by the control unit, the control unit configured to translate the at least one gesture into a specific command for an action to occur within the wristband system.

125. A method of processing data from an intelligent wristband system, comprising:

detecting at least one gesture made by the user, the at least one gesture indicating instructions to be performed by the wristband system;
translating the at least one gesture into a specific command for an action to occur within the wristband system.
Patent History
Publication number: 20170031449
Type: Application
Filed: Mar 3, 2016
Publication Date: Feb 2, 2017
Inventors: Peter KARSTEN (Cookham), George Arriola (San Francisco, CA), Kouji Kodera (Mercer Island, WA)
Application Number: 15/059,636
Classifications
International Classification: G06F 3/01 (20060101); G06F 19/00 (20060101); A61B 5/0205 (20060101); G06F 3/0488 (20060101); G06F 3/0346 (20060101); A61B 5/00 (20060101); H04Q 9/00 (20060101); G06F 1/16 (20060101);