Devices and methods for health tracking and providing information for improving health
In one aspect, a device includes at least one storage medium bearing instructions executable by a processor, and at least one processor configured for accessing the storage medium to execute the instructions to configure the processor for receiving input pertaining to at least a first health parameter, monitoring at least one biometric of a user, and determining whether the user's biometric conforms to the first health parameter. The instructions also configure the processor for providing an indication that the biometric conforms to the first health parameter in response to determining that the user's biometric conforms to the first health parameter, and providing a recommendation for conforming to the first health parameter in response to determining that the user's biometric does not conform to the first health parameter.
Latest Sony Corporation Patents:
- Electronic device and method for spatial synchronization of videos
- Information processing apparatus for responding to finger and hand operation inputs
- Surgical support system, data processing apparatus and method
- Wireless communication device and wireless communication method
- Communication terminal, sensing device, and server
The present application relates generally to digital ecosystems that are configured for use to track a user's health-related biometrics.
II. BACKGROUND OF THE INVENTIONSociety is becoming increasingly health-conscious. However, there are currently no adequately robust tools for providing diagnostics and recommendations for improving one's health and fitness.
SUMMARY OF THE INVENTIONAccordingly, present principles recognize that a variety of health monitoring devices and/or sensors may be used to help a user improve their health and fitness by e.g. performing calculations for reaching health targets, providing exercise goals, and providing interactive functions making reaching such goals more enjoyable.
Thus, in a first aspect a device includes at least one computer readable storage medium bearing instructions executable by a processor, and at least one processor configured for accessing the computer readable storage medium to execute the instructions to configure the processor for receiving input pertaining to at least a first health parameter, monitoring at least one biometric of a user, and determining whether the user's biometric conforms to the first health parameter. The instructions also configure the processor for providing an indication that the biometric conforms to the first health parameter and providing information pertaining to a reference biometric in response to determining that the user's biometric conforms to the first health parameter, and providing a recommendation for conforming to the first health parameter in response to determining that the user's biometric does not conform to the first health parameter. The reference biometric is of the same biometric type as the user's biometric.
The user's biometric may be monitored at least in part based on signals from one or more biometric sensors configured to gather biometric information from the user. In some embodiments, the input pertaining to the first health parameter may be received from the user.
Also in some embodiments, the information pertaining to the reference biometric may include the reference biometric. The reference biometric may be derived by the device from information from a public health agency website and/or from a government website. Furthermore, the reference biometric may be a biometric average of plural persons of the same age and gender as the user.
If desired, determining whether the user's biometric conforms to the first health parameter may include comparing the user's biometric against the first health parameter and determining whether the user's biometric is within a threshold of the first health parameter.
Also if desired, the indication and/or recommendation may be provided on a user interface (UI) that may be presented on different device than the device executing the instructions. The UI may include a link to a website pertaining to health information. The recommendation may include an instruction for the user to alter the user's physical activity in at least one respect, and/or may include an indication of sustenance to consume. In some embodiments, the biometric type may be one of blood oxygen level, glucose level, sodium level, and resting heart rate.
In another aspect, a method includes receiving a biometric target from a person, receiving at least one signal from a biometric sensor sensing a biometric of the person, determining whether the biometric target has been reached based at least in part on the signal, and providing at least a first indication that the biometric target has not been reached responsive to a determination that the biometric target has not been reached.
In still another aspect, a device includes at least one computer readable storage medium bearing instructions executable by a processor, and at least one processor configured for accessing the computer readable storage medium to execute the instructions to configure the processor for receiving at least one physical fitness target from a user, determining at least one biometric range for which at least one biometric of the user is to reach to conform to the physical fitness target, and providing a fitness plan to the user to reach the physical fitness target.
The details of the present invention, both as to its structure and operation, can best be understood in reference to the accompanying drawings, in which like reference numerals refer to like parts, and in which:
This disclosure relates generally to consumer electronics (CE) device based user information. With respect to any computer systems discussed herein, a system may include server and client components, connected over a network such that data may be exchanged between the client and server components. The client components may include one or more computing devices including televisions (e.g. smart TVs, Internet-enabled TVs), computers such as laptops and tablet computers, and mobile devices including smart phones and additional examples discussed below. These client devices may employ, as non-limiting examples, operating systems from Apple, Google, or Microsoft. A Unix operating system may be used. These operating systems can execute one or more browsers such as a browser made by Microsoft or Google or Mozilla or other browser program that can access web applications hosted by the Internet servers over a network such as the Internet, a local intranet, or a virtual private network.
As used herein, instructions refer to computer-implemented steps for processing information in the system. Instructions can be implemented in software, firmware or hardware; hence, illustrative components, blocks, modules, circuits, and steps are set forth in terms of their functionality.
A processor may be any conventional general purpose single- or multi-chip processor that can execute logic by means of various lines such as address lines, data lines, and control lines and registers and shift registers. Moreover, any logical blocks, modules, and circuits described herein can be implemented or performed, in addition to a general purpose processor, in or by a digital signal processor (DSP), a field programmable gate array (FPGA) or other programmable logic device such as an application specific integrated circuit (ASIC), discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A processor can be implemented by a controller or state machine or a combination of computing devices.
Any software and/or applications described by way of flow charts and/or user interfaces herein can include various sub-routines, procedures, etc. It is to be understood that logic divulged as being executed by e.g. a module can be redistributed to other software modules and/or combined together in a single module and/or made available in a shareable library.
Logic when implemented in software, can be written in an appropriate language such as but not limited to C# or C++, and can be stored on or transmitted through a computer-readable storage medium such as a random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), compact disk read-only memory (CD-ROM) or other optical disk storage such as digital versatile disc (DVD), magnetic disk storage or other magnetic storage devices including removable thumb drives, etc. A connection may establish a computer-readable medium. Such connections can include, as examples, hard-wired cables including fiber optics and coaxial wires and twisted pair wires. Such connections may include wireless communication connections including infrared and radio.
In an example, a processor can access information over its input lines from data storage, such as the computer readable storage medium, and/or the processor accesses information wirelessly from an Internet server by activating a wireless transceiver to send and receive data. Data typically is converted from analog signals to digital by circuitry between the antenna and the registers of the processor when being received and from digital to analog when being transmitted. The processor then processes the data through its shift registers to output calculated data on output lines, for presentation of the calculated data on the CE device.
Components included in one embodiment can be used in other embodiments in any appropriate combination. For example, any of the various components described herein and/or depicted in the Figures may be combined, interchanged or excluded from other embodiments.
“A system having at least one of A, B, and C” (likewise “a system having at least one of A, B, or C” and “a system having at least one of A, B, C”) includes systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.
Before describing
Two general types of computer ecosystems exist: vertical and horizontal computer ecosystems. In the vertical approach, virtually all aspects of the ecosystem are associated with the same company (e.g. produced by the same manufacturer), and are specifically designed to seamlessly interact with one another. Horizontal ecosystems, one the other hand, integrate aspects such as hardware and software that are created by differing entities into one unified ecosystem. The horizontal approach allows for greater variety of input from consumers and manufactures, increasing the capacity for novel innovations and adaptations to changing demands. But regardless, it is to be understood that some digital ecosystems, including those referenced herein, may embody characteristics of both the horizontal and vertical ecosystems described above.
Accordingly, it is to be further understood that these ecosystems may be used while engaged in physical activity to e.g. provide inspiration, goal fulfillment and/or achievement, automated coaching/training, health and exercise analysis, convenient access to data, group sharing (e.g. of fitness data), and increased accuracy of health monitoring, all while doing so in a stylish and entertaining manner. Further still, the devices disclosed herein are understood to be capable of making diagnostic determinations based on data from various sensors (such as those described below in reference to
Now specifically referring to
Accordingly, to undertake such principles the CE device 12 can include some or all of the components shown in
In addition to the foregoing, the CE device 12 may also include one or more input ports 26 such as, e.g., a USB port to physically connect (e.g. using a wired connection) to another CE device and/or a headphone port to connect headphones to the CE device 12 for presentation of audio from the CE device 12 to a user through the headphones. The CE device 12 may further include one or more tangible computer readable storage mediums 28 such as disk-based or solid state storage, it being understood that the computer readable storage medium 28 may not be a carrier wave. Also in some embodiments, the CE device 12 can include a position or location receiver such as but not limited to a GPS receiver and/or altimeter 30 that is configured to e.g. receive geographic position information from at least one satellite and provide the information to the processor 24 and/or determine an altitude at which the CE device 12 is disposed in conjunction with the processor 24. However, it is to be understood that that another suitable position receiver other than a GPS receiver and/or altimeter may be used in accordance with present principles to e.g. determine the location of the CE device 12 in e.g. all three dimensions.
Continuing the description of the CE device 12, in some embodiments the CE device 12 may include one or more cameras 32 that may be, e.g., a thermal imaging camera, a digital camera such as a webcam, and/or a camera integrated into the CE device 12 and controllable by the processor 24 to gather pictures/images and/or video in accordance with present principles. Also included on the CE device 12 may be a Bluetooth transceiver 34 and other Near Field Communication (NFC) element 36 for communication with other devices using Bluetooth and/or NFC technology, respectively. An example NFC element can be a radio frequency identification (RFID) element.
Further still, the CE device 12 may include one or more motion sensors 37 (e.g., an accelerometer, gyroscope, cyclometer, magnetic sensor, infrared (IR) motion sensors such as passive IR sensors, an optical sensor, a speed and/or cadence sensor, a gesture sensor (e.g. for sensing gesture command), etc.) providing input to the processor 24. The CE device 12 may include still other sensors such as e.g. one or more climate sensors 38 (e.g. barometers, humidity sensors, wind sensors, light sensors, temperature sensors, etc.) and/or one or more biometric sensors 40 (e.g. heart rate sensors and/or heart monitors, calorie counters, blood pressure sensors, perspiration sensors, odor and/or scent detectors, fingerprint sensors, facial recognition sensors, iris and/or retina detectors, DNA sensors, oxygen sensors (e.g. blood oxygen sensors and/or VO2 max sensors), glucose and/or blood sugar sensors, blood oxygen sensors, other oxygen sensors, sodium sensors, sleep sensors (e.g. a sleep tracker), pedometers and/or speed sensors, body temperature sensors, nutrient and metabolic rate sensors, voice sensors, lung input/output and other cardiovascular sensors, mood sensors, and still other sensors for providing biometrics of the biometric types discussed herein, etc.) also providing input to the processor 24. In addition to the foregoing, it is noted that in some embodiments the CE device 12 may also include a kinetic energy harvester 42 to e.g. charge a battery (not shown) powering the CE device 12.
Still referring to
Thus, for instance, the headphones/ear buds 46 may include a heart rate sensor configured to sense a person's heart rate when a person is wearing the head phones, the clothing 48 may include sensors such as perspiration sensors, climate sensors, and heart sensors for measuring the intensity of a person's workout, the exercise machine 50 may include a camera mounted on a portion thereof for gathering facial images of a user so that the machine 50 may thereby determine whether a particular facial expression is indicative of a user struggling to keep the pace set by the exercise machine 50 and/or an NFC element to e.g. pair the machine 50 with the CE device 12 and hence access a database of preset workout routines, and the kiosk 52 may include an NFC element permitting entry to a person authenticated as being authorized for entry based on input received from a complimentary NFC element (such as e.g. the NFC element 36 on the device 12). Also note that all of the devices described in reference to
Now in reference to the afore-mentioned at least one server 54, it includes at least one processor 56, at least one tangible computer readable storage medium 58 that may not be a carrier wave such as disk-based or solid state storage, and at least one network interface 60 that, under control of the processor 56, allows for communication with the other CE devices of
Accordingly, in some embodiments the server 54 may be an Internet server, may facilitate fitness coordination and/or data exchange between CE device devices in accordance with present principles, provide information to one or more CE devices in accordance with present principles, and may include and perform “cloud” functions such that the CE devices of the system 10 may access a “cloud” environment via the server 54 in example embodiments to e.g. access a fitness plan in accordance with present principles and/or stream music to listen to while exercising.
Turning now to
For instance, a BMI indication may not necessarily directly correspond to the output of a biometric sensor, but can be derived from outputs by e.g. dividing the user's weight by the square of the user's height. Thus, the user's weight (e.g. and even height) may be provided by a biometric sensor such as a weight sensor (e.g. a scale), but the computation itself to derive the user's BMI by dividing the user's weight by the square of the user's height may be undertaken by the device undertaking the logic of
In any case, after block 100 the logic proceeds to block 102 where the logic receives a (e.g. desired) time in which the user is to and/or wishes to reach the parameter(s) and/or target(s). The time may be subsequently used to e.g. present an alarm at the time responsive to the parameter and/or target not being reached by the time (e.g. at or within one month) in accordance with present principles. In any case, after block 102 the logic proceeds to block 104 where the logic receives at least one signal from a biometric sensor sensing a biometric of the user in accordance with present principles. E.g., input may be received from a smart wrist band including plural biometric sensors of different types and provided to another device such as e.g. a smart phone of the user. Additionally or alternatively at block 104, the logic monitors the user's biometrics (by e.g. generating and updating a history of received biometric information, and/or processing and/or analyzing received biometric information) for making determinations as described herein based on the received data.
Regardless, after block 104 the logic proceeds to decision diamond 106, where the logic determines whether the biometric information that has been received conforms to the parameters and/or has reached the target(s). The logic may do so by e.g. comparing biometric data associated with the user from the biometric sensor(s) against the first health parameter to determine whether the user's biometric is within a threshold of the first health parameter. Thus, for instance, the logic may e.g. every tenth of a second determine if the parameter and/or target has been reached based on e.g. comparing most-recently received biometric information to the parameter and/or target and determining whether it is within a threshold of the target. Note that parameters and targets in accordance with present principles may be a specific number and/or value, and/or may be a number and/or value range such that e.g. a target range for healthy sodium levels in a person's body may be determined to have or have not been reached at block 106.
Thus, an affirmative determination at diamond 106 causes the logic to move to block 108 where the logic provides an indication (e.g. on a user interface (UI)) that the user's biometric(s) conforms to the parameter(s) and/or target(s). Also at block 108, the logic may provide (e.g. health and fitness) information on the UI such as e.g. a reference biometric as will be discussed further below, links to health websites and/or health information, etc. Note that the indication e.g. as provided on a UI may be presented on the device providing the biometric information such as e.g. a smart bracelet, and/or may be provided on another device in communication with the device providing the biometric information such as e.g. a smart phone of the user.
Still in reference to diamond 106, should a negative determination be made thereat, the logic instead proceeds to block 110 where the logic provides a recommendation for conforming to the first health parameter and/or an indication that target(s) has not been reached. The recommendation and/or indication that the target(s) has not been reached may be presented on a UI in accordance with present principles, where the UI may be presented on the device providing the biometric information such as e.g. a smart bracelet, and/or may be presented on another device in communication with the device providing the biometric information such as e.g. a smart phone of the user.
The UI presented at block 110 may include e.g. health and fitness information, one or more reference biometrics, links to health websites and/or health information, one or more instructions for the user to alter the user's physical activity and/or behavior in at least one respect to conform to the parameter and/or reach the target (e.g. engage in, and/or refrain from, a particular activity), one or more instructions for the user to alter the user's eating habits in at least one respect to conform to the parameter and/or reach the target (e.g. consume certain foods or drink, and/or refrain from consuming certain foods or drink), etc.
Referring back to the reference biometric from above, a reference biometric in accordance with present principles may be e.g. a particular number, parameter, statistic, average, etc. for the biometric type of the respective user biometric that is commonly accepted (e.g. within the medical community) as being healthy, a “healthy normal,” and/or within a normal range as indicated at e.g. a health website providing such information and that is accessed by the device undertaking the logic of
Now in reference to
After receiving at least one physical fitness target from a user such at block 100 and/or block 120, the logic determines at block 122 at least one biometric range for which one or more biometrics of the user is to reach to conform to the user's fitness target. The fitness target may be, e.g. a desired BMI, a desired blood oxygen level, a desired oxygen consumption amount, a desired blood glucose level, a desired blood sodium level, a desired (e.g. resting) heart rate, a desired blood pressure, a desired core body temperature, a desired ratio of calorie intake to calorie usage, etc.
Also at block 122, the logic derives and/or determines a fitness plan, routine, outline, fitness program, and/or scheme in accordance with present principles for undertaking actions and/or refraining from actions to progress toward conforming to the desired parameter and/or target. Thus, e.g., the fitness plan may be e.g. one or more (e.g. a set) of activities to regularly perform and/or perform at intervals (e.g. workout routines (e.g. and times) to engage in with the plan indicating details e.g. workout by workout (and/or day by day) for a total number of workouts, diets and/or meal plans providing detailed information meal by meal for a total number of meals, etc.). The information indicated in the plan may be determined based on e.g. accessing a data table correlating exercises and/or dieting with one or more biometric types for thus improving biometrics of the biometric types, which may then be incorporated into the plan by the device and provided to the user.
In addition to or in lieu of the foregoing, the fitness plan may be e.g. an indication of the total time a particular activity or activities are to be undertaken to reach the user's physical fitness target. Such a time may be determined by e.g. accessing an average (e.g. reference) time (e.g. stored at a server with which the device undertaking the logic of
Regardless, after accessing such information, the logic may determine a number of increments (e.g. based on the same particular and/or specific increment (e.g. a increment value or constant) for the average time to incrementally alter the biometric type as set forth immediately above) the particular biometric of the user is from the biometric range/target and hence the time to reach the target by taking e.g. a current biometric for the user and/or most recent biometric sensor output for the biometric and e.g. subtracting it from the targeted biometric to arrive at a first number. The first number is then divided by the e.g. specific increment value/constant for the average time to incrementally alter the biometric type (as discussed above) to thus determine a second number that is the number of increments from which the user's biometric is from the biometric target. Time to reach the target may then be calculated by multiplying the second number by the average time to incrementally alter the biometric type. In some embodiments, one or both of the number of increments away and/or the time to reach the target may be presented to the user on one of the Ws discussed herein and/or as part of a fitness plan in accordance with present principles.
Continuing the detailed description in reference to
Still in reference to the UI 130, it also may include a selector element 156 selectable to automatically without further user input responsive thereto cause the CE device to e.g. communicate with one or more biometric sensors to receive current biometric information therefrom pertaining to the user to e.g. auto-fill one or more of the respective input boxes in column 132 such as e.g. a current blood pressure should a current blood pressure input box be presented in the column 132 (though not actually shown in
Now in reference to
In addition to the foregoing, the UI 160 may also include a section 170 providing information on activities in which to partake to improve the user's biometrics even further and/or bring them even closer to the reference biometrics. For instance, the section 170 may provide instructions for the user to partake in one or more additional physical activities or behaviors, participate in those or other physical activities for a longer duration, eat particular foods and/or food classes/groups, etc. Likewise, a section 172 providing information on activities in which to not partake to improve the user's biometrics even further and/or bring them even closer to the reference biometrics is shown. The section 172 may include e.g. instructions for the user to refrain from one or more activities or behaviors, refrain from eating particular foods and/or food classes/groups, etc. Furthermore, note that although not shown, each respective piece of information in either column 170 or 172 may include an indication of the biometric type for which the information pertains and e.g. seeks to improve. For instance, a “Don't eat pickles” instruction in column 172 may be accompanied by the following: “This will improve your sodium levels.”
Still in reference to the UI 160, it may also include a health information section 174 providing one or more pieces of health information that may or may not be unique and/or tailored to the user, such as e.g. health news on recently released health studies, selectable links to health-related websites that may be selected from the UI 160 to automatically without further user input cause the CE device to access the website and present it on the CE device, general health information pertaining to one or more of the biometric types from the column 164, etc. Last, the exemplary UI 160 includes one or more share selector elements 176 that are selectable to automatically without further user input share (e.g. a screen shot of) the UI 160 and/or information associated therewith over a social networking site corresponding to the selected element such as e.g. Facebook or Twitter.
Moving on to
Still in reference to the UI 180, a selector element 192 may also be presented thereon. The selector element 192 may be selectable to automatically without further user input responsive thereto cause an updated e.g. fitness plan to be (e.g. generated and/or) presented on the CE device relative to a previously determined plan from the CE device. Thus, the updated plan may be generated by the CE device responsive to determining that at least one of the user's current biometrics does not conform to the respective target and even that the user's current biometric fails to conform to where that user biometric was estimated by the non-updated plan to have been or reached at that point in time based on the user undertaking actions in conformance with the initial plan that was generated. For instance, a user's body may burn calories at a slower rate for the same exercise than a reference biometric for the average number of calories that the general public is estimated to burn for the exercise. In such an instance, the logic may determine based on input from one or more biometric sensors that the user thus needs to engage in further exercise to burn the same amount of calories as the reference biometric and therefore that the user's plan should be adjusted accordingly to nonetheless still reach the reference biometric and/or the user's target given the user's relatively less calorie-burning physical output.
Continuing the description of the UI 180, it may also include a section 194 providing information on activities in which to partake to improve the user's biometrics to thus bring them into conformance with the plan and/or reference biometric in accordance with present principles. Note further that the section 194 may include selector elements corresponding to respective entries in the section 194 that e.g. provide interactive features.
For instance, an entry instructing the user to eat more vegetables may be accompanied by a see recipes selector element 196 selectable to automatically without further user input responsive thereto cause the CE device to access (e.g. from a cooking website) and then present on the CE device one or more recipes for which what is indicated in the entry is used as an ingredient. In the present exemplary instance, selection of the selector element 196 may cause one or more recipes that require vegetables to be presented for the user to thus e.g. prepare a meal including vegetables to thus attempt to conform to the indication of food to eat to improve the user's biometric(s).
Likewise, a location selector element 198 selectable to automatically without further user input responsive thereto cause the CE device to access (e.g. a list of gyms on the Internet) and then present on the CE device indications of one or more nearby gyms and/or fitness facilities (e.g. based on GPS coordinates determined by the CE device using a GPS receiver thereon), which may be included next to an indication of a physical activity in which to undertake so that the user may e.g. go to one of the facilities indicated to undertake in the physical activity. Before moving on, also note that a section 200 providing information on activities in which to not partake to improve the user's biometrics in accordance with present principles is shown.
Still in reference to the UI 180, it may also include a health information section 202 in accordance with present principles for providing one or more pieces of health information that may or may not be unique and/or tailored to the user. Last, the exemplary UI 180 includes one or more share selector elements 204 in accordance with present principles that are selectable to automatically without further user input share (e.g. a screen shot of) the UI 180 and/or information associated therewith over a social networking site corresponding to the selected element such as e.g. Facebook or Twitter.
Now in reference to both
Without reference to any particular figure, it is to be understood that exercise plans in accordance with present principles may in some embodiments be generated at a server and provided to a CE device rather than being generated at the CE device itself.
Furthermore, present principles recognize that although much of the foregoing has been directed specifically to exercise routines, present principles may apply not only to exercising but also other activities not necessarily commonly construed as “exercises” such as walking down the street or sitting with a particular posture at a desk.
Accordingly, it may now be appreciated that as health and wellness is increasingly in the forefront of our everyday lives, present principles meet the increased desire for users to know more about how well their bodies are functioning and what changes can be made to optimize their body biometrics and functions to thus keep them as close to ideal and/or health as possible.
Thus, in one embodiments present principles may be undertaken by a wearable diagnostics smart device that is configured to track and monitor e.g. full bodily functions including, but not limited to, temperature, blood pressure, oxygen consumption, calories in and out, sugar levels, sodium levels, sleep patterns, mood/energy (e.g. using a mood sensor), etc. The wearable smart device with sensors measuring biometrics of the user may help a user to set and track goals, as well as warn the user when the user's levels are off and/or out of sync (e.g. with healthy normals). With this information, the user may subsequently make informed changes to improve their health by e.g. following instructions and/or recommendations provided by the wearable smart device. Furthermore, the information provided to the user (e.g. on a UI such as the UIs 160 and 180 described above) may change and/or report out in real time information (e.g. biometric information) in accordance with present principles, and can also sync with other devices to provide more robust reporting (e.g. for viewing on more than one of the user's devices such as a tablet even if the biometrics were taken by the wearable device). Such devices may also e.g. provide recommendations to the user such as types of foods, recipes, exercises, fitness facilities, etc.
Thus, such a wearable smart device may be thought of as a portable diagnostics device that tracks and reports physical activity for a user, tracks consumption and operating levels, and provides recommendations on food and activities to help balance and optimize overall health and wellness. Things that may be tracked include e.g. calories in (e.g. consumed) and out (e.g. used or burned), temperature, blood pressure, heart rate, oxygen consumption, sugar and sodium levels, etc.
Before concluding the detailed description, it is to be understood that although e.g. an application for undertaking present principles may be vended with a CE device for undertaking present principles, present principles also apply in instances where one or more of the foregoing features and/or an application including software code for undertaking present principles may be e.g. downloaded from a server to a device over a network such as the Internet.
While the particular DEVICES AND METHODS FOR HEALTH TRACKING AND PROVIDING INFORMATION FOR IMPROVING HEALTH is herein shown and described in detail, it is to be understood that the subject matter which is encompassed by the present invention is limited only by the claims.
Claims
1. A device comprising:
- at least one computer memory that is not a transitory signal and that comprises instructions executable by at least one processor for:
- receiving input pertaining to at least a first health parameter;
- monitoring at least one biometric of a user;
- determining whether the user's biometric conforms to the first health parameter,
- in response to determining that the user's biometric conforms to the first health parameter, providing an indication that the biometric conforms to the first health parameter and providing information pertaining to a reference biometric, the reference biometric being of the same biometric type as the user's biometric;
- in response to determining that the user's biometric does not conform to the first health parameter, providing a recommendation for conforming to the first health parameter; and
- providing an indication of at least one specific sustenance to refrain from consuming to reach a biometric target.
2. The device of claim 1, wherein the input pertaining to the first health parameter is received from the user, and wherein the user's biometric does not conform to the first biometric at or around the time of receiving the input.
3. The device of claim 1, where the information pertaining to the reference biometric includes the reference biometric, and wherein the reference biometric is derived by the device from information from a public health agency website.
4. The device of claim 1, where the information pertaining to the reference biometric includes the reference biometric, and wherein the reference biometric is derived by the device from information from a government website.
5. The device of claim 1, where the information pertaining to the reference biometric includes the reference biometric, and wherein the reference biometric is a biometric average of plural persons of the same age and gender as the user.
6. The device of claim 1, wherein the determining whether the user's biometric conforms to the first health parameter includes comparing the user's biometric against the first health parameter and determining whether the user's biometric is within a threshold of the first health parameter.
7. The device of claim 1, wherein the device is a first device, and wherein the indication is provided on a user interface (UI) presented on a second device different from the first device.
8. The device of claim 1, wherein the device is a first device, wherein the recommendation is provided on a user interface (UI) presented on a second device different from the first device, and wherein the UI includes a link to a website pertaining to health information.
9. The device of claim 1, wherein the biometric type is one of: blood oxygen level, glucose level, sodium level, and resting heart rate.
10. The device of claim 1, wherein the recommendation includes an instruction for the user to alter the user's physical activity in at least one respect.
11. The device of claim 1, wherein the recommendation includes an indication of sustenance to consume.
12. The device of claim 1, wherein the user's biometric is monitored at least in part based on signals from one or more biometric sensors configured to gather biometric information from the user.
13. A method, comprising:
- receiving a biometric target from a person;
- receiving at least one signal from a biometric sensor sensing a biometric of the person;
- based at least in part on the signal, determining whether the biometric target has been reached; and
- responsive to a determination that the biometric target has not been reached, providing at least a first indication that the biometric target has not been reached; and
- providing an indication of sustenance to refrain from consuming to reach the biometric target.
14. The method of claim 13, further comprising:
- responsive to the determination that the biometric target has not been reached, providing at least one instruction to the person for reaching the biometric target.
15. The method of claim 14, wherein the instruction includes an indication of physical activity in which to refrain from engaging to reach the biometric target.
16. The method of claim 13, wherein the first indication is an alarm presentable on a device associated with the user at a time indicated by the user at which the alarm is to be presented responsive to the biometric target not being reached by the time.
17. The method of claim 13, wherein the target is a biometric range for a biometric type.
18. A device, comprising:
- at least one computer memory that is not a transitory signal and that comprises instructions executable by at least one processor for:
- receiving at least one physical fitness target from a user;
- determining at least one biometric range for which at least one biometric of the user is to reach to conform to the physical fitness target; and
- providing a fitness plan to the user to reach the physical fitness target at least in part by:
- accessing an average time to incrementally alter the biometric type of the biometric when undertaking a particular activity;
- determining a number of increments the biometric of the user is from the biometric range; and
- multiplying the average time by the number to render a total time, and wherein the fitness plan includes an indication of the total time the particular activity is estimated to be undertaken to reach the physical fitness target.
4278095 | July 14, 1981 | Lapeyre |
4566461 | January 28, 1986 | Lubell et al. |
4625962 | December 2, 1986 | Street |
4708337 | November 24, 1987 | Shyu |
4728100 | March 1, 1988 | Smith |
4869497 | September 26, 1989 | Stewart et al. |
4916628 | April 10, 1990 | Kugler |
4920969 | May 1, 1990 | Suzuki et al. |
5072458 | December 17, 1991 | Suzuki |
5111818 | May 12, 1992 | Suzuki et al. |
5207621 | May 4, 1993 | Koch et al. |
5277197 | January 11, 1994 | Church et al. |
5314389 | May 24, 1994 | Dotan |
5410472 | April 25, 1995 | Anderson |
5433683 | July 18, 1995 | Stevens |
5454770 | October 3, 1995 | Stevens |
5474083 | December 12, 1995 | Church et al. |
5474090 | December 12, 1995 | Begun et al. |
5516334 | May 14, 1996 | Easton |
5524637 | June 11, 1996 | Erickson |
5579777 | December 3, 1996 | Suga |
5598849 | February 4, 1997 | Browne |
5704067 | January 6, 1998 | Brady |
5706822 | January 13, 1998 | Khavari |
5857939 | January 12, 1999 | Kaufman |
5921891 | July 13, 1999 | Browne |
6013007 | January 11, 2000 | Root et al. |
6032108 | February 29, 2000 | Seiple et al. |
6042519 | March 28, 2000 | Shea |
6050924 | April 18, 2000 | Shea |
6101443 | August 8, 2000 | Kato et al. |
6106297 | August 22, 2000 | Pollak et al. |
6171218 | January 9, 2001 | Shea |
6198431 | March 6, 2001 | Gibson |
6220865 | April 24, 2001 | Macri et al. |
6231527 | May 15, 2001 | Sol |
6244988 | June 12, 2001 | Delman |
6251048 | June 26, 2001 | Kaufman |
6259944 | July 10, 2001 | Margulis et al. |
6296595 | October 2, 2001 | Stark et al. |
6447424 | September 10, 2002 | Ashby et al. |
6447425 | September 10, 2002 | Keller et al. |
6464618 | October 15, 2002 | Shea |
6497638 | December 24, 2002 | Shea |
6500100 | December 31, 2002 | Harrell |
6515593 | February 4, 2003 | Stark et al. |
6582342 | June 24, 2003 | Kaufman |
6601016 | July 29, 2003 | Brown et al. |
6605044 | August 12, 2003 | Bimbaum |
6638198 | October 28, 2003 | Shea |
6659916 | December 9, 2003 | Shea |
6659946 | December 9, 2003 | Batchelor et al. |
6672991 | January 6, 2004 | O'Malley |
6702719 | March 9, 2004 | Brown et al. |
6746371 | June 8, 2004 | Brown et al. |
6749537 | June 15, 2004 | Hickman |
6786848 | September 7, 2004 | Yamashita et al. |
6793607 | September 21, 2004 | Neil |
6863641 | March 8, 2005 | Brown et al. |
6866613 | March 15, 2005 | Brown et al. |
6882883 | April 19, 2005 | Condie et al. |
6997882 | February 14, 2006 | Parker et al. |
7024369 | April 4, 2006 | Brown et al. |
7056265 | June 6, 2006 | Shea |
7057551 | June 6, 2006 | Vogt |
7070539 | July 4, 2006 | Brown et al. |
7128693 | October 31, 2006 | Brown et al. |
7192401 | March 20, 2007 | Saalasti et al. |
7223215 | May 29, 2007 | Bastyr |
7227468 | June 5, 2007 | Florio |
7245254 | July 17, 2007 | Vogt |
7328612 | February 12, 2008 | Jämsen et al. |
7351187 | April 1, 2008 | Seliber |
7370763 | May 13, 2008 | Pascucci |
7376423 | May 20, 2008 | Sakanaba |
7438670 | October 21, 2008 | Gray et al. |
7507183 | March 24, 2009 | Anderson et al. |
7586418 | September 8, 2009 | Cuddihy et al. |
7617615 | November 17, 2009 | Martorell et al. |
7633076 | December 15, 2009 | Huppi et al. |
7664292 | February 16, 2010 | Van Den et al. |
7683252 | March 23, 2010 | Oliver et al. |
7699752 | April 20, 2010 | Anderson et al. |
7728214 | June 1, 2010 | Oliver et al. |
7786856 | August 31, 2010 | O'Brien |
7840031 | November 23, 2010 | Albertson et al. |
7841966 | November 30, 2010 | Aaron et al. |
7857730 | December 28, 2010 | Dugan |
7894849 | February 22, 2011 | Kass |
7931563 | April 26, 2011 | Shaw et al. |
7951046 | May 31, 2011 | Barber, Jr. |
7966230 | June 21, 2011 | Brown |
7979136 | July 12, 2011 | Young et al. |
7996080 | August 9, 2011 | Hartman et al. |
8021270 | September 20, 2011 | D'eredita |
8029410 | October 4, 2011 | Shea |
8047965 | November 1, 2011 | Shea |
8057360 | November 15, 2011 | Shea |
8062182 | November 22, 2011 | Somers |
8092346 | January 10, 2012 | Shea |
8103762 | January 24, 2012 | Duberry |
8109858 | February 7, 2012 | Redmann |
8125314 | February 28, 2012 | Fithian |
8157730 | April 17, 2012 | LeBoeuf et al. |
8162802 | April 24, 2012 | Berg |
8182424 | May 22, 2012 | Heckerman |
8199014 | June 12, 2012 | Kindeberg |
8204786 | June 19, 2012 | Leboeuf et al. |
8219191 | July 10, 2012 | Hartman et al. |
8277377 | October 2, 2012 | Quy |
8317658 | November 27, 2012 | Dorogusker et al. |
8333874 | December 18, 2012 | Currie |
8343012 | January 1, 2013 | Redmann |
8360785 | January 29, 2013 | Park et al. |
8360935 | January 29, 2013 | Olsen et al. |
8371990 | February 12, 2013 | Shea |
8406085 | March 26, 2013 | Sakita |
8435177 | May 7, 2013 | Lanfermann et al. |
8452413 | May 28, 2013 | Young et al. |
8467860 | June 18, 2013 | Salazar et al. |
8491446 | July 23, 2013 | Hinds et al. |
8512209 | August 20, 2013 | Guidi et al. |
8512548 | August 20, 2013 | Bar-or et al. |
8514067 | August 20, 2013 | Hyde et al. |
8597093 | December 3, 2013 | Engelberg et al. |
8795138 | August 5, 2014 | Yeh et al. |
8825445 | September 2, 2014 | Hoffman et al. |
20010020143 | September 6, 2001 | Stark et al. |
20020028730 | March 7, 2002 | Kaufman |
20020072932 | June 13, 2002 | Swamy |
20020082142 | June 27, 2002 | Cannon et al. |
20020108000 | August 8, 2002 | Iori et al. |
20020128119 | September 12, 2002 | Arai |
20020142887 | October 3, 2002 | O'Malley |
20020156392 | October 24, 2002 | Arai et al. |
20030028116 | February 6, 2003 | Bimbaum |
20030064860 | April 3, 2003 | Yamashita et al. |
20030171188 | September 11, 2003 | Neil |
20030171189 | September 11, 2003 | Kaufman |
20030211916 | November 13, 2003 | Capuano |
20030226695 | December 11, 2003 | Mault |
20040058908 | March 25, 2004 | Keller et al. |
20040077462 | April 22, 2004 | Brown et al. |
20040117214 | June 17, 2004 | Shea |
20050010425 | January 13, 2005 | Chen et al. |
20050070809 | March 31, 2005 | Acres |
20050075214 | April 7, 2005 | Brown et al. |
20050083846 | April 21, 2005 | Bahl |
20050163346 | July 28, 2005 | Van Den et al. |
20050177059 | August 11, 2005 | Koivumaa et al. |
20050209002 | September 22, 2005 | Blythe et al. |
20050233861 | October 20, 2005 | Hickman et al. |
20050272561 | December 8, 2005 | Cammerata |
20060020216 | January 26, 2006 | Oishi et al. |
20060025282 | February 2, 2006 | Redmann |
20060032315 | February 16, 2006 | Saalastic et al. |
20060058156 | March 16, 2006 | Cohen et al. |
20060094570 | May 4, 2006 | Schneider |
20060107822 | May 25, 2006 | Bowen |
20060111621 | May 25, 2006 | Coppi et al. |
20060111944 | May 25, 2006 | Sirmans et al. |
20060113381 | June 1, 2006 | Hochstein et al. |
20060240959 | October 26, 2006 | Huang |
20060252602 | November 9, 2006 | Brown et al. |
20060281976 | December 14, 2006 | Juang et al. |
20060288846 | December 28, 2006 | Logan |
20070021269 | January 25, 2007 | Shum |
20070033068 | February 8, 2007 | Rao et al. |
20070033069 | February 8, 2007 | Rao et al. |
20070042868 | February 22, 2007 | Fisher et al. |
20070060446 | March 15, 2007 | Asukai et al. |
20070083092 | April 12, 2007 | Rippo et al. |
20070083095 | April 12, 2007 | Rippo et al. |
20070113725 | May 24, 2007 | Oliver et al. |
20070113726 | May 24, 2007 | Oliver et al. |
20070146116 | June 28, 2007 | Kimbrell |
20070173377 | July 26, 2007 | Jamsen et al. |
20070213608 | September 13, 2007 | Brown |
20070219059 | September 20, 2007 | Schwartz et al. |
20070249467 | October 25, 2007 | Hong et al. |
20070249468 | October 25, 2007 | Chen |
20070266065 | November 15, 2007 | Rosenberg |
20070271065 | November 22, 2007 | Gupta et al. |
20070275825 | November 29, 2007 | O'Brien |
20070300185 | December 27, 2007 | Macbeth et al. |
20080045384 | February 21, 2008 | Matsubara et al. |
20080051919 | February 28, 2008 | Sakai et al. |
20080098876 | May 1, 2008 | Kuo et al. |
20080103022 | May 1, 2008 | Dvorak et al. |
20080110115 | May 15, 2008 | French |
20080146890 | June 19, 2008 | Leboeuf et al. |
20080146892 | June 19, 2008 | LeBoeuf et al. |
20080147502 | June 19, 2008 | Baker |
20080153670 | June 26, 2008 | McKirdy et al. |
20080162186 | July 3, 2008 | Jones |
20080170123 | July 17, 2008 | Albertson et al. |
20080176713 | July 24, 2008 | Olivera et al. |
20080182723 | July 31, 2008 | Aaron et al. |
20080204225 | August 28, 2008 | Kitchen |
20080220941 | September 11, 2008 | Shaw et al. |
20080262918 | October 23, 2008 | Wiener |
20090044687 | February 19, 2009 | Sorber |
20090105047 | April 23, 2009 | Guidi et al. |
20090131224 | May 21, 2009 | Yuen |
20090131759 | May 21, 2009 | Sims et al. |
20090138488 | May 28, 2009 | Shea |
20090149131 | June 11, 2009 | Young et al. |
20090150175 | June 11, 2009 | Young et al. |
20090233771 | September 17, 2009 | Quatrochi et al. |
20090247366 | October 1, 2009 | Frumer |
20090247368 | October 1, 2009 | Chiang |
20090258758 | October 15, 2009 | Hickman et al. |
20090275442 | November 5, 2009 | Nissila |
20090287103 | November 19, 2009 | Pillai |
20090292178 | November 26, 2009 | Ellis et al. |
20090293298 | December 3, 2009 | Martorell et al. |
20090298426 | December 3, 2009 | Helvick |
20100017114 | January 21, 2010 | Tehan et al. |
20100035726 | February 11, 2010 | Fisher et al. |
20100056341 | March 4, 2010 | Ellis et al. |
20100099437 | April 22, 2010 | Moerdijk |
20100120585 | May 13, 2010 | Quy |
20100134257 | June 3, 2010 | Puleston et al. |
20100167876 | July 1, 2010 | Cheng |
20100185062 | July 22, 2010 | Salazar et al. |
20100186078 | July 22, 2010 | Napoli et al. |
20100190607 | July 29, 2010 | Widerman et al. |
20100216603 | August 26, 2010 | Somers |
20100217099 | August 26, 2010 | LeBoeuf et al. |
20100222178 | September 2, 2010 | Shea |
20100222181 | September 2, 2010 | Shea |
20100234699 | September 16, 2010 | Lanfermann et al. |
20100273610 | October 28, 2010 | Johnson |
20110001827 | January 6, 2011 | Ortiz et al. |
20110015039 | January 20, 2011 | Shea |
20110015041 | January 20, 2011 | Shea |
20110035184 | February 10, 2011 | Aaron et al. |
20110059825 | March 10, 2011 | Mcgown |
20110098112 | April 28, 2011 | LeBoeuf et al. |
20110098583 | April 28, 2011 | Pandia et al. |
20110106627 | May 5, 2011 | LeBoeuf et al. |
20110137191 | June 9, 2011 | Kinnunen |
20110152696 | June 23, 2011 | Ryan |
20110165996 | July 7, 2011 | Paulus et al. |
20110165998 | July 7, 2011 | Lav et al. |
20110179068 | July 21, 2011 | O'Brien |
20110195780 | August 11, 2011 | Lu |
20110195819 | August 11, 2011 | Shaw et al. |
20110205697 | August 25, 2011 | Callicoat et al. |
20110212688 | September 1, 2011 | Griffin et al. |
20110230142 | September 22, 2011 | Young et al. |
20110246908 | October 6, 2011 | Akram et al. |
20110263385 | October 27, 2011 | Shea et al. |
20110275042 | November 10, 2011 | Warman et al. |
20110288381 | November 24, 2011 | Bartholomew et al. |
20110319228 | December 29, 2011 | Shea |
20120010478 | January 12, 2012 | Kinnunen et al. |
20120058859 | March 8, 2012 | Elsom-Cook et al. |
20120077580 | March 29, 2012 | Mahajan et al. |
20120096249 | April 19, 2012 | Rubin et al. |
20120108395 | May 3, 2012 | Shea |
20120129138 | May 24, 2012 | Redmann |
20120130630 | May 24, 2012 | Tang et al. |
20120142429 | June 7, 2012 | Muller |
20120178431 | July 12, 2012 | Gold |
20120184871 | July 19, 2012 | Jang et al. |
20120190502 | July 26, 2012 | Paulus et al. |
20120203081 | August 9, 2012 | Leboeuf et al. |
20120226111 | September 6, 2012 | Leboeuf et al. |
20120226112 | September 6, 2012 | Leboeuf et al. |
20120271143 | October 25, 2012 | Aragones et al. |
20120271913 | October 25, 2012 | Tallgren et al. |
20120283855 | November 8, 2012 | Hoffman |
20120308192 | December 6, 2012 | Chung et al. |
20130032634 | February 7, 2013 | McKirdy |
20130046477 | February 21, 2013 | Hyde et al. |
20130089842 | April 11, 2013 | Shea |
20130090213 | April 11, 2013 | Amini et al. |
20130090565 | April 11, 2013 | Quy |
20130095459 | April 18, 2013 | Tran |
20130110265 | May 2, 2013 | Rahko et al. |
20130130213 | May 23, 2013 | Burbank et al. |
20130155251 | June 20, 2013 | Moravchik |
20130178960 | July 11, 2013 | Sheehan et al. |
20130217541 | August 22, 2013 | Shea |
20130217542 | August 22, 2013 | Shea |
20130217543 | August 22, 2013 | Shea |
20130218309 | August 22, 2013 | Napolitano |
20130225369 | August 29, 2013 | Fisbein et al. |
20130304377 | November 14, 2013 | Hende |
20130312589 | November 28, 2013 | Macpherson |
20130325326 | December 5, 2013 | Blumenberg et al. |
20140000322 | January 2, 2014 | Williams |
20140013344 | January 9, 2014 | Taxier |
20140089672 | March 27, 2014 | Luna et al. |
20140124570 | May 8, 2014 | Franklin |
20140248996 | September 4, 2014 | Adel |
20140316701 | October 23, 2014 | Cardonha et al. |
20150081210 | March 19, 2015 | Yeh et al. |
H08241496 | September 1996 | JP |
H10281790 | October 1998 | JP |
2003131785 | May 2003 | JP |
2005224318 | August 2005 | JP |
2007075172 | March 2007 | JP |
2007322172 | December 2007 | JP |
2008242063 | October 2008 | JP |
2009041964 | February 2009 | JP |
2009142333 | July 2009 | JP |
2009194670 | August 2009 | JP |
2010088886 | April 2010 | JP |
2012108801 | June 2012 | JP |
2012189415 | October 2012 | JP |
2013043061 | March 2013 | JP |
2013050454 | March 2013 | JP |
2009075493 | June 2009 | WO |
2012070019 | May 2012 | WO |
2012176193 | December 2012 | WO |
2013055380 | April 2013 | WO |
- Julia Anne Framel, Aravind Babu Asam, Guru Prashanth Balasubramanian, Takeshi Suzuki, Charles D. Hedrick Jr., “User Device Position Indication for Security and Distributed Race Challenges”, File History of related U.S. Appl. No. 13/644,044, filed Oct. 3, 2012.
- Sabrina Tai-Chen Yeh, David Andrew Young, “Altering Exercise Routes Based on Device Determined Information”, file history of related U.S. Appl. No. 14/037,286, filed Sep. 25, 2013.
- Sabrina Tai-Chen Yeh, Steven Friedlander, David Andrew Young, “Nonverbal Audio Cues During Physical Activity”, file history of related U.S. Appl. No. 14/037,278, filed Sep. 25, 2013.
- Sabrina Tai-Chen Yeh, Takashi Hironaka, David Andrew Young, Steven Friedlander, “Quick Login to User Profile on Exercise Machine”, file history of related U.S. Appl. No. 14/037,263, filed Sep. 25, 2013.
- Sabrina Tai-Chen Yeh, David Andrew Young, Steven Friedlander, “Determine Exercise Routes Based on Device Determined Information”, file history of related U.S. Appl. No. 14/037,276, filed Sep. 25, 2013.
- Sabrina Ta-Chen Yeh, Steven Friedlander, David Andrew Young, “Synchronized Exercise Buddy Headphones”, file history of related U.S. Appl. No. 14/037,267, filed Sep. 25, 2013.
- Sabrina Tai-Chen Yeh, David Andrew Young, Takashi Hironaka, Steven Friedlander, “Presenting Audio Based on Biometrics Parameters”, file history of related U.S. Appl. No. 14/037,271, filed Sep. 25, 2013.
- Sabrina Tai-Chen Yeh, Steven Friedlander, David Andrew Young, “Presenting Audio Video on Biometrics Parameters”, file history of related U.S. Appl. No. 14/037,252, filed Sep. 25, 2013.
- Sabrina Tai-Chen Yeh, Jenny Therese Fredriksson, “Combining Data Sources to Provide Accurate Effort Monitoring”, file history of related U.S. Appl. No. 14/037,224, filed Sep. 25, 2013.
- Sabrina Tai-Chen Yeh, Jenny Therese Fredriksson, “Intelligent Device Mode Shifting Based on Activity”, file history of related U.S. Appl. No. 14/037,228, filed Sep. 25, 2013.
- Sabrina Tai-Chen Yeh, David Andrew Young, Steven Friedlander, Determining Exercise Routes Based on Device Determined Information, related U.S. Appl. No. 14/037,276. Non-Final Office Action dated Mar. 31, 2015.
- Sabrina Tai-Chen Yeh, David Andrew Young, Steven Firedlander, Determining Exercise Routes Based on Device Determined Information, related U.S. Appl. No. 14/037,276, Applicant's response to Non-Final Office Action filed Apr. 1, 2015.
- Julia Anne Framel, Aravind Babu Asam, Guru Prashanth Balasubramanian, “User Device Position Indication for Security and Distributed Race Challenges”, related U.S. Appl. No. 14/261,075, Applicant's response to Non-Final Office Action filed Mar. 30, 2015.
- Sabrina Tai-Chen Yeh; Tai Ashi Hironaka, David Andrew Young, Steven Friedlander, “Quick Login to User Profile on Exercise Machine”, related U.S. Appl. No. 14/037,263, Non-Final Office Action dated May 27, 2015.
- Sabrina Tai-Chen Yeh, Takashi Hironaka, David Andrew Young, Steven Friedlander, “Quick Login to User Profile on Exercise Machine”, related U.S. Appl. No. 14/037,263, Applicant's response to Non-Final Office Action filed May 29, 2015.
- Julie Anne Framel, Aravind Babu Asam, Guru Prashanth Balasubramanian, Takeshi Suzuki, Charles D. Hedrick Jr.; “User Device Position Indication for Security and Distributed Race Challenges” related U.S. Appl. No. 13/644,044 non-final office action dated Apr. 14, 2014.
- Julie Anne Framel, Aravind Babu Asam, Guru Prashanth Balasubramanian, Takeshi Suzuki, Charles D. Hedrick Jr.; “User Device Position Indication for Security and Distributed Race Challenges” related U.S. Appl. No. 13/644,044 applicants response to non-final office action filed Apr. 24, 2014.
- Sabrina Tai-Chen Yeh, Steven Friedlander, David Andrew Young, “Presenting Audio Based on Biometrics Parameters”, related U.S. Appl. No. 14/037,252, Non-Final Office Action dated Aug. 4, 2015.
- Julia Anne Framel, Aravind Babu Asam, Guru Prashanth Balasubramanian, Takeshi Suzuki, Charles D. Hedrick, “User Device Position Indication for Security and Distributed Race Challenges”, related U.S. Appl. No. 14/261,075, Non-Final Office Action dated Mar. 23, 2015.
- Sabrina Tai-Chen Yeh, Jenny Therese Fredriksson, “Combining Data Sources to Provide Accurate Effort Monitoring” related U.S. Appl. No. 14/255,663 non-final office action dated Sep. 12, 2014.
- Sabrina Tai-Chen Yeh, Jenny Therese Fredriksson, “Combining Data Sources to Provide Accurate Effort Monitoring” related U.S. Appl. No. 14/255,663 applicants response to the non-final office action filed Oct. 30, 2014.
- Sabrina Tai-Chen Yeh, David Andrew Young, Steven Firedlander, “Determining Exercise Routes Based on Device Determined Information”, related U.S. Appl. No. 14/037,276, Final Office Action dated Jun. 15, 2015.
- Sabrina Tai-Chen Yeh, David Andrew Young, Steven Firedlander, “Determining Exercise Routes Based on Device Determined Information”, related U.S. Appl. No. 14/037,276, Applicant's response to Final Office Action filed Jun. 17, 2015.
- Judith A. Markowitz, “Voice Biometrics, Who Are You? Your voice along can be used to verify your personal identity -unobtrusively and invisibly.” Sep. 2000/ vol. 43. No. 9, Communications of the ACM. http://www.web2.utc.edu/˜djy471/documents/voice-biometrics-p66-markowitz.pdf.
- Veli-Jussi Raitila, “Tag, you're it—NFC in a home environment” TKK T-110.5190 Seminar on Internertworking, Finland, pp. 1-6, 2007. http://www.tml.tkk.fi/Publications/C/23/papers/Raitila—final.pdf.
- Performtek Sensor Technology, “PerformTek Technology, Monitor Fitness Metrics Using Earbud Sensor Technology” http://www.valencell.com/preformtek-sensor-technology, website printed Sep. 17, 2013.
- Steve Silverman, “Biometic Exercises” http://www.livestrong.com/article/282962-biometrics-exercises/, Mar. 31, 2011.
- Sabrina Tai-Chen Yeh, David Andrew Young, Takashi Hironaka, Steven Friedlander, “Nonverbal Audio Cues During Physical Activity”, related U.S. Appl. No. 14/037,271, Non-Final Office Action dated Jul. 2, 2015.
- Sabrina Tai-Chen Yeh, David Andrew Young, Takashi Hironaka, Steven Friedlander, “Nonverbal Audio Cues During Physical Activity”, related U.S. Appl. No. 14/037,271, Applicant's response to Non-Final Office Action filed Jul. 13, 2015.
- Sabrina Tai-Chen Yeh, David Andrew Young, “Altering Exercise Based on Device Determined Information”, related U.S. Appl. No. 14/037,286, Non-Final Office Action dated Aug. 28, 2015.
- Sabrina Tai-Chen Yeh, David Andrew Young, “Altering Exercise Based on Device Determined Information”, related U.S. Appl. No. 14/037,286, Applicant's response to Non-Final Office Action filed Sep. 8, 2015.
- Sabrina Tai-Chen Yeh, David Andrew Young, Steven Friedlander, “Quick Login to User Profile on Exercise Machine”, related U.S. Appl. No. 14/037,263, Final Office Action dated Aug. 20, 2015.
- Sabrina Tai-Chen Yeh, David Andrew Young, Steven Friedlander, “Quick Login to User Profile on Exercise Machine”, related U.S. Appl. No. 14/037,263, Applicant's response to Final Office Action filed Aug. 25, 2015.
- Sabrina Tai-Chen Yeh, Jenny Therese Fredriksson, “Combining Data Sources to Provide Accurate Effort Monitoring”, related U.S. Appl. No. 14/255,663, Final Office Action dated Oct. 2, 2015.
- Sabrina Tai-Chen Yeh, Jenny Therese Fredriksson, “Combining Data Sources to Provide Accurate Effort Monitoring”, related U.S. Appl. No. 14/255,663, Applicant's response to Final Office Action filed Oct. 2, 2015.
- Sabrina Tai-Chen Yeh, Steven Friedlander, David Andrew Young, “Nonverbal Audio Cues During Physical Activity”, related U.S. Appl. No. 14/037,278 non-final office action dated Oct. 22, 2015.
- Sabrina Tai-Chen Yeh, Steven Friedlander, David Andrew Young, “Nonverbal Audio Cues During Physical Activity”, related U.S. Appl. No. 14/037,278 applicants response to non-final office action filed Oct. 23, 2015.
- Sabrina Tai-Chen Yeh, Jenny Therese Frediksson, “Intelligent Device Mode Shifting Based on Activity”, related U.S. Appl. No. 14/037,228 non-final office action date Oct. 26, 2015.
- Sabrina Tai-Chen Yeh, Jenny Therese Frediksson, “Intelligent Device Mode Shifting Based on Activity”, related U.S. Appl. No. 14/037,228 applicants response to non-final office action filed Oct. 27, 2015.
- Sabrina Tai-Chen Yeh, Takashi Hironaka, David Andrew Young, Steven Friedlander, “Quick Login to User Profile on Exercise Machine”, related U.S. Appl. No. 14/037,263, Non-Final Office Action dated Nov. 3, 2015.
- Sabrina Tai-Chen Yeh, Takashi Hironaka, David Andrew Young, Steven Friedlander, “Quick Login to User Profile on Exercise Machine”, related U.S. Appl. No. 14/037,263, Applicant's response to Non-Final Office Action filed Nov. 4, 2015.
- Sabrina Tai-Chen Yeh, David Andrew Young, “Altering Exercise Based on Device Determined Information”, related U.S. Appl. No. 14/037,286, Final Office Action dated Dec. 3, 2015.
- Sabrina Tai-Chen Yeh, David Andrew Young, Takashi Hironaka, Steven Firedlander, “Presenting Audio Based on Biometrics Parameters”, related U.S. Appl. No. 14/037,271, Applicant's responese to Non-Final Office Action filed Nov. 25, 2015.
- Sabrina Tai-Chen Yeh, David Andrew Young, Takashi Hironaka, Steven Friedlander, “Presenting Audio Based on Biometrics Parameters”, related U.S. Appl. No. 14/037,271, Final Office Action dated Nov. 24, 2015.
- Sabrina Tai-Chen Yeh, David Andrew Young, “Altering Exercise Routes Based on Device Determined Information”, related U.S. Appl. No. 14/037,286. Applicant's response to Final Office Action filed Dec. 8, 2015.
Type: Grant
Filed: Jan 22, 2014
Date of Patent: Feb 23, 2016
Patent Publication Number: 20150206413
Assignee: Sony Corporation (Tokyo)
Inventor: Jason Michael Warner (San Diego, CA)
Primary Examiner: Anh V La
Application Number: 14/160,871
International Classification: G08B 23/00 (20060101); G06Q 50/22 (20120101); A61B 5/11 (20060101); A61B 5/00 (20060101); A61B 5/0205 (20060101); G06F 19/00 (20110101); A61B 5/024 (20060101); A61B 5/145 (20060101); A63B 24/00 (20060101); G06F 3/16 (20060101); G06F 3/01 (20060101);