INTELLIGENT ACTIVITY MONITOR

- FitLinxx, Inc.

Apparatuses and methods are disclosed for identifying with a single, small activity monitor a particular type of activity from among a plurality of different activities. The monitor may include a multi-axis accelerometer and microcontroller configured to combine and process accelerometer data so as to generate features representative of an activity. The features may be processed to identify a particular activity (e.g., running, biking, swimming) from among a plurality of different activities that may include activities not performed by a human subject. The activity monitor may be configured to further process the features to calculate various parameters characterizing the activity (e.g., duration of activity, total steps, distance traveled, intensity of activity, and calories burned). The monitor may identify a location on a subject where the monitor is worn, and provide a measure of the quality of the processed data or calculated parameters. The activity monitor may be configured to execute a self-calibration routine for an activity that is based on a temporary cessation of motion during the activity. Low energy paradigms are used to extend battery-powered operation of the activity monitor.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
RELATED APPLICATIONS

This application claims the benefit of U.S. provisional application Ser. No. 61/566,528 titled “Intelligent Activity Monitoring,” filed on Dec. 2, 2011, which is incorporated herein by reference in its entirety.

FIELD

This disclosure relates generally to apparatuses and methods for detecting, identifying, and/or analyzing motion data obtained by an activity monitor that is configured to sense one or more activities that may be representative of human and/or non-human movement.

BACKGROUND

There currently exist small commercial devices that can be worn by a user to monitor an activity or two similar types of user activity. As an example, the FitLinxx® ActiPed+ (available from FitLinxx, Shelton, Conn., USA) is a small unit that can be clipped to a shoe and used to monitor walking and running activities by the user. When a user walks or runs, an on-board accelerometer outputs data that is stored on the unit for subsequent transmission to a computer system. The computer system can analyze the data to calculate various activity parameters (e.g., duration of activity, total steps, distance traveled, and calories burned). The calculated data can be stored, so that the user can easily maintain a record of exercise regimens.

SUMMARY OF EXAMPLE EMBODIMENTS

The embodiments relate to a small activity monitor configured to identify a particular type of activity from among a plurality of different activities. The monitor may include a single- or multi-axis accelerometer and at least one microcontroller configured to process accelerometer data so as to generate characteristic features representative of an activity. The features may be processed to identify a particular activity (e.g., running, biking, swimming) from among a plurality of different activities. The plurality of different activities may include activities not performed by a human subject. The activity monitor may be configured to further process the features to calculate various parameters characterizing the activity (e.g., duration of activity, total steps, distance traveled, intensity of activity, and calories burned). In some embodiments, the monitor may be configured to identify a location on a subject where the monitor is worn, and provide a measure of the quality of the processed data and/or calculated parameters. The activity monitor may be configured to execute a self-calibration routine for an activity that is based on a temporary cessation of motion during the activity. Low energy paradigms may be used to extend battery-powered operation of the activity monitor.

In some embodiments, an intelligent activity monitor may, for example, comprise a single or multi-axis accelerometer and at least one microcontroller or microprocessor. The term “microprocessor” is used herein to refer to either a microcontroller or microprocessor. The accelerometer and microprocessor may, for example, be disposed in a small package that can be supported by a subject in motion (e.g., affixed to an appendage or torso of an animate object, affixed to an item worn by an animate object, or affixed to a moving inanimate object). In some embodiments, the microprocessor may, for example, be configured to combine derivatives of acceleration data from at least two axes of the multi-axis accelerometer to form a combined acceleration-derivative value when the accelerometer is supported by the subject in motion. The microprocessor may, for example, be configured to compute acceleration-derivative values for one or more axes of an accelerometer by calculating time derivatives of acceleration data received from the accelerometer. A bit stream of acceleration-derivative values or combined acceleration-derivative values may be formed and processed by the microprocessor to determine one or more characteristic features from the bit stream. The characteristic features may be analyzed to determine an activity from among a plurality of different activities.

According to a first embodiment of an activity monitor, the activity monitor may comprise an accelerometer and a microprocessor configured to receive acceleration data from the accelerometer. The microprocessor may be configured to compute derivatives of first acceleration data received from a first axis of the accelerometer, and to form first acceleration-derivative values from the first acceleration data when the accelerometer is supported by a subject in motion. The microprocessor may further be configured to process the acceleration-derivative values to identify an activity of the subject from among a plurality of activities that may or may not be performed by the subject.

According to a first aspect of the first embodiment, the activity monitor may further include a power source, a transceiver for transmitting and receiving data, and power-management circuitry. The power management circuitry may be configured to sense an activity level (e.g., an amount of motion) of the activity monitor and apply power to or remove power from at least the microprocessor responsive to the sensed activity level of the activity monitor. The activity monitor may further include a strap or a clip for attaching the activity monitor to an article of clothing or strapping the activity monitor to a subject.

According to a second aspect, the activity identified by the activity monitor may be one activity from among a plurality of activities identifiable by the activity monitor. The plurality of activities may include two or more activities selected from the following group: walking, running, biking, swimming, using a type of exercise machine, rowing, cross-country skiing, jumping-jacks, sit-ups, push-ups, pull-ups, and jumping rope. In some implementations, the plurality of activities identifiable by the activity monitor may include a falsified human activity and/or an activity that a human is not capable of performing.

According to a third aspect, the accelerometer of the activity monitor may be a multi-axis accelerometer, and the microprocessor is further configured to compute derivatives of acceleration data received from each axis of the multi-axis accelerometer to form multi-axis acceleration-derivative values and to combine the multi-axis acceleration-derivative values according to the following relation:

D n = i = 1 m D i , n

where Dn represents a combined acceleration-derivative value and represents an nth computed acceleration-derivative value for an ith axis of the multi-axis accelerometer.

According to a fourth aspect, the activity monitor's microprocessor is configured to form the first acceleration-derivative values into a first stream of data, and to process the first stream of data to identify at least one characteristic feature of the first stream of data. The microprocessor may further be configured to provide the at least one characteristic feature to a fuzzy inference engine, and to analyze the at least one characteristic feature with the fuzzy inference engine to identify one activity from among a plurality of different activities. The identifying at least one characteristic feature from the first stream of data may comprise extracting peak values and/or peak widths from the first data stream, and/or may comprise determining maximum and minimum values of the first data stream. The microprocessor may further be configured to provide the first acceleration data and/or the at least one characteristic feature to one of a plurality of activity analysis engines based on the identification of the one activity, and to calculate, with the one activity analysis engine, parameters characterizing the activity. The parameters may include one or more parameters selected from the following group: a measure of pace of the activity, a measure of energy expended during the activity, a measure of distance traveled during the activity, and a duration of the activity. The microprocessor may be further configured to calculate at least one quality metric for one or more of the parameters, the at least one quality metric may indicate a reliability of the one or more of the computed parameters.

Any of the foregoing aspects and features of the aspects related to the first embodiment of the activity monitor may be implemented in any combination with the first embodiment to form various additional embodiments of intelligent activity monitors that are considered to be within the scope of the invention.

Various methods associated with an intelligent activity monitor are also described. According to a first method embodiment, a method for analyzing accelerometer data comprises computing, by at least one microprocessor, derivatives of first acceleration data received from an accelerometer to form first acceleration-derivative values. The first acceleration data may be representative of acceleration along a first axis of the accelerometer. The method may further include processing the first acceleration-derivative values to identify an activity sensed by the accelerometer.

According to a first aspect of the first method embodiment, the method may further comprise computing derivatives of at least second acceleration data received from the accelerometer to form at least second acceleration-derivative values. The at least second acceleration data may be representative of acceleration along at least a second axis of the accelerometer. The method may further include combining the first and at least second acceleration-derivative values according to the following relation:

D n = i = 1 m D i , n

where Dn represents a combined acceleration-derivative value and Di,n represents an nth computed acceleration-derivative value for an ith axis of the multi-axis accelerometer.

According to a second aspect, the accelerometer may be disposed in an activity monitor, and the method may further comprise clipping the activity monitor to an article of clothing or attaching the activity monitor to a subject. The method may also comprise forming a first data stream of the first acceleration-derivative values over a first period of time, identifying at least one characteristic feature of the first data stream, providing the at least one characteristic feature to a fuzzy inference engine, and analyzing the at least one characteristic feature with the fuzzy inference engine to identify one activity from among a plurality of different activities. Identifying at least one characteristic feature of the first data stream may comprise extracting peak values and/or peak widths from the first data stream, determining maximum and minimum values of the first data stream, and/or calculating an average value of acceleration from the first acceleration data for the first period of time.

According to a third aspect, the method may further include determining that the at least one characteristic feature is representative of an activity not recognized by the fuzzy inference engine. For example, the fuzzy inference engine may not be able to recognize the activity from among the plurality of activities that can be recognized by the fuzzy inference engine. The method may further include receiving machine-readable instructions and data enabling the fuzzy inference engine to subsequently recognize and identify an activity corresponding to the at least one characteristic feature, for which an activity could not previously be recognized by the fuzzy inference engine. Accordingly, the fuzzy inference engine may be configured to be trained to recognize new activities by a user.

According to a fourth aspect, the fuzzy inference engine may be configured to use historical data specific to a user in evaluating the at least one characteristic feature when identifying one activity from among the plurality of different activities. User-specific historical data may enable higher reliability in the identification of an activity.

According to a fifth aspect, the method may further include providing the first acceleration data and/or the at least one characteristic feature to one of a plurality of activity analysis engines based on the identification of the one activity, and calculating, with the one activity analysis engine, parameters characterizing the activity. The parameters may include one or more parameters selected from the following group: a measure of pace of the activity, a measure of energy expended during the activity, a measure of distance traveled during the activity, and a duration of the activity. The method may further include calculating at least one quality metric for one or more of the parameters. The at least one quality metric may indicate a reliability of the one or more parameters.

Any of the foregoing aspects and features of the aspects related to the first method embodiment may be implemented in any combination with the first method embodiment to form various additional method embodiments for intelligent activity monitors that are considered to be within the scope of the invention.

The identified activity may or may not be an activity performed by an animate subject. In some embodiments, the plurality of different activities may include two or more activities selected from the following group: walking, running, biking, swimming, using a type of exercise machine, rowing, cross-country skiing, jumping-jacks, sit-ups, push-ups, pull-ups, and jumping rope. According to some embodiments, the plurality of different activities may include falsified human activities and/or activities that a human is not capable of performing.

In additional embodiments, for example, a method may comprise receiving first motion data from an accelerometer of an activity monitor supported by a human performing an activity, and processing the received first motion data, by at least one microprocessor, to identify a type of activity performed by the human. In certain embodiments, such a method may, for example, further comprise receiving second motion data from the accelerometer, and processing the received second motion data, by the at least one microprocessor, to identify the type of motion as a non-human activity. In some embodiments, the non-human activity may, for example, be identified as a specific type of activity. Such a non-human activity may, for example, be a type of activity executed to mimic or falsely represent a human activity.

In some embodiments, a method for processing data from an activity monitor may, for example, comprise receiving motion data from an accelerometer of the activity monitor, wherein the activity monitor is attached to a subject. In certain embodiments, such a method may, for example, further include processing the received motion data, by at least one microprocessor, to determine a type of activity, and determining that the activity monitor is attached to the subject at a first location. Moreover, in some embodiments, the method may additionally or alternatively comprise compensating the received motion data with historical calibration data so that the received motion data becomes representative of the activity when the activity monitor is attached to the subject at a second location.

In some embodiments, a method of automatic calibration may be employed that comprises, for example, acts of receiving motion data from an accelerometer of the activity monitor, wherein the activity monitor is attached to a subject, and processing the received motion data, by at least one microprocessor, to determine a stopping of motion of the activity monitor in at least one direction. In certain embodiments, such automatic calibration may, for example, further comprise receiving subsequent data from the accelerometer for a time window subsequent the stopping of the motion, and processing the subsequent data to determine a calibration for use in the activity monitor to improve the accuracy of activity monitor calculations that are based on received motion data.

The foregoing and other aspects, embodiments, and features of the present teachings can be more fully understood from the following description in conjunction with the accompanying drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

The skilled artisan will understand that the figures, described herein, are for illustration purposes only. It is to be understood that in some instances various aspects of the invention may be shown exaggerated or enlarged to facilitate an understanding of the invention. In the drawings, like reference characters generally refer to like features, functionally similar and/or structurally similar elements throughout the various figures. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the teachings. The drawings are not intended to limit the scope of the present teachings in any way.

FIG. 1A depicts an illustrative example of an intelligent activity monitor 100 supported by a subject 50. In some embodiments, the activity monitor may be supported at locations on the subject other than the foot or ankle.

FIG. 1B depicts examples of components that may be included in an activity monitor such as that shown in FIG. 1A;

FIG. 1C is a block diagram illustrating examples of selected electrical components that may be included an activity monitor such as that shown in FIGS. 1A-B;

FIG. 2 is an illustrative example of a state diagram for low-energy operation of an activity monitor;

FIG. 3A depicts an example of an architecture for data flow and data handling elements of an activity monitor;

FIG. 3B depicts another example of an architecture for data flow and data handling elements of an activity monitor;

FIGS. 4A-4E depict illustrative examples of multi-axis accelerometer data and combined acceleration-derivative data for various types of activities;

FIGS. 5A-5C illustrate examples of membership functions that may be used in fuzzy-logic identification of activities;

FIGS. 6A-6C depict illustrative examples of multi-axis accelerometer data and combined acceleration-derivative data for a walking activity with the monitor supported at different locations on a subject;

The features and advantages of the present invention will become more apparent from the detailed description set forth below when taken in conjunction with the drawings.

DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

I. Overview

We have recognized there currently exists no commercial product for detecting, identifying, and analyzing, in a small single unit, different types of human activities (e.g., running, swimming, biking, etc.), as well as for detecting and identifying non-human activities. We have appreciated, moreover, that there has been significant research and development in the areas of pattern matching, neural networks, and gesture recognition in recent years. Such technology may be used for image recognition, speech recognition, etc., and may find applications in activity recognition by a machine. We have recognized, however, that in a vast majority of these “machine recognition” cases, the algorithms can be computationally expensive and may require advanced signal processing methods which would be difficult to implement on a small processor that could be incorporated in a small package worn by a user, such as a package sized and configured to be worn on a user's wrist. Though integer math may be used at the expense of accuracy for such calculations as one method to reduce computational burden, the energy consumption requirements by a processor configured to carry out such recognition algorithms still may be prohibitively large and require a user to either frequently recharge the device or replace its batteries.

This disclosure relates to a small, low-power, electronic activity monitor that may be attached to a user, and that may, for example, be configured to detect and identify a specific activity type from among a plurality of different activity types. Also disclosed are data-processing architectures and algorithms that may be used, for example, to process data generated by such an activity monitor, as well as low-power operation paradigms for the activity monitor. In certain embodiments, an on-board microcontroller may, for example, receive and process data from a tri-axis accelerometer by employing a data-reduction algorithm that combines outputs from multiple axes of the accelerometer to identify a specific type of activity, including both human and non-human activities, from among a plurality of activities. In some implementations, the data-reduction algorithm that is employed may, for example, reduce the computational load on the processor and reduce overall power consumption of the activity monitor when identifying activity types and/or analyzing data from an identified activity.

We have recognized that a small, intelligent activity monitor that is configured, for example, to distinguish between a plurality of different user activities may be useful for a wide variety of applications including, but not limited to, fitness monitoring and evaluation, health and medical applications (e.g., patient monitoring or tracking rehabilitation progress), validating fitness-based health-insurance discounts, monitoring animal activity, monitoring human special service activities (e.g., monitoring activities of fire, rescue, or police personnel). We have also recognized that for some applications, for such a device to be widely adopted, it is preferable that the device not be overly cumbersome or hindering when attached to a user, that it have a long battery life, and that it be able to distinguish between a plurality of human as well as non-human activities. Further, we have determined that it may be advantageous for some applications for the data reported by the activity monitor to be accompanied by a measure of quality or confidence of the data, so that partial credit may be given for an activity rather than an all-or-nothing approach.

We have also recognized that in at least some circumstances useful results may be obtained from an activity monitor when it is worn or supported at any of various locations on a person (e.g., on the leg, below the knee, on the shoe, on the belt, in pants pockets, in shirt pockets, and clipped to chest straps or women's undergarments, etc.) and that the data signals produced by such a sensor may vary considerably depending upon the location of the device on the person. Locations at which the activity monitor may be worn may affect the measured signals, for example, both in shape and amplitude. We have realized that signals received from the activity monitor may be processed, for example, to determine where on the body the device is being worn in addition to determining the type of activity. By knowing where the device is located on the person's body, the quality or usefulness of the measured data may, for example, be improved algorithmically.

The constraints of a small yet sophisticated device with a long battery life may pose difficult technical challenges for an activity monitor. For example, sophisticated hardware and computational power to distinguish between similar and different types of activities via machine recognition, small size (e.g., size of a wristwatch or smaller), and low power consumption may be considerations for certain applications. In order to meet these challenges, an illustrative embodiment of an intelligent activity monitor described herein employs a combination of low-power techniques, a small on-board and variable clock-rate processor, and specially developed data-processing algorithms, and is thus suitable for a wide variety of applications as described above.

Various embodiments of apparatus, methods and systems for an intelligent activity monitor are described in further detail in the following sections of the specification. In overview and with reference to FIG. 1A, an intelligent activity monitor 100 may, as shown, comprise a small electronic device that can be attached to or supported by a subject 50, and that may be configured to identify a type of activity from among a plurality of different activities that may be performed by the subject, and to process data received during an identified activity to calculate one or more parameters representative of the activity. In various embodiments, the subject 50 may be human or non-human, animate or inanimate. In certain embodiments, the activity monitor 100 may be supported by a subject 50 in any suitable manner (e.g., strapped to an ankle or wrist like a watch, or attached to an article of clothing worn by the subject, clipped to a spoke on a wheel) at any suitable location on a subject. There may, in some embodiments, be a wide variety of different activities that can be identified by the activity monitor (e.g., walking, running, biking, swimming, using an elliptical workout machine, rowing, cross-country skiing, performing jumping jacks, performing athletic drills, and jumping rope). In some embodiments, the intelligent activity monitor 100 may additionally or alternatively be configured to identify “fake” or falsified activities, e.g., activities alleged to be human activities that are not common to or not capable of being performed by a human subject 50.

In some embodiments, the intelligent activity monitor may, for example, comprise an accelerometer capable of generating one or more data streams representative of motion of the activity monitor and a programmable microprocessor along with machine-readable instructions adapted to analyze the one or more data streams to identify the type of activity performed by the subject 50 as well as identify a fake activity, and to process data received during an identified activity. In certain embodiments, the intelligent activity monitor 100 may further be configured to communicate (e.g., exchange data wirelessly or via a cabled communication port) with a remote device such as a computer or data-processing device in a network of computers. In some embodiments, the intelligent activity monitor may additionally be adapted for low-power operation such that it can operate for days in some embodiments, months in some embodiments, or even years in some embodiments, between recharging or replacement of its power source.

There are a wide variety of applications for which various embodiments of the intelligent activity monitor may be useful. Examples of such applications include, but are not limited to, a monitor for general health and fitness purposes, monitoring rehabilitation activities of a human or animal subject, an aide to monitor training regimens for competitive and/or elite athletes, an aide to monitor training regimens for animals trained for competition, a monitor for exercise validation in incentive-based health-insurance discount programs, a motion monitor for inanimate moving objects, gait analysis, and workplace occupational conformance.

II. Example Apparatus

Referring now to FIG. 1B, an exploded view of an illustrative example of an activity monitor 100 that may be employed in certain embodiments is illustrated. As shown, the activity monitor 100 may, for example, comprise an enclosure that includes a first cover 170 and second cover 172. The first and second covers may be formed from any suitable materials including, but not limited to, metals and plastics or combinations thereof. As one example, the first cover 170 may be a molded plastic and the second cover 172 a corrosion-resistant metal. The first and second covers may be fastened together by any suitable means to form a water-tight seal and to enclose a power source 105 and electronic circuitry 180 of the activity monitor. A clip or strap 174 may be disposed on or attached to a surface of one of the covers so that the activity monitor 100 may be attached to or supported by a subject or machine (e.g., strapped to a wrist, ankle, or appendage, clipped to an article of clothing, strapped or clipped to a movable portion of a machine.)

As shown, the electronic circuitry 180 may comprise a combination of circuit elements 182 disposed on a printed circuit board. In various embodiments, the circuit elements 182 may, for example, include a combination of integrated circuit (IC) chips, application-specific integrated circuit (ASIC) chips, at least one microcontroller or microprocessor, micro-electrical-mechanical system (MEMS) devices, resistors, capacitors, inductors, diodes, light-emitting diodes, transistors, and/or conductive circuit traces, etc. A microcontroller or microprocessor may, for example, coordinate and manage operation of the monitor's electronic circuitry. In some embodiments, the electronic circuitry 180 may further include at least one radio-frequency (RF) antenna 185 for use in sending and receiving RF communication signals.

FIG. 1C depicts an example embodiment of internal circuitry 102 that may be used in an intelligent activity monitor 100 in further detail. As shown, the monitor's circuitry may, for example, comprise a source of power 105, e.g., a battery or energy-scavenging chip and a wake-up and power-management circuit 150, that provide and manage power to an accelerometer 130, a microprocessor or microcontroller 110, memory 120, and a transceiver 140. The microcontroller 110 may be coupled to the wake-up circuit, the accelerometer, memory, and the transceiver. The microcontroller may be configured to receive and process acceleration data from the accelerometer 130, to read and write data to memory 120, and to send and receive data from transceiver 140. The wake-up circuit 150 may be adapted to sense when the activity monitor 100 is not in use, and in response, reduce power consumption of the internal circuitry 102. The wake-up circuit may be further adapted to sense when the activity monitor 100 is placed in use, and in response, activate one or more elements of the internal circuitry 102.

In some embodiments, the microprocessor or microcontroller 110 may, for example, comprise a low-power, 8-bit microcontroller configured to draw low power in sleep-mode operation, and capable of operating at multiple millions of instructions per second (MIPS) when activated. One example of a suitable microcontroller is the 8051F931 microcontroller available from Silicon Laboratories Inc. of Austin, Tex., though any other suitable microcontroller or microprocessor may alternatively be employed in other embodiments. The microcontroller 110 may, for example, include various types of on-board memory (e.g., flash memory, SRAM, and XRAM) for storing data and/or machine-readable instructions, and may be clocked by an internal oscillator or external oscillator. In some embodiments, the microcontroller may, for example, be clocked by an internal high-frequency oscillator (e.g., an oscillator operating at about 25 MHz or higher) when the microcontroller is active and processing data, and alternatively clocked by a low-frequency external oscillator when the microcontroller is substantially inactive and in sleep mode. The clocking of the microcontroller at low frequency may, for example, reduce power consumption by the microcontroller during sleep mode.

In various embodiments, the microcontroller 110 may be configured to receive acceleration data from accelerometer 130 and process the received data according to pre-programmed machine-readable instructions. The microcontroller 110 may, for example, be configured to receive analog and/or digital input data, and may include on-board analog-to-digital and digital-to-analog converters and on-board timers or clocks. In some embodiments, the microcontroller may be further configured to receive power through wake-up and power management circuitry 150. The microcontroller may, for example, cooperatively operate with or comprise a portion of power management circuitry 150, and assist in the activating and deactivating of one or more circuit elements within the activity monitor.

In some embodiments, the microcontroller 110 may be configured to be operable at a number of different clock frequencies. When operating at a low clock frequency, the microcontroller will typically consume less power than when operating at a high clock frequency. In some embodiments, the microcontroller may, for example, be configured to be in a “sleep” mode and operating at a low clock frequency when there is no motion of activity monitor 100, and to be cycled through several operating states when motion of the activity monitor 100 is detected. An example of how a microcontroller may be cycled in such manner will be described below in reference to FIG. 2. As one example, when in sleep mode, the microcontroller may sample data at a rate less than 10 Hz and draw less than about 30 microamps.

In some embodiments, accelerometer 130 may comprise a multi-axis accelerometer configured to sense acceleration along at least two substantially orthogonal spatial directions. The accelerometer 130 may, for example, comprises a three-axis accelerometer based on micro-electro-mechanical systems (MEMS) technology. In some implementations, one or more single-axis accelerometers may additionally or alternatively be used. In some embodiments, the accelerometer 130 may be configured to provide one or more analog data-stream outputs (e.g., X, Y, Z data outputs corresponding to each axis of the accelerometer) that are each representative of a magnitude and direction of acceleration along a respective axis. One example of a suitable accelerometer is the Kionix model KXSC7 accelerometer available from Kionix Inc., Ithica, N.Y. The accelerometer 130 may, for example, provide analog output data, that may later be converted to digital data, or may provide digital output data representative of acceleration values.

The accelerometer 130 may be characterized by several parameters. Among these parameters may, for example, be a sensitivity value and a sampling rate value. As examples, the accelerometer's analog sensitivity may be between about 100 millivolts (mV) per gravitational value (100 mV/G) and about 200 mV/G in some embodiments, between about 200 mV/G and about 400 mV/G in some embodiments, between about 400 mV/G and about 800 mV/G in some embodiments, and yet between about 800 mV/G and about 1600 mV/G in some embodiments. When configured to provide a digital output, the sampling rate of the accelerometer may, for example, be between about 10 samples per second per axis (10 S/sec-A) and about 20 S/sec-A in some embodiments, between about 20 S/sec-A and about 40 S/sec-A in some embodiments, between about 40 S/sec-A and about 80 S/sec-A in some embodiments, between about 80 S/sec-A and about 160 S/sec-A in some embodiments, between about 160 S/sec-A and about 320 S/sec-A in some embodiments, and yet between about 320 S/sec-A and about 640 S/sec-A in some embodiments. It will be appreciated that in some embodiments the higher sampling rates may improve the quality of the measured accelerations.

It will be appreciated that, in some embodiments, an accelerometer 130 may be combined with one or more analog-to-digital converters to provide digital output data representative of acceleration values at sampling rates described above. When digital output data is provided by an accelerometer, the accelerometer's sensitivity may be expressed in units of bits per gravitational constant (b/G). As examples, an accelerometer providing digital output data may have a sensitivity of more than about 2 b/G in some embodiments, more than about 4 b/G in some embodiments, more than about 6 b/G in some embodiments, more than about 8 b/G in some embodiments, more than about 10 b/G in some embodiments, more than about 12 b/G in some embodiments, or even higher values in some embodiments.

In some embodiments, the activity monitor 100 may include memory 120 that is external to and accessible to the microcontroller 110. The memory 120 may be any one of or combination of the following types of memory: RAM, SRAM, DRAM, ROM, flash memory. The memory 120 may, for example, be used to store and/or buffer raw data from accelerometer 130, machine-readable instructions for microcontroller 110, program data used by the microcontroller for processing accelerometer data, and/or activity data resulting from the processing of accelerometer data. In some embodiments, the memory 120 may additionally or alternatively be used to store diagnostic information about the health of the activity monitor, e.g., battery life, error status, etc., and/or physical parameters about the device, e.g., memory size, gravitational sensitivity, weight, battery model, processor speed, version of operating software, user interface requirements, etc. In some embodiments, the memory may also be used to store information pertinent to a user, e.g., user weight, height, gender, age, training goals, specific workout plans, activity-specific data for a user that may be used to identify an activity performed by the user or process data representative of an identified activity.

In some embodiments, the memory 120 may additionally or alternatively be used to store data structures and/or code received from an external device, e.g., via a wired or wireless link. The data structures and/or code may, for example, be used to update one or more data processing applications used by the activity monitor. For example, one type of data structure may be data representative of an acceleration data pattern that may be used to identify a specific type of activity not previously recognized by the activity monitor, e.g., a new activity or an activity that is specific to an individual user of the activity monitor. As another example, a data structure may comprise a membership function, described below, defined for a new activity or redefined for an identifiable activity. The data structure may, for example, include one or more sample accelerometer traces obtained from the activity and/or may comprise identification data (e.g., membership functions) resulting from the processing of the accelerometer traces that may be used in an algorithm executed by the activity monitor 100 to identify the activity. Further, in some embodiments, the memory 120 may be used to store updates and/or replacements to algorithms executed by the activity monitor. The stored data structures and algorithms may, for example, be used to reprogram and/or expand the functionality of the activity monitor 100 to identify new activities or activities not previously recognized by the activity monitor and/or improve the accuracy or confidence of results calculated for identified activities.

In some embodiments, the memory 120 may also be used to store calibration and/or conversion data that is used by the microcontroller 110 to characterize detected activities. Calibration data may, for example, be used to improve the accuracy of detected activity parameters, e.g., stride length, speed. Conversion data may, for example, be used to convert a detected activity into an amount of expended human energy, e.g., calories burned, metabolic equivalents, etc.

In some embodiments, the activity monitor 100 may include a transceiver 140 and/or one or more data communication ports (e.g., a USB port, an RF communication port, a Bluetooth port) for communicating data between the activity monitor and an external device such as a computer, tablet, cell phone, portable communication device, or data processor, any of which may be configured to communicate with other similar devices in a network such as the world-wide web or a local area network. The activity monitor 100 may, for example, be configured to communicate via the transceiver 140 through a wired or wireless port to any device or combination or devices selected from the following list: a personal computer, laptop computer, tablet computer, PDA, a watch, an MP3 player, an iPod, a mobile phone, a medical device such as a blood glucose meter, blood pressure monitor, or InR meter, an electronic interactive gaming apparatus, and an automobile system. Data retrieved from the memory 120 or to be stored in memory 120 may, for example, be communicated between the activity monitor 100 and an external device via the transceiver 140. In some embodiments, data transmitted from the activity monitor 100 may be configured for routing to a data service device adapted to process data from the activity monitor.

In some embodiments, power for the internal electronics of the activity monitor 100 may be provided by a battery 105 and managed by a wake-up and power-management circuit 150. The battery may, for example, comprise one or more lithium-type batteries that may be rechargeable or replaceable. As an example, a single lithium coin or button cell 3-volt battery having a capacity of about 230 mAh may be used (model CR2032 available from Renata SA of Itingen, Switzerland), though any suitable type of battery may alternatively be used in other embodiments. In some embodiments, the activity monitor may include power-generation or energy-harvesting hardware (e.g., a piezo-electric material or electric generator configured to convert mechanical motion into electric current, a solar cell, an RF or thermal converter) that may be stored in a battery or charge storage component such as a super capacitor. In some implementations, generated electrical current may be provided to a storage component via a diode bridge. One example of a suitable energy harvesting device is a microenergy cell MEC225 available from Infinite Power Solutions, Inc. of Littleton, Colo. In some embodiments, power generation components may be used in combination with a rechargeable battery as a source of power for the activity monitor 100.

In some embodiments, wake-up and power-management circuitry 150 may include a motion sensor 152 that, in combination with the wake-up and power-management circuitry 150, identifies when the activity monitor 100 is being moved in a manner that may be representative of an activity to be monitored. The wake-up and power-management circuitry 150 may, for example, comprise logic and control circuitry to enable, disable, reduce and/or increase power to various circuit elements shown in FIG. 1C. Logic and control circuitry for the wake-up and power-management circuitry may, for example, comprise machine-readable instructions and utilized hardware of the microcontroller 110, or may comprise machine-readable instructions and utilized hardware of an application specific integrated circuit.

In some embodiments, the motion sensor 152 may comprise one or more force sensitive switches, e.g., a piezo element configured to generate an electric signal representative of an amount of acceleration that the activity monitor experiences. In other embodiments, the motion sensor 152 may additionally or alternatively comprise one or more contact switches that close a circuit, or open a circuit, when the activity monitor is subjected to an acceleration, e.g., a “ball-in-tube” switch that is not force sensitive. Wake-up may, for example, be initiated when a frequency of switch closures exceeds a pre-selected value. In other embodiments, the sensor 152 may additionally or alternatively comprise one or more force-sensitive contact switches that close only when the activity monitor undergoes acceleration in excess of a pre-selected value.

In some embodiments, the wake-up and power-management circuitry 150 may be configured to cycle the activity monitor through a plurality of operational states, as depicted in FIG. 2 for example. The operational state to which the activity monitor is cycled may, for example, depend upon motion detected by the wake-up and power-management circuitry. Some of the operational states may, for example, be power-conserving or low-power states.

As shown in FIG. 2, in some embodiments, there may be one low-power or no-power state 210 and one or more powered operational states 230, 250, 270. The powered operational states may, for example, include low- or power-conserving states. As illustrated, the activity monitor 100 may move from any one state to any other state along paths 220, 240, 260, 280, 215. In some embodiments, there may be paths in addition to or in lieu of those shown in FIG. 2, e.g., directly from step detect state 270 to wake qualify state 230.

In some embodiments, when the activity monitor 100 is inactive (exhibiting no motion or motion less than a first pre-selected limit or threshold), the activity monitor may operate in sleep mode 210. In some embodiments, sleep mode may consume no power. In other implementations, however, sleep mode may consume low power, e.g., about 1 microamp or less. The low power may, for example, be provided to the motion sensor 152 for detecting a motion of the activity monitor 100. In some embodiments, no power is provided to any combination of or all of the accelerometer 130, the microcontroller 110, memory 120, and transceiver 140 while in sleep mode. When sufficient motion has been detected via the motion sensor 152, the activity monitor 100 may be moved to a wake qualify state 230.

In some embodiments, sufficient motion for moving the monitor out of the sleep state 210 may be detected via motion sensor 152 according to an amplitude and/or frequency of a signal from the motion sensor. For example, when the motion sensor comprises one or more piezo-electric elements, a signal greater than a pre-defined signal value may be used to identify sufficient motion of the activity monitor 100 and move the monitor to wake qualify state 230. In another example, the motion sensor 152 may comprise one or more contact switches, and a pre-defined number of switch openings, or closings, per pre-defined time interval may be used to identify sufficient motion of the activity monitor and move the monitor to wake qualify state 230.

In some embodiments, when in the wake qualify state 230, low levels of power (e.g., less than about 30 microamps) are consumed by the activity monitor 100. Power may, for example, be provided to the accelerometer and to the microcontroller so that data from the accelerometer may be processed while in the wake qualify state 230. Power may, in some embodiments, also be provided to memory 120 while in the wake qualify state. In some embodiments, when in the wake qualify state 230, the system may be clocked at a low rate to conserve power. For example, the microcontroller 110 may be clocked at a low frequency and/or the accelerometer sampled at a low frequency (e.g., less than about 10 Hz) to obtain and process data from the accelerometer. The data may, for example, be processed to determine whether a predefined second threshold has been exceeded. In some embodiments, the second threshold may comprise a pre-defined amount of force or acceleration that the activity monitor 100 is subjected to, and may include a further parameter, e.g., a frequency of occurrence of measured values exceeding the pre-defined amount of acceleration. When the second threshold is crossed, the activity monitor may, for example, be moved to a step qualify state 250. If the second threshold is not crossed within a pre-defined period of time, the activity monitor may, for example, return to sleep mode 210.

In some embodiments, when in a step qualify state 250, more power is provided to the accelerometer 130 and/or microcontroller 110, so that a greater amount of data processing can occur. The accelerometer and/or microcontroller may, for example, be operated at a higher clock frequency so that a greater amount of data may be obtained from the accelerometer and processed by the microcontroller for a given time interval, as compared with the wake qualify state 230. The power consumed by the activity monitor in step qualify mode may be on the order of a few hundred microamps, e.g. between about 100 microamps and about 500 microamps in some embodiments. The data-collection rate may, for example, be increased to a normal operating rate in step qualify mode 250. The accelerometer data may, for example, be sampled at several hundred Hertz (e.g., 256 Hz) or higher values. In some embodiments, when in the step qualify mode 250, the activity monitor 100 may analyze detected acceleration data to determine whether the data is representative of an activity that may be a recognizable activity for the monitor 110, e.g., walking, running, swimming, jumping rope, etc. In some embodiments, if it is determined by the microcontroller that the activity may be recognizable, the activity monitor may be moved to a step detect state 270. In some embodiments, if it is determined that there is insufficient activity in a pre-defined amount of time to be recognizable by the microcontroller, then the activity monitor may return to wake qualify mode 230.

In some embodiments, when in step detect mode 270, the activity monitor may be placed in full operation. In this mode, power may, for example, be provided to transceiver 140, in addition to the other operational components so that communications with an external device may be carried out. In some embodiments, normal operating clock frequencies and sampling rates may be used for operating the accelerometer and microcontroller, so that activity detection and full data processing may be carried out. Power consumption in step detect mode 270 may, for example, be several hundred microamps, e.g., between about 200 microamps and about 600 microamps.

In some embodiments, each operational state other than sleep mode 210 may include a provision for returning the activity monitor directly to sleep mode, e.g., along state paths 215 as indicated in FIG. 2. For example, each operational state 230, 250, 270 may be configured to return the activity monitor 100 to sleep mode 210 if there is a termination of incoming data or of processed data parameters associated with incoming data. In some embodiments, the activity monitor may additionally or alternatively be configured to return to sleep mode upon a system glitch or crash, e.g., a freezing of the microcontroller. A return to sleep mode may, for example, be used to reset the activity monitor.

III. Overview of Data Processing

This section provides an overview of data handling and data processing architectures that may be implemented with various embodiments of the activity monitor 100. With reference to the block diagram of FIG. 3A, an illustrative example of a data handling and processing architecture 300 is shown. Another example of such an architecture is depicted in FIG. 3B. It should be appreciated that other embodiments may include fewer, additional, or different elements than those shown in FIGS. 3A-3B. Moreover, it should be appreciated that, in some embodiments, one or more depicted elements may be combined into a single unit providing equivalent functionality of both units depicted separately.

As illustrated in FIG. 3A, in some embodiments, the accelerometer 130 may be configured to output a data stream 133 of values representative of acceleration detected by the accelerometer along at least one direction of motion. The data stream 133 may, for example, comprise acceleration values representative of acceleration measured along respective X-, Y-, and Z-axes of motion defined by the accelerometer 130. In some embodiments, the data stream 133 of acceleration values may be provided to both a feature generator 310 and a preprocessor 305. The preprocessor 305 may, for example, pre-process the data stream 133 from the accelerometer, as described in the following section, to produce a stream of acceleration-derivative values 307 that is provided to the feature generator 310.

In certain embodiments, the feature generator 310 may process the received data stream 133 of acceleration values and the stream of acceleration-derivative values 307 to produce one or more characteristic features 312 that may be provided to an inference engine 320 and optionally to a buffer 325. The characteristic features may, for example, be used by the inference engine 320 to identify a type of activity sensed by the accelerometer 130 from among a plurality of types of activities when the accelerometer is supported by a subject in motion. In some embodiments, upon the identification of activity, the inference engine 320 may provide a control signal 322 to a multiplexor 330 to route the characteristic features 312 to an appropriate activity engine 340-1, 340-2, . . . , 340-n for further data analysis. Each activity engine may, for example, be configured to process received characteristic features according to activity-specific (e.g., running, walking, swimming, biking, etc.) algorithms to calculate one or more parameters descriptive of the activity. Illustrative examples of the one or more parameters include, but are not limited to, a measure of intensity or pace of the activity, a measure of energy expended during the activity, a measure of distance traveled during the activity, and a duration of the activity.

In some embodiments, parameters characteristic of the activity may be provided to and handled by data service 360. The data service 360 may, for example, comprise an on-board data store, e.g., memory 120, and a transceiver 140 for transmitting the data to a remote device, such as a computer. In some embodiments, data service may additionally or alternatively include an application in operation on a remote computer or a remote server configured to receive data from the activity monitor and record and/or further process the received data.

In some embodiments, any of the preprocessor 305, feature generator 310, inference engine 320, multiplexor 330, and activity engines 340-1-340-n may be embodied in whole or in part as machine-readable instructions operable on microcontroller or microprocessor 110 to adapt the microcontroller to perform a respective functionality as described above or in the following sections. Further, any of the preprocessor 305, feature generator 310, inference engine 320, multiplexor 330, and activity engines 340-1-340-n may additionally or alternatively be embodied in whole or in part as hardware configured to perform a respective functionality.

According to some implementations, inference engine 320 may be additionally or alternatively be configured to identify non-recognized activities as well as activities that are not performed by a human. For example, inference engine may receive a set of characteristic features 312 that cannot be identified as corresponding to any one of a plurality of activities recognizable by the inference engine. In this case, the data-handling architecture may, for example, be configured to provide the characteristic feature set data 312 directly to data service 360 for subsequent analysis and determination of a type of activity associated with the feature set. For example, the feature set data 312 may be routed by the multiplexor 330 from the buffer 325 to data service 360. Further, the data-handling architecture may be additionally or alternatively configured to receive machine-readable instructions and reconfiguration data 362 back from data service 360 suitable to reconfigure inference engine 320 to subsequently identify a type of activity corresponding to the previously non-recognizable feature set. In this manner, activity recognition by the activity monitor 100 may, for example, be upgraded or reconfigured at any time.

In some embodiments, a non-recognized activity may be reported to a user at a later time for subsequent identification by the user. For example, a non-recognized activity may be reported when the user is reviewing a record of monitored activities via a computer. The user may, for example, be notified that a non-recognizable activity occurred at a specific date and time and for a duration, and then the user may be queried to identify the activity. In some embodiments, the user may then identify the activity, which will in turn associate a set of characteristic features 312 with the previously non-recognized activity. In some embodiments, the system may be adapted to learn from the user identification that the previously non-recognized set of characteristic features 312 are now recognizable, and subsequently update inference engine 320 to recognize any subsequent activity exhibiting a same or similar set of characteristic features. The updating of the inference engine 320 may, for example, comprise transmitting new data structures and/or code to the microprocessor 110 for use in recognizing the new activity.

In some implementations, inference engine 320 may be configured to recognize one or more activities that are not performed by a human. One example of such an activity might be cyclic motion of a fan, wherein the activity monitor 100 may be strapped to the blade of a fan. Another example might be motion of an automobile or motorized vehicle. Another example might be motion of a bicycle wheel, e.g., monitor attached to spokes of a wheel. Other examples of activities not performed by a human might be motion in a clothes drying or clothes washing machine. Another example may include walking or running motion of a dog or horse. In some cases, the activities not performed by a human may, for example, be recognized to aid in preventing mis-recording of data generated to falsify human activity, e.g., in connection with health insurance incentive programs.

Another illustrative embodiment of a data flow and data handling architecture 301 is shown in FIG. 3B. The front end of the data processing is similar to that shown in FIG. 3A, however, in the embodiment of FIG. 3B, the inference engine 320 is configured to further receive data from preprocessor 305 and buffer 325, as well as error signal indication 345 from each activity engine 340-1 . . . 340-n. The error signal may, for example, represent a confidence level of an activity processed by an activity engine. For example, a first activity engine 340-1 may be an activity engine for walking, and a second engine 340-2 an engine for running. When the user is walking, for example, a step cadence may be below a threshold criterion for running, and an output error signal from the running activity engine 340-2 might be high. The output error signal may, for example, be used by inference engine 320 to aid in identifying an activity. In some embodiments (e.g., the embodiment of FIG. 3B), all activity engines may substantially simultaneously process feature data, whereas in other embodiments (e.g., the embodiment of FIG. 3A) only one activity engine or a selected number of activity engines may be selected to process feature data.

It should be appreciated that error signaling from activity engines may, for some applications, also be employed in the embodiment shown in FIG. 3A. For example, error signals from activity engines 340-1 . . . 340-n may be fed back to inference engine 320. The presence of a large error signal for a currently identified activity may, for example, cause inference engine 320 to re-identify an activity for newly received data.

In some embodiments, the buffer 325 may be used in various embodiments to temporarily retain data representative of an activity while inference engine 320 identifies an activity. For example, once an activity has been identified, data may be routed from the buffer to the appropriate activity engine or engines. Such temporary buffering of data prevents loss of activity data during initial identification of an activity or transitions from one activity to another.

IV. Pre-processing of Accelerometer Data

Referring again to FIGS. 3A and 3B, it should be appreciated that, in some embodiments, data handling and processing architecture 300, 301 may include a data preprocessor 305 and a feature generator 310. In some implementations, preprocessor 305 may, for example, filter acceleration values received from at least one axis of the accelerometer 130 to reduce noise in the data. In some embodiments, the preprocessor 305 may additionally or alternatively calculate derivatives of values from the acceleration values received from at least one axis of the accelerometer 130. Furthermore, in some embodiments, the preprocessor 305 may additionally or alternatively parse the received data stream into blocks of data for subsequent data processing. As described above, in some embodiments, the feature generator 320 may generate a set of characteristic features 312 that are representative of the received signals from the accelerometer 130 and preprocessor 305, and the features may subsequently be used to identify a particular activity from among a plurality of different activities as well as identify a location at which the activity monitor may be worn.

In certain embodiments, the preprocessor 305 may compute derivatives of acceleration values (also referred to herein as “acceleration-derivative values”) by calculating changes in acceleration values with respect to time according to the relation

D i , n = a i , n - a i , n - 1 dt ( 1 )

where Di,n is an nth computed acceleration-derivative value along an ith directional axis in a data stream of acceleration-derivative values, ai,n represents an nth acceleration value detected by the accelerometer along an ith axis of the accelerometer (e.g., an X axis), and dt is a time difference between the detected acceleration values ai,n and The sensed acceleration values ai,n and ai,n−1 may, for example, be samples of acceleration values obtained via analog-to-digital conversion. In some embodiments, the acceleration values ai,n may be obtained at regular intervals (e.g., all values of dt are equivalent between successive values of ai,n) and the derivative of EQ. 1 may be expressed as


Di,n=ai,n−ai,n−1  (2)

wherein the derivative with respect to time is implicit. Regular intervals of sampling may, for example, result from analog-to-digital conversion of the analog output of acceleration values from the accelerometer.

In some embodiments, computed acceleration-derivative values for multiple axes of the accelerometer 130 may be combined by preprocessor 305 for each time sample, e.g., summed together. In some implementations, an absolute value of the sum, or a squared value of the sum in some embodiments, may be computed after combining the values. In some embodiments, acceleration-derivative values for multiple axes of the accelerometer 130 may be combined according to the following relation


Dni=1m|Di,n|  (3)

where Dn represents an nth combined acceleration-derivative value in a data stream of combined acceleration-derivative values, and m represents a number of axes for which acceleration-derivative values are combined. For some embodiments employing EQ. 3, the absolute value of acceleration-derivative values Di,n may, for example, be taken prior to combining the acceleration-derivative values. It should be appreciated that a stream of values Dn may, for example, be produced as new acceleration values 133 are received and processed by the preprocessor 305.

In some embodiments, similar approaches for computing and combining acceleration-derivative values like the approaches shown in EQS. 1-3 may additionally or alternatively be employed. Such approaches may, for example, include computing acceleration-derivative values using different acceleration values (e.g., [ai,n+1−ai,n]), using a greater number of consecutive acceleration values to compute an average acceleration-derivative value, and squaring acceleration-derivative values when combining the values (e.g., [Di,n]2) rather than taking an absolute value. In some cases, the absolute value may not need to be taken when summing derivative values. Also, in some embodiments, raw accelerometer data may be smoothed, averaged, up-sampled, or down-sampled prior to computing combined acceleration-derivative values.

Additional embodiments of computing combined acceleration-derivative values are also contemplated. First derivative values may, for example, be computed in accordance with EQ. 2 above, or approximated in accordance with the following expression.


Di,n=ai,n−ai,n−m  (4)

where m may be an integer greater than 1, e.g., 2, 3, 4, or greater.

In some embodiments, second derivative values may be computed in addition to, or instead of, first derivative values. The second derivative values Cn may be computed from the Dn values, e.g.,


Ci,n=Di,n−Di,n−m  (5)

where m may be an integer greater than or equal to 1, and Di,n is an nth computed acceleration-derivative value along an ith directional axis. The values C, may be combined in any manner as described above in connection with the combination of the Di,n values.

In some embodiments, the stream of acceleration-derivative values Dn may be used by inference engine, alone or in combination with other data values, to identify activities detected by activity monitor 100. In some implementations, for example, the stream of acceleration-derivative values Dn may be used by an activity engine, alone or in combination with other data values, to compute parameters characterizing a recognized activity.

By combining and computing the acceleration-derivative values Dn as described above, a strategy for identifying activities and/or analyzing activity-generated data that can reduce the computational load on the microprocessor may be employed in some embodiments. Further, it may not be necessary for the microcontroller to determine which axis of the accelerometer provides the most relevant data, nor to track an orientation of the accelerometer during operation to analyze activity-generated data since the acceleration-derivative values tend to emphasize high-frequency variations. The computation of acceleration-derivative values Dn as described above may, for example, provide a more stable activity analysis algorithm across various types of activities.

In some embodiments, the stream of acceleration-derivative values Dn may provide better data and less data for identifying activities and/or analyzing activity-generated data than one or more data streams of raw acceleration data 133. The stream of acceleration-derivative values Dn may, for example, accentuate higher frequency components of a raw acceleration data stream that are useful in identifying and/or analyzing an activity and suppress low frequency components of raw acceleration data streams that contribute to noise or are less useful in identifying or analyzing an activity. Additionally, in some embodiments, by combining the raw data to produce the stream of acceleration-derivative values Dn, there may be less data-processing burden on inference engine 320, e.g., it may receive one data stream of acceleration-derivative values Dn rather than three data streams of raw acceleration data from each axis of accelerometer 130. The stream of acceleration-derivative values Dn may, for example, be passed by feature generator 310 to inference engine 320 in some embodiments.

In some embodiments, the feature generator 310 may receive raw data from the accelerometer 133 and data from the preprocessor 305 to produce one or more features to include in a set of characteristic features 312 that are provided to inference engine 320. It should be appreciated that sets of characteristic features 312 may, for example, be provided to inference engine 320 on a periodic or semi-periodic basis as new data is received by feature generator 310. Such a set of characteristic features 312 may, for example, comprise a data structure that includes one or more data entries representative of a detected activity. Examples of such data entries include, but are not limited to, any of the following or combination thereof: an array of acceleration-derivative values Dn for a pre-selected time interval, one or more arrays of raw acceleration data corresponding to the same time interval, a maximum and/or minimum value of the acceleration-derivative values Dn within the time interval, a maximum and/or minimum value of the raw acceleration data within the time interval, phase shifts between two or more cyclic patterns of the raw acceleration data, a number of peaks of the acceleration-derivative values Dn within the time interval, a distance between peaks, a width of at least one peak, a maximum rate of signal change at the onset of a peak. It should be appreciated that other aspects of received signals may additionally or alternatively be included in a set of characteristic features 312 that is constructed to represent a detected activity.

In some embodiments, a set Fn of characteristic features fn may be represented symbolically, according to one example and for teaching purposes only, as follows

F n = { f 1 , f 2 , f 3 f n } = { ( D n , D n + 1 , D n + m ) , P 1 , Δ P 1 , P 2 , Δ P 2 , dP 12 } ( 6 )

where fn represents an nth feature in the set, m represents a number of acceleration-derivative values Dn calculated within a pre-selected time interval, P1 represents a largest peak value of the acceleration-derivative value array (Dn, Dn+1, . . . Dn+m), P2 represents a second largest peak value of the acceleration-derivative value array (Dn, Dn+1, . . . Dn+m), ΔPn represents a width of the nth peak, and dP12 represents a distance between peaks P1 and P2. Each set Fn of characteristic features may, for example, be operated on by inference engine 320 to identify an activity type. In some embodiments, such a feature set may additionally or alternatively include a number of raw acceleration values from one or more axes of the accelerometer. In some implementations, a feature set may, for example, be treated as a vector for calculation purposes.

In some embodiments, data received from the accelerometer 120 may also be compressed and/or filtered by preprocessor 305 or feature generator 310. For example, data from the accelerometer may be streamed to preprocessor at a rate of P samples per second per axis. (P may be any value such as 64, 100, 128, 200, 256, 300, 400, 500, 512, 1024, for example.) Preprocessor may, for example, compress the data streams by averaging q values and outputting one averaged value for each q values. (q may be any integer value, e.g., 2, 4, 5, 8, 10.) In some embodiments, filtering may additionally or alternatively be employed by preprocessor or feature generator to smooth a data stream.

In some implementations, data may be scaled by preprocessor 305 or feature generator 310. For example, an auto-gain algorithm may be used to adjust accelerometer signal levels to utilize a full range, or near full range, of bits in an A-to-D converter when the raw signal level is too strong or too week. This may, for example, allow for a better capture of a waveform with higher resolution. When gain is adjusted automatically, in some embodiments, the system may be configured to provide a gain setting or scaling value along with the data, so that absolute values of the data may be determined.

In some embodiments, data reduction may occur along the direction of flow of data within the data handling architecture. For example, the raw data 133 (e.g., three acceleration data streams) from the accelerometer may be preprocessed as described above to pass only a compressed and combined acceleration-derivative data stream and/or several features to the inference engine 320 or an activity engine 340-m.

V. Identification of Activities

As noted above, in some embodiments, data generated by the accelerometer 130 may be processed by preprocessor 305 to produce acceleration-derivative data Dn and processed by feature generator to produce characteristic features fn that are provided to the inference engine 320. In some embodiments, raw acceleration data may additionally or alternatively be provided to inference engine. The Inference engine 320 may, for example, be configured to receive processed data, and in some cases raw data, and further process the received data to identify an activity sensed by the accelerometer. In some embodiments, the activity identified may be one activity from among a plurality of different activities that include activities performed by a human as well as activities that are not performed by a human.

In some implementations, the inference engine 320 may identify an activity using acceleration-derivative data Dn only, or a combination of acceleration-derivative data and a limited set of additional characteristic features. In some embodiments, the inference engine 320 may identify an activity using one or more raw accelerometer traces and/or characteristic features generated from the one or more raw traces. Further, in some embodiments, the inference engine may additionally or alternatively qualify (e.g., identify a level of quality of the recognized activity's data or identify an aspect of the recognized activity such as a location of the activity monitor on the subject) using any combination of acceleration-derivative data Dn, raw accelerometer data, and related characteristic features.

In some embodiments, the inference engine 320 may employ one or more identification algorithms to identify or distinguish activities. For example, in some embodiments, the inference engine 320 may employ a pattern recognition algorithm to recognize an activity based upon a waveform defined by acceleration-derivative data Dn and/or one or more raw accelerometer traces. In some embodiments, the inference engine 320 may additionally or alternatively employ fuzzy logic to identify an activity based upon a number of values in characteristic feature sets Fn. The inference engine 320 may thus, in some embodiments, employ a combination of pattern recognition and fuzzy logic to identify an activity. It should be appreciated that other “recognition” algorithms or combinations of such algorithms may additionally or alternatively be used.

In some embodiments, for fuzzy logic recognition, membership functions specific to different activities may be defined and downloaded to the activity monitor 100. For example, data received by inference engine 320 that has a predetermined number of values falling within a membership function range may be identified by inference engine as an activity associated with that membership function. For some implementations, fuzzy logic may be suitable for recognizing a large variety of different activities without placing a heavy data-processing burden on microcontroller 110. For example, fuzzy logic may, in some embodiments, require only determining whether a plurality of characteristic features from feature data Fn fall within certain ranges of values. In other embodiments, fuzzy logic may additionally or alternatively evaluate a cost factor for each candidate activity and identify an activity based upon a maximal value of the cost factor.

For purposes of understanding only, and without limiting the invention, one example of activity identification is described with reference to FIGS. 4A-4E. FIG. 4A represents three traces (x, y, z: top three) of raw acceleration data from a first type of activity (walking in this example). The lower trace D in FIG. 4A represents acceleration-derivative data computed from the upper traces in accordance with EQ. 3. FIG. 4B represents corresponding traces of data obtained from a second type of activity (biking in this example). FIG. 4C represents three traces of raw acceleration data and acceleration-derivative data from a third type of activity (elliptical training). FIG. 4D represents three traces of raw acceleration data and acceleration-derivative data from a fourth type of activity (tapping foot). FIG. 4E represents three traces of raw acceleration data and acceleration-derivative data from a fourth type of activity (motion in a clothes dryer). For all data shown in FIGS. 4A-4D the activity monitor 100 was supported either on the user's foot or ankle.

As can be seen from the traces of FIGS. 4A-4E, there are a number of differences in the traces. The differences include maximum and minimum values of acceleration, periodicity of the traces, number and shapes of peaks in the traces, width of peaks, and distances between peaks, among other things. The differences may, for example, be captured in characteristic feature sets Fn for each trace.

Continuing with the above example, in some embodiments, a characteristic feature set for each trace within a measurement interval Tm may be constructed with the following entries:

Fn={max value of trace (max); min value of trace (min); number of peaks with a width less than m1 samples (Np); number of valleys with a width less than m2 samples (Nv); average rate of change of the trace at half-maximum of peaks (R1/2); average distance between peaks (ΔP)}. It should be appreciated that a wide variety of characteristic features may be generated and used to identify the different activities. As can be understood from this example, the differences in the traces of FIGS. 4A-4E may, for example, be captured as numerical differences in the characteristic feature sets. In some embodiments, the inference engine 320 may then distinguish between the activities based upon such numerical differences. A judicious choice of values to include in feature sets may, in some embodiments, reduce the computational burden on inference engine 320 and enable rapid identification of different types of activities.

In some implementations, by sampling a large number of trials for each activity, variations in each of the characteristic feature values may be observed and statistics regarding the variations may be determined. The statistical results may, for example, be used to construct membership functions for fuzzy-logic activity identification. For example, it may be observed that a maximum value of acceleration-derivative trace D for running has a 2-sigma variation of 5 measurement units. A membership function for running may, for example, include the specification {(120−5)≦max value of D≦(120+5)}, where 120 measurement units is determined to be an average of the maximum value for trace D for the running activity. Continuing with the example above, a membership function for each of the activities may, for example, be defined as follows:

M activity = { ( max avg - max ) max ( max avg + max ) ; ( min avg - min ) min ( max avg + max ) ; ( N p , avg - N p ) N p ( N p , avg + N p ) ; ( N v , avg - N v ) N v ( N v , avg + N v ) ; ( R 1 2 , avg - R 1 2 ) R 1 2 ( R 1 2 , avg + R 1 2 ) ; ( Δ P avg - Δ P ) Δ P ( R Δ P avg + Δ P ) } ( 7 )

where the subscript “avg” designates an average or expected value, and the quantities ±∂[ ] identify a pre-defined range within which a measured value would be considered to qualify as belonging to the membership function. In some embodiments, when characteristic features are received by the inference engine 320, for which all values qualify as belonging to the membership function, then the detected activity may be identified by the inference engine. The inference engine may then, for example, route the data from the feature generator, using multiplexor 330 to an appropriate activity engine for further analysis.

In some instances, there may be partial overlap of membership functions. For example, one or more ranges for characteristic features in one membership function may overlap or be coincident with corresponding ranges in a second membership function. Even though there may be partial overlap of membership functions, in some implementations, an activity may be identified based on non-overlapping characteristic features. For example, each activity may receive a score (e.g., a value of 1) for each feature that falls within the membership function for that activity. In some embodiments, after scores have been tallied for each activity, the one receiving the highest score may be selected as the identified activity.

In some embodiments, scores based on the described membership functions may be all-or-nothing, e.g., either a feature fn is measured and determined to be within the bounds of its corresponding membership function and contribute a score, or may be outside the bounds and contribute nothing. Other embodiments may additionally or alternatively employ membership functions such as those as depicted in FIGS. 5A-5C. The graphs show membership functions Mi,j that have been constructed for n characteristic features (denoted by the “j” subscript) and for two activities (denoted by the “i” subscript). The membership functions Mi,j may, for example, be constructed from statistical analysis of many measurements, as described above. Though the membership functions are shown as trapezoidal, they may take any suitable shape, e.g., round top, semi-circular, semi-ellipse, Gaussian, parabolic, etc.

In some implementations, when a characteristic feature is measured, e.g., f1 510, a corresponding value for each activity's membership function for that feature may be determined. For the case shown in FIG. 5A, for example, the second activity's membership function M2,1 contributes a value 512 where the first activity's membership function contributes no value. For a second measured feature 520 shown in FIG. 5B, both membership functions may, for example, contribute different values 522, 524. For another measured feature 530 shown in FIG. 5C, both membership functions may, for example, contribute identical values.

In some embodiments, a cost factor C, may be computed for each candidate activity (denoted by the “i” subscript) based upon detected feature values 510, 520, 530 and the predetermined membership functions Mi,j according to the following relation:

C i = j = 1 n W ij M i , j ( f j ) j = 1 n W ij ( 8 )

where Wi,j represents a weighting factor for the jth feature of the ith activity. The weighting factor may, for example, be selected to emphasize some features and de-emphasize other features for purposes of identifying an activity. In some embodiments, an activity with the highest cost factor Ci may be selected as the identified activity. For example, with reference to FIGS. 5A-5C and considering only the membership functions shown and the measured feature characteristics 510, 520, 530, the second activity M2 would be selected as the identified activity in such embodiments.

In some cases, membership functions may overlap entirely, such that it would not be possible for the inference engine 320 to identify an activity between the two membership functions. This situation might occur, for example, when a new membership function is added to the inference engine 320 for recognition of a new activity not previously recognizable by the inference engine. For example, if the activity monitor 100 is configured to identify biking and is later updated to identify elliptical training activity, a new membership function for identifying elliptical training may entirely overlap with the pre-defined membership function for biking since the two activities are similar.

In some embodiments, membership functions and characteristic features used by the inference engine 320 may be expandable, so that additional characteristic features and associated membership functions can be added to the system. The additional characteristic features and revised membership functions may, for example, be added to distinguish two activities that previously had substantially overlapping membership functions. The addition and updating of membership functions and features can be accomplished, for example, via communication between the activity monitor 100 with an external device, e.g., a personal computer or computer connected to the internet.

In some embodiments, in addition to or in lieu of recognizing human activities, the activity monitor 100 may be configured to recognize activities that are not performed by a human, e.g., motion in a clothes dryer (see FIG. 4E), or activities that may be detected when the activity monitor is not supported by a human but is supported by a machine or animal. Such activities may, for example, be detected as an attempt to falsify human activity, and may be referred to herein as “fake” activities, or may be useful in analyzing certain activities such as riding a bicycle when the monitor is place on a wheel of the bike. In some embodiments, inference engine 320 may additionally or alternatively be configured to recognize patterns (e.g., pattern recognition) and/or evaluate membership functions (e.g., fuzzy logic recognition) to identify non-human activities such as motion on a ceiling fan, motion on a wheel of a bicycle or motorized bike, motion in a drying or washing machine. In some instances, an acceleration contribution from gravity may, for example, be tracked to deduce a fake or non-human activity, e.g., the gravitational acceleration contribution changes rapidly from axis to axis. In some implementations, a frequency of cyclic motion may additionally or alternatively be used to deduce a fake or non-human activity, e.g., by determining that the frequency is at a rate higher than humanly possible.

VI. Activity-Specific Data-Processing Engines and Calculation of METS

In some embodiments, once an activity has been identified, data from the feature generator 310, which may include raw accelerometer data, may be routed to an appropriate activity engine 340-m (m corresponding to the value of a selected engine 1, 2, 3, . . . n). Each of the activity-specific data processing engines may, for example, use any instance or combination of acceleration-derivative data Dn, one or more accelerometer trace data, and characteristic feature data to determine a value of one or more parameters associated with the activity (e.g., speed, distance, number of steps, etc.). In some embodiments, the acceleration-derivative data Dn may additionally or alternatively be used to determine an intensity of the activity (e.g., walking speed, running speed, cadence) and/or other parameters associated with the activity.

As one example and referring to FIG. 4A, the inference engine 320 may identify the activity represented by the data in the figure as walking. Accordingly, multiplexor may, for example, be configured to forward only acceleration-derivative data Dn to a walking activity engine 340-1. Walking activity engine 340-1 may, for example, be configured to determine a distance Tc between a large peak 410, which corresponds to a heel strike, and a successive smaller peak 420, which corresponds to a toe-off in a step. The distance Tc may, for example, represent the contact time of the foot with the ground, from which a walking speed can be determined. An example of a method of determining walking speed from foot contact time is disclosed in U.S. Pat. No. 4,578,769, which in hereby incorporated by reference in its entirety.

In some embodiments, after determining an intensity for an activity, the selected activity engine may additionally or alternatively determine an energy expenditure (e.g., calorie burning rate) for the activity. The energy expenditure may, for example, be determined from a look-up table of metabolic equivalents (METs) for the activity. The look-up table may, for example, comprise a list of metabolic expenditures where each entry may be associated with one or more activity intensity parameters, e.g., step rate, speed, heart rate, etc. A look-up table for each activity that may be performed by a human may, for example, be stored in memory 120 of the activity monitor. In some embodiments, look-up tables for METs may be user-specific, e.g., specific to a user's sex, weight, and height. In some embodiments, over the course of an activity session, the activity monitor may record calorie burn rates as a function of time and also compute a total number of calories burned as a function of time of the activity. In some embodiments, the activity engine may additionally or alternatively compute other data, e.g., METs/sec, maximum speed, average speed, distance traveled, number of steps, maximum calorie burn rate, time of day etc.

In some embodiments, any data computed by the activity engine 340-m may be provided to data service 360 for subsequent presentation to the user and/or storage in a remote storage device. Additional data may also be stored with data from the activity engine in some embodiments. For example, data identifying the type, date, time, and duration of the activity may be stored in association with the data from the activity engine. In some embodiments, identifying data may be stored as a header associated with a data structure provided by the activity engine.

In some embodiments, the activity monitor 100 may additionally or alternatively be configured to store in memory user goals (e.g., number of steps per day, distance traveled, an exercise duration for a specific activity). The activity monitor may, for example, be further configured to provide an audible or tactile indication to the user when the goal is reached. For example, the activity monitor may beep or vibrate when a user has reached a goal of a walking distance within a time interval of a day.

In some embodiments, the activity monitor 100 may additionally or alternatively be configured to recognize specific motion gestures that a user may execute (e.g., shaking the activity monitor, moving it in a circle with the hand, spinning the activity monitor). The activity monitor may, for example, include one or more activity engines adapted to recognize such gestures. The recognized gestures may, for example, be used as an interface method for executing specific functions on the activity monitor (e.g., power up, power down, clear data, set a goal).

VII. Calibration and Quality of Activity Data

Determining parameters for a motion sensing device that accurately reflect an activity associated with the sensed motion, such as distance and speed of a walking or running person, may be difficult without proper calibration techniques. Thus, in some embodiments, to ensure that an activity monitor 100 provides data that accurately reflects various parameters associated with the activity, activity-dependent calibration factors may be employed, e.g., such factors may be used by an activity engine 340-m when computing activity-related data from data received from motion detection and preprocessing circuitry. Activity-dependent calibration factors for each activity may, for example, be maintained and updated in memory 120.

In some implementations, calibration factors may be used in one or more equations used by an activity engine 340-m to compute a measure of intensity of the activity. An example of such an implementation is disclosed, for example, in U.S. Pat. No. 4,578,769 (incorporated by reference above) which describes deducing a runner's speed based upon foot contact time that is detected by a sensor placed in footwear. In some embodiments, there may be a number of different calibration factors needed for an activity monitor configured to recognize a number of different activities. Such calibration factors may, for example, be determined ahead of time, e.g., through laboratory testing and experimentation, and then loaded into memory 120 of the activity monitor 100 prior to its use.

In some embodiments, calibration may be performed in conjunction with a user's review of the data and user input. For example, a user may jog for 2.0 miles and record a time, 14 minutes, 0 seconds (14:00), that it took to jog the two miles. The activity monitor may, for example, use a pre-defined calibration technique to identify the activity as running and the activity engine 340-m may compute a running pace of 6:50 minutes/mile. In such an implementation, the user may then, via a computer-based interface with the activity monitor, execute a calibration routine wherein the user may first select the identified activity and computed pace, and then enter a known pace for the activity. The system may then adjust or replace an internal calibration factor used by the activity monitor 100 with a new calibration value for that activity. In this manner, calibrations for various activities can be made specific to individual users of the activity monitor, which may improve the accuracy of the activity monitor for each user.

In some embodiments, the activity monitor 100 may be additionally or alternatively configured for automatic calibration or self-calibration. Such calibration routines may be executed for one or more recognizable activities. In some implementations, the activity monitor may be configured for self-calibration for one or more activities. As one example, self-calibration for running will be described. When the activity monitor is placed on the foot or ankle, the activity monitor will temporarily come to rest along the direction of running (taken as x-directed in this example) as the foot plants on the ground. When the foot is planted, the x-directed velocity of the accelerometer is zero, and this can serve as a reference point for calibration. When the foot next plants, the x-directed velocity returns again to zero. By integrating the x-directed acceleration data twice, a distance between the two successive foot plants can be determined. The distance may be corrected using y- and z-directed acceleration values, since the orientation of the accelerometer changes as the foot moves forward. Once the distance is determined, a time between the foot plants may be determined from an internal clock of the activity monitor. The time and distance may then be used to calculate a velocity of the runner, or walker. The velocity may be determined from two successive foot plants, or more to obtain an averaged value, and the process of determining velocity may be repeated at separated intervals of time. In some embodiments, the calculated velocity may be used to update or correct internal calibration values used by the activity monitor 100. For example, the calculated velocity may be used to correct a calibration value used for estimating running speed based on foot contact time.

Calibrations may additionally or alternatively be used by the activity monitor in a different manner, and such calibrations may be referred to as location-dependent calibrations. For example and with regard to running or walking without being limited to only these activities, when the activity monitor 100 is placed on the ankle or foot, for example, a more precise measurement can be made of the activity than if the monitor were worn on the belt or placed in a trouser pocket. This can be seen, for example, in the raw accelerometer data traces of FIGS. 6A-6C. When the monitor is worn on the ankle (FIG. 6A), the z and x waveforms are more pronounced than when the monitor is worn on the belt (FIG. 6B). The timing of the foot strike and/or foot contact time can be determined more accurately using the data from a monitor worn on the ankle or foot.

In various embodiments, the activity monitor may additionally or alternatively be configured to recognize a type of activity independent of the location of where the monitor is worn, and is further configured to identify where the monitor is worn for the activity. Just as the data traces of FIG. 6A can be identified as walking by inference engine 320 as described above, the traces of FIG. 6B may be identified as walking where the monitor is worn on a belt, and the traces of FIG. 6C may be identified as walking where the monitor is located in a pocket. For example, the traces of FIG. 6B may generate characteristic features fn belonging most closely to one or more membership functions that would identify the activity as “walking, monitor on belt.”

In some embodiments, when an activity is identified where the activity monitor is mounted in a non-optimal location, a different calibration value, or values, may be used by activity engine 340-m to compute an intensity of the activity, according to some embodiments. For example, different calibration values may be associated with each identifiable activity and monitor location. In other embodiments, when an activity is identified where the activity monitor is mounted in a non-optimal location, information that was gathered from prior use of the activity monitor in a more optimal location may additionally or alternatively be used to infer or estimate parameters of the activity with the monitor in the non-optimal location. For example, walking data gathered when the activity monitor is worn on an ankle may be used to determine stride lengths that correspond to different walking step frequencies or cadences. Then, when the activity monitor is worn at a non-optimal location (e.g., a belt or pocket), a detected cyclic frequency or cadence may, for example, be used in conjunction with the previously-obtained data to infer or estimate a stride length for the activity. The estimated stride length may be user-specific. In some implementations, different calibration techniques and/or calibration values may be associated with each identifiable activity and monitor location.

In some embodiments, an activity may additionally or alternatively be identified using the accelerometer traces and/or characteristic features generated from these traces. Once identified, the data may, for example, be qualified or the location of the activity monitor 100 may be identified using acceleration-derivative data Dn. With reference to FIG. 6A, it should be appreciated that, in some embodiments, feature characteristics may be generated from the acceleration-derivative data trace D (e.g., peak values, number of peaks in measurement interval Tm) and may be used to identify that the activity monitor is worn on the foot or ankle and therefore qualify the data as being of high quality. The same feature characteristics generated from the acceleration-derivative data trace D of FIG. 6B may, for example, be used to determine that the activity monitor is not worn on the foot or ankle and therefore qualify the data as being of low quality. In some embodiments, feature characteristics generated from the acceleration-derivative data alone, or in combination with feature characteristics generated from accelerometer data traces, may be used to identify a particular location of the activity monitor (e.g., belt, trouser pocket, shirt pocket, arm, wrist).

In some embodiments, an intensity value or credit for an activity may additionally or alternatively be reduced by a preselected value when it is recognized that the activity monitor is located in a non-optimal location. For example and returning to the example of FIGS. 6A-6C, the system may compute an intensity for the activity (walking in this case) in a standard way, but then de-rate the computed intensity (e.g., multiply the computed intensity by a pre-selected value that would lower the energy expenditure of the user) since the quality of the data is less than optimal. The devaluing of the data may, for example, depend upon the location of the activity monitor, e.g., one value used for a belt location, another value used for a pants pocket location, etc.

In some embodiments, characteristics of an activity may additionally or alternatively be inferred by the activity monitor from prior high quality data, and an intensity for the activity may be computed accordingly. Returning again to the example of FIGS. 6A-6C, the activity monitor 100 may, for example, store in memory 120 or provide for storage in an external memory device one or more samples of high quality data (FIG. 6A) when such data is collected and the monitor is worn in an optimal or near optimal location for characterizing the activity. In some embodiments, when the activity is repeated and the inference engine identifies the activity but with the monitor worn in a non-optimal location (FIG. 6B or 6C), the activity monitor may recall from memory, higher quality data with a cadence that matches the currently sensed activity. The higher quality data may, for example, be repeatedly provided to activity engine 340-m for subsequent processing. As the currently sensed cadence changes, different samples may, for example, be retrieved from storage. In some embodiments, the samples retrieved from storage may depend on additional values of the currently sensed signals other than cadence, e.g., peak values, widths of peaks, minimum values.

In some embodiments, the activity monitor may additionally or alternatively provide a measure of confidence along with data output by an activity engine 340-m. For example, the monitor may indicate a confidence level in the recognition of the activity (e.g., >90% confidence, >75% confidence, >95% confidence), and may also indicate a level of quality of the data (e.g., best, fair, poor). Confidence may be determined, for example, by how central each measured feature characteristic falls within a membership function, or based upon a calculated value of the cost factor (e.g., value calculated in accordance with EQ. 8) for an activity, or how well a measured pattern matches a reference patter. Quality of the data may be determined, for example, based upon an identified location of the activity monitor when worn by the user during the identified activity. Another form of calibration may additionally or alternatively be implemented in some embodiments, and is referred to herein as “step conversion.” Step conversion between different activities based on metabolic equivalents has been studied previously. By way of example, if one were to ride a bicycle there would be no steps, but rather a cadence. Cadence in biking may, for example, be converted to an equivalent number of walking steps through the use of MET equivalents. The concept of MET equivalents for various activities is described, for example, in Ainsworth, B, et al., 2011 Compendium of Physical Activities: A Second Update of Codes and MET Values, Medicine & Science in Sports & Exercise, August 2011, which is hereby incorporated by reference in its entirety.

In some embodiments, using the value of the measured activity parameter and a calculated intensity value from the specific exercise, the METs that are calculated for the measured activity (e.g., bicycling) may be translated to an equivalent value (e.g., a number of steps) for another activity (e.g., walking). Furthermore, in some embodiments, the steps and intensity that are so determined may also be translated into relative speed and distance. In some embodiments, such calculations may be performed for any activity where the intensity can be determined by the activity monitor 100. Accordingly, in some embodiments, detection of one activity may be converted to equivalent efforts for other activities.

As described above, in some embodiments, the calibration values, characteristic features, membership functions, and/or computation algorithms used by the activity monitor 100 may be added and/or revised when the device is interfaced with a computer via transceiver 140. It should thus be appreciated that in such embodiments the activity monitor may be personalized to become more accurate for a given activity, for a given location of being worn, and/or for a particular user. For example, when the device detects accelerometer data that cannot be recognized by the inference engine 320, the device can log the data along with any related characteristic features generated from the data, the time, and duration of the non-recognized activity. In some embodiments, when subsequently in communication with an external device having a user interface, such as a computer, smart phone, PDA, or similar device, the user may, for example, be queried to identify the intensity, activity, and/or location of the monitor. In such embodiments, the information may, for example, be returned to the device, and a new membership function, features, and/or identification algorithm may be defined for the activity. The membership function and/or identification algorithm may, in some embodiments, be produced external to the activity monitor and downloaded. In some embodiments, one or more new activity engines 340-m may additionally or alternatively be added for the purpose of personalizing the activity monitor 100.

In some embodiments, the activity monitor 100 may collect quantifiable information about one's activity, and can identify any one of a plurality of different activities being performed by the user. In some implementations, the activity monitor may, for example, identify an activity independently of where the activity monitor is worn, and also identify where the monitor is worn for the activity. Further, in some embodiments, the activity monitor 100 may additionally or alternatively provide quality metrics associated with the sensed data. In some embodiments, the data collection and processing may, for example, be done at low power using data reduction techniques, a variable clock rate microcontroller, and fuzzy logic data processing.

In some embodiments, the activity monitor 100 may be useful for broad community challenges, allowing for users to be able to readily compare themselves to each other. The activity monitor may, for example, be used for more accurately handicapping users of different performance capabilities. Some embodiments of the activity monitor 100 may, for example, allow healthcare providers, insurance companies and employers to more accurately assess fitness levels and exercise regimens of individuals and provide appropriate incentives accordingly.

All literature and similar material cited in this application, including, but not limited to, patents, patent applications, articles, books, treatises, and web pages, regardless of the format of such literature and similar materials, are expressly incorporated by reference in their entirety. In the event that one or more of the incorporated literature and similar materials differs from or contradicts this application, including but not limited to defined terms, term usage, described techniques, or the like, this application controls.

The section headings used herein are for organizational purposes only and are not to be construed as limiting the subject matter described in any way.

While the present teachings have been described in conjunction with various embodiments and examples, it is not intended that the present teachings be limited to such embodiments or examples. On the contrary, the present teachings encompass various alternatives, modifications, and equivalents, as will be appreciated by those of skill in the art.

While various inventive embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the inventive embodiments described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the inventive teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific inventive embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described and claimed. Inventive embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, and/or methods, if such features, systems, articles, materials, and/or methods are not mutually inconsistent, is included within the inventive scope of the present disclosure.

The above-described embodiments of the invention can be implemented in any of numerous ways. For example, some embodiments may be implemented using hardware, software or a combination thereof. When any aspect of an embodiment is implemented at least in part in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.

In this respect, various aspects of the invention, e.g., feature generator 310, preprocessor 305, inference engine 320, activity engines 340-m, and data service 360, may be embodied at least in part as a computer readable storage medium (or multiple computer readable storage media) (e.g., a computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other tangible computer storage medium or non-transitory medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement the various embodiments of the technology discussed above. The computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various aspects of the present technology as discussed above.

The terms “program” or “software” are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects of the present technology as discussed above. Additionally, it should be appreciated that according to one aspect of this embodiment, one or more computer programs that when executed perform methods of the present technology need not reside on a single computer, processor, or microcontroller, but may be distributed in a modular fashion amongst a number of different computers, processors, or microcontrollers to implement various aspects of the present technology.

Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically the functionality of the program modules may be combined or distributed as desired in various embodiments.

Also, the technology described herein may be embodied as a method, of which at least one example has been provided. The acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.

All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.

The indefinite articles “a” and “an,” as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean “at least one.”

The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.

As used herein in the specification and in the claims, “or” should be understood to have the same meaning as “and/or” as defined above. For example, when separating items in a list, “or” or “and/or” shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as “only one of” or “exactly one of,” or, when used in the claims, “consisting of,” will refer to the inclusion of exactly one element of a number or list of elements. In general, the term “or” as used herein shall only be interpreted as indicating exclusive alternatives (i.e. “one or the other but not both”) when preceded by terms of exclusivity, such as “either,” “one of,” “only one of,” or “exactly one of.” “Consisting essentially of,” when used in the claims, shall have its ordinary meaning as used in the field of patent law.

As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, “at least one of A and B” (or, equivalently, “at least one of A or B,” or, equivalently “at least one of A and/or B”) can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.

The claims should not be read as limited to the described order or elements unless stated to that effect. It should be understood that various changes in form and detail may be made by one of ordinary skill in the art without departing from the spirit and scope of the appended claims. All embodiments that come within the spirit and scope of the following claims and equivalents thereto are claimed.

Claims

1. An activity monitor comprising:

an accelerometer; and
a microprocessor, wherein the microprocessor is configured to compute derivatives of first acceleration data received from a first axis of the accelerometer to form first acceleration-derivative values when the accelerometer is supported by a subject in motion and to process the acceleration-derivative values to identify an activity of the subject.

2. The activity monitor of claim 1, further comprising:

a power source;
a transceiver; and
power-management circuitry configured to sense an activity level of the activity monitor and apply power to or remove power from at least the microprocessor responsive to the sensed activity level of the activity monitor.

3. The activity monitor of claim 1, wherein the activity monitor includes a strap or a clip for attaching the activity monitor to an article of clothing or strapping the activity monitor to a subject.

4. The activity monitor of claim 1, wherein the identified activity is one activity from among a plurality of activities identifiable by the activity monitor, the plurality of activities comprising two or more activities selected from the following group: walking, running, biking, swimming, using a type of exercise machine, rowing, cross-country skiing, jumping-jacks, sit-ups, push-ups, pull-ups, and jumping rope.

5. The activity monitor of claim 4, wherein the microprocessor is further configured to identify, from the processed acceleration-derivative values, a falsified human activity and/or an activity that a human is not capable of performing.

6. The activity monitor of claim 1, wherein the accelerometer is a multi-axis accelerometer and the microprocessor is further configured to compute derivatives of acceleration data received from each axis of the multi-axis accelerometer to form multi-axis acceleration-derivative values and to combine the multi-axis acceleration-derivative values according to the following relation: D n = ∑ i = 1 m    D i, n 

where Dn represents a combined acceleration-derivative value and Di,n represents an nth computed acceleration-derivative value for an ith axis of the multi-axis accelerometer.

7. The activity monitor of claim 1, wherein the microprocessor is configured to:

form the first acceleration-derivative values into a first stream of data; and
identify from the first stream of data at least one characteristic feature by extracting peak values and/or peak widths from the first data stream.

8. The activity monitor of claim 1, wherein the microprocessor is configured to:

form the first acceleration-derivative values into a first stream of data; and
identify from the first stream of data at least one characteristic feature by determining maximum and minimum values of the first data stream.

9. The activity monitor of claim 1, wherein the microprocessor is further configured to:

form the first acceleration-derivative values into a first stream of data;
process the first stream of data to identify at least one characteristic feature of the first stream of data;
provide the at least one characteristic feature to a fuzzy inference engine; and
analyze the at least one characteristic feature with the fuzzy inference engine to identify one activity from among a plurality of different activities.

10. The activity monitor of claim 9, wherein the microprocessor is further configured to:

provide the first acceleration data and/or the at least one characteristic feature to one of a plurality of activity analysis engines based on the identification of the one activity; and
calculate, with the one activity analysis engine, parameters characterizing the activity.

11. The activity monitor of claim 10, wherein the parameters includes one or more parameters selected from the following group: a measure of pace of the activity, a measure of energy expended during the activity, a measure of distance traveled during the activity, and a duration of the activity.

12. The activity monitor of claim 11, wherein the microprocessor is further configured to calculate at least one quality metric for one or more of the parameters, the at least one quality metric indicating a reliability of the one or more of the parameters.

13. The activity monitor of claim 1, wherein the microprocessor is further configured to identify a location at which the activity monitor is supported by the subject and to select a data processing algorithm based on the identified location.

14. A method comprising:

computing, by at least one microprocessor, derivatives of first acceleration data received from an accelerometer to form first acceleration-derivative values, the first acceleration data being representative of acceleration along a first axis of the accelerometer; and
processing the first acceleration-derivative values to identify an activity sensed by the accelerometer.

15. The method of claim 14, further comprising: D n = ∑ i = 1 m    D i, n  where Dn represents a combined acceleration-derivative value and Di,n represents an nth computed acceleration-derivative value for an ith axis of the accelerometer.

computing derivatives of at least second acceleration data received from the accelerometer to form at least second acceleration-derivative values, the at least second acceleration data being representative of acceleration along at least a second axis of the accelerometer; and
combining the first and at least second acceleration-derivative values according to the following relation:

16. The method of claim 14, wherein the accelerometer is disposed in an activity monitor and the method further comprises clipping the activity monitor to an article of clothing or attaching the activity monitor to a subject.

17. The method of claim 14, further comprising:

forming a first data stream of the first acceleration-derivative values over a first period of time;
identifying at least one characteristic feature of the first data stream;
providing the at least one characteristic feature to a fuzzy inference engine; and
analyzing the at least one characteristic feature with the fuzzy inference engine to identify one activity from among a plurality of different activities.

18. The method of claim 17, wherein the identifying at least one characteristic feature comprises extracting peak values and/or peak widths from the first data stream.

19. The method of claim 17, wherein the identifying at least one characteristic feature comprises determining maximum and minimum values of the first data stream.

20. The method of claim 17, wherein the identifying at least one characteristic feature comprises calculating an average value of acceleration from the first acceleration data for the first period of time.

21. The method of claim 17, wherein the plurality of different activities include two or more activities selected from the following group: walking, running, biking, swimming, using a type of exercise machine, rowing, cross-country skiing, jumping-jacks, sit-ups, push-ups, pull-ups, and jumping rope.

22. The method of claim 17, wherein the plurality of different activities include falsified human activities.

23. The method of claim 17, further comprising:

determining that the at least one characteristic feature is representative of an activity not recognized by the fuzzy inference engine; and
receiving machine-readable instructions and data enabling the fuzzy inference engine to subsequently identify an activity corresponding to the at least one characteristic feature.

24. The method of claim 17, wherein the fuzzy inference engine uses historical data specific to a user in evaluating the at least one characteristic feature to identify the one activity from among the plurality of different activities.

25. The method of claim 17, further comprising:

providing the first acceleration data and/or the at least one characteristic feature to one of a plurality of activity analysis engines based on the identification of the one activity; and
calculating, with the one activity analysis engine, parameters characterizing the activity.

26. The method of claim 25, wherein the parameters includes one or more parameters selected from the following group: a measure of pace of the activity, a measure of energy expended during the activity, a measure of distance traveled during the activity, and a duration of the activity.

27. The method of claim 25, further comprising calculating at least one quality metric for one or more of the parameters, the at least one quality metric indicating a reliability of the one or more of the parameters.

Patent History
Publication number: 20130158686
Type: Application
Filed: Nov 30, 2012
Publication Date: Jun 20, 2013
Applicant: FitLinxx, Inc. (Shelton, CT)
Inventor: FitLinxx, Inc. (Shelton, CT)
Application Number: 13/690,313
Classifications