WEARABLE MENTAL STATE MONITOR COMPUTER APPARATUS, SYSTEMS, AND RELATED METHODS

A computer-implemented method of assessing the metal state of an individual by: (1) providing the individual with a wearable device (e.g., eyewear) that includes one or more sensors for assessing the mental state of the individual, (2) using information from one or more of the sensors to assess the mental state of the individual; and (3) informing the individual or a third party of the individual's mental state. In various embodiments, the method further involves using the wearable device to determine one or more environmental factors that are related to the individual's mental state. For example, the method may involve determining (e.g., from one or more images taken using the wearable device) that the individual is frequently in a stressed emotional state when a particular person is present, when the individual is engaged in a particular activity, and/or when the wearer experiences a certain internal or external context.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATION

This application claims the benefit of U.S. Provisional Patent Application No. 62/046,406, filed Sep. 5, 2014, entitled, “Wearable Health Computer Apparatus, Systems, and Related Methods,” which is hereby incorporated herein by reference in its entirety.

BACKGROUND

Being able to detect stress in a person and the cause of that stress based on personal and environmental factors is of importance to many people. Accordingly, there is a need for improved systems and methods for monitoring and detecting stress for an individual. Various embodiments of the present systems and methods recognized and address the foregoing considerations, and others, of prior art systems and methods.

SUMMARY OF THE VARIOUS EMBODIMENTS

In general, in various embodiments, a computer-implemented method of assessing the mental state of a wearer of a wearable device comprises providing eyewear to the wearer that comprises at least one or more sensors coupled to the eyewear that are adapted to detect one or more characteristics of the wearer of the eyewear. The one or more characteristics are associated with the wearer's mental state. The system also receives one or more signals from the one or more sensors, wherein each of the one or more signals relates to at least one characteristic associated with the wearer. The at least one characteristic is selected from a group consisting of: (1) pupil size; (2) heart rate; (3) perspiration level; (4) respiration rate; (5) movement; and (6) brainwave activity. In various embodiments, the system analyzes the one or more received signals to determine the at least one characteristic associated with the wearer. The system may then facilitate the determination of a mental state of the wearer based on the at least one characteristic, and associate the mental state of the wearer with an object and/or an activity.

In various embodiments, a computer-implemented method of assessing the mental state of a wearer of a wearable device comprises providing the wearer with eyewear comprising a front-facing camera and an eye-facing camera that are adapted to detect one or more characteristics of the wearer of the eyewear. The one or more characteristics of the wearer are associated with the wearer's mental state. The system receives one or more first images from the eye facing camera, wherein at least one of the one or more first images relates to at least one characteristic associated with the wearer. The system also analyzes, by a processor, the at least one or more first images to determine the at least one characteristic. The method includes facilitating determination of a mental state of the wearer based on the at least one characteristic.

BRIEF DESCRIPTION OF THE DRAWINGS

Various embodiments of systems and methods for assessing a user's mental state are described below. In the course of this description, reference will be made to the accompanying drawings, which are not necessarily drawn to scale and wherein:

FIG. 1 is a block diagram of a Mental State Monitoring System in accordance with an embodiment of the present system.

FIG. 2 is a block diagram of the Mental State Server of FIG. 1.

FIGS. 3A-3B depict a flowchart that generally illustrates various steps executed by a Mental State Monitoring Module according to a particular embodiment.

FIG. 4 is a perspective view of eyewear that may serve as the Wearable Health Monitoring Device 156 of FIG. 1.

DETAILED DESCRIPTION OF SOME EMBODIMENTS

Various embodiments will now be described more fully hereinafter with reference to the accompanying drawings. It should be understood that the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout.

OVERVIEW

A wearable mental state monitoring system, in various embodiments, may, for example, be embodied in any suitable wearable device configured to monitor the mental state of a wearer. The system may, for example, be embodied as a pair of eyewear, as contact lenses, as a wristwatch, as a suitable piece of clothing (e.g., such as a suitable shirt, pair of pants, undergarment, compression sleeve, etc.), as footwear, as a hat, as a helmet, as an orthopedic cast, or any other suitable wearable item. In a particular example, a wearable mental state monitoring system embodied as a pair of eyewear may enable the system to access one or more (e.g., all five) of a wearer's senses (e.g., touch, sight, sound, smell, and taste) based at least in part on a proximity of the eyewear to the wearer's sensory systems (e.g., eyes, mouth, ears, nose) when worn by the wearer.

In various embodiments, the system comprises one or more sensors configured to determine one or more attributes of the wearer's mental state. The one or more sensors may be coupled to the wearable device in any suitable way. For instance, the one or more sensors may be embedded into the wearable device, coupled to the wearable device, and/or operatively coupled to the wearable device. The one or more sensors may include, for example, one or more heart rate monitors, one or more electrocardiograms (EKG), one or more electroencephalograms (EEG), one or more pedometers, one or more thermometers, one or more transdermal transmitter sensors, one or more front-facing cameras, one or more eye-facing cameras, one or more microphones, one or more accelerometers, one or more gyroscopes, one or more blood pressure sensors, one or more pulse oximeters, one or more respiration rate sensors, one or more blood alcohol concentration (BAC) sensors, one or more near-field communication sensors, or any other suitable one or more sensors. In particular embodiments, the system is configured to gather data, for example, using the one or more sensors, about the wearer (e.g., such as temperature, balance, heart rate, activity, activity levels, food eaten, medications taken, steps taken, position, movements, facial muscle movements, etc.).

In various embodiments, the sensors sense the mental state of the wearer by monitoring certain characteristics of the wearer including changes in pupil size, heart rate, perspiration level, composition of the wearer's perspiration, respiration rate, movement, brainwave activity, and/or any other suitable characteristic. The system then determines the wearer's mental state based on these characteristics. For instance, where the wearer has an increase in heart rate and/or perspiration level, the system may determine that the wearer is in a state of emotional stress. After determining the wearer's mental state, the system may notify the wearer or a third party (e.g., the wearer's physician) of the mental state. In particular embodiments, the system may also, or alternatively, save the mental state and related information to computer memory. In various embodiments, the system may notify the wearer via the wearable device or through a notification sent to a mobile device associated with the wearer. The system may also provide the wearer with one or more suggestions on how to address the wearer's current mental state. For instance, when the wearer is in a state of emotional stress, the system may suggest that the user meditate, remove themselves from their current physical or social situation, execute one or more exercises, etc.

In various embodiments, while the system is using one or more sensors (e.g., eyewear based sensors) to assess the mental state of the wearer, the system may also (e.g., at least substantially simultaneously) capture one or more images of a person or other object located in close proximity to the wearer (e.g., using a camera, such as a forward-facing camera associated with eyewear worn by the wearer). In various embodiments, the system may capture and analyze the image to determine whether the person or other object in the image caused the wearer's mental state. In other embodiments, the system may simply capture the image to determine what the user was viewing when they were under a state of emotional stress. If the system determines that the person or other object in the image caused the wearer's mental state, the system may notify the wearer (or other individual) of the association and may also provide the wearer suggested actions to address the wearer's mental state.

In various embodiments, the system may also capture one or more images to determine reoccurring mental states (e.g., mental state patterns) in the wearer's life. For instance, if on a first occasion when the wearer is speaking to a particular person, the wearer becomes emotionally stressed, the system may capture an image of the particular person. If later, on a second occasion when the wearer is speaking to the particular person, the wearer again becomes emotionally stressed, the system may again capture an image of the particular person. The system may compare the first image with the second image. If the system determines that the first and second images are of the same particular person, the system may notify the wearer that the wearer's mental state when speaking to the particular person is “stressed”. The system may then provide suggestions to the wearer on how to address the wearer's mental state when in close proximity to the particular person.

The system may use one or more similar techniques to determine that a user is typically in a particular mental state (e.g., relaxed, happy, emotionally stressed) when engaged in a particular activity (e.g., sailing, yoga, golf, commuting, a particular work activity). In various embodiments, the system may be adapted to automatically determine that the wearer is engaged in a particular work activity based on information from one or more of the sensors that the wearer is currently wearing (e.g., eyewear based sensors). For example, the system may use information from a forward facing camera installed in a pair of eyewear that the individual is wearing to determine that the individual is on a golf course (e.g., by detecting the presence of a hole, a pin, a fairway etc. . . . in one or more images taken by the forward-facing camera) or that the individual is driving (e.g., by detecting the presence of roadway “stripes” in an image from the forward-facing camera).

Exemplary Technical Platforms

As will be appreciated by one skilled in the relevant field, the present systems and methods may be, for example, embodied as a computer system, a method, or a computer program product. Accordingly, various embodiments may be entirely hardware or a combination of hardware and software. Furthermore, particular embodiments may take the form of a computer program product stored on a computer-readable storage medium having computer-readable instructions (e.g., software) embodied in the storage medium. Various embodiments may also take the form of Internet-implemented computer software. Any suitable computer-readable storage medium may be utilized including, for example, hard disks, compact disks, DVDs, optical storage devices, and/or magnetic storage devices.

Various embodiments are described below with reference to block diagram and flowchart illustrations of methods, apparatuses, (e.g., systems), and computer program products. It should be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by a computer executing computer program instructions. These computer program instructions may be loaded onto a general purpose computer, a special purpose computer, or other programmable data processing apparatus that can direct a computer or other programmable data processing apparatus to function in a particular manner such that the instructions stored in the computer-readable memory produce an article of manufacture that is configured for implementing the functions specified in the flowchart block or blocks.

The computer instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on a user's computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including but not limited to: a local area network (LAN); a wide area network (WAN); a cellular network; or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).

These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner such that the instructions stored in the computer-readable memory produce an article of manufacture that is configured for implementing the function specified in the flowchart block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process (e.g., method) such that the instructions that execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.

Example System Architecture

FIG. 1 is a block diagram of a Mental State Monitoring System 100 according to particular embodiments. As may be understood from this figure, the Mental State Monitoring System 100 includes One or More Networks 115, One or More Third Party Servers 50, a Mental State Monitoring Server 120 that may, for example, be adapted to execute a Mental State Monitoring Module 300, a Database 140, One or More Remote Computing Devices 154 (e.g., such as a smart phone, a tablet computer, a wearable computing device, a laptop computer, a desktop computer, etc.), and One or More Wearable Health Monitoring Devices 156, which may, for example, be embodied as one or more of eyewear, headwear, clothing, a watch, a hat, a helmet, a cast, an adhesive bandage, a piece of jewelry (e.g., a ring, earring, necklace, bracelet, etc.), or any other suitable wearable device. In particular embodiments, the one or more computer networks 115 facilitate communication between the One or More Third Party Servers 50, the Mental State Monitoring Server 120, Database 140, One or More Remote Computing Devices 154, and the one or more Health Monitoring Devices 156.

The one or more networks 115 may include any of a variety of types of wired or wireless computer networks such as the Internet, a private intranet, a mesh network, a public switch telephone network (PSTN), or any other type of network (e.g., a network that uses Bluetooth or near field communications to facilitate communication between computing devices). The communication link between the One or More Remote Computing Devices 154 and the Mental State Monitoring Server 120 may be, for example, implemented via a Local Area Network (LAN) or via the Internet.

FIG. 2 illustrates a diagrammatic representation of the architecture for the Mental State Monitoring Server 120 that may be used within the Mental State Monitoring System 100. It should be understood that the computer architecture shown in FIG. 2 may also represent the computer architecture for any one of the One or More Remote Computing Devices 154, one or more Third Party Servers 50, and one or more Health Monitoring Devices 156 shown in FIG. 1. In particular embodiments, the Mental State Monitoring Server 120 may be suitable for use as a computer within the context of the Mental State Monitoring System 100 that is configured for determining a mental state of a wearer by detecting characteristics of the wearer using signals received from sensors coupled to the eyewear.

In particular embodiments, the Mental State Monitoring Server 120 may be connected (e.g., networked) to other computing devices in a LAN, an intranet, an extranet, and/or the Internet as shown in FIG. 1. As noted above, the Mental State Monitoring Server 120 may operate in the capacity of a server or a client computing device in a client-server network environment, or as a peer computing device in a peer-to-peer (or distributed) network environment. The Mental State Monitoring Server 120 may be a desktop personal computing device (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, a switch or bridge, or any other computing device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that computing device. Further, while only a single computing device is illustrated, the term “computing device” shall also be interpreted to include any collection of computing devices that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.

An exemplary Mental State Monitoring Server 120 includes a processing device 202, a main memory 204 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM) or Rambus DRAM (RDRAM), etc.), a static memory 206 (e.g., flash memory, static random access memory (SRAM), etc.), and a data storage device 218, which communicate with each other via a bus 232.

The processing device 202 represents one or more general-purpose or specific processing devices such as a microprocessor, a central processing unit (CPU), or the like. More particularly, the processing device 202 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or processor implementing other instruction sets, or processors implementing a combination of instruction sets. The processing device 202 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 202 may be configured to execute processing logic 226 for performing various operations and steps discussed herein.

The Mental State Monitoring Server 120 may further include a network interface device 208. The Mental State Monitoring Server 120 may also include a video display unit 210 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)), an alpha-numeric input device 212 (e.g., a keyboard), a cursor control device 214 (e.g., a mouse), and a signal generation device 216 (e.g., a speaker).

The data storage device 218 may include a non-transitory computing device-accessible storage medium 230 (also known as a non-transitory computing device-readable storage medium, a non-transitory computing device-readable medium, or a non-transitory computer-readable medium) on which is stored one or more sets of instructions (e.g., the Mental State Monitoring Module 300) embodying any one or more of the methodologies or functions described herein. The one or more sets of instructions may also reside, completely or at least partially, within the main memory 204 and/or within the processing device 202 during execution thereof by the Mental State Monitoring Server 120—the main memory 204 and the processing device 202 also constituting computing device-accessible storage media. The one or more sets of instructions may further be transmitted or received over a network 115 via a network interface device 208.

While the computing device-accessible storage medium 230 is shown in an exemplary embodiment to be a single medium, the term “computing device-accessible storage medium” should be understood to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “computing device-accessible storage medium” should also be understood to include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the computing device and that causes the computing device to include any one or more of the methodologies of the present invention. The term “computing device-accessible storage medium” should accordingly be understood to include, but not be limited to, solid-state memories, optical and magnetic media, etc.

Exemplary System Platform

As noted above, a system, according to various embodiments, is adapted to assess the mental state of a wearer of a wearable device. Various aspects of the system's functionality may be executed by certain system modules, including the Mental State Monitoring Module 300. The Mental State Monitoring Module 300 is discussed in greater detail below.

Mental State Monitoring Module

FIG. 3A is a flow chart of operations performed by an exemplary Mental State Monitoring Module 300, which may, for example, run on the Mental State Monitoring Server 120, or any suitable computing device (such as the One or More Health Monitoring Devices 156 or a suitable mobile computing device). In particular embodiments, the Mental State Monitoring Module 300 may assess a wearer's mental state and make suggestions to the wearer to address a particular mental state associated with the wearer.

The system begins, in various embodiments, at Step 305 by providing eyewear comprising at least one or more sensors coupled to the eyewear. In various embodiments, the system may do this by, for example: (1) facilitating delivery of the eyewear to an address associated with a particular individual; (2) facilitating distribution of the eyewear from a healthcare worker to the individual; and (3) placing an order of the eyewear from a third party for delivery to the individual. In other embodiments, this step may be executed manually (e.g., by a human being) rather than a computer.

In various embodiments, the one or more sensors that are coupled to the eyewear (or other health monitoring device) are adapted to detect one or more characteristics of a wearer of the eyewear, wherein the one or more characteristics of the wearer are associated with the wearer's mental state. In various embodiments, the sensors coupled to the eyewear or other health monitoring device may include, for example, one or more of the following: a heart rate monitor, an electrocardiogram (EKG), an electroencephalogram (EEG), a pedometer, a thermometer, a front-facing camera, an eye-facing camera, a microphone, an accelerometer, a gyroscope, a magnetometer, a blood pressure sensor, a pulse oximeter, a respiration rate sensor, a blood alcohol concentration (BAC) sensor, a skin conductance response sensor, a near-field communication sensor, or any other suitable sensor. In particular embodiments, the sensors coupled to the eyewear comprise an eye-facing camera, a front-facing camera, and a heart rate monitor

In various embodiments, the one or more sensors are coupled to a computing device that is associated with (e.g., embedded within, attached to) the eyewear or other health monitoring device. In particular embodiments, the eyewear or other health monitoring device comprises at least one processor, computer memory, suitable wireless communications components (e.g., a Bluetooth chip) and a power supply for powering the health monitoring device and/or the various sensors.

In particular embodiments, the sensors may be physically coupled to the eyewear in any suitable way. For example, in various embodiments, the sensors may be embedded into the eyewear. In some embodiments, the sensors may be positioned along the brow bar of the eyewear. In other embodiments, the sensors may be positioned along the one or more of the temples of the eyewear. In still other embodiments, the sensors may be coupled to one or more of the lenses of the eyewear. As noted above, the one or more sensors may be coupled to a Bluetooth device that is configured to transmit the one or more signals to a handheld wireless device, and the step of receiving one or more signals from the one or more sensors (discussed below in reference to Step 310) further comprises receiving the one or more signals from the wireless handheld device (e.g., via the Internet). In particular embodiments, one or more of the sensors may be detachable from the eyewear. For instance, if a wearer does not need a temperature sensor or other particular sensor, the sensor may be removed from the eyewear.

The system continues, at Step 310 by receiving one or more signals from the one or more sensors, wherein each of the one or more signals relates to at least one characteristic associated with the wearer. In particular embodiments, the one or more signals that relate to the at least one characteristic associated with the wearer may include one or more signals that may be used to derive: (1) the wearer's a heart rate, (2) the wearer's heart rhythm; (3) a distance traveled by the wearer; (4) the wearer's body temperature; (5) one or more images associated with the wearer or the environment; (6) one or more sounds associated with the wearer's body or environment; (7) a speed traveled by the wearer; (8) the wearer's blood pressure; (9) the wearer's oxygen saturation level; (10) the wearer's brainwave activity (e.g., the location of the brainwave relative to the wearer's brain, the frequency of the brainwave, and the type of brainwave (e.g., gamma waves, beta waves, alpha waves, theta waves, and delta waves)); (11) the wearer's pupil size; (12) the wearer's perspiration level; (13) the wearer's hydration level; (14) the wearer's respiration rate; (15) the number and/or cadence of steps taken by the wearer; (16) the movement of one or more of the wearer's facial muscles; (17) one or more biochemical changes within the wearer's body; (18) changes in the one or more characteristics of the wearer's skin (e.g., skin paleness or clamminess) and/or (19) any other suitable attribute of the wearer or the wearer's environment. For instance, the system may receive a signal from an eye-facing camera associated with the eyewear that the wearer's brow is furrowed at the same time that the system receives a signal from the heart rate sensor that the wearer's heart rate is above a predetermined target heart rate. In various embodiments, the system may store data related to the signals and/or data derived from this data for later review and use in determining the mental state of the wearer.

In particular embodiments, the system may receive one or more of the above-referenced signals substantially automatically. In various embodiments, the system may receive one or more of the signals on a substantially periodic basis (e.g., by the second, by the minute, hourly, daily, etc.). For example, the system may receive one or more signals every thirty seconds throughout the day. In other embodiments, the system may receive one or more signals at least partially in response to receiving an indication from the wearer that the system should receive a signal. For instance, the wearer may speak a voice command to the wearable device requesting that the device take the wearer's blood pressure. In various embodiments, the system may receive an indication from the wearer of when to have the system receive the signal. For example, the system may receive an indication from the wearer to have the system conduct a brain scan of the user (e.g., receive at least one brainwave signal from the EEG) at 8:00 a.m. and at 2:00 p.m. on a particular day. In particular embodiments, the system may receive a request from the wearer to have a particular signal received from a particular sensor at the same time that the system receives a second particular signal from a second particular sensor. For example, when the system receives a signal that indicates that the user's respiration rate has increased, the system may, at least partially in response to receiving the increased respiration rate signal, also obtain an image of the wearer's eye from an eye-facing camera associated with the eyewear.

In some embodiments, the system receives a signal of an image captured by the eyewear. In various embodiments, the system receives a plurality of images captured by the eyewear. In particular embodiments, the system receives the image from the front-facing camera. In some embodiments, the system receives the image substantially automatically from the front-facing camera. In other embodiments, the system may receive the image in response to receiving an indication from the wearer to capture the image. For example, the system may receive a voice command from the wearer to capture the image. In various embodiments, the system may store the captured image in local or remote memory. In some embodiments, the image captured by the eyewear may be a video.

In various embodiments, the system may receive only one signal from a single sensor associated with the eyewear. In other embodiments, the system may receive a signal from a plurality of the sensors associated with the eyewear. In yet other embodiments, the system may receive multiple signals from one or more of the sensors. In various embodiments, the system may be configured to receive a first signal from a first sensor at the same time that it receives a second signal from a second sensor. For example, the system may be configured to receive an image signal from a front-facing camera associated with the eyewear at the same time that the system receives a heart rate signal from a heart rate sensor associated with the eyewear. As a further example, the system may be configured to simultaneously receive a signal from both an eye-facing camera and an EEG associated with the eyewear.

Next, at Step 315, the system analyzes the one or more received signals to determine at least one characteristic associated with the wearer. For example, the system may analyze the one or more received signals to determine that the at least one characteristic associated with the wearer is an increase or decrease in pupil size. In particular embodiments, the system may analyze the one or more received signals to determine that the at least one characteristic associated with the wearer is an increase or decrease in heart rate. In some embodiments, the system may analyze the one or more received signals to determine that the at least one characteristic associated with the wearer is an increase or decrease in the wearer's perspiration rate. In other embodiments, the system may analyze the one or more received signals to determine that the at least one characteristic associated with the wearer is an increase or decrease in respiration rate. In yet other embodiments, the system may analyze the one or more received signals to determine that the at least one characteristic associated with the wearer is an increase or decrease in movement. In still other embodiments, the system may analyze the one or more received signals to determine that the at least one characteristic associated with the wearer is an increase or decrease in brainwave activity and/or frequency. In various embodiments, the system may analyze the one or more received signals to determine that the at least one characteristic associated with the wearer is a change in location of the wearer's brainwave activity. In yet other embodiments, the system may analyze the one or more received signals to determine that the at least one characteristic associated with the wearer is a change in wearer's the type of brainwave (e.g., a change from gamma waves to delta waves). In some of these embodiments, the system may analyze the one or more received signals to determine that the at least one characteristic associated with the wearer is an increase in heart rate in conjunction with an increase in pupil size. It should be understood from this disclosure that the system may analyze the one or more signals to determine that the at least one characteristic associated with the wearer is a combination characteristics that allow the system to determine the mental state of the wearer.

In various embodiments, the system may store the results of the analysis of the one or more received signals for later comparison with past and future analyses of the one or more received signals. In particular embodiments, the system may analyze a particular received signal at a particular time of day (e.g., morning, noon, night, etc.). In some embodiments, the system may analyze a particular received signal at the same time that the system analyzes a second particular received signal. For instance, the system may analyze the front-facing camera signal when it analyzes the heart rate signal. In particular embodiments, the system may chart the analysis of the one or more received signals in a visual diagram. For example, the system may chart the changes in the wearer's heart rate in a diagram displayed e.g., on the lens of the eyewear or on a separate display screen associated with the eyewear.

At Step 320, the system facilitates determination of a mental state of the wearer based on the at least one characteristic. In various embodiments, the system may facilitate determination of the mental state of the wearer substantially automatically. In particular embodiments, the system may facilitate determination of the mental state of the wearer in response to receiving manual input of indication request from the wearer to determine the wearer's mental state. For example, the system may receive a voice command from the wearer requesting the system determine the mental state of the wearer at that time. In still other embodiments, the system may facilitate determination of the mental state of the wearer after comparing various signals to predetermined thresholds to establish that the signal is indicative of at least one characteristic. In various embodiments, the system may facilitate determination of the mental state of the wearer at random. For instance, the system may determine the mental state of the wearer at different, randomly selected times throughout the day. In some embodiments, the system may facilitate determination of the mental state of the wearer periodically throughout the day at specified times. For instance, the system may determine the mental state of the wearer at 8:00 a.m., 10:00 a.m., 12:00 p.m., 2:00 p.m., and 4:00 p.m. on a given day. In other embodiments, the system may determine the mental state of the wearer at predetermined intervals of time. For instance, the system may determine the mental state of the wearer every 30 minutes or every two hours.

In other embodiments, the determination of the mental state of the wearer may be that the wearer's mental state is in a particular emotional state (e.g., happy, sad, anxious, calm, scared, angry, surprised, ashamed, envious, curious, relaxed, emotionally stressed, confused, moody, etc.). For instance, the determination of the mental state of the wearer may be that the wearer is under emotional stress (i.e., the wearer's mental state is “emotionally stressed”). In some embodiments, the determination of the mental state of the wearer may include the wearer's experience of the wearer's current physical state (e.g., the wearer's experience of bodily pain, nausea, rapid heartbeat, etc.). For example, the determination of the wearer's mental state may include determining that the wearer is responding mentally and/or physically to sustaining a concussion. In other embodiments, the determination of the wearer's mental state may include a determination of the wearer's cognitive state (e.g., inability to concentrate, poor judgment, racing thoughts, constant worrying, etc.). For example, the determination of the wearer's mental state may be that the wearer is having memory problems. In yet other embodiments, the determination of the mental state of the wearer may include a mental state associated with a particular behavior (e.g., eating more or less, sleeping too much or too little, isolation from others, procrastinating, neglecting responsibilities, using alcohol or drugs, nervous habits or twitches, etc.). For instance, the determination of the mental state may be that the wearer is overly sleepy (which the system may determine, for example, by determining that the wearer is sleeping more than a predetermined amount of time per day).

In various embodiments, the system may determine the mental state of the wearer using a scaled rating. For example, where the system is determining a mental state such as stress of the wearer, the system may determine that the wearer's mental state (e.g., stress level) is low, moderate, or high. In some embodiments, the system may determine that the mental state of the wearer includes multiple mental states. For instance, the system may determine that the wearer is both emotionally stressed and anxious.

After determining the wearer's mental state, at Step 325 the system may optionally associate the mental state of the wearer with at least one object (e.g., an animate object, such as a particular person or animal, or an inanimate object, such as a scary clown doll), at least one activity (e.g., bowling, playing tennis, driving, working on a computer, or running), at least one external condition (e.g., the user's current workload at work, the user's current credit rating, the user's marital or dating status), and/or at least one internal condition (the user's current weight, health, etc. . . . ). In yet other embodiments, the system may associate the mental state of the wearer with both an animate object and an inanimate object (e.g., a person and an object—for example, the system may determine that the wearer is typically emotionally stressed when they see both the wearer's spouse and a checkbook), or any other combination of factors described herein. In particular embodiments, the system may associate the mental state of the wearer with both a particular object and an activity (for example, the system may determine that the wearer is typically relaxed when the wearer's spouse is present and the wearer is watching TV).

In various embodiments, the system may associate the mental state of the wearer with the user's external or internal context. In some embodiments, the external or internal context may include one or more of the following for a particular time period: (1) the user's current health; (2) the user's general state of mind; (3) the user's current spending behavior; (4) the current weather conditions at the user's current location (e.g., pollen count, UV index, air quality, precipitation, wind speed and direction, barometric pressure, humidity, outdoor temperature, season, cloud levels); (5) one or more social media entries made by or received by the individuals at the particular time; (6) the user's credit rating at the particular time; (7) the user's employment status at the particular time; (8) the user's housing information for the particular time; (9) information from one or more e-mails at or around the particular time; (10) the user's exercise activity during the particular time; (11) indoor temperature within the wearer's home or workplace; and/or (12) any other external information that may be relevant to and/or have an impact on the user's mental state. In order to determine this information, the system may receive information from and/or regarding, for example: (1) the wearer's health records; (2) the wearer's genetics; (3) the wearer's family history; (4) one or more of the wearer's physical attributes; (5) the wearer's workout schedule; (6) one or more social media accounts associated with the wearer; (7) one or more of the wearer's social behaviors; (8) the wearer's arrest history; (9) the weather at a user's current location (e.g., as determined from a suitable weather service); (10) the user's calendar; (11) the user's spending behaviors; (12) the wearer's credit history; (13) the wearer's employment status and/or history; (14) the wearer's marital status; (15) the wearer's current residence; (16) an email account of the wearer; (17) the wearer's travel history; (18) the wearer's aspirations; (19) the wearer's goals; (20) dietary information for the wearer, etc.

For example, the system may associate the wearer's mental state (e.g., “emotionally stressed”, “unhappy”, etc. . . . ) with the activity of a paying bills by: (1) identifying an entry on the wearer's electronic calendar that indicates that the wearer will be paying bills in a particular time slot on a particular day; (2) confirming that the user is paying bills in the particular time slot by identifying a checkbook in an image taken by the system's front facing camera during the particular time slot; and (3) determining the wearer's mental state during the time slot using any suitable technique, such as those described herein.

Similar techniques may be used to determine longer-lasting impacts of certain external or internal conditions on a wearer's mental state. For example, the system may use any suitable technique to determine the wearer's general mental state (e.g., relatively happy, relatively sad, under relative stress) over a predetermined number of hours, days, weeks, or months, by comparing the wearer's aggregate (e.g., average) mental state for that time period with the wearer's aggregate mental state for one or more time periods (e.g., of similar length), or by comparing the wearer's aggregate mental state for the time period with typical mental states of the wearer or one or more other individuals. The system may then access any suitable information (e.g., any of the types of object, activity, internal context or external context information described herein, or other information) to determine what may be contributing to the mental state (e.g., using any suitable data analysis techniques). For example, the system may determine that the wearer is typically relatively happy in weeks in which the wearer is exercising more than five times per week and has a balance of less than $1,000 on their personal credit card, and experiences relatively high stress levels in weeks in which the user is exercising less than two times per week.

In various embodiments, the system may associate the mental state of the wearer with a person, object, activity, internal context, or external context using one or more signals received from the eyewear's sensors and/or any other suitable data received by the system. In various embodiments, the system may associate the mental state of the wearer with a person, object, or activity by receiving a manual input from the wearer. For instance, the system may receive indication request from the wearer to associate a particular person with the wearer's current mental state. In these embodiments, the system may then capture an image of the person being viewed by the wearer, identify the person in the image and associate the current mental state of the wearer with the identified person. In some such embodiments, the system may also monitor the wearer's mental state when the wearer next encounters the identified person and track the wearer's mental state over time each time the wearer encounters the identified person. In this way, the system can alert the wearer if the identified person continually causes the wearer to experience the same or similar mental state each time the wearer encounters the identified person. In other embodiments, the system may associate the mental state of the wearer with (e.g., an animate or inanimate object) or activity substantially automatically after (e.g., in response to) determining the mental state of the wearer.

In various embodiments, the system may be configured to associate the mental state of the wearer with an animate or inanimate object by examining a received image of a first object (e.g., a first person) located in the received image. In particular embodiments, the system determines the object in the received image by identifying the object located in the image using any suitable image recognition techniques. In other embodiments, the system determines the object in the received image by comparing the image of the object with one or more stored images. At least partially in response to determining that the image of the object at least substantially matches a stored image, the system may identify the object as being the same as the known object (e.g., person or thing) in the stored image.

In some embodiments, the system may associate the mental state of the wearer with an object based on the proximity of the object to the wearer. For example, the system may use a front-facing camera to determine that a particular person is in close proximity to the wearer (e.g., in front of the wearer) when the wearer is experiencing a particular mental state such as stress. In some embodiments, the proximity of a person to the wearer may be obtained by an electronic device on the person (e.g., a cellphone, an RFID tag, etc.).

In various embodiments, the system may associate the mental state of the wearer with an activity based on the wearer performing the activity. For example, the system may use the system's processor and front facing camera to determine, from images taken by the front-facing camera, that the user is climbing stairs. The system may then assess the wearer's mental state (e.g., in any suitable way described herein) and associate that that mental state with stair climbing. For example, the system may determine that the user is typically mentally relaxed when climbing stairs.

Continuing to Step 330, the system stores, in memory, the association between the wearer's mental state and the object, activity, internal context, and/or external context causing the wearer's mental state. In various embodiments, the system may store the association between the wearer's mental state and the object, activity, external or internal context substantially automatically after (e.g., in response to) making the association between the wearer's mental state and the object, activity, internal context and/or external context. In particular embodiments, the system may store the association after (e.g., in response to) receiving manual input from the wearer requesting that the system store the association. In various embodiments, the system may store the association for a specified period of time. For example, the system may store the association for a day, a month, a year, etc. in the system's Database 140. In some embodiments, the system may store the association on any suitable server, database, or device. In particular embodiments, the system may store the association on the Mental State Monitoring Server 120.

Next, at Step 335, the system notifies the wearer or other suitable individual of the association of the wearer's mental state and the object, activity, internal context, and/or external context. For example, in some embodiments, the system may notify the wearer and the wearer's physician of the association of the wearer's mental state with the object in the received image and/or the activity that the wearer was engaged in while experiencing the mental state. In various embodiments, the system notifies the wearer of the association by displaying an image on the lens of the eyewear, or a display screen associated with the eyewear. In other embodiments, the system notifies the wearer of the association by communicating the association through an audio speaker to the wearer. In some embodiments, the system notifies the wearer of the association by sending a notification to the wearer's mobile device. In particular embodiments, the system notifies the wearer of the association via an electronic communication such as an email or text message. In other embodiments, the system may notify the wearer of a single association substantially immediately after (e.g., in response to) the system associates the wearer's mental state with a particular object. In yet other embodiments, the system may notify the wearer of all associations made on a particular day (or within another particular time period).

In other embodiments, the system may notify the wearer of the association after (e.g., in response to) detecting a particular event. For example, the system may notify the wearer of the association after the system no longer detects the presence or proximity of a particular object. In some embodiments, the system may notify the wearer of the association after a particular period of time. For instance, the system may notify the wearer of an association one hour after the system associates the wearer's mental state with a particular object or activity. In still other embodiments, the system may notify the wearer of the association at a particular time of day. As an example, the system may notify the wearer of an association between the wearer's mental state and an object, activity, internal context, and/or external context at the end of the day.

Continuing to Step 340, the system provides the wearer with one or more suggested actions to address the wearer's current mental state. In various embodiments, the system may provide suggested actions to the wearer in any suitable way. In various embodiments, the system may provide one or more suggested actions to the wearer by displaying an image on the lens of the eyewear, or on a display screen associated with the eyewear. In other embodiments, the system may provide one or more suggested actions to the wearer by communicating through an audio speaker to the wearer. In some embodiments, the system provides one or more suggested actions to the wearer by sending a notification to the wearer's mobile device. In particular embodiments, the system may provide one or more suggested actions to the wearer via an electronic communication, such as an email or text message. In still other embodiments, the system may provide a single suggested action to the wearer. In yet other embodiments, the system may provide multiple suggested actions to the wearer.

In various embodiments, the system may provide suggested actions to the wearer to address the wearer's current mental state substantially immediately after the system notifies the wearer of the association between the wearer's mental state and the particular object, activity, internal context and/or external context. In other embodiments, the system may provide one or more suggested actions to the wearer after (e.g., in response to) detecting a particular event. For example, the system may provide suggested actions to the wearer after the system no longer detects the presence or proximity of a particular object. In some embodiments, the system may provide one or more suggested actions to the wearer after a particular period of time. For instance, the system may provide suggested actions to the wearer one hour after the system associates the wearer's mental state with the particular object. In still other embodiments, the system may provide one or more suggested actions to the wearer at a particular time of day. As an example, the system may provide one or more suggested actions to the wearer at the end of the day.

In various embodiments, the one or more suggested actions to address the wearer's current mental state may be one or more techniques for reducing stress. In particular embodiments, the suggested actions to address the wearer's current mental state may include, for example: (1) taking medication; (2) praying; (3) engaging in one or more yoga poses; (4) taking deep breaths; (5) avoiding a particular object (e.g., a particular person or thing) based on the association between the wearer's mental state and the object. In other embodiments, the suggested actions to address the wearer's current mental state may include suggesting that the wearer visit links to websites containing information on the particular mental state of the wearer. In some embodiments, the suggested actions to address the wearer's current mental state may include a listing of suggested applications on the wearer's mobile device. For instance, where the wearer's current mental state is in a stressed mental state, the system may suggest a yoga application on the wearer's mobile device to assist in improving the wearer's mental state. As a further example, if the system determines that the wearer typically experiences high levels of stress when the wearer is carrying high amounts of credit card debt, the system may suggest that the user reduce their debt levels.

In various embodiments, the system, when executing the Mental State Monitoring Module 300, may omit particular steps, perform particular steps in an order other than the order presented above, or perform additional steps not discussed directly above. In should also be understood that various steps executed “in response to” a particular event occurring, or a particular condition being satisfied, may also be executed “at least partially in response to” the particular event occurring or the particular condition being satisfied.

Structure of the Eyewear

As shown in FIG. 4, eyewear 400, according to various embodiments, includes: (1) an eyewear frame 410; (2) a first temple 412; and (3) a second temple 414. These various components are discussed in more detail below. In particular embodiments, the eyewear 400 is computerized and may serve as the wearable health monitoring device 156 of FIG. 1.

Eyewear Frame

Referring still to FIG. 4, eyewear 400, in various embodiments, includes any suitable eyewear frame 410 configured to support one or more lenses 418, 420. In the embodiment shown in this figure, the eyewear frame 410 has a first end 402 and a second end 404. The eyewear frame 410 may be made of any suitable material such as metal, ceramic, polymers or any combination thereof. In particular embodiments, the eyewear frame 410 is configured to support the first and second lenses 418, 420 about the full perimeter of the first and second lenses 418, 420. In other embodiments, the eyewear frame 410 may be configured to support the first and second lenses 418, 420 about only a portion of each respective lens. In various embodiments, the eyewear frame 410 is configured to support a number of lenses other than two lenses (e.g., a single lens, a plurality of lenses, etc.). In particular embodiments, the lenses 418, 420 may include prescription lenses, sunglass lenses, or any other suitable type of lens (e.g., reading lenses, non-prescription lenses), which may be formed from glass or polymers.

The eyewear frame 410 includes a first and second nose pad 422 (not shown in figure), 424, which may be configured to maintain the eyewear 400 adjacent the front of a wearer's face such that the lenses 418, 420 are positioned substantially in front of the wearer's eyes while the wearer is wearing the eyewear 400. In particular embodiments, the nose pads 422, 424 may comprise a material that is configured to be comfortable when worn by the wearer (e.g., rubber, etc.). In other embodiments, the nose pads may include any other suitable material (e.g., plastic, metal, etc.). In still other embodiments, the nose pads may be integrally formed with the frame 410.

The eyewear frame 410 includes a first and second hinge 426, 428 that attach the first and second temples 412, 414 to the frame first and second ends 402, 404, respectively. In various embodiments, the hinges may be formed by any suitable connection (e.g., tongue and groove, ball and socket, spring hinge, etc.). In particular embodiments, the first hinge 426 may be welded to, or integrally formed with, the frame 410 and the first temple 412 and the second hinge 428 may be welded to, or integrally formed with, the frame 410 and the second temple 414.

First and Second Temples

As shown in FIG. 4, the first temple 412, according to various embodiments, is rotatably connected to the frame 410 at a right angle to extend the first temple 412 substantially perpendicular, substantially parallel, or anywhere in between the right angle to the frame 410. The first temple 412 has a first and second end 412a, 412b. Proximate the first temple second end 412b, the first temple 412 includes an earpiece 413 configured to be supported by a wearer's ear. Similarly, the second temple 414, according to various embodiments, is rotatably connected to the frame 410 at a right angle to extend the second temple 414 substantially perpendicular, substantially parallel, or anywhere in between the right angle to the frame 410. The second temple 414 has a first and second end 414a, 414b. Proximate the second temple second end 414b, the second temple 414 includes an earpiece 415 configured to be supported by a wearer's ear.

Sensors

In various embodiments, the second temple 414 has one or more sensors 430 connected to the second temple 414. In various embodiments, the one or more sensors 430 may be coupled to the frame 410, the first and second temples 412, 414, the first and second lenses 418, 410, or any other portion of the eyewear 400 in any suitable way. For instance, the one or more sensors 430 may be embedded into the eyewear 400, coupled to the eyewear 400, and/or operatively coupled to the eyewear 400. In various embodiments, the one or more sensors may be formed at any point along the eyewear 400. For instance, a fingerprint reader may be disposed adjacent the first temple of the eyewear 400. In various embodiments, the one or more sensors may be formed in any shape. In addition, the one or more sensors may be formed on the inner (back) surface of the frame 410, the first and second temples 412, 414, the first and second lenses 418, 410, or any other portion of the eyewear 400. In other embodiments, the one or more sensors may be formed on the outer (front) surface of the frame 410, the first and second temples 412, 414, the first and second lenses 418, 410, or any other portion of the eyewear 400.

In various embodiments, the one or more sensors 430 that are coupled to the eyewear (or other wearable device) are adapted to detect one or more characteristics of the eyewear or a wearer of the eyewear, wherein the one or more characteristics of the wearer are associated with the wearer's identity. In various embodiments, the one or more sensors coupled to the eyewear or other health monitoring device may include, for example, one or more of the following: a near-field communication sensor, a Bluetooth chip, a GPS unit, an RFID tag (passive or active), a fingerprint reader, an iris reader, a retinal scanner, a voice recognition sensor, a heart rate monitor, an electrocardiogram (EKG), an electroencephalogram (EEG), a pedometer, a thermometer, a front-facing camera, an eye-facing camera, a microphone, an accelerometer, a magnetometer, a blood pressure sensor, a pulse oximeter, a skin conductance response sensor, any suitable biometric reader, or any other suitable sensor. In some embodiments, the one or more sensors may include a unique shape, a unique code, or a unique design physically inscribed into the eyewear that may be readable by an individual or a remote computing device. In particular embodiments, the sensors coupled to the eyewear may include one or more electronic communications devices such as a near field communication sensor, a Bluetooth chip, an active RFID, and a GPS unit.

In various embodiments, the one or more sensors are coupled to a computing device that is associated with (e.g., embedded within, attached to) the eyewear or other wearable device. In particular embodiments, the eyewear or other wearable device comprises at least one processor, computer memory, suitable wireless communications components (e.g., a Bluetooth chip) and a power supply for powering the wearable device and/or the various sensors.

As noted above, the one or more sensors may be coupled to a Bluetooth device that is configured to transmit the one or more signals to a handheld wireless device, and the step of using the eyewear to confirm the identity of the wearer of the eyewear (discussed above in reference to Step 310) further comprises receiving the one or more signals from the wireless handheld device (e.g., via the Internet). In particular embodiments, one or more of the sensors may be detachable from the eyewear. For instance, if a wearer does not need a temperature sensor or other particular sensor, the sensor may be removed from the eyewear.

Exemplary User Experience

Sense and Track Focused States

In a particular example of a wearer using the Mental State Monitoring Module 300 to monitor their mental state, the wearer may put on the wearable device in the morning and continue to wear the device throughout the day. During this time, the system tracks the brainwave activity, including the types of brainwaves, of the wearer using the system's EEG. The system particularly tracks the gamma wave activity of the wearer, which may at the optimal range, for example, signify a high focused mental state of the wearer. The system may store the brainwave activity and chart the highs and lows of the wearer's brainwave activity throughout the day in order to determine the times throughout the day when the wearer is most focused. For example, where the wearer has relatively high gamma wave activity from 9:00 a.m. until 11:00 a.m., and relatively high theta wave activity from 2:00 p.m. until 3:00 p.m., the system will notify the wearer that the wearer's highest cognitive functioning occurs from 9:00-11:00 a.m. and the wearer's lowest cognitive functioning occurs from 2:00-3:00 p.m. The wearer may then use this information to alter the wearer's activities so that the wearer is focused on important matters from 9:00-11:00 a.m. and less important matters from 2:00-3:00 p.m.

Sense and Track Low-Productivity States

Similar to the system tracking high focused states, the system, in a particular example, will also track low-productivity states. This period of low-productivity may be measured, for example, by tracking the wearer's brainwave activity, the wearer's movements, the distance traveled by the wearer, and/or the wearer's speed of travel. The system may also capture images of the wearer's surroundings for use, for example, in determining one or more activities that the wearer is engaged in. For instance, in a period in which the wearer has not moved and has had a particular type of brainwave such as theta waves for an extended period of time, the system may capture a plurality of images from the system's front-facing camera in order to determine that the wearer has been sitting in front of a television during those periods of time. The system may then track the amount of time that the wearer spent in the particular state and then notify the wearer and provide the wearer with one or more recommendations, such as to exercise rather than watch TV, or to exercise while watching TV.

Identify Periods of High Stress

In a further particular example of a wearer using the Mental State Monitoring Module 300 of the One or More Wearable Health Monitoring Devices 156 to monitor the wearer's mental state, the system tracks the wearer's various mental states through a typical day via the wearable device (e.g., a pair of eyewear). The system tracks the wearer's mental state by monitoring the wearer's pupil size, heart rate, perspiration rate, respiration rate, movement, and brainwave activity. The system may also monitor potential causes of a certain change in the wearer's pupil size, heart rate, perspiration level, respiration rate, movement, and brainwave activity using a front-facing camera.

For example, if the wearer encounters a particular person that causes the wearer high stress as indicated by a rapid heartbeat in the wearer and an increase in the wearer's perspiration rate, the system captures an image of the person causing the high stress. The system may then present this image to the wearer with a suggestion to not interact in the future with the particular person. The system may also suggest other ways of interacting with the person in the future should avoidance be impractical. For instance, if the wearer works for the particular person that causes the wearer high stress, the system may suggest writing down what the wearer needs to speak to the person about prior to interacting with the person. The system may, for example, provide these suggestions to the wearer through a notification sent to the wearer's mobile device so that other people around the wearer are not made aware of the wearer's high stress.

Monitor Behavior and Movements

In another example, the system may monitor and track one or more specific behaviors and/or movements of the wearer in order to diagnose particular mental states. For example, the system may track the wearer's movements to determine whether a wearer frequently has one or more simple tics such as motor tics including, for example, eye blinking or other nervous eye movements, facial grimacing, shoulder shrugging, muscle twitches, head or shoulder jerking, or one or more complex tics such as facial grimacing combined with a head twist and a shoulder shrug. After monitoring the wearer's movements for a predetermined period of time (e.g., a day, a week, a month, etc.), the system may notify the wearer or the wearer's physician that the wearer has one or more behavioral characteristics that are consistent with a neurological condition such as Tourette syndrome.

CONCLUSION

Many modifications and other embodiments of the invention will come to mind to one skilled in the art to which this invention pertains, having the benefit of the teaching presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the invention is not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for the purposes of limitation.

Claims

1. A computer-implemented method of assessing the mental state of a wearer of eyewear comprising one or more sensors coupled to the eyewear, the one or more sensors being adapted to detect one or more characteristics of the wearer of the eyewear, wherein the one or more characteristics are associated with the wearer's mental state, the method comprising:

a. receiving, by a processor, one or more signals from the one or more sensors, wherein each of the one or more signals relates to at least one characteristic associated with the wearer, the at least one characteristic being selected from a group consisting of: i. pupil size, ii. heart rate, iii. perspiration rate, iv. respiration rate, v. physical movement, and vi. brainwave activity;
b. analyzing, by a processor, the one or more received signals to determine the at least one characteristic;
c. facilitating, by a processor, determination of a mental state of the wearer based, at least in part, on the at least one characteristic; and
d. associating, by a processor, the mental state of the wearer with at least one of one or more stimuli selected from a group consisting of: (1) an object; (2) an activity; (3) an internal context associated with the wearer; and (4) an external context associated with the wearer.

2. The computer-implemented method of claim 1, further comprising the step of, providing a wireless transmitter coupled to the one or more sensors, the wireless transmitter being configured to transmit the one or more signals to a handheld computing device, wherein the step of receiving one or more signals from the one or more sensors further comprises, receiving the one or more signals from the handheld computing device via the Internet.

3. The computer-implemented method of claim 1, wherein the one or more sensors comprises at least one sensor selected from a group consisting of:

a. an eye facing camera;
b. a forward facing camera;
c. an electrocardiogram sensor;
d. a heart rate monitor;
e. a microphone;
f. a skin conductance response sensor;
g. an accelerometer; and
h. a gyroscope.

4. The computer-implemented method of claim 3, wherein the eyewear comprises an eye facing camera, a forward facing camera, and a heart rate monitor.

5. The computer-implemented method of claim 3, further comprising the step of:

a. determining, by a processor, that the wearer of the eyewear is under emotional stress;
b. in response to determining that the wearer of the eyewear is under emotional stress, capturing, by a processor, an image by the forward facing camera;
c. receiving, by a processor, the image;
d. identifying an object from within the received image; and
e. facilitating notification of the wearer of the association of the wearer's emotionally stressed state and the object in the image.

6. The computer-implemented method of claim 5, wherein the object is a particular individual.

7. The computer-implemented method of claim 6, wherein the step of identifying the particular individual comprises using one or more face recognition techniques to determine that a human is present within the image.

8. The computer-implemented method of claim 6, wherein the step of identifying the particular individual comprises using one or more face recognition techniques to determine the identity of the particular individual.

9. The computer-implemented method of claim 5, wherein the step of determining that the wearer of the eyewear is under emotional stress further comprises:

a. receiving, by a processor, an image of the wearer's pupil from the eye facing camera;
b. determining, by a processor, from the received image the size of the wearer's pupil;
c. comparing, by a processor, the size of the wearer's pupil to a predefined measurement of the wearer's pupil; and
d. at least partially in response to determining that the wearer's pupil size exceeds the predefined measurement of the wearer's pupil determining, by a processor, that the wearer is in a state of emotional stress.

10. The computer-implemented method of claim 5, wherein the step of determining that the wearer of the eyewear is under emotional stress further comprises:

a. determining, by a processor, the respiration rate of the wearer from the received one or more signals;
b. comparing the determined respiration rate to a predefined respiration rate level; and
c. in response to determining that the respiration rate exceeds the predefined respiration rate level, determining, by a processor, that the wearer is under emotional stress.

11. The computer-implemented method of claim 3, further comprising the step of:

a. receiving, by a processor, an image captured by the eyewear;
b. after receiving the image, identifying, by a processor, an object in the received image; and
c. notifying, by a processor, the wearer of the association of the wearer's mental state and the object in the received image.

12. The computer-implemented method of claim 11, wherein the object is a person.

13. The computer-implemented method of claim 3, further comprising the steps of:

a. notifying, by a processor, the wearer of the eyewear of their mental state; and
b. providing, by a processor, to the wearer one or more suggested actions to address their current mental state.

14. The computer-implemented method of claim 13, wherein:

a. the wearer's mental state is one of emotional stress; and
b. the one or more suggested actions comprise one or more techniques for reducing stress.

15. A computer-implemented method of assessing the mental state of a wearer of eyewear comprising one or more sensors coupled to the eyewear, the one or more sensors being adapted to detect one or more characteristics of the wearer of the eyewear, wherein the one or more characteristics are associated with the wearer's mental state, the method comprising:

a. receiving, by a processor, one or more first images from the eye facing camera, wherein at least one of the one or more first images relates to at least one characteristic associated with the wearer;
b. analyzing, by a processor, the at least one or more first images to determine the at least one characteristic; and
c. facilitating, by a processor, determination of a mental state of the wearer based on the at least one characteristic.

16. The computer-implemented method of claim 15, further comprising the steps of:

a. receiving, by a processor, one or more second images from the front facing camera;
b. determining, by a processor, the mental state of wearer based at least in part on one or more wearer characteristics selected from a group consisting of: i. the wearer's pupil size, ii. the wearer's respiration rate, iii. the wearer's perspiration rate, iv. the paleness of the wearer's skin, and v. the wearer's brain activity level;
c. associating, by a processor, the mental state of the wearer with a factor selected from a group consisting of: i. an object in close proximity to the wearer; and ii. an activity in which the wearer is participating in.

17. The computer-implemented method of claim 16, wherein the step of associating the mental state of the wearer with the factor further comprises the steps of:

a. detecting, by a processor, the object in the one or more second images; and
b. storing, in memory, the association of the wearer's mental state with the object.

18. The computer-implemented method of claim 16, wherein the step of associating the mental state of the wearer with the factor further comprises the steps of:

a. using the one or more second images to determine an activity that the wearer was engaged in when the one or more images were taken; and
b. storing, in memory, the association of the wearer's mental state with the activity.

19. The computer-implemented method of claim 16, wherein the eyewear further comprises a near-field communication sensor that is configured to detect the presence of the at least one of the factor, the method further comprising the steps of:

a. receiving, by a processor, a signal from the near-field communication sensor;
b. determining, by a processor, the identity of the factor from the received signal; and
c. storing, by a processor, the identity of the factor and the mental state of the wearer in memory.

20. The computer-implemented method of claim 16, wherein the step of determining the mental state of the wearer further comprises:

a. measuring, by a processor, a pupil size of the wearer;
b. measuring, by a processor, a respiration rate level of the wearer;
c. comparing, by a processor, i. the pupil size of the wearer to a predefined pupil size; and ii. comparing the respiration rate of the wearer to a predefined respiration rate; and
d. determining, by a processor, that the wearer is in a stressful state in response to determining that the measured pupil size exceeds the predefined pupil size and the measured respiration rate exceeds the predefined respiration rate.

21. A system for assessing the mental state of a wearer of a wearable device comprising:

a. at least one processor; and
b. memory operatively coupled to the at least one processor,
wherein the at least one processor is configured to: i. receive one or more signals from a device selected from a group consisting of: a helmet and eyewear, the device having one or more sensors embedded therein, wherein each of the one or more signals relates to at least one characteristic associated with the wearer, the at least one characteristic selected from a group consisting of: 1. pupil size, 2. heart rate, 3. perspiration rate, 4. respiration rate, 5. movement, and 6. brainwave activity; ii. analyze the one or more received signals to determine the at least one characteristic; iii. determine a mental state of the wearer based at least in part on the at least one characteristic; iv. associate the mental state of the wearer with one or more stimuli selected from a group consisting of: (1) an object; (2) an activity; (3) an internal context of the wearer; and (4) an external context of the wearer.

22. The system of claim 22, wherein the at least one processor is further configured to monitor at least one of the wearer's pupil size and brainwave activity over time to determine whether the wearer has sustained a concussion.

Patent History
Publication number: 20160066829
Type: Application
Filed: Dec 19, 2014
Publication Date: Mar 10, 2016
Inventors: Jay William Sales (Citrus Heights, CA), Richard Chester Klosinski, JR. (Sacramento, CA), Matthew Allen Workman (Sacramento, CA), Meghan Kathleen Murphy (Davis, CA), Matthew David Steen (Sacramento, CA)
Application Number: 14/578,039
Classifications
International Classification: A61B 5/16 (20060101); A61B 3/11 (20060101); A61B 5/00 (20060101); A61B 5/117 (20060101); A61B 5/0205 (20060101); A61B 5/103 (20060101); A61B 5/0402 (20060101); A61B 7/04 (20060101); G09B 5/00 (20060101); A61B 5/0476 (20060101);