SYSTEM AND METHOD FOR INTERFACING WITH BIOLOGICAL TISSUE
There is provided a system and method for interfacing with biological tissue. The system includes: a feature extraction module to implement an extraction approach to extract one or more features from the one or more physiological recording signals; a machine learning module to apply a machine learning model based on input data to detect a physiological event or condition for classification, the input data including the extracted features, the machine learning model trained using a training set including feature vectors of time-series data labelled with known occurrences of the physiological event or condition; and an output module to output the classification of the machine learning module.
The following relates, generally, to signal processing; and more particularly, to a system and method for interfacing with biological tissue.
BACKGROUNDIn various applications, small footprint and/or compact computing systems, such as embedded systems, may be required to record data in a low power manner. An example of this type of requirement on such a system is with an implantable medical device, such as an implantable neural device or implantable cardiac pacemaker. In the case of the implantable neural device, such devices are often tasked with finding brain states through a process that relies on understanding long-term trends in data. However, storing long-term data typically requires a relatively large memory, so conventional devices are typically limited to storing approximately less than 10 seconds of data. In some conventional devices, recorded neural signals are stored in the memory, then typically the data is windowed and windows are compared to find increases or decreases in the signal. However, conventional window-based approaches are typically limited by the relatively small capacity memory, and as such, typically lose long-term signal characteristics.
SUMMARYIn an aspect, there is provided a system for interfacing with biological tissue, the system comprising one or more processors and one or more memory units, the one or more processors in communication with the one or more memory units, the one or more processors configured to execute: a feature extraction module to implement an extraction approach to extract one or more features from the one or more physiological recording signals; a machine learning module to apply a machine learning model based on input data to detect a physiological event or condition for classification, the input data comprising the extracted features, the machine learning model trained using a training set comprising feature vectors of time-series data labelled with known occurrences of the physiological event or condition; and an output module to output the classification of the machine learning module.
In a particular case, the system is connectable to one or more electrodes implantable in the biological tissue via an analog front-end, the analog front-end comprising one or more physiological signal acquisition circuits in communication with the biological tissue and a convertor, the analog front-end communicating one or more physiological recording signals to the one or more processors.
In another case, the system is connectable to one or more physiological stimulation channels, the one or more physiological stimulation channels in communication with the biological tissue, the one or more physiological stimulation channels connectable to the one or more processors, the system further comprising a stimulation controller for generating and delivering one or more electronic signals to the one or more physiological stimulation channels, the one or more electronic signals comprising an arbitrary shape waveform.
In yet another case, the biological tissue comprising tissue within a central nervous system or a peripheral nervous system.
In yet another case, the one or more physiological signal acquisition circuits comprising neural signal recording channels.
In yet another case, the physiological signal acquisition circuits comprising at least one of: signal samplers, amplifiers, filters and analog-to-digital convertors.
In yet another case, the one or more physiological stimulation channels comprises one or more neurostimulation channels.
In yet another case, the one or more physiological stimulation channels generate and deliver at least one of current, charge, voltage, ultrasound, and magnetic signals.
In yet another case, the extraction approach comprising a dimensionality reduction approach, the dimensionality reduction approach comprising at least one of: an autoencoder neural network, a principal component analysis (PCA), an independent component analysis (ICA).
In yet another case, the one or more features comprising at least one of: signals band energy, signals phase locking feature, signals cross-frequency coupling, signals temporal correlation, and signals spatial correlation.
In yet another case, the physiological event or condition comprising at least one of: a pathological brain state, a non-pathological brain state, and a physiological event in a peripheral nervous system.
In yet another case, the arbitrary shape comprising at least one of biphasic pulses, monophasic pulses, sinusoids, and functions of sinusoids.
In yet another case, the stimulator generating the one or more electronic signals in a temporal or spatial periodic pattern.
In yet another case, the stimulator generating the one or more electronic signals when the physiological event is detected by the machine learning module.
In yet another case, the detected physiological event is a pathological brain state.
In yet another case, the stimulator comprising a charge balancer for generating charge-balanced physiological stimulation waveforms.
In yet another case, the charge balancer comprises charge balance monitor.
In yet another case, the one or more physiological stimulation channels are in communication with the brain via at least one of: the one or more electrodes, electromagnetic coils, antennas, ultrasound sources, light sources, and reservoirs comprising molecular, chemical, or biochemical content.
In another aspect, there is provided a method for interfacing with biological tissue, the method executable on one or more processors, the method comprising: extracting one or more features from one or more physiological signals by an extraction approach; applying a machine learning model based on input data to detect a physiological event or condition for classification, the input data comprising the extracted features, the machine learning model trained using a training set comprising feature vectors of time-series data labelled with known occurrences of the physiological event or condition; generating and delivering one or more electronic signals to the one or more physiological stimulation channels, the one or more electronic signals comprising an arbitrary shape waveform.
In another aspect, there is provided a computer-implemented method for sampling time-series data, comprising: receiving new data from the time-series data stream; receiving contemporary data from a time-series data stream; applying a sampling recursive window to the contemporary data; accessing previously received data from the time-series data stream; applying a temporal function to the previously received data; subtracting the previously received data with the temporal function applied from the contemporary data; and outputting the contemporary data after the subtraction has been applied.
In another aspect, there is provided a system for sampling time-series data, the system comprising one or more processors and one or more memory units, the one or more processor configured to execute a sampling module to: receive new data from the time-series data stream; receive contemporary data from a time-series data stream; apply a sampling recursive window to the contemporary data; access previously received data from the time-series data stream; apply a temporal function to the previously received data; subtract the previously received data with the temporal function applied from the contemporary data; and output the contemporary data after the subtraction has been applied.
In another aspect, there is provided a computer-implemented method for arbitrary waveform generation for physiological stimulation, comprising: generating an arbitrary function signal; passing the arbitrary function signal through a charge balance monitor to monitor compliance with predetermined charge limits; applying a physiological stimulation with the signal; and applying binary exponential charge recovery (BECR) to the signal by determining a net stimulus integral and applying a reverse charge when the integral is not zero due to arbitrary waveform stimulation or due to predetermined limits having been exceeded.
In another aspect, there is provided a system for arbitrary waveform generation for physiological stimulation, the system connectable to one or more electrodes implantable in a brain, the system comprising: an arbitrary waveform generator (AWG) to generate an arbitrary function signal; a charge balance monitor to receive the arbitrary function signal and monitor compliance with predetermined charge limits; a physiological stimulator to apply a physiological stimulation with the signal; and a binary exponential charge recovery (BECR) unit to apply BECR to the signal by determining a net stimulus integral and applying a reverse charge when the integral is not zero due to arbitrary waveform stimulation or due to the predetermined limits having been exceeded.
In another aspect, there is provided a computer-implemented method for classifying time-series data for identifying a state, the time-series data comprising a series of samples, the method comprising: training a machine learning model to classify occurrences of the state by classifying a representative feature vector, using a respective training set, the respective training set comprising feature vectors of the time-series data labelled with occurrences of the state; receiving a new time-series data stream; determining whether a current sample in the new time-series data stream corresponds to an occurrence of the state by determining a classified feature vector, the classified feature vector determined by passing the current sample and samples in at least one continuous sampling window into the trained machine learning model, each continuous sampling window comprising one or more preceding samples from the time-series data, an epoch for each respective continuous sampling window determined according to a temporal function; and outputting the determination of whether the current sample corresponds to an occurrence of the state.
In a particular case, each continuous sampling window is recursively defined based on the epoch of a previous iteration of the respective window subtracted by the respective temporal function multiplied by the epoch of such previous iteration.
In another case, the at least one continuous sampling window comprises at least two continuous sampling windows, the epoch of each of the continuous sampling windows are defined by different temporal function parameters.
In yet another case, each of the temporal functions comprise a decay rate, and wherein each exponential decay rate is a reciprocal of a power of 2.
In yet another case, each exponential decay rate is in the range of 1/2 to 1/(216).
In yet another case, each epoch is on the order of minutes or less.
In yet another case, the state vector machine learning model uses one of linear, polynomial and radial-basis function (RBF) kernels.
In yet another case, the at least one continuous sampling window comprises a plurality of continuous sampling windows organized into at least two banks of continuous sampling windows, each bank comprising at least one continuous sampling window, the continuous sampling windows in each bank having a different temporal function parameters than the continuous sampling windows in the other banks.
In yet another case, the time-series data comprises physiological signals and the state comprises a physiological event or condition.
In yet another case, the time-series data comprises electroencephalography (EEG) signals and the state comprises one or more onset biomarkers associated with a seizure.
In another aspect, there is provided a system for classifying time-series data for state identification, the system comprising one or more processors and one or more memory units, the one or more memory units storing the time-series data comprising a series of samples, the one or more processors in communication with the one or more memory units and configured to execute: a training module for training a machine learning model to classify occurrences of the state by classifying a representative feature vector, using a respective training set, the respective training set comprising feature vectors of the time-series data labelled with occurrences of the state; an input module for receiving a new time-series data stream comprising a plurality of samples; a temporal function module for defining at least one continuous sampling window, each continuous sampling window comprising one or more samples from the time-series data preceding a current sample, an epoch for each respective continuous sampling window determined according to a respective temporal function; a support vector module for determining whether a current sample in the new time-series data stream is an occurrence of the state by determining a classified feature vector, the classified feature vector determined by passing the current sample and samples in the at least one continuous sampling window into the trained machine learning model; and an output module for outputting the determination of whether the current sample is an occurrence of the state.
In yet another case, each continuous sampling window is recursively defined based on the epoch of a previous iteration of the respective window subtracted by the respective temporal function multiplied by the epoch of such previous iteration.
In yet another case, the at least one continuous sampling window comprises at least two continuous sampling windows, the epoch of each of the continuous sampling windows are defined by defined by different temporal function parameters.
In yet another case, each of the temporal functions comprise a decay rate, and wherein each exponential decay rate is a reciprocal of a power of 2.
In yet another case, each exponential decay rate is in the range of 1/2 to 1/(216).
In yet another case, each epoch is on the order of minutes or less.
In yet another case, the state vector machine learning model uses one of linear, polynomial and radial-basis function (RBF) kernels.
In yet another case, the temporal function module defines a plurality of continuous sampling windows organized into at least two banks of continuous sampling windows, each bank comprising at least one continuous sampling window, the continuous sampling windows in each bank having different temporal function parameters than the continuous sampling windows in the other banks.
In yet another case, the time-series data comprises physiological signals and the state comprises a physiological event.
In yet another case, the time-series data comprises electroencephalography (EEG) signals captured by electrodes in communication with the system, and the state comprises one or more onset biomarkers associated with a seizure.
These and other aspects are contemplated and described herein. It will be appreciated that the foregoing summary sets out representative aspects of the system and method to assist skilled readers in understanding the following detailed description.
A greater understanding of the embodiments will be had with reference to the Figures, in which:
For simplicity and clarity of illustration, where considered appropriate, reference numerals may be repeated among the Figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein may be practised without these specific details. In other instances, well-known methods, procedures and components have not been described in detail so as not to obscure the embodiments described herein. Also, the description is not to be considered as limiting the scope of the embodiments described herein.
Various terms used throughout the present description may be read and understood as follows, unless the context indicates otherwise: “or” as used throughout is inclusive, as though written “and/or”; singular articles and pronouns as used throughout include their plural forms, and vice versa; similarly, gendered pronouns include their counterpart pronouns so that pronouns should not be understood as limiting anything described herein to use, implementation, performance, etc. by a single gender. Further definitions for terms may be set out herein; these may apply to prior and subsequent instances of those terms, as will be understood from a reading of the present description.
Any module, unit, component, server, computer, terminal or device exemplified herein that executes instructions may include or otherwise have access to computer readable media such as storage media, computer storage media, or data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Computer storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. Examples of computer storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by an application, module, or both. Any such computer storage media may be part of the device or accessible or connectable thereto. Further, unless the context clearly indicates otherwise, any processor or controller set out herein may be implemented as a singular processor or as a plurality of processors. The plurality of processors may be arrayed or distributed, and any processing function referred to herein may be carried out by one or by a plurality of processors, even though a single processor may be exemplified. Any method, application or module herein described may be implemented using computer readable/executable instructions that may be stored or otherwise held by such computer readable media and executed by the one or more processors.
The present inventors have determined that various advantages can be achieved with the embodiments described herein using an array of hardware approximators for moving average filters, where portions of new data are incorporated into a single register, and previous values decay exponentially. In this way, advantageously, trends in neural signals ranging from years to seconds can be stored on an implanted device without requiring large and power-inefficient electronic memory. The embodiments described herein are intended to allow for various advantages; for example, reducing the size and cost of implantable brain state classifiers, reducing power requirements for devices, enabling implicit storage of long-term data with low footprint, and enabling more accurate and efficient time-series classification. With respect to implantable neural devices, the embodiments described herein are intended to enable higher performance brain state classification with fewer false positives and more true positives, and allow implicit storage of long-term data without the need for large, expensive, and power inefficient memories.
While the following embodiments may refer to implantable neural devices in applications for seizure neuromodulation, it will be appreciated that the exponentially decaying memory (EDM) techniques described herein can be used in various suitable computing devices, for example, other implantable medical devices. Generally, embodiments described herein relate to the classification of time-series data for state identification; for example, seizures in epilepsy, tremors in Parkinson's disease, neurological artefacts due to Alzheimer's disease, or the like. It is appreciated that the embodiments described herein with respect to classifying time-series data can be applied to any suitable field involving time-series classification; for example, voice recognition, financial data analysis, or the like.
Distinguishing seizure activity from normal brain activity can be a difficult task because of potentially great variation in seizure morphology. As described herein, machine learning enables the utilization of large volumes of patient recordings to accurately distinguish pathological from physiological activity. Thus, allowing for responsive closed-loop neuromodulation which can proactively detect and inhibit the onset of seizures. In such an approach, supervised learning models can be utilized to maintain low false-detection rates for improved power efficiency and reduced side-effects. However, the use of supervised learning classifiers for seizure detection can expose a class imbalance problem which arises from a lack of ictal recordings compared to large volumes of inter-ictal data. Furthermore, supervised classification systems, in some cases, can require accurate data labeling, and can be vulnerable to the human error in annotating complex EEG recordings.
The somewhat limited success in the pharmacologic treatment of epileptic syndromes has aroused an increasing interest in the possibility of stopping seizures with brief direct electrical intracerebral stimulation. Support for the possible success of electrical perturbations in preventing seizures is based on the assumption that if the dynamics of the abnormal synchrony that characterizes paroxysms is perturbed by stimulations, then the ictus may not appear, or will be forced to stop if already initiated. Thus, the implementation of “minimal” (short duration, low frequencies and intensities) perturbations to stop the transition from the preictal activity to the ictal, convulsive event by a precisely timed brief stimulation is a highly beneficial solution. Contrary to the current deep brain or vagus nerve stimulation paradigms that use intermittent (continuous) stimulation, present embodiments stimulate when a paroxysm is about to occur, using an on-demand feedback stimulation method based on real-time analysis of brain signals that detects a precursor of paroxysms, and implements a brief (e.g., 5 second) stimuli to stop the transition to the ictal event. Generally, an abnormal oscillation originates from an epileptogenic zone (often in hippocampus in temporal lobe epilepsy), which may disrupt theta wave (and others) synchronization with the other hippocampus. Over time, this focal oscillation spreads and often propagates contralaterally and develops a paroxysmal discharge. A feedback stimulator could disrupt the local epileptic oscillation and abort the seizure development.
The following terminology is used in the present disclosure. “Paroxysms” are any abnormal electrographic activities (e.g., having a duration of greater than or equal to 10 seconds) associated with relatively high frequency and amplitude of spikes. When no apparent behavioral alterations are observed at the time of an electrographic paroxysm, the term “nonconvulsive paroxysm” is used, whereas the expression “convulsive paroxysm” is used if an abnormal behavior is observed concomitant with abnormal electrographic recording. The “paroxysm onset” is defined as the time when the amplitude of the electrographic recording in the paroxysm becomes greater than twice the standard deviation of the baseline activity. The “early paroxysm detection time” is the time between the detection of the seizure precursor and the paroxysm onset. The “preictal period” is defined as 1 minute before the paroxysm onset, and the “interictal period” is the time between the end of one paroxysm and the preictal of the next. The convulsive paroxysms are defined according to the Racine scale (class III to class IV), whereas the nonconvulsive paroxysms are class I or class II seizures.
Activity in EEG signal frequency bands can be used to categorize irregular neural recordings. Such events include electrographic seizure onsets and interictal discharges (IIDs). This assessment is generally based on temporo-spectral changes such as low-voltage fast activity in intracranial electroencephalogram (iEEG) seizure onset. In the present embodiments, in order to provide a technological solution to capture such changes, Exponentially Decaying Memory (EDM) is presented as a hardware efficient mechanism to represent temporal feature characteristics for machine learning. In an embodiment described herein, an unsupervised learning based One-class Support Vector Machine (OC-SVM) can be used. This approach can navigate the technical problems related to class imbalance and data labeling by, for example, learning to distinguish normal neural activity from segments of clinical interest. In a particular case, irregular recording periods indicated by the OC-SVM can be reviewed by a user, such as an epileptologist, enabling ictal data to be labelled and accumulated over time. With increasing volumes of data, specialized supervised learning classifiers can be trained more effectively for closed-loop applications.
Generally, chronic neural recording implants experience considerable signal variability over time, leading to a gradual degradation of classifier performance. Thus, continuous model re-training is generally necessary to adapt to changing physiological recording conditions and maximize the treatment efficacy. However, this can be impractical to perform on an implantable device as power-consumption is a primary consideration to reduce both heat dissipation and the risks associated with battery replacement surgery. In an embodiment, there is provided a patient-localized microserver that can communicate with an implanted device to enable incremental training. In some cases, data recorded by the device can be sent to the server and processed by an FPGA-accelerated OC-SVM. iEEG segments which are considered irregular are archived and sent to a remote epileptologist for review. Once an assessment is made, the microserver can re-train the model to be uploaded to the implanted device.
In a diagrammatic example illustrated in
To detect anomalous activity in neural signals, spectral energy in physiological signal bands can be used to label electrographic events. Signal bands of interest can be extracted by passing recorded samples through parallel bandpass filters for, for example, Delta (<4 Hz), Theta (4-8 Hz), Alpha (8-13 Hz), Beta (13-30 Hz) and Gamma (30-60 Hz) bands. In this example, a 256-tap Type-1 FIR filter can be used for each band with a symmetric impulse response, allowing coefficient multiplications to be shared. Each iEEG channel can be processed sequentially and filter states are stored in block RAM (BRAM) between sample processing. For each band, the absolute value of each output sample can be taken as a measure of signal energy. This approximation of instantaneous energy can be accumulated over a time window to generate a temporo-spectral measure of the signal.
To capture temporal evolution of machine learning features, such as signal energy, some approaches use a windowing approach where contiguous time epochs are concatenated to form a feature vector to be classified. Using this approach, it is possible to learn temporal differences between windows for events such as seizure onset. However, window-based approaches have several limitations in performance and hardware efficiency. As an example, processing larger windows requires proportionally large accumulation logic. In another example, if classification is performed at every epoch, test vector re-ordering logic may be necessary to remove old windows and add new windows. In another example, a minimum detection latency is the time required to generate a window (typically multiple seconds). In another example, as EEG recordings are patient specific, one window size may give sufficient temporal resolution in one case, but may not be optimal for another. In contrast, embodiments of the present invention advantageously provide the ability to learn feature timescales in a patient-specific manner to maximize classification performance; for example, using Exponentially Decaying Memory (EDM) as described herein.
A support vector machine (SVM) can be used as a supervised learning model for classification tasks of two or more classes. Generally, a similar number of examples in each class is required to prevent classifier bias. In the case of seizure detection, ictal activity is rare and accurate classification is generally a necessity to prevent the onset of a seizure.
The one-class SVM, described herein, provides an approach for datasets with class imbalances. It can be viewed as a regular two-class SVM, where the training data is taken as one class, and the origin is taken as the only member of the second class. Training is performed without labels, where data is mapped to a kernel space and separated from the origin by a hyperplane with maximum margin. To classify an input feature vector, a decision function is evaluated to distinguish an inlier (f(x)>0), from an outlier (f(x)<0):
Where svi are the support vectors used to construct the hyperplane, ai are the corresponding weights, b is the classifier bias term, and K is implemented here as a Radial Basis Function (RBF) kernel, defined as:
K({right arrow over (x)},{right arrow over (sv)})=e−γ∥{right arrow over (sv)}-{right arrow over (x)}∥
This concept is exemplified in
In order to effectively use closed-loop neuromodulation for treating neurological disorders, (1) analog circuits are generally needed to monitor brain activity uninterruptedly even during neurostimulation, (2) energy-efficient high-efficacy processors are generally needed for responsive, adaptive, personalized neurostimulation, and (3) safe neurostimulation paradigms with rich spatio-temporal stimuli are generally needed for controlling the brain's complex dynamics. In embodiments described herein, an implantable neural interface processor (NURIP) is provided that generally includes the above advantages, thus generally able to perform brain state classification for reliable seizure prediction and contingent seizure abortion. Thus, able to classify brain states, (for example, seizures in epilepsy or tremors in Parkinson's disease) and provide responsive intervention with electrical stimulation. In other embodiments, NURIP can be used for enhancing other psychological states; for example, memory recall, sleep states, or the like. In an embodiment, NURIP is a low-power complementary metal-oxide silicon (CMOS) device which monitors digitized neural signal recordings and detects pathological states. When a state is detected, the device generates electrical stimulation waveforms to influence neural behaviour and lead the brain to a normal physiological state.
Some devices can use simplistic biomarkers for seizure detection, thus typically requiring manual tuning by clinicians and typically have high noise levels resulting in a high number of false stimulations, increasing side-effects, and decreasing battery life. Some devices can also use manual biomarker thresholding for classification, which also typically requires manual tuning and a high false alarm rate. Some devices can also use basic waveforms that can limit the ability to specifically target stimulation and precisely control neural activity.
In an example of the NURIP system level architecture, as diagrammatically illustrated in
Some stimulation strategies use “low resolution” bi-phasic pulse waveforms to reduce damage caused by charge buildup at brain-electrode interface. Low-resolution, low-selectivity biphasic waveforms are sufficiently different from measured EEG activity. Additionally, low-resolution waveforms limit the ability to selectively target and control neural activity to treat disorder symptoms. In contrast, high resolution waveforms typically enable more intricate interaction with a nervous system but are typically more difficult to regulate from a charge perspective.
In the present embodiments, the AWG advantageously permits on-chip generation of complex waveforms to enhance spatial selectivity and to enable the precise control of neural activity. In this way, the AWG charge accumulation register monitors neural waveform and applies charge recovery waveform when safe levels are exceeded. Advantageously, the AWG can enable more intricate interaction with the nervous system in order to control neurological disorders because charge balancing can ensure compliance with charge limits.
The ADC can be configured to automatically detect any sharp transitions in the intracranial electroencephalogram (iEEG), such as those due to a stimulation artifact. The ADC can also be configured to then shift a high-resolution input range to zoom to the input signal, such as anywhere within the power rails. This approach is advantageous because it generally experiences no blind intervals caused by sharp input transitions. An input digital stage can include an autoencoder neural network for both iEEG spatial filtering and dimensionality reduction. Dedicated feature extraction blocks can be used to implement univariate (signal-band energy (SE)) and multivariate (phase locking value (PLV) and cross frequency coupling (CFC)) neural signal processing. A proceeding support vector machine (SVM) accelerator employs these features for brain state classification. A further processor can be used to facilitate additional custom feature extraction and system control, as suitable. In response to a detection of a pathological brain state, an appropriate modulation waveform is generated to control the operation of the current-mode neurostimulator. In further embodiments, other psychological states can be classified; for example, memory recall, sleep states, or the like.
In an exemplary embodiment, an array of three configurable neural signal feature extractors, shown in
In the case of seizure prediction, onset biomarkers are subtle and can occur minutes before seizure onset. This presents a challenge in processing and memory requirements for implantable devices. The NURIP includes an exponentially decaying-memory support vector machine (EDM-SVM) accelerator for efficient classification of long-term temporal patterns. The EDM-SVM input stage, shown in
To capture temporal evolution of machine learning features such as signal energy, some methods typically use a windowing approach where contiguous time epochs are concatenated to form a feature vector to be classified. Using this approach, it is possible to learn temporal differences between windows for events such as seizure onset. Window-based approaches have several limitations in performance and hardware efficiency. As an example, processing larger windows requires proportionally large accumulation logic. As another example, if classification is performed at every epoch, test vector re-ordering logic may be necessary to remove old windows and add new windows. As another example, a minimum detection latency is typically the time required to generate a window, which is typically multiple seconds. As another example, EEG recordings are typically patient specific, so one window size may give sufficient temporal resolution in one case, but may not be optimal for another. Due to the fact every person/patient is different and presents different neurological biomarkers, advantageously, using machine learning approaches, as embodied herein, allows the system to learn and apply stimulation on a patient-by-patient basis.
Advantageously, some of the present embodiments use feature timescales in a patient-specific manner to maximize classification performance. Exponentially decaying memory (EDM) is an approach which can provide such an advantage. Rather than accumulating and concatenating fixed windows, the system can use a continuous sampling recursive window defined by:
EDM(t)=EDM(t-1)−α[EDM(t-1)−x(t)] (1)
In the above formula (1), in some cases, an initial EDM “magnitude” can be 0. In this way, a new value for an EDM is the old value, minus a weighted difference between the old value and the new value. The new value, x(t), can be incorporated based on a set learning rate.
In some cases, this approach incorporates new inputs or degrades existing memory of a feature according to a decay rate, α. Where:
Advantageously, EDM can minimize latency as the output is continuous and can be classified at every sample, rather than every window. Furthermore, temporal resolution can be maximized as accumulation over an epoch is not required. EDM can be implemented efficiently in hardware using shift and add operations if N is limited to powers-of-two. This efficiency provides a technological advantage by allowing multiple EDMs to be used in parallel, enabling multiple timescales to be processed simultaneously at a low computational cost.
In the present embodiment, after the signal energy is extracted for a given EEG band, its value is passed to a corresponding bank of one or more EDMs. Each EDM implements a different decay rate, α, complementing one another by offering a different temporal perspective of the input feature to be used for classification. In this way, small values of α can result in longer-term memory, while larger values can capture finer time resolutions.
In some cases, different decay rates can be combined by arranging the different decay rates linearly in memory. For example, Decay Rate 1 can be located at memory address 0x0 (element 0 in the test vector), Decay Rate 2 can be located at 0x1 (element 1). During training, the model can be optimized based on an assumption that Decay Rate 1 will be at a first element in a test vector, and the like.
In an example, upon the detection of a seizure, an integrated digitally charge-balanced neurostimulation waveform generator can respond as demonstrated in
Turning to
Turning to
Turning to
Turning to
In an embodiment, the system 1000 is connectable to one or more electrodes 1050 implantable in a patient's brain via an analog front-end 1060.
Turning to
At block 1204, the input module 1008 receives a new time-series data stream comprising a plurality of samples. In an example, the new time-series data can come from data already stored in the one or more memory units. In another example, the new time-series data can come from a signal received by the system, for example an EEG signal received from electrodes.
At block 1206, the exponential decay module 1010 defines at least one continuous sampling window, each continuous sampling window comprising one or more samples from the time-series data preceding a current sample. An epoch for each respective continuous sampling window determined according to a respective exponential decay rate.
At block 1208, the support vector module 1012 determines whether a current sample in the new time-series data stream is an occurrence of the state by determining a classified feature vector. The classified feature vector is determined by passing the current sample and samples in the at least one continuous sampling window into the trained machine learning model.
At block 1210, the output module 1014 outputs the determination of whether the current sample is an occurrence of the state. In an example, the output module 1014 can output to a user output device, such as a monitor or speaker. In another example, the output module 114 can output to another computing system, or other module on the current system, via a communication network. In another example, the output module 1014 can output to a neurological stimulation device or system, such as a waveform generator as described herein.
In an example of the present embodiments, the present inventors experimentally demonstrated the system using an intracranial EEG epilepsy database with annotated clinical and subclinical seizure events. Patients were selected based on a postoperative outcome of Engel class I, indicating that intracranial electrodes were positioned at an informative location. After the first 24 hours of neural recordings are accumulated, feature extraction is performed by the system to generate an initial training set. Labelled subclinical and clinical seizure events, as labelled by an expert in the field, are removed along with a surrounding period of recordings, in this case 10 minutes of surrounding period. An OC-SVM model is trained and stored on an FPGA fabric along with feature normalization coefficients used for the training data. In this case, generally, minimizing misclassification of normal physiological neural activity while ensuring that pathological activity is captured is a key consideration. To enable this trade-off, classifier output is smoothed using a moving average window, which can be increased at the expense of detection latency. Once highlighted activity has been annotated, for example by the expert, a refined supervised model can be trained on a microserver to be uploaded to the implanted device. SVM training can then be performed on a computing system, for example, on a Zynq SoC's dual-core CPUs using a LibSVM implementation. The required training time generally scales linearly with the number of features used on the implanted device and the FPGA fabric (as illustrated in the SVM training time chart of
In this example, performance of the system was validated using 500 hours of iEEG data across four subjects in the expert-labelled epilepsy database. A combination of 16 depth and surface electrodes were determined on a per patient basis based on proximity to the seizure onset zone. The feature extraction implementation used five spectral bands per channel, each with a decay coefficients of 4, 6, 8, 10, 12, 14 and 16. The resulting feature vector had a dimensionality of 560. An illustration of the signal from an electrode placed in the seizure onset zone is illustrated in
In this example experiment, using the system, a seizure detection rate of 97.05% was achieved. System performance for this example experiment can be summarized as:
The example experiment illustrates that the machine learning microserver of the present embodiments can enable continuous post-implantation adaptation for personalized seizure-control neuromodulation devices. The system demonstrates the efficacy of OC-SVMs to assist in the labelling of complex iEEG recordings for training supervised learning models. The concept of the patient-localized microserver provides, for example, a solution to the technical problem posed by providing life-long learning in personalized biomedical devices. The present embodiments, can be used, for example, in cases where there is an accumulation of larger volumes of ictal data whereby the present embodiments can enhance the performance of supervised learning classifiers, improving the treatment efficacy and the quality of life for patients.
As described herein, another embodiment of the NURIP is described to be used to perform brain state classification and closed-loop control using programmable-waveform electrical stimulation. In this case, an architecture for the NURIP includes an autoencoder neural network for spatial filtering and dimensionality reduction. In this case, dedicated feature extraction blocks can be used for univariate (signal-band energy, SE) and multivariate (phase locking value, PLV, and cross-frequency coupling, CFC) neural signal processing. The exponentially decaying-memory support vector machine (EDM-SVM) accelerator can be used for hardware-efficient brain state classification with a high temporal resolution. An integrated digitally charge-balanced waveform generator can be used to enable flexibility in finding optimal neuromodulation paradigms for pathological symptom suppression. This embodiment was experimentally validated using an EU human intracranial EEG (iEEG) epilepsy dataset, achieving a seizure sensitivity of 97.7% and a false detection rate of 0.185 per hour while consuming 169 μJ per classification. In further cases, in addition to or instead of the autoencoder neural network, principal component analysis (PCA), independent component analysis (ICA), dimensionality reduction algorithm, or the like can be used.
Synchronized firing of local neural populations within the brain can give rise to oscillations known as local field potentials, or LFPs. Using low-power CMOS, NURIP can function as an implantable medical device and integrate complex and feature-rich digital processing systems with analog circuits for the acquisition of low-amplitude neural signals, such as LFPs. The combination of digitization, biomarker extraction and individualized classification enables NURIP to measure and identify complex brain dynamics. As described herein, upon the detection of a pathological state, NURIP can deliver an electrical stimulus to influence neural activity and suppress symptoms. While some devices employ a simple bi-phasic pulse waveform, NURIP can excite networks more selectively and with reduced energy consumption. Further, the brain is a complex dynamical system and neurostimulation devices and NURIP is advantageously capable of adapting the stimulus in response to changing physiological environments. In this embodiment, NURIP can programmatically synthesize appropriate waveforms in an online manner; while constraining such waveforms to ensure that tissue damage does not arise through the effects of excessive stimulation charge buildup.
Advantageously, NURIP integrates neural signal acquisition, signal processing, machine learning model acceleration and neuromodulation waveform generation on a single SoC. NURIP does so by performing brain state classification for reliable seizure prediction and contingent seizure abortion. Advantageously, embodiments of NURIP can make use of an inclusion of an on-chip autoencoder neural network for signal conditioning and dimensionality reduction. Also advantageously, embodiments of NURIP can make use of the integration of a diverse array of univariate and bivariate neural signal processing feature extractors with on-chip machine learning acceleration. Also advantageously, embodiments of NURIP can make use of the implementation of the EDM-SVM for effective and hardware efficient time-series classification. Also advantageously, embodiments of NURIP can make use of on-chip neuromodulation waveform synthesis for precise control of neural activity and online stimulus adaptation. Also advantageously, embodiments of NURIP can make use of the introduction of binary exponential charge recovery (BECR) for digital charge-balanced neurostimulation.
Advantageously, the closed-loop neurostimulation SoC provided by the present embodiments can implement the phase locking value biomarker and threshold-based seizure detection. In an example, NURIP includes an optimized implementation of this biomarker with a 9-times reduction in area and a 5× reduction in power. With the additional biomarkers described herein, along with data-driven classification, NURIP improves seizure sensitivity by greater than 20%.
Embodiments of NURIP described herein can use signal-band energy (SE) with a combination of SVM classifiers to tradeoff between detection sensitivity and specificity, but uses a windowing approach with a limited ability to capture complex temporal EEG dynamics. The use of SE with PLV, CFC and the EDM-SVM approach was experimentally tested to result in increased seizure sensitivity and a 45% reduction in false detections. Advantageously, embodiments of NURIP enable the use of on-chip arbitrary waveform generation for precise stimulus control with a digital charge balancing technique to mitigate electrode and tissue damage.
Turning to
Turning to
In some cases, the scalable architecture of the system 1200 can be agnostic to the number and type of analog interface channels to be processed, supporting increasingly high channel counts and new interface paradigms such as optogenetic recording and stimulation. In some cases, as the neural signals of interest can be sampled at low frequencies (256 Hz-1 KHz for LFPs), the system 1200 can be optimized for reduced power consumption and area of performance. In some cases, this advantage can be extended by time-sharing common system resources such as a configurable-order FIR filter, 32-bit MAC, hyperbolic and circular CORDIC blocks, and system SRAM.
In some cases, the number of integrated recording channels can be scaled to increase the spatial coverage of signal acquisition; and thus, there can be a corresponding increase in the volume of data which must be processed. Dimensionality reduction can be performed to reduce the required computation while minimizing the loss of information. In a particular case, spatial filtering and principal component extraction can be used through the use of an autoencoder neural network by the feature extraction module 1206. In an example, dimensionality can be reduced from 32 recording channels to 4 weighted combinations (such as in principal component analysis), reducing the processing requirements by 8×. In some cases, the autoencoder can be implemented by re-using shared on-chip computing resources which minimizes the area overhead.
In some cases, the autoencoder can be an unsupervised learning algorithm that applies backpropagation to train an encoding layer which minimizes the error between an input xj and the reconstructed output {circumflex over (x)}j as follows:
where Hi is N encoding hidden-layer node and Wj and bi are model parameters. Training can be performed offline and the feedforward path can be computed on the implant using the model stored in on-device SRAM. In some cases, the linear transfer function can be equivalent to principal component analysis (PCA). This approach can be used to separate multichannel EEG into temporally and spatially independent components that can often be associated with particular neural generators. In some cases, the biomarkers may rely on temporal preservation; and for this reason, the system 1200 can also support the storage of up to 12 raw channel sample streams.
In some cases, the autoencoder neural network can also be used to implement common average referencing (CAR) for an EEG noise reduction; for example, up to approximately 30%.
The system 1200 can also perform data management of incoming sample streams from an array of analog front-end ADCs. For time-series analysis, a moving window of the most recently recorded samples can be stored for signal processing purposes. For example, a 256-sample circular buffer (CB) can maintain a continuous window of incoming samples (as shown in
Generally, there exists a trade-off in device signal processing between embedded computation and wireless transmission for remote processing. Implantable devices try to operate on a low power budget to maximize their battery life, and reduce the number of replacement surgeries; which can have risks of infection and an additional clinical burden. Further, devices generally also have a trade-off between thermal limits of heat dissipation and wireless communication. For example, the power consumption of local signal processing is generally an order of magnitude lower than typical wireless data transmission
Following pre-processing, in an embodiment, an array of three configurable neural signal feature extractors can be used; as shown in
With respect to the signal band energy biomarker feature, the energy in physiological signal bands can be used to characterize brain states based on the recorded neural signals. An example is shown in
The absolute output value of each bandpass filter can be taken as a measure of signal energy, as illustrated in
With respect to the phase locking value (PLV) biomarker feature, neural connectivity can refer to a pattern of anatomical links between distinct neural populations within the brain. Connectivity patterns are formed by structural links such as synapses or fiber pathways. Neural activity is constrained by connectivity, and quantitative measures are therefore useful to understanding how the brain processes information. A “preictal state” is characterized by a desynchronization of the neuronal populations related to the epileptogenic focus before a seizure onset. Phase locking can occur at specific physiological frequencies which are bandpass filtered for each channel. Synchronization can be detected by the system 1200 between a channel pair when the difference between the instantaneous phases in the extracted bands, defined as f0 and f1, remains constant.
An example illustration of PLV extraction is shown in
Δϕ(t)=ϕf
The angle determined above is used to create an instantaneous complex vector which, for example, can be constructed using a dual-core COordinate Rotation Digital Computer (CORDIC) for sine and cosine generation. The magnitude of the average of N vectors can be used as a measure of phase locking. If the average Δϕi is 0, both f0 and f1 are phase-locked. In some cases, due to the narrow bandwidth of the signal, the above moving average can be efficiently replaced by the following IIR approximation:
where α=1/2{circumflex over ( )}N defines the decay rate of the moving average; Δφ is the phase difference between the two band-passed signals; and ZRe is the real component and ZIm is the imaginary component of the complex vector for which the magnitude is determined (which provides the final PLV). Advantageously, the IIR approximation have been experimentally determined to result in a 60% decrease in group delay latency. Further, only two CORDIC computations must be performed at a given stage. The overall number can thus be reduced from five to two with resource sharing, as shown in
With respect to the cross-frequency coupling (CFC) biomarker feature, a physiological mechanism known as cross-frequency coupling has been identified as playing a significant role in biological information processing in the brain. Phase-amplitude coupling is a particular approach to integrate functional brain regions and transfer information from global brain networks operating at behavioral timescales, to local high-frequency cortical processing. In electroencephalogram (EEG), this mechanism can manifest itself in local field potentials that resemble amplitude modulation in communication systems, where a neural signal can be described as:
x(t)=[1+M·cos(2πfLF(t)+ϕ)]·[A·sin(2πfHF(t))]
where fLF is the low frequency modulating component, fHF is the high frequency component whose amplitude is modulated, M is the modulation index and A is the base amplitude of the high frequency activity.
Elevated CFC between pathological high frequency oscillations and low frequency activity is generally found in the seizure-onset zone of epilepsy patients when compared to normal brain regions. Abnormal PAC has also generally been found in the primary motor cortex of patients with Parkinson's Disease and a reduction has been shown in neural stimulation treatments which have alleviated symptoms. These findings suggest that CFC could act as a feedback measure in closed-loop neuromodulation devices for the present embodiments.
The very-large-scale integration (VLSI) implementation of the present embodiments allows for the use of two key metrics to enable a trade-off between low-power, low-latency and high-precision. Two measures of CFC, the mean vector length modulation index (MVL-MI) and the cross-frequency phase locking value (CF-PLV) can be used; as described herein.
In a particular case, prior to extracting the CFC feature, extraction of a low-frequency phase-modulating signal band, fLF(t), and high-frequency amplitude-modulated signal band, fHF(t) can be performed; as shown in
The mean vector length modulation index (MVL-MI) can be used to determine a relationship between the instantaneous phases of fLF(t) and fA(t). This can be achieved by building a complex-valued time-series with a phase of ϕLF(t) and amplitude which is scaled by fA(t); which can be expressed as:
m(t)=|fA(t)eiϕ
where the phase-amplitude coupling measure, m(t), is extracted from the time-series defined in the complex plane. Each amplitude sample is represented by the length of the complex vector, whereas the phase of the modulating signal during that same sample is represented by the vector angle. When phase-amplitude coupling is not present, a uniform circular density of vector points is symmetric around zero. However, if the fLF(t) phase is modulating the high-frequency amplitude, the fHF(t) envelope is higher at certain phases. A measure of CFC can thus be quantified by taking the magnitude of the average complex vector.
In this embodiment, the system 1200 can use a cross-frequency phase locking value (CF-PLV) to enable the detection of synchrony between the phase of the low frequency modulating signal ϕfP(t), and the phase of the envelope extracted from the high frequency modulated signal ϕfA(t). In some cases, the PLV accelerator can be re-used, where the phase difference between both modulating and modulated signals is determined as:
Δϕ(t)=ϕfA(t)−ϕfP(t)
As in between-channel PLV, the magnitude of an average vector can be used as a measure of CFC between the phases of the modulating low-frequency signal and the envelope of the modulated high-frequency signal. If the average Δϕi is 0, both fLF(t) and fHF(t) are phase-locked and CFC is determined to be present.
As illustrated in
In this embodiment, feature extraction accelerators can be used to reduce the pre-classification workload which would otherwise need to be performed by the on-chip CPU. A summary of this improvement in processing efficiency is shown in
In an example, an overall number of extractions to be performed is:
In this example, the largest contribution to the cycle count of these features is the required FIR filtering. Further, the use of decimation filtering before feature extraction increases throughput by reducing the processing required. Such configurations can be determined on a per-patient basis to allow design trade-offs based on recording channel noise and power-dissipation constraints.
For a given brain state, the biomarker features can be expressed with high variability from patient to patient and change with the underlying physiology over time. In the present embodiments, data-driven approaches can be used to create models based on recorded iEEG data rather than using manual thresholds of feature values defined by a clinician. These models can then be used to personalize an implanted closed-loop medical device to accurately detect a patient's neurological event, such as a seizure. In the present embodiment, machine learning techniques can be sued to classify the neurological events. In the context of seizure prediction, a comparison of a Long Short-Term Memory (LSTM)/Convolutional Neural Network (CNN) deep learning approach with a Support-Vector Machine (SVM) approach (of the present embodiment) was conducted. The deep learning and SVM approaches show comparable performance, but the computational complexity of the deep learning approach is several orders of magnitudes higher:
where, for the RCNN, the CNN was: {3×3 conv. layer: 4, 2×2 pool. layer: 2}, the LSTM was: {128 hidden-unit: 60}, and the FCL was: {60 hidden-unit: 1}. For the SVM, it had a Radial Basis Function Kernel, 321 support vectors, 384 features.
The above illustrated efficiency renders the SVM employed by NURIP in the present embodiment particularly suited to low-power seizure detection implantable devices. Additionally, hand tuned features can be used with the SVM based on insights from domain knowledge. Generally, deep learning approaches depart from such domain understanding, in favor of data-driven learning of features using complex models.
In this embodiment, NURIP can use an exponentially decaying memory (EDM) approach to represent complex temporal relationships for efficient classification and to addresses the outlined obstacles associated with windowing. Rather than accumulating and concatenating fixed windows, a continuous sampling recursive window can be defined by:
EDM[t]=EDM[t−1]−λ(EDM[t−1]−x[t]) (8)
Here, input x[t] is incorporated based on a set learning rate, and the existing memory of a feature is degraded according to the decay rate, λ. Where:
λ=12α,1<α<16
When the decay coefficient, λ, is constrained using a reciprocal of a power-of-two, the EDM update can efficiently be performed using shift and add operations. In an example, each EDM can have an approximate effective time window of 2α/Fs (as exemplified in
In this embodiment, the machine learning module 1208 can use a support vector machine (SVM) as a supervised learning model for classification of two or more classes. In this approach, a data point is viewed as an N-dimensional vector and an objective is to find an N-dimensional hyperplane that can separate two groups of input data points which should also be mapped to the same high-dimensional space as the hyperplane.
SVMs generally have a similar number of examples in each class to prevent classifier bias, but in the case of seizure detection, ictal activity is rare and prone to labeling errors. The SVM utilized in the present embodiment can use a semi-supervised one-class approach. It can be viewed as a regular two-class SVM where the training data is taken as one class, and the origin is taken as the only member of the second class. Training can be performed using only interictal data which is mapped to the kernel space and separated from the origin by a hyperplane with maximum margin. The kernel, K, is implemented in a particular case using the Radial Basis Function (RBF):
K(sv,x)=e−γ∥sv-x∥
where sv are the support vectors used to construct the hyperplane, x is the extracted feature vector and γ is the inverse of the standard deviation of the RBF, or Gaussian function. Intuitively, the gamma parameter defines how far the influence of a single support vector reaches. In a particular case, the exponential function can be determined using a shared hyperbolic CORDIC core. In this way, the selection of linear, polynomial and RBF kernels can be a trade-off between performance, energy and memory usage.
In the present embodiment, the combination of EDM with SVM classification can allow for effective low-power time-series classification. As the EDM is updated every sample, inferences can be performed continuously rather than only when a window has been processed. Advantageously, EDM allows for the retention of biomarkers over time periods which are infeasible for windowing-based approaches, where device memory requirements scale linearly with the number of samples. Furthermore, the efficiency of EDM allows for the combination of N long-term and short-term memory decay rate EDMs to enable the learning of complex temporal relationships, as exemplified in
The present inventors have experimentally verified and validated the NURIP of the present embodiments using neural recording data available in the EU Epilepsy database. This database contains intracranial recordings from 30 patients with an average continuous recording time of 150 hours per patient. Due to the inherent class imbalance problem associated with seizure data (with few ictal examples compared to interictal data), sensitivity and false detection rate (FDR) measures are generally used. Sensitivity measures the proportion of real seizures that were correctly identified by a classifier while the false detection rate indicates the number of false alarms raised by a detector per hour of recording. NURIP's ability to detect clinically relevant brain states was evaluated using 500 hours of data from four patients in the EU Epilepsy database:
In this example experiment, data was first down-sampled to 256 Hz and 16 electrodes were chosen on a per patient basis based on their proximity to the seizure onset zone. The feature extraction used five signal energy spectral bands, phase locking values between channels in the theta band and cross-frequency coupling between the theta and gamma frequency bands. EDM decay coefficients, of 8, 10, 12, and 14 were used (approximately 1, 4, 16, and 64 second effective time windows, respectively). A one-class SVM was trained using a radial basis function kernel. To estimate the classifier performance, a leave-one-record-out cross-validation scheme was employed.
Upon the detection of a pathological brain state, an electrical stimulus can be applied to suppress symptoms. There are a number of technical obstacles to overcome when implementing stimulation strategies for neuromodulation devices; for example, selection of stimulation parameters that are necessary to induce the desired neural activity, minimization of the power required to achieve a given effect, and ensuring stimulation parameters are safe for chronic use.
It has been demonstrated the use of stimulus waveforms with a net direct current component increases the probability of tissue and electrode damage. Generally, neural stimulators deliver charge-balanced bi-phasic rectangular current pulses, where the first (cathodic) phase excites the nerve fiber and the second (anodic) phase provides charge balancing. The rectangular waveform is widely used for its simplicity and ease of generation with a simple current source. However, it has been shown that arbitrary waveforms can induce complex neural activity. Stimulation parameters generally trade-off between selectivity, reduced power consumption and waveform safety. The ability to programmatically control these parameters on a device, as in the present embodiments, allows online waveform adaptation based on closed-loop control techniques. NURIP, in the present embodiment, integrates a digitally charge-balanced neurostimulation waveform synthesizer (exemplified in
To mitigate the issue of charge imbalanced stimulation, some approaches use pulsating voltage transcranial electrical stimulator (PVTES) to adapt the number of stimulation pulses with respect to skin-electrode impedance variation. However, this approach is limited to bi-phasic pulse stimulation. The present embodiment uses a digital charge-balancing technique to support the use of arbitrary waveforms. In a particular case, the system's MAC logic can be re-used to store the net charge sent to an analog front-end neural stimulator and hence monitor the stimulus to ensure safe limits are not exceeded. An exponential charge recovery phase has been experimentally demonstrated to safely reduce such imbalances when compared to sudden terminations. This approach can be efficiently implemented in NURIP using binary exponential charge recovery (BECR) where the inverse of the charge monitoring register is iteratively applied from most-significant bit (MSB) to least-significant bit (LSB). In some cases, while BECR ensures that the digital values sent to analog DACs do not exceed safe limits, the charge monitoring register can be adjusted with feedback from the analog domain to compensate for stimulator nonidealities and varying electrode impedances.
In an example of the present embodiment of NURIP, a processing unit can be implemented in a 0.13-μm RF CMOS as shown in the micrograph in
Functional verification was experimentally performed by the present inventors using an external debug interface to on-chip logic for data streaming and control. Samples from the EU database are streamed to the preprocessor via a test FPGA for analysis and generated digital waveform values are accessed via memory mapped registers for visualization as shown in
where evaluated were conducted using: *MIT-CHB Database, †Local data, ⋄EU Epilepsy Database.
NURIP implements the broadest range of feature extractors at a cost of power consumption, but its classification performance is among the highest demonstrated using the EU Epilepsy Database, as shown in
In this embodiment, NURIP was experimentally verified to integrate accurate brain state classification for patient-specific seizure detection with neuromodulation waveform generation for precise simulation and contingent seizure abortion with a processing power consumption of 674.4 μW. The on-chip autoencoder for signal conditioning and dimensionality reduction is an approach that greatly reduced device computational requirements as it scaled towards higher channel counts. In further embodiments, the SE, PLV and CFC array of feature extractors combined with on-chip machine learning allows NURIP to classify other brain states; for example, those found in Parkinson's disease. The EDM approach to time-series classification has been demonstrated to efficiently encode temporal relationships in biomarkers for classification and to overcome the high memory requirement associated with windowing. In further embodiments, the EDM approach can be used in other applications where long-term dependencies generally need to be considered; for example, financial time-series analysis for stock price prediction, classification of automotive sensor data for advanced driver-assistance systems (ADAS) or autonomous vehicles, analysis of Internet of Things (IoT) ambient sensor data for environmental state classification, and industrial/retail business analytics for inventory monitoring and process optimization.
Although some of the embodiments herein generally describe a support vector machine (SVM) for the machine learning model, any suitable machine learning model or technique can be used; for example, artificial neural networks (ANNs), Logistic Regression, Nearest Neighbors classifiers, or the like.
Although the foregoing has been described with reference to certain specific embodiments, various modifications thereto will be apparent to those skilled in the art without departing from the spirit and scope of the invention as outlined in the appended claims. The entire disclosures of all references recited above are incorporated herein by reference.
Claims
1. A system for interfacing with biological tissue, the system comprising one or more processors and one or more memory units, the one or more processors in communication with the one or more memory units, the one or more processors configured to execute:
- a feature extraction module to implement an extraction approach to extract one or more features from the one or more physiological recording signals;
- a machine learning module to apply a machine learning model based on input data to detect a physiological event or condition for classification, the input data comprising the extracted features, the machine learning model trained using a training set comprising feature vectors of time-series data labelled with known occurrences of the physiological event or condition; and
- an output module to output the classification of the machine learning module.
2. The system of claim 1, wherein the system is connectable to one or more electrodes implantable in the biological tissue via an analog front-end, the analog front-end comprising one or more physiological signal acquisition circuits in communication with the biological tissue and a convertor, the analog front-end communicating one or more physiological recording signals to the one or more processors.
3. The system of claim 2, wherein the system is connectable to one or more physiological stimulation channels, the one or more physiological stimulation channels in communication with the biological tissue, the one or more physiological stimulation channels connectable to the one or more processors, the system further comprising a stimulation controller for generating and delivering one or more electronic signals to the one or more physiological stimulation channels, the one or more electronic signals comprising an arbitrary shape waveform.
4. The system of claim 3, wherein the biological tissue comprises tissue within a central nervous system or a peripheral nervous system.
5. The system of claim 3, wherein the one or more physiological signal acquisition circuits comprise neural signal recording channels.
6. The system of claim 5, wherein the physiological signal acquisition circuits comprise at least one of: signal samplers, amplifiers, filters and analog-to-digital convertors.
7. The system of claim 5, wherein the one or more physiological stimulation channels comprises one or more neurostimulation channels.
8. The system of claim 3, wherein the one or more physiological stimulation channels generate and deliver at least one of current, charge, voltage, ultrasound, and magnetic signals.
9. The system of claim 3, wherein the extraction approach comprises a dimensionality reduction approach, the dimensionality reduction approach comprising at least one of: an autoencoder neural network, a principal component analysis (PCA), an independent component analysis (ICA).
10. The system of claim 3, wherein the one or more features comprise at least one of: signals band energy, signals phase locking feature, signals cross-frequency coupling, signals temporal correlation, and signals spatial correlation.
11. The system of claim 3, wherein the physiological event or condition comprises at least one of: a pathological brain state, a non-pathological brain state, and a physiological event in a peripheral nervous system.
12. The system of claim 3, wherein the arbitrary shape comprises at least one of biphasic pulses, monophasic pulses, sinusoids, and functions of sinusoids.
13. The system of claim 3, wherein the stimulator generates the one or more electronic signals in a temporal or spatial periodic pattern.
14. The system of claim 3, wherein the stimulator generates the one or more electronic signals when the physiological event is detected by the machine learning module.
15. The system of claim 14, wherein the detected physiological event is a pathological brain state.
16. The system of claim 3, wherein the stimulator comprises a charge balancer for generating charge-balanced physiological stimulation waveforms.
17. The system of claim 16, wherein the charge balancer comprises a charge balance monitor.
18. The system of claim 3, wherein the one or more physiological stimulation channels are in communication with the brain via at least one of: the one or more electrodes, electromagnetic coils, antennas, ultrasound sources, light sources, and reservoirs comprising molecular, chemical, or biochemical content.
19. The system of claim 2, wherein the machine learning model is continuously updated using new inputs from the one or more physiological recording signals.
20. A method for interfacing with biological tissue, the method executable on one or more processors, the method comprising:
- extracting one or more features from one or more physiological signals by an extraction approach;
- applying a machine learning model based on input data to detect a physiological event or condition for classification, the input data comprising the extracted features, the machine learning model trained using a training set comprising feature vectors of time-series data labelled with known occurrences of the physiological event or condition;
- generating and delivering one or more electronic signals to the one or more physiological stimulation channels, the one or more electronic signals comprising an arbitrary shape waveform.
21. A computer-implemented method for sampling time-series data, comprising:
- receiving new data from the time-series data stream;
- receiving contemporary data from a time-series data stream;
- applying a sampling recursive window to the contemporary data;
- accessing previously received data from the time-series data stream;
- applying a temporal function to the previously received data;
- subtracting the previously received data with the temporal function applied from the contemporary data; and
- outputting the contemporary data after the subtraction has been applied.
22. A system for sampling time-series data, the system comprising one or more processors and one or more memory units, the one or more processor configured to execute a sampling module to:
- receive new data from the time-series data stream;
- receive contemporary data from a time-series data stream;
- apply a sampling recursive window to the contemporary data;
- access previously received data from the time-series data stream;
- apply a temporal function to the previously received data;
- subtract the previously received data with the temporal function applied from the contemporary data; and
- output the contemporary data after the subtraction has been applied.
23. A computer-implemented method for arbitrary waveform generation for physiological stimulation, comprising:
- generating an arbitrary function signal;
- passing the arbitrary function signal through a charge balance monitor to monitor compliance with predetermined charge limits;
- applying a physiological stimulation with the signal; and
- applying binary exponential charge recovery (BECR) to the signal by determining a net stimulus integral and applying a reverse charge when the integral is not zero due to arbitrary waveform stimulation or due to predetermined limits having been exceeded.
24. A system for arbitrary waveform generation for physiological stimulation, the system connectable to one or more electrodes implantable in a brain, the system comprising:
- an arbitrary waveform generator (AWG) to generate an arbitrary function signal;
- a charge balance monitor to receive the arbitrary function signal and monitor compliance with predetermined charge limits;
- a physiological stimulator to apply a physiological stimulation with the signal; and
- a binary exponential charge recovery (BECR) unit to apply BECR to the signal by determining a net stimulus integral and applying a reverse charge when the integral is not zero due to arbitrary waveform stimulation or due to the predetermined limits having been exceeded.
25. A computer-implemented method for classifying time-series data for identifying a state, the time-series data comprising a series of samples, the method comprising:
- training a machine learning model to classify occurrences of the state by classifying a representative feature vector, using a respective training set, the respective training set comprising feature vectors of the time-series data labelled with occurrences of the state;
- receiving a new time-series data stream;
- determining whether a current sample in the new time-series data stream corresponds to an occurrence of the state by determining a classified feature vector, the classified feature vector determined by passing the current sample and samples in at least one continuous sampling window into the trained machine learning model, each continuous sampling window comprising one or more preceding samples from the time-series data, an epoch for each respective continuous sampling window determined according to a temporal function; and
- outputting the determination of whether the current sample corresponds to an occurrence of the state.
26. The method of claim 25, wherein each continuous sampling window is recursively defined based on the epoch of a previous iteration of the respective window subtracted by the respective temporal function multiplied by the epoch of such previous iteration.
27. The method of claim 25, wherein the at least one continuous sampling window comprises at least two continuous sampling windows, the epoch of each of the continuous sampling windows are defined by different temporal function parameters.
28. The method of claim 25, wherein each of the temporal functions comprise a decay rate, and wherein each exponential decay rate is a reciprocal of a power of 2.
29. The method of claim 28, wherein each exponential decay rate is in the range of 1/2 to 1/(216).
30. The method of claim 27, wherein each epoch is on the order of minutes or less.
31. The method of claim 27, wherein the state vector machine learning model uses one of linear, polynomial and radial-basis function (RBF) kernels.
32. The method of claim 26, wherein the at least one continuous sampling window comprises a plurality of continuous sampling windows organized into at least two banks of continuous sampling windows, each bank comprising at least one continuous sampling window, the continuous sampling windows in each bank having a different temporal function parameters than the continuous sampling windows in the other banks.
33. The method of claim 27, wherein the time-series data comprises physiological signals and the state comprises a physiological event or condition.
34. The method of claim 33, wherein the time-series data comprises electroencephalography (EEG) signals and the state comprises one or more onset biomarkers associated with a seizure.
35. A system for classifying time-series data for state identification, the system comprising one or more processors and one or more memory units, the one or more memory units storing the time-series data comprising a series of samples, the one or more processors in communication with the one or more memory units and configured to execute:
- a training module for training a machine learning model to classify occurrences of the state by classifying a representative feature vector, using a respective training set, the respective training set comprising feature vectors of the time-series data labelled with occurrences of the state;
- an input module for receiving a new time-series data stream comprising a plurality of samples;
- a temporal function module for defining at least one continuous sampling window, each continuous sampling window comprising one or more samples from the time-series data preceding a current sample, an epoch for each respective continuous sampling window determined according to a respective temporal function;
- a support vector module for determining whether a current sample in the new time-series data stream is an occurrence of the state by determining a classified feature vector, the classified feature vector determined by passing the current sample and samples in the at least one continuous sampling window into the trained machine learning model; and
- an output module for outputting the determination of whether the current sample is an occurrence of the state.
36. The system of claim 35, wherein each continuous sampling window is recursively defined based on the epoch of a previous iteration of the respective window subtracted by the respective temporal function multiplied by the epoch of such previous iteration.
37. The system of claim 36, wherein the at least one continuous sampling window comprises at least two continuous sampling windows, the epoch of each of the continuous sampling windows are defined by defined by different temporal function parameters.
38. The system of claim 34, wherein each of the temporal functions comprise a decay rate, and wherein each exponential decay rate is a reciprocal of a power of 2.
39. The system of claim 38, wherein each exponential decay rate is in the range of 1/2 to 1/(216).
40. The system of claim 37, wherein each epoch is on the order of minutes or less.
41. The system of claim 37, wherein the state vector machine learning model uses one of linear, polynomial and radial-basis function (RBF) kernels.
42. The system of claim 35, wherein the temporal function module defines a plurality of continuous sampling windows organized into at least two banks of continuous sampling windows, each bank comprising at least one continuous sampling window, the continuous sampling windows in each bank having different temporal function parameters than the continuous sampling windows in the other banks.
43. The system of claim 37, wherein the time-series data comprises physiological signals and the state comprises a physiological event.
44. The system of claim 43, wherein the time-series data comprises electroencephalography (EEG) signals captured by electrodes in communication with the system, and the state comprises one or more onset biomarkers associated with a seizure.
Type: Application
Filed: Feb 11, 2019
Publication Date: Dec 24, 2020
Inventors: Roman GENOV (Toronto), Gerard O'LEARY (Toronto)
Application Number: 16/968,439