SYSTEMS AND METHODS OF MONITORING SOCIAL INTERACTIONS IN A GROUP OF ORGANISMS OVER A PERIOD OF AT LEAST 24 HOURS IN A SEMI-NATURAL ENVIRONMENT

A method of monitoring social interactions among a plurality of model organisms. The method comprises marking each of a plurality of model organisms in a monitored space, divided to a plurality of spatial segments, with one of a plurality of unique visual markers, capturing a sequence of images of the plurality of unique visual markers, calculating at least one spatiotemporal model of at least one of the plurality of model organisms by identifying in which of the plurality of segments each unique visual marker is located during each of a plurality of temporal frames of a period, and outputting the at least one spatiotemporal model.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
RELATED APPLICATION

This application is claims priority from U.S. Patent Application No. 61/514,705, filed on Aug. 3, 2011 and from U.S. Patent Application No. 61/531,107 filed on Sep. 6, 2011. The content of the above document is incorporated by reference as if fully set forth herein.

FIELD AND BACKGROUND OF THE INVENTION

The present invention, in some embodiments thereof, relates to monitoring systems and, more particularly, but not exclusively, to systems and methods of monitoring social interactions in a group of organisms, for example over a period of at least 24 hours in a semi-natural environment.

During the last years various systems and methods for monitoring social interactions in a group of organisms, for example rodents, in a semi-natural environment have been developed.

For example, NewBehaviour™ developed a system, named SocialBox™ which is an extension to IntelliCage™ system. The system allows monitoring of the spatial distribution of mice within a setup. The system uses transponder-based RFID-technology to monitor individual social behaviour of mice. Additional cages can be used either for monitoring social interactions between mice or for the monitoring of individual spatial preference patterns. The SocialBox™ may be used for different tasks including monitoring of preference or avoidance in experiments by creating different environments in the boxes (e.g. different conditions of lighting, temperature, color, padding, by olfactory/acoustic devices or novel objects). SocialBox also allows detection of social patterns.

SUMMARY OF THE INVENTION

According to an aspect of some embodiments of the present invention there are provided methods and systems of identifying one or more individual or collective social behaviours of a group of model organisms by quantifying interaction complexity among members of subgroups of the group. The methods and systems are based on recorded spatiotemporal data of a plurality of model organisms in a monitored space during of a period, optionally of 24 hours or more. The spatiotemporal data is analyzed to identify a plurality of interactions between members of each subgroup. This allows quantifying interaction complexity of each subgroup and identifying accordingly one or more social behaviours, such as aggressiveness and instinctive behaviour.

According to an aspect of some embodiments of the present invention there are provided methods and systems of monitoring social interactions among a plurality of model organisms to extract a mathematical model of interactions. The methods for example is based on marking each of a plurality of model organisms in a monitored space that is divided to a plurality of spatial segments one of with a plurality of unique visual markers, such as a color marker and/or a visual machine-readable code. This allows capturing, a sequence of images of the unique visual markers and calculating a spatiotemporal model of one or more of the model organisms by identifying in which of the segments each unique visual marker is located during each of a plurality of temporal frames of a period, optionally of 24 hours or more.

Optionally, an analysis of the spatiotemporal model allows characterizing the nature of one or more interactions among at least some of the model organisms.

Optionally, some of the segments are identified as related to one of a plurality of distinct regions, such as objects and food and water sources.

Optionally, the characterizing is performed by comparing the locations of different model organisms during the period and/or one or more entropy measures extracted from the spatiotemporal model.

Optionally, a plurality of chase-escape interactions of each of a plurality of pairs of the model organisms are identified according to the spatiotemporal model. These chase-escape interactions may be used to calculate a hierarchical structure of the model organisms.

Optionally, trials of model organisms are calculated based on the spatiotemporal model and used to characterize the nature of interactions among the model organisms.

According to an aspect of some embodiments of the present invention there is a system of monitoring social interactions among a plurality of model organisms that comprises an imaging system which capturing a sequence of images of a plurality of unique visual markers each of another of a plurality of model organisms in a monitored space divided to a plurality of spatial segments. The system further includes an image processing module which analyses, using a processor, the sequence of images to identify in which of the segments each of the unique visual marker is located during each of a plurality of temporal frames of a period and a characterizing module which calculates a spatiotemporal model according to the identification and characterizes the nature of one or more interactions among at least some of the plurality of model organisms by an analysis of the spatiotemporal model.

According to an aspect of some embodiments of the present invention there is a method of monitoring social interactions among a plurality of model organisms. The method is based on recording spatiotemporal data of a plurality of model organisms in a monitored space during of a period, analyzing the spatiotemporal data to identify a plurality of interactions between members of each of a plurality of subgroups of the plurality of model organisms, calculating a plurality of entropy measures each for interactions in another of the plurality of subgroups, and characterizing a social behaviour of at least some of the model organisms according to the plurality of entropy measures.

Unless otherwise defined, all technical and/or scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the invention pertains. Although methods and materials similar or equivalent to those described herein can be used in the practice or testing of embodiments of the invention, exemplary methods and/or materials are described below. In case of conflict, the patent specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and are not intended to be necessarily limiting.

Implementation of the method and/or system of embodiments of the invention can involve performing or completing selected tasks manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of embodiments of the method and/or system of the invention, several selected tasks could be implemented by hardware, by software or by firmware or by a combination thereof using an operating system.

For example, hardware for performing selected tasks according to embodiments of the invention could be implemented as a chip or a circuit. As software, selected tasks according to embodiments of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In an exemplary embodiment of the invention, one or more tasks according to exemplary embodiments of method and/or system as described herein are performed by a data processor, such as a computing platform for executing a plurality of instructions. Optionally, the data processor includes a volatile memory for storing instructions and/or data and/or a non-volatile storage, for example, a magnetic hard-disk and/or removable media, for storing instructions and/or data. Optionally, a network connection is provided as well. A display and/or a user input device such as a keyboard or mouse are optionally provided as well.

BRIEF DESCRIPTION OF THE DRAWINGS

Some embodiments of the invention are herein described, by way of example only, with reference to the accompanying drawings. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of embodiments of the invention. In this regard, the description taken with the drawings makes apparent to those skilled in the art how embodiments of the invention may be practiced.

In the drawings:

FIG. 1 is a flowchart of a method of monitoring social interactions among a plurality of model organisms, such as rodents, according to some embodiments of the present invention;

FIG. 2 is a schematic illustration of components of a system of monitoring social interactions, for example by implementing the method depicted in FIG. 1, according to some embodiments of the present invention;

FIGS. 3A-3D depict an exemplary frame that is processed during a tracking process and processed versions thereof, according to some embodiments of the present invention;

FIGS. 3E-3G are a set of graphs that respectively depict an exemplary probability that an exemplary pixel belongs to an organism calculated based on a hue (H), saturation (S), and brightness (V) values, according to some embodiments of the present invention;

FIG. 3H are translations of segments into estimated trails of organisms (i.e. mice), for example using Viterbi's algorithm, according to some embodiments of the present invention;

FIGS. 4A-4D are schematic illustrations and images of a top view of an exemplary monitored space with four monitored organisms (mice), according to some embodiments of the present invention;

FIG. 4E is a set of individual histograms of time spent in the different regions of a monitored space;

FIG. 4F depicts a distribution of continuous time periods spent by one typical organism (mouse) at each region of interest;

FIGS. 4G-4H depicts an open 70 centimetres (cm)×50 cm×50 cm box that encircles the monitored space, according to some embodiments of the present invention;

FIGS. 5A-5D depict exemplary spatiotemporal models generated according to the data gathered while monitoring the monitored space depicted in FIGS. 4A-4D during a monitoring period, according to some embodiments of the present invention;

FIG. 6A is a schematic illustration that depicts an accuracy of a 3rd order maximum entropy model;

FIG. 6B is a graph that depicts a distribution of weights according to the order of interactions in a regularized model, according to some embodiments of the present invention;

FIG. 6C is a schematic illustration that depicts pairwise interactions between organisms in several subgroups, according to some embodiments of the present invention;

FIG. 6D is a schematic illustration that depicts pairwise interactions of a typical group, according to some embodiments of the present invention;

FIGS. 6E-6F depict dominant pairwise interactions and triplet interactions for one group overlaid on a drawing of an exemplary monitored space, according to some embodiments of the present invention;

FIG. 7A depicts representative ethograms of contacts between organisms (mice) that were classified as aggressive and/or non-aggressive in two different spaces;

FIG. 7B denotes a group hierarchy as a tree with the minimum number of levels that preserves relative ranking between exemplary organisms; and

FIG. 7C depicts a correlation between a hierarchy rank and environment exploration.

DESCRIPTION OF EMBODIMENTS OF THE INVENTION

The present invention, in some embodiments thereof, relates to monitoring systems and, more particularly, but not exclusively, to systems and methods of monitoring social interactions in a group of organisms, for example rodents, for instance over a period of at least 24 hours in a semi-natural environment.

According to some embodiments of the present invention there are provided methods and systems of automatically gathering spatiotemporal data of location and optionally behaviour of a plurality of members in a group of monitored organisms and automatically analyzing the spatiotemporal data to identify or characterize social related characteristics of the monitored organisms group or any subgroup thereof. The social related characteristics may include group behaviour, subgroup behaviour, social hierarchy among the model organisms, and/or the effect of external factors (also referred to as environmental) on the group and/or subgroup behaviour (i.e. weather change, food supply and/or the like).

According to some embodiments of the present invention there are provided methods and systems of monitoring social interactions among a plurality of model organisms by recording spatiotemporal data of a plurality of model organisms in a monitored space during of a monitoring period, analyzing the spatiotemporal data to identify a plurality of interactions between members of subgroups of the model organisms, and calculating entropy measures for the interactions in each subgroup and characterizing a social behaviour of the model organisms according to the plurality of entropy measures.

Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth in the following description and/or illustrated in the drawings and/or the Examples. The invention is capable of other embodiments or of being practiced or carried out in various ways.

Reference is now made to FIG. 1, which is a flowchart of a method 100 of monitoring social interactions among a plurality of model organisms, such as rodents, according to some embodiments of the present invention. Reference is also made to FIG. 2, which is a schematic illustration of components of a system 200 of monitoring social interactions, for example by implementing the method depicted in FIG. 1, according to some embodiments of the present invention. As shown, software components include an input interface 61 for receiving a sequence of images, for example a video sequence, from one or more image sensors 71 in a monitored space 72, optionally over a network, a capturing module 63, and an output module 65. For example, the image sensor 71 is a color sensitive camera, such as Panasonic Color CCTV, WV-CL924AE, that is placed above the monitored space. Optionally, frames are captured in a sampling rate of about frames per second (40 ms per frame).

The method 100 and system 200 allows analyzing patterns, optionally reparative, of interactions between specific members of a group of model organisms, for example several, dozens, hundreds, and/or thousands of model organisms, in a monitored space. These patterns, referred to herein as interaction patterns may be used detecting social behaviours and explore the interplay of individual and group relations in decision making, information transfer, learning, empathy and/or the like. Optionally, the patterns are indicative of a joint activity of subgroups.

The method 100 and/or system 200 are optionally set for automatic tracking of individuals in groups of model organisms over a relatively long period of 24 hours or more, for example several days, a week, a month, a year, and/or any intermediate period. Optionally, an entropy measure, for example maximum entropy, is calculated to quantify a nature of correlated group behaviour and/or used to map the social interactions between individuals.

First, as shown at 101, each of a plurality of model organisms which are designated to be in the monitored space 72 are marked, either physically or by identifying uniquely characterising features. For example, each organism is marked with another of a plurality of unique visual markers, for example colour marks. The marking may be by selecting one or more unique visual signature, for example a length of an organ, a walking pattern and/or the like. The monitored space 72 is optionally divided to a plurality of spatial areas, also referred to as spatial areas. For example, organisms are color marked. For example, the marking may be performed with fluorescent dyes, optionally hair-dyes, for example semi-permanent and/or other dyes which glow under dark light.

Then, as shown at 102, a sequence of images, for example a video stream, imaging the unique visual markers in the monitored space during a monitoring period of few hours, 24 hours or more, for example several days, a week, a month, a year, and/or any intermediate period is captured.

As shown at 103, the sequence of images (also referred to as frames) is analyzing, for example, in real time, in intervals, or after the capturing is completed, to calculate one or more spatiotemporal models of one or more of the marked model organisms by identifying in which of the segments each unique visual marker is located during a plurality of temporal frames of the monitoring period.

Optionally, in real time, organisms are identified and tracked automatically, optionally according to the color marking fur colors, which were learned from labeled data. Optionally, as some frames in the sequence may have reflection artifacts, for example as an outcome of a low signal-to-noise ratio (SNR) derived from a dim lighting, camera sensitivity, and/or missing parts, a Bayesian model is used to infer an estimated location of an organism in a given observed location of connected colored blobs.

Reference is now made to FIGS. 3A-3D which depict an exemplary frame that is processed during a tracking process, according to some embodiments of the present invention. FIG. 3A depicts an input frame, FIG. 3B depicts using an adaptive threshold based on the amount of noise in the image, FIG. 3C depicts areas spread based on colors assigned to each pixel based on the organism. FIG. 3D depicts small separated color patches which are filtered out and the probability of each segment to represent organism m is computed (m|ct,i). Optionally, the analysis of the captured images and the extracting of coordinates of each organism is based on probabilistic tracking. Each video frame at time t, for example as shown in FIG. 3A is processed using an adaptive threshold that takes into account the amount of noise in the frame. Optionally, per pixel, a probability that it belongs to an organism m is calculated based on a hue (H), saturation (S), and brightness (V) values p(m|H,S,V), for example see FIGS. 3E-3G. The computation is based on Bayes law:

p ( m | H , S , V ) = p ( H | m ) p ( S | m ) p ( V | m ) m p ( H | m ) p ( S | m ) p ( V | m ) + p ( H | - ) p ( S | - ) p ( V | - ) ,

where equality is assumed per organism. For color segmentation a most probable organism assignment is used in order to divide the segments into organism, see FIG. 3C. After filtering small separated areas we ended with a frame similar to the frame depicted in FIG. 3D. For each such segment ct,i an average probability of belonging to an organism m is computed using the following:

p ( m | c t , i ) = ( H , S , V ) c t , i p ( m | H , S , V ) ( H , S , V ) c t , i 1 .

Optionally, blobs ct,i are translated during the segmentation into trails of organisms (i.e. mice), for example using Viterbi's algorithm, see FIG. 3H. For each time frame t, emission probability p(m|ct,1) is computed per segment ct,i, and transition probabilities p(ct,i|ct−1,j) are computed from all the segments ct−1,j at time t−1, optionally including the probability of segments that were not picked-up by the system during the segmentation, and coming out of and to the nests. This allows calculating a most likely segment sequence p(c1, i1, . . . cT, iT|m). The transition probabilities p(ct,i|ct−1,j) are computed based on the distance between segments. For each frame additional three special segments are added to account for hidden segments (if the system lost track of the mouse during the segmentation), and for the two nests.

As shown at 104, the one or more spatiotemporal models are outputted.

As shown at 105, the spatiotemporal models are now analyzed to identify one or more social related characteristics of the group or any subgroup of the group. The social related characteristics may include social behaviours of certain model organisms when in proximity to other members of the group, social behaviours of certain subgroups of the model organisms, social hierarchy among the model organisms and/or the effect of an external factor, which is in effect during the monitoring period, on the monitored organism group and/or any subgroup of the monitored organism group. The social related characteristics may be extracted from one or more dependencies (correlations) between a spatiotemporal behavioural characteristic of one organism and one or more similar and/or different spatiotemporal behavioural characteristics of one or more of the other organism. The spatiotemporal behavioural characteristic may be location, activity, dominancy behaviour indication, volume and/or physical characteristic which change with time, for example bristling hair, teeth exposure and/or the like.

The method 100 and system 200 allows detecting complex behaviour patterns of individuals in a group wherein behaviour of one organism in the group is affected by the behaviour of other organisms in the group and/or group behaviour patterns which are affected from external parameters, for example changes in the environment and/or the like.

The spatiotemporal models are optionally maximum entropy models which quantify the contribution of different orders (i.e. subgroups) of correlations in the group (for example correlation between organisms in different subgroups), including irreducible high-order interactions between organisms that go beyond pairwise relations. The spatiotemporal models may be used to characterize a social interaction map among organisms, for example, as further exemplified below. The social hierarchy in the group is optionally deduced by detecting aggressive contacts among organisms. These results quantitatively show limitations of pairwise based approaches, and the importance of studying groups in naturalistic environment, to reveal complex social structure.

Optionally, spatiotemporal movement and/or location patterns of subgroup members are analyzed to identify environmental and/or genetic influences on the social behavior of the subgroup members.

For example, reference is now made to FIGS. 4A-4D which are schematic illustrations and images of a top view of an exemplary monitored space with four monitored mice, according to some embodiments of the present invention. FIGS. 4G-4H depict an open 70 centimetres (cm)×50 cm×50 cm box that encircles the monitored space. As shown in FIG. 4A, the monitored space is divided in advance to 10 segments, which are set around the following regions of interest open field 1, Z-wall 2, water source 3, feeders 4, 5, an area above a small-nest 6, an area above a large-nest 7, a block 8, an inner space of the small-nest 9, an inner space of the large-nest 10.

Optionally, Ultraviolet (UVA) fluorescent lamps are placed above the monitored space to illuminate the surrounding area with, for example with 370-380 nanometer (nm) light which mice cannot see. Optionally, to avoid reflections, the monitored space is covered. FIG. 4B depicts an aggregation of images of a video recording and color-based tracking of a group of mice in the dark. FIG. 4C depicts paths of the 4 organisms (i.e. mice) which has been monitored during a period of 15 minutes (min). FIG. 4D teaches a heat map of the monitored space showing a relative amount of time the organism (i.e. mice) spent in different segments of the monitored space. Data shown is from one typical group during one day where red corresponds to highly visited points and blue to less visited points. Reference is also made to FIG. 4E which is a set of individual histograms of the time spent in the different regions of the monitored space (same group as depicted in FIG. 4D). FIG. 4F depicts a distribution of continuous time periods spent by one typical organism (mouse) at each region of interest.

Optionally, joint activity patterns, for example as acquired from the model depicted in FIGS. 4A-4D, are characterized using discretized representation of spatial configurations thereof. For brevity, (x1,x2,x3,x4) denotes a state of a group at time t given by a vector where xi denotes a location of an organism i at t, where xi=1 . . . 10, denoting the regions defined in FIG. 4A. An example of these state vectors as a function of time is shown in FIG. 5A, with time bin of Δt=240 millisecond (ms). In particular, FIG. 5A depicts an independence model which maps a joint configuration of the organisms (i.e. mice) at different time frame is represented by a four dimensional vector, where each dimension denotes the location of a particular organism in one of the 10 regions of interest.

For brevity, the empirical probability to find a group in a given spatial configuration is defined as Pempirical(x1,x2,x3,x4). Optionally, group behavior patterns of the organisms, namely a pattern that goes beyond individual traits, are acquired when comparing between Pempirical and a prediction of an independent (correlation-free) model that assumes that the organisms choose locations based on individual preferences, ignoring other organisms, for example as follows: Pind(x1,x2,x3,x4)=P(x1)P(x2)P(x3)P(x4). Optionally, these distributions are distinct, in terms of the number of observed configurations. For example, FIG. 5B indicates how a group behavior is different from a single organism behavior. FIG. 5B depicts a model that compares between empirical probability distribution of observed configurations and a predicted distribution from a model that assumes independence between organisms (i.e. mice). Configurations are ranked from the most prevalent to the least prevalent (optional). The figure depicts a distribution of observed states for one typical group, where out of the 104 possible states (of four organisms (i.e. mice) in 10 zones), only about 2000 occurred in during the monitoring period (marked in blow), whereas the independent model predicts that about 4000 states would typically occur during such a monitoring period (marked in red).

Reference is now made to FIG. 5C which depicts a model mapping a fraction of uncertainty about the location of an organism that can be read from the location of other group members (mutual information about location, divided by location entropy). Every dot shows the fraction of information about the location of organism i that can be read from the joint location of the other three versus the sum of pairwise information terms between i and each of the other organisms.

Optionally, the model allows deducing the location of one organism xi from the location of another xj. This dependency between two of organisms may be measured according to an amount of information, denoted herein as I(xi; xj). Optionally, a normalized measure of the fraction of uncertainty estimating the location of organism i from the location of organism j may be acquired by dividing I(xi; xj) by entropy of location of organism i may be calculated denoted herein as H(xi).

Optionally, the model allows deducing the location of one organism xi from the location of a number of other organisms. For example, FIG. 5C depicts how the locations of three organisms gives, on average, 25% of the information about the location of a fourth organism. This information, denoted herein as I(xi; {xj,xk,xl}), is a synergistic relation between the organisms of the group. It should be noted that I(xi; {xj,xk,xl}) is higher (more than double) than the naïve sum of pairwise information values I(xi; xj)+I(xi; xk)+I(xi; xl), indicating that the group is more complex than the whole collection of pairs (or any pair separately).

FIG. 5D includes a number of models (first to fourth order models) mapping an accuracy of maximum entropy models in describing the joint group configurations of a representative subgroup. Each dot corresponds to a certain configuration state of the group. The grey funnel shows a 95% confidence interval of estimation of the empirical distribution of configurations. Example of two specific configurations are highlighted in all graphs (green and blue dot), to show improvement of model accuracy over orders. Right side models depict a breakdown of the total group correlations and/or multi-information IN, to the contribution of the pairwise interactions between mice, I(2) triplet interactions between them, I(3) and 4th order contribution I(4).

According to some embodiments, the one or more spatiotemporal models which are extracted from the sequence of images are analyzed to identify dependencies based on a hierarchy of models which describe optional group configurations. The group configurations are optionally set based on different orders of correlations among the organisms, for example pair configurations, triplet configurations, and/or the like. The contribution of each of these optional group configurations (also referred to as correlation orders) to the whole group behavior is measured. Optionally, the distribution of values of optional group configurations is measured. As the entropy of a distribution measures its randomness of lack of structure, then a minimal model that relies on these observed correlations without other assumptions is uniquely given by a maximum entropy (ME) distribution consistent with the observed correlations, see Schneidman, E., Still, S., Berry, M. J., 2th & Bialek, W. Network information and connected correlations. Phys Rev Lett 91, 238701 (2003).

Optionally, corresponding maximum entropy models are built to from the monitored data to uncover contributions of different orders of interactions among organisms. For example, with reference to the example in FIGS. 4A-4F, a first order model may be based just on individual behavior of each organism and optionally assumes no dependencies between them the monitored organisms, so P(1)(x1,x2,x3,x4)=P(x1)P(x2)P(x3)P(x4). A pairwise ME model that sums all correlations between pairs (but no higher-order ones), may be set as follows:

P ( 2 ) ( x 1 , x 2 , x 3 , x 4 ) = 1 z exp ( i α i ( x i ) + i < j β ij ( x i , x j ) ) ,

where parameters, αi(xi) is calculated per organism i for location xi, and βij(xi,xj) for is calculated per each pair i and j, one for every combination of locations xi, and xj, are set such that the model marginal probabilities agree with the observed P(xi) and P(xi,xj); Z denotes a normalization factor. Similarly, the third order model is given by a distribution of a similar form, with interaction parameters for each triplet and locations γijk(xi,xj,xk); the fourth order model P(4), uses all possible relations among the organism. The maximum entropy models give the most parsimonious explanation of the data, and therefore are not just an arbitrary “modeling approach”, but rather the least structured models one could build. FIG. 5D depicts the accuracy of the different models in describing the empirical distribution of the spatial configurations of the organisms (i.e. mice). A discrepancy between the Pempirical and the independent P(1) model reflects the effect of the correlations of all orders among the organisms. The pairwise model, P(2) has a better model of a group behavior, and captures much of the correlations and shows considerable differences from the empirical data. P(3) gives an approximation to the empirical data, almost similarly to P(4) that relies on all correlations between mice.

Optionally, the contribution of each interaction order may be quantified by estimated a total correlations of all orders in the group using multi-information IN, for example as described in Schneidman, E., Berry, M. J., 2nd, Segev, R. & Bialek, W. Weak pairwise correlations imply strongly correlated network states in a neural population. Nature 440, 1007-12 (2006). This total group correlation is given by the difference between an entropy of the independent (first order) model and entropy of the empirical distribution, IN=H[P(1)({xi})]−H[Pempirical{xi})]. The exact contribution of order k to the multi-information, is given by I(k)=H[P(k−1)]−H[P(k)], where H[P(k)] denotes an entropy of the ME models of order k, see Schneidman, E., Still, S., Berry, M. J., 2nd & Bialek, W. Network information and connected correlations. Phys Rev Lett 91, 238701 (2003). In the above example, over all groups, the pairwise ME model, P(2), explained 57.2%-10.2% of the total correlations, while the ME model that includes also third order interactions between mice, P(3), explained 92.8%±2.9% (the right side of FIG. 5D shows the results for the left side of FIG. 5D). Thus, already for a group of four organisms (mice), higher-order interactions achieve better outcomes than using pairwise interactions.

According to some embodiments of the present invention, the analysis of the sequence of images is used to generate one or more functional social interaction maps, for example ME models that describe the group of organisms in terms of the dependencies of different orders among the organisms. Optionally, an ME model is a general form of a Potts model, see Landau, L. D. & Lifshitz, E. M. Statistical Physics (Pergamon Press, Oxford, 1980). In such embodiments, the parameters of the models describe a map of functional interactions between the organisms, for example as described in Schneidman, E., Berry, M. J., 2nd, Segev, R. & Bialek, W. Weak pairwise correlations imply strongly correlated network states in a neural population. Nature 440, 1007-12 (2006). Optionally, in order to create more compact model, a 3rd order ME model, denoted herein as P*, is constructed in a regularization process, for example by introducing a cost for each used parameter, for example penalizing every interaction term that is not zero. The accuracy of such a 3rd order maximum entropy model is depicted by FIG. 6A where Model predictions of spatial configurations of a group of organisms, the mice from the monitored space of FIG. 4A, are shown with reference to the empirical distribution. As depicted in FIG. 6A, this model is almost as accurate as the full P(3) model, but with less parameter (only 935 parameters in the above monitored space) (of all orders) that are larger than 0.1, for example see FIG. 6B which depict a distribution of weights according to the order of interactions in the regularized P* model above the horizontal line, compared to the model without regularization shown below the line. The distribution is over parameters of all groups of the organism in the monitored space of FIG. 4A (for their second day in the monitored space) is depicted in FIG. 6C wherein 270 pairwise interactions between the organisms (i.e. mice) are in several of the subgroups from the group in the monitored space of FIG. 4A. See also FIG. 6D which depicts pairwise interactions of a typical group. These interactions are the weights of second order interactions in a regularized third order maximum entropy model. Each panel corresponds to one pair of mice, and its rows and columns correspond to the locations within the arena according to the legend at the bottom of the figure. In this example, most of the non-zero pairwise interactions are negative. It should be noted that FIG. 6C depicts a full pairwise interaction maps for four representative groups (group on the left is used in following panels). In each of the maps, the colored dots represent the location of a mouse according to the color-coding in the bottom of the figure. The colors of the mice are depicted near their corresponding locations. The vertices show the strength of interaction between the mice, which can be positive (red) or negative (blue). Vertices' width reflects the interaction strength. FIG. 6E depicts dominant pairwise interactions for one group overlaid on a drawing of the monitored space. In the figure, dominant positive and negative pairwise interactions overlaid on a diagram of the monitored space. Filled mice icons show positive interactions and empty mice icons show negative interactions. Star denotes that the mouse is on one of the nests. FIG. 6F depicts positive and negative triple-wise dominant interactions for the same group as in FIG. 6E, overlaid on a diagram of the monitored space.

According to some embodiments of the present invention, the analysis of the sequence of images is used to identify a dependency between social interactions and/or group correlations and environment parameters, for example availability of resources. The social interactions and/or group correlations may be indicative of population density, aggression behavior pattern, dominance behavior pattern, and/or territoriality behavior pattern. Optionally, the dependency is identified by matching behavioral patterns and social phenotypes of groups in a nature environment and/or standard laboratory condition environment.

According to some embodiments of the present invention, the analysis of the sequence of images is used to identify an effect one or more environment parameters on a group and/or a sub group characteristic, for example a social hierarchy, also referred to as group hierarchy structure. Optionally, the hierarchy structure is set by identifying and counting the number of aggressive events between each pairs of organisms. For example, these aggressive events may be organisms which fight to determine social dominance (in terms of access to females, food, and territory). For example, FIG. 7A depicts representative ethograms of contacts between organisms (mice) that were classified as aggressive and/or non-aggressive in two different spaces, a complex space which includes a variety of objects such as shelters, tunnels, running wheels, and mouse nest boxes and a standard space, such as a common laboratory cage. For example, FIG. 7B depicts a daily group hierarchy as a tree with the minimum number of levels that preserves relative ranking between exemplary organisms, namely mice. Organism (mouse) i is ranked higher than mouse j if there are more aggressive events between them in which Organism (mouse) i chased after another organism (mouse) j, than the opposite. Optionally, organisms with no significant difference in their levels of aggressiveness are ranked at the same level). A contact between two organisms is optionally automatically identified and classified as events in which the distance between two mice is was less than about 10 cm. The movement direction of one organism relative to another organism (θ) is optionally analyzed to identify the nature of the contact for each of the organisms. If a projection of the direction of a movement of organism A relative to the direction of a movement of organism B is small enough

tan ( θ ) * d < θ 1 , for - π 2 < θ < π 2

then it moves towards B; if

tan ( θ ) * d < θ 2 for π 2 < θ < 3 2 π

it moves away from it; otherwise it was idle with respect to the other organism where θ1 and θ2 were found by optimization. To classify aggressive and non-aggressive contacts, a hidden Markov model is used, for example see Rabiner, L. R. A tutorial on hidden Markov models and selected applications in speech recognition. Proceedings of the IEEE 77, 257-286 (1989). This allows identifying post-contact behaviors in which organism A moves towards organism B, and organism B moves away from organism A. Optionally, events are labeled to learn statistical classifiers of aggressive and non-aggressive post-contact behavior. Optionally, each event is classified according to a range of parameters, including individual and relative speed, distance, and/or the like. Optionally, the following is used to classify the parameters:

(1) a quadratic discriminant classifier, for example see Duda, R. O., Hart, P. E. & Stork, D. G. Pattern Classification and Scene Analysis 2ND ed. (1995),

(2) a k-nearest neighbor algorithm, and

(3) a decision-tree classifier, that used these parameters at each tree intersection, for example see Breiman, L., Friedman, J. H., Olshen, R. A. & Stone, CJ. Classification and regression trees. Chapman & Hall/CRC (1984).

According to some embodiments of the present invention, a social hierarchy is determined by estimating relative aggressiveness of each pair, by comparing the number of times organism A chased organism B vs. the other way around. If this number is significantly higher for one than the other, then this organism is ranked higher (more dominant) than the other. Next, the hierarchy map is set as the lowest tree to preserve these ranks. Optionally, a variation of a topological sorting algorithm is used; see for example Cormen, T. H., Leiserson, C. E., Rivest, R. L. & Stein, C. Introduction To Algorithms. (MIT Press: 2001). This algorithm is extended in order to allow two or more organism get a common rank. Optionally, organisms are classified for example as alpha, beta, gamma, and delta, optionally on a daily basis. Optionally, a Clopper-Pearson test is performed in order to ensure that relative ranks are significant when constructing a hierarchy map, see Clopper, C. J. & Pearson, E. S. the use of confidence or fiducial limits illustrated in the case of the binomial. Biometrika 26, 404-413 (1934). The relative ranks may be used when one organism chases another, significantly more times than it escaped from the other organism within a 95% percent confidence interval. Optionally, the stability of the hierarchy is quantified by measuring a fraction of pairs that changed their relative ranking between consecutive days.

Optionally, the social hierarchy of an organism is linked to its individual behavior by quantifying the entropy of its location. For example, FIG. 7C depicts a correlation between a hierarchy rank and environment exploration.

It is expected that during the life of a patent maturing from this application many relevant systems and methods will be developed and the scope of the term a module, a processor, and an image sensor is intended to include all such new technologies a priori.

As used herein the term “about” refers to +10%. The terms “comprises”, “comprising”, “includes”, “including”, “having” and their conjugates mean “including but not limited to”. This term encompasses the terms “consisting of” and “consisting essentially of”.

The phrase “consisting essentially of” means that the composition or method may include additional ingredients and/or steps, but only if the additional ingredients and/or steps do not materially alter the basic and novel characteristics of the claimed composition or method.

As used herein, the singular form “a”, “an” and “the” include plural references unless the context clearly dictates otherwise. For example, the term “a compound” or “at least one compound” may include a plurality of compounds, including mixtures thereof.

The word “exemplary” is used herein to mean “serving as an example, instance or illustration”. Any embodiment described as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments.

The word “optionally” is used herein to mean “is provided in some embodiments and not provided in other embodiments”. Any particular embodiment of the invention may include a plurality of “optional” features unless such features conflict.

Throughout this application, various embodiments of this invention may be presented in a range format. It should be understood that the description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the invention. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.

Whenever a numerical range is indicated herein, it is meant to include any cited numeral (fractional or integral) within the indicated range. The phrases “ranging/ranges between” a first indicate number and a second indicate number and “ranging/ranges from” a first indicate number “to” a second indicate number are used herein interchangeably and are meant to include the first and second indicated numbers and all the fractional and integral numerals therebetween.

It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination or as suitable in any other described embodiment of the invention. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements.

Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims.

All publications, patents and patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual publication, patent or patent application was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention. To the extent that section headings are used, they should not be construed as necessarily limiting.

Claims

1. A method of monitoring social interactions among a plurality of model organisms, comprising:

marking each of a plurality of model organisms in a monitored space, divided to a plurality of spatial segments, with one of a plurality of unique visual markers;
capturing a sequence of images of said plurality of unique visual markers;
calculating at least one spatiotemporal model of at least one of said plurality of model organisms by automatically identifying in which of said plurality of segments each said unique visual marker is located during each of a plurality of temporal frames of a period;
identifying a plurality of interactions between said plurality of model organisms using said at least one spatiotemporal model; and
characterizing a social behaviour of at least some of said plurality of model organisms according to said plurality of interactions.

2. The method of claim 1, further comprising identifying at least one social related characteristic of at least one subgroup of said plurality of model organisms according to said at least one spatiotemporal model; wherein said at least one social related characteristic comprises a member of a group consisting of a group behaviour of said group, subgroup behaviour of a subgroup of said group, a social hierarchy among said plurality of model organisms, and/or an effect of external factors on said at least one subgroup.

3. (canceled)

4. The method of claim 1, wherein said plurality of unique visual markers comprises a plurality of unique color marks.

5-6. (canceled)

7. The method of claim 1, wherein said period is of at least 24 hours.

8-9. (canceled)

10. The method of claim 1, further comprising characterizing at least one interaction among at least some of said plurality of model organisms by an analysis of said at least one spatiotemporal model.

11. The method of claim 10, further comprising marking at least some of said plurality of segments as related to one of a plurality of distinct regions in said monitored space and characterizing said at least one interaction according to the time at least one of said plurality of model organisms spent in at least some of said plurality of distinct regions.

12. The method of claim 10, wherein said characterizing comprises using said at least one spatiotemporal model to identify a dependency in the location of one of said plurality of model organisms in relation to location of another of said plurality of model organisms in at least some of said plurality of temporal frames and performing said characterizing accordingly.

13. The method of claim 10, wherein said characterizing comprises using said at least one spatiotemporal model to identify a group behavioural pattern of at least a subgroup of organisms from said plurality of model organisms and performing said characterizing accordingly.

14. The method of claim 10, wherein said characterizing comprises using said at least one spatiotemporal model to identify a dependency between a change in at least one environmental characteristic of said monitored space and a group behavioural pattern of at least a subgroup of organisms from said plurality of model organisms and performing said characterizing accordingly.

15. The method of claim 10, wherein said characterizing is performed according to an entropy measure extracted from said at least one spatiotemporal model.

16. The method of claim 1, wherein said plurality of model organisms are rodents.

17. The method of claim 1, further comprising:

calculating a plurality of entropy measures each for interactions in another of said plurality of subgroups, and
wherein said characterizing a social behaviour of at least some of said plurality of model organisms is performed according to said plurality of entropy measures.

18. The method of claim 1, further comprising:

identifying a plurality of chase-escape interactions of each of a plurality of pairs of said plurality of model organisms using said at least one spatiotemporal model, and
calculating a hierarchical structure of at least some of said plurality of model organisms.

19. The method of claim 18, wherein said identifying further comprises hierarchically ranking each said model organism according to the prevalence of its rule as a chaser or escaper in said plurality of chase-escape interactions.

20. The method of claim 1, further comprises:

calculating a plurality of trials of at least some of said plurality of model organisms using said at least one spatiotemporal model, and
characterizing at least one interaction among at least some of said plurality of model organisms by an analysis of said plurality of trials.

21. The method of claim 1, estimating an aggressiveness level of at least some of said plurality of model organisms according to an analysis of said at least one spatiotemporal model.

22. (canceled)

23. The method of claim 1, further comprises identifying at least one instinctive behaviour of at least one of said plurality of model organisms by an analysis of said at least one spatiotemporal model.

24. A system of monitoring social interactions among a plurality of model organisms, comprising:

an imaging system which capturing a sequence of images of a plurality of unique visual markers each of another of a plurality of model organisms in a monitored space divided to a plurality of spatial segments;
an image processing module which analyses said sequence of images to identify in which of said plurality of segments each said unique visual marker is located during each of a plurality of temporal frames of a period; and
a characterizing module which calculates at least one spatiotemporal model according to said identification and characterizes a social behaviour of at least some of said plurality of model organisms from at least one interaction among at least some of said plurality of model organisms by an analysis of said at least one spatiotemporal model.

25. A method of monitoring social interactions among a plurality of model organisms, comprising:

recording spatiotemporal data of a plurality of model organisms in a monitored space during of a period;
analyzing said spatiotemporal data to identify a plurality of interactions between members of each of a plurality of subgroups of said plurality of model organisms;
calculating a plurality of entropy measures each for interactions in another of said plurality of subgroups; and
characterizing a social behaviour of at least some of said plurality of model organisms according to said plurality of entropy measures.

26. (canceled)

Patent History
Publication number: 20140207433
Type: Application
Filed: Aug 3, 2012
Publication Date: Jul 24, 2014
Applicant: Yeda Research and Development Co., Ltd. (Rehovot)
Inventors: Alon Chen (Rehovot), Elad Schneidman (Rehovot), Yair Shemesh (Rehovot), Yehezkel Sztainberg (Rehovot), Oren Forkosh (Rehovot)
Application Number: 14/236,674
Classifications
Current U.S. Class: Modeling By Mathematical Expression (703/2)
International Classification: G06F 17/50 (20060101);