Increment entropy as a measure of complexity for time series - arXiv

16 downloads 0 Views 1MB Size Report
we introduce increment entropy to measure the complexity of time series in ..... 815-866. 41. Marston M, Hedlund GA (1940) Symbolic dynamics II. sturmian ...
Increment entropy as a measure of complexity for time series Xiaofeng Liu1,2,*, Aimin Jiang1,2, Ning Xu1,2, Jianru Xue3, 1 College of IoT Engineering, Hohai University, Changzhou, 213022, China. 2 Changzhou key laboratory of robotics and intelligent technology, Changzhou, 213022, China 3 Institute of Artificial Intelligence and Robotics, Xi'an Jiaotong University, Xi'an,710049, China *

E-mail : Corresponding [email protected]

Abstract Entropy has been a common index to quantify the complexity of time series in a variety of fields. Here, we introduce increment entropy to measure the complexity of time series in which each increment is mapped into a word of two letters, one letter corresponding to direction and the other corresponding to magnitude. The Shannon entropy of the words is termed as increment entropy (IncrEn). Simulations on synthetic data and tests on epileptic EEG signals have demonstrated its ability of detecting the abrupt change, regardless of energetic (e.g. spikes or bursts) or structural changes. The computation of IncrEn does not make any assumption on time series and it can be applicable to arbitrary real-world data.

Author summary Introduction Nowadays, the notion of complexity has been ubiquitously used to examine a variety of time series, ranging from diverse physiological signals [1-9] to financial time series [10, 11] and ecological time series [12]. There there is not an established universal definition of complexity to date [13]. In practice, various measures of complexity have been developed to characterize the behaviors of time series, e.g., regular, chaotic, and stochastic behaviors [8, 14-18]. Among these studies, approximate entropy (ApEn) proposed by Pincus [16] is one of the most commonly used methods. It is a measure of regularity of chaotic and non-stationary time series [19-23]. It has been demonstrated that the more frequent similar epochs occur in time series, the lower the resulting ApEn is. As pointed out by Pincus [16], AnEn indicates the generation rate of new information. Richman and Moorman later proposed the notion of sample entropy (SampEn) [8] that is conceptually inherited from ApEn but computationally counteract the shortcomings of ApEn. SampEn excludes self-matches of vectors and demonstrates relatively consistency [8, 24]. In recent years, Permutation entropy (PE) has been widely applied to quantify the complexity of a variety of experimental time series [25,26], in particular physiological signal [27-32]. PE is robust to observation or dynamical noise, conceptually simple, and computationally extremely fast. These advantages make it suitable for analysing data sets of huge size without any preprocessing and fine-tuning of parameters. An intriguing feature of PE is that a time series is naturally mapped into a symbol sequence by comparing neighboring values. However, this ordinal pattern ignores the relative sizes of neighboring values. Recently Liu and Wang added a factor related to magnitude of consecutive values of time series into an mapping pattern vector, so that the epoches with an identical ordinal pattern can be discriminated [33]. This fine-graining partition makes the modified version of PE more sensitive to abrupt changes in amplitude, such as bursts and spikes [34, 35]. Another modified version of PE, is weighted-permutation entropy, introduced by Fadlallah and colleagues [36]. Similarly, they incorporated amplitude information into the pattern extracted from a given time series. As a result, their

new scheme performs better when tracking the abrupt changes in signal. In addition, in PE only inequalities between values are considered and equal values are neglected. In practical, small random perturbations are added to break equalities. For some signals with quite a lot of equal values, e.g. heart rate variability (HRV) sequence, it is inaccurate to characterize the complexity of this signal by means of PE. Bian and his colleague [37] mapped all the equal values in an epoch into the same symbol as the equal value with the smallest index in this epoch. Results on HRV indicate that this scheme performs better for distinguishing HRV signals. In this paper, we will introduce a new approach to measure the complexity of time series, termed increment entropy (IncrEn). We focus on the increments of signal as the increments of signal indicate characteristics of dynamic changes hidden in the signal. The basic idea is that each increment is mapped into a word with length of 2 letters, with a sign and its size coded in this word. Then we calculate the symbolic entropy of these words [38-42]. This scheme entails IncrEn an ability to detect either energetic change or structural changes. Moreover, it is of conceptual simplicity. We finally provide demonstrations of its various applications, and the tests on synthetic and real-world data indicates its effectiveness. The remainder of this paper is organized as follows. Section 2 addresses the notion of increment entropy; Section 3 shows its performance on synthetic trajectories and comparison with PE and AnEn (or SampEn); Section 4 shows its performance on real-world data, such as seizure detection. Section 5 includes the discussion and future works. Finally, we conclude our paper in section 6.

1 Increment Entropy Consider a finite time series

{x(i )} | 1  i  N , and construct an increment time series {v (i )}

from {x (i )} , where v(i )  x(i  1)  x(i ), 1  i  N  1 , N is the length of time series. Let m be

the length to be compared, we can construct N  m vectors of m dimension from the increment time series, V (i )  [v(i ), v(i  1), , v(i  m  1)] . Then we map each element in the constructed vector into a word with a length of 2, where the sign of increment (positive, negative, level) is denoted as a letter, and the size of increment compared with other increments in the same vector is denoted as a letter. As a result, each vector is mapped into a word

  U km1sk qk

, where sk

quantified value relative to other increments in the same vector. Let

Q   N  m 1 The increment entropy of order m  2 is defined as

 sgn( v(k )) , qk is a

Q   denote total amount of

p  

any unique pattern that occurs in the time series, thus its relative frequency is

H m   

(1)

 p  log p 

( 2 R 1 ) m i 1

i

i

˄2˅

where R is the quantifying resolution. As usual, log is with base 2, thus H is given in bit. In general, H ( m) is normalized by m  1 .

Obviously, H ( m) is bounded in [0,(2 K  1) ] . For example, for a series with ten values m

(3,3, 2, 8, 5, 4, 20,10,11, 8)

,

we

can

generate

an

increment

sequence

(0,1, 10,3,9,16, 10,1, 3) from the given series by a simple formula

xi  xi 1 , i  2 . We take

m  2 and R  4 , then we get eight words, each of which is composed up of s1q1s2 q2 as follows. Table I. Symbolization process Vectors

s1q1s2q2

3, 3, 2

0, 0,-1,4

3, 2,-8

-1, 0,-1, 4

2,-8,-5

-1, 4, 1, 1

-8,-5, 4

1, 2, 1, 4

-5, 4,20

1, 2, 1, 4

4,20,10

1, 4,-1, 3

20,10,11

-1, 4, 1, 0

10,11, 8

1, 2,-1, 4

In this illustration, the size of increment is quantified in terms of the standard deviation of increment epoch and a coefficient. The word 1,2,1,4 has a replicated copy and its corresponding frequency is 2 / 8 . The frequency of the remaining six unique words is 1 / 8 . As a result,

1 1 2 2 H (2)   log( )  6  log( )  1.9062 8 8 8 8 H (2) h(2)   1.9062 2 1

(3) (4)

2 Simulation and results We conducted a variety of simulations both on chaotic signals and on regular signals to examine IncrEn. We compared IncrEn with PE and SampEn on these synthetic data in various aspects, such as reliability for detecting abrupt change, data length effect, and stability.

2.1 Results from chaotic time series First, we demonstrate that IncrEn can effectively and accurately indicate the complexity of

dynamical system using the well-known logistic map xn  r  xn 1  (1  xn 1 ) already shown in

previous studies [25,33]. We generated 20,000 values of the logistic map with 3.5  r  4

and

initial x0  0.303 . The bifurcation diagram of the logistic map is shown in Fig.1a. We can clearly see from Fig.1b-d that IncrEn is capable of distinguishing the chaotic degree, similar to the positive Lyapunov exponent. This is completely consistent with the results in previous studies [25, 33]. We can obviously see from Fig.1b-d that the functions h4 , h6 and h8 are very similar. This justifies that IncrEn of lower order is as IncrEn of higher order when appling to real-world data [25]. We further examine the effects of Gaussian observational noise on IncrEn. Comparing Fig.1d with Fig.1c, it is obvious that IncrEn with Gaussian noise is entirely identical to that without Gaussian

noise. As shown in Fig.1d, IncrEn, h8 , only grows up slightly, compared to that without noise, even in low-period window. This is very distinct from PE as PE is relatively larger in the low-period window,

e.g. period-4 and period-3 windows (see ref [25] Fig.2e). We calculate the mean and variance of the IncrEn of data with various length through taking 1000

time series produced by the logistic map with r  4 to address the effect of data length N. In Fig. 3a

hm increases at the beginning, reaches the highest at m  4 , then decreases. The variance of hm is

rather small as shown in Fig.3b. In general, a smaller data size compared to

2R  1m

, will cause a

bias in IncrEn. In practice, we recommend the order m  2, 3, 4, 5 and the quantifying resolution

R  4.

.

Figure 1. Logistic equation for varying control parameter r and corresponding IncrEn with varying scale. a) Bifurcation diagram, b) Increment Entropy, h6 , c) h8 , d) h8 with Gaussian observational noise at standard deviations.

Figure 2. Order m choice and data length effect. a) Mean

hm

of logistic map (r = 4) with 10

k

= 2, 3, 4, 5, 6). Horizontal: embeding dimension m, b) Corresponding standard deviation  of hm .

data points (k

2.2 Comparison with PE and SampeEn The notion of increment entropy is motivated by the concept of permutation [25] and approximate

2R  1m

entropy [16], thus we examined the relationship between IncrEn and PE and SampEn to illustrate its features and applications. The number of possible formed patterns

produced in IncrEn is

roughly more than that produced in PE, m ! , for smaller order m (it is also dependent on quantifying resolution too). Thus IncrEn is much larger than PE for random or chaotic signals, while it is a little lower than PE for regular signals because less patterns are generated from the computation based on increments of time series. For regular signals, the distances between vectors are commonly smaller than the specified threshold, which leads to a higher conditional probility. As a result, SampEn (or ApEn) is lower than IncrEn for regular time series.

Figure 3. Symbolization of analogous pattern in IncrEn and PE.

2.2.1 Distinguishing analogous patterns in IncrEn and PE The left panel of Fig.4 shows seven patterns that could be attributed to an identical motif. These seven vectors are in the same order, thus all of them are transformed into the same word in PE. In contrast, in IncrEn, they are mapped into seven different words because of the distinct increment pattern in each vector, although they have an identical ordinal pattern. This endows IncrEn a greater sensitivity to changes hidden in time series. However, some of them could be classified in the same category in terms of the distance between each vector. For example, it is clear that the vector presented in Fig.3b is symmetrical to that in Fig.3g with respect to the vector in Fig.3a, thus obviously their distance to the vector in Fig.3a is equal. For appropriate threshold r the three vectors are deemed identically in SampEn and ApEn even though they have very distinct appearances.

Figure 4. Detection of Energy and Pattern Change. (a) Regular time series consists of 300 identical slices [0.07, 0.89, 0.15, 0.97]. (b) Time series (a) interspersed with 3 slices [0.37, 0.59, 0.45, 0.67]. (c)mTime series (a) interspersed with 3 slices [0.37, 0.59, 0.45, 0.67] adding 0.5 offset. (d) Time series (a) interspaced with 3 slices [0.97, 0.15, 0.89, 0.07].

2.2.2 Detecting energetic change and structural change The synthetic data with impulsive spikes are usually used to illustrate the failure of PE for detecting abrupt change in amplitude in previous studies (see [33] Fig.1, [36] Fig.2a). Either spike or burst in signal is one kind of energetic change in signal. In this paper, we demonstrated the performance of increment entropy on an entire regular time series and regular data interspersed either with a few amplitude amplified epochs (AMP.) or with a few amplitude attenuation epochs (ATT.), as shown in Fig.4a-c. To demonstrate the performance of IncrEn for detecting structural change, we synthesized a regular signal interspersed with a few reverse ordinal epochs (Struct.) too (Fig.4d). Table 2. Entropy of regular signals with and without distorted slices. Entropy IncrEn

PE

Regular

ATT.

AMP.

Struct.

m=2

0.6931

0.7328

0.7345

0.7345

m=3

0.3466

0.3694

0.3698

0.3698

m=4

0.2310

0.2477

0.2479

0.2479

m=2

0.6930

0.6930

0.6930

0.6930

m=3

0.6932

0.6932

0.6932

0.6932

m=4

0.4622

0.4661

0.4650

0.4650

0.0000

0.0000

0.0000

0.0008

SamEn

We calculated IncrEn, PE, and SmapEn of each signal presented in Fig: 4 and the results in Table 2. As mentioned in the previous section, overall, IncrEn is smaller than PE in various order m  2, 3, 4 . It is clear that IncrEn is able to discriminate all three signals with distorted epochs from

regular one in all three orders ( m  2, 3, 4 ). In contrast, PE can not distinguish distorted signals from

regular ones because PE of regular signal is equal to those with distorted epochs in lower order, m  2, 3 , except for a small difference in order 4. For all four signals, sample entropy is zero or approximate to zero, which suggests that sample entropy fails to detect these changes, energetic or structural. Compared to PE and SampEn, IncrEn is sensitive to a fraction of subtle changes although

m  4 , for IncrEn and for PE. This suggests that multi-scale entropy can

the differences between distorted and regular conditions are smaller. The case for higher order m = 5 is similar to that for order

reveal different aspects of data relative to single-scale entropy. However, we have to point out that the

results may be biased for data with short length when using multi-scale entropy. 2.2.3 Stability of IncrEn, PE and SampEn Figure.5 As presented in the previous section, the variability of IncrEn is rather small, which is comparable to the performance of PE [25]. We further computed the variability of IncrEn, PE, and SampEn on a random time series, a Gaussian noise. As shown in Fig.5, IncrEn is universally much higher than PE for random time series. The track of IncrEn is approximately of the same plain as that of PE (variance

 103 and 106 for data length W = 500 and 1000). In contrast, the track of SampEn is full of fluctuations (variance = 0.0337 and 0.0092, for W = 500 and 1000) even though the data length is much longer than 100, which is required for the calculation of SampEn [8]. This indicates that IncrEn performs as well as PE, and both of them are much better than SampEn in terms of their variability.

Figure 5. Time-dependent IncrEn, PE and SampEn of random noise (Data length = 20,000, Window size = 200).

2.3 Application to seizure detection from EEG signals

Figure 6. Detecting seizure of a short time from epileptic EEG signals. (a) A part of epileptic EEG of a female patient aged 7 years. It contains a seizure of and is recorded from Fz-Cz electrode with a sample rate of 256 Hz. (b) IncrEn. (c) PE. (d) SampEn. All of them are computed using a sliding window of 500 samples with 499 samples overlap.

Entropy has been an effective approach to be used to detect seizure from EEG signals [27, 33, 36]. We calculate IncrEn of epileptic EEGs that contain a seizure and compare it with PE and SampEn. The data set is continuous scalp EEG sampled at 256 Hz, and data sets are available from the CHB-MIT database [43, 44]. The earliest EEG changes associated with each seizure are indicated by a clinical expert [43, 44]. We choose two different data sets to evaluate IncrEn for detecting dynamical changes in real time series. One contains a long seizure of 120 seconds and the other contains a short seizure of 10 seconds. The results for detecting a short seizure are presented in Fig.6. A seizure in EEG signal begins at 1120s and ends at 1129s (see Fig.6a). It is clear that IncrEn drops when the seizure occurs and remains yet much lower during seizure than before and after seizure (see Fig.6b). IncrEn reaches the minimum at around 1123s. This is consistent with previous study that Entropy descends during seizure stage and increases when seizure ends [33, 45]. In contrast, PE becomes almost level for a long time after 1123s, irrespective of seizure (see Fig.6c). SampEn falls sharply at initial phase of seizure, and rises at around 1123s. In comparison to PE and SampEn, the decrease of IncrEn overall captures the whole process of seizure, rather than only the initial part of seizure, and moreover, the drop corresponding to seizure is more evident.

Figure 7. Detecting seizure of a long time from epileptic EEG signals. (a) A part of epileptic EEG of a male patient aged 16 years. It contains a seizure of and is recorded from Fz-Cz electrode with a sample rate of 256 Hz. (b) IncrEn. (c) PE. (d) SampEn. All of them are computed using a sliding window of 500 samples with 499 samples overlap.

Figure.7 We calculate IncrEn of a long seizure as well. Unlike the above short seizure, there are a few transient pauses during a long seizure as shown in Fig.7a. Similar to the case of a short seizure, IncrEn goes downwards obviously at the initial phase of seizure and goes upwards at the end of seizure. Further, it is evident from Fig.7b that IncrEn is sensitive to these transient pauses within seizure, which means IncrEn can accurately track the process of seizure. So is PE, except that the drop at the initial phase is much smaller compared with IncrEn. Although it is far lower during seizure than after seizure, SmapEn fails to indicate these transient pauses within seizure. Together with results of a short seizure, IncrEn can not only accurately detect seizure in EEG signals, but also sensitively capture various changes within seizure.

3 Discussion To date, there is not a universal definition of complexity of time series. Most of measures of

complexity monotonically increase with randomness, giving a lower value for regular sequences but the maximum for random sequences. Other definitions assign a lower value for both completely regular and pure random sequences, but give the maximum value for the intermediate stage [18, 46-48]. However, from the point of view of predictability, neither chaotic sequence nor random sequence is predictable. Both of them are highly complex [13]. Our approach is an intuitive notion of complexity in that IncrEn grows up with increasing disorder, maximizing at random stage but minimizing at regular stage. On this respect, it is the same as PE and SampEn. IncrEn is lower for a regular sequence, higher for a random one, and intermediate for a chaotic one, thus is a measure of the degree of sequence predictability. Our simulation has demonstrated that IncrEn is more sensitive not only to the subtle alternation of amplitude with identical structure, but to tiny modification of structure or pattern, as shown in Fig. 2 and Fig. 3. This allows for accurately capturing the subtle changes hidden in sequences, irrespective of whether energetic or structural. As a result, IncrEn can be used to extract various features hidden in time series and to detect dynamical change in time series relating to internal or external factors. In recent years, entropy has been an effective method to characterize and to distinguish the physiological signals across diverse physiological and pathological conditions, such as HRV abnormality [20, 21], neural disorder [22, 49, 50], gaits analysis [23], etc. As shown in Fig.6 and Fig.7, IncrEn exhibits better performance on seizure detection from real epileptic EEG signals than PE and SmapEn. A prominent merit of IncrEn lies in that the computation of IncrEn does not make any assumption on data, which makes it applicable to diverse signals. In particular, it is an appropriate method to examine the characters of those signals consisting of most equal values, e.g. HRV, equal values in signals can be coded as equivalently as those inequalities, unlike PE that does not consider the equal values [25, 37]. IncrEn exhibits robust to observational noise. As shown in Fig. 1e, the noise do not disturb the appearance of IncrEn, even in lower-period window where PE is higher and flips the case without noise (see Fig.2e in ref [25]). The noise only causes a slight increase in IncrEn. Overall IncrEn with noise is consistent with that without noise (comparing Fig.1d and Fig.1e). It suggests that IncrEn is suitable to analyzing noisy data. The data length has a great impact on entropy computation [8, 16, 24, 25, 51]. Even though the data length is 100, the variance of IncrEn across different data length is small as shown in Fig.2. We calculated the IncrEn, PE and SmapEn for a Gaussian noise sequence for the sake of comparison. Both IncrEn and PE have quite a lower variance. In contrast, variance of SampEn is much higher as shown in Fig. 5, although SampEn is largely independent of record length [8, 24]. This indicates that IncrEn is powerful tool for analyzing very short signals.

4 Conclusion In this paper we present a novel measure to quantify the complexity of time series, as well as to characterize and distinguish the time series across regular, chaotic and random states. Results of a series of simulations and real data have illustrated the powerful function and the potential applications of IncrEn to reveal the characteristics of a variety of signals and to discriminate sequences across various conditions. Our approach has a wider dynamical range compared to the established PE and is more reliable than sampEn or ApEn. It is a promising method to characterize time series of diverse fields.

References

1. Hornero R, Abasolo D, Jimeno N, Sanchez CI, Poza J, et al. (2006) Variability, regularity, and complexity of time series generated by schizophrenic patients and control subjects. IEEE Transactions on Biomedical Engineering 53: 210-218. 2. Gonzalez Andino SL, Grave de Peralta Menendez R, Thut G, Spinelli L, Blanke O, et al. (2000) Measuring the complexity of time series: An application to neurophysiological signals. Human Brain Mapping 11: 46-57. 3. Friston KJ, Tononi G, Sporns O, Edelman GM (1995) Characterising the complexity of neuronal interactions. Human Brain Mapping 3: 302-314. 4. Zamora-Lpez G, Russo E, Gleiser PM, Zhou C, Kurths J (2011) Characterizing the complexity of brain and mind networks. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 369: 3730-3747. 5. Olbrich E, Achermann P, Wennekers T (2011) The sleeping brain as a complex system. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 369: 3697-3707. 6. Costa M, Goldberger AL, Peng CK (2002) Multiscale entropy analysis of complex physiologic time series. Physical Review Letters 89: 068102. 7. Pincus SM, Goldberger AL (1994) Physiological time-series analysis: what does regularity quantify? American Journal of Physiology - Heart and Circulatory Physiology 266: H1643-H1656. 8. Richman JS, Moorman JR (2000) Physiological time-series analysis using approximate entropy and sample entropy. American Journal of Physiology - Heart and Circulatory Physiology 278: H2039-H2049. 9. Tononi G, Sporns O, Edelman GM (1994) A measure for brain complexity: relating functional segregation and integration in the nervous system. Proceedings of the National Academy of Sciences 91: 5033-5037. 10. Barnett WA, Medio A, Serletis A (1997) Nonlinear and complex dynamics in economics. Econometrics 9709001, EconWPA. URL http://ideas.repec.org/p/wpa/wuwpem/9709001.html. 11. Arthur WB (1999) Complexity and the economy. Science 284: 107-109. 12. Turchin P, Taylor AD (1992) Complex dynamics in ecological time series. Ecology 73: 289-305. 13. Boffetta G, Cencini M, Falcioni M, Vulpiani A (2002) Predictability: a way to characterize complexity. Physics Reports 356: 367-474. 14. Lempel A, Ziv J (1976) On the complexity of finite sequences. IEEE Transactions on Information Theory 22: 75-81. 15. Kolmogorov A (1968) Logical basis for information theory and probability theory. IEEE Transactions on Information Theory 14: 662-664. 16. Pincus SM (1991) Approximate entropy as a measure of system complexity. Proceedings of the National Academy of Sciences 88: 2297-2301. 17. Crutchfield JP, Young K (1989) Inferring statistical complexity. Physical Review Letters 63: 105-108. 18. Shiner JS, Davison M, Landsberg PT (1999) Simple measure for complexity. Physical Review E 59: 1459-1464. 19. Pincus SM, Goldberger AL (1994) Physiological time-series analysis: what does regularity quantify? American Journal of Physiology - Heart and Circulatory Physiology 266: H1643-H1656. 20. Porta A, Guzzetti S, Furlan R, Gnecchi-Ruscone T, Montano N, et al. (2007) Complexity and nonlinearity in short-term heart period variability: comparison of methods based on local nonlinear

prediction. IEEE Transactions on Biomedical Engineering 54: 94-106. 21. Lake DE, Richman JS, Griffin MP, Moorman JR (2002) Sample entropy analysis of neonatal heart rate variability. American Journal of Physiology-Regulatory, Integrative and Comparative Physiology 283: R789-R797. 22. Takahashi T, Cho RY, Mizuno T, Kikuchi M, Murata T, et al. (2010) Antipsychotics reverse abnormal EEG complexity in drug-naive schizophrenia: a multiscale entropy analysis. NeuroImage 51: 173-182. 23. Buzzi UH, Ulrich BD (2004) Dynamic stability of gait cycles as a function of speed and system constraints. Motor control 8: 241. 24. Yentes J, Hunt N, Schmid K, Kaipust J, McGrath D, et al. (2013) The appropriate use of approximate entropy and sample entropy with short data sets. Annals of Biomedical Engineering 41: 349-365. 25. Bandt C, Pompe B (2002) Permutation entropy: A natural complexity measure for time series. Physical Review Letters 88: 174102. 26. Bandt C, Keller G, Pompe B (2002) Entropy of interval maps via permutations. Nonlinearity 15: 1595. 27. Cao Y, Tung Ww, Gao J, Protopopescu V, Hively L (2004) Detecting dynamical changes in time series using the permutation entropy. Physical Review Series E- 70: 046217-046217. 28. Xu X, Zhang J, Small M (2008) Superfamily phenomena and motifs of networks induced from time series. Proceedings of the National Academy of Sciences 105: 19601-19605. 29. Staniek M, Lehnertz K (2008) Symbolic transfer entropy. Physical Review Letters 100: 158101. 30. Olofsen E, Sleigh J, Dahan A (2008) Permutation entropy of the electroencephalogram: a measure of anaesthetic drug effect. British journal of anaesthesia 101: 810-821. 31. Zunino L, Zanin M, Tabak BM, Prez DG, Rosso OA (2009) Forbidden patterns, permutation entropy and stock market inefficiency. Physica A: Statistical Mechanics and its Applications 388: 2854-2864. 32. Hornero R, Absolo D, Escudero J, Gmez C (2009) Nonlinear analysis of electroencephalogram and magnetoencephalogram recordings in patients with alzheimer's disease. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 367: 317-336. 33. Liu XF, Wang Y (2009) Fine-grained permutation entropy as a measure of natural complexity for time series. Chinese Physics B 18: 2690. 34. Ruiz M. d. C., Guillamn A, Gabaldn A (2012) A new approach to measure volatility in energy markets. Entropy 14: 74-91. 35. Zanin M, Zunino L, Rosso OA, Papo D (2012) Permutation entropy and its main biomedical and econophysics applications: a review. Entropy 14: 1553-1577. 36. Fadlallah B, Chen B, Keil A, Prncipe J (2013) Weighted-permutation entropy: A complexity measure for time series incorporating amplitude information. Physical Review E 87: 022911. 37. Bian C, Qin C, Ma QD, Shen Q (2012) Modi_ed permutation-entropy analysis of heartbeat dynamics. Physical Review E 85: 021906. 38. Liu XF, Yu WL (2008) A symbolic dynamics approach to the complexity analysis of event-related potentials. Acta Physica Sinica 57: 2587-2594. 39. Daw CS, Finney CEA, Tracy ER (2003) A review of symbolic analysis of experimental data. Review of Scientific Instruments 74: 915-930. 40. Marston M, Hedlund GA (1938) Symbolic dynamics. American Journal of Mathematics 60:

815-866. 41. Marston M, Hedlund GA (1940) Symbolic dynamics II. sturmian trajectories. American Journal of Mathematics 62: 1-42. 42. Graben Pb, Saddy JD, Schlesewsky M, Kurths J (2000) Symbolic dynamics of event-related brain potentials. Physical Review E 62: 5518-5541. 43. Shoeb A, Edwards H, Connolly J, Bourgeois B, Ted Treves S, et al. (2004) Patient-specific seizure onset detection. Epilepsy & Behavior 5: 483-498. 44. Shoeb AH, Guttag JV (2010) Application of machine learning to epileptic seizure detection. In: Proceedings of the 27th International Conference on Machine Learning (ICML-10). pp. 975-982. 45. Diambra L, de Figueiredo JCB, Malta CP (1999) Epileptic activity recognition in EEG recording. Physica A: Statistical Mechanics and its Applications 273: 495-505. 46. Ke Dg (2013) Unifying complexity and information. Sci Report 3. 47. Grassberger P (1986) Toward a quantitative theory of self-generated complexity. International Journal of Theoretical Physics 25: 907-938. 48. Crutchfield JP (2012) Between order and chaos. Nature Physics 8: 17-24. 49. Srinivasan V, Eswaran C, Sriraam N (2007) Approximate entropy-based epileptic EEG detection using artificial neural networks. IEEE Transactions on Information Technology in Biomedicine 11: 288-295. 50. Kannathal N, Choo ML, Acharya UR, Sadasivan P (2005) Entropies for detection of epilepsy in EEG. Computer methods and programs in biomedicine 80: 187-194. 51. Lesne A, Blanc JL, Pezard L (2009) Entropy estimation of very short symbolic sequences. Physical Review E 79: 046208.