Next Article in Journal
New Explanation for the Mpemba Effect
Previous Article in Journal
Mapping Ecosyndemic Risk and Social Vulnerability in Guatemala during the 2014–2016 El Niño: An Exploratory GIS Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Reverse Weighted-Permutation Entropy: A Novel Complexity Metric Incorporating Distance and Amplitude Information †

School of Automation and Information Engineering, Xi’an University of Technology, Xi’an 710048, Shaanxi, China
Presented at the 5th International Electronic Conference on Entropy and Its Applications, 18–30 November 2019; Available online: https://ecea-5.sciforum.net/.
Proceedings 2020, 46(1), 1; https://doi.org/10.3390/ecea-5-06688
Published: 17 November 2019

Abstract

:
Permutation entropy (PE), as one of the effective complexity metrics to represent the complexity of time series, has the merits of simple calculation and high calculation efficiency. In view of the limitations of PE, weighted-permutation entropy (WPE) and reverse permutation entropy (RPE) were proposed to improve the performance of PE. WPE introduces amplitude information to weigh each arrangement pattern, it can not only better reveal the complexity of time series with a sudden change of amplitude, but it also has better robustness to noise; by introducing distance information, RPE is defined as the distance to white noise, it has the reverse trend to traditional PE and has better stability for time series of different lengths. In this paper, we propose a novel complexity metric incorporating distance and amplitude information, and name it reverse weighted-permutation entropy (RWPE), which incorporates the advantages of both WPE and RPE. Three simulation experiments were conducted, including mutation signal detection testing, robustness testing to noise based on complexity, and complexity testing of time series with various lengths. The simulation results show that RWPE can be used as a complexity metric, which has the ability to accurately detect the abrupt amplitudes of time series and has better robustness to noise. Moreover, it also shows greater stability than the other three kinds of PE for time series with various lengths.

1. Introduction

Permutation entropy (PE) was first brought forward by Bandt and Pompe in a seminal paper [1]. It introduced a simple and robust symbolic method that takes into account arrangement patterns of time series by comparing neighboring values of time series. The theoretical advantages of PE promote its application in practical problems [2]. Some typical applications of PE can be found in the medical area [3], where it has been used to represent different states of human organs, including neuron, brain and heart. In addition, more applications can also be found in the fields of economics [4,5], mechanical engineering [6,7] and underwater acoustics [8,9]. The development of PE includes two aspects: One is the expansion of its application in different fields, the other is the improvement of PE theory. Many researchers have proposed improved PEs based on the limitations of PE.
In 2013, weighted-permutation entropy (WPE) was proposed and applied to electroencephalogram (EEG) data analysis by Fadlallah [10]. In 2017, Bandt proposed a new version of PE with a reverse trend to existing PEs—we name it reverse PE (RPE) [11] in this paper. RPE is defined as the distance to white noise and has better stability for time series of different lengths. WPE and RPE have their own edges in indicating the complexity of time series from different angles. In order to keep and enhance the advantages of WPE and RPE, we propose a novel complexity metric incorporating distance and amplitude information and name it reverse weighted-permutation entropy (RWPE). Three simulation experiments have been carried out to demonstrate validity of RWPE by analysis and comparison with PE, WPE and RPE.

2. Reverse Weighted-Permutation Entropy

The specific steps of RWPE are as follows:
  • Step 1: Phase space reconstruction.
For a time series Y = { y ( i ) ,   i = 1 ,   2 ,   ,   T } with T values, we reconstruct Y into L , embedding vectors with the time delay τ and embedding dimension m , respectively. The matrix consisting of all embedding vectors can be represented as follows:
[ { y ( 1 ) ,   y ( 1 + τ ) ,   ,   y ( 1 + ( m 1 ) τ ) }                               { y ( j ) ,   y ( j + τ ) ,   ,   y ( j + ( m 1 ) τ ) }                               { y ( L ) ,   y ( L + τ ) ,   ,   y ( L + ( m 1 ) τ ) } ]
In Formula (1), each row vector of the matrix corresponds to an embedding vector successively, and the number of embedding vectors L is T ( m 1 ) τ .
  • Step 2: Ascending order.
We can rearrange each embedding vector in ascending order as follows:
y ( i + ( j 1 1 ) τ ) y ( i + ( j 2 1 ) τ ) y ( i + ( j m 1 ) τ )
When two values of the same embedding vector are equal as follows:
y ( i + ( j a 1 ) τ ) = y ( i + ( j b 1 ) τ )
Their arrangement depends on their own temporal order as follows:
y ( i + ( j a 1 ) τ ) < y ( i + ( j b 1 ) τ )   ( a < b )
Hence, we can obtain original patterns of embedding vectors as follows:
π g = ( j 1 ,   j 2 ,   ,   j m )   ( g = 1 ,   2 ,   ,   m ! )
where π g is one of the m !  original patterns.
  • Step 3: Entropy calculation.
    (1)
    Probability calculation.
Since an original pattern corresponds to several possible patterns, we introduce amplitude information to calculate the probabilities of original patterns on the basis of WPE. For example, Figure 1 is an original pattern and its corresponding three possible patterns ( m = 3 ).
We introduce amplitude information by weighting each embedding vector, weight value of the embedding vector Y j can be expressed as follows:
{ ω g ( π s ) = 1 m k = 1 m ( y ( j + ( k 1 ) τ ) Y ¯ j ) 2 Y ¯ j = 1 m k = 1 m y ( j + ( k 1 ) τ )
where Y j corresponds to the s -th possible pattern of the g -th original pattern, Y ¯ j is the mean of Y j , and ω g ( π s ) is the weight value of Y j by calculating the variance of Y j . The frequency for the g -th original pattern can be expressed as:
f ( π g ) = s S f ( π g ( s ) )   ω g ( π s ) ( s = 0 ,   1 ,   ,   S )
where S is the number of possible patterns of the g -th original pattern, and f ( π g ( s ) ) is the frequency of the s -th possible pattern. Then, the probability of the g -th original pattern can be represented as:
P ( π g ) = f ( π g ) g = 1 m ! f ( π g )
(2)
Calculation formula.
We define RWPE as the distance to white noise based on RPE by introducing distance information. Therefore, RWPE can be expressed as:
H R W P E ( m ) = g = 1 m ! ( P ( π g ) 1 m ! ) 2 = g = 1 m ! P ( π g ) 2 1 m !
When P ( π g ) = 1 / m ! , the value of H R W P E ( m )  is 0 (minimum value).

3. Simulations

3.1. Simulation 1

Mutation signal detection is carried out to verify the performance of RWPE. The synthetic signals are as follows:
{ y = 50 * ( t = = 0.498 ) + 0 * ( t > = 0 & t < = 1 ) s = r a n d n ( t ) y s = y + s
where the synthetic signal y s consists of impulse signal y and white Gaussian noise s with the sampling frequency of 1 KHz. The time domain waveform of synthetic signal y s is shown in Figure 2. We compute the values of PE, WPE, RPE and RWPE using sliding windows of 80 samples with 70 overlapped samples, the embedding dimension and time delay are 3 and 1, respectively. Figure 3 shows the four entropies of synthetic signal y s . As seen in Figure 3, the values of WPE and RWPE decrease and increase significantly in the windows containing the impulse signal, the values of PE and RPE remain steady in all windows. Table 1 shows the four entropies in the windows from 42 to 51. As seen in Figure 3 and Table 1, the impulse signal is in the windows from 43 to 50, the entropies incorporating amplitude information are obviously different from the ones without amplitude information, and WPE change is in the opposite direction from RWPE. The simulation results show that WPE and RWPE have better performance than PE and RPE in mutation signal detection.

3.2. Simulation 2

Robustness testing to noise is carried out to verify the performance of RWPE. We use the same impulse signal y in the Simulation 1, the synthetic signals y s with different signal-to-noise ratios (SNRs) can be obtained by adding white Gaussian noise to y . The embedding dimension and time delays are 3 and 1, respectively. Figure 4 shows the four entropies of synthetic signal under different SNRs, each entropy value is the average of 1000 calculations. As seen in Figure 4, the values of PE and RPE are relatively stable under different SNRs and close to 1 and 0, respectively; the entropies incorporating amplitude information respond to changes in SNR, the influence of noise on complexity decreases with the increase of SNR, the values of WPE and RWPE are monotonically decreasing and increasing. The testing results indicate that RWPE and WPE have better robustness to noise than PE and RPE.

3.3. Simulation 3

Complexity testing of time series with various lengths is carried out to prove the stability of RWPE. We calculate four entropies of cosine signals with frequencies of 100 Hz. The embedding dimension and time delay are set to 3 and 1. The initial data length is 2000 sampling points, and 100 sampling points are added each time until the data length reaches 12,000 sampling points. Figure 5 shows the four entropies of cosine signals with frequencies of 100 Hz.
As seen in Figure 5, the four entropies of two cosine signals change in varying degrees with the increase of data length, the variation ranges of WPE are one order of magnitude lower than the ones of PE and RPE, and the variation ranges of RWPE are the smallest than the ones of WPE. The complexity testing results indicate that RWPE has better stability than the other three entropies in the case of different length data.

4. Conclusions

This paper proposes a novel complexity metric incorporating distance and amplitude information and names it RWPE. Three simulation experiments were carried out to validate this approach. Firstly, like WPE, RWPE has the ability to accurately detect the abrupt amplitudes of time series and has the opposite trend with WPE. Secondly, RWPE has better robustness to noise than PE and RPE. Lastly, RWPE has better stability for time series with different lengths than PE, WPE and RPE. RWPE, as an effective complexity metric, could be used to solve practical problems in different fields in future work.

Funding

The author gratefully acknowledges the support from the National Natural Science Foundation of China (No. 61603296).

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Bandt, C.; Pompe, B. Permutation entropy: A natural complexity measure for time series. Phys. Rev. Lett. 2001, 88, 174102. [Google Scholar] [CrossRef] [PubMed]
  2. Zanin, M.; Zunino, L.; Rosso, O.A.; Papo, D. Permutation Entropy and Its Main Biomedical and Econophysics Applications: A Review. Entropy 2012, 14, 1553–1577. [Google Scholar] [CrossRef]
  3. Nicolaou, N.; Georgiou, J. Detection of epileptic electroencephalogram based on Permutation Entropy and Support Vector Machines. Expert Syst. Appl. 2012, 39, 202–209. [Google Scholar] [CrossRef]
  4. Zunino, L.; Zanin, M.; Tabak, B.; Perez, D.; Rosso, O. Forbidden patterns, permutation entropy and stock market inefficiency. Phys. A 2009, 388, 2854–2864. [Google Scholar] [CrossRef]
  5. Hou, Y.; Liu, F.; Gao, J.; Cheng, C.; Song, C. Characterizing Complexity Changes in Chinese Stock Markets by Permutation Entropy. Entropy 2017, 19, 514. [Google Scholar] [CrossRef]
  6. Zhang, X.; Liang, Y.; Zhou, J.; Zang, Y. A novel bearing fault diagnosis model integrated permutation entropy, ensemble empirical mode decomposition and optimized SVM. Measurement 2015, 69, 164–179. [Google Scholar] [CrossRef]
  7. Yan, R.; Liu, Y.; Gao, R. Permutation entropy: A nonlinear statistical measure for status characterization of rotary machines. Mech. Syst. Signal Process. 2012, 29, 474–484. [Google Scholar] [CrossRef]
  8. Li, Y.-X.; Li, Y.-A.; Chen, Z.; Chen, X. Feature Extraction of Ship-Radiated Noise Based on Permutation Entropy of the Intrinsic Mode Function with the Highest Energy. Entropy 2016, 18, 393. [Google Scholar] [CrossRef]
  9. Li, Y.; Li, Y.; Chen, X.; Yu, J. A Novel Feature Extraction Method for Ship-Radiated Noise Based on Variational Mode Decomposition and Multi-Scale Permutation Entropy. Entropy 2017, 19, 342. [Google Scholar] [CrossRef]
  10. Fadlallah, B.; Chen, B.; Keil, A.; Principe, J. Weighted-permutation entropy: A complexity measure for time series incorporating amplitude information. Phys. Rev. E 2013, 87, 022911. [Google Scholar] [CrossRef] [PubMed]
  11. Bandt, C. A New Kind of Permutation Entropy Used to Classify Sleep Stages from Invisible EEG Microstructure. Entropy 2017, 19, 197. [Google Scholar] [CrossRef]
Figure 1. An original pattern and its corresponding three possible patterns ( m = 3 ).
Figure 1. An original pattern and its corresponding three possible patterns ( m = 3 ).
Proceedings 46 00001 g001
Figure 2. The time domain waveform of synthetic signal y s .
Figure 2. The time domain waveform of synthetic signal y s .
Proceedings 46 00001 g002
Figure 3. The four entropies of synthetic signal y s .
Figure 3. The four entropies of synthetic signal y s .
Proceedings 46 00001 g003
Figure 4. The four entropies of synthetic signal under different SNRs.
Figure 4. The four entropies of synthetic signal under different SNRs.
Proceedings 46 00001 g004
Figure 5. The four entropies of cosine signals with frequencies of 100 Hz. (a) PE; (b) WPE; (c) RPE; (d) RWPE.
Figure 5. The four entropies of cosine signals with frequencies of 100 Hz. (a) PE; (b) WPE; (c) RPE; (d) RWPE.
Proceedings 46 00001 g005aProceedings 46 00001 g005b
Table 1. The four entropies in the windows from 42 to 51.
Table 1. The four entropies in the windows from 42 to 51.
Window42434445464748495051
PE0.9940.9880.9920.9790.9820.9780.9870.9900.9950.995
WPE0.9920.6510.6450.6480.6510.6490.6520.6490.6500.993
RPE0.0070.0140.0100.0260.0210.0250.0150.0110.0060.005
RWPE0.0090.3170.3190.3180.3170.3170.3160.3170.3170.009
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Li, Y. Reverse Weighted-Permutation Entropy: A Novel Complexity Metric Incorporating Distance and Amplitude Information. Proceedings 2020, 46, 1. https://doi.org/10.3390/ecea-5-06688

AMA Style

Li Y. Reverse Weighted-Permutation Entropy: A Novel Complexity Metric Incorporating Distance and Amplitude Information. Proceedings. 2020; 46(1):1. https://doi.org/10.3390/ecea-5-06688

Chicago/Turabian Style

Li, Yuxing. 2020. "Reverse Weighted-Permutation Entropy: A Novel Complexity Metric Incorporating Distance and Amplitude Information" Proceedings 46, no. 1: 1. https://doi.org/10.3390/ecea-5-06688

Article Metrics

Back to TopTop