Skip to main content

TSAX is Trending

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 12743))

Abstract

Time series mining is an important branch of data mining, as time series data is ubiquitous and has many applications in several domains. The main task in time series mining is classification. Time series representation methods play an important role in time series classification and other time series mining tasks. One of the most popular representation methods of time series data is the Symbolic Aggregate approXimation (SAX). The secret behind its popularity is its simplicity and efficiency. SAX has however one major drawback, which is its inability to represent trend information. Several methods have been proposed to enable SAX to capture trend information, but this comes at the expense of complex processing, preprocessing, or post-processing procedures. In this paper we present a new modification of SAX that we call Trending SAX (TSAX), which only adds minimal complexity to SAX, but substantially improves its performance in time series classification. This is validated experimentally on 50 datasets. The results show the superior performance of our method, as it gives a smaller classification error on 39 datasets compared with SAX.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Bagnall, A., Lines, J., Bostrom, A., Large, J., Keogh, E.: The great time series classification bake off: a review and experimental evaluation of recent algorithmic advances. Data Min. Knowl. Disc. 31(3), 606–660 (2016). https://doi.org/10.1007/s10618-016-0483-9

    Article  MathSciNet  Google Scholar 

  2. Baydogan, M., Runger, G., Tuv, E.: A bag-of-features framework to classify time series. IEEE Trans. Pattern Anal. Mach. Intell. 25(11), 2796–2802 (2013)

    Article  Google Scholar 

  3. Dau, H.A., et al.: The UCR Time Series Classification Archive (2019). https://www.cs.ucr.edu/~eamonn/time_series_data_2018/

  4. Fawaz, H.I., Forestier, G., Weber, J., Idoumghar, L., Muller, P.A.: Adversarial attacks on deep neural networks for time series classification. In: Proceedings of the 2019 International Joint Conference on Neural Networks (IJCNN), Budapest, Hungary, 14–19 July 2019 (2019)

    Google Scholar 

  5. Golomb, S.W.: Run-length encodings. IEEE Trans. Inf. Theory 12(7), 399–401 (1966)

    Article  Google Scholar 

  6. Hatami, N., Gavet, Y., Debayle, J.: Bag of recurrence patterns representation for time-series classification. Pattern Anal. Appl. 22(3), 877–887 (2018). https://doi.org/10.1007/s10044-018-0703-6

    Article  MathSciNet  Google Scholar 

  7. Kane, A.: Trend and value based time series representation for similarity search. In: 2017 IEEE Third International Conference on Multimedia Big Data (BigMM), p. 252 (2017)

    Google Scholar 

  8. Karim, F., Majumdar, S., Darabi, H.: Insights into LSTM fully convolutional networks for time series classification. IEEE Access 7, 67718–67725 (2019). https://doi.org/10.1109/ACCESS.2019.2916828

    Article  Google Scholar 

  9. Keogh, E., Chakrabarti, K., Pazzani, M., Mehrotra, S.: Dimensionality reduction for fast similarity search in large time series databases. J. Knowl. Inform. Syst. 3(3), 263–286 (2000)

    Article  Google Scholar 

  10. Keogh, E., Chakrabarti, K., Pazzani, M., Mehrotra, S.: Locally adaptive dimensionality reduction for similarity search in large time series databases. In: SIGMOD, pp. 151–162 (2001)

    Google Scholar 

  11. Lin, J., Keogh, E., Lonardi, S., Chiu, B.Y.: A symbolic representation of time series, with implications for streaming algorithms. In: DMKD 2003, pp. 2–11 (2003)

    Google Scholar 

  12. Lin, J., Keogh, E., Wei, L., Lonardi, S.: Experiencing SAX: a novel symbolic representation of time series. Data Min. Knowl. Discov. 15(2), 107–144 (2007)

    Article  MathSciNet  Google Scholar 

  13. Lin, J., Khade, R., Li, Y.: Rotation-invariant similarity in time series using bag-of-patterns representation. J. Intell. Inf. Syst. 39, 287–315 (2012)

    Article  Google Scholar 

  14. Malinowski, S., Guyet, T., Quiniou, R., Tavenard, R.: 1d-SAX: a novel symbolic representation for time series. In: Tucker, A., Höppner, F., Siebes, A., Swift, S. (eds.) IDA 2013. LNCS, vol. 8207, pp. 273–284. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-41398-8_24

    Chapter  Google Scholar 

  15. Maimon, O., Rokach, L. (eds.): Data Mining and Knowledge Discovery Handbook. Springer, New York (2005). https://doi.org/10.1007/b107408

    Book  MATH  Google Scholar 

  16. Muhammad Fuad, M.M.: Differential evolution versus genetic algorithms: towards symbolic aggregate approximation of non-normalized time series. In: Sixteenth International Database Engineering & Applications Symposium– IDEAS 2012, Prague, Czech Republic, 8–10 August, 2012. BytePress/ACM (2012)

    Google Scholar 

  17. Muhammad Fuad, M.M.: Extreme-SAX: extreme points based symbolic representation for time series classification. In: Song, M., Song, I.-Y., Kotsis, G., Tjoa, A.M., Khalil, I. (eds.) DaWaK 2020. LNCS, vol. 12393, pp. 122–130. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-59065-9_10

    Chapter  Google Scholar 

  18. Muhammad Fuad, M.M.: Genetic algorithms-based symbolic aggregate approximation. In: Cuzzocrea, A., Dayal, U. (eds.) DaWaK 2012. LNCS, vol. 7448, pp. 105–116. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-32584-7_9

    Chapter  Google Scholar 

  19. Muhammad Fuad, M.M.: Modifying the symbolic aggregate approximation method to capture segment trend information. In: Torra, V., Narukawa, Y., Nin, J., Agell, N. (eds.) MDAI 2020. LNCS (LNAI), vol. 12256, pp. 230–239. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-57524-3_19

    Chapter  Google Scholar 

  20. Ratanamahatana, C., Keogh, E.: Making time-series classification more accurate using learned constraints. In: Proceedings of SIAM International Conference on Data Mining, pp. 11–22 (2004)

    Google Scholar 

  21. Ratanamahatana, C., Keogh, E., Bagnall, A.J., Lonardi, S.: A novel bit level time series representation with implication of similarity search and clustering. In: Ho, T.B., Cheung, D., Liu, H. (eds.) PAKDD 2005. LNCS (LNAI), vol. 3518, pp. 771–777. Springer, Heidelberg (2005). https://doi.org/10.1007/11430919_90

    Chapter  Google Scholar 

  22. Wang, Z., Yan, W., Oates, T.: Time series classification from scratch with deep neural networks: a strong baseline. In: Proceedings of International Joint Conference on Neural Networks (IJCNN), May 2017, pp. 1578–1585 (2017)

    Google Scholar 

  23. Yi, B.K., Faloutsos, C.: Fast time sequence indexing for arbitrary LP norms. In: Proceedings of the 26th International Conference on Very Large Databases, Cairo, Egypt (2000)

    Google Scholar 

  24. Zhang, T., Yue, D., Gu, Y., Wang, Y., Yu, G.: Adaptive correlation analysis in stream time series with sliding windows. Comput. Math. Appl. 57(6), 937–948 (2009)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Muhammad Marwan Muhammad Fuad .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Muhammad Fuad, M.M. (2021). TSAX is Trending. In: Paszynski, M., Kranzlmüller, D., Krzhizhanovskaya, V.V., Dongarra, J.J., Sloot, P.M.A. (eds) Computational Science – ICCS 2021. ICCS 2021. Lecture Notes in Computer Science(), vol 12743. Springer, Cham. https://doi.org/10.1007/978-3-030-77964-1_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-77964-1_23

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-77963-4

  • Online ISBN: 978-3-030-77964-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics