skip to main content
10.1145/3341105.3373948acmconferencesArticle/Chapter ViewAbstractPublication PagessacConference Proceedingsconference-collections
research-article

Benchmarking elasticity of FaaS platforms as a foundation for objective-driven design of serverless applications

Published:30 March 2020Publication History

ABSTRACT

Application providers have to solve the trade-off between performance and deployment costs by selecting the "right" amount of provisioned computing resources for their application. The high value of changing this trade-off decision at runtime fueled a decade of combined efforts by industry and research to develop elastic applications. Despite these efforts, the development of elastic applications still demands significant time and expertise from application providers.

To address this demand, FaaS platforms shift responsibilities associated with elasticity from the application developer to the cloud provider. While this shift is highly promising, FaaS platforms do not quantify elasticity; thus, application developers are unaware of how elastic FaaS platforms are. This lack of knowledge significantly impairs effective objective-driven design of serverless applications.

In this paper, we present an experiment design and corresponding toolkit for quantifying elasticity and its associated trade-offs with latency, reliability, and execution costs. We present results for the evaluation of four popular FaaS platforms by AWS, Google, IBM, Microsoft, and show significant differences between the service offers. Based on our results, we assess the applicability of the individual FaaS platforms in three scenarios under different objectives: web serving, online data analysis, and offline batch processing.

References

  1. Ioannis Arapakis, Xiao Bai, and B. Barla Cambazoglu. 2014. Impact of Response Latency on User Behavior in Web Search. In 37th International ACM SIGIR Conference on Research & Development in Information Retrieval (SIGIR '14). ACM, New York, NY, USA, 103--112. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. David Bermbach, Jörn Kuhlenkamp, Akon Dey, Arunmoezhi Ramachandran, Alan Fekete, and Stefan Tai. 2017. BenchFoundry: A Benchmarking Framework for Cloud Storage Services. In Service-Oriented Computing (ICSOC'15), Michael Maximilien, Antonio Vallecillo, Jianmin Wang, and Marc Oriol (Eds.). Springer International Publishing, Cham, 314--330.Google ScholarGoogle Scholar
  3. David Bermbach, Erik Wittern, and Stefan Tai. 2017. Cloud Service Benchmarking: Measuring Quality of Cloud Services from a Client Perspective. Springer International Publishing, Cham.Google ScholarGoogle Scholar
  4. Brian F Cooper, Adam Silberstein, Erwin Tam, Raghu Ramakrishnan, and Russell Sears. 2010. Benchmarking cloud serving systems with YCSB. In 1st ACM symposium on Cloud computing. ACM, ACM, New York, NY, USA, 143--154.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Thibault Dory, Boris Mejías, Peter Van Roy, and Nam-Luc Tran. 2011. Measuring Elasticity for Cloud Databases. In 2nd International Conference on Cloud Computing, GRIDs, and Virtualization. IARIA, Rome, Italy, 154--160.Google ScholarGoogle Scholar
  6. Gregor Hohpe. 2017. The Architect Elevator - Visiting the upper floors. https://martinfowler.com/articles/architect-elevator.html. [Online; accessed 26.09.2019].Google ScholarGoogle Scholar
  7. Karl Huppler. 2009. The Art of Building a Good Benchmark. In Performance Evaluation and Benchmarking, Raghunath Nambiar and Meikel Poess (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 18--30.Google ScholarGoogle Scholar
  8. Sadeka Islam, Kevin Lee, Alan Fekete, and Anna Liu. 2012. How a Consumer Can Measure Elasticity for Cloud Platforms. In 3rd ACM/SPEC International Conference on Performance Engineering (ICPE '12). ACM, New York, NY, USA, 85--96. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. David Jackson and Gary Clynch. 2018. An Investigation of the Impact of Language Runtime on the Performance and Cost of Serverless Functions. In Proceedings of the 3rd International Workshop on Serverless Computing (WoSC'18). IEEE, Zurich, Switzerland, 154--160. Google ScholarGoogle ScholarCross RefCross Ref
  10. Eric Jonas, Qifan Pu, Shivaram Venkataraman, Ion Stoica, and Benjamin Recht. 2017. Occupy the Cloud: Distributed Computing for the 99%. In Proceedings of the 2017 Symposium on Cloud Computing (SoCC'17). ACM, New York, NY, USA, 445--451. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Eric Jonas, Johann Schleier Smith, Vikram Sreekanti, Chia-Che Tsai, Anurag Khandelwal, Qifan Pu, Vaishaal Shankar, Joao Carreira, Karl Krauth, Neeraja Yadwadkar, Joseph E. Gonzales, Raluca A. Popa, Ion Stoica, and David A. Patterson. 2019. Cloud Programming Simplified: A Berkeley View on Serverless Computing. CoRR arXiv preprint arXiv:1902.03383 (9 Feb 2019), 1--33. http://arxiv.org/abs/1812.03651Google ScholarGoogle Scholar
  12. Markus Klems. 2016. Experiment-driven Evaluation of Cloud-based Distributed Systems. Ph.D. Dissertation. TU Berlin.Google ScholarGoogle Scholar
  13. Ioannis Konstantinou, Evangelos Angelou, Christina Boumpouka, and Nectarios Koziris. 2011. On the Elasticity of NoSQL Databases over Cloud Management Platforms. In 20th ACM International Conference on Information and Knowledge Management. ACM Press, New York, NY, USA, 2385--2388. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Jörn Kuhlenkamp and Markus Klems. 2017. Costradamus: A Cost-Tracing System for Cloud-Based Software Services. In Service-Oriented Computing (ICSOC'15), Michael Maximilien, Antonio Vallecillo, Jianmin Wang, and Marc Oriol (Eds.). Springer International Publishing, Cham, 657--672.Google ScholarGoogle Scholar
  15. Jörn Kuhlenkamp, Markus Klems, and Oliver Röss. 2014. Benchmarking Scalability and Elasticity of Distributed Database Systems. VLDB Endowment 7, 12 (2014), 1219--1230.Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Jörn Kuhlenkamp and Sebastian Werner. 2018. Benchmarking FaaS Platforms: Call for Community Participation. In Proceedings of the 3rd International Workshop on Serverless Computing (WoSC'18). IEEE, Zurich, Switzerland, 189--194. Google ScholarGoogle ScholarCross RefCross Ref
  17. Jörn Kuhlenkamp, Sebastian Werner, Maria C. Borges, Karim El Tal, and Stefan Tai. 2019. An Evaluation of FaaS Platforms As a Foundation for Serverless Big Data Processing. In Conference on Utility and Cloud Computing (UCC'19). ACM, New York, NY, USA, 1--9. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Aleksandr Kuntsevich, Pezhman Nasirifard, and Hans-Arno Jacobsen. 2018. A Distributed Analysis and Benchmarking Framework for Apache OpenWhisk Serverless Platform. In 19th International Middleware Conference (Posters) (Middleware'18). ACM, New York, NY, USA, 3--4. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Hyungro Lee, Kumar Satyam, and Geoffrey Fox. 2018. Evaluation of Production Serverless Computing Environments. In Proceedings of the IEEE 11th International Conference on Cloud Computing (CLOUD'18). IEEE, San Francisco, CA, USA, 442--450. Google ScholarGoogle ScholarCross RefCross Ref
  20. Wes Lloyd, Shruti Ramesh, Swetha Chinthalapati, Lan Ly, and Shrideep Pallickara. 2018. Serverless Computing: An Investigation of Factors Influencing Microservice Performance. In Proceedings of the IEEE International Conference on Cloud Engineering (IC2E'18). IEEE, Orlando, FL, USA, 159--169. Google ScholarGoogle ScholarCross RefCross Ref
  21. Maciej Malawski, Kamil Figiela, Adam Gajek, and Adam Zima. 2018. Benchmarking Heterogeneous Cloud Functions. In Euro-Par 2017: Parallel Processing Workshops (Euro-Par'17), Dora B. Heras, Luc Bougé, Gabriele Mencagli, Emmanuel Jeannot, Rizos Sakellariou, Rosa M. Badia, Jorge G. Barbosa, Laura Ricci, Stephen L. Scott, Stefan Lankes, and Josef Weidendorfer (Eds.). Springer International Publishing, Cham, 415--426.Google ScholarGoogle Scholar
  22. Johannes Manner, Martín Endreß, Tobis Heckel, and Guido Wirtz. 2018. Cold Start Influencing Factors in Function as a Service. In Proceedings of the 3rd International Workshop on Serverless Computing (WoSC'18). IEEE, Zurich, Switzerland, 181--188. Google ScholarGoogle ScholarCross RefCross Ref
  23. Douglas C. Montgomery. 2017. Design and Analysis of Experiments (8. ed.). John Wiley & Sons, New York, NY, USA.Google ScholarGoogle Scholar
  24. Josep Sampé, Gil Vernik, Marc Sánchez Artigas, and Pedro Garcia López. 2018. Serverless Data Analytics in the IBM Cloud. In Proceedings of the 19th International Middleware Conference Industry (Middleware'18). ACM, New York, NY, USA, 1--8. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Jóakim v. Kistowski, Jeremy A. Arnold, Karl Huppler, Klaus-Dieter Lange, John L. Henning, and Paul Cao. 2015. How to Build a Benchmark. In 6th ACM/SPEC International Conference on Performance Engineering (ICPE '15). ACM, New York, NY, USA, 333--336. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Erwin van Eyk, Alexandru Iosup, Cristina L. Abad, Johannes Grohmann, and Simon Eismann. 2018. A SPEC RG Cloud Group's Vision on the Performance Challenges of FaaS Cloud Architectures. In Companion of the 2018 International Conference on Performance Engineering (ICPE'18). ACM, New York, NY, USA, 21--24. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Andreas Weber, Nikolas Herbst, Henning Groenda, and Samuel Kounev. 2014. Towards a Resource Elasticity Benchmark for Cloud Environments. In 2Nd International Workshop on Hot Topics in Cloud Service Scalability (HotTopiCS '14). ACM. New York, NY, USA, Article 5, 8 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Sebastian Werner, Jörn Kuhlenkamp, Markus Klems, Johannes Müller, and Stefan Tai. 2018. Serverless Big Data Processing Using Matrix Multiplication as Example. In IEEE International Conference on Big Data (Big Data'18). IEEE, Seattle, WA, USA, 358--365. Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Benchmarking elasticity of FaaS platforms as a foundation for objective-driven design of serverless applications

            Recommendations

            Comments

            Login options

            Check if you have access through your login credentials or your institution to get full access on this article.

            Sign in
            • Published in

              cover image ACM Conferences
              SAC '20: Proceedings of the 35th Annual ACM Symposium on Applied Computing
              March 2020
              2348 pages
              ISBN:9781450368667
              DOI:10.1145/3341105

              Copyright © 2020 ACM

              Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

              Publisher

              Association for Computing Machinery

              New York, NY, United States

              Publication History

              • Published: 30 March 2020

              Permissions

              Request permissions about this article.

              Request Permissions

              Check for updates

              Qualifiers

              • research-article

              Acceptance Rates

              Overall Acceptance Rate1,650of6,669submissions,25%

            PDF Format

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader