ABSTRACT
Application providers have to solve the trade-off between performance and deployment costs by selecting the "right" amount of provisioned computing resources for their application. The high value of changing this trade-off decision at runtime fueled a decade of combined efforts by industry and research to develop elastic applications. Despite these efforts, the development of elastic applications still demands significant time and expertise from application providers.
To address this demand, FaaS platforms shift responsibilities associated with elasticity from the application developer to the cloud provider. While this shift is highly promising, FaaS platforms do not quantify elasticity; thus, application developers are unaware of how elastic FaaS platforms are. This lack of knowledge significantly impairs effective objective-driven design of serverless applications.
In this paper, we present an experiment design and corresponding toolkit for quantifying elasticity and its associated trade-offs with latency, reliability, and execution costs. We present results for the evaluation of four popular FaaS platforms by AWS, Google, IBM, Microsoft, and show significant differences between the service offers. Based on our results, we assess the applicability of the individual FaaS platforms in three scenarios under different objectives: web serving, online data analysis, and offline batch processing.
- Ioannis Arapakis, Xiao Bai, and B. Barla Cambazoglu. 2014. Impact of Response Latency on User Behavior in Web Search. In 37th International ACM SIGIR Conference on Research & Development in Information Retrieval (SIGIR '14). ACM, New York, NY, USA, 103--112. Google ScholarDigital Library
- David Bermbach, Jörn Kuhlenkamp, Akon Dey, Arunmoezhi Ramachandran, Alan Fekete, and Stefan Tai. 2017. BenchFoundry: A Benchmarking Framework for Cloud Storage Services. In Service-Oriented Computing (ICSOC'15), Michael Maximilien, Antonio Vallecillo, Jianmin Wang, and Marc Oriol (Eds.). Springer International Publishing, Cham, 314--330.Google Scholar
- David Bermbach, Erik Wittern, and Stefan Tai. 2017. Cloud Service Benchmarking: Measuring Quality of Cloud Services from a Client Perspective. Springer International Publishing, Cham.Google Scholar
- Brian F Cooper, Adam Silberstein, Erwin Tam, Raghu Ramakrishnan, and Russell Sears. 2010. Benchmarking cloud serving systems with YCSB. In 1st ACM symposium on Cloud computing. ACM, ACM, New York, NY, USA, 143--154.Google ScholarDigital Library
- Thibault Dory, Boris Mejías, Peter Van Roy, and Nam-Luc Tran. 2011. Measuring Elasticity for Cloud Databases. In 2nd International Conference on Cloud Computing, GRIDs, and Virtualization. IARIA, Rome, Italy, 154--160.Google Scholar
- Gregor Hohpe. 2017. The Architect Elevator - Visiting the upper floors. https://martinfowler.com/articles/architect-elevator.html. [Online; accessed 26.09.2019].Google Scholar
- Karl Huppler. 2009. The Art of Building a Good Benchmark. In Performance Evaluation and Benchmarking, Raghunath Nambiar and Meikel Poess (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 18--30.Google Scholar
- Sadeka Islam, Kevin Lee, Alan Fekete, and Anna Liu. 2012. How a Consumer Can Measure Elasticity for Cloud Platforms. In 3rd ACM/SPEC International Conference on Performance Engineering (ICPE '12). ACM, New York, NY, USA, 85--96. Google ScholarDigital Library
- David Jackson and Gary Clynch. 2018. An Investigation of the Impact of Language Runtime on the Performance and Cost of Serverless Functions. In Proceedings of the 3rd International Workshop on Serverless Computing (WoSC'18). IEEE, Zurich, Switzerland, 154--160. Google ScholarCross Ref
- Eric Jonas, Qifan Pu, Shivaram Venkataraman, Ion Stoica, and Benjamin Recht. 2017. Occupy the Cloud: Distributed Computing for the 99%. In Proceedings of the 2017 Symposium on Cloud Computing (SoCC'17). ACM, New York, NY, USA, 445--451. Google ScholarDigital Library
- Eric Jonas, Johann Schleier Smith, Vikram Sreekanti, Chia-Che Tsai, Anurag Khandelwal, Qifan Pu, Vaishaal Shankar, Joao Carreira, Karl Krauth, Neeraja Yadwadkar, Joseph E. Gonzales, Raluca A. Popa, Ion Stoica, and David A. Patterson. 2019. Cloud Programming Simplified: A Berkeley View on Serverless Computing. CoRR arXiv preprint arXiv:1902.03383 (9 Feb 2019), 1--33. http://arxiv.org/abs/1812.03651Google Scholar
- Markus Klems. 2016. Experiment-driven Evaluation of Cloud-based Distributed Systems. Ph.D. Dissertation. TU Berlin.Google Scholar
- Ioannis Konstantinou, Evangelos Angelou, Christina Boumpouka, and Nectarios Koziris. 2011. On the Elasticity of NoSQL Databases over Cloud Management Platforms. In 20th ACM International Conference on Information and Knowledge Management. ACM Press, New York, NY, USA, 2385--2388. Google ScholarDigital Library
- Jörn Kuhlenkamp and Markus Klems. 2017. Costradamus: A Cost-Tracing System for Cloud-Based Software Services. In Service-Oriented Computing (ICSOC'15), Michael Maximilien, Antonio Vallecillo, Jianmin Wang, and Marc Oriol (Eds.). Springer International Publishing, Cham, 657--672.Google Scholar
- Jörn Kuhlenkamp, Markus Klems, and Oliver Röss. 2014. Benchmarking Scalability and Elasticity of Distributed Database Systems. VLDB Endowment 7, 12 (2014), 1219--1230.Google ScholarDigital Library
- Jörn Kuhlenkamp and Sebastian Werner. 2018. Benchmarking FaaS Platforms: Call for Community Participation. In Proceedings of the 3rd International Workshop on Serverless Computing (WoSC'18). IEEE, Zurich, Switzerland, 189--194. Google ScholarCross Ref
- Jörn Kuhlenkamp, Sebastian Werner, Maria C. Borges, Karim El Tal, and Stefan Tai. 2019. An Evaluation of FaaS Platforms As a Foundation for Serverless Big Data Processing. In Conference on Utility and Cloud Computing (UCC'19). ACM, New York, NY, USA, 1--9. Google ScholarDigital Library
- Aleksandr Kuntsevich, Pezhman Nasirifard, and Hans-Arno Jacobsen. 2018. A Distributed Analysis and Benchmarking Framework for Apache OpenWhisk Serverless Platform. In 19th International Middleware Conference (Posters) (Middleware'18). ACM, New York, NY, USA, 3--4. Google ScholarDigital Library
- Hyungro Lee, Kumar Satyam, and Geoffrey Fox. 2018. Evaluation of Production Serverless Computing Environments. In Proceedings of the IEEE 11th International Conference on Cloud Computing (CLOUD'18). IEEE, San Francisco, CA, USA, 442--450. Google ScholarCross Ref
- Wes Lloyd, Shruti Ramesh, Swetha Chinthalapati, Lan Ly, and Shrideep Pallickara. 2018. Serverless Computing: An Investigation of Factors Influencing Microservice Performance. In Proceedings of the IEEE International Conference on Cloud Engineering (IC2E'18). IEEE, Orlando, FL, USA, 159--169. Google ScholarCross Ref
- Maciej Malawski, Kamil Figiela, Adam Gajek, and Adam Zima. 2018. Benchmarking Heterogeneous Cloud Functions. In Euro-Par 2017: Parallel Processing Workshops (Euro-Par'17), Dora B. Heras, Luc Bougé, Gabriele Mencagli, Emmanuel Jeannot, Rizos Sakellariou, Rosa M. Badia, Jorge G. Barbosa, Laura Ricci, Stephen L. Scott, Stefan Lankes, and Josef Weidendorfer (Eds.). Springer International Publishing, Cham, 415--426.Google Scholar
- Johannes Manner, Martín Endreß, Tobis Heckel, and Guido Wirtz. 2018. Cold Start Influencing Factors in Function as a Service. In Proceedings of the 3rd International Workshop on Serverless Computing (WoSC'18). IEEE, Zurich, Switzerland, 181--188. Google ScholarCross Ref
- Douglas C. Montgomery. 2017. Design and Analysis of Experiments (8. ed.). John Wiley & Sons, New York, NY, USA.Google Scholar
- Josep Sampé, Gil Vernik, Marc Sánchez Artigas, and Pedro Garcia López. 2018. Serverless Data Analytics in the IBM Cloud. In Proceedings of the 19th International Middleware Conference Industry (Middleware'18). ACM, New York, NY, USA, 1--8. Google ScholarDigital Library
- Jóakim v. Kistowski, Jeremy A. Arnold, Karl Huppler, Klaus-Dieter Lange, John L. Henning, and Paul Cao. 2015. How to Build a Benchmark. In 6th ACM/SPEC International Conference on Performance Engineering (ICPE '15). ACM, New York, NY, USA, 333--336. Google ScholarDigital Library
- Erwin van Eyk, Alexandru Iosup, Cristina L. Abad, Johannes Grohmann, and Simon Eismann. 2018. A SPEC RG Cloud Group's Vision on the Performance Challenges of FaaS Cloud Architectures. In Companion of the 2018 International Conference on Performance Engineering (ICPE'18). ACM, New York, NY, USA, 21--24. Google ScholarDigital Library
- Andreas Weber, Nikolas Herbst, Henning Groenda, and Samuel Kounev. 2014. Towards a Resource Elasticity Benchmark for Cloud Environments. In 2Nd International Workshop on Hot Topics in Cloud Service Scalability (HotTopiCS '14). ACM. New York, NY, USA, Article 5, 8 pages. Google ScholarDigital Library
- Sebastian Werner, Jörn Kuhlenkamp, Markus Klems, Johannes Müller, and Stefan Tai. 2018. Serverless Big Data Processing Using Matrix Multiplication as Example. In IEEE International Conference on Big Data (Big Data'18). IEEE, Seattle, WA, USA, 358--365. Google ScholarCross Ref
Index Terms
- Benchmarking elasticity of FaaS platforms as a foundation for objective-driven design of serverless applications
Recommendations
An Evaluation of FaaS Platforms as a Foundation for Serverless Big Data Processing
UCC'19: Proceedings of the 12th IEEE/ACM International Conference on Utility and Cloud ComputingFunction-as-a-Service (FaaS), offers a new alternative to operate cloud-based applications. FaaS platforms enable developers to define their application only through a set of service functions, relieving them of infrastructure management tasks, which ...
All but one: FaaS platform elasticity revisited
Serverless computing through Function-as-a-Service (FaaS) platforms is a popular new cloud computing model, offering high elasticity while being fully-managed. However, FaaS platforms do not quantify elasticity; thus, application developers are unaware ...
Benchmarking the Data Layer Across Serverless Platforms
HiPS '22: Proceedings of the 2nd Workshop on High Performance Serverless ComputingThe use of highly scalable serverless platforms for web microservices and IoT applications is well known. However, their use for data-intensive applications is restricted due to the stateless nature of serverless functions. Any data retrieval, storage, ...
Comments