ABSTRACT
In agile, fast and continuous development lifecycles, software performance analysis is fundamental to confidently release continuously improved software versions. Researchers and industry practitioners have identified the importance of integrating performance testing in agile development processes in a timely and efficient way. However, existing techniques are fragmented and not integrated taking into account the heterogeneous skills of the users developing polyglot distributed software, and their need to automate performance practices as they are integrated in the whole lifecycle without breaking its intrinsic velocity. In this paper we present our vision for holistic continuous software performance assessment, which is being implemented in the BenchFlow tool. BenchFlow enables performance testing and analysis practices to be pervasively integrated in continuous development lifecycle activities. Users can specify performance activities (e.g., standard performance tests) by relying on an expressive Domain Specific Language for objective-driven performance analysis. Collected performance knowledge can be thus reused to speed up performance activities throughout the entire process.
- W. Afzal, R. Torkar, and R. Feldt. A systematic review of search-based testing for non-functional system properties. IST, 51(6):957--976, 2009. Google ScholarDigital Library
- T. Ahmad and D. Truscan. Automatic performance space exploration of web applications using genetic algorithms. In SAC '16, pages 795--800. ACM, 2016. Google ScholarDigital Library
- ASQ. Continuous Improvement. http://asq.org/learn-about-quality/continuous-improvement/overview/overview.html.Google Scholar
- M. Bernardino, E. M. Rodrigues, and A. F. Zorzo. Performance testing modeling. In SAC '16, pages 1660--1665. ACM Press, 2016. Google ScholarDigital Library
- M. Bernardino, A. F. Zorzo, and E. M. Rodrigues. Canopus: A Domain-Specific Language for Modeling Performance Testing. In ICSEA, pages 157--167, 2014.Google Scholar
- Blazemeter. Taurus: Automation-friendly framework for Continuous Testing. http://gettaurus.org.Google Scholar
- A. B. Bondi. Foundations of Software and System Performance Engineering. Process, Performance Modeling, Requirements, Testing, Scalability, and Practice. Addison-Wesley Professional, 2014.Google Scholar
- J. Bosch. Continuous Software Engineering. Springer, 2014. Google ScholarCross Ref
- A. Brunnert, A. van Hoorn, F. Willnecker, et al. Performance-oriented DevOps: A Research Agenda. 2015.Google Scholar
- S. Di Alesio, S. Nejati, L. C. Briand, et al. Stress testing of task deadlines - A constraint programming approach. ISSRE, 2013. Google ScholarCross Ref
- DZONE. State of DevOps Report. Technical report, 2016.Google Scholar
- A. Faisal, D. Petriu, and M. Woodside. A Systematic Approach for Composing General Middleware Completions to Performance Models. In Fundamental Approaches to Software Engineering, pages 30--44. Springer, 2014. Google ScholarCross Ref
- V. Ferme, A. Ivanchikj, and C. Pautasso. A Framework for Benchmarking BPMN 2.0 Workflow Management Systems. BPM, 9253(Chapter 18):251--259, 2015.Google Scholar
- V. Ferme, A. Ivanchikj, C. Pautasso, et al. A Container-centric Methodology for Benchmarking Workflow Management Systems. CLOSER, 2:74--84, 2016. Google ScholarDigital Library
- B. Fitzgerald and K.-J. Stol. Continuous software engineering: A roadmap and agenda. Journal of Systems and Software, 123:176--189, 2017. Google ScholarCross Ref
- M. Fowler. Domain-Specific Languages. Pearson, 2010.Google Scholar
- R. Gao, Z. M. Jiang, C. Barna, et al. A Framework to Evaluate the Effectiveness of Different Load Testing Analysis Techniques. In ICST, pages 22--32. IEEE, 2016. Google ScholarCross Ref
- I. P. Gent. The Recomputation Manifesto. 2013.Google Scholar
- M. Gualtieri and G. ODonnell. Augment DevOps With NoOps, 2016.Google Scholar
- M. Hauck, M. Kuperberg, N. Huber, et al. Deriving performance-relevant infrastructure properties through model-based experiments with Ginpex. Softw Syst Model, 13(4):1345--1365, 2013. Google ScholarDigital Library
- D. Hollingsworth. The Workflow Reference Model. Technical report, 1995.Google Scholar
- V. Horký, F. Haas, J. Kotrč, et al. Performance Regression Unit Testing: A Case Study. In Relationship of DevOps to Agile, Lean and Continuous Deployment, pages 149--163. Springer, 2013. Google ScholarCross Ref
- J. Humble and D. Farley. Continuous Delivery. Reliable Software Releases through Build, Test, and Deployment Automation. Pearson, 2010.Google Scholar
- R. Jabbari, N. bin Ali, K. Petersen, et al. What is DevOps?: A Systematic Mapping Study on Definitions and Practices. In XP '16 Workshops, pages 12--21. ACM, 2016.Google ScholarDigital Library
- T. Kalibera and P. Tůma. Precise Regression Benchmarking with Random Effects: Improving Mono Benchmark Results. In Formal Methods and Stochastic Models for Performance Evaluation, pages 63--77. Springer, 2006.Google Scholar
- Y. Koh, R. Knauerhase, P. Brett, et al. An Analysis of Performance Interference Effects in Virtual Environments. In ISPASS, pages 200--209. IEEE, 2007. Google ScholarCross Ref
- R. Kolb, D. Ganesan, D. Muthig, et al. Goal-Oriented Performance Analysis of Reusable Software Components. ICSR, 4039(27):368--381, 2006. Google ScholarDigital Library
- H. Koziolek. Performance evaluation of component-based software systems: A survey. Performance Evaluation, 67(8):634--658, 2010. Google ScholarDigital Library
- E. I. Laukkanen, J. Itkonen, and C. Lassenius. Problems, causes and solutions when adopting continuous delivery - A systematic literature review. IST, 82:55--79, 2017. Google ScholarCross Ref
- M. Mernik, J. Heering, and A. M. Sloane. When and how to develop domain-specific languages. ACM Computing Surveys, 37(4):316--344, 2005. Google ScholarDigital Library
- T. M. Mitchell. Machine learning. McGraw Hill series in computer science, 1997.Google Scholar
- I. Molyneaux. The Art of Application Performance Testing. From Strategy to Tools. O'Reilly Media, Inc., 2014.Google Scholar
- PractiTest. State of Testing Report. Technical report, 2016.Google Scholar
- K.-T. Rehmann, C. Seo, D. Hwang, et al. Performance Monitoring in SAP HANA's Continuous Integration Process. SIGMETRICS Perf. Eval. Rev., 43(4):43--52, 2016. Google ScholarDigital Library
- S. Reis, A. Metzger, and K. Pohl. A reuse technique for performance testing of software product lines. 2006.Google Scholar
- D. Rudolph and G. Stitt. An interpolation-based approach to multi-parameter performance modeling for heterogeneous systems. ASAP, 2015-September:174--180, 2015.Google ScholarCross Ref
- B. M. Rutherford and L. P. Swiler. Response Surface (Meta-model) Methods and Applications. In ICPE'13, 2006.Google Scholar
- G. K. Sandve, A. Nekrutenko, J. Taylor, et al. Ten Simple Rules for Reproducible Computational Research. PLOS Computational Biology, 9(10):e1003285, 2013. Google ScholarCross Ref
- A. Sarkar, J. Guo, N. Siegmund, et al. Cost-Efficient Sampling for Performance Prediction of Configurable Systems (T). In ASE, pages 342--352. IEEE, 2015.Google ScholarDigital Library
- G. Schermann, J. Cito, P. Leitner, et al. Towards quality gates in continuous delivery and deployment. In ICPC, pages 1--4. IEEE, 2016. Google ScholarCross Ref
- K. Spafford and J. S. Vetter. Aspen - a domain specific language for performance modeling. SC, page 84, 2012.Google Scholar
- S. Spinner, G. Casale, F. Brosig, et al. Evaluating approaches to resource demand estimation. Perf. Eval., 92:51--71, 2015. Google ScholarDigital Library
- S. Tsakiltsidis, A. Miranskyy, and E. Mazzawi. On Automatic Detection of Performance Bugs. In ISSREW, pages 132--139. IEEE, 2016. Google ScholarCross Ref
- A. van Deursen, P. Klint, and J. Visser. Domain-specific languages. SIGPLAN Not., 35(6):26--36, 2000. Google ScholarDigital Library
- L. Van Gelder, P. Das, H. Janssen, and S. Roels. Comparative study of metamodelling techniques in building energy simulation: Guidelines for practitioners. Simulation Modelling Practice and Theory, 49:245--257, 2014. Google ScholarCross Ref
- R. Vuduc, J. W. Demmel, and J. A. Bilmes. Statistical Models for Empirical Search-Based Performance Tuning. IJHPCA, 18(1):65--94, 2004. Google ScholarDigital Library
- J. Walter, A. van Hoorn, H. Koziolek, et al. Asking "What"?, Automating the "How"? - The Vision of Declarative Performance Engineering. ICPE, pages 91--94, 2016.Google Scholar
- Y. Wang. Automating experimentation with distributed systems using generative techniques. PhD thesis, University of Colorado, 2006.Google Scholar
- D. Westermann. Deriving Goal-oriented Performance Models by Systematic Experimentation. PhD thesis, KIT Scientific Publishing, 2014.Google Scholar
- F. Wu, W. Weimer, M. Harman, et al. Deep Parameter Optimisation. In GECCO, pages 1375--1382. ACM, 2015. Google ScholarDigital Library
- S. Yoo and M. Harman. Regression testing minimization, selection and prioritization: a survey. Softw. Test. Verif. Reliab., 22(2):67--120, 2012. Google ScholarDigital Library
- L. Zhu, L. Bass, and G. Champlin-Scharff. DevOps and Its Practices. IEEE Softw., 33(3):32--34, 2016. Google ScholarCross Ref
Index Terms
- Towards Holistic Continuous Software Performance Assessment
Recommendations
Towards Agile Testing for Railway Safety-critical Software
XP '16 Workshops: Proceedings of the Scientific Workshop Proceedings of XP2016EN 50128 gives a general guidance for testing railway safety-critical software. However, it does not specify how to test safety-critical software in an agile development environment. Based on our observation, agile software development has not been ...
Enabling Agile Testing through Continuous Integration
AGILE '09: Proceedings of the 2009 Agile ConferenceA Continuous Integration system is often considered one of the key elements involved in supporting an agile software development and testing environment. As a traditional software tester transitioning to an agile development environment it became clear ...
Rugby: an agile process model based on continuous delivery
RCoSE 2014: Proceedings of the 1st International Workshop on Rapid Continuous Software EngineeringIn this paper we introduce Rugby, an agile process model that includes workflows for the continuous delivery of software. It allows part-timers to work in a project-based organization with multiple projects for the rapid delivery of prototypes and ...
Comments