A case analysis of enabling continuous software deployment through knowledge management

https://doi.org/10.1016/j.ijinfomgt.2017.11.005Get rights and content

Abstract

Continuous software engineering aims to accelerate software development by automating the whole software development process. Knowledge management is a cornerstone for continuous integration between software development and its operational deployment, which must be implemented using sound methodologies and solid tools. In this paper, the authors present and analyse a case study on the adoption of such practices by a software company. Results show that, beyond tools, knowledge management practices are the main enablers of continuous software engineering adoption and success.

Introduction

In order to preserve their competitive advantage, software producers need to deliver products and new features to customers as fast as they can. It is generally accepted that important problems in software delivery are rooted, among other aspects, in the disconnections among software development activities, causing delays in software delivery (Fitzgerald & Stol, 2017). This lack of connection lies not only on the technical side, where human aspects and knowledge management facets are some of the main areas to be improved. Continuous software engineering permits software features delivery at rates which a few years ago would have been considered unachieveable (Colomo-Palacios, Fernandes, Soto-Acosta, & Sabbagh, 2011, p. 4; O’Connor, Elger, & Clarke, 2017). This approach is based heavily on applying automation to the overall software development process (including code collaboration tools, verification, version control system, deployment and release management…) by using several tools. These tools act as structures in which different types of knowledge are coded and shared among software practitioners.

Like any other approach, continuous deployment presents benefits but also caveats. On the benefits side, the literature reports: Increased customer satisfaction, shorter time-to-market, higher developer productivity and efficiency, continuous rapid feedback and, finally, higher quality and reliability. With regard to the challenges, researchers found the wide panoply of tools available and their integration, organizational culture to be a hindrance to the transformation process and increased quality assurance efforts.

The continuous approach goes beyond the borders of traditional software development to reach the operational side as well. In this scenario, DevOps stands for a continuous integration between software development (Dev) and its operational deployment (Ops). DevOps efficiently integrates development, delivery, and operations, thus facilitating a lean and fluid connection of these traditionally separated silos (Ebert, Gallardo, Hernantes, & Serrano, 2016). Consequently, DevOps implies a cultural shift toward collaboration between development, quality assurance, and operations (Ebert et al., 2016). The success of DevOps is based on four principles (Humble & Molesky, 2011):

  • Culture. Joint responsibility for the delivery of high quality software.

  • Automation. Automation in all development and operation steps towards rapid delivery and feedback from users.

  • Measurement. All process must be quantified to understand delivery capability and proposals of corrective actions should be formulated for improving the process.

  • Sharing. Sharing knowledge enabled by tools is crucial.

Accordingly, knowledge management is one of the pillars of DevOps and must be implemented using sound methodologies and solid tools. The literature has reported specific knowledge management systems designed and implemented to serve in DevOps scenarios (Wettinger, Andrikopoulos, & Leymann, 2015). Focusing just on the development side of DevOps, Knowledge management is seen as one of the cornerstones for software quality. These authors indicate that in the context of software quality, knowledge management comprises aggregation, distribution, visualization of data, and information and knowledge to support collaborating stakeholders in fulfilling their quality-related tasks and decisions (Del Giudice & Della Peruta, 2016).

In spite of the importance of the topic, to the best of authors’ knowledge, there are no research studies that go beyond the explanation of knowledge management tools on knowledge management factors in continuous software engineering or DevOps scenarios. This paper aims to bridge the gap in this important topic.

This case is structured as follows: Section 1 above contains a brief introduction to Continuous Software engineering, continuous deployment and DevOps. In Section 2, a background of the company in which the case study is conducted is presented. Section 3 presents the main aspects on the team leading the DevOps efforts based on continuous deployment. This is followed by Section 4, in which the research methodology for this case study is presented. In Section 4, the case study findings are analysed and discussed. Section 5 provides a discussion and describes the lessons learnt. Section 6 presents the main conclusions of the case study.

Section snippets

Company background

Meta4 is a world leader in human capital management solutions. Founded in 1991, Meta4 has more than 1300 clients in 100 countries. More than 18 million employees are managed via Meta4 software. In 2016, Meta4 made 63 million euro, 5% more than for 2015, achieving record takings through its line of cloud HR and payroll solution.

Meta4, with 950 employees worldwide, has branches in eleven countries, although the headquarters of the company is located in Madrid, Spain. Meta4 moved from on premise

The DevOps team

This section begins by describing the scenario before the project started, after which the project scope and objectives are depicted.

According to Gartner, by 2020, 30% of global midmarket and large enterprises will have invested in a cloud-deployed human capital management suite. Meta4 started its efforts towards fully functional cloud solutions around a decade ago. However, it was not until 2013 when DevOps appeared as a possible solution to some of the issues associated with DevOps adoption.

Case study research method

Given the nature of the project and the objectives of the case study, a qualitative research methodology was adopted. More precisely, researchers used the Grounded Theory (GT). Drawing on GT, researchers are able to investigate the organisation from a user-orientated perspective and an organisational perspective and extrapolate findings grounded in the data available. In our case, researchers conducted a set of semi-structured interviews with project group members identified by the project

Lessons learned

The lessons learned during the different phases of the case study can be classified into three different categories as follows: organizational matters, tools and people. In what follows, these areas will be reviewed and discussed.

Conclusions

This case study illustrates the use knowledge management tools in the adoption of DevOps practices by a traditional software vendor as a way of efficiently integrating development, delivery and operations of cloud solutions. DevOps adoption drives a challenging cultural shift towards collaboration and knowledge-sharing between software development, quality control and operations. In this sense, several conclusions can be drawn from this case study. The need for implementing DevOps emerged when

Acknowledgment

We would like to thank Fundación CajaMurcia for the support provided.

Ricardo Colomo-Palacios Full Professor at the Computer Science Department of the Østfold University College, Norway. Formerly he worked at Universidad Carlos III de Madrid, Spain. His research interests include applied research in information systems, software project management, people in software projects, business software, software and services process improvement and web science. He received his PhD in Computer Science from the Universidad Politécnica of Madrid (2005). He also holds a MBA

References (11)

There are more references available in the full text version of this article.

Cited by (53)

  • Agile incident response (AIR): Improving the incident response process in healthcare

    2022, International Journal of Information Management
    Citation Excerpt :

    We propose the Agile Incident Response (IR) Framework, inspired by the Agile Manifesto (Beck & Depew, 2001) and we incorporate agile principles into IR processes to break them down into smaller, more manageable parts, focused around specific tasks, which can be prioritised and continuously delivered over shorter iterations. The Agile philosophy is widely and successfully applied in Software Engineering (e.g., Colomo-Palacios, Fernandes, Soto-Acosta, & Larrucea, 2018; Gupta, George, & Xia, 2019; Tam, Moura, da, Oliveira, & Varajão, 2020), and have been shown to reduce large project failures, by providing constant monitoring and continuous improvement throughout the project (Laanti, Salo, & Abrahamsson, 2011). Most importantly, it incorporates quick feedback and continuous adaptation (Serrador & Pinto, 2015), both of which can support IR teams to respond to incidents whilst minimising information loss and service disruption.

  • The VALU3S ECSEL project: Verification and validation of automated systems safety and security

    2021, Microprocessors and Microsystems
    Citation Excerpt :

    Business strategy [75] considers continuous planning and budgeting that evolve in response to changes in the business environment. Continuous development [76,77] considers areas such as integration, delivery, deployment, verification [78,79], compliance, and continuous architecting [80–82]. Our focus is on the continuous integration, verification, and architecting.

  • DevOps and software quality: A systematic mapping

    2020, Computer Science Review
    Citation Excerpt :

    According to Muñoz, and Negrete [54] it is important to highlight that the generic DevOps process has 4 phases, 8 activities and 40 tasks, which have to be followed in order to perform a software development process with DevOps culture. Colomo-Palacios et al. [25] observed that DevOps culture has an impact on software quality. It determines the employee’s operation which in turn improve interoperability of an application.

View all citing articles on Scopus

Ricardo Colomo-Palacios Full Professor at the Computer Science Department of the Østfold University College, Norway. Formerly he worked at Universidad Carlos III de Madrid, Spain. His research interests include applied research in information systems, software project management, people in software projects, business software, software and services process improvement and web science. He received his PhD in Computer Science from the Universidad Politécnica of Madrid (2005). He also holds a MBA from the Instituto de Empresa (2002). He has been working as Software Engineer, Project Manager and Software Engineering Consultant in several companies including Spanish IT leader INDRA. He is also an Editorial Board Member and Associate Editor for several international journals and conferences and Editor in Chief of International Journal of Human Capital and Information Technology Professionals.

Eduardo Fernandes received his Ph.D in Physical Sciences from the European Space Agency (ESA) / Universidad Complutense de Madrid (1995). He is currently Chief Architect and Technology Product Manager at Meta4. In this position he is responsible for strategic decisions which affect the Meta4 product line, for example PeopleNet. His main role is to analize technological market trends and technological frameworks to merge them to Meta4 products. Before reaches meta4 he was responsible for the development software team for the MiniSat-01 (ESA), the first Spanish mini-satellite (1997). While working on his master in physics he was responsible for the automation of the biggest telescope in Brazil, deeply involved in both software and hardware developments (1992).

Pedro Soto-Acosta is a Full Professor of Management at the University of Murcia (Spain). He attended Postgraduate Courses in Management at Harvard University (USA) and received his PhD in Business Economics from the University of Murcia. He serves as Associate Editor and Senior Editor for several mainstream journals including Decision Sciences, Electronic Commerce Research and Applications, Electronic Markets, Information Systems Management, Journal of Knowledge Management, Journal of Strategic Information Systems, and Computational Economics. His work has been published in journals such as Computers in Human Behavior, European Journal of Information Systems, Journal of Business Research, and Technological Forecasting and Social Change, among others. Further information is available at http://webs.um.es/psoto.

Xabier Larrucea holds a Ph.D, PMP, Executive MBA, and a computer engineering degree. He is a senior project leader and research scientist at Tecnalia and a part‐time lecturer at the University of the Basque Country. His research focuses on areas such as safety‐critical software systems, software quality assurance, software process improvement, empirical software engineering, metamodeling and technology strategy.

View full text