Keywords

1 Introduction

Human factors and ergonomics is “the scientific discipline concerned with the understanding of interactions among human and other elements of the system, and the profession that applies theory, principles, data and methods to design in order to optimize human well-being and overall system performance” (IEA 2000). The Federal Aviation Administration defines human factors as a “multidisciplinary effort to generate and compile information about human capabilities and limitations and apply that information to equipment, systems, facilities, procedures, jobs, environments, training, staffing, and personnel management for safe, comfortable, and effective human performance” (FAA 2016). Human factors (HF) analysis is utilized in several fields, but is most prominent within the healthcare and aviation industries where a single human error can lead to potentially catastrophic consequences. HF is ideally applied proactively throughout system development. In many cases, however, it is applied retrospectively, examining the consequences of human errors after a costly or damaging incident.

Human factors and ergonomics assumes that humans are inevitability fallible. Therefore, an effort needs to be made to design products, systems, and processes to reduce human error and optimize human efficiency. Research spanning behavioral economics to organizational behavior reveals that people have a limited bandwidth for processing information. Furthermore, research by Kahneman (2013) showed that people attempt to reduce cognitive load by creating heuristics and are beholden to subconscious biases. This generally results in suboptimal decision making. Consequently, there is no way to completely eliminate human fallibility, only to decrease opportunities for error and mitigate their effects.

One attempt at managing human factors analysis and human errors throughout all stages of development and production is the concept of Human Readiness Levels (HRL). This scale was initially developed by Phillips at the Naval Postgraduate School (2010) and was designed to complement the previously existing Technology Readiness Levels (TRL) scale. HRLs are meant to integrate the human into the system and create a reliable and unbiased measure of the readiness of the technology for human operators/users (Endsley 2015).

The HRL scale is still currently in development and alternatives to the scale have been discussed, but have not been well-established. A draft HRL scale was developed by Endsley (2015) and can be viewed in Fig. 1. This specific HRL scale has not been established as the most optimal tool for incorporating the human component throughout the entire development and design processes. Since no particular HRL scale has been officially adopted, we consider HRLs as a general concept to consistently measure and define the human aspect in development, production, and related processes. Therefore, we advocate HRLs in terms of the implementation of a management process that incorporates the human component. HF is often used as a retrospective analysis- the HRL scale advocates that the human component is considered proactively throughout the entire product lifecycle. We refer to HRLs as a tool or measure that allows project teams to analyze, understand, and develop their components, processes, and systems from a human factors approach. This allows projects and programs to incorporate the human element from initial design through the end-user, rather than as a form of post-analysis.

Fig. 1.
figure 1

HRLs scale as suggested by Endsley (2015)

While HRLs have been researched, the government sector has not implemented HRLs as a proponent of technology or system development. HRLs have not received sufficient justification to prove their impact to the success of a project. Thus, our research aims to answer:

  1. 1.

    Can we justify using human readiness levels by their economic benefits and technical needs?

  2. 2.

    Can we create implementation strategies for human readiness levels based on the forecasted adoption roadmap?

To answer these questions, we examine a reactive case study wherein a human factors team was asked to analyze and discover errors within production that led to significant cost and schedule overruns as well as lot failures. The team found significant errors related to the human component and made several suggestions. The economic impact of these HF issues is analyzed in terms of the triple constraints: scope, cost, and schedule. We also evaluate some indirect benefits of incorporating HRLs. This provides rationale for implementing the HF approach earlier in the development process justifying the need for HRLs from an economic standpoint. Additionally, the negative project scope impact demonstrates the technical need for HRLs. We then use the historical TRL adoption model to create a roadmap of HRLs adoption within the government sector. Finally, our paper concludes with potential implementation strategies to increase the likelihood of successful HRL adoption within the government.

1.1 A Reactive Case Study

We provide a case study to demonstrate one instance of the specific human factors issues that arose from not incorporating the human component in the initial design phase or throughout development. This example comes from Sandia National Laboratories during development of a critical component on a large-scale weapons system. The case demonstrates a reactive approach to human factors wherein the human component was not considered until several issues were prominent and resulted in negative financial, schedule, and scope impacts. In our example, three lots of the component had failed and had to be discarded, which resulted in a halt in production. A human factors team was then consulted to review processes and evaluate the most common errors. They designed user-centered controls and processes to combat those errors as well as reduce the possibility of human errors further affecting the product. While this case refers to only one component, several issues were found, spanning across the various sectors of HF.

The HF team identified two plant and equipment issues within the component development. The first pertained to the tools provided to support visual inspection. The inspectors struggled to adequately detect contamination, which led to scrapping of 1.2% of the components. In essence, the contamination was not visible to the human eye under normal lighting conditions. Inspectors were being asked to perform a task they were not physically capable of completing. The HF team experimented with different types of supplemental lights and determined that the contamination fluoresced under one particular color of lighting. Following the change in lighting, inspectors were able to successfully detect contamination, and no components were scrapped. The second plant and equipment issue stemmed from several fixtures and accessories that resulted in damage to the component. The HF team recommended design changes to incorporate keyed features within the fixtures that facilitated proper alignment of the component and prevented damage.

Additionally, the HF team identified several issues in the processes of the component’s production that were due to human errors. First, there was an issue pertaining to the assembly order of the component—13% of components had to be reworked and 4% of the reworked units had to be scrapped. The HF team analyzed the complete assembly process and identified a more effective and reliable ordering of the steps involved. After the assembly process was revised, only 3% of components needed to be reworked, and no components had to be scrapped, saving significant time and resources. Second, a process was implemented to enhance inspector’s ability to read serial numbers, which reduced handling mishaps as well as human error when reading or recording the serial number.

Finally, the HF team identified issues within the people component via two more inspection processes. No prior process had been established to determine the coordination among four different vendors responsible for inspection and certification of piece parts used in this particular component. This resulted in a lack of communication among vendors as well as the inability to trace the life of production. Because of the poor traceability, a lot of 1,300 piece parts had to be scrapped, with a loss of $18,000 and a significant schedule delay. The HF team mapped the process among vendors to promote traceability of piece parts. Further, job aides were created to facilitate inspection of the piece parts and enhance consistency in inspection. Another phase of the production process required manual transcription of data, which resulted in high probabilities of human error while transcribing or reading the data. The HF team redesigned forms to remove unnecessary information and pre-populated them with static information such as serial numbers. The team also converted some paper information to electronic form. These changes resulted in 16 fewer days for completion and reduced the number of human error opportunities from 8000 to 400. Modifications also allowed operators to focus solely on the task, without the distractions associated with reading or transcribing data.

This case study provides a unique opportunity to not only examine the HF interventions, but also view the economic and business impact before and after implementing a HF analysis. With economic implications in mind, we take a project management approach while examining the benefits that resulted from incorporating the HF recommendations. Traditionally, the project management perspective allows an analyst to consider the triple constraints of a project: schedule, cost, and scope/quality.

Instead of analyzing the independent HF interventions, we consider the impact to production of component lots before and after the recommended suggestions (Fig. 2). We are able to quantify the impacts of the various HF recommendations through several metrics. The HF interventions resulted in a cost savings of 67% per lot, a 36% reduction in manufacturing time per lot, and a doubling in the number of components delivered. It is evident that implementing the HF recommendations resulted in significant cost savings, reduced delays, and increased quality and quantity. The technical and production teams were better able to meet their deliverables, saving both time and money.

Fig. 2.
figure 2

The scope, cost, and schedule percent improvements from human factors interventions that were implemented on a defense project within the U.S. government.

1.2 A Proactive Approach

Frequently, human factors experts are brought in to assess situations once a failure has occurred. The Three Mile Island Nuclear Reactor Accident in 1979 was caused by an operator’s failure to adequately assess a situation due to their lack of training as well as poor human usability and design of controls (GPU Nuclear Corp 1999). The fire in the King’s Cross station of the London Underground in 1987 can be attributed to the same human errors (Fennell 1988). The Challenger Explosion in 1986 provides another example of a catastrophe wherein poor communication and arduous work schedules have been cited as partial contributors (Forrest 1996). These examples are used to show the breadth of human errors and accidents that led to the loss of human life. Each of these cases used post hoc analysis to determine the cause of system failures which was primarily human error.

We propose that investing in the human component throughout all stages of the product’s lifecycle increases an organization’s flexibility and enhances their capabilities. Furthermore, it provides the organization a competitive advantage. Most government organizations consider the human dimension only when there is system failure and significant negative consequences. Rather than the reactive approach, we suggest that taking a proactive approach would result in maximum benefits for an organization and the product development lifecycle. From a business perspective, the savings seen in our reactive case study could have been realized from the beginning of lot production had an HRL scale been considered since development. In addition to several quantifiable economic benefits that would results from adopting an HRL scale, we suggest several other latent benefits would also arise. We provide several potential examples to help gain an understanding of the types of advantages that may occur from a proactive use of HRLs, but suggest the added benefits be further researched.

If human factors are considered early in development, from a proactive approach, training would be significantly improved. The operator would be trained on the correct system initially and would not have to undergo retraining if reactive adjustments are made. This would also improve the operator’s cognitive load and reduce the possibility of human error. It is prudent to implement the HF techniques from the beginning of development to reduce the negative impact to the users and decrease opportunities for human error.

Additionally, if a product is developed poorly, resulting in failures, the organization’s reputation might be severely damaged and the customer’s satisfaction may be adversely affected. The organization may appear as incompetent if a product was labelled as production-ready yet failed. In addition to the effects of reputation on customers, this would also impact the employee and end-user perspectives of the organization. The more incompetent an organization appears, the less trustworthy the organization is perceived by these essential stakeholders. For example, Japan lost their role as leader of the electronics industry when their competitors delivered better systems that led to more productive and efficient users. According to Panasonic’s President Kazuhiro Tsuga, “Japanese firms were too confident about our technology and manufacturing prowess. We lost sight of the products from the customer’s point of view” (Wakabayashi 2012).

2 Adoption Within the U.S. Government

We conducted an extensive literature review to capture the full picture of TRL adoption within the government sector. This helped us determine the key events leading to widespread adoption within the five stages of the Technology Adoption Lifecycle Rogers (2003). Based on the impact of an event, we determined the milestones that led to the completion of one stage and the beginning of the next. We used the historical adoption model of TRLs to create a forecasted HRL adoption model. The two scales’ parallel nature allows us to make such predictions. This forecasted model was used to recommend project and organizational implementation strategies.

2.1 Historical TRL Adoption Model

We create a TRL adoption model to provide insight into the HRLs adoption roadmap. TRLs have largely become a requirement through most of the government and therefore have reached complete adoption within all relevant industries. Figure 3 reveals our historical TRL adoption roadmap as well as the most probable market share percentage (S-curve) as TRL adoption increased over time. We also show the main government organization adopter through each stage of the lifecycle.

Fig. 3.
figure 3

Model of our TRL historical adoption roadmap and market share percent

The chasm, as described by Moore (2014), is the gap between early adopters and early majority. “Crossing the chasm” is often seen as the most difficult step in the technology adoption lifecycle. In the case of TRL adoption within the government, we suggest that the General Accounting Office (GAO) recommendation for DoD to begin implementing TRLs due to increased technological maturity and cost savings seen in private industry allowed TRLs to cross the chasm and reach the early majority stage. The specific events included in our historical TRL adoption roadmap are explained in further detail.

  • 1969 – Report on advanced space station technology mentioned a new idea to assess maturity of new technologies called the “Technology Readiness Review” Mankins (2009)

  • 1974 – Stan Sadin developed the first 7-level TRL scale with one line definitions as a tool for assessing technological maturity for NASA Mankins (2009)

  • 1986 – Challenger Space Shuttle accident increased focus on rebuilding space agency’s technological foundations through new programs Mankins (2009)

  • 1989 – TRL use expanded due to the “Space Exploration Initiative”. The TRL scale was extended to the 9-levels that are now the standard Mankins (2009)

  • 1991 – TRLs became unilaterally used throughout the Civil Space Program Mankins (2009)

  • 1990s – TRLs initial adoption within the U.S. Air Force (Whelan 2008)

  • 1992-1994 – NASA’s Office of Space Science used TRLs extensively to communicate with researchers, internal and external organizations, and its management chain Mankins (2009)

  • 1995 – Mankins developed and explained the first complete set of TRL definitions Mankins (2009)

  • 1999 –The U.S. General Accounting Office recommended the DoD “adopt a disciplined and knowledge-based approach of assessing technology maturity such as TRL” Mankins (2009)

  • 2000 – First DoD adoption of NASA’s TRL scales Mankins (2009)

  • 2001 – Deputy under Secretary of Defense for Science and Technology issued a memorandum that endorsed the use of TRLs in new major programs (Whelan 2008)

  • 2001 – Required use of TRLs in Department of Defense accelerates adoption (Olechowski 2015)

  • 2003 – DoD developed their own formal guidelines and definitions for assessing technological maturity (Whelan 2008)

  • 2006 – GAO Initiated review that resulted in the DoE producing their own Technology Readiness Assessment guidelines (Alexander 2007)

  • 2007 – GAO recommended DOE adopt TRLs (Alexander 2008)

  • 2008 – Language supporting the GAO recommendation was incorporated into the Congress budget allocation (Alexander 2008)

2.2 Forecasted HRL Adoption

HRLs development and implementation has been low since its inception in 2010. Based on market penetration, we estimate HRLs have only been adopted by the innovators (refer to Fig. 4). While the human element has previously been considered, the actual tool/framework of HRLs lineage and adoption growth can be seen below.

Fig. 4.
figure 4

Forecasted HRLs adoption roadmap and possible market share percent

  • Phillips (2010) developed and tested a 9-levels Human Readiness Levels scale at the Naval Postgraduate School

  • Hale et al. (2011) created a 6-level Human Factors Readiness Level scale to assess human factors needs in human-machine interactions

  • 2013 – Endsley examined the feasibility of the 9-level HRL scale as a parallel measure to TRLs

  • O’Neil et al. (2015) developed the Comprehensive Human Integration Evaluation Framework (CHIEF) Model- a 5 level scale to assess human system integration on total system performance which was implemented within the U.S. Coast Guard Office

  • 2015 – Endsley presented on “Human System Integration: Challenges and Opportunities” at National Defense Industrial Association, which argues for the need to use an HRL framework

  • See and Morris (2016), researchers at Sandia National Laboratories, began examining the feasibility of integrating the HRL scale within the national laboratory

  • 2017 – Newton, Greenberg, and See conduct research to justify the need for HRLs from an economic perspective and create a roadmap of HRL adoption

Since we are currently in the innovators stage of HRL adoption, there must be four major catalysts that bridge each stage to the subsequent one. These catalysts are events that cross the gap separating each stage of the lifecycle and help to advance adoption. To determine the HRL adoption catalysts, we use the TRL adoption roadmap as a general guideline.

Prior to reaching the early adopter’s stage, we believe that HRLs must have formal definitions and a finalized scale. The scale must be broad enough to cross disciplines, but specific enough that it can be implemented by an organization. We suggest that a feasibility study be conducted to determine how an HRL scale could actually be implemented. Current research within DoD is being performed to determine if HRLs could be adopted. Sandia National Laboratories is currently conducting research to determine if HRLs should be structured as a separate readiness scale or if they should be incorporated into the existing TRL scale (See and Morris 2016).

To cross the chasm, HRLs will need a strong champion that will encourage and convince government organizations to begin adopting HRLs. Due to the impact on TRLs, we suggest an organization like the GAO would be able to adequately provide proper justification by recommending HRLs be adopted within the government. Private organizations have an incentive to maximize profits which can be partially done by understanding the human element, although a formal HRL process may not be performed. GAO can examine the private industry and use these benefits to better understand and justify the need to use HRLs within the government. Formal policy from large government groups, like DoD and DOE, requiring the use of HRLs, will be needed to lead to the late majority stage of adoption. Finally, HRLs will reach the laggards when the requirements policy eventually extends to all contractors and suppliers, much like the TRL adoption.

3 Implementation Strategies

The TRL roadmap adoption model provides insight into a possible HRL roadmap adoption lifecycle and major implementation checkpoints. A TRL feasibility study conducted by the DoD provides added inferences into distinct implementation logistics and potential challenges in adopting HRLs into an organization. The first challenge is to evaluate potential organizations that would be most ideally suited for HRL adoption. We consider which processes need to be established within an organization to increase likelihood of a successful HRL implementation. HRL adoption is more likely to spread from one organization to another once the usefulness and triple constraint benefits can be empirically validated across government institutions. The first organization to unilaterally implement HRLs will likely need to demonstrate particular organizational characteristics. The second challenge is to examine implementation tactics within a specific organization, especially as it relates to project management.

Organizations that are most likely to adapt to significant changes are those that demonstrate organizational change management processes. Weiner (2009) treats organizational readiness as “a shared psychological state in which organizational members feel committed to implementing organizational change and confident in their collective abilities to do so.” This indicates that everyone in the organization must act towards effectively implementing the changes and that each individual understands the justification for such modifications. These organizations tend to be much more adaptable and amenable to significant changes. Organizations that have a culture which fosters change and encourages individual responsibility will be more likely to successfully implement HRLs into their processes. Strong social capital is an aspect in organizations that lead to more flexible organizations and are also more likely to adapt to changes (Krebs 2008).

We suggest that organizations that previously demonstrate some level of human factors considerations are also more likely to successfully adopt HRLs. Organizations that already have HF experts incorporated into their projects is an indication that there is value in the human component of a system. Even if HF engineers are not necessarily incorporated into a project team, they are considered valued members. It is these organizations that clearly indicate the benefits of understanding and analyzing the human element. Furthermore, the foundational infrastructure will already be in place as the organization already has HF employees.

In addition to the organizational characteristics, we suggest several practical strategies that need to be in place for an organization to successfully adopt HRLs. DoD conducted a feasibility study to ensure successful implementation of TRLs and found that successful adoption is labor intensive (Graettinger et al. 2002). Additionally, DoD and DARPA Principal Investigators (PI) were already working under tight time constraints and exerted maximum cognitive efforts prior to implementing the TRLs. DoD found a third person objective observer to be effective for proper TRL utilization to help overcome the PI’s constraints, but this method still required extensive interactions with each PI. We suggest that for successful HRL application, sub-groups of people or a super-user group is required to apply and utilize HRLs to reduce the cognitive load on the PIs. These super-users can consult with all necessary stakeholders (e.g. production workers, end-users of the product, and PIs) and work with the project manager to ensure all human component aspects have been accounted for in the project. The most appropriate option for a super-user is a human factors engineer or subject matter expert.

Successful implementation would also include directives from executives and upper-management to adequately permeate company culture. Policy requirements ensure HRLs would be used within all technology development initiatives. We suggest a project manager be assigned for implementation of HRLs within an organization. This would be beneficial to create a plan and control and monitor the scope, cost, and schedule of implementation. We do not provide a specific implementation plan in this paper, but the project manager needs to consider the variation of research topics, project sizes, and individual requirements within their organization. Furthermore, this manager should also develop a plan for handling projects in various stages.

In addition to implementing HRLs within an organization, the management team needs to consider how HRLs will affect their project processes and procedures. As a parallel, TRLs have often been used by DoD to act as a threshold to technological maturity prior to acquisition of a new project. Technologies must reach a TRL 6 before they are ready for insertion into acquisition programs. Similarly, HRLs need to be defined and optimized for the specific organization and project types. The product or process development stage that maximizes HRL utility must also be evaluated. Project managers need to ask questions based on potential HRL impact to the project. Will the system or component levels be evaluated? Have we considered the working and operational environments? Additionally, a contingency plan could be necessary. For example, if a project reaches completion but the HRL is too low for acquisition what are the countermeasures?

The benefits of adopting an HRL scale or its equivalent are evident in scope, cost, and schedule, as well as increases in user and customer trust, the reduction of cognitive load, and training efficacy. There are however a few added costs as a result of incorporating the human component. First, adding the HRL scale requires a human factors engineer to be a core team member on projects, or at least needs to review projects during every stage of development. This leads to an investment in human capital requiring additional budget allocation for the HF engineer’s salary. Incorporating an HRL scale will entail additional requirements prior to advancing to further stages in development. For example, a product must reach an appropriate readiness level across all scales to meet its design standards and to pass reviews. Adopting an HRL scale would be an additional metric of maturity by which to gauge a project, process, or product. If a product does not have a high enough HRL measure, more efforts would need to be put towards development to increase the HRL. This may require additional costs up front, but would result in significant cost savings in the long run, as shown by our case study.

4 Conclusion

Our research provides the business and economic justification for implementing HRLs and provides a potential roadmap for HRL adoption within the government sector. Results from additional research that is currently being conducted are needed before HRLs will reach the early adopters stage. We mentioned several latent benefits of using an HRL scale, but further undertakings need to occur to realize these added benefits. As mentioned in our adoption roadmap, a feasibility study such as the current effort at Sandia National Laboratories (See and Morris 2016) would help define the HRL scale to ensure it meets the needs of the organization. Finally, it would be helpful to create guidelines for organizations to be able to easily implement HRLs within an organization or program. Creating a foundational procedure for incorporating HRLs into a project will lead to project managers who are able to include HRLs in their processes.

Additional research can be done that extends past the early adopter’s stage of the adoption lifecycle. Further, industry adoption can help to understand the benefits of HRLs. For example, understanding how HRLs might impact specific fields, such as healthcare, can provide insight to how HRLs may be enforced within that industry. Creating technologies for all users, rather than the “average” user, can also propel human factors considerations. This would be especially important for individuals with disabilities and would increase a technology’s market potential. Researchers need to understand how the human is a part of the system as research, production, and applications continue to grow in areas like TSensor systems (Walsh 2014), edge computing, robotics, artificial intelligence, and human augmentation (Sanwal 2017). Studies should be conducted to determine the best method for incorporating HRLs for each of these industry trends.