Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

The role of institutional factors and cognitive absorption on students’ satisfaction and performance in online learning during COVID 19

  • Sameera Butt,

    Roles Conceptualization, Data curation, Methodology, Validation, Writing – original draft

    Affiliation Institute of Quality & Technology Management, University of the Punjab, Lahore, Pakistan

  • Asif Mahmood ,

    Roles Formal analysis, Investigation, Software, Supervision, Writing – review & editing

    mahmood.engineer@gmail.com

    Affiliation Department of Innovation and Technology Management, College of Graduate Studies, Arabian Gulf University, Manama, Bahrain

  • Saima Saleem

    Roles Project administration, Supervision, Validation, Writing – review & editing

    Affiliation Institute of Quality & Technology Management, University of the Punjab, Lahore, Pakistan

Abstract

With the rise of the Covid-19 pandemic, there has been a severe negative impact on all aspects of life, whether it be a job, business, health, education, etc. As a result, institutions, schools, colleges and universities are being shut down globally to control the spread of Covid-19. Due to this reason, the mode of education has a dramatic shift from on-campus to online learning with virtual teaching using digital technologies. This sudden shift has elevated the stress level among the students because they were not mentally prepared for it, and hence their academic performance has been adversely affected. So, there needs to figure out the underlying process to make online learning more productive. Thus, to obtain this objective, the present study has integrated the modified Technology Acceptance Model (TAM), Task Technology Fit Model (TTF), DeLone and McLean Model of Information Systems Success (DMISM) and Unified Theory of Acceptance and Use of Technology (UTAUT) model. A sample of 404 students was obtained, where 202 students were from the top ten public sector universities, and 202 were from the top ten private sector universities of Punjab. Structural Equation Modelling (SEM) was used to analyze the hypothesized framework using AMOS. The results reveal that institutional factors positively impact students’ performance mediated by user satisfaction and task technology fit. Similarly, institutional factors affect performance through mediation by user satisfaction and actual usage in sequence. Cognitive absorption was used as a moderator between institutional factors and user satisfaction. In the end, theoretical and practical inferences have also been discussed.

1. Introduction

The COVID‐19 pandemic constrained universities all around the globe to shut down their campuses inconclusively and move their instructive activities onto online modes. The deadly virus discovered in China in December 2019 outspread throughout the world in no time, and hence was pronounced as a pandemic by the World Health Organization on March 11, 2020. In the spring of 2020, most universities were supposed to close their campuses and move their entire academic programs online [1]. Universities were unprepared for such a change from traditional classroom instruction to entirely online education delivery [2]. But enormous technological evolution in the past few decades has proved immensely useful during this pandemic [3, 4]. There were online portals to promote online education [5]. Numerous tools used by the institutions to deliver online education during this pandemic are Zoom, Google Meet, Google Classroom, Microsoft Teams, Google Forms, docs and sheets, and so on. Despite this, educators and students encountered various administrative, economic, technological and social issues [6, 7].

The deadly pandemic and the ensuing lockdowns had a significantly destructive effect on people’s mental health all over the world [8, 9]. Such mental and psychological issues regularly impede many students from adjusting to online instruction. Besides, all students also do not have the same kind of access to and understanding of emerging technology. While these differences were present before, the COVID 19 pandemic has brought this digital gap to light [10]. Because of the raging pandemic, [1] was the first to explain how universities relocated from classroom-based education to online education. Although many countries had extensive online education infrastructure before the pandemic [11], still no university was prepared for a complete transition to online learning. Observational examinations have discovered that students prefer learning in actual study halls to online sessions [12]. Students miss time with their friends in classrooms, labs and admittance to the library [13].

In any case, students feel that online classes assisted them with continuing their studies during the pandemic [11]. Universities are now using novel techniques and strategies to ensure that their students continue to receive a quality education [14]. According to [15], the digital divide existed before the pandemic, but it escalated. Online education needs consistent access to digital technology. Students with limited access to new technology and those unfamiliar with them have difficulty adjusting to online education. Moreover, some students live in distant areas and have a problem attending online classes from their homes [13]. Furthermore, because of the unanticipated transformation from traditional face-to-face on-campus courses to online learning, students, teachers and educational institutions face many challenges. Among these, the most critical challenge is implementing high-quality online education systems, adapting the latest online learning technologies, and providing high-quality education. So, because of these numerous difficulties, there was a negative effect on students’ academic performance and subsequently, their grades declined [16]. Therefore, a question arises, ‘what needs to be done to enhance the satisfaction level and academic performance of the students with the help of online education systems’?

Before analyzing the essential components in online learning that can elevate the satisfaction level and academic performance of the students, it is required to initially understand the fundamental theory of e-learning and the various constituents of e-learning. Online education is a system that imparts education with the help of the internet while utilizing laptops, smartphones, desktops, tablets, etc. [17]. The governments of various countries worldwide are dispensing their maximum efforts to encourage the usage of the latest technology in online education systems [18]. It is also believed to have many advantages, like it saves time, supports interpersonal communion, enhances learning efficiency, delivers advanced education and learning, provides authentic information, saves cost, supports adjustable location choice, and minimizes time-based problems associated with on-campus learning [1921]. Thus, it is evident from these advantages that e-learning is effective and advantageous for the health of the learners, instructors, and relevant staff’ in the time of the pandemic covid-19.

Thereby, many researchers have made significant contributions in creating different hypothetical ideas and developing various models in the context of information systems to ascertain and explain the attitude and behavior of the users with the relevant technology. The essential models found in the literature relevant to information systems are the Theory of Reasoned Action (TRA) [22], Theory of Planned Behavior (TPB) [23], Technology Acceptance Model (TAM) [24], Task Technology Fit (TTF) model [25], DeLone and McLean Model of Information Systems Success (DMISM) [26, 27] and Unified Theory of Acceptance and Use of Technology (UTAUT) model [28]. But the discernment of utilization of information systems has been greatly neglected in these models and their corresponding hypothetical conceptions [29], with the DMISM as an exception that determines the usage of information technology by examining the influence of overall quality on user satisfaction, actual system usage and performance. So, it is immensely utilized to evaluate the proficiency and productiveness of information systems [30].

Thus, in a majority of the research-oriented online learning, variables were obtained from these models to assess the effectiveness of online learning and its numerous established frameworks. Moreover, multiple factors affect the satisfaction level and academic performance in online learning. Among these factors, institutional factors also play a significant role in enhancing the performance level of the students. According to [31], the institutional environment and infrastructures significantly affect students’ performance. According to [32], reducing the class size could improve learning, while sufficient research equipment and teaching content could significantly enhance the students’ performance. [33] asserted that an improved physical environment offers comfort, security, and a better knowledge of courses and is impactful in the form of higher learning and performance. Another factor recognized as significant in online learning is cognitive absorption, an individual characteristic identified by several experts as necessary in using and influencing technology. In recent Malaysian research on digital libraries [34] it was observed that there is a significant positive association between cognitive absorption and user satisfaction. Likewise, in Spain, [35] observed that cognitive absorption had a substantial positive link with perceived usefulness, perceived ease of use, and user satisfaction. [36] examined the influence of online learning platforms on user satisfaction through cognitive and emotional engagement. Similarly [37], in his research work, determined the influence emotions have on online learning. [38] evaluated the factors that affect online learning utilization, like effort expectancy, social influence, facilitating conditions, performance expectancy etc.

Similarly, [39] determined the effect of compatibility and task technology-fit as mediators on the usage of e-learning with the help of the Information System Success Model (ISSM). Hence, the researchers have analyzed the association of online learning with numerous variables. Yet, there are still certain gaps present in the literature that are required to be determined and investigated, such as there are only limited studies based on online learning that have analyzed the impact of institutional factors on performance impact [4042]. To date, limited studies have been done on how institutions worldwide coped with the COVID-19 outbreak and prepared for it. A considerable number of assessments and forecasts have been published on the likely influence of COVID-19 on higher education. However, there have been few quantitative published studies on this subject, and the research aimed at examining how online distance learning implemented during the COVID-19 lockdown has influenced the teaching approach in higher education is still being developed. Moreover, minimal research has analyzed student performance in online distance learning during lockdown to prior learning in face-to-face classes. Likewise, contradictory studies have been observed based on user satisfaction, actual system usage and performance impact. Research studies that advocate the existence of this relationship are [4345], yet studies like [46, 47] analyzed that there is no such association that is present among user satisfaction and performance impact. This contradiction additionally presents an advantage in observing the significance or insignificance of the effect of user satisfaction on actual system usage and performance. Besides, the presence and contribution of mediators and moderators in the context of online learning models have been slightly addressed, like the contribution of human-relevant attributes such as cognitive absorption is scarcely being described.

Hence, specific research questions are constituted by examining the issues encountered by the online education system in the time of pandemic Covid-19 and the gaps observed in the prior literature work. They are (a) By what means task technology fit (i.e., academic tasks of the students are fit with the respective e-learning system) may influence institutional factors and user satisfaction to enhance the performance of the learners during COVID-19? Likewise, (b) By what means actual usage (i.e., frequency of using the online education system by the learners) may influence institutional factors and user satisfaction to enhance the performance of the learners during COVID-19? (c) Can cognitive absorption of learners elevate the satisfaction level of learners pertaining to the quality of the online education system?

Thus, a research framework has been established in this study derived from the consolidation of the (TTF), (DMISM) models with the inclusion of essential components, institutional factors derived from the facilitating conditions variable of the (UTAUT) model and cognitive absorption derived from perceived ease of use and perceived usefulness variables in (TAM) model. In the TTF model, the significant construct is the task technology fit and its relationship with performance impact, but it neglects its association with the constructs of overall quality, user satisfaction, and actual system usage. On the other hand, the DMISM model specifies the relationship between the construct’s overall quality, user satisfaction, actual usage, and performance impact constructs but neglects the task technology construct present in the TTF model. The UTAUT model aims to describe user intentions for the usage of information systems and associated usage behavior. Finally, the technology acceptance model (TAM) describes how users tend to accept and utilize a particular technology. So depending on the above-discussed literature, six variables were selected to constitute an online learning framework where institutional factors served as an independent variable in the framework, user satisfaction, actual usage, task technology fit, performance impact acted as dependent variables, while cognitive absorption used as a moderator between institutional factors and user satisfaction.

Furthermore, the association between institutional factors and performance impact is being analyzed in two ways: via serial mediation of user satisfaction and actual usage; and via serial mediation of user satisfaction and task technology fit in series and cognitive absorption acts as a moderator on the association between institutional factors and user satisfaction. This research contributes to the literature by providing empirical evidence on the causal influence of online education on student academic performance during the COVID-19 crisis using data from top universities in Punjab, Pakistan. Moreover, this research has also analyzed the online learning satisfaction level and performance among the students after the COVID-19 breakout caused an abrupt transition from face-to-face to online learning. Furthermore, we contribute to the empirical evidence on the influence of instructional models on student academic performance in online learning during Covid-19. This research also examined various factors at the institutional level that have a considerable impact on the satisfactional level and academic performance of the students in the online learning system. Moreover, this research study is important as it will present guidance to the experts more extensively on how the academic performance of the students’ can enhance by the usage of online learning during the time of the Covid-19 pandemic.

2. Literature review and hypotheses development

2.1. Literature review

There has been an enormous increase in the average number of students acquiring distance education in the past few years, especially online education for their higher education. According to research by [48], the most significant necessity to successfully implement distance education is to achieve student satisfaction. Therefore, keeping in view the importance of the satisfaction level of students in online education, it has become the primary concern of the institutions to determine the factors that elevate the level of student satisfaction. Among many factors, institutional factors are considered quite significant for enhancing the satisfaction level of the students. Institutional factors consist of assistance plans or specifications delegated as standards, conventions, or principles for student engagement to fulfill the requirements set up for graduation [49]. As the institutions devote their earnest efforts to achieving the success of their online education programs, the institutions should prioritize and give serious consideration to stabilizing and enhancing their support services, the weakening of which may become an obstacle for them in attaining their desired goals. This study will consider four important institutional factors: institutional support, administrative support, instructional support, and technical support. As [50] recommended, an effective distance education system needs a considerable extent of institutional support directed towards enhancing distance learning and education quality. Technical support is regarded as one of the significant components, and this support is provided by highly skilled and trained people who are hardware and software experts [51]. It is the technical assistance provided to the students while using the distance learning technology. Another critical institutional factor is administrative support which refers to services such as online student registration, record maintenance, online class scheduling, providing course relevant information, training etc. [52]. Instructional support is referred to as the guidance by the instructor for learning that comprises answers to student queries, rectifying the misconceptions of the students, delivering clear content to the students and providing productive feedback to the learners on their class performance and assignments.

In the context of online learning, user satisfaction is described as the degree to which online learning students realize satisfaction in their sole decision to depend on such services and how effectively they fulfill their requirements [35, 53, 54]. With reference to this study, TTF is defined as the extent to which an online learning system corresponds to users’ interests, fits their tasks, and fulfills their requirements. It is also referred to as the extent to which technology facilitates the users to accomplish their coursework or online learning activities. Actual usage is the extent of frequent usage of technology and the number of times it has been used [55]. Performance impact is defined as the degree to which using a system leads to the enhancement of work quality by assisting users to accomplish their tasks rapidly, enhancing job conduct, eradicating errors and improving job efficacy [56, 57]. In this research, the performance impact is referred to as the extent to which an online learning system influences the academic performance of the students in relation to asset reservation, proficiency, productiveness and knowledge attainment [58]. Cognitive absorption is defined as the extent to which users consider the system useful and are eager to use it again [44]. It is also referred to as a state in which the user is deeply involved with the software or a holistic experience of the student with IT, i.e., the comprehensive experience of an individual with the information technology as he/she uses the internet as well as online learning as related to attention, time and enjoyment [59]. Cognitive absorption denotes one of the kinds of intrinsic motivation, in which ‘‘the behavior is enacted for itself, to come across the inherent contentment and satisfaction”.

In research by [60], it was observed that student satisfaction is affected by a positive attitude towards technology and an independent learning approach. In their research, [61] analyzed the association between the perceived level of presence, perceived usefulness and ease of use with student satisfaction and persistence by obtaining data from an online university in South Korea. Their research highlighted that cognitive presence, teaching presence, and perceived usefulness and ease of use are regarded as the essential components for acquiring student satisfaction. [62] recommended that all universities have a flexible institutional structure to incorporate online learning technology to enhance learning effects. So apart from other factors that affect student satisfaction level and performance, institutional factors are also significant, and it comprises areas that impact student retention, and these can or cannot be changed by the institution [63]. Therefore, institutional factors have a significant impact on elevating the satisfaction level with regard to distance education, as has been discussed in detail, and many institutional factors emphasized that it might affect the student’s understanding as well. The instructors and instructional formulators must arrange for a favorable learning environment [64]. The support provided to the students proves to be one of the essential elements that impact students’ success in the context of online learning [65, 66]. Research work by [67] analyzed the recognition factors of online education systems, and they observed that cognitive absorption has a positive impact on perceived ease of use and perceived usefulness. Moreover, in relevance to Information Technology, cognitive absorption was observed to have an important influence on user satisfaction in the occurrence of regular website usage [68] in the context of usage of online learning systems [35, 69, 70] and usage of social networking websites like Facebook [71]. [34] analyzed that cognitive absorption positively influences user satisfaction.

Higher education institutions and governments are trying to introduce online learning around the world, as analyzed by [19]. In addition to this, [72] depicted their model as significant in making e-learning more beneficial and determining its success. The results demonstrated that the strength of e-learning positively influenced the performance and satisfaction level of the individual. [73] observed that technology characteristics and task characteristics of substantial open online courses positively anticipated task technology fit. Apart from this, perceived relatedness, perceived competence and social recognition remarkably determine the behavioral intention of students. Furthermore, this research also analyzed that cognitive absorption acts as a moderator on user satisfaction, where cognitive absorption was found to have a significant positive impact on student satisfaction [19].

2.2. Hypotheses development

2.2.1. Institutional factors and user satisfaction.

According to previous studies, institutional factors substantially influence learner satisfaction. A research study was carried out in the UK to determine the impact of institutional factors on learner satisfaction in online learning. It was observed that the quality of teaching and learning is an essential component in postgraduate student retention and satisfaction [74]. Research by [75] proposed that with the increase in the intensity of institutional support services in online learning, the magnitude or extent of student satisfaction also increases. Other studies also endorse that institutional factors impact user satisfaction [7678]. Therefore, it can be inferred that institutional factors significantly intensify student satisfaction in online learning.

Various studies have recognized technical support as a significant component that leads to student satisfaction. Research findings by [79] depict that students’ willingness to accept or refuse an information system is significantly affected by the quality of technical support. If the users face any issue with the online learning system and practically do not receive any assistance from the support, then the students perceive that it is simply time waste to use the particular online learning system, and hence they may stop using it. So the more efficient will be the technical support, the more the students will be satisfied with the system. Research by [80] emphasizes that support of the online system administration is believed to have a significant role in user satisfaction with technology usage. The course instructor essentially provides instructional support, however, technology can be utilized to deliver support to the students individually and improve their instructional environment [81]. Research by [82] indicates that instructional support significantly impacts student satisfaction.

An essential factor in IS practices and a significant success element used in adopting a new system is the primary user satisfaction [30, 83]. According to the present research, it is being suggested that the higher the level of support by the institutional factors of the online learning system, the higher will be the satisfaction level of the students. The more these technologies will help the students accomplish their online assigned tasks, the more they will evolve into prerequisites for accomplishing their educational commitments. There it can be hypothesized that

  1. H0: Institutional factors do not have a significant influence on user satisfaction.
  2. H1: Institutional factors have a significant influence on user satisfaction.

2.2.2 User satisfaction and task technology fit.

Various studies have affirmed the significant role of user satisfaction in IS practices in different conditions and technological applications. Task Technology Fit is perceived as a highly imperative component when examining technology applications in various organizations [84]. Many studies have analyzed the association between task technology fit and user satisfaction, and they examined that a substantial direct interrelationship is present among these variables [42, 57, 8489]. Hence, in this study, it is presumed that user satisfaction has a significant positive influence on task technology fit, given the presumption that the higher the quality level of the technological system used in online learning, the higher will be the satisfaction level of the students, and more they will acknowledge the technology as essentially fit to fulfill their online pedagogical activities.

  1. H0: User satisfaction does not positively predict task technology fit
  2. H2: User satisfaction positively predicts task technology fit

2.2.3 Task technology fit and performance impact.

In the past few years, vigorous progression has been observed in technology and the inclusion of several novel systems; particular emphasis is directed towards the realization of the usage of the technological system in relevance to elevation in the performance level of users to assess system productivity [30, 90, 91]. Numerous research work exists in the literature that has inductively assessed the relationship between task technology fit and performance impact and observed that task technology fit has a significant positive influence on performance impact [42, 43, 57, 8489]. It has been further analyzed that performance of students’ in relevance to proficiency and productiveness is enhanced due to task technology fit [92].

Therefore, the following hypothesis is deduced:

  1. H0: Task technology fit does not positively predict performance impact.
  2. H3: Task technology fit positively predicts performance impact.

2.2.4 User satisfaction and actual usage.

One more important component in technology-intended studies is the knowledge of technology utilization by the users. Many research works have been conducted to investigate the relationship between user satisfaction and actual system usage, with the conclusion that the user satisfaction construct has a considerable impact on actual system usage [9396]. In fact, the time span of technology utilization by users is enhanced due to user satisfaction [90]. The important element in this supposition is also user satisfaction, as it is an obvious fact that when the satisfaction level of the user is high, the actual system usage will also be enhanced. Hence, it can be hypothesized that:

  1. H0: User satisfaction does not positively predict actual usage of the system
  2. H4: User satisfaction positively predicts actual usage of the system

2.2.5 Actual usage and performance impact.

The relationship between actual usage of the system and performance impact is one more critical aspect in the frame of reference of technology utilization [97]. Few studies have tried hard to minimize the gap by doing considerable work on the relationship between actual usage of the system and performance impact [57, 98]. A quantitative study by [95] analyzed that the actual system usage has an important effect on performance. Moreover, research work centered on information systems has emphasized that actual usage of a system has a positive influence on performance construct [54, 84, 98102]. Hence, this relationship illustrates that the students will use the more persistently the online learning system to complete their academic tasks, the more it will contribute to their improved academic performance.

So, it is hypothesized that:

  1. H0: Actual usage of the system does not positively predict the performance impact
  2. H5: Actual usage of the system positively predicts performance impact

2.2.6 Mediating role of user satisfaction.

User satisfaction is favorably influenced by institutional factors [75, 77, 78], and user satisfaction has a considerable positive influence on task technology fit [39]. Hence, it is being advocated that institutional factors positively impact task technology fit through user satisfaction. Besides, it is determined from the prior discussed literature that the time span of technological system utilization is enhanced due to user satisfaction [58], and user satisfaction is correspondingly affected by the institutional factors [19, 102].

Based on these arguments, the following hypotheses are proposed:

  1. H0: Institutional factors do not positively predict task technology fit through the mediating role of user satisfaction
  2. H2a: Institutional factors positively predict task technology fit through the mediating role of user satisfaction
  3. H0: Institutional factors do not positively predict actual usage of the system through the mediating role of user satisfaction
  4. H4a: Institutional factors positively predict actual usage of the system through the mediating role of user satisfaction

2.2.7 Mediating role of task technology fit.

As it was observed that with the increase in the potency of the institutional support services, the intensity of student satisfaction is also enhanced, leading to task technology fit in the sequence. According to [90], user satisfaction is an essential factor that significantly impacts users’ performance. User satisfaction and task technology fit are the vital components of this presumption. Nevertheless, as [43] observed, an important association was determined between task technology fit and performance impact. This relationship was practically verified, and it was deduced that task technology fit positively influences performance impact [43]. Indeed, the academic performance of the students in relevance to their efficacy and productivity is enhanced due to task technology fit [103], and the technology fit fulfills the needs of the students when they are highly satisfied, as analyzed by [39], consequently because of a higher level of institutional factors.

Hence, it is deduced that:

  1. H0: Institutional factors do not positively predict performance impact through the mediating role of user satisfaction and task technology fit in the sequence.
  2. H3a: Institutional factors positively predict performance impact through the mediating role of user satisfaction and task technology fit in the sequence.

2.2.8 Mediating role of actual usage.

It was analyzed that an increase in the level of institutional factors of the system directs towards the elevated satisfaction level of the students and has an indirect impact on actual usage of the system through user satisfaction. User satisfaction and actual usage are the vital elements in this supposition [58]. Moreover, past research also suggests that actual usage of the system has a significant effect on performance [84], and actual system usage is itself affected by user satisfaction [58]. User satisfaction is essentially influenced by institutional factors [104]. Therefore, institutional factors certainly affect performance impact through user satisfaction and actual system usage in sequence. Therefore,

  1. H0: Institutional factors do not positively influence performance impact through the mediating role of user satisfaction and actual usage of the system
  2. H5a: Institutional factors positively influence performance impact through the mediating role of user satisfaction and actual usage of the system

2.2.9 The moderating role of cognitive absorption.

Cognitive absorption comprises three discrete measures: inherent interest, curiosity and observation center [105]. Cognitive absorption has been found to positively influence the online learning system’s perceived usefulness [106] and perceived ease of use [67, 107]. Cognitive absorption is a kind of intrinsic motivation [108]. Users submerge in the usage of a particular IT system for their personal objectives and feel a sense of joy and pleasure when they use it. If an online education system satisfies the intrinsic motivation and imparts a feeling of enjoyment to the students, they will keep using it and spend more time on it.

Hence, the satisfaction throughout the usage of the online learning system arises from cognitive absorption, which enables the students to fully concentrate and immerse in their educational activities. A research study by [109] highlights that cognitive absorption is positively associated with user satisfaction. In this research, cognitive absorption focuses on how the students feel completely involved with online learning while using it with full joy and are fully attentive. Numerous studies emphasize that cognitive absorption is a significant factor in user satisfaction [19, 69, 71, 110]. Moreover, research studies by [34, 35, 111] determined that cognitive absorption has an essential impact on user satisfaction. Therefore, it can be inferred that cognitive absorption significantly influences user satisfaction in online learning, i.e., the greater the level of cognitive absorption of the students using the online learning system, the higher their satisfaction level will be. Thus, it is hypothesized that:

  1. H0: Cognitive absorption does not moderate the relationship between Institutional factors and User Satisfaction.
  2. H6: Cognitive absorption moderates the relationship between Institutional factors and User Satisfaction.

Based on these hypotheses, Fig 1 depicts the relationships among the variables of the study.

3. Methodology

3.1 Ethics statement

The Institution’s Ethical Evaluation Committee (NML-ERC/2021-02) provided ethical review and approval for this study on human subjects. Furthermore, the respondents gave their written informed consent to take part in the research.

3.2 Research design

Structural Equation Modelling (SEM), a multivariate statistical analysis technique, has been used to analyze the structural associations and empirically test the formulated hypotheses with the help of Analysis of Moment Structures (AMOS®) 24 software. SEM comprises two components: Confirmatory Factor Analysis (CFA) is utilized to assess the measurement model among the observed and latent variables. The second is Path Analysis (PA), which is used to fit the structural model and the latent variables. CFA is used for testing the validity of indicators, whereas the path analysis technique signifies the mode where a particular latent variable directly or indirectly causes a change in the other latent variable. This 2-step approach demonstrates that the structural model makes use of only those constructs that possess a suitable measure. A goodness-of-fit index was estimated in SEM between the conceptual model and the sample data. Three different measures were used to assess the goodness of fit of the measurement model and the structural model: relative Chi-square ratio over the degree of freedom (χ2/DF), Goodness of Fit Index (GFI) and Root Mean Square Error Approximation (RMSEA).

3.3 Sample and procedure

The population for this research study is students of the top ten public sector universities and top ten private universities in Punjab as ranked by HEC Pakistan. The age group of the students ranged from 20 to 40 years. They were students of bachelor, master, Ph.D. degrees or any other diploma courses in the universities. Quota sampling was used to obtain data from the students. The data were gathered by distributing a self-administered questionnaire in hard form and forwarding the questionnaires to students at various universities via email and posting the questionnaire link on the enlisted universities’ Facebook pages. Google form questionnaire was used as a tool for online data collection. The response was only requested from the students studying through some online mode. This was ensured by asking them the screening question. The total number of questionnaires distributed among the students was 1000. Of the 1000 questionnaires, 416 responses were received, while 404 were selected for further analysis, excluding 12 due to incomplete or improper information. Therefore, the valuable response percentage of the data collection was 40.40%.

The first section of the survey was about demographics. Table 1 shows that 60 percent of respondents were male students. Moreover, 12.10% of the students were under 21 years, 79% were between the 21–30 age group, and 8.10% were between the 31–40 age group. Furthermore, the universities’ percentile responses ranged from 5.0 to 12.90%. Intermediate students made up 0.7 percent of the respondents, while bachelor students made up 53.20%, master’s students made up 32.90%, 8.70% were Mphil students, Ph.D. students made up 2.70%, and other students made up 1.50% of the overall responses.

Furthermore, as [112] suggested, common method variance (CMV) was provisionally carried out. CMV might exist when all the scale items are estimated with one questionnaire, and the whole data are gathered at one time. CMV also occurs when the relationship between two constructs is distorted; to put it another way, CMV constructs a methodical covariance despite the real relationship among the scale items. The perplexity of the scale items, the respondents’ incapability, double-barreled research items, incapability of the respondents to examine the research topic, less participation of the respondent in the topic, arrangement of the scale items, the respondent’s tendency to provide intense responses, and so on are all sources of CMV. Consequently, incorrect estimates of convergent validity and reliability in the research can be caused by the changed values of the observed correlations and related indicators.

CMV can be handled in two ways: one by using procedural amendments and the other by using different statistical techniques. The most efficient way to reduce CMV using procedural remedies is to identify the resemblance between predictor and criterion variable measurements and then limit or eliminate them. To avoid CMV, the researchers apply procedural remedies in the early stages of questionnaire development. A common procedural remedy is using more than one information source to get data for the model’s constituents. Another approach would be to use clear, concise, and suitable language to avoid misinterpretation of scale items by participants and reduce the possibility of undesired results. Researchers that use procedural remedies may indeed be able to minimize, if not eliminate, the possible effects of CMV in their research results. In these scenarios, they may be more willing to choose one of the statistical techniques available. Furthermore, the researcher has access to a wide variety of statistical methods for preventing CMV. The most widely used of these methods is Harman’s single-factor test, also known as Harman’s one-factor test [113]. In this procedure, the researcher enters the scale items into a single exploratory factor analysis and then examines the unrotated factor solution to find the count for the constituents with eigen values larger than one, representing cumulative variation. It is presumed here that if CMV exists, then just one component is responsible for more than half of the covariance amongst scale items. All 37 items in this study were subjected to a single exploratory factor analysis using this approach, and an unrotated factor analysis was discovered, accounting for only 28.79 percent of the total. As a result, the CMV output confirmed that the sample data did not contain any CMV bias.

3.4 Measurement scale

Measurement scales previously established were utilized in the current study for data collection, as presented in S1 Appendix. Each scale item was assessed on a seven-point Likert scale (1 Strongly Disagree and 7 Strongly Agree). The scale for Institutional factors was adopted from [76, 114]. It has 13 items and a reported value of Cronbach alpha of 0.926. A sample item of the scale is “I knew where to ask for help when I had any technical issues”. User Satisfaction has 3 items taken from [19]. The alpha value for this scale is 0.915. A sample item is “My decision to use online learning was a wise one”. Task Technology was adopted from [90]. It has 3 items with a reported alpha value of 0.911, and a sample item is “Online learning fits with the way I like to learn and study.” Actual usage was acquired from [39], and has 2 items and the alpha value for this scale is 0.818. A sample item is “on average, how much time do you spend per week using the online learning?”. Performance Impact was adopted from [19]. This scale has an alpha value of 0.959 with 10 items, and a sample item is “online learning helps me accomplish my tasks more quickly”. Cognitive absorption was taken from [59]. This scale has 6 items, and a sample item is “While on the Web, I am immersed in the task I am performing.”, while it has an alpha value of 0.893. The overall summary of all the items has been reported in Table 2.

thumbnail
Table 2. Measurement scales and corresponding references for all the constructs.

https://doi.org/10.1371/journal.pone.0269609.t002

4. Data analysis and results

A highly efficient software Analysis of a Moment Structure (AMOS) was employed in the present study for performing data analysis. It is statistical software that uses innovative mechanisms for conducting structural equation modeling (SEM). It has a graphical interface that can be operated easily, produces an explicit research model for the researchers, creates quality illustrations for presentation in the publication, and calculates the most authentic numeric values.

4.1. Descriptive analysis

The values measured for descriptive analysis for Institutional factors, User Satisfaction, Cognitive absorption, Task Technology Fit, Actual Usage, and Performance Impact are shown in Table 3. Institutional factors had a mean of 4.8, and a standard deviation of 1.4; User Satisfaction had a mean of 4.7 and a standard deviation of 1.4. The mean of Cognitive Absorption was found to be 4.7 with a standard deviation of 1.2. Moreover, the mean for Task Technology Fit was 3.9, and the standard deviation turned out to be 1.8. For Actual Usage, the mean value was 4.7, and the value for standard deviation was 1.5. Lastly, the mean value for Performance Impact was 4.8, and the value obtained for standard deviation was 1.6. It indicates that the coefficient of variation (CV = Mean/Std Dev) is not very large, and there is not much dispersion in the data, which suggests that the responses obtained are reliable.

As reported in Table 3, the skewness values unveil that the data has a normal distribution because the values of skewness range from − 3 and + 3, and kurtosis between − 10 to + 10, when applying structural equation modeling (SEM) [116].

4.2. Measurement model

The measurement model represents comprehensive or implied models that relate latent variables with corresponding indicators. Moreover, it is also identified as path analysis. It helps us determine the goodness of measures of the conceptions we are intrigued by in our hypothetical model. This, in turn, allows us to assess how efficiently we can evaluate the theories to correct the errors and estimate the overall fit of these models. Confirmatory Factor Analysis (CFA) is an important method widely used to confirm a theoretical measurement model. It illustrates an interrelation between the unobserved or latent variables and observed variables or indicators. Moreover, CFA is a statistical approach that has a foundation on concepts that explain errors in the measurement and estimates the unidimensional model, and so it is suggested to carry out data analysis. The measurement model was assessed by using construct validity and reliability. Therefore, CFA was carried out by analyzing the factor constitution of the variables to evaluate the validity of the six measures, as displayed in Fig 2.

Structural Equation Modeling (SEM) used in this study is a highly efficient data analysis technique, and there is no other technique that can give us more precise results of the parameters, presuming multivariate normal data. SEM attempts to justify the applicability of a given hypothesis by assessing the impact of mediators on the relationship between an independent variable and a dependent variable. SEM was also used to examine the role of controls and moderators. Three attributes distinguish all SEM Models: evaluation of many interconnected dependency connections and the potential to describe unknown concepts in these associations, rectify measurement inaccuracies in the estimating process and create a model to describe the complete set of relationships. The major assumptions in applying structural equation modeling are: normality of the acquired data; no systematic missing data; sufficiently large sample size and correct model specification. These assumptions were checked before the application of SEM. For example, normality was assessed through Skewness and Kurtosis (see Table 3). Similarly, a sample size of 404 employed in this study is considered to be greater than the minimum threshold of 200 cases typically used in SEM studies, as recommended by [117]. Finally, we employed the Ramsey Regression Specification Error Test’s (RESET) null hypothesis of correct specification [118]. The p-value of the F stat was found to be 11.45%>5%, indicating that the functional form of the model is correct, and it does not succumb to omitted variables. Furthermore, in this study, the maximum likelihood (ML) estimation method was used, which proclaims that if all the items load significantly on their corresponding factors, then the uni-dimensionality is present for the constructs, and hence validity is exhibited. Fig 2 represents the measurement model and, composite reliability estimates for the scale reliability and results of CFA are given in Table 4.

4.2.1 Model fit.

Fit in research signifies the capability of a model to describe the data. Specifically, in CFA, a model fit demonstrates the proximity of the observed data and supplements the interrelation suggested in the theoretical model. A model with a good fit is conveniently well suited to the data, i.e., to evaluate if the model considerably fits in conformance with the data or not. Consequently, the goodness-of-fit of the model in connection to data was estimated by the application of numerous tests. So predicating on the goodness-of-fit indices, there is a proclamation of the model affirmation.

Chi-square (χ2) statistic denotes a test that estimates the relevance of the hypothesized model to observed data. It is used to assess the overall model fit and the variance between the sample and fitted covariance matrices. But since χ2 is based on the sample size, it is not employed to estimate the model fitness. However, for assessing the model fit, we calculate the χ2/DF, where the ratio is ≤ 3, it denotes an acceptable fit [119] and if the value is ≤ 5, it represents a reasonable fit [120]. The model used for observation in the present study has a normed chi-square (χ2/DF) = (936.179 /445) = 2.105 (<3.00), indicating a satisfactory fit. The goodness of fit index (GFI) is a metric for comparing the fit among the proposed model and the measured covariance matrix. According to [121], the Goodness-of-Fit Index (GFI ≤ 1) enumerates the fragment of variance constituted by the determined covariance of the population. If the value obtained is equal to 1, it will be regarded as a perfect fit. But, when the sample size increases, the GFI value is also expected to increase. When the GFI value is > 0.95, it is estimated as a good fit, and if the value of GFI is < 0.65, it is interpreted as a sustainable fit. The value of GFI obtained is 0.872, which represents an acceptable model fit. In covariance structure modeling, the root mean square error of approximation (RMSEA) and standardized root mean square residual (SRMR) is regarded as the most significant constituent. RMSEA is an indicator that measures the difference between the observed covariance matrix per degree of freedom and the predicted covariance matrix that represents the model. Whereas, the SRMR is an empirical measure of fit that is specified as the standardized difference between the actual and estimated correlations. If the value of RMSEA is <0.05, it is considered a good fit, and if the value lies between 0.08 to 0.10, it is regarded as an average fit, and the value greater than 0.10 denotes a poor fit, whereas SRMR<0.09 represents a good model fit [122]. The values obtained for RMSEA = 0.052 and for standardized RMR = .0319 illustrate reasonable unidimensionality of the constructs.

4.2.2 Reliability of the variables.

Reliability in research is described as the ability which the research methods to yield reliable and uniform results. In this research, Cronbach’s alpha has been used to determine the internal consistency reliability or to observe the degree to which a set of items is related. The value of Cronbach’s alpha falls between 0 and 1, and an inflated value denotes an inflated internal consistency. The Cronbach’s alpha values for the specified measures tabulated in S2 Appendix are higher than 0.70, representing the threshold value that describes convenient reliability for the measures used in this study [123].

4.2.3 Construct validity.

Validity is referred to as “the integrity or adequacy of a test or device in quantifying what it is devised to measure.” [124]. In the current research, construct validity was calculated after verifying discriminant validity, face validity, and convergent validity. As items for measurement were obtained from the former research, it justifies the face validity. Convergent validity was observed and estimated using the average variance extracted (AVE) and indicator reliability, which is the extent to which a measure is associated positively correlated with different measures of the same construct.

For estimating the reliability of the indicators, we employed factor loading. Construct with high loadings indicate that the associated indicators appear to have much in common exhibited by the construct [125]. Factor loadings with a value greater than 0.50 were considered highly significant [125]. It was examined that all the items seemed important with (p<0.001), and the loadings were higher than 0.5 (as illustrated in S2 Appendix), which depicts that items have satisfied all the conditions. Furthermore, all AVE values were greater than the recommended value of 0.50 [125]. Convergent validity was obtained explicitly for all constructs, and satisfactory convergent validity is exhibited in Table 4.

Similarly, discriminant validity refers to how well items distinguish across constructs or evaluate discrete ideas for the measurement model. The Fornell-Larcker criteria and the Heterotrait-Monotrait ratio (HTMT) were used for its verification.

The discriminant validity is described as the degree to which items discriminate amid constructs or assess individual conceptions for the measurement model, and it was authenticated by using Fornell-Larcker and the heterotrait-monotrait ratio (HTMT). As per the Fornell-Larcker method, the square roots of the AVEs (indicated in the diagonal of Table 4) are higher than the association between the constructs (relative row and column values). Hence, it denotes that the constructs are strongly linked with their relative indicators in contrast to other constructs in the model [124, 126], which specifies the existence of satisfactory discriminant validity [127]. The interrelationship between exogenous constructs is estimated to be lower than 0.85 [128]. Hence, we have achieved the discriminant validity for each construct in the model.

The Fornell-Larcker test was also criticized to some extent. [103] described that it fails to elaborate exactly the nonexistence of discriminant validity in general research scenarios. Hence, a substitute method was advised depending on the multitrait-multimethod matrix: HTMT—the heterotrait-monotrait ratio of correlations. Discriminant validity in the present research is assessed with the help of HTMT. The discriminant validity is likely to have some problems if the value obtained for HTMT is higher than the value of 0.90, i.e., HTMT 0.90, or the value of 0.85, i.e., HTMT 0.85. Since all the values obtained, as shown in Table 5, were smaller than the proposed value of 0.85, this signifies the attainment of discriminant validity.

4.3. Structural model assessment

SEM analysis is the second mandatory approach in the structural equation model. It can be demonstrated after validating the measurement model by illustrating the relationship between the constructs. So, the structure model explains the relationship among the variables, possessing the particular associations among exogenous variables and corresponding endogenous variables and displaying the connection between constructs. The structural model results help us assess how closely the theory is supported by empirical data and helps to assess whether the theory is confirmed empirically [129]. The structural model’s goodness-of-fit corresponded to the CFA measurement model’s goodness-of-fit. Therefore χ2/df = 3.14, CFI = 0.938, and RMSEA = 0.073 in the proposed structural model. These fit indices demonstrated a good match between the predicted model and the observed data.

4.4. Path analysis and hypothesis testing

Exogenous variables were studied using path analysis to identify their direct and indirect effects. A path diagram in Fig 3 demonstrates the hypothesized relationship between the constructs established on results from previous literature. IF is an exogenous variable, while US, AU, TTF and PI are endogenous variables.

Bootstrapping technique tests the indirect effects in the structural models by calculating the beta (β) values, R2, and the respective t-values. Moreover, the p-value is used to determine the presence of the effect [130].

The structural model estimation specifies the hypothesis tests as shown in Fig 3 and Table 6. The R2 values (squared multiple correlations) for US, AU, TTF and PI were found to be 0.055, 0.197, 0.363 and 0.314, respectively. Test results supported the six alternative hypotheses developed for this study. Hence it is evident that institutional factors positively estimate user satisfaction. So, the results supported H1 with β = 0.238, p <0.05. Similarly, user satisfaction positively predicts task technology fit (TTF), and the results of the tests advocate this hypothesis; hence, H2 is supported with β = 0.713, p <0.05, as reported in Table 6. It was analyzed that task technology fit had a significant effect on performance impact; therefore, H3 was also supported by the results with β = 0.262, p <0.05. Moreover, user satisfaction positively predicts actual usage of the system, and test results support this hypothesis β = 0.373, p<0.05, which is displayed in Table 6; hence H4 is also supported. As stated by H5, actual usage of the system positively predicts performance impact, and the results of the tests support it, as shown in Table 6; so results support H5 with β = 0.200 and p<0.05.

Variance accounted for (VAF) value was used to determine the efficacy of mediating effects. Complete mediation is regarded as the value of VAF higher than 80%, a value ranging from 20% to 80% is considered partial mediation, and a value less than 20% shows no mediation [131]. Table 7 study findings show partial mediation effects in the model. According to H2a, institutional factors positively estimate task technology fit through mediating role of user satisfaction have indirect effect β = 0.170 and direct effect as β = 0.278, indicating partial mediation respectively. For the H3a mediation test, tests similar to those stated above were performed indicating partial mediation effect, which represented that institutional factors firmly predict performance impact through the partial mediating role of user satisfaction and task technology fit in the sequence (β = 0.044) and β = 0.163 as indirect and direct effects respectively. For testing H4a, mediation tests similar to those mentioned before were performed, β = 0.089 was the indirect effect and β = 0.251 was the direct effect showing partial mediation. So, H4a is supported, and hence institutional factors positively predict actual usage of the system through the partial mediating role of user satisfaction. For verification of H5a, similar tests of mediation aforementioned were performed, and found indirect and direct effects β = 0.018 and β = 0.015, respectively. Therefore, H5a is verified, so institutional factors positively predict performance impact through the partial mediating role of user satisfaction and actual usage of the system.

Hayes Process Macro for Moderation was utilized to test for moderation to test H6, which claims that cognitive absorption moderates the relationship between institutional factors and user satisfaction [132]. The test results are displayed in Table 8, where Coeff is the coefficient, SE represents the standard error, T represents the T-test, and the value of T should be greater than 1.96 to accept the alternative hypothesis. P is the significance value, and its value should be less than .05 to accept the alternative hypothesis. The last two columns, LLCI and ULCI, are the lower and upper intervals for the coefficient values, and they should not contain zero values to accept the alternative hypothesis. At first, the total direct effect of institutional factors was tested on user satisfaction; the output represented a significant interaction impact of institutional factors on user satisfaction (β = 0.749; t = 10.719; p< 0.001). Likewise, the direct effect of cognitive absorption as a moderator on user satisfaction was tested; the output represented a significant interaction impact of cognitive absorption on user satisfaction (β = 0.5014; t = 10.6868; p<0.001). Finally, we tested the interaction effect of institutional factors and cognitive absorption on user satisfaction; the output represented that cognitive absorption has a significant interaction impact on user satisfaction (β = -0.1566; t = 4.5098; p<0.001). Since the interaction term is crucial to determine the effect, the moderation effect exists in our framework, as shown in Table 8. Hence, the moderating role of cognitive absorption was found to be significant between institutional factors and user satisfaction, as shown in Table 8, supporting H6.

5. Discussion

Because student retention is a more difficult challenge for online courses than for face-to-face learning [28], therefore from the past few years, it has become a matter of great concern for higher education institutions and now, because of the outbreak of Covid-19 pandemic, this issue has become the prime concern for all the institutions in all the countries worldwide. As a result, it will be critical to investigate the factors contributing to student satisfaction and academic performance in online learning. Therefore this research investigated the link between institutional factors in terms of university support, technical assistance, instructor support and administrative on student satisfaction and performance in online learning during Covid-19.

In this research, an effort has been made to develop a model based on the consolidation of the Technology Acceptance Model (TAM), Task Technology Fit model (TTF), DeLone and McLean Model of Information Systems Success (DMISM) and Unified Theory of Acceptance and Use of Technology (UTAUT) model to examine the coalition between institutional factors, user satisfaction, TTF, actual usage, cognitive absorption, and performance impact by collecting and analyzing data from top public and private universities of Punjab, Pakistan.

According to the findings of this study, institutional factors have a significant positive impact on student satisfaction and performance. Participants of this study gave higher consideration to institutional factors, which validates the outcomes of the study by [133]. This study analyzed that institutional factors positively estimate user satisfaction. It deduced that the better the support of institutional factors in terms of administrative, technical, instructional, and technology support for the usage of the online learning system, the more probably the student attaining online education would consider that the service relates more to their demands, conceptions, behavior and way of life. Therefore, they will feel more complacent about choosing to depend on and attain online education. This finding supports previous research by [75, 79, 81]. Furthermore, [134] proposed that universities must provision students with adequate facilities and resources, and that technical assistance must be provided and shall be highly efficient in encouraging good perceptions and opinions among students. Apart from this, it was also exhibited that user satisfaction positively estimates task technology fit, which signifies that user satisfaction is a significant element in analyzing the success or failure of the new technology. It is also supported by the previous research findings [20]. This study further indicates that institutional factors positively estimate task technology fit with the mediating role of user satisfaction. This depicts that the more the students are content with the institutional factors support for online education technology, the more they will be satisfied with the services given by this technology, and the more the students will consider this technology fit to accomplish their requirements. Therefore more this technology would assist them in achieving their tasks [135, 136].

This study also infers that institutional factors significantly estimate performance impact by the mediating role of user satisfaction and task technology fit in series. Moreover, depending on the output of the empirical test carried out on the interrelationship between task technology fit and performance impact, it was observed that task technology fit positively estimates performance impact, an estimation that is similar to the conclusions of previous studies [8488, 137]. This indicates that the more support provided by the institutional factors of the online learning system, the higher will be the level of satisfaction among the students with the facilities provided by the online education system in satisfying their needs, the more the students will observe the technology fit to accomplish their tasks, which will intensify academic performance and coursework productivity positively. However, the effectiveness and efficiency of student performance are enhanced due to task technology fit, and this technology fit fulfills the needs of the student when the student is more content, which is due to high support of institutional factors.

In addition to this, an interrelationship between user satisfaction and actual usage was analyzed, and it was observed that user satisfaction positively estimates the actual usage of the system. This output is also supported by the finding of previous research [98, 138]. This study also observed that institutional factors indirectly impact the actual usage of technology by the student via user satisfaction. So, the greater the support of institutional factors of the online learning system, the more the students will be content with the system and more the time span of the online learning usage by the students will increase [40, 139, 140].

This research also favors the hypothesis that actual usage of the system positively estimates the students’ performance impact. Few studies have enlightened the relationship between actual usage of system and performance impact, for example, in empirical research by [95], it was observed that actual system usage has a significant effect on the performance of an individual as they are using the system for accomplishing their task it will lead to the improvement in their performance. However, numerous studies refer to IS that emphasize that actual system usage has a significant positive effect on the performance of the system users [46, 78, 95, 98, 141]. This research also hypothesized that institutional factors estimate performance impact via the mediating role of user satisfaction and actual usage of the system. It infers that the higher the support of institutional factors of the online learning system is, the greater the level of satisfaction among the students, and as a result, the online learning system usage by the students will enhance. Hence, they will extend their time span of online learning system usage, which will have a positive impact on their academic excellence and coursework productiveness [19, 142146]. So, this practice accommodates how the students are learning, which is regarded as an essential element in attaining their academic tasks.

It is further observed in this study that cognitive absorption plays the role of a moderator in the relationship that exists between institutional factors and user satisfaction depicts that if the students ascertain that the online learning system is satisfying their requirements and is valuable for them to carry out their academic tasks, then the students will be immensely content with it. Therefore, the more the level of cognitive absorption of the students, the greater their level of satisfaction will be. Numerous studies advocate the conception that cognitive absorption has a positive effect on user satisfaction [19, 110]

6. Theoretical and practical implications

This study can make various theoretical contributions. Firstly, this study facilitates the literature by analyzing the impact of cognitive absorption as a moderator on the interrelationship between institutional factors and user satisfaction. Secondly, this research further adds to the literature by exploring the process of sequence mediation from institutional factors to performance impact, where it is mediated by user satisfaction and task technology fit in series, and examining the effect of institutional factors on performance impact, where it is mediated by user satisfaction actual usage. However, this study is significant from a practical point of view because of various reasons. First, the e-learning method can significantly assist learning by efficient usage of time, and studying at a place of one’s convenience motivates acquiring education by utilizing minimal resources and minimizing spatial issues. As educational institutions in these covid-19 pandemic days rely more on providing online education as a precautionary measure, this research will be of much significance for the institutions and the students.

Secondly, this research aims to provide policymakers with an extensive framework that highlights the fact how the utilization of online learning technologies can elevate the academic performance of the students as educational institutions and governments throughout the world are making an effort to use online education at a huge level to assure that the students are provided with quality learning and education in this pandemic situation. According to the findings of the recommended framework, the academic performance of the students in online learning can be elevated if the institutional factors of the online system, constructs of user satisfaction, task technology fit, actual usage of the system and cognitive absorption are administered and regulated appropriately. Third, the anticipated outcome of this research will enable the students to improve and strengthen knowledge attainment, enhance educational performance, and improve students’ productive and innovative capabilities. It will minimize their stress level in attaining online education in the current situation of the covid-19 pandemic.

Even though Pakistan is a developing nation, it can still completely utilize the benefits offered by online education so that even with the scarce availability of resources, it can still impart high-quality education all over the country. Numerous governments have provided their students with modern technological gadgets; they have also reduced the cost of internet connections to significantly elevate the acquisition of online education in their respective countries. Pakistan can also productively utilize this strategy and support the nationwide use of online learning. Moreover, the proposed framework would significantly facilitate the instructors, the students, and other administrative staff of the universities to utilize the novel technologies for delivering online education in sorting out their numerous issues.

7. Conclusion

The World Health Organization (WHO) announced a pandemic of the novel SARS-CoV2 viral infection earlier this year, and since then, it has become the world’s most serious public health threat. To combat the pandemic, all countries around the world adopted a policy of social distancing, which resulted in the closing of educational institutions in most countries. All educational establishments were bound to make mandatory and suitable changes in their present educational framework and continue to deliver quality education to the students. Therefore, all the teaching and learning processes and sessions changed to e-learning. A comparative procedure was also carried out in Pakistan to avoid the spread of the pandemic. This unexpected shift from face-to-face classroom learning to online learning had put a lot of burden on the students, and that significantly influenced the students’ academic performance. The current study portrayed students’ viewpoints on online learning and identified factors that can help students improve their academic performance by implementing the most appropriate technology. This research recommended an integrated model between the (TAM) Technology Acceptance Model, (TTF) Task Technology Fit model, (DMISM) DeLone and McLean Model of Information Systems Success and (UTAUT) Unified theory of acceptance and use of technology model to resolve this problem. Institutional factors, task technology fit, user satisfaction, cognitive absorption, actual system use, and performance effect were all essential constructs in the framework.

The output of various tests indicated that the recommended framework effectively demonstrated the effects of online learning on students’ academic success. Furthermore, user satisfaction is highly significant for determining task technology fit and actual use of online learning. Furthermore, it facilitates the link between institutional factors and user satisfaction and actual usage and also the link between user satisfaction, actual usage, and performance impact. Besides, task technology fit is also important in evaluating academic performance and facilitating the relationship between user satisfaction and academic performance. Cognitive absorption is also essential in assessing user satisfaction. The output of the empirical tests performed significantly provided support to the associations between the framework constructs. The policymakers and educational experts should stress these features to elevate the likelihood of enhanced academic performance. Lastly, the findings of this study will effectively support the government of Pakistan’s plan and strategies related to the implementation of online education to create conditions that are compatible with student online learning assignments, social values, and lifestyles. In such a setting, students are more likely to use online learning to enhance their academic performance and, ultimately, the quality of their work-life.

8. Limitations and future research directions

Certain limitations have been observed in the current research that can anticipate a roadmap for future research. As data for this research was gathered from students of those universities situated in Punjab, it is being suggested that researchers who plan to conduct a similar study should also collect data from universities located in all provinces of Pakistan so that the outcomes of the current research can be generalized. Moreover, future research work can be carried out in a broader spectrum by analyzing the online education system implemented in various universities of Pakistan with the ones being implemented in universities of other developed countries. As the current research is constrained to the cross-sectional collection of data, researchers in the future should also observe longitudinal data. The researchers should also perform various experiments to compare the outcomes to make the current research framework more extensive. Apart from cognitive absorption, which is being utilized as a moderator in the present study, other moderators should also be considered by the researchers for future studies. Considering this viewpoint, perceived usefulness and perceived ease of use can be employed as a moderator, and further description relevant to this moderation effect can be further studied [45]. Moreover, numerous studies have analyzed that the human factor plays a significant role in influencing the students to acquire online education–like a research study by [147] proposes that transformational leadership can also be taken into consideration as a moderator to analyze relationships in online learning frameworks.

Furthermore, the relationship between the framework variables in the present can be presumed to formulate various scenarios. For instance, actual usage and user satisfaction can be replaced in future research work. Moreover, the present study is constrained only to the education sector, so for future studies, researchers can consider other disciplines as well, for instance, to carry out the assessment of the framework of the present research.

References

  1. 1. Bao W. COVID ‐19 and online teaching in higher education: A case study of Peking University. Hum Behav Emerg Technol 2020; 2: 113–115. pmid:32510042
  2. 2. Zhang W, Wang Y, Yang L. Suspending Classes Without Stopping Learning: China ‘ s Education Emergency Management Policy in the COVID-19 Outbreak.
  3. 3. Chatterjee I, Chakraborty P. Use of Information Communication Technology by Medical Educators Amid COVID-19 Pandemic and Beyond. J Educ Technol Syst 2021; 49: 310–324.
  4. 4. Dhawan S. Online Learning: A Panacea in the Time of COVID-19 Crisis. J Educ Technol Syst 2020; 49: 5–22.
  5. 5. Nash C. Report on Digital Literacy in Academic Meetings during the 2020 COVID-19 Lockdown. 2.
  6. 6. Lassoued Z, Alhendawi M, Bashitialshaaer R. education sciences An Exploratory Study of the Obstacles for Achieving Quality in Distance Learning during the. Educ Sci 2020; 10: 1–13.
  7. 7. Peters MA, Wang H, Ogunniran MO, et al. China ‘ s Internationalized Higher Education During Covid-19: Collective Student Autoethnography. 2020; 968–988.
  8. 8. Cao L, Zhang J, Ge X, et al. Occupational profiling driven by online job advertisements: Taking the data analysis and processing engineering technicians as an example. PLoS One 2021; 16: e0253308. pmid:34157028
  9. 9. Akhtarul Islam M, Barna SD, Raihan H, et al. Depression and anxiety among university students during the COVID-19 pandemic in Bangladesh: A web-based cross-sectional survey. PLoS One 2020; 15: 1–12.
  10. 10. Jæger MM, Blaabæk EH. Inequality in learning opportunities during Covid-19: Evidence from library takeout. Res Soc Stratif Mobil; 68. Epub ahead of print 2020. pmid:32834345
  11. 11. Mishra L, Gupta T, Shree A. International Journal of Educational Research Open Online teaching-learning in higher education during lockdown period of COVID-19 pandemic. Int J Educ Res Open 2020; 1: 100012.
  12. 12. Bojović Ž, Bojović PD, Vujošević D, et al. Education in times of crisis: Rapid transition to distance learning. Comput Appl Eng Educ 2020; 28: 1467–1489.
  13. 13. Patricia A. College Students’ Use and Acceptance of Emergency Online Learning Due to COVID-19. Int J Educ Res Open 2020; 100011.
  14. 14. Zhu X, Liu J. Education in and After Covid-19: Immediate Responses and Long-Term Visions. Postdigital Sci Educ 2020; 2: 695–699.
  15. 15. Beaunoyer E, Dupéré S, Guitton MJ. COVID-19 and digital inequalities: Reciprocal impacts and mitigation strategies. Comput Human Behav; 111. Epub ahead of print 2020.
  16. 16. Aucejo EM, French J, Paola M, et al. The impact of COVID-19 on student experiences and expectations: Evidence from a survey ☆. J Public Econ 2020; 191: 104271. pmid:32873994
  17. 17. Ruth Colvin Clark REM. e-Learning and the Science of Instruction. 2016.
  18. 18. Tenório T, Bittencourt II, Isotani S, et al. Does peer assessment in on-line learning environments work? A systematic review of the literature. Comput Human Behav 2016; 64: 94–107.
  19. 19. Aldholay AH, Abdullah Z, Ramayah T, et al. Online learning usage and performance among students within public universities in Yemen. Int J Serv Stand 2018; 12: 163–179.
  20. 20. Isaac O, Aldholay A, Abdullah Z, et al. Online learning usage within Yemeni higher education: The role of compatibility and task-technology fit as mediating variables in the IS success model. Comput Educ 2019; 136: 113–129.
  21. 21. Panigrahi R, Srivastava PR, Sharma D. Online learning: Adoption, continuance, and learning outcome—A review of literature. Int J Inf Manage 2018; 43: 1–14.
  22. 22. Ajzen M. Understanding Attitudes and Predicting Social Behaviour, Prentice-Hall, Englewood Cliffs. 1980.
  23. 23. Icek Ajzen. From intentions to actions: A theory of planned behavior. Action Control 1985; 11–39.
  24. 24. Davis Fred D. Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology.
  25. 25. Goodhue Dale L. and Thompson Ronald L. Task-Technology Fit and Individual Performance.
  26. 26. DeLone WH, McLean ER. Information systems success: The quest for the dependent variable. Inf Syst Res 1992; 3: 60–95.
  27. 27. Delone WH, McLean ER. The DeLone and McLean Model of Information Systems Success: A Ten-Year Update. J Manag Inf Syst 2003; 19: 9–30.
  28. 28. Venkatesh V, Morris MG, Davis GB, et al. User Acceptance of Information Technology: Toward a Unified View. 2012; 27: 425–478.
  29. 29. Islam AKMN. Investigating e-learning system usage outcomes in the university context. Comput Educ 2013; 69: 387–399.
  30. 30. Montesdioca G. P. Z., & Maçada ACG. ScienceDirect Measuring user satisfaction with information security practices. 8. Epub ahead of print 2014.
  31. 31. Lizzio A, Wilson K, Simons R. Studies in Higher Education University Students ‘ Perceptions of the Learning Environment and Academic Outcomes: Implications for theory and practice. 2010; 37–41.
  32. 32. Darling-hammond L, Berry B, Thoreson A, et al. Educational Evaluation and Policy Analysis the Evidence. Epub ahead of print 2001.
  33. 33. Devadoss S, Foltz J. Evaluation of Factors Influencing Student Class Attendance and Performance. 78.
  34. 34. Masrek MN, Gaskin JE. Assessing users satisfaction with web digital library: the case of Universiti Teknologi MARA. Int J Inf Learn Technol 2016; 33: 36–56.
  35. 35. Roca JC, Chiu CM, Martínez FJ. Understanding e-learning continuance intention: An extension of the Technology Acceptance Model. Int J Hum Comput Stud 2006; 64: 683–696.
  36. 36. Gao BW, Jiang J, Tang Y. The effect of blended learning platform and engagement on students’ satisfaction——the case from the tourism management teaching. J Hosp Leis Sport Tour Educ 2020; 27: 100272.
  37. 37. Mayer RE. Searching for the role of emotions in e-learning. Learn Instr 2020; 70: 101213.
  38. 38. Zhao Y, Bacao F. What factors determining customer continuingly using food delivery apps during 2019 novel coronavirus pandemic period? Int J Hosp Manag 2020; 91: 102683. pmid:32929294
  39. 39. Aldholay A, Abdullah Z, Isaac O, et al. Perspective of Yemeni students on use of online learning: Extending the information systems success model with transformational leadership and compatibility. Inf Technol People 2019; 33: 106–128.
  40. 40. Isaac O, Abdullah Z, Aldholay AH, et al. Antecedents and outcomes of internet usage within organisations in Yemen: An extension of the Unified Theory of Acceptance and Use of Technology (UTAUT) model. Asia Pacific Manag Rev 2019; 24: 335–354.
  41. 41. Cheng B, Wang M, Moormann J, et al. The effects of organizational learning environment factors on e-learning acceptance. Comput Educ 2012; 58: 885–899.
  42. 42. McGill TJ, Klobas JE. A task-technology fit view of learning management system impact. Comput Educ 2009; 52: 496–508.
  43. 43. Shim M, Jo HS. What quality factors matter in enhancing the perceived benefits of online health information sites? Application of the updated DeLone and McLean Information Systems Success Model. Int J Med Inform 2020; 137: 104093. pmid:32078918
  44. 44. Xinli H. Effectiveness of information technology in reducing corruption in China A validation of the DeLone and McLean information systems success model. Electron Libr 2015; 33: 52–64.
  45. 45. Ashfaq M, Yun J, Yu S, et al. I, Chatbot: Modeling the determinants of users’ satisfaction and continuance intention of AI-powered service agents. Telemat Informatics 2020; 54: 101473.
  46. 46. Cho KW, Bae SK, Ryu JH, et al. Performance evaluation of public hospital information systems by the information system success model. Healthc Inform Res 2015; 21: 43–48. pmid:25705557
  47. 47. Wu LY, Wang CJ. Transforming resources to improve performance of technology-based firms: A Taiwanese Empirical Study. J Eng Technol Manag—JET-M 2007; 24: 251–261.
  48. 48. Kintu MJ, Zhu C, Kagambe E. Blended learning effectiveness: the relationship between student characteristics, design features and outcomes. Int J Educ Technol High Educ; 14. Epub ahead of print 2017.
  49. 49. Dixon MJ. INSTITUTIONAL FACTORS AFFECTING DOCTORAL DEGREE COMPLETION AT SELECTED TEXAS PUBLIC UNIVERSITIES. 2015.
  50. 50. Chaney Don,Chaney Elizabeth,Eddy James M. Enhancing Online Education through Instructor Skill Development in Higher Education. Online J Distance Learn Adm; 13, http://www.westga.edu/~distance/ojdla/winter134/chaney134.html (2010).
  51. 51. Alshammari SH, Ali MB, Rosli MS. The influences of technical support, self efficacy and instructional design on the usage and acceptance of LMS: A comprehensive review. Turkish Online J Educ Technol 2016; 15: 116–125.
  52. 52. Usm AE. Towards Student-Centred Learning: Factors Contributing to the Towards Student-Centred Learning: Factors Contributing to the Adoption of E-Learn @ USM.
  53. 53. Wang YS. Assessing e-commerce systems success: A respecification and validation of the DeLone and McLean model of IS success. Inf Syst J 2008; 18: 529–557.
  54. 54. Wang YS, Liao YW. Assessing eGovernment systems success: A validation of the DeLone and McLean model of information systems success. Gov Inf Q 2008; 25: 717–733.
  55. 55. Kim HW, Chan HC, Gupta S. Value-based Adoption of Mobile Internet: An empirical investigation. Decis Support Syst 2007; 43: 111–126.
  56. 56. Alrajawy I, Mohd Daud N, Isaac O, et al. Mobile Learning in Yemen Public Universities: Factors Influence student’s Intention to Use. 7th Int Conf Postgrad Educ Univ Teknol MARA (UiTM), Malaysia 2016; 1050–1064.
  57. 57. Norzaidi MD, Chong SC, Murali R, et al. Intranet usage and managers’ performance in the port industry. Ind Manag Data Syst 2007; 107: 1227–1250.
  58. 58. Isaac O, Abdullah Z, Ramayah T, et al. Internet usage user satisfaction task-technology fit and performance impact among public sector employees in Yemen. Int J Inf Learn Technol 2017; 34: 210–241. Epub ahead of print 2017.
  59. 59. Agarwal Ritu and Karahanna Elena. Time Flies When You’re Having Fun: Cognitive Absorption and Beliefs about Information, http://www.jstor.org/stable/3250951. (2000).
  60. 60. Judy Drennan a JK b & AP. Factors Affecting Student Attitudes Toward Flexible Online Learning in Management Education. J Educ Res 2005; 2: 37–41.
  61. 61. Joo YJ, Lim KY, Kim EK. Online university students’ satisfaction and persistence: Examining perceived level of presence, usefulness and ease of use as predictors in a structural model. Comput Educ 2011; 57: 1654–1664.
  62. 62. Stefanovic D, Drapsin M, Nikolic J, et al. Empirical study of student satisfaction in e-learning system environment. Tech Technol Educ Manag 2011; 6: 1152–1164.
  63. 63. Marsh Gregory B. An Exploratory Investigation of the Relationship between Institutional Characteristics and Student Retention in Public Four-Year Colleges and Universities. J Allergy Clin Immunol 2010; 130: 556.
  64. 64. Moisey S. D., & Hughes JA. Supporting the online learner. In Anderson T. (Ed.), The Theory and Practice ofOnline Learning. (2nd ed.). Edmonton, AB, Canada: AU Press, 2008.
  65. 65. Rovai AP, Downey JR. Why some distance education programs fail while others succeed in a global environment. Internet High Educ 2010; 13: 141–147.
  66. 66. Wheeler S. Learner support needs in online problem-based learning. Quarterly Review of Distance Education, 7(2),. 2006.
  67. 67. Saadé R, Bahli B. The impact of cognitive absorption on perceived usefulness and perceived ease of use in on-line learning: An extension of the technology acceptance model. Inf Manag 2005; 42: 317–327.
  68. 68. Elmezni I, Gharbi JE. Mediation of cognitive absorption between users’ time styles and website satisfaction. J Internet Bank Commer 2010; 15: 1–16.
  69. 69. Leong P. Role of social presence and cognitive absorption in online learning environments. Distance Educ 2011; 32: 5–28.
  70. 70. TsaiFang Yu, Lee TW Y. The Impact of Task Technology Fit, Perceived Usability and Satisfaction on M-Learning Continuance Intention. Int J Digit Content Technol Its Appl. Epub ahead of print 2012. ID: 8761276.
  71. 71. Rouis S. Impact of cognitive absorption on facebook on students’ achievement. Cyberpsychology, Behav Soc Netw 2012; 15: 296–303. pmid:22703035
  72. 72. Aparicio M, Bacao F, Oliveira T. Grit in the path to e-learning success. Comput Human Behav 2017; 66: 388–399.
  73. 73. Khan IU, Hameed Z, Yu Y, et al. Predicting the acceptance of MOOCs in a developing country: Application of task-technology fit model, social motivation, and self-determination theory. Telemat Informatics 2018; 35: 964–978.
  74. 74. Park C. and Wells P. “The Higher Education Academy postgraduate taught experience survey”, PTES 2010 Report, The Higher Education Academy, York. 2010.
  75. 75. Hasan HFA, Ilias A, Rahman RA, et al. Service Quality and Student Satisfaction: A Case Study at Private Higher Education Institutions. Int Bus Res 2009; 1: 163–175.
  76. 76. Amoozegar A, Mohd Daud S, Mahmud R, et al. Exploring Learner to Institutional Factors and Learner Characteristics as a Success Factor in Distance Learning. Int J Innov Res Educ Sci 2017; 4: 2349–5219.
  77. 77. Wang G, Song J. The relation of perceived benefits and organizational supports to user satisfaction with building information model (BIM). Comput Human Behav 2017; 68: 493–500.
  78. 78. Cho J, Yu H. Roles of University Support for International Students in the United States: Analysis of a Systematic Model of University Identification, University Support, and Psychological Well-Being. J Stud Int Educ 2015; 19: 11–27.
  79. 79. Baleghi-Zadeh S, Ayub AFM, Mahmud R, et al. The influence of system interactivity and technical support on learning management system utilization. Knowl Manag E-Learning 2017; 9: 50–68.
  80. 80. Moses P, Bakar KA, Mahmud R, et al. ICT Infrastructure, Technical and Administrative Support as Correlates of Teachers’ Laptop Use. Procedia—Soc Behav Sci 2012; 59: 709–714.
  81. 81. Chen PSD, Lambert AD, Guidry KR. Engaging online learners: The impact of Web-based learning technology on college student engagement. Comput Educ 2010; 54: 1222–1232.
  82. 82. Ikhsan RB, Saraswati LA, Muchardie BG, et al. The determinants of students’ perceived learning outcomes and satisfaction in BINUS online learning. Proc 2019 5th Int Conf New Media Stud CONMEDIA 2019 2019; 68–73.
  83. 83. DeLone WH, McLean ER. Information Systems Success Measurement. 2016. Epub ahead of print 2016.
  84. 84. D’Ambra J, Wilson CS, Akter S. Application of the task-technology fit model to structure and evaluate the adoption of E-books by academics. J Am Soc Inf Sci Technol 2013; 64: 48–64.
  85. 85. Glowalla P, Sunyaev A. ERP system fit–An explorative task and data quality perspective. J Enterp Inf Manag 2014; 27: 668–686.
  86. 86. Lee DY, Lehto MR. User acceptance of YouTube for procedural learning: An extension of the Technology Acceptance Model. Comput Educ 2013; 61: 193–208.
  87. 87. Larsen TJ, Sørebø AM, Sørebø Ø. The role of task-technology fit as users’ motivation to continue information system use. Comput Human Behav 2009; 25: 778–784.
  88. 88. Daud NM. Factors Determining Intranet Usage: An Empirical Study of Middle Managers in Malaysian Port Industry, Multimedia University, Selangor.
  89. 89. Lee KC, Lee S, Kim JS. Analysis of mobile commerce performance by using the task-technology fit. IFIP Adv Inf Commun Technol 2005; 158: 135–153.
  90. 90. Isaac O RT, Mutahar AM. Internet Usage and Net Benefit among Employees within Government Institutions in Yemen: An Extension of DeLone. . . Int J Soft Comput 2017; 12: 178–198.
  91. 91. Shih Y-Y, Chen C-Y. The study of behavioral intention for mobile commerce: via integrated model of TAM and TTF. Qual Quant 2013; 47: 1009–1020.
  92. 92. Sinha A, Kumar P, Rana NP, et al. Impact of internet of things (IoT) in disaster management: a task-technology fit perspective. Ann Oper Res 2019; 283: 759–794.
  93. 93. Petter S, DeLone W, McLean E. Measuring information systems success: Models, dimensions, measures, and interrelationships. Eur J Inf Syst 2008; 17: 236–263.
  94. 94. Jafari SM, Ali NA, Sambasivan M, et al. A respecification and extension of DeLone and McLean model of IS success in the citizen-centric e-governance. Proc 2011 IEEE Int Conf Inf Reuse Integr IRI 2011 2011; 342–346.
  95. 95. Norzaidi MD, Chong SC, Murali R, et al. Towards a holistic model in investigating the effects of intranet usage on managerial performance: A study on Malaysian port industry. Marit Policy Manag 2009; 36: 269–289.
  96. 96. Norzaidi MD, Salwani MI. Campus-Wide Information Systems Article information: Evaluating technology resistance and technology satisfaction on students’ performance. Res Pap 2014; 9: 460–466.
  97. 97. Venkatesh Viswanath, Michael G. Morris GBD and FDD. User Acceptance of Information Technology: Toward a Unified View, http://www.jstor.org/stable/30036540 (2003).
  98. 98. Hou CK. Examining the effect of user satisfaction on system usage and individual performance with business intelligence systems: An empirical study of Taiwan’s electronics industry. Int J Inf Manage 2012; 32: 560–573.
  99. 99. Isaac O, Abdullah Z, Ramayah T, et al. Perceived Usefulness, Perceived Ease of Use, Perceived Compatibility, and Net Benefits: an empirical study of internet usage among employees in Yemen. 7th Int Conf Postgrad Educ Univ Teknol MARA (UiTM), Shah Alam, Malaysia 2016; 899–919.
  100. 100. Makokha MW, Ochieng DO. Assessing the Success of ICT’s from a User Perspective: Case Study of Coffee Research Foundation, Kenya. J Manag Strateg 2014; 5: 12–58.
  101. 101. Fan JC, Fang K. ERP implementation and information systems success: A test of DeLone and McLean’s model. Portl Int Conf Manag Eng Technol 2006; 3: 1272–1278.
  102. 102. Wang C, Teo TSH. International Journal of Information Management Online service quality and perceived value in mobile government success: An empirical study of mobile police in China. Int J Inf Manage 2020; 102076.
  103. 103. Hensler J, Ringle C M., Sarstedt . A new criterion for assessing discriminant validity in variance-based structural equation modeling. J Acad Mark Sci 2015; 43: 115–135.
  104. 104. Aldholay A, Isaac O, Abdullah Z, et al. An extension of Delone and McLean IS success model with self-efficacy Online learning usage in Yemen. Epub ahead of print 2018.
  105. 105. Webster J, Ho H. Audience Engagement in Multimedia Presentations. Data Base Adv Inf Syst 1997; 28: 63–77.
  106. 106. Hoffman DL, Novak TP. Marketing in hypermedia computer-mediated environments: Conceptual foundations. J Mark 1996; 60: 50–68.
  107. 107. Shang HF. Email dialogue journaling: Attitudes and impact on L2 reading performance. Educ Stud 2005; 31: 197–212.
  108. 108. Léger PM, Davis FD, Cronan TP, et al. Neurophysiological correlates of cognitive absorption in an enactive training context. Comput Human Behav 2014; 34: 273–283.
  109. 109. Hou ACY, Shiau WL, Shang RA. The involvement paradox: The role of cognitive absorption in mobile instant messaging user satisfaction. Ind Manag Data Syst 2019; 119: 881–901.
  110. 110. Ismail NZ, Razak MR, Zakariah Z, et al. E-Learning Continuance Intention Among Higher Learning Institution Students’ in Malaysia. Procedia—Soc Behav Sci 2012; 67: 409–415.
  111. 111. Jumaan IA, Hashim NH, Al-Ghazali BM. The role of cognitive absorption in predicting mobile internet users’ continuance intention: An extension of the expectation-confirmation model. Technol Soc 2020; 63: 101355.
  112. 112. Podsakoff PM, Mackenzie SB, Lee J, et al. Common Method Biases in Behavioral Research: A Critical Review of the Literature and Recommended Remedies. 2003; 88: 879–903.
  113. 113. Harman HH. Modern Factor Analysis. University of Chicago Press, Chicago, USA., https://www.amazon.com/Modern-Factor-Analysis-Harry-Harman/dp/0226316521 (1976).
  114. 114. Lee SJ, Srinivasan S, Trail T, et al. Examining the relationship among student perception of support, course satisfaction, and learning outcomes in online learning. Internet High Educ 2011; 14: 158–163.
  115. 115. Agarwal R, Karahanna E. Time Files When You’re Fun: Cognitive Absorption and Beliefs about Information Technology Usage. MIS Q 2004; 18: 695–704.
  116. 116. Brown TA, Moore MT. Hoyle CFA Chapter—Final Running head: CONFIRMATORY FACTOR ANALYSIS Confirmatory Factor Analysis Timothy A. Brown and Michael T. Moore Correspondence concerning this chapter should be addressed to Timothy A. Brown, Center for Anxiety & Related Disor.
  117. 117. Kline Rex B. Principles and Practice of Structural Equation Modeling, Fourth Edition. New York, NY: Guilford publications, https://books.google.com.bh/books?hl=en&lr=&id=Q61ECgAAQBAJ&oi=fnd&pg=PP1&dq=related:GDR5Xk2fZEkJ:scholar.google.com/&ots=jFgo4rDath&sig=mmRM3TIVbTk7UJ8Mx3a7k1EktVE&redir_esc=y#v=onepage&q&f=false (2011).
  118. 118. Ramsey JB. Tests for Specification Errors in Classical Linear Least-Squares Regression Analysis. J R Stat Soc Ser B 1969; 31: 350–371
  119. 119. Kline RB. Structural equation modeling o.
  120. 120. Marsh HW, Hocevar D. Application of Confirmatory Factor Analysis to the Study of Self-Concept: First- and Higher Order Factor Models and Their Invariance Across Groups. 1985; 97: 562–582.
  121. 121. Tabachnick BG, Fidell LS. Multivariat statistics, https://www.researchgate.net/publication/270761711_Using_Multivariat_Statistics (2007).
  122. 122. Hu LT, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct Equ Model 1999; 6: 1–55
  123. 123. Fornell C, Larcker DF. Evaluating Structural Equation Models with Unobservable Variables and Measurement Error. J Mark Res 1981; 18: 39.
  124. 124. Thomas R Silverman J, et al. Research Methods in Physical Activity, 7E, https://books.google.com.bh/books?id=3FR1CQAAQBAJ&hl=ar&source=gbs_navlinks_s (2015).
  125. 125. Sarstedt M, Ringle CM, Hair JF. Handbook of Market Research. 2020. Epub ahead of print 2020.
  126. 126. Chin WW. The partial least squares approach for structural equation modeling. Mod methods Bus Res 1998; 295–336.
  127. 127. Sarstedt M, Ringle CM, Hair JF. Partial Least Squares Structural Equation Modeling. # Springer Int Publ AG 2017. Epub ahead of print 2017.
  128. 128. Awang ZH. Analyzing the Mediating Variables. Res Methodol Data Anal 2000; 303–311.
  129. 129. Hair JF, Ringle CM, Sarstedt M. Corrigendum to ‘Editorial Partial Least Squares Structural Equation Modeling: Rigorous Applications, Better Results and Higher Acceptance’ [LRP, 46, 1–2, (2013), 1–12], Long Range Plann 2014; 47: 392.
  130. 130. Sullivan GM, Feinn R. Using Effect Size—or Why the P Value Is Not Enough. J Grad Med Educ 2012; 4: 279–282. pmid:23997866
  131. 131. Hair JF, Ringle CM, Sarstedt M, et al. PLS-SEM: Indeed a Silver Bullet PLS-SEM: Indeed a Silver Bullet. 2014; 37–41.
  132. 132. Hayes A. Integrating Mediation and Moderation Analysis: fundamentals using PROCESS. 2013.
  133. 133. Nyachae JN. The Effect of Social Presence on Students’ Perceived Learning and Satisfaction in Online Courses. ProQuest LLC, http://search.ebscohost.com/login.aspx?direct=true&db=eric&AN=ED534970&site=ehost-live%5Cnhttp://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3476497 (2011, accessed 29 May 2022).
  134. 134. Abel R. Achieving success in internet-supported learning in higher education: Case studies il-luminate success factors, challenges, and future directions. February,. Lake Mary, FL, http://www.msmc.la.edu/include/learning_resources/online_course_environment/A-HEC_IsL0205.pdf (2005).
  135. 135. Hameed MA, Counsell S, Swift S. A conceptual model for the process of IT innovation adoption in organizations. J Eng Technol Manag—JET-M 2012; 29: 358–390.
  136. 136. Tariq A, Akter S. An assessment of m-health in developing countries using task technology fit model. 17th Am Conf Inf Syst 2011, AMCIS 2011 2011; 2: 1059–1070.
  137. 137. Gatara M, Cohen JF. Mobile-health tool use and community health worker performance in the Kenyan context: A task-technology fit perspective. ACM Int Conf Proceeding Ser 2014; 28-Septemb: 229–240.
  138. 138. Chen T, Peng L, Jing B, et al. The impact of the COVID-19 pandemic on user experience with online education platforms in China. Sustain 2020; 12: 1–31.
  139. 139. Tarhini A, Hone K, Liu X. User acceptance towards web-based learning systems: Investigating the role of social, organizational and individual factors in european higher education. Procedia Comput Sci 2013; 17: 189–197.
  140. 140. Novkovic N, Huseman C, Zoranovic T, et al. Farm management information systems. CEUR Workshop Proc 2015; 1498: 705–712.
  141. 141. Islam AKMN. E-learning system use and its outcomes: Moderating role of perceived compatibility. Telemat Informatics 2016; 33: 48–55.
  142. 142. Wimshurst K, Wortley R, Bates M, et al. The impact of institutional factors on student academic results: implications for ‘quality’ in universities. High Educ Res Dev 2006; 25: 131–145.
  143. 143. Frimpong EA, Agyeman GA, Ofosu FF. Institutional Factors Affecting the Academic Performance of Polytechnic Students in Ghana. Int J Humanit Soc Sci Stud 2016; 6959: 2349–6959.
  144. 144. ADEYEMI AM, ADEYEMI SB. Institutional factors as predictors of students academic achievement in colleges of education in South western Nigeria. Int J Educ Adm Policy Stud 2014; 6: 141–153.
  145. 145. Gopal R, Singh V, Aggarwal A. Impact of online classes on the satisfaction and performance of students during the pandemic period of COVID 19. Educ Inf Technol 2021; 1–25. pmid:33903795
  146. 146. Christiana OO. Institutional Factors Affecting the Academic Performance of Public Administration Students in a Nigerian University. Public Adm Res; 3. Epub ahead of print 2014.
  147. 147. Aldholay AH, Isaac O, Abdullah Z, et al. The role of transformational leadership as a mediating variable in DeLone and McLean information system success model: The context of online learning usage in Yemen. Telemat Informatics 2018; 35: 1421–1437.