Next Article in Journal
Dual-Energy Computed Tomography of the Liver: Uses in Clinical Practices and Applications
Next Article in Special Issue
Addressing Pediatric HIV Pretreatment Drug Resistance and Virologic Failure in Sub-Saharan Africa: A Cost-Effectiveness Analysis of Diagnostic-Based Strategies in Children ≥3 Years Old
Previous Article in Journal
Is the 1-Minute Sit-To-Stand Test a Good Tool to Evaluate Exertional Oxygen Desaturation in Chronic Obstructive Pulmonary Disease?
Previous Article in Special Issue
Bringing Data Analytics to the Design of Optimized Diagnostic Networks in Low- and Middle-Income Countries: Process, Terms and Definitions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Development of a Standardized Quality Assessment Material to Support Xpert® HIV-1 Viral Load Testing for ART Monitoring in South Africa

by
Lara Dominique Noble
1,*,
Lesley Erica Scott
1,
Asiashu Bongwe
2,
Pedro Da Silva
2 and
Wendy Susan Stevens
1,2
1
Department of Molecular Medicine and Haematology, School of Pathology, Faculty of Health Sciences, University of the Witwatersrand, Johannesburg 2000, Gauteng, South Africa
2
National Priority Program, National Health Laboratory Service, Johannesburg 2000, Gauteng, South Africa
*
Author to whom correspondence should be addressed.
Diagnostics 2021, 11(2), 160; https://doi.org/10.3390/diagnostics11020160
Submission received: 30 November 2020 / Revised: 17 January 2021 / Accepted: 19 January 2021 / Published: 22 January 2021
(This article belongs to the Special Issue HIV Diagnosis, Treatment, and Care)

Abstract

:
The tiered laboratory framework for human immunodeficiency virus (HIV) viral load monitoring accommodates a range of HIV viral load testing platforms, with quality assessment critical to ensure quality patient testing. HIV plasma viral load testing is challenged by the instability of viral RNA. An approach using an RNA stabilizing buffer is described for the Xpert® HIV-1 Viral Load (Cepheid) assay and was tested in remote laboratories in South Africa. Plasma panels with known HIV viral titres were prepared in PrimeStore molecular transport medium for per-module verification and per-instrument external quality assessment. The panels were transported at ambient temperatures to 13 testing laboratories during 2017 and 2018, tested according to standard procedures and uploaded to a web portal for analysis. A total of 275 quality assessment specimens (57 verification panels and two EQA cycles) were tested. All participating laboratories met study verification criteria (n = 171 specimens) with an overall concordance correlation coefficient (ρc) of 0.997 (95% confidence interval (CI): 0.996 to 0.998) and a mean bias of −0.019 log copies per milliliter (cp/mL) (95% CI: −0.044 to 0.063). The overall EQA ρc (n = 104 specimens) was 0.999 (95% CI: 0.998 to 0.999), with a mean bias of 0.03 log cp/mL (95% CI: 0.02 to 0.05). These panels are suitable for use in quality monitoring of Xpert® HIV-1 VL and are applicable to laboratories in remote settings.

1. Introduction

Several countries striving to attain their 2020 UNAIDS 90%/90%/90% targets for global HIV healthcare [1] struggle with the third 90% (virological suppression). Fast-track targets were designed to address this [2], aiming to increase the number of people living with HIV (PLWH) accessing treatment and achieving virological suppression. Current global estimates show that 25.4 million people, approximately 67% of PLWH, were accessing antiretroviral therapy (ART) by end-2019 [3], and monitoring needs are likely to increase over the next decade as more people access ART. A total of 5,231,809 (70%) patients currently access ART in South Africa alone [4], with the number expected to increase as the remaining PLWH are reached. The recommended test for monitoring ART response is HIV viral load (VL) quantification [5]. This has historically been performed at centralized laboratories owing to the number of specimens requiring processing, the logistical needs of the available technologies, and the lack of accurate and cost-effective near patient VL technologies. South Africa has addressed the VL scale-up testing needs through a highly centralized model within the National Health Laboratory Service (NHLS), which is responsible for laboratory testing of ~80% of the population. The capacity of the 16 high throughput, centralized HIV VL laboratories has been further augmented through automation and instruments with increased throughput [6,7,8,9,10,11,12,13,14], most recently the cobas® 8800 (Roche Molecular, Pleasanton, CA, USA) and Alinity-m (Abbott Molecular, Des Plaines, IL, USA) systems.
Nonetheless, there are a number of PLWH who live in remote areas and whom are unable to access the centralized facilities, as highlighted during the current COVID-19 pandemic, either because no collection facilities exist within travelling distance or because specimen transport to the testing laboratories is limited by the stability of HIV RNA plasma [15,16]. While studies showing long-term stability of HIV in whole blood are available [17,18,19], the manufacturers of the VL technologies recommend testing within 24 h, with separation of plasma within six hours and specimen refrigeration [20,21], primarily to maintain the quality of low VL specimens and to overcome the extreme temperatures (>30 °C) in many high HIV prevalence regions. The use of plasma preparation tubes (PPT; Becton Dickinson, Franklin Lakes, NJ, USA) was introduced [19,22,23] to increase the specimen transport window to at least 24 h [24,25], although specimens should still be separated within six hours of collection and prior to transport [19]. Alternative options to plasma-based testing include the use of dried blood spots (DBS) and several countries have shown that this is a feasible option for remote collection and centralized testing [26,27,28,29,30,31]. The DBS matrix is nonetheless challenged by inaccuracies at the clinically relevant range (1000 copies per milliliter (cp/mL)) as the VL at this threshold increases due to the contribution of cell-associated RNA [32]. While this remains the recommended threshold for virological failure [5], there is contention regarding the use of DBS at VL below 5000 cp/mL [33,34]. A decentralized model, utilizing mobile or remote clinics, may address the needs of PLWH in remote areas through a tiered laboratory network [15,16,35], similar to that originally used for CD4 scale-up [36]. As such, the NHLS National Priority Program (NPP), in collaboration with the South African Department of Health, and through the Global Fund to Fight HIV, Tuberculosis and Malaria (Global Fund; Geneva, Switzerland), performed a pilot evaluation of the Xpert® HIV-1 VL (Cepheid, Sunnyvale, CA, USA) in remote district laboratories. The Xpert® HIV-1 VL assay was previously evaluated in collaboration with the NPP [37] and received World Health Organisation pre-qualification status in 2017 [38]. In addition to being one of the few commercially available POCT HIV VL assays ready for implementation at the time of the study, this platform was selected due to the existing GeneXpert® footprint in South Africa, through the Xpert® MTB/RIF program which comprises 207 tuberculosis testing sites, and the goal of integrated diagnosis and monitoring through multipurpose testing platforms.
As part of the HIV VL testing mandate, technologies selected for the NHLS laboratories must be verified (“fit for purpose”) upon installation and prior to testing clinical specimens, regardless of placement within the testing framework. Verification material is frequently sought by the testing laboratory (laboratory networks) from residual patient’s specimens [39], but it is often difficult to obtain sufficient volumes for paired (duplicate/split) testing and is not always possible for remote testing sites. Participation in EQA programs, such as the global Virology Quality Assurance program (VQA, supplied by the Department of AIDS (National Institute of Health, Atlanta, GA, USA)) or the National External Quality Assessment Service (NEQAS, Sheffield, United Kingdom) HIV-1 RNA quantitation program, does provide assurance to an accredited laboratory for pathology services, but does not address pre-testing verification. Furthermore, these panels require expensive shipment, are only available at times of the annual panel testing cycles, and comprise limited numbers of specimens (n = ~5). In addition, the World Health Organisation has published considerations for POCT, including the need for instrument verification as ‘fit for purpose’ and external quality assessment at least annually [40]. Dried tube specimens (DTS) [41,42,43] were not selected, as it was desirable to minimize onsite processing, mimic plasma specimens as far as possible, and ensure sufficient specimen volume for use with the Xpert® HIV-1 VL assay (1.1 mL).
In addition to the programs described above, the South African Viral Load Quality Assessment (SAVQA) panel [44] was previously developed to address the need for scaled HIV VL services in centralized HIV VL laboratories. This panel provides an accessible option for the verification of newly installed HIV VL testing platforms, initially the RealTime HIV-1 (Abbott Molecular, Des Plaines, IL, USA) and cobas® AmpliPrep/cobas® TaqMan® (CAP/CTM; Roche Molecular, Pleasanton, CA, USA) assays, prior to testing clinical specimens, and has also been used for the rapid evaluation of new HIV VL assays [37,45,46,47]. The SAVQA panel [44] is a 42-specimen plasma panel prepared from purchased human plasma (known HIV-1 positive/negative) and quantified using RealTime HIV-1, CAP/CTM and cobas® HIV-1 (Roche Molecular, Pleasanton, CA, USA). The panel is stored and shipped frozen, and only defrosted immediately prior to testing. The panel comprises 17 negative specimens and five repeats of five positive specimens with VL ranging from 2.7 log cp/mL to 5.0 log cp/mL. The panel was designed to measure accuracy, precision, carryover and limit of the blank [44]. The SAVQA panel was readily available, but was not suitable in its existing format. The panel required adaptation to avoid the need for cold-chain shipping and storage, with the remote testing sites having no refrigeration facilities. It was also desirable to include a smaller number of specimens to minimize cost and time constraints as the GeneXpert® is a modular, cartridge-based system designed for random access, single specimen testing. We therefore designed a miniaturized, thermostable version of the SAVQA panel using a commercially available matrix, PrimeStore® Molecular Transport Medium (MTM; Longhorn Vaccines and Diagnostics LLM, Bethesda, MD, USA), to allow ambient temperature shipping and storage. This medium achieved US FDA approval in 2018 [48], and has been evaluated with a variety of mycobacterial [49,50,51,52,53] and viral [54,55,56,57] specimens, including HIV [58]. In addition to the use of MTM-stored specimens with PrimeMix® [50,55,56], MTM has been shown to be compatible with the Xpert® MTB/RIF [52,56] and, more recently, the Xpert® Xpress SARS-CoV-2 [57,59] assays (Cepheid, Sunnyvale, CA, USA), as well as the m2000 RealTime HIV-1 assay [58]. Verification panels were developed alongside a web-based result reporting tool, which was based on the web portal (www.tbgxmonitor.com) previously developed for Xpert® MTB/RIF quality monitoring [60]. Following the successful verification rollout, an external quality assessment (EQA) panel was requested and was designed to measure pre- and post-processing analytics at these pilot laboratories. This manuscript aims to provide a detailed description of these pilot quality panels as an option for POCT HIV VL sites, using clinically relevant panel specimens which can be prepared centrally and sent to remote sites. These panels were specifically designed to meet the needs of remote testing laboratories using the Xpert® HIV-1 VL assay, notably limited cold-chain shipping and cold-storage facilities on site, low throughput testing platforms, the need for ad hoc verification products and, frequently, lower-skilled laboratory staff. The use of QA materials, particularly when evaluated between laboratories, ensures that instruments are fit-for-purpose and that onsite processing is robust, thus ensuring best possible patient result quality within a tiered laboratory framework.

2. Materials and Methods

2.1. Panel Material Preparation

A SAVQA plasma panel, as described above, was removed from storage (−80 °C) and defrosted at ambient temperature, followed by brief centrifugation (3000 rpm, 1 min). HIV-negative specimens (1.3 mL) were not mixed with MTM to provide a clinically relevant specimen, overcoming the decreased viscosity/fat content of the MTM. The negative specimen is important to ensure that no cross-contamination occurs in either the reference laboratory or the testing laboratory during specimen preparation and testing. HIV-positive plasma specimens (300 µL) with known VL were added to 1 mL MTM (Longhorn Vaccines and Diagnostics LLC, Bethesda, MD, USA), giving a dilution factor of 4.3 (total volume/specimen volume). To minimize the risk of leakage, each specimen was packaged individually in a sealed plastic bag with an absorbent pad and the complete panel was then placed into a second sealable bag. Specimens were shipped at ambient temperature using the routine NHLS specimen transport system.
Two panel formats were designed: (i) a verification panel (Figure 1a) and (ii) an EQA panel, (Figure 1b). The verification panel was used to ensure that instruments were functioning correctly upon installation, instrument (module) replacement or instrument movement, and can also be used for staff training. The verification panel consisted of three specimens per module tested: two specimens of known HIV VL stabilized in MTM buffer and one HIV-negative specimen (plasma only). The target ranges for the HIV-positive specimens were 2.7 log cp/mL (low), 3.0 log cp/mL (low), 4.7 log cp/mL (high), and 5.0 log cp/mL (high). All sites received one low VL specimen, one high VL specimen and one HIV-negative specimen, as per testing organization requirements. The EQA panel was necessary for ongoing monitoring of instruments and testing sites. Four specimens were provided per instrument tested, with an instrument being defined as “up to four” GeneXpert® systems attached to one computer. The panel included three specimens of a known HIV VL stabilized in MTM buffer, with a target range of 3.0 log cp/mL, 3.7 log cp/mL and 4.7 log cp/mL, and one HIV-negative plasma specimen. On preparation of either panel format, one specimen in each range was tested using the reference laboratory GeneXpert® instrument (reference specimen; day 0).

2.2. Xpert® HIV-1 VL Quality Panel Testing

Both the verification and EQA specimens were processed according to the Xpert® HIV-1 VL manufacturer’s instructions (Cepheid, Sunnyvale, CA, USA), using the liquid panel in place of clinical plasma. Briefly, the Xpert® HIV-1 VL cartridge was opened and the entire specimen volume (1.3 mL) was transferred into the Xpert® HIV-1 VL cartridge using a precision pipette or 1 mL Pasteur pipette (supplied by Cepheid as part of the kit). The specimen barcode and cartridge number were scanned, and the specimen was tested using the Xpert® HIV-1 Viral Load assay definition file. The original SOP did not include centrifugation instructions, but this was amended after the first verification panel was analysed to ensure that every specimen was briefly centrifuged (3000 rpm, 1 min) prior to processing.

2.3. Result Return and Performance Scoring

A web portal (www.viralloadmonitor.com), based on the original TBGxMonitor website [60] for upload of both verification and EQA results and report generation, was created in collaboration with SmartSpot Quality Pty (Ltd.) (Johannesburg, Gauteng, South Africa). Users were required to upload the comma-separated values (CSV) run files (automatically produced by the GeneXpert® software) for the Xpert® HIV-1 VL panel specimens using a USB device. Results were converted using the dilution factor (4.3) and this was applied within the website logic as part of the scoring algorithm. The criteria for designing the panels were based on monitoring across the clinically relevant threshold of 1000 cp/mL [5], and therefore the scoring system and performance monitoring were applied to this critical range. This included evaluating acceptable differences between the test specimen and the Xpert® HIV-1 VL reference specimen (described above), and was originally defined as <1.0 log cp/mL difference. This large variability was selected to account for potential artefacts generated by specimen dilution, ambient temperature shipping and result conversion. Retrospective analyses at <0.5 log cp/mL difference and <0.3 log cp/mL difference, in line with generally accepted VL variation [61,62], were also performed. Finally, the Xpert® HIV-1 VL reference VL was compared to the pooled mean VL achieved by the 13 testing sites, ensuring that the reference laboratory instrument was performing acceptably and that the reference result was suitable for use as the standard. The scoring system was aligned with the previously well-described TB quality program [60,63,64] and, although differences exist between qualitative (TB) and quantitative (VL) result outputs, the performance was similarly applied due to the modular nature of the GeneXpert system, as follows: each specimen tested received a score out of two: correct result (2/2); error, invalid, >1.0 log cp/mL quantifiable result bias (1/2); incorrect result (e.g., HIV positive reported as HIV negative: 0/2). Each panel was then scored out of six for verification and out of eight for EQA. Scoring logic is detailed in Table 1. The overall panel performance across all sites was measured by the mean, median, range and standard deviation (SD) of the quantifiable viral loads, which were calculated using Microsoft® Excel® 2016 (Microsoft Corporation, Redmond, WA, USA). Regression, the concordance correlation coefficient (ρc) [65,66], including a Pearson correlation coefficient (p; measure of precision) and a bias correction factor (Cb; measure of accuracy), and Bland–Altman [67,68] analyses were performed and graphically represented using MedCalc Statistical Software version 18.11 (MedCalc Software bvba, Ostend, Belgium; http://www.medcalc.org; 2018).

2.4. Verification and EQA Pilot Field Evaluation

The pilot evaluation was nested within a field trial of near-patient VL testing, overseen by the NHLS NPP (Johannesburg, South Africa). Thirteen district laboratory facilities were selected and provided with a GeneXpert® IV (Cepheid, Sunnyvale, CA, USA). The laboratories were located in remote areas across six provinces (Eastern Cape: n = 2, Northern Cape: n = 4, Western Cape: n = 3; Free State: n = 1, Limpopo: n = 2; North West Province: n = 1). Technicians were recruited and received training on the GeneXpert® platform and the Xpert® HIV-1 VL assay. The verification and EQA material were designed to meet requirements of the NPP to ensure that the instruments were fit-for-purpose and that specimen processing was being correctly performed.
Verification panels (n = 4 per site) were provided to all sites in September 2017, following instrument installation and prior to patient testing. Further verification panels (n = 5) were provided on an ad hoc basis as modules were replaced. EQA panels (n = 1 per site) were provided to the sites in June and November 2018. For the pilot evaluation, the automatically generated reports were manually checked prior to release, but the website has the capacity to automatically release reports to the sites.

2.5. Stability Testing

Prior to initial supply to sites, verification specimens (2.7 log cp/mL; 5.0 log cp/mL) were prepared and tested in duplicate at days 7, 14, 21, and 28 (as per process described above) to determine stability compared to the day 0 reference result. Extra EQA panels (3.0 log cp/mL, 3.7 log cp/mL and 4.7 log cp/mL) were prepared at the same time as those sent to the sites and tested at days 24, 43, 84 and 150 post manufacture to determine longer term stability. All specimens were stored at ambient temperature in sealed plastic bags with desiccant.

3. Results

3.1. Verification Panel Performance

All sites tested and uploaded results to the website within three days of panel receipt. Result scores and outcomes are summarized in Table 2 and Figure 2, with detailed information provided in Supplementary Table S1. Quantifiable VL results were within acceptable limits for verification (<1.0 log cp/mL difference from the reference VL, as shown in Table 2) and all reference results were within 0.3 log cp/mL of the pooled mean VL of the specimens tested, although it was noted that the VL bias was high in the 5.0 log cp/mL reference specimen (0.22 log cp/mL). In addition, the sites’ verification VL results were compared to the mean VL (data not shown) and this was comparable to analysis using the reference VL values. The ρc across all sites (n = 151 specimens) was 0.997 (95% confidence interval [CI]: 0.995 to 0.998), with a p of 0.997 and a Cb of 0.999. The mean bias was −0.02 log cp/mL (95% CI: −0.046 to 0.006), with a coefficient of determination (R2) value of 0.9940.
The error rate (20/171; 11.7%) for the verification panels was higher than expected, and was primarily a result of processing errors (55% of errors). Seven errors (35%) were linked to the internal probe failures, two to syringe pressure (10%) and eleven relating to input volume (errors 2096 (35%) and 2097 (20%)). The majority of errors reported (13/20; 65%) occurred in the clinically relevant negative specimen, indicating laboratory processing errors. It was determined, on discussion with the program manager, that the specimens were not being centrifuged prior to testing and that incorrect pipetting procedures may have contributed to the errors. Changes were made to the standard operating procedure (i.e., to centrifuge all specimens prior to use, as would be required for clinical specimens) and staff retraining was performed if necessary. Once these changes were implemented, the error rate (over ad hoc verification and EQA) decreased to 1.7% (2/119 further tests), indicating that correct operating procedures were being observed.

3.2. Pilot EQA Performance

Two cycles of EQA (E18V1, E18V2) were shipped to 13 sites (18 June 2018, 12 November 2018) and results were uploaded within seven days (mean: 4.1 days). All sites showed acceptable performance across both EQA panels; the program performance is summarized in Table 3 and Figure 3, and complete site results are detailed in Supplementary Table S2. Viral loads were within acceptable limits for EQA (<1.0 log cp/mL bias), and all negative specimens were reported as not detectable (no carryover). The ρc for the EQA pilot panels (two EQA panels, n = 102/104 specimens) across all sites was 0.9985 (95% CI: 0.9978 to 0.9990), with a ρc of 0.9987 and a Cb of 0.9998. The mean bias was 0.03 (95% CI: 0.02 to 0.05). The error rate was 1.9% (two in 104 tests) and was caused by volume loading (user) errors.

3.3. Retrospective Result Analysis

Retrospective analysis of the verification and EQA results was performed after the pilot evaluation, in order to accommodate acceptable VL biases [61,62]. Amongst 107 quantifiable verification results, ten (9.3%) showed a bias of >0.3 log cp/mL (range: 0.36, −0.91). Only one outlier specimen (4.33 log cp/mL) displayed a bias >0.5 log cp/mL: −0.91 log cp/mL compared to the reference VL and −0.69 log cp/mL compared to the pooled mean VL. This specimen was part of the 5.0 log cp/mL group, where the reference VL (5.24 log cp/mL) was notably higher than the pooled mean VL (5.02 log cp/mL). A second outlier (4.83 log cp/mL) in this group had a VL bias of −0.41 log cp/mL compared to the reference VL, with an acceptable bias of −0.19 log cp/mL compared to the pooled mean VL. Only three specimens (2.8%) had a bias of >0.3 log cp/mL compared to the pooled mean VL. All quantifiable EQA VL (n = 76) results showed a bias of <0.3 log cp/mL compared to the reference VL.

3.4. Specimen Stability

Stability of the specimens stored in MTM was evaluated prior to panel design and supply, with specimen stability acceptable up to 28 days (Figure 4a). Testing of EQA panels in the reference laboratory between weeks 4 and 20 (Figure 4b), showed stability of all specimens at week 6 (day 43) and extended stability of the higher VL range (4.7 log cp/mL) specimens until week 12 (day 84). However, by week 12, a decrease of ~0.5 log cp/mL was noted in the lower (3.0 log cp/mL) VL range. Errors were noted in the 2.7 log cp/mL on day 1 (repeat) and the 3.0 log cp/mL specimen at day 24 (both error 2126; module reset), and in the 3.7 log cp/mL specimen at day 84 (invalid, error 5016: probe check error). These relate to the instrument and the cartridge, rather than the specimen. Retesting was not possible due to limited specimen availability. By Day 150, all VL exceeded >0.5 log cp/mL difference from baseline (day 0), with both the 3.7 log cp/mL and 4.7 log cp/mL specimens showing a VL decrease of >1.0 log cp/mL. Bland-Altman analysis of the reportable VL results (n = 14/16) over the weeks, including day 84, when a VL decrease was noted, but excluding day 150, when VL were no longer relevant, gave a mean bias of −0.06 log cp/mL with a lower limit of −0.34 log cp/mL (95% CI: −0.89 to −0.21) and an upper limit of 0.23 log cp/mL (95% CI: 0.10 to 0.77). Including day 150 (n = 18/20) gave a mean bias of −0.20 log cp/mL with a lower limit of −0.97 log cp/mL (95%CI: −2.11 to −0.62) and an upper limit of 0.58 log c/mL (95% CI: 0.23 to 1.72), beyond acceptable limits for supply to sites.

4. Discussion

Laboratory quality monitoring is vital to ensure ongoing patient result testing accuracy [39,69]. Instruments must be evaluated prior to implementation, verified before use in the field and monitored on an ongoing basis. Similarly, staff competency should be evaluated through training, observation and participation in quality programs. Evaluation can be performed on existing specimens (e.g., frozen plasma), prospective specimens (against a reference instrument currently in use) or on well-described quality panels (e.g., NEQAS, SAVQA). EQA, through supply of standardized specimens for testing and through continuous quality monitoring (CQM, e.g., analysis of central data repositories), enables program managers to identify potential instrument or staff deficiencies for correction. Participation in EQA programs has been shown to improve participant performance [42]. CQM of assays and instruments is becoming standard practice for many connected diagnostics. Operational dashboards, such as C360 (Cepheid), provide assay and instrument quality information on errors, utility, and various result parameters on a module/instrument/laboratory and location basis, and can be utilized for daily and monthly monitoring to identify quality issues, without waiting for EQA panel cycles [70]. CQM, through the C360 platform, was successfully applied during the near-patient testing pilot into which this evaluation was nested, but is beyond the scope of this manuscript. EQA is complimentary to CQM, ensuring ongoing pre- and post-analytical performance monitoring, which is particularly important where staff turn-over is high.
The Xpert® HIV-1 VL assay was previously evaluated, using both the SAVQA panel and clinical specimens [37], and has since been extensively evaluated in the field [14,71], meaning that the assay did not require further evaluation prior to implementation. However, before the implementation pilot could commence, verification of the modules was required, and was complicated by the remote placement of the instruments as residual plasma specimens were not readily available. Alternative options for instrument verification were thus needed. This manuscript describes the design and pilot evaluation of quality panels used for POCT HIV VL. The panels were designed to meet specific requirements: (i) specimen processing needed to be as similar as possible to actual specimens; (ii) thermostable transport and storage; (iii) reproducible VL results, such that processing or instrument issues could be detected during verification and ongoing EQA, and (iv) safe during transport. While initially designed for module verification, the panels were easily adapted for ongoing EQA. These panels were based on similar principles to the Xpert® MTB/RIF program [63,64], which has been used successfully throughout the NHLS to monitor 207 Xpert® MTB/RIF testing sites, as well as internationally (28 countries), and was expected to provide similar rigorous quality monitoring to Xpert® HIV-1 VL sites.
It is notable that the panels were supplied in a liquid format and that no processing was required beyond centrifugation and direct addition of specimen into the Xpert® HIV-1 cartridge, mimicking routine patient specimen testing. This was in contrast to dried tube specimens (DTS), which have been used throughout sub-Saharan Africa for EQA [41,42,43]. DTS were not selected for this pilot as the NPP preferred to minimize specimen processing variability during specimen reconstitution by using a liquid panel, although DTS met all other requirements described. Furthermore, similarly to the original SAVQA panel, the verification program was designed for rapid deployment using local resources, decreasing reliance on scheduled schemes [44]. Shipping of liquid specimens is potentially problematic, given the risk of leakage, particularly if the transport infrastructure is poor (e.g., degraded road surfaces). Panels were well packaged and no leakage of the specimen from the tube into the protective packing was observed. However, the extra packaging, as described above, is recommended for similar panels going forward to minimize risk to transport personnel and to meet IATA requirements [72]. The infectivity of HIV when stored in MTM was not tested in this pilot, but existing studies have shown that pathogens are fully inactivated on addition to the buffer [48,50,54,57,73], while RNA integrity is simultaneously preserved [50,55,56,57], including HIV-1 RNA [58].
Thermostability of the panels, with little VL variation, was shown for a minimum of twelve weeks from manufacture. Earlier studies have shown that viral RNA (e.g., influenza) can be reliably detected for up to 196 days [54] and quantified for up to 23 days [56]. This study has shown longer-term stability on HIV RNA, although it should be noted that the manufacturer only recommends storage at ambient temperature for 30 days. Furthermore, stability testing was performed in Johannesburg during the South African winter and spring, with temperatures ranging from 8 to 23 °C, but with minimal humidity. More recent studies performed during the hotter months (maximum temperature 31 °C) and with increased humidity showed decreases of >1 log cp/mL by 10 weeks (personal communication, Dean Sher, SmartSpot Quality Pty (Ltd.), Johannesburg, Gauteng, South Africa) and is therefore a consideration for long-term stability in warmer climates. A recent manuscript reported decreased yield of Mycobacterium tuberculosis in oral mucosa specimens stored in PrimeStore MTM after 30 days and also after extended freezing [52], a finding that may similarly affect these specimens if frozen. Further stability evaluations in humid and warmer settings are recommended to ensure similar stability in such settings. In this evaluation, the QA specimens were made to order and generally tested within one week. It should also be noted that the dilution factor was applied to this pilot in order to allow for a comparison with the original SAVQA data.
In order to determine if specimen variability [74] affected the performance of sites compared to the reference VL, the specimen VL from all sites and the reference VL were compared to the pooled mean VL of all sites. In all cases, the mean VL and the reference VL were similar (−0.02 log cp/mL mean difference), although the reference instrument did produce a higher VL (5.24 log cp/mL vs 5.02 log cp/mL) than all sites in the 5.0 log cp/mL range. This was not clinically significant and did not affect site performance outcome. The bias of the single outlier specimen described (−0.91 log cp/mL bias) was acceptable for verification in terms of the panel design, but unacceptable in the retrospective analysis. However, the site still achieved a module score of 5/6 in the retrospective analysis and patient specimen testing could commence. The benefit of a quality program across multiple sites was that multiple instruments were tested concurrently and panels could be compared to the pooled mean VL rather than only the reference VL; this provided an additional quality control of the reference instrument and the potential to highlight unexpected instability of the quality material. Retrospective analysis of the specimens showed that they could be evaluated at 0.3 and 0.5 log cp/mL bias [61,62], and these thresholds should be implemented when using this quality panel further.
This design can be adapted to tiered laboratory systems to ensure continued quality POCT HIV VL testing, although resources (MTM buffer, plasma (if purchased), staff time required to manufacture the panels and to collate the results, post-manufacture quality testing and shipping) and individual country needs must be evaluated on an individual basis [69]. Similarly, if this quality material was adapted by commercial suppliers, the cost and feasibility of scaled manufacture at an implementation price acceptable to countries needing such QA products should be investigated. Of note, is the limited stability and compatibility with alternative HIV VL assays if using assays beyond Xpert® HIV-1 VL. This was not evaluated during this pilot, but it has been observed that the MTM buffer occasionally interacts negatively with certain HIV VL assays (personal communication, Dean Sher, SmartSpot Quality Pty (Ltd.), Johannesburg, Gauteng, South Africa). The value of formal verification or EQA panels should not be disregarded, particularly for smaller programs where globally standardized specimens may provide more rigorous quality measures [39,69], but mandatory participation in such schemes varies [39]. A further consideration for using commercial EQA panels is to free up the time of the program managers from producing panels and evaluating results, so as to use this time to assist the laboratories which the EQA identifies as needing help, to identify root-causes and implement corrective actions [69]. Ultimately, whether in-house or commercial, the goal is to ensure quality laboratory testing [39,69], which impacts positively on patient care and management.
Ongoing quality monitoring at all levels of a tiered laboratory network is paramount to ensure that patient results are accurate. This can be difficult for POCT instruments placed in remote settings, where quality management options used in centralized laboratories are not feasible, but where quality monitoring is vital. The quality panels described in this manuscript provide simple and convenient verification and/or EQA options for countries aiming to implement Xpert® HIV-1 VL.

Supplementary Materials

The following are available online at https://www.mdpi.com/2075-4418/11/2/160/s1, Supplementary Table S1: Detailed Site Verification Summary: September 2017–November 2018; Supplementary Table S2: Detailed Site EQA Summary: September 2017–November 2018.

Author Contributions

Conceptualization: L.E.S., L.D.N., and P.D.S.; methodology: L.D.N. and L.E.S.; validation: L.D.N., A.B., and P.D.S.; formal analysis: L.D.N. and L.E.S.; resources: W.S.S., L.E.S., and P.D.S.; writing—original draft preparation: L.D.N.; writing—review and editing, L.E.S., P.D.S., W.S.S., and L.D.N. Supervision: L.E.S., P.D.S., and W.S.S.; project administration: A.B. and L.D.N.; funding acquisition: W.S.S., P.D.S., and L.E.S. All authors have read and agreed to the published version of the manuscript.

Funding

The project was supported by funding received from the National Department of Health with funds received from the Global Fund to Fight AIDS, Tuberculosis and Malaria (sub-recipient grant number: ZAC-C-NDOH). Lesley Scott and Lara Noble were supported by funds received from the AIDS Clinical Trials Group. Wendy Stevens and Lesley Scott were supported by funding received from the South African Medical Research Council and with funds received from the South African National Department of Health, the UK Medical Research Council, the UK Government’s Newton Fund under the UK/South Africa Newton Fund (no. 015NEWTON TB), Wendy Stevens, Lesley Scott and Lara Noble are supported through funding received from the Bill and Melinda Gates Foundation (OPP1171455).

Institutional Review Board Statement

Ethical review and approval were not required for this study, as the anonymized plasma specimens used were purchased through agreements with the South African National Blood Service.

Informed Consent Statement

Patient specimens were not used for this study. Anonymized plasma specimens used were purchased through agreements with the South African National Blood Service.

Acknowledgments

SmartSpot Quality (Pty) Ltd. for assistance in packaging the panels and the development of the www.viralloadmonitor.com website. John Molifi for assistance with specimen shipping and site staff for their participation in the pilot project.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Joint United Nations Programme for HIV/AIDS. 90–90–90—An Ambitious Treatment Target to Help End the Aids Epidemic; UNAIDS: Geneva, Switzerland, 2014; Available online: https://www.unaids.org/sites/default/files/media_asset/90-90-90_en.pdf (accessed on 30 November 2020).
  2. UNAIDS. Fast-Track: Ending the AIDS Epidemic by 2030. 2014. Available online: http://www.unaids.org/sites/default/files/media_asset/JC2686_WAD2014report_en.pdf (accessed on 28 July 2018).
  3. UNAIDS. Global HIV and AIDS Statistics—2020 Fact Sheet. 2020. Available online: https://unaids.org/en/resources/fact-sheet (accessed on 25 November 2020).
  4. UNAIDS. UNAIDS Data 2020. 2020. Available online: https://www.unaids.org/sites/default/files/media_asset/2020-aids-data-book_en.pdf (accessed on 25 November 2020).
  5. World Health Organisation. Consolidated Guidelines on the Use of Anitretroviral Drugs for Treating and Preventing HIV Infection: Recommendations for a Public Health Approach, 2nd ed.; World Health Organisation: Geneva, Switzerland, 2016. [Google Scholar]
  6. Berger, A.; Scherzed, L.; Sturmer, M.; Preiser, W.; Doerr, H.W.; Rabenau, H.F. Comparative evaluation of the Cobas Amplicor HIV-1 Monitor Ultrasensitive Test, the new Cobas AmpliPrep/Cobas Amplicor HIV-1 Monitor Ultrasensitive Test and the Versant HIV RNA 3.0 assays for quantitation of HIV-1 RNA in plasma samples. J. Clin. Virol. Off. Publ. Pan Am. Soc. Clin. Virol. 2005, 33, 43–51. [Google Scholar] [CrossRef] [PubMed]
  7. De Mendoza, C.; Koppelman, M.; Montes, B.; Ferre, V.; Soriano, V.; Cuypers, H.; Segondy, M.; Oosterlaken, T. Multicenter evaluation of the NucliSens EasyQ HIV-1 v1.1 assay for the quantitative detection of HIV-1 RNA in plasma. J. Virol. Methods 2005, 127, 54–59. [Google Scholar] [CrossRef] [PubMed]
  8. Stevens, W.; Horsfield, P.; Scott, L.E. Evaluation of the performance of the automated NucliSENS easyMAG and EasyQ systems versus the Roche AmpliPrep-AMPLICOR combination for high-throughput monitoring of human immunodeficiency virus load. J. Clin. Microbiol. 2007, 45, 1244–1249. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  9. Scott, L.E.; Noble, L.D.; Moloi, J.; Erasmus, L.; Venter, W.D.; Stevens, W. Evaluation of the Abbott m2000 RealTime human immunodeficiency virus type 1 (HIV-1) assay for HIV load monitoring in South Africa compared to the Roche Cobas AmpliPrep-Cobas Amplicor, Roche Cobas AmpliPrep-Cobas TaqMan HIV-1, and BioMerieux NucliSENS EasyQ HIV-1 assays. J. Clin. Microbiol. 2009, 47, 2209–2217. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Chung, E.; Ferns, R.B.; He, M.; Rigatti, R.; Grant, P.; McCormick, A.; Bhagani, S.; Webster, D.P.; Nastouli, E.; Waters, L.J. Ultra-deep sequencing provides insights into the virology of hepatitis C super-infections in a case of three sequential infections with different genotypes. J. Clin. Virol. Off. Publ. Pan Am. Soc. Clin. Virol. 2015, 70, 63–66. [Google Scholar] [CrossRef] [PubMed]
  11. Manak, M.M.; Hack, H.R.; Nair, S.V.; Worlock, A.; Malia, J.A.; Peel, S.A.; Jagodzinski, L.L. Evaluation of Hologic Aptima HIV-1 Quant Dx Assay on the Panther System on HIV Subtypes. J. Clin. Microbiol. 2016, 54, 2575–2581. [Google Scholar] [CrossRef] [Green Version]
  12. Sam, S.S.; Kurpewski, J.R.; Cu-Uvin, S.; Caliendo, A.M. Evaluation of Performance Characteristics of the Aptima HIV-1 Quant Dx Assay for Detection and Quantitation of Human Immunodeficiency Virus Type 1 in Plasma and Cervicovaginal Lavage Samples. J. Clin. Microbiol. 2016, 54, 1036–1041. [Google Scholar] [CrossRef] [Green Version]
  13. Longo, S.; Bon, I.; Musumeci, G.; Bertoldi, A.; D’Urbano, V.; Calza, L.; Re, M.C. Comparison of the Aptima HIV-1 Quant Dx assay with the COBAS AmpliPrep/COBAS TaqMan HIV-1 v2.0 Test for HIV-1 viral load quantification in plasma samples from HIV-1-infected patients. Health Sci. Rep. 2018, 1, e31. [Google Scholar] [CrossRef]
  14. Sacks, J.A.; Fong, Y.; Gonzalez, M.P.; Andreotti, M.; Baliga, S.; Garrett, N.; Jordan, J.; Karita, E.; Kulkarni, S.; Mor, O.; et al. Performance of Cepheid Xpert HIV-1 viral load plasma assay to accurately detect treatment failure. Aids 2019, 33, 1881–1889. [Google Scholar] [CrossRef] [Green Version]
  15. Carmona, S.; Peter, T.; Berrie, L. HIV viral load scale-up: Multiple interventions to meet the HIV treatment cascade. Curr. Opin. HIV AIDS 2017, 12, 157–164. [Google Scholar] [CrossRef]
  16. Nichols, B.E.; Girdwood, S.J.; Crompton, T.; Stewart-Isherwood, L.; Berrie, L.; Chimhamhiwa, D.; Moyo, C.; Kuehnle, J.; Stevens, W.; Rosen, S.; et al. Monitoring viral load for the last mile: What will it cost? J. Int. AIDS Soc. 2019, 22, e25337. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Amellal, B.; Murphy, R.; Maiga, A.; Brucker, G.; Katlama, C.; Calvez, V.; Marcelin, A.G. Stability of HIV RNA in plasma specimens stored at different temperatures. HIV Med. 2008, 9, 790–793. [Google Scholar] [CrossRef] [PubMed]
  18. Vandamme, A.M.; Van Laethem, K.; Schmit, J.C.; Van Wijngaerden, E.; Reynders, M.; Debyser, Z.; Witvrouw, M.; Van Ranst, M.; De Clercq, E.; Desmyter, J. Long-term stability of human immunodeficiency virus viral load and infectivity in whole blood. Eur. J. Clin. Investig. 1999, 29, 445–452. [Google Scholar] [CrossRef] [PubMed]
  19. Hardie, D.; Korsman, S.; Ameer, S.; Vojnov, L.; Hsiao, N.Y. Reliability of plasma HIV viral load testing beyond 24 hours: Insights gained from a study in a routine diagnostic laboratory. PLoS ONE 2019, 14, e0219381. [Google Scholar] [CrossRef] [Green Version]
  20. Abbott RealTime HIV1 Kit Insert 51-602100/R10 [Internet]. 2014. Available online: https://www.who.int/diagnostics_laboratory/evaluations/pq-list/hiv-vrl/160530_0145_027_00_final_public_report_v2.pdf (accessed on 30 November 2020).
  21. Roche CAP/CTM HIV-1 v2.0 EXPT-IVD [Internet]. 2018. Available online: https://pim-eservices.roche.com/eLD/api/downloads/ee10c6ed-0bff-e911-fa90-005056a71a5d?countryIsoCode=za (accessed on 30 November 2020).
  22. Goedhals, D.; Scott, L.E.; Moretti, S.; Cooper, M.A.; Opperman, W.J.; Rossouw, I. Evaluation of the use of plasma preparation tubes for HIV viral load testing on the COBAS AmpliPrep/COBAS TaqMan HIV-1 version 2.0. J. Virol. Methods 2013, 187, 248–250. [Google Scholar] [CrossRef] [PubMed]
  23. Luo, R.; Markby, J.; Sacks, J.; Vojnov, L. Systematic review of the accuracy of plasma preparation tubes for HIV viral load testing. PLoS ONE 2019, 14, e0225393. [Google Scholar] [CrossRef]
  24. Ginocchio, C.C.; Wang, X.P.; Kaplan, M.H.; Mulligan, G.; Witt, D.; Romano, J.W.; Cronin, M.; Carroll, R. Effects of specimen collection, processing, and storage conditions on stability of human immunodeficiency virus type 1 RNA levels in plasma. J. Clin. Microbiol. 1997, 35, 2886–2893. [Google Scholar] [CrossRef] [Green Version]
  25. Dickover, R.E.; Herman, S.A.; Saddiq, K.; Wafer, D.; Dillon, M.; Bryson, Y.J. Optimization of specimen-handling procedures for accurate quantitation of levels of human immunodeficiency virus RNA in plasma by reverse transcriptase PCR. J. Clin. Microbiol. 1998, 36, 1070–1073. [Google Scholar] [CrossRef] [Green Version]
  26. Sikombe, K.; Hantuba, C.; Musukuma, K.; Sharma, A.; Padian, N.; Holmes, C.; Czaicki, N.; Simbeza, S.; Somwe, P.; Bolton-Moore, C.; et al. Accurate dried blood spots collection in the community using non-medically trained personnel could support scaling up routine viral load testing in resource limited settings. PLoS ONE 2019, 14, e0223573. [Google Scholar] [CrossRef] [Green Version]
  27. Rutstein, S.E.; Hosseinipour, M.C.; Kamwendo, D.; Soko, A.; Mkandawire, M.; Biddle, A.K.; Miller, W.C.; Weinberger, M.; Wheeler, S.B.; Sarr, A.; et al. Dried blood spots for viral load monitoring in Malawi: Feasible and effective. PLoS ONE 2015, 10, e0124748. [Google Scholar] [CrossRef] [Green Version]
  28. Barnabas, R.; Coombs, R.; Chang, M.; Schaafsma, T.; Asiimwe, S.; Thomas, K.; Baeten, J.; Celum, C. Dried Blood Spots Provide Accurate Enumeration of HIV-1 Viral Load in East Africa. In Proceedings of the 21st International AIDS Conference, Durban, South Africa, 18–22 July 2016. [Google Scholar]
  29. Schmitz, M.E.; Agolory, S.; Junghae, M.; Broyles, L.N.; Kimeu, M.; Ombayo, J.; Umuro, M.; Mukui, I.; Alwenya, K.; Baraza, M.; et al. Field evaluation of Dried Blood Spots for HIV-1 viral load monitoring in adults and children receiving antiretroviral treatment in Kenya: Implications for scale-up in resource-limited settings. J. Acquir. Immune Defic. Syndr. 2017. [Google Scholar] [CrossRef] [PubMed]
  30. Pollack, T.M.; Duong, H.T.; Truong, P.T.; Pham, T.T.; Do, C.D.; Colby, D. Sensitivity and specificity of two dried blood spot methods for HIV-1 viral load monitoring among patients in Hanoi, Vietnam. PLoS ONE 2018, 13, e0191411. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  31. Zeh, C.; Ndiege, K.; Inzaule, S.; Achieng, R.; Williamson, J.; Chih-Wei Chang, J.; Ellenberger, D.; Nkengasong, J. Evaluation of the performance of Abbott m2000 and Roche COBAS Ampliprep/COBAS Taqman assays for HIV-1 viral load determination using dried blood spots and dried plasma spots in Kenya. PLoS ONE 2017, 12, e0179316. [Google Scholar] [CrossRef] [PubMed]
  32. Zida, S.; Tuaillon, E.; Barro, M.; Kwimatouo Lekpa Franchard, A.; Kagone, T.; Nacro, B.; Ouedraogo, A.S.; Bollore, K.; Sanosyan, A.; Plantier, J.C.; et al. Estimation of HIV-1 DNA Level Interfering with Reliability of HIV-1 RNA Quantification Performed on Dried Blood Spots Collected from Successfully Treated Patients. J. Clin. Microbiol. 2016, 54, 1641–1643. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  33. Smit, P.W.; Sollis, K.A.; Fiscus, S.; Ford, N.; Vitoria, M.; Essajee, S.; Barnett, D.; Cheng, B.; Crowe, S.M.; Denny, T.; et al. Systematic review of the use of dried blood spots for monitoring HIV viral load and for early infant diagnosis. PLoS ONE 2014, 9, e86461. [Google Scholar] [CrossRef] [PubMed]
  34. Inzaule, S.C.; Hamers, R.L.; Zeh, C.E.; Rinke de Wit, T.F. Stringent HIV Viral Load Threshold for Virological Failure Using Dried Blood Spots: Is the Perfect the Enemy of the Good? J. Acquir. Immune Defic. Syndr. 2016, 71, e30–e33. [Google Scholar] [CrossRef]
  35. Cassim, N.; Coetzee, L.M.; Stevens, W.S.; Glencross, D.K. Addressing antiretroviral therapy-related diagnostic coverage gaps across South Africa using a programmatic approach. Afr. J. Lab. Med. 2018, 7, 681. [Google Scholar] [CrossRef]
  36. Glencross, D.K.; Coetzee, L.M.; Cassim, N. An integrated tiered service delivery model (ITSDM) based on local CD4 testing demands can improve turn-around times and save costs whilst ensuring accessible and scalable CD4 services across a national programme. PLoS ONE 2014, 9, e114727. [Google Scholar] [CrossRef]
  37. Gous, N.; Scott, L.; Berrie, L.; Stevens, W. Options to Expand HIV Viral Load Testing in South Africa: Evaluation of the GeneXpert(R) HIV-1 Viral Load Assay. PLoS ONE 2016, 11, e0168244. [Google Scholar] [CrossRef] [Green Version]
  38. World Health Organisation. WHO Prequalification of In Vitro Diagnostics PUBLIC REPORT Product: Xpert® HIV-1 Viral Load with GeneXpert® Dx, GeneXpert® Infinity-48, GeneXpert® Infinity-48s and GeneXpert® Infinity-80; WHO Reference Numbers: PQDx 0192-070-00, PQDx 0193-070-00, PQDx 0194-070-00, PQDx 0195-070-00. Available online: http://www.who.int/diagnostics_laboratory/evaluations/pq-list/hiv-vrl/170720_final_pq_report_pqdx_0192_0193_0194_0195_070-00.pdf?ua=1 (accessed on 2 August 2017).
  39. Payne, D.A.; Russomando, G.; Linder, M.W.; Baluchova, K.; Ashavaid, T.; Steimer, W.; Ahmad-Nejad, P. External quality assessment (EQA) and alternative assessment procedures (AAPs) in molecular diagnostics: Findings of an international survey. Clin. Chem. Lab. Med. 2020. [Google Scholar] [CrossRef]
  40. World Health Organisation. Global TB Programme and Department of HIV/AIDS Information Note: Considerations for Adoption and Use of Multidesease Testing Devices in Integrated Laboratory Networks. 2017. Available online: https://www.who.int/tb/publications/2017/considerations_multidisease_testing_devices_2017/en/ (accessed on 12 December 2019).
  41. Parekh, B.S.; Anyanwu, J.; Patel, H.; Downer, M.; Kalou, M.; Gichimu, C.; Keipkerich, B.S.; Clement, N.; Omondi, M.; Mayer, O.; et al. Dried tube specimens: A simple and cost-effective method for preparation of HIV proficiency testing panels and quality control materials for use in resource-limited settings. J. Virol. Methods 2010, 163, 295–300. [Google Scholar] [CrossRef] [PubMed]
  42. Nguyen, S.; Ramos, A.; Chang, J.; Li, B.; Shanmugam, V.; Boeras, D.; Nkengasong, J.N.; Yang, C.; Ellenberger, D. Monitoring the quality of HIV-1 viral load testing through a proficiency testing program using dried tube specimens in resource-limited settings. J. Clin. Microbiol. 2015, 53, 1129–1136. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  43. Ramos, A.; Nguyen, S.; Garcia, A.; Subbarao, S.; Nkengasong, J.N.; Ellenberger, D. Generation of dried tube specimen for HIV-1 viral load proficiency test panels: A cost-effective alternative for external quality assessment programs. J. Virol. Methods 2013, 188, 1–5. [Google Scholar] [CrossRef]
  44. Scott, L.E.; Carmona, S.; Gous, N.; Horsfield, P.; Mackay, M.; Stevens, W. Use of a prequalification panel for rapid scale-up of high-throughput HIV viral load testing. J. Clin. Microbiol. 2012, 50, 4083–4086. [Google Scholar] [CrossRef] [Green Version]
  45. Gous, N.; Bethlehem, L.; Subramunian, C.; Coetzee, J.; Stevens, W.; Scott, L.E. New Options for HIV Viral Load testing: The Panther Aptima HIV-1 Quant Dx assay (Hologics, Inc.). In Proceedings of the African Society for Laboratory Medicine, Cape Town, South Africa, 3–8 December 2016. [Google Scholar]
  46. Scott, L.; Gous, N.; Carmona, S.; Stevens, W. Laboratory evaluation of the Liat HIV Quant (IQuum) whole-blood and plasma HIV-1 viral load assays for point-of-care testing in South Africa. J. Clin. Microbiol. 2015, 53, 1616–1621. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  47. Scott, L.; Gous, N.; Carmona, S.; Stevens, W. Performance of Xpert® HIV-1 Quant compared to Roche CAP/CTM v2 and Abbott RealTime HIV-1 on a prequalification plasma validation panel. In Proceedings of the African Society of Laboratory Medicine (ASLM), Cape Town, South Africa, 30 November–4 December 2014. [Google Scholar]
  48. Evaluation of Automatic Class III Designation for PrimeStore MTM Decision Summary. 2018. Available online: https://www.accessdata.fda.gov/cdrh_docs/reviews/DEN170029.pdf (accessed on 11 December 2019).
  49. Daum, L.T.; Fourie, P.B.; Peters, R.P.; Rodriguez, J.D.; Worthy, S.A.; Khubbar, M.; Bhattacharyya, S.; Gradus, M.S.; Mboneni, T.; Marubini, E.E.; et al. Xpert((R)) MTB/RIF detection of Mycobacterium tuberculosis from sputum collected in molecular transport medium. Int. J. Tuberc Lung. Dis. 2016, 20, 1118–1124. [Google Scholar] [CrossRef] [PubMed]
  50. Daum, L.T.; Choi, Y.; Worthy, S.A.; Rodriguez, J.D.; Chambers, J.P.; Fischer, G.W. A molecular transport medium for collection, inactivation, transport, and detection of Mycobacterium tuberculosis. Int. J. Tuberc. Lung Dis. 2014, 18, 847–849. [Google Scholar] [CrossRef] [PubMed]
  51. Mboneni, T.A.; Eales, O.O.; Maningi, N.e.; Hugo, J.F.M.; Fourie, P.B. Detection by RT-PCR of Mycobacterium tuberculosis from oral swab specimens using PrimeStore(R) molecular transport medium. In Proceedings of the 20th European Congress of Clinical Microbiology and Infectious Diseases, Amsterdam, The Netherlands, 13–16 April 2019. [Google Scholar]
  52. Molina-Moya, B.; Ciobanu, N.; Hernandez, M.; Prat-Aymerich, C.; Crudu, V.; Adams, E.R.; Codreanu, A.; Sloan, D.J.; Cuevas, L.E.; Dominguez, J. Molecular Detection of Mycobacterium tuberculosis in Oral Mucosa from Patients with Presumptive Tuberculosis. J. Clin. Med. 2020, 9, 4124. [Google Scholar] [CrossRef]
  53. Bimba, J.S.; Lawson, L.; Kontogianni, K.; Edwards, T.; Ekpenyong, B.E.; Dodd, J.; Adams, E.R.; Sloan, D.J.; Creswell, J.; Dominguez, J.; et al. PrimeStore MTM and OMNIgene Sputum for the Preservation of Sputum for Xpert MTB/RIF Testing in Nigeria. J. Clin. Med. 2019, 8, 2146. [Google Scholar] [CrossRef] [Green Version]
  54. Schlaudecker, E.P.; Heck, J.P.; MacIntyre, E.T.; Martinez, R.; Dodd, C.N.; McNeal, M.M.; Staat, M.A.; Heck, J.E.; Steinhoff, M.C. Comparison of a new transport medium with universal transport medium at a tropical field site. Diagn. Microbiol. Infect. Dis. 2014, 80, 107–110. [Google Scholar] [CrossRef]
  55. Daum, L.T.; Worthy, S.A.; Yim, K.C.; Nogueras, M.; Schuman, R.F.; Choi, Y.W.; Fischer, G.W. A clinical specimen collection and transport medium for molecular diagnostic and genomic applications. Epidemiol. Infect. 2011, 139, 1764–1773. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  56. Daum, L.T.; Rodriguez, J.D.; Fischer, J.D.; Fischer, G.W. Influenza Viral Detection from Nasal Wash, Throat, and Nasophayngeal Swabs Collected and Preserved in PrimeStore Molecular Transport Medium. In Proceedings of the 6th ISIRV-AVG Conference, Washington, DC, USA, 13–15 November 2018. [Google Scholar]
  57. Van Bockel, D.; Munier, C.M.L.; Turville, S.; Badman, S.G.; Walker, G.; Stella, A.O.; Aggarwal, A.; Yeang, M.; Condylios, A.; Kelleher, A.D.; et al. Evaluation of Commercially Available Viral Transport Medium (VTM) for SARS-CoV-2 Inactivation and Use in Point-of-Care (POC) Testing. Viruses 2020, 12, 1208. [Google Scholar] [CrossRef] [PubMed]
  58. Gous, N.; Scott, L.; Stevens, W. Can dried blood spots or whole blood liquid transport media extend access to HIV viral load testing? In Proceedings of the African Society for Laboratory Medicine Conference, Cape Town, South Africa, 30 November–2 December 2014. [Google Scholar]
  59. Hengel, B.; Causer, L.; Matthews, S.; Smith, K.; Andrewartha, K.; Badman, S.; Spaeth, B.; Tangey, A.; Cunningham, P.; Phillips, E.; et al. A decentralised point-of-care testing model to address inequities in the COVID-19 response. Lancet. Infect. Dis. 2020. [Google Scholar] [CrossRef]
  60. Cunningham, B.; Scott, L.; Molapo, S.; Gous, N.; Erasmus, L.; Stevens, W. Web-based automated EQA and Instrument Verification reporting tool for the Xpert® MTB/RIF assay. In Proceedings of the 3rd South African TB Conference, Durban, South Africa, 2–5 June 2012. [Google Scholar]
  61. Senechal, B.; James, V.L. Ten years of external quality assessment of human immunodeficiency virus type 1 RNA quantification. J. Clin. Microbiol. 2012, 50, 3614–3619. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  62. Brambilla, D.; Granger, S.; Bremer, J. Variation in HIV RNA assays at low RNA concentration, abstr 774. In Proceedings of the 7th Conference on Retroviruses and Opportunistic Infections (ROI), San Francisco, CA, USA, 30 January–2 February 2000. [Google Scholar]
  63. Gous, N.; Cunningham, B.; Kana, B.; Stevens, W.; Scott, L.E. Performance monitoring of mycobacterium tuberculosis dried culture spots for use with the GeneXpert system within a national program in South Africa. J. Clin. Microbiol. 2013, 51, 4018–4021. [Google Scholar] [CrossRef] [Green Version]
  64. Scott, L.E.; Gous, N.; Cunningham, B.E.; Kana, B.D.; Perovic, O.; Erasmus, L.; Coetzee, G.J.; Koornhof, H.; Stevens, W. Dried culture spots for Xpert MTB/RIF external quality assessment: Results of a phase 1 pilot study in South Africa. J. Clin. Microbiol. 2011, 49, 4356–4360. [Google Scholar] [CrossRef] [Green Version]
  65. Lin, L.I.-K. A note on the concordance correlation coefficient. Biometrics 2000, 56, 324–325. [Google Scholar]
  66. Lin, L.I.-K. A concordance correlation coefficient to evaluate reproducibility. Biometrics 1989, 45, 255–268. [Google Scholar] [CrossRef]
  67. Bland, J.M.; Altman, D.G. Measuring agreement in method comparison studies. Stat. Methods Med. Res. 1999, 8, 135–160. [Google Scholar] [CrossRef]
  68. Bland, J.M.; Altman, D.G. Statistical methods for assessing agreement between two methods of clinical measurement. Lancet 1986, 1, 307–310. [Google Scholar] [CrossRef]
  69. World Health Organization. Laboratory Quality Management System (LQMS) Training Toolkit, Modules 10: External Quality Assessment (EQA): Module 10; World Health Organization: Geneva, Switzerland, 2011; Available online: https://www.who.int/ihr/training/laboratory_quality/eqa_assessment/en/ (accessed on 16 January 2021).
  70. Gous, N.M.; Onyebujoh, P.C.; Abimiku, A.; Macek, C.; Takle, J. The role of connected diagnostics in strengthening regional, national and continental African disease surveillance. Afr. J. Lab. Med. 2018, 7, 775. [Google Scholar] [CrossRef] [PubMed]
  71. Nash, M.; Huddart, S.; Badar, S.; Baliga, S.; Saravu, K.; Pai, M. Performance of the Xpert HIV-1 Viral Load Assay: A Systematic Review and Meta-analysis. J. Clin. Microbiol. 2018, 56. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  72. International Air Transport Association. Infectious Substances Shipping Guidelines, 15th ed.; International Air Transport Association, Ed.; International Air Transport Association: Montreal, QC, Canada, 2020. [Google Scholar]
  73. Reeve, B.W.P.; McFall, S.M.; Song, R.; Warren, R.; Steingart, K.R.; Theron, G. Commercial products to preserve specimens for tuberculosis diagnosis: A systematic review. Int. J. Tuberc. Lung Dis. 2018, 22, 741–753. [Google Scholar] [CrossRef] [PubMed]
  74. Nolte, F.S. Impact of viral load testing on patient care. Arch. Pathol. Lab. Med. 1999, 123, 1011–1014. [Google Scholar] [CrossRef]
Figure 1. Processing of Verification and EQA panels. (a) Verification panel: same module must be used for each set of specimens. Verification panels are labelled with orange labels to remind users of this. (b) EQA panel: different modules must be used for each specimen. HIV: human immunodeficiency virus. VL: viral load. cp/mL: copies per milliliter. EQA: external quality assessment.
Figure 1. Processing of Verification and EQA panels. (a) Verification panel: same module must be used for each set of specimens. Verification panels are labelled with orange labels to remind users of this. (b) EQA panel: different modules must be used for each specimen. HIV: human immunodeficiency virus. VL: viral load. cp/mL: copies per milliliter. EQA: external quality assessment.
Diagnostics 11 00160 g001
Figure 2. Verification panel VL variation (log cp/mL) across different testing sites (n = 13). (a) Regression analysis for all verification panels tested between September 2017 and November 2018. (b) Bland-Altman agreement of the viral load results, compared to the reference result obtained at panel preparation. One outlier specimen (4.33 log cp/mL; −0.91 log cp/mL difference from reference VL) was noted in the 5.0 log cp/mL category, but was within the acceptable range for the pilot panels (<1.0 log cp/mL).VL: viral load. cp/mL: copies per milliliter. SD: standard deviation.−
Figure 2. Verification panel VL variation (log cp/mL) across different testing sites (n = 13). (a) Regression analysis for all verification panels tested between September 2017 and November 2018. (b) Bland-Altman agreement of the viral load results, compared to the reference result obtained at panel preparation. One outlier specimen (4.33 log cp/mL; −0.91 log cp/mL difference from reference VL) was noted in the 5.0 log cp/mL category, but was within the acceptable range for the pilot panels (<1.0 log cp/mL).VL: viral load. cp/mL: copies per milliliter. SD: standard deviation.−
Diagnostics 11 00160 g002
Figure 3. EQA Panel VL variation (log cp/mL) across different testing sites (n = 13) and EQA panels (n = 2). (a) Regression analysis for EQA Panels 1 and 2 (n = 102/104 specimens). (b) Bland-Altman agreement of the viral load results (n = 102/104 specimens), compared to the reference result obtained at panel preparation. EQA: external quality assessment. VL: viral load. cp/mL: copies per milliliter. SD: standard deviation.
Figure 3. EQA Panel VL variation (log cp/mL) across different testing sites (n = 13) and EQA panels (n = 2). (a) Regression analysis for EQA Panels 1 and 2 (n = 102/104 specimens). (b) Bland-Altman agreement of the viral load results (n = 102/104 specimens), compared to the reference result obtained at panel preparation. EQA: external quality assessment. VL: viral load. cp/mL: copies per milliliter. SD: standard deviation.
Diagnostics 11 00160 g003
Figure 4. Stability of EQA Pilot Panel Baseline to Day 150. (a) Bar chart showing VL from Day 0 to Day 28, with specimens tested in duplicate. There is little VL variability. (b) Bar chart showing VL from Day 0 to Day 150. There is a decrease in VL between Day 84 and Day 150. The VL remains within 0.2 log cp/mL of the expected VL for the log 3.7 and log 4.7 specimens until Day 84. There is a decrease at Day 84 for the log 3 specimen, but it remains within 0.5 log cp/mL of the expected VL. By Day 150, all VL exceed >0.5 log cp/mL difference from day 0, with both the log 3.7 and log 4.7 specimens showing a VL decrease of >1.0 log cp/mL. EQA: external quality assessment. VL: viral load. cp/mL: copies per milliliter.
Figure 4. Stability of EQA Pilot Panel Baseline to Day 150. (a) Bar chart showing VL from Day 0 to Day 28, with specimens tested in duplicate. There is little VL variability. (b) Bar chart showing VL from Day 0 to Day 150. There is a decrease in VL between Day 84 and Day 150. The VL remains within 0.2 log cp/mL of the expected VL for the log 3.7 and log 4.7 specimens until Day 84. There is a decrease at Day 84 for the log 3 specimen, but it remains within 0.5 log cp/mL of the expected VL. By Day 150, all VL exceed >0.5 log cp/mL difference from day 0, with both the log 3.7 and log 4.7 specimens showing a VL decrease of >1.0 log cp/mL. EQA: external quality assessment. VL: viral load. cp/mL: copies per milliliter.
Diagnostics 11 00160 g004
Table 1. Summary of scoring logic.
Table 1. Summary of scoring logic.
Specimen ScoreResultsOutcome
2/2Correct resultPass
1/2Error, Invalid, No result
>1.0 log cp/mL quantifiable result bias
Acceptable
0/2Incorrect result (e.g., HIV positive reported as HIV negative)Concern
Verification ScorePercentage PerformanceOutcome
6/6100%Pass
5/683.3%Acceptable
≤4/666.7%Unacceptable
EQA ScorePercentage PerformanceOutcome
8/8100%Pass
7/887.5%Acceptable
6/875.0%Concern
≤5/862.5%Unacceptable
The table is divided into specimen score, verification score and EQA score sections (shown in bold). Specimen Score: Each specimen generates a score out of two. Verification Score: Verification of a module generates a score out of six (three specimens per module). EQA Score: EQA of an instrument generates a score out of eight (four specimens per instrument, run over different modules). If an unacceptable score is obtained, the site is required to conduct a root cause analysis and corrective action, and to test a second verification or EQA panel. Site trainers or monitors may provide further interventions (e.g., staff training, instrument calibration). EQA: external quality assessment. HIV: human immunodeficiency virus. cp/mL: copies per milliliter.
Table 2. Site Verification Summary: September 2017–November 2018 (compared to reference VL).
Table 2. Site Verification Summary: September 2017–November 2018 (compared to reference VL).
PanelExpected Viral Load (log cp/mL)Reference Viral Load (log cp/mL)Tested (n)Result Obtained
(n (%))
Viral Load Bias
(Mean (Median) Range) (log cp/mL)
Standard DeViation of Mean Bias
(log cp/mL)
Error (n)Invalid (n)Reference vs. Mean (log cp/mL)
1NegativeNegative5239 (75.0)001210
2 cNegativeNegative55 (100)00000
12.702.702623 (88.5)0.04 (0.06)
−0.33, 0.34
0.1521−0.04
2 c2.702.8155 (100)−0.11 (−0.10))
−0.19, −0.06
0.06000.11
Overall (log 2.70)-3128 (90.3)0.02 (−0.02)
−0.33, 0.34 b
0.1521-
13.002.912625 (96.2)0.13 (0.13)
(−0.17, 0.36)
0.1610−0.14
14.704.752625 (96.2)−0.01 (0.00)
−0.19, 0.20
0.09100.01
15.005.242625 (96.2)−0.22 (−0.20)
−0.91 a, −0.01
0.18100.22 a
2 c5.005.2154 (80.0)−0.25 (−0.25)
−0.30; −0.20
0.04010.25
Overall (log 5.00)-3129 (93.6)−0.23 (−0.22)
−0.91; −0.01
0.1711-
Overall (57 verification panels)171151/171 (88.3)
Quantified: 107/114 (93.9)
−0.02 (0.00)
(−0.91, 0.36)
0.1617
9.9%
3
1.8%
0.07
a increased variability owing to one outlier specimen (4.33 log cp/mL). If this specimen is excluded, the mean bias increases to −0.19 log cp/mL with a range of −0.41 to −0.01, and the difference between the reference and the pooled mean decreases to 0.19 log cp/mL. b variation around the median >0.30 when two panels are combined, but remains <0.03 log cp/mL in the individual panels. c verification panel 2 numbers are low (n = 5), so values lack robustness, but are similar to the larger panel 1. n: number. cp/mL: copies per milliliter.
Table 3. Site EQA Summary: September 2017–November 2018.
Table 3. Site EQA Summary: September 2017–November 2018.
PanelExpected Viral Load (log cp/mL)Reference Viral Load (log cp/mL)Tested (n)Result Obtained(n (%))Viral Load Bias
(Mean (Median) Range) (log cp/mL)
Standard Deviation of Mean Bias(log cp/mL)Error (n)Reference vs. Mean(log cp/mL)
1NegativeNegative1313 (100)00-0
2NegativeNegative1313 (100)00-0
13.003.061312 (92.3)0.02 (0.02)
−0.14, 0.30
0.111−0.02
23.003.111313 (100)0.02 (0.04)
−0.20, 0.22
0.12-0.05
Overall (log 3.00)3.092625 (96.2)0.02 (0.02)
−0.20, 0.30
0.111-
13.703.721313 (100)−0.06 (−0.06)
−0.18, 0.05
0.07-0.06
23.703.73 1313 (100)−0.04 (−0.04)
−0.17, 0.06
0.07-0.01
Overall (log 3.70)3.732626 (100)−0.05 (−0.05)
−0.18, 0.06
0.07--
14.704.801313 (100)−0.05 (−0.04)
−0.16; 0.11
0.08-0.05
24.704.781312 (92.3)−0.01 (0.01)
−0.13, 0.05
0.051−0.02
Overall (log 4.70)4.792625 (96.2)−0.03 (−0.02)
−0.16; 0.11
0.071-
Overall (26 EQA panels panels)104102/104 (98.1)−0.02 (−0.02)
−0.20, 0.30
0.092
1.9%
-
EQA: External quality assessment. n: number. cp/mL: copies per milliliter.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Noble, L.D.; Scott, L.E.; Bongwe, A.; Da Silva, P.; Stevens, W.S. The Development of a Standardized Quality Assessment Material to Support Xpert® HIV-1 Viral Load Testing for ART Monitoring in South Africa. Diagnostics 2021, 11, 160. https://doi.org/10.3390/diagnostics11020160

AMA Style

Noble LD, Scott LE, Bongwe A, Da Silva P, Stevens WS. The Development of a Standardized Quality Assessment Material to Support Xpert® HIV-1 Viral Load Testing for ART Monitoring in South Africa. Diagnostics. 2021; 11(2):160. https://doi.org/10.3390/diagnostics11020160

Chicago/Turabian Style

Noble, Lara Dominique, Lesley Erica Scott, Asiashu Bongwe, Pedro Da Silva, and Wendy Susan Stevens. 2021. "The Development of a Standardized Quality Assessment Material to Support Xpert® HIV-1 Viral Load Testing for ART Monitoring in South Africa" Diagnostics 11, no. 2: 160. https://doi.org/10.3390/diagnostics11020160

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop