key: cord-0015610-xh3zjmvm authors: Noble, Lara Dominique; Scott, Lesley Erica; Bongwe, Asiashu; Da Silva, Pedro; Stevens, Wendy Susan title: The Development of a Standardized Quality Assessment Material to Support Xpert(®) HIV-1 Viral Load Testing for ART Monitoring in South Africa date: 2021-01-22 journal: Diagnostics (Basel) DOI: 10.3390/diagnostics11020160 sha: 376acb77c5b3e09e3ed49e077c7268b469c6a4d2 doc_id: 15610 cord_uid: xh3zjmvm The tiered laboratory framework for human immunodeficiency virus (HIV) viral load monitoring accommodates a range of HIV viral load testing platforms, with quality assessment critical to ensure quality patient testing. HIV plasma viral load testing is challenged by the instability of viral RNA. An approach using an RNA stabilizing buffer is described for the Xpert(®) HIV-1 Viral Load (Cepheid) assay and was tested in remote laboratories in South Africa. Plasma panels with known HIV viral titres were prepared in PrimeStore molecular transport medium for per-module verification and per-instrument external quality assessment. The panels were transported at ambient temperatures to 13 testing laboratories during 2017 and 2018, tested according to standard procedures and uploaded to a web portal for analysis. A total of 275 quality assessment specimens (57 verification panels and two EQA cycles) were tested. All participating laboratories met study verification criteria (n = 171 specimens) with an overall concordance correlation coefficient (ρ(c)) of 0.997 (95% confidence interval (CI): 0.996 to 0.998) and a mean bias of −0.019 log copies per milliliter (cp/mL) (95% CI: −0.044 to 0.063). The overall EQA ρ(c) (n = 104 specimens) was 0.999 (95% CI: 0.998 to 0.999), with a mean bias of 0.03 log cp/mL (95% CI: 0.02 to 0.05). These panels are suitable for use in quality monitoring of Xpert(®) HIV-1 VL and are applicable to laboratories in remote settings. Several countries striving to attain their 2020 UNAIDS 90%/90%/90% targets for global HIV healthcare [1] struggle with the third 90% (virological suppression). Fasttrack targets were designed to address this [2] , aiming to increase the number of people living with HIV (PLWH) accessing treatment and achieving virological suppression. Current global estimates show that 25.4 million people, approximately 67% of PLWH, were accessing antiretroviral therapy (ART) by end-2019 [3] , and monitoring needs are likely to increase over the next decade as more people access ART. A total of 5,231,809 (70%) patients currently access ART in South Africa alone [4] , with the number expected to increase as the remaining PLWH are reached. The recommended test for monitoring ART response is HIV viral load (VL) quantification [5] . This has historically been performed at centralized laboratories owing to the number of specimens requiring processing, the logistical needs of the available technologies, and the lack of accurate and cost-effective near patient VL technologies. South Africa has addressed the VL scale-up testing needs through a highly centralized model within the National Health Laboratory Service (NHLS), which is responsible for laboratory testing of~80% of the population. The capacity of In addition to the programs described above, the South African Viral Load Quality Assessment (SAVQA) panel [44] was previously developed to address the need for scaled HIV VL services in centralized HIV VL laboratories. This panel provides an accessible option for the verification of newly installed HIV VL testing platforms, initially the Re-alTime HIV-1 (Abbott Molecular, Des Plaines, IL, USA) and cobas ® AmpliPrep/cobas ® TaqMan ® (CAP/CTM; Roche Molecular, Pleasanton, CA, USA) assays, prior to testing clinical specimens, and has also been used for the rapid evaluation of new HIV VL assays [37, [45] [46] [47] . The SAVQA panel [44] is a 42-specimen plasma panel prepared from purchased human plasma (known HIV-1 positive/negative) and quantified using RealTime HIV-1, CAP/CTM and cobas ® HIV-1 (Roche Molecular, Pleasanton, CA, USA). The panel is stored and shipped frozen, and only defrosted immediately prior to testing. The panel comprises 17 negative specimens and five repeats of five positive specimens with VL ranging from 2.7 log cp/mL to 5.0 log cp/mL. The panel was designed to measure accuracy, precision, carryover and limit of the blank [44] . The SAVQA panel was readily available, but was not suitable in its existing format. The panel required adaptation to avoid the need for cold-chain shipping and storage, with the remote testing sites having no refrigeration facilities. It was also desirable to include a smaller number of specimens to minimize cost and time constraints as the GeneXpert ® is a modular, cartridge-based system designed for random access, single specimen testing. We therefore designed a miniaturized, thermostable version of the SAVQA panel using a commercially available matrix, PrimeStore ® Molecular Transport Medium (MTM; Longhorn Vaccines and Diagnostics LLM, Bethesda, MD, USA), to allow ambient temperature shipping and storage. This medium achieved US FDA approval in 2018 [48] , and has been evaluated with a variety of mycobacterial [49] [50] [51] [52] [53] and viral [54] [55] [56] [57] specimens, including HIV [58] . In addition to the use of MTM-stored specimens with PrimeMix ® [50, 55, 56] , MTM has been shown to be compatible with the Xpert ® MTB/RIF [52, 56] and, more recently, the Xpert ® Xpress SARS-CoV-2 [57, 59] assays (Cepheid, Sunnyvale, CA, USA), as well as the m2000 Real-Time HIV-1 assay [58] . Verification panels were developed alongside a web-based result reporting tool, which was based on the web portal (www.tbgxmonitor.com) previously developed for Xpert ® MTB/RIF quality monitoring [60] . Following the successful verification rollout, an external quality assessment (EQA) panel was requested and was designed to measure pre-and post-processing analytics at these pilot laboratories. This manuscript aims to provide a detailed description of these pilot quality panels as an option for POCT HIV VL sites, using clinically relevant panel specimens which can be prepared centrally and sent to remote sites. These panels were specifically designed to meet the needs of remote testing laboratories using the Xpert ® HIV-1 VL assay, notably limited cold-chain shipping and cold-storage facilities on site, low throughput testing platforms, the need for ad hoc verification products and, frequently, lower-skilled laboratory staff. The use of QA materials, particularly when evaluated between laboratories, ensures that instruments are fit-for-purpose and that onsite processing is robust, thus ensuring best possible patient result quality within a tiered laboratory framework. A SAVQA plasma panel, as described above, was removed from storage (−80 • C) and defrosted at ambient temperature, followed by brief centrifugation (3000 rpm, 1 min). HIV-negative specimens (1.3 mL) were not mixed with MTM to provide a clinically relevant specimen, overcoming the decreased viscosity/fat content of the MTM. The negative specimen is important to ensure that no cross-contamination occurs in either the reference laboratory or the testing laboratory during specimen preparation and testing. HIV-positive plasma specimens (300 µL) with known VL were added to 1 mL MTM (Longhorn Vaccines and Diagnostics LLC, Bethesda, MD, USA), giving a dilution factor of 4.3 (total volume/specimen volume). To minimize the risk of leakage, each specimen was packaged individually in a sealed plastic bag with an absorbent pad and the complete panel was then placed into a second sealable bag. Specimens were shipped at ambient temperature using the routine NHLS specimen transport system. Two panel formats were designed: (i) a verification panel ( Figure 1a ) and (ii) an EQA panel, (Figure 1b) . The verification panel was used to ensure that instruments were functioning correctly upon installation, instrument (module) replacement or instrument movement, and can also be used for staff training. The verification panel consisted of three specimens per module tested: two specimens of known HIV VL stabilized in MTM buffer and one HIV-negative specimen (plasma only). The target ranges for the HIV-positive specimens were 2.7 log cp/mL (low), 3.0 log cp/mL (low), 4.7 log cp/mL (high), and 5.0 log cp/mL (high). All sites received one low VL specimen, one high VL specimen and one HIVnegative specimen, as per testing organization requirements. The EQA panel was necessary for ongoing monitoring of instruments and testing sites. Four specimens were provided per instrument tested, with an instrument being defined as "up to four" GeneXpert ® systems attached to one computer. The panel included three specimens of a known HIV VL stabilized in MTM buffer, with a target range of 3.0 log cp/mL, 3.7 log cp/mL and 4.7 log cp/mL, and one HIV-negative plasma specimen. On preparation of either panel format, one specimen in each range was tested using the reference laboratory GeneXpert ® instrument (reference specimen; day 0). Both the verification and EQA specimens were processed according to the Xpert ® HIV-1 VL manufacturer's instructions (Cepheid, Sunnyvale, CA, USA), using the liquid panel in place of clinical plasma. Briefly, the Xpert ® HIV-1 VL cartridge was opened and the entire specimen volume (1.3 mL) was transferred into the Xpert ® HIV-1 VL cartridge using a precision pipette or 1 mL Pasteur pipette (supplied by Cepheid as part of the kit). The specimen barcode and cartridge number were scanned, and the specimen was tested using the Xpert ® HIV-1 Viral Load assay definition file. The original SOP did not include centrifugation instructions, but this was amended after the first verification panel was analysed to ensure that every specimen was briefly centrifuged (3000 rpm, 1 min) prior to processing. A web portal (www.viralloadmonitor.com), based on the original TBGxMonitor website [60] for upload of both verification and EQA results and report generation, was created in collaboration with SmartSpot Quality Pty (Ltd.) (Johannesburg, Gauteng, South Africa). Users were required to upload the comma-separated values (CSV) run files (automatically produced by the GeneXpert ® software) for the Xpert ® HIV-1 VL panel specimens using a USB device. Results were converted using the dilution factor (4.3) and this was applied within the website logic as part of the scoring algorithm. The criteria for designing the panels were based on monitoring across the clinically relevant threshold of 1000 cp/mL [5] , and therefore the scoring system and performance monitoring were applied to this critical range. This included evaluating acceptable differences between the test specimen and the Xpert ® HIV-1 VL reference specimen (described above), and was originally defined as <1.0 log cp/mL difference. This large variability was selected to account for potential artefacts generated by specimen dilution, ambient temperature shipping and result conversion. Retrospective analyses at <0.5 log cp/mL difference and <0.3 log cp/mL difference, in line with generally accepted VL variation [61, 62] , were also performed. Finally, the Xpert ® HIV-1 VL reference VL was compared to the pooled mean VL achieved by the 13 testing sites, ensuring that the reference laboratory instrument was performing acceptably and that the reference result was suitable for use as the standard. The scoring system was aligned with the previously well-described TB quality program [60, 63, 64] and, although differences exist between qualitative (TB) and quantitative (VL) result outputs, the performance was similarly applied due to the modular nature of the GeneXpert system, as follows: each specimen tested received a score out of two: correct result (2/2); error, invalid, >1.0 log cp/mL quantifiable result bias (1/2); incorrect result (e.g., HIV positive reported as HIV negative: 0/2). Each panel was then scored out of six for verification and out of eight for EQA. Scoring logic is detailed in Table 1 . The overall panel performance across all sites was measured by the mean, median, range and standard deviation (SD) of the quantifiable viral loads, which were calculated using Microsoft ® Excel ® 2016 (Microsoft Corporation, Redmond, WA, USA). Regression, the concordance correlation coefficient (ρ c ) [65, 66] , including a Pearson correlation coefficient (p; measure of precision) and a bias correction factor (C b ; measure of accuracy), and Bland-Altman [67, 68] analyses were performed and graphically represented using MedCalc Statistical Software version 18.11 (MedCalc Software bvba, Ostend, Belgium; http://www.medcalc.org; 2018). The pilot evaluation was nested within a field trial of near-patient VL testing, overseen by the NHLS NPP (Johannesburg, South Africa). Thirteen district laboratory facilities were selected and provided with a GeneXpert ® IV (Cepheid, Sunnyvale, CA, USA). The laboratories were located in remote areas across six provinces (Eastern Cape: n = 2, Northern Cape: n = 4, Western Cape: n = 3; Free State: n = 1, Limpopo: n = 2; North West Province: n = 1). Technicians were recruited and received training on the GeneXpert ® platform and the Xpert ® HIV-1 VL assay. The verification and EQA material were designed to meet requirements of the NPP to ensure that the instruments were fit-for-purpose and that specimen processing was being correctly performed. The table is divided into specimen score, verification score and EQA score sections (shown in bold). Specimen Score: Each specimen generates a score out of two. Verification Score: Verification of a module generates a score out of six (three specimens per module). EQA Score: EQA of an instrument generates a score out of eight (four specimens per instrument, run over different modules). If an unacceptable score is obtained, the site is required to conduct a root cause analysis and corrective action, and to test a second verification or EQA panel. Site trainers or monitors may provide further interventions (e.g., staff training, instrument calibration). EQA: external quality assessment. HIV: human immunodeficiency virus. cp/mL: copies per milliliter. Verification panels (n = 4 per site) were provided to all sites in September 2017, following instrument installation and prior to patient testing. Further verification panels (n = 5) were provided on an ad hoc basis as modules were replaced. EQA panels (n = 1 per site) were provided to the sites in June and November 2018. For the pilot evaluation, the automatically generated reports were manually checked prior to release, but the website has the capacity to automatically release reports to the sites. Prior to initial supply to sites, verification specimens (2.7 log cp/mL; 5.0 log cp/mL) were prepared and tested in duplicate at days 7, 14, 21, and 28 (as per process described above) to determine stability compared to the day 0 reference result. Extra EQA panels (3.0 log cp/mL, 3.7 log cp/mL and 4.7 log cp/mL) were prepared at the same time as those sent to the sites and tested at days 24, 43, 84 and 150 post manufacture to determine longer term stability. All specimens were stored at ambient temperature in sealed plastic bags with desiccant. All sites tested and uploaded results to the website within three days of panel receipt. Result scores and outcomes are summarized in Table 2 and Figure 2 , with detailed information provided in Supplementary Table S1 . Quantifiable VL results were within acceptable limits for verification (<1.0 log cp/mL difference from the reference VL, as shown in Table 2 ) and all reference results were within 0.3 log cp/mL of the pooled mean VL of the specimens tested, although it was noted that the VL bias was high in the 5.0 log cp/mL reference specimen (0.22 log cp/mL). In addition, the sites' verification VL results were compared to the mean VL (data not shown) and this was comparable to analysis using the reference VL values. The ρ c across all sites (n = 151 specimens) was 0.997 (95% confidence interval [CI]: 0.995 to 0.998), with a p of 0.997 and a C b of 0.999. The mean bias was −0.02 log cp/mL (95% CI: −0.046 to 0.006), with a coefficient of determination (R 2 ) value of 0.9940. may have contributed to the errors. Changes were made to the standard operating procedure (i.e., to centrifuge all specimens prior to use, as would be required for clinical specimens) and staff retraining was performed if necessary. Once these changes were implemented, the error rate (over ad hoc verification and EQA) decreased to 1.7% (2/119 further tests), indicating that correct operating procedures were being observed. (a) (b) The error rate (20/171; 11.7%) for the verification panels was higher than expected, and was primarily a result of processing errors (55% of errors). Seven errors (35%) were linked to the internal probe failures, two to syringe pressure (10%) and eleven relating to input volume (errors 2096 (35%) and 2097 (20%)). The majority of errors reported (13/20; 65%) occurred in the clinically relevant negative specimen, indicating laboratory processing errors. It was determined, on discussion with the program manager, that the specimens were not being centrifuged prior to testing and that incorrect pipetting procedures may have contributed to the errors. Changes were made to the standard operating procedure (i.e., to centrifuge all specimens prior to use, as would be required for clinical specimens) and staff retraining was performed if necessary. Once these changes were implemented, the error rate (over ad hoc verification and EQA) decreased to 1.7% (2/119 further tests), indicating that correct operating procedures were being observed. Two cycles of EQA (E18V1, E18V2) were shipped to 13 sites (18 June 2018, 12 November 2018) and results were uploaded within seven days (mean: 4.1 days). All sites showed acceptable performance across both EQA panels; the program performance is summarized in Table 3 and Figure 3 , and complete site results are detailed in Supplementary Table S2 . Viral loads were within acceptable limits for EQA (<1.0 log cp/mL bias), and all negative specimens were reported as not detectable (no carryover). The ρ c for the EQA pilot panels (two EQA panels, n = 102/104 specimens) across all sites was 0.9985 (95% CI: 0.9978 to 0.9990), with a ρ c of 0.9987 and a C b of 0.9998. The mean bias was 0.03 (95% CI: 0.02 to 0.05). The error rate was 1.9% (two in 104 tests) and was caused by volume loading (user) errors. Retrospective analysis of the verification and EQA results was performed after the pilot evaluation, in order to accommodate acceptable VL biases [61, 62] . Amongst 107 quantifiable verification results, ten (9.3%) showed a bias of >0.3 log cp/mL (range: 0.36, −0.91). Only one outlier specimen (4.33 log cp/mL) displayed a bias >0.5 log cp/mL: −0.91 log cp/mL compared to the reference VL and −0.69 log cp/mL compared to the pooled mean VL. This specimen was part of the 5.0 log cp/mL group, where the reference VL (5.24 log cp/mL) was notably higher than the pooled mean VL (5.02 log cp/mL). A second outlier (4.83 log cp/mL) in this group had a VL bias of −0.41 log cp/mL compared to the reference VL, with an acceptable bias of −0.19 log cp/mL compared to the pooled mean VL. Only three specimens (2.8%) had a bias of >0.3 log cp/mL compared to the pooled mean VL. All quantifiable EQA VL (n = 76) results showed a bias of <0.3 log cp/mL compared to the reference VL. Stability of the specimens stored in MTM was evaluated prior to panel design and supply, with specimen stability acceptable up to 28 days (Figure 4a ). Testing of EQA panels in the reference laboratory between weeks 4 and 20 (Figure 4b) , showed stability of all specimens at week 6 (day 43) and extended stability of the higher VL range (4.7 log cp/mL) specimens until week 12 (day 84). However, by week 12, a decrease of~0.5 log cp/mL was noted in the lower (3.0 log cp/mL) VL range. Errors were noted in the 2.7 log cp/mL on day 1 (repeat) and the 3.0 log cp/mL specimen at day 24 (both error 2126; module reset), and in the 3.7 log cp/mL specimen at day 84 (invalid, error 5016: probe check error). These relate to the instrument and the cartridge, rather than the specimen. Retesting was not possible due to limited specimen availability. By Day 150, all VL exceeded >0.5 log cp/mL difference from baseline (day 0), with both the 3.7 log cp/mL and 4.7 log cp/mL specimens showing a VL decrease of >1.0 log cp/mL. Bland-Altman analysis of the reportable VL results (n = 14/16) over the weeks, including day 84, when a VL decrease was noted, but excluding day 150, when VL were no longer relevant, gave a mean bias of −0.06 log cp/mL with a lower limit of −0.34 log cp/mL (95% CI: −0.89 to −0.21) and an upper limit of 0.23 log cp/mL (95% CI: 0.10 to 0.77). Including day 150 (n = 18/20) gave a mean bias of −0.20 log cp/mL with a lower limit of −0.97 log cp/mL (95%CI: −2.11 to −0.62) and an upper limit of 0.58 log c/mL (95% CI: 0.23 to 1.72), beyond acceptable limits for supply to sites. Laboratory quality monitoring is vital to ensure ongoing patient result testing accuracy [39, 69] . Instruments must be evaluated prior to implementation, verified before use in the field and monitored on an ongoing basis. Similarly, staff competency should be evaluated through training, observation and participation in quality programs. Evaluation can be performed on existing specimens (e.g., frozen plasma), prospective specimens (against a reference instrument currently in use) or on well-described quality panels (e.g., NEQAS, SAVQA). EQA, through supply of standardized specimens for testing and through continuous quality monitoring (CQM, e.g., analysis of central data repositories), enables program managers to identify potential instrument or staff deficiencies for correction. Participation in EQA programs has been shown to improve participant performance [42] . CQM of assays and instruments is becoming standard practice for many connected diagnostics. Operational dashboards, such as C360 (Cepheid), provide assay and Laboratory quality monitoring is vital to ensure ongoing patient result testing accuracy [39, 69] . Instruments must be evaluated prior to implementation, verified before use in the field and monitored on an ongoing basis. Similarly, staff competency should be evaluated through training, observation and participation in quality programs. Evaluation can be performed on existing specimens (e.g., frozen plasma), prospective specimens (against a reference instrument currently in use) or on well-described quality panels (e.g., NEQAS, SAVQA). EQA, through supply of standardized specimens for testing and through continuous quality monitoring (CQM, e.g., analysis of central data repositories), enables program managers to identify potential instrument or staff deficiencies for correction. Participation in EQA programs has been shown to improve participant performance [42] . CQM of assays and instruments is becoming standard practice for many connected diagnostics. Operational dashboards, such as C360 (Cepheid), provide assay and instrument quality informa-tion on errors, utility, and various result parameters on a module/instrument/laboratory and location basis, and can be utilized for daily and monthly monitoring to identify quality issues, without waiting for EQA panel cycles [70] . CQM, through the C360 platform, was successfully applied during the near-patient testing pilot into which this evaluation was nested, but is beyond the scope of this manuscript. EQA is complimentary to CQM, ensuring ongoing pre-and post-analytical performance monitoring, which is particularly important where staff turn-over is high. The Xpert ® HIV-1 VL assay was previously evaluated, using both the SAVQA panel and clinical specimens [37] , and has since been extensively evaluated in the field [14, 71] , meaning that the assay did not require further evaluation prior to implementation. However, before the implementation pilot could commence, verification of the modules was required, and was complicated by the remote placement of the instruments as residual plasma specimens were not readily available. Alternative options for instrument verification were thus needed. This manuscript describes the design and pilot evaluation of quality panels used for POCT HIV VL. The panels were designed to meet specific requirements: (i) specimen processing needed to be as similar as possible to actual specimens; (ii) thermostable transport and storage; (iii) reproducible VL results, such that processing or instrument issues could be detected during verification and ongoing EQA, and (iv) safe during transport. While initially designed for module verification, the panels were easily adapted for ongoing EQA. These panels were based on similar principles to the Xpert ® MTB/RIF program [63, 64] , which has been used successfully throughout the NHLS to monitor 207 Xpert ® MTB/RIF testing sites, as well as internationally (28 countries), and was expected to provide similar rigorous quality monitoring to Xpert ® HIV-1 VL sites. It is notable that the panels were supplied in a liquid format and that no processing was required beyond centrifugation and direct addition of specimen into the Xpert ® HIV-1 cartridge, mimicking routine patient specimen testing. This was in contrast to dried tube specimens (DTS), which have been used throughout sub-Saharan Africa for EQA [41] [42] [43] . DTS were not selected for this pilot as the NPP preferred to minimize specimen processing variability during specimen reconstitution by using a liquid panel, although DTS met all other requirements described. Furthermore, similarly to the original SAVQA panel, the verification program was designed for rapid deployment using local resources, decreasing reliance on scheduled schemes [44] . Shipping of liquid specimens is potentially problematic, given the risk of leakage, particularly if the transport infrastructure is poor (e.g., degraded road surfaces). Panels were well packaged and no leakage of the specimen from the tube into the protective packing was observed. However, the extra packaging, as described above, is recommended for similar panels going forward to minimize risk to transport personnel and to meet IATA requirements [72] . The infectivity of HIV when stored in MTM was not tested in this pilot, but existing studies have shown that pathogens are fully inactivated on addition to the buffer [48, 50, 54, 57, 73] , while RNA integrity is simultaneously preserved [50, [55] [56] [57] , including HIV-1 RNA [58] . Thermostability of the panels, with little VL variation, was shown for a minimum of twelve weeks from manufacture. Earlier studies have shown that viral RNA (e.g., influenza) can be reliably detected for up to 196 days [54] and quantified for up to 23 days [56] . This study has shown longer-term stability on HIV RNA, although it should be noted that the manufacturer only recommends storage at ambient temperature for 30 days. Furthermore, stability testing was performed in Johannesburg during the South African winter and spring, with temperatures ranging from 8 to 23 • C, but with minimal humidity. More recent studies performed during the hotter months (maximum temperature 31 • C) and with increased humidity showed decreases of >1 log cp/mL by 10 weeks (personal communication, Dean Sher, SmartSpot Quality Pty (Ltd.), Johannesburg, Gauteng, South Africa) and is therefore a consideration for long-term stability in warmer climates. A recent manuscript reported decreased yield of Mycobacterium tuberculosis in oral mucosa specimens stored in PrimeStore MTM after 30 days and also after extended freezing [52] , a finding that may similarly affect these specimens if frozen. Further stability evaluations in humid and warmer settings are recommended to ensure similar stability in such settings. In this evaluation, the QA specimens were made to order and generally tested within one week. It should also be noted that the dilution factor was applied to this pilot in order to allow for a comparison with the original SAVQA data. In order to determine if specimen variability [74] affected the performance of sites compared to the reference VL, the specimen VL from all sites and the reference VL were compared to the pooled mean VL of all sites. In all cases, the mean VL and the reference VL were similar (−0.02 log cp/mL mean difference), although the reference instrument did produce a higher VL (5.24 log cp/mL vs 5.02 log cp/mL) than all sites in the 5.0 log cp/mL range. This was not clinically significant and did not affect site performance outcome. The bias of the single outlier specimen described (−0.91 log cp/mL bias) was acceptable for verification in terms of the panel design, but unacceptable in the retrospective analysis. However, the site still achieved a module score of 5/6 in the retrospective analysis and patient specimen testing could commence. The benefit of a quality program across multiple sites was that multiple instruments were tested concurrently and panels could be compared to the pooled mean VL rather than only the reference VL; this provided an additional quality control of the reference instrument and the potential to highlight unexpected instability of the quality material. Retrospective analysis of the specimens showed that they could be evaluated at 0.3 and 0.5 log cp/mL bias [61, 62] , and these thresholds should be implemented when using this quality panel further. This design can be adapted to tiered laboratory systems to ensure continued quality POCT HIV VL testing, although resources (MTM buffer, plasma (if purchased), staff time required to manufacture the panels and to collate the results, post-manufacture quality testing and shipping) and individual country needs must be evaluated on an individual basis [69] . Similarly, if this quality material was adapted by commercial suppliers, the cost and feasibility of scaled manufacture at an implementation price acceptable to countries needing such QA products should be investigated. Of note, is the limited stability and compatibility with alternative HIV VL assays if using assays beyond Xpert ® HIV-1 VL. This was not evaluated during this pilot, but it has been observed that the MTM buffer occasionally interacts negatively with certain HIV VL assays (personal communication, Dean Sher, SmartSpot Quality Pty (Ltd.), Johannesburg, Gauteng, South Africa). The value of formal verification or EQA panels should not be disregarded, particularly for smaller programs where globally standardized specimens may provide more rigorous quality measures [39, 69] , but mandatory participation in such schemes varies [39] . A further consideration for using commercial EQA panels is to free up the time of the program managers from producing panels and evaluating results, so as to use this time to assist the laboratories which the EQA identifies as needing help, to identify root-causes and implement corrective actions [69] . Ultimately, whether in-house or commercial, the goal is to ensure quality laboratory testing [39, 69] , which impacts positively on patient care and management. Ongoing quality monitoring at all levels of a tiered laboratory network is paramount to ensure that patient results are accurate. This can be difficult for POCT instruments placed in remote settings, where quality management options used in centralized laboratories are not feasible, but where quality monitoring is vital. The quality panels described in this manuscript provide simple and convenient verification and/or EQA options for countries aiming to implement Xpert ® HIV-1 VL. Joint United Nations Programme for HIV/AIDS. 90-90-90-An Ambitious Treatment Target to Help End the Aids Epidemic Ending the AIDS Epidemic by 2030 UNAIDS. Global HIV and AIDS Statistics-2020 Fact Sheet. 2020. Available online World Health Organisation. Consolidated Guidelines on the Use of Anitretroviral Drugs for Treating and Preventing HIV Infection: Recommendations for a Public Health Approach Comparative evaluation of the Cobas Amplicor HIV-1 Monitor Ultrasensitive Test, the new Cobas AmpliPrep/Cobas Amplicor HIV-1 Monitor Ultrasensitive Test and the Versant HIV RNA 3.0 assays for quantitation of HIV-1 RNA in plasma samples Multicenter evaluation of the NucliSens EasyQ HIV-1 v1.1 assay for the quantitative detection of HIV-1 RNA in plasma Evaluation of the performance of the automated NucliSENS easyMAG and EasyQ systems versus the Roche AmpliPrep-AMPLICOR combination for high-throughput monitoring of human immunodeficiency virus load Evaluation of the Abbott m2000 RealTime human immunodeficiency virus type 1 (HIV-1) assay for HIV load monitoring in South Africa compared to the Roche Cobas AmpliPrep-Cobas Amplicor, Roche Cobas AmpliPrep-Cobas TaqMan HIV-1, and BioMerieux NucliSENS EasyQ HIV-1 assays Ultra-deep sequencing provides insights into the virology of hepatitis C super-infections in a case of three sequential infections with different genotypes Evaluation of Performance Characteristics of the Aptima HIV-1 Quant Dx Assay for Detection and Quantitation of Human Immunodeficiency Virus Type 1 in Plasma and Cervicovaginal Lavage Samples Comparison of the Aptima HIV-1 Quant Dx assay with the COBAS AmpliPrep/COBAS TaqMan HIV-1 v2.0 Test for HIV-1 viral load quantification in plasma samples from HIV-1-infected patients Performance of Cepheid Xpert HIV-1 viral load plasma assay to accurately detect treatment failure HIV viral load scale-up: Multiple interventions to meet the HIV treatment cascade Monitoring viral load for the last mile: What will it cost? Stability of HIV RNA in plasma specimens stored at different temperatures Long-term stability of human immunodeficiency virus viral load and infectivity in whole blood Reliability of plasma HIV viral load testing beyond 24 hours: Insights gained from a study in a routine diagnostic laboratory Evaluation of the use of plasma preparation tubes for HIV viral load testing on the COBAS AmpliPrep/COBAS TaqMan HIV-1 version 2.0 Systematic review of the accuracy of plasma preparation tubes for HIV viral load testing Effects of specimen collection, processing, and storage conditions on stability of human immunodeficiency virus type 1 RNA levels in plasma Optimization of specimen-handling procedures for accurate quantitation of levels of human immunodeficiency virus RNA in plasma by reverse transcriptase PCR Accurate dried blood spots collection in the community using non-medically trained personnel could support scaling up routine viral load testing in resource limited settings Dried blood spots for viral load monitoring in Malawi: Feasible and effective Dried Blood Spots Provide Accurate Enumeration of HIV-1 Viral Load in East Africa Field evaluation of Dried Blood Spots for HIV-1 viral load monitoring in adults and children receiving antiretroviral treatment in Kenya: Implications for scale-up in resource-limited settings Sensitivity and specificity of two dried blood spot methods for HIV-1 viral load monitoring among patients in Hanoi Evaluation of the performance of Abbott m2000 and Roche COBAS Ampliprep/COBAS Taqman assays for HIV-1 viral load determination using dried blood spots and dried plasma spots in Kenya Estimation of HIV-1 DNA Level Interfering with Reliability of HIV-1 RNA Quantification Performed on Dried Blood Spots Collected from Successfully Treated Patients Systematic review of the use of dried blood spots for monitoring HIV viral load and for early infant diagnosis Stringent HIV Viral Load Threshold for Virological Failure Using Dried Blood Spots: Is the Perfect the Enemy of the Good? Addressing antiretroviral therapy-related diagnostic coverage gaps across South Africa using a programmatic approach An integrated tiered service delivery model (ITSDM) based on local CD4 testing demands can improve turn-around times and save costs whilst ensuring accessible and scalable CD4 services across a national programme Options to Expand HIV Viral Load Testing in South Africa: Evaluation of the GeneXpert(R) HIV-1 Viral Load Assay GeneXpert®Infinity-48, GeneXpert®Infinity-48s and GeneXpert®Infinity-80 External quality assessment (EQA) and alternative assessment procedures (AAPs) in molecular diagnostics: Findings of an international survey World Health Organisation. Global TB Programme and Department of HIV/AIDS Information Note: Considerations for Adoption and Use of Multidesease Testing Devices in Integrated Laboratory Networks Dried tube specimens: A simple and cost-effective method for preparation of HIV proficiency testing panels and quality control materials for use in resource-limited settings Monitoring the quality of HIV-1 viral load testing through a proficiency testing program using dried tube specimens in resource-limited settings Generation of dried tube specimen for HIV-1 viral load proficiency test panels: A cost-effective alternative for external quality assessment programs Use of a prequalification panel for rapid scale-up of high-throughput HIV viral load testing New Options for HIV Viral Load testing: The Panther Aptima HIV-1 Quant Dx assay Laboratory evaluation of the Liat HIV Quant (IQuum) whole-blood and plasma HIV-1 viral load assays for point-of-care testing in South Africa Performance of Xpert®HIV-1 Quant compared to Roche CAP/CTM v2 and Abbott RealTime HIV-1 on a prequalification plasma validation panel Evaluation of Automatic Class III Designation for PrimeStore MTM Decision Summary R)) MTB/RIF detection of Mycobacterium tuberculosis from sputum collected in molecular transport medium A molecular transport medium for collection, inactivation, transport, and detection of Mycobacterium tuberculosis Detection by RT-PCR of Mycobacterium tuberculosis from oral swab specimens using PrimeStore(R) molecular transport medium Molecular Detection of Mycobacterium tuberculosis in Oral Mucosa from Patients with Presumptive Tuberculosis PrimeStore MTM and OMNIgene Sputum for the Preservation of Sputum for Xpert MTB/RIF Testing in Nigeria Comparison of a new transport medium with universal transport medium at a tropical field site A clinical specimen collection and transport medium for molecular diagnostic and genomic applications Influenza Viral Detection from Nasal Wash, Throat, and Nasophayngeal Swabs Collected and Preserved in PrimeStore Molecular Transport Medium Evaluation of Commercially Available Viral Transport Medium (VTM) for SARS-CoV-2 Inactivation and Use in Point-of-Care (POC) Testing. Viruses Can dried blood spots or whole blood liquid transport media extend access to HIV viral load testing? A decentralised point-of-care testing model to address inequities in the COVID-19 response Web-based automated EQA and Instrument Verification reporting tool for the Xpert ® MTB/RIF assay Ten years of external quality assessment of human immunodeficiency virus type 1 RNA quantification Variation in HIV RNA assays at low RNA concentration, abstr 774 Performance monitoring of mycobacterium tuberculosis dried culture spots for use with the GeneXpert system within a national program in South Africa Dried culture spots for Xpert MTB/RIF external quality assessment: Results of a phase 1 pilot study in South Africa A note on the concordance correlation coefficient A concordance correlation coefficient to evaluate reproducibility Measuring agreement in method comparison studies Statistical methods for assessing agreement between two methods of clinical measurement External Quality Assessment (EQA): Module 10; World Health Organization The role of connected diagnostics in strengthening regional, national and continental African disease surveillance Performance of the Xpert HIV-1 Viral Load Assay: A Systematic Review and Meta-analysis International Air Transport Association. Infectious Substances Shipping Guidelines Commercial products to preserve specimens for tuberculosis diagnosis: A systematic review Impact of viral load testing on patient care Acknowledgments: SmartSpot Quality (Pty) Ltd. for assistance in packaging the panels and the development of the www.viralloadmonitor.com website. John Molifi for assistance with specimen shipping and site staff for their participation in the pilot project. The authors declare no conflict of interest.