Syphilis affects 1.4 million pregnant women globally each year. Maternal syphilis causes congenital syphilis in over half of affected pregnancies, leading to early foetal loss, pregnancy complications, stillbirth and neonatal death. Syphilis is under-diagnosed in pregnant women. Point-of-care rapid syphilis tests (RST) allow for same-day treatment and address logistical barriers to testing encountered with standard Rapid Plasma Reagin testing. Recent literature emphasises successful introduction of new health technologies requires healthcare worker (HCW) acceptance, effective training, quality monitoring and robust health systems. Following a successful pilot, the Zambian Ministry of Health (MoH) adopted RST into policy, integrating them into prevention of mother-to-child transmission of HIV clinics in four underserved Zambian districts. We compare HCW experiences, including challenges encountered in scaling up from a highly supported NGO-led pilot to a large-scale MoH-led national programme. Questionnaires were administered through structured interviews of 16 HCWs in two pilot districts and 24 HCWs in two different rollout districts. Supplementary data were gathered via stakeholder interviews, clinic registers and supervisory visits. Using a conceptual framework adapted from health technology literature, we explored RST acceptance and usability. Quantitative data were analysed using descriptive statistics. Key themes in qualitative data were explored using template analysis. Overall, HCWs accepted RST as learnable, suitable, effective tools to improve antenatal services, which were usable in diverse clinical settings. Changes in training, supervision and quality monitoring models between pilot and rollout may have influenced rollout HCW acceptance and compromised testing quality. While quality monitoring was integrated into national policy and training, implementation was limited during rollout despite financial support and mentorship. We illustrate that new health technology pilot research can rapidly translate into policy change and scale-up. However, training, supervision and quality assurance models should be reviewed and strengthened as rollout of the Zambian RST programme continues.
Zambia is a lower-middle income country with a population of 14.1 million, spread over a large geographical area [38]. 94% of pregnant women attend at least one antenatal visit and ANC is free-of-charge [18,39]. The health care system is arranged in six tiers: outreach services, health posts, urban and rural health centres, district hospitals, secondary referral and tertiary referral hospitals [40]. Rural facilities are remote, located several hours’ drive from district hubs, lacking or with limited electricity and limited transport to convey patient samples, results or supplies back and forth to district laboratories. In 2010–11, a six-country pilot study, including Zambia, evaluated the feasibility of introducing RSTs into existing maternal and child health (MCH) programmes [1,13,34]. The study was the first of its kind to incorporate QA systems to monitor training and accuracy of POC test results [34]. The non-governmental organisation (NGO)-led Zambian study arm introduced RST into existing prevention of mother-to-child transmission (PMTCT) of HIV services in 15 sites in two districts: Mongu, a rural district, with a syphilis prevalence of 7%, and Lusaka, an urban district, with a prevalence of 2.5% [41]. It showed that integrating RST into PMTCT programmes increased testing and treatment for syphilis in HIV positive pregnant women without compromising HIV service delivery. Following the pilot’s success, the Zambian Ministry of Health (MoH) rapidly adopted RST into national policy and led the rollout of a national RST programme, incorporating QA/QC into programme design [42]. The rollout, supported by the Elizabeth Glaser Pediatric AIDS Foundation (EGPAF), commenced in 2012, initially in four underserved districts with high rates of maternal mortality: Kalomo, Lundazi, Mansa and Nyimba. Key implementation changes were made between pilot and rollout: supervision was reduced from monthly to quarterly and QA/QC activities were devolved from central to district laboratory level (Table 1). Prior to the pilot, antenatal syphilis screening was carried out using the laboratory-based RPR test when available. MCH HCWs sent blood samples to a laboratory for batched testing, with patients usually returning for results at a later date. In some rural facilities, non-laboratory workers performed RPR tests without a systematic QA approach. During the pilot, RST was performed by MCH HCWs who started same-day treatment based on a reactive RST test. National rollout guidelines described several diagnostic algorithms, depending on RST, RPR and Treponema pallidum haemoagglutination (TPHA) test availability at each site (Fig 1). None of the sites included in this evaluation performed TPHA. Legend: 4Cs: condoms, counselling, compliance, contact tracing, offer HIV test; RPR:Rapid Plasma Reagin; non-treponemal qualitative test; quantitative testing may also be available;RST: Rapid Syphilis Test, a treponemal test;TPHA*:Treponema pallidum Haemoagglutination assay, a treponemal laboratory test;BP: Benzathine Penicillin 2.4 megaunits IM. *The testing algorithm involving TPHA applied to tertiary care centres and was unavailable at sites included in this evaluation. This qualitative evaluation used a HCW questionnaire (S1 Appendix) and key informant interviews to determine the feasibility and acceptability of introducing RST into PMTCT services, comparing pilot and national rollout implementation phases. The concept of “feasibility”, drawn from the health technology literature, was defined as the process of RST programme implementation leading to end-user acceptance and utilisation, discussed further below [37]. Acceptability was defined as health workers’ positive satisfaction levels and their correct and consistent use of RST [41]. A health economic analysis of pilot and rollout phases is presented in our accompanying paper, Shelley et al. [43]. The questionnaire utilised both closed questions (including five-point Likert scales) and open-ended questions eliciting qualitative responses. The following domains were included: RST advantages and disadvantages, patient experience, organisational environment and workflow, training, skill and confidence in test performance and RST acceptability. For the rollout evaluation, the pilot questionnaire was adapted using current literature on technology acceptance; domains on QA/QC and supervision were added; and a topic guide was designed, covering the same domains as the questionnaire. Further adjustment, following a pilot interview and feedback from completed MoH/EGPAF supervisory visits, strengthened the internal validity of the study. During the pilot assessment, SAS software (version 9.1) was used to randomly select four sites, two in each study district. They included a district hospital and both low and high volume health centres. Data on service and patient volumes, staffing numbers and location, time/motion studies and cost data were collected to document changes in syphilis and HIV testing and treatment and are reported elsewhere [41]. For the rollout evaluation health facilities were selected by convenience sampling (based on MoH guidance and practical limitations such as distance, staffing and vehicle availability) to reflect a range of urban and rural clinical settings with varying laboratory capacity. Review of records and key informant interviews took place at District Health Offices (DHO). Face-to-face interviews were conducted in English with consenting health workers, with the exception of one interview, which used a translator. In November and December 2010, two EGPAF study staff administered the questionnaire to pilot HCWs, including seven midwives, two nurses and seven lay counsellors. Twenty four rollout HCWs (including four midwives, four nurses, five lay counsellors, one clinical officer, six laboratory technicians, three environmental health technologists and one psychosocial counsellor) were interviewed in August 2012 by one of the authors (É.A.), an independent researcher who accompanied MoH/EGPAF on supervisory visits. Responses were hand-written and rollout interviews were also audio-recorded and subsequently transcribed by the interviewer. Interviewees were selected by convenience sampling, aiming to interview four HCWs at each pilot site and any HCW who had ever performed RST at each rollout site. Using the MoH guideline, any errors identified in test performance were immediately remediated post-interview and discussed with the implementing partner. Informal interviews were held with key informants during MoH supervisory visits to gain understanding of the planning, implementation and costs of the national RST programme. Key informants included EGPAF senior and pilot study staff, MoH HIV/AIDS STI Programme staff, and DHO staff from Mansa and Kalomo districts. Data were collected in the form of field notes recorded after each informal interview. A conceptual framework was created to guide the analysis, based on a model used by Asiimwe et al. which explored the feasibility of new health technology introduction. The framework divides the concept of feasibility into two inter-related domains, acceptance and usability [37]. Technology acceptance and usability originate in the study of human-computer interaction. Usability has been further broken down into various attributes [44,45]. Here, acceptance and usability were divided into six sub-domains: learnability, willingness, suitability, satisfaction, efficacy and effectiveness, attributes which have been described in other settings (Fig 2) [45]. The framework recognises that acceptance and usability may be influenced by factors related to the end-user (both HCW and client), the diagnostic tool and the health system. Health system factors include guidelines and training, quality monitoring and evaluation, supply chain, and policy, budget and planning. Legend: In this context, these themes were understood to mean the following: Learnability: how easy or difficult it was for HCW to learn to perform the RST, perform it accurately and learn about quality control and quality assurance; Willingness: Willingness of the HCW to perform the RST, to take part in the cascaded training i.e. being trained by or training other colleagues; willingness to take part in supervisory and quality assurance activities; Suitability: HCWs’ belief the RST test was relevant to their work and could be successfully integrated into existing services; HCWs’ belief in the appropriateness of the current supporting components of the RST programme i.e. training, supervision and quality maintenance; Satisfaction: HCWs’ satisfaction with the test itself, its impact on workflow and satisfaction with the supporting components of the programme; Efficacy: Ability of HCWs to implement same-visit testing and treatment (STAT), to incorporate the test and to integrate quality assurance and quality control activities into their workflow; Effectiveness: How the organisational and systemic environment, including implementation of policy, guidelines, supply chain and other logistics, impacted on successful delivery of the programme. In addition, how the social context (the community, patients and their partners) influenced programme delivery. Data from the two evaluation phases were collected, entered and analysed by separate pilot and rollout teams. Quantitative data were double-entered, verified and cleaned using MS Access (2007–10) and Microsoft Excel 2003 respectively (S1 Dataset). Qualitative data were entered into MAXqda (V10) and NVivo 10, respectively, which are computer-assisted qualitative data analysis software packages. Two authors (M.G. and É.A.) used template analysis to independently analyse qualitative data from each phase [46]. The initial coding template was based on the questionnaire and included codes for: HCW acceptance and satisfaction, patient and partner experience of RST, training, workflow and integration with existing services and quality assurance activities and supervision. For this paper, the authors compared data from each phase under these theme headings and explored and refined emerging sub-themes through iterative discussion. The pilot study protocol and consent procedures were approved by the University of Zambia Biomedical Research Ethics Committee (UNZAREC), the University of Alabama at Birmingham’s Institutional Review Board, the World Health Organization Research Ethics Review Committee and the Zambian MoH. The rollout study protocol and consent procedures were approved by UNZAREC, the London School of Hygiene and Tropical Medicine Research Ethics Committee and the Zambian MoH. To minimise use of personal identifiers, verbal informed consent was obtained from pilot HCWs and recorded on an approved consent form by interviewers. During rollout evaluation, written informed consent was obtained from each relevant District Health Officer and from each interviewed HCW using an approved consent form; verbal informed consent to participate was obtained from key stakeholders and recorded by the interviewer.