Maternal syphilis results in an estimated 500,000 stillbirths and neonatal deaths annually in Sub-Saharan Africa. Despite the existence of national guidelines for antenatal syphilis screening, syphilis testing is often limited by inadequate laboratory and staff services. Recent availability of inexpensive rapid point-of-care syphilis tests (RST) can improve access to antenatal syphilis screening. A 2010 pilot in Zambia explored the feasibility of integrating RST within prevention of mother-to-child-transmission of HIV services. Following successful demonstration, the Zambian Ministry of Health adopted RSTs into national policy in 2011. Cost data from the pilot and 2012 preliminary national rollout were extracted from project records, antenatal registers, clinic staff interviews, and facility observations, with the aim of assessing the cost and quality implications of scaling up a successful pilot into a national rollout. Start-up, capital, and recurrent cost inputs were collected, including costs of extensive supervision and quality monitoring during the pilot. Costs were analysed from a provider’s perspective, incremental to existing antenatal services. Total and unit costs were calculated and a multivariate sensitivity analysis was performed. Our accompanying qualitative study by Ansbro et al. (2015) elucidated quality assurance and supervisory system challenges experienced during rollout, which helped explain key cost drivers. The average unit cost per woman screened during rollout ($11.16) was more than triple the pilot unit cost ($3.19). While quality assurance costs were much lower during rollout, the increased unit costs can be attributed to several factors, including higher RST prices and lower RST coverage during rollout, which reduced economies of scale. Pilot and rollout cost drivers differed due to implementation decisions related to training, supervision, and quality assurance. This study explored the cost of integrating RST into antenatal care in pilot and national rollout settings, and highlighted important differences in costs that may be observed when moving from pilot to scale-up.
From 2008–2010, the Elizabeth Glaser Pediatric AIDS Foundation (EGPAF) in partnership with the Centre for Infectious Disease Research in Zambia (CIDRZ) conducted a pilot that examined the feasibility and acceptability of introducing RSTs into prevention of mother-to-child-transmission of HIV programmes in ANC clinics in Zambia. A detailed description of the pilot and results is available elsewhere [25]; in brief, RST was introduced within a variety of ANC settings at 15 pilot facilities in two districts, comprising urban and rural locations, high and low ANC volume clinics and high and low syphilis prevalence (approximately 7% in Mongu and 2.5% in Lusaka). A centralized two-day RST training workshop took place before RST was integrated within existing clinic staffing patterns, patient flow, and clinic processes alongside other routine antenatal POC tests (HIV, malaria, and haemoglobin). The pilot utilised SD BIOLINE Syphilis 3.0, a rapid POC syphilis antibody test produced by Standard Diagnostics (Yongin-Si, South Korea), with a manufacturer reported 99.3% sensitivity and 99.5% specificity in serum versus the gold standard Treponema pallidum haemagglutination (TPHA) test [27]; of note, a recent meta-analysis of the diagnostic accuracy of SD Bioline 3.0 in field conditions reported a lower pooled sensitivity (87.9% serum; 83.8% whole blood) and specificity (96.0% serum; 98.4% whole blood)[28]. Pilot-specific QA and quality control (QC) measures were established to ensure high standards of quality management [25,29]. The pilot results showed increased syphilis testing among ANC attendees (79.9% versus 95.6%, p<0.0001) and treatment of syphilis positive pregnant women (51.1% versus 95.2%, p<0.0001), and demonstrated the feasibility of integrating RST within busy urban and rural ANC settings in Zambia [25]. Following the successful pilot, Zambia adopted RST into national policy and recommended the use of rapid syphilis tests to offer same-day testing and treatment [30,31]. In March 2012, the Zambian Ministry of Health (MOH) launched the first phase of RST rollout in four underserved districts with high rates of maternal mortality: Mansa (Luapula Province), Kalomo (Southern Province), and Lundazi and Nyimba (Eastern Province). There were several key implementation differences between the pilot and rollout with regard to training, supervision, quality management mechanisms, and testing and treatment algorithms. Firstly, both pilot and rollout utilised a cascaded training approach whereby at least one staff member from each participating facility attended a district-level training workshop and, in-turn, trained facility colleagues in RST testing procedure. However, in practice, rollout HCWs received substantially less on-the-job supervision and guidance following initial training. Second, the rollout supervision process involved a 10-day MOH/EGPAF joint visit to rollout districts during May-July 2012, with unannounced visits to selected facilities; whereas, the pilot supervision process involved monthly EGPAF/CIDRZ monitoring visits at all pilot sites for data collection, quality checks and remedial on-site training for poor performers identified through proficiency testing. Third, the QA/QC mechanisms were centrally led during the pilot, with study staff providing materials and feedback; for the rollout, training was provided to district-level laboratory staff in order to assume this role. Fourth, at all pilot sites, treatment was initiated on the basis of a positive RST. RST detects antibodies specific to the causative bacterium Treponema pallidum, and does not distinguish between active and past, treated infection; whereas, the rollout introduced different testing algorithms utilising either RST alone, or RST as a screening test followed by RPR as a confirmatory test where this was available. The differences in treatment algorithm are described below. Our interpretation of differentials in the cost data and of the influence of implementation decisions on costs draws heavily on the qualitative findings reported in our accompanying paper by Ansbro et al. (2015, in review), which includes a detailed comparison of implementation differences in scale up from successful pilot to national rollout [32]. Our cost methodology combined an ingredients-based approach, whereby a unit cost is multiplied by a resource quantity to generate a total cost, with a step-down cost accounting approach for facility level data in that joint costs are allocated to activities through cost centres [33,34]. A variety of sources were used for data on inputs, outputs and costs, including: inspection of facility records and registers, clinic observation, expert interviews with clinic and project staff, and review of project accounts. An Excel-based cost collection tool was used to collect the cost information during the pilot and rollout phases [34]. Costs were collected between March and July 2010 from five of the fifteen pilot facilities, including two urban health centres (UHC) in Lusaka plus one UHC and two rural health centres (RHC) in Mongu; facilities were purposively sampled to represent variation in facility size, target population, and location. For the national rollout, costs for the period March to July 2012 were collected from five facilities, including one UHC and three RHCs in Mansa District and one district hospital (DH) in Kalomo; facilities were sampled by convenience from among facilities visited by the MOH supervisory team during July and August 2012. Prevalence data from the urban facility (UHC4) diverged significantly from the average and was considered an outlier. We excluded UHC4 from the cost analysis, but the unique challenges experienced by this facility are discussed in the qualitative paper [32]. Table 1 illustrates key differences across the pilot and rollout facilities. DTS = Dry Tube Specimen; FP = Finger Prick; KM = Kilometre; RHC = Rural Health Centre; UHC = Urban Health Centre; VNP = Venepuncture Both pilot and rollout examined the incremental cost of adding RST screening and treatment onto existing ANC services, i.e. any additional costs to execute syphilis screening and treatment were included but administrative costs to run the health facility were excluded. All research costs to collect data during the pilot and rollout were also excluded. Economic costs were collected retrospectively from the provider’s perspective. Financial and logistical constraints during the rollout evaluation prevented full replication of pilot cost methods, the key difference being that direct observation of rollout RST test performance and clinic flow was not possible, either due to RST stock outs or lack of ANC services scheduled on the day of site visits. Therefore, rollout recurrent staff time estimates are based on pilot costing and interviews with HCWs and experts. All pilot and rollout costs are presented in 2012 United States Dollars (USD). Pilot costs collected in 2010 Zambian Kwacha (ZKW) were converted to USD using the average exchange rate for 2010 (ZKW 4,743.98 = 1 USD [35]), then adjusted to 2012 USD using a 2-year inflation rate of 5.29% (2010 CPI 218.056 / 2012 CPI 229.594 = 1.0529) from the US Consumer Price Index (CPI) [36,37]. During the rollout, non-supply costs were collected in 2012 ZKW or USD; supply costs were collected in 2011 ZKW and inflation adjusted to 2012 ZKW using Zambia’s CPI (2012 CPI 122.439 / 2011 CPI 115.091 = 1.0638) [38]. All rollout costs in 2012 ZKW were then converted to USD using the average exchange rate for 2012 (ZKW 5,219.83 = 1 USD [39]). Start-up, capital and recurrent cost inputs were collected during pilot and rollout phases. Recurrent and capital costs were further subdivided into facility and central level categories to highlight the implementation costs of incorporating an extensive supervision and QA system during the pilot phase. Start-up is considered an input, rather than an activity, and therefore includes all resources required for training (e.g. personnel, per diems, conference hire, training equipment and supplies, and vehicle transport). For the pilot period only, the start-up also included costs for two district events to launch the RST pilot activities. All start-up costs during the pilot and rollout were annualised over an estimated project life of three years. Capital costs are generally considered to have a life span of more than one year and cost greater than $100 USD per unit [40]. We included capital cost inputs for vehicles and computers. Annual financial costs were estimated using straightline depreciation, while economic costs were annualised with a 3% discount rate [41]. RPR-related equipment was not included, as this study focussed on incremental costs to existing programs. None of the pilot or rollout sites had an on-site vehicle dedicated to ANC services; thus, pilot vehicle costs included travel in the project, MOH, or district health vehicle for routine study monitoring, delivery of supplies and QA visits. During rollout, the only vehicle costs comprised return travel from Lusaka for supervisory monitoring visits. Vehicles were annualized over a five-year period. The economic costs of computers used for electronic medical records at two pilot facilities in Lusaka were annualised over a three-year period with an allocation factor of 20%. During rollout, the economic costs of computers used during RST training were annualised over three years with an allocation factor ranging from 3 to 12% based on the number of participants attending training from each facility. Recurrent cost inputs comprised all operating costs throughout the project life, including: personnel, supplies, vehicle fuel and maintenance for supervisory visits, QA/QC, and supervision. Table 2 presents differences between pilot and rollout in terms of unit prices and quantity of healthcare resources consumed for testing commodities. Personnel time and some supplies were considered joint “shared” costs, with an allocation factor applied based on researcher time-motion observation during the pilot. Supply use was not directly observed but was modelled from monthly output data plus a 10% adjustment to account for estimated supply wastage. Supplies for testing and treating male partners were included for the pilot. Treatment with a single dose of BP was assumed during the pilot because follow up doses were not generally recorded in registers and could not be verified. Following the pilot and RST policy adoption, national treatment algorithms outlined a single injection of BP following a reactive RST; where RPR confirmation was unavailable, this was followed by two further weekly doses of BP; where RPR confirmation was available, two further BP doses were given if active syphilis infection was confirmed (a description of algorithms is available in Ansbro et al.) [32]. Two of the four rollout facilities (DH1 and RHC3) utilised confirmatory RPR; for these clinics, three doses of BP are assumed in the costing calculations for RPR-confirmed positives. Male partner testing was inconsistent during rollout; therefore, only male partner treatment costs were included. Shared supplies (e.g. biohazard bags, test tubes and needles for blood draw, sharps bins, gloves, cottonwool, and disinfectant) were given a 25% allocation factor to reflect that four blood tests were routinely conducted on ANC patients (HIV, syphilis, haemoglobin, and malaria). Whereas, supplies used only for syphlis testing (e.g. RST test kit, RST job aid, penicillin, water and needle for BP injection), were given an allocation factor of 100%. For facilities that used both venipuncture and finger prick methods of blood collection, we assumed each method was used 50% of the time. *For facilities that reported both finger prick and venous blood draw methods for RST, we assumed 50% for each collection type † Includes 10% supply wastage ^Higher cost vacutainer needle used during rollout period QA/QC was considered a recurrent input that included personnel, supplies, and transporation costs for distributing and collecting known positive and negative samples for testing at the facility level during the pilot phase; a supplementary table is available with further details of the QA/QC cost calculations (S1 Table). During the rollout, a formal QA/QC system was not implemented and supervision activities occurred infrequently; therefore, the only central level costs for supervision included personnel salaries, accommodation, and fuel costs during the supervision trip. Recurent building utilities and waste management were excluded given the incremental costing approach. See the supplementary table of data inputs and assumptions for additional information (S2 Table). All project outputs were retrospectively collected from facility-level ANC patient registers during the five-month cost collection period for both pilot and rollout. Outputs included number of pregnant women attending first ANC visit, number of pregnant women and partners screened for syphilis with RST, number of syphilis-reactive tests, and number of women and partners treated for syphilis. Unit economic costs were calculated per patient tested and per patient treated at each facility, and a facility average was calculated. We performed univariate and multivariate sensitivity analyses to assess the influence of key inputs on overall unit costs. Cost inputs that were not directly observed, were highly uncertain or differed substantially between the pilot and rollout periods were varied in the sensitivity analysis, including project life years (1 to 5 years), supply wastage rate (0 to 50%), blood collection method (finger prick versus venepuncture), price of RST kits ($0.65 to $1.15), and coverage of RST among first ANC attendees (25% to 100%). Best and worst case scenarios were estimated by applying the minimum and maximum value for all parameters varied in the sensitivity analysis. The RST pilot protocol was approved by the University of Zambia Biomedical Research Ethics Committee (UNZAREC), University of Alabama at Birmingham’s Institutional Review Board, the WHO’s Research Ethics Review Committee, and MOH Zambia. The follow-up research to collect costs during the national RST rollout was approved by LSHTM University Ethics, UNZAREC and by the Permanent Secretary of the Zambian MOH. Additionally, written informed consent to conduct data collection related to national rollout was obtained from each Provincial and District Health Office. During both pilot and rollout, individual patient consent was not administered. De-identified patient outcome data were retrospectively abstracted from routinely collected health register data into aggregate monthly summaries.