Health management information system (HMIS) data verification: A case study in four districts in Rwanda

listen audio

Study Justification:
– Reliable Health Management and Information System (HMIS) data is crucial for monitoring, evaluation, and research in healthcare delivery.
– Variable HMIS data quality in low- and middle-income countries limits its value.
– This study aimed to assess the quality of Rwandan HMIS data for maternal and newborn health.
Highlights:
– High proportions of health facilities achieved acceptable verification factors (VFs) for certain data elements, such as the number of deliveries, antenatal care new registrants, live births, and newborns receiving postnatal care.
– However, lower VFs were observed for data elements like women receiving iron/folic acid, syphilis testing in antenatal care, and ANC standard visits.
– Over-reporting was observed for ANC standard visits.
Recommendations:
– Ongoing data quality assessments and training should be conducted to address gaps and improve HMIS data quality.
– Special attention should be given to data elements that require more complex calculations and knowledge, such as ANC-related data.
Key Role Players:
– Ministry of Health (MoH)
– Health facility staff
– Data managers
– ABC project team
– Trained data collectors
Cost Items for Planning Recommendations:
– Training materials and resources
– Data collection tools
– Staff time for data verification and corrections
– Monitoring and evaluation activities
– Capacity-building initiatives for health facility staff
– Technical support for data management and analysis

The strength of evidence for this abstract is 7 out of 10.
The evidence in the abstract is based on a cross-sectional study conducted in four districts in Rwanda. The study compared Health Management and Information System (HMIS) data to facility source documents for 14 maternal and newborn health data elements. The study used World Health Organization guidelines on HMIS data verification and calculated verification factors (VFs) to assess the agreement between HMIS and facility data. The abstract provides specific results for each data element and concludes that there is variable HMIS data quality. The abstract also suggests ongoing data quality assessments and training to improve HMIS data quality. While the study design and methodology are appropriate, the abstract does not provide information on the sample size or representativeness of the districts studied. Additionally, it does not mention any limitations or potential biases in the study. To improve the evidence, future studies could include a larger and more diverse sample, address potential biases, and provide a more comprehensive discussion of the limitations.

Introduction Reliable Health Management and Information System (HMIS) data can be used with minimal cost to identify areas for improvement and to measure impact of healthcare delivery. However, variable HMIS data quality in low- and middle-income countries limits its value in monitoring, evaluation and research. We aimed to review the quality of Rwandan HMIS data for maternal and newborn health (MNH) based on consistency of HMIS reports with facility source documents. Methods We conducted a cross-sectional study in 76 health facilities (HFs) in four Rwandan districts. For 14 MNH data elements, we compared HMIS data to facility register data recounted by study staff for a three-month period in 2017. A HF was excluded from a specific comparison if the service was not offered, source documents were unavailable or at least one HMIS report was missing for the study period. World Health Organization guidelines on HMIS data verification were used: a verification factor (VF) was defined as the ratio of register over HMIS data. A VF1.10 indicated over- and under-reporting in HMIS, respectively. Results High proportions of HFs achieved acceptable VFs for data on the number of deliveries (98.7%;75/76), antenatal care (ANC1) new registrants (95.7%;66/69), live births (94.7%;72/ 76), and newborns who received first postnatal care within 24 hours (81.5%;53/65). This was slightly lower for the number of women who received iron/folic acid (78.3%;47/60) and tested for syphilis in ANC1 (67.6%;45/68) and was the lowest for the number of women with ANC1 standard visit (25.0%;17/68) and fourth standard visit (ANC4) (17.4%;12/69). The majority of HFs over-reported on ANC4 (76.8%;53/69) and ANC1 (64.7%;44/68) standard visits. Conclusion There was variable HMIS data quality by data element, with some indicators with high quality and also consistency in reporting trends across districts. Over-reporting was observed for ANC-related data requiring more complex calculations, i.e., knowledge of gestational age, scheduling to determine ANC standard visits, as well as quality indicators in ANC. Ongoing data quality assessments and training to address gaps could help improve HMIS data quality.

This was a cross-sectional study that was conducted to assess the quality of Rwanda HMIS data, measured as agreement between HMIS and facility source documents data, for 76 HFs on 14 HMIS data elements related to maternal and newborn health data that were used to monitor quality and progress and to inform quality improvement efforts through the ABC intervention (Table 1). The antenatal care (ANC) register was the source document for 5 data elements that were only reported by health centers (HCs), while the maternity and postnatal care (PNC) registers were source documents for 7 documents that were reported by both HCs and hospitals. Two data elements related to neonatal admissions and deaths were recounted from the neonatology care unit (NCU) register and were only reported by hospitals. This study included 48 HFs in Gakenke and Rulindo districts in Northern Rwanda and 28 HFs in Gisagara and Rusizi districts in Southern and Western Rwanda, respectively. The 76 HFs were grouped into seven hospital catchment areas (HCAs): Nemba District Hospital (15 HFs), Ruli District Hospital (10 HFs), Kinihira Provinical Hospital (9 HFs), Rutongo District Hospital (14 HFs), Gakoma District Hospital (6 HFs), Kibilizi District Hospital (10 HFs) and Mibilizi District Hospital (12 HFs). These MoH-operated facilities were included because they received the ABC intervention, and represented 14% (69/499) of health centers and 15% (7/48) of the hospitals in all 30 districts in Rwanda [36]. The ABC project was originally implemented in 2013 by Partners in Health/Inshuti Mu Buzima (PIH/IMB) in Kayonza and Kirehe districts in Eastern Rwanda, in partnership with the Rwanda MoH [34]. Later on, ABC was scaled-up to Gakenke and Rulindo (July 2017) and then to Gisagara and Rusizi (October 2017). The ABC scale-up project used health facility-based data, collected monthly through the Rwanda HMIS, to monitor indicator progress from baseline to the end of the project and to evaluate its impact. In addition, the project underwent the HMIS data verification process by recounting the same data in standardized HF registers for the same data elements and reporting periods. Five data elements related to antenatal care (ANC) services (ANC new registrants, syphilis testing and iron/folic acid distribution at the first ANC visit and number of women with first and fourth ANC standard visits) were only reported by health centers, whereas two data elements on the number of neonatal admissions and neonatal deaths in the hospital neonatology care unit were specific to hospitals [37]. Data are recorded using standardized registers developed by the MoH and provided to all HFs (see Table 1 for data sources); women attending ANC are recorded at their first ANC visit and provided an ANC card and an ANC number that facilitates continuity of data recording at the individual level for ANC. This study reports data from the baseline period prior to ABC intervention in the 7 HCAs. The ABC team worked with the MoH national HMIS team to extract HMIS reports data for the study periods. The HMIS data collection starts from the reporting facility, with clinical staff in each care service registering patients/clients and the care provided to them in standardized registers and/or medical files [38,39]. Then, for monthly reporting to HMIS, the facility data manager ensures the distribution of paper HMIS reporting forms to heads of services by the 25th of each month. The head of service collects those data that are relevant to their specific service and submits a completed HMIS report for the previous month to the facility data manager by the 3rd day of the month following the month of reporting. For timely reporting, the facility data manager should upload all facility data into DHIS 2 by the 5th day of every month. Data verification by facility team and corrections in the system are only allowed between the 5th and 15th of each month. Any request for changes on the data in the system beyond the 15th of each month should be submitted to the central MoH, and access is only granted upon strong justification of the request. A specialized team of two trained ABC data collectors went to all HFs under study and recounted the same data in the standardized HF source documents for the same reporting periods. ABC baseline data—April-June 2017 for 48 HFs in Gakenke and Rulindo districts and July-September 2017 for 28 HFs in Gisagara and Rusizi—were collected during the periods July 31-September 19, 2017 and November 14, 2017-January 11, 2018, respectively. In particular, due to observed variable unit of recording of gestational age (GA) in weeks or months in the ANC register by facility and care provider, ABC data collectors worked with midwives or nurses responsible for providing ANC to standardize the calculation of GA in weeks before recounting data on ANC1 and ANC4 standard visits, as the reporting to HMIS on these data elements is based on GA calculated in weeks. The data collection team used a pregnancy wheel and the data recorded on date of last menstrual period and dates of ANC visits for individual women who attended ANC to determine GA at each visit. For all data elements, data collection was done in consultation with the health facility staff responsible for routine reporting of data into HMIS. We used the WHO DQR guidelines on data verification and system assessment to calculate verification factors (VFs) for each data element [17]. A VF was defined as the ratio of register data to HMIS data (Eq 1). ABC baseline data were aggregated for the three-month reporting period. HMIS and facility source documents data were compared by data element and HF. At the HCA level, a VF was calculated by summing all the non-missing values for each data element and all the reporting HFs under that HCA during the study period. Then, a HCA-level VF was calculated as a ratio of the aggregated recounted data to HMIS data. In addition, a VF for data elements with rare events was only calculated at the HCA level, where aggregated data were compared to avoid denominators with true zero values that would be expected if these data were compared at the HF level. For each data element, we excluded from our analyses any HFs that were not eligible for reporting on that data element or that had either incomplete HMIS data or missing source documents’ data for any month during the reporting period. A VF of 1.00 indicated a perfect match between recounted data and HMIS data. The acceptable margin of error for the discrepancy between HMIS reports data and recounted data in facility registers was (0.90≤VF≤1.10), based on the WHO DQR guidelines. A VF1.10 indicated over-reporting and under-reporting in HMIS data, respectively. We used Stata v.15.1 (Stata Corp, College Station, TX, USA) and R Language and Environment for Statistical Computing for analysis and visual presentation of data [40]. The ABC scale-up project received approval from the Rwanda MoH to access HMIS data for the project’s indicators for all HFs that received the intervention. This study was approved by the Rwanda National Ethics Committee approval (Kigali, Rwanda, protocol #0067/RNEC/2017). As this study was completed using de-identified routinely-collected aggregate data, informed consent was not required.

The innovation described in the title and description is the use of a Health Management Information System (HMIS) data verification process to improve the quality of maternal health data in Rwanda. This process involves comparing HMIS data to facility source documents to assess the consistency and accuracy of the reported data. The study found variable HMIS data quality for different maternal and newborn health indicators, with some indicators showing high quality and consistency in reporting across districts, while others showed over- or under-reporting. The study suggests that ongoing data quality assessments and training could help improve the HMIS data quality.
AI Innovations Description
The recommendation based on the study is to implement a Health Management Information System (HMIS) data verification process to improve the quality of maternal health data in Rwanda. The study found that there were inconsistencies between HMIS data and facility source documents, indicating variable data quality. By implementing a data verification process, the accuracy and reliability of HMIS data can be improved, which is crucial for monitoring and evaluating healthcare delivery and informing quality improvement efforts. The verification process involves comparing HMIS data to facility register data for specific data elements related to maternal and newborn health. A verification factor (VF) is calculated as the ratio of register data over HMIS data, with a VF1.10 indicating over- or under-reporting in HMIS data, respectively. Ongoing data quality assessments and training can help address gaps and further improve the quality of HMIS data.
AI Innovations Methodology
Based on the provided description, one potential innovation to improve access to maternal health is the implementation of a digital health management information system (HMIS). This system would digitize the collection, storage, and analysis of maternal health data, making it more efficient and accurate. It would allow for real-time monitoring of key indicators, such as the number of deliveries, antenatal care visits, and postnatal care, enabling healthcare providers to identify areas for improvement and measure the impact of interventions.

To simulate the impact of this recommendation on improving access to maternal health, a methodology could be developed as follows:

1. Baseline Data Collection: Collect data on maternal health indicators from the existing paper-based HMIS in the selected districts. This would involve extracting data from facility registers and reports for a specific period.

2. System Development: Develop a digital HMIS platform that can capture and store the collected data. The system should be user-friendly and accessible to healthcare providers at all levels.

3. Data Migration: Transfer the collected data from the paper-based system to the digital HMIS platform. This may involve manual data entry or automated data extraction, depending on the availability of electronic records.

4. Data Verification: Verify the accuracy and consistency of the migrated data by comparing it with the original paper-based records. This step ensures data integrity and minimizes errors during the migration process.

5. System Implementation: Roll out the digital HMIS platform in the selected districts, providing training and support to healthcare providers on its usage. This step may involve the procurement of necessary hardware and software infrastructure.

6. Data Analysis: Analyze the collected data using appropriate statistical methods to assess the impact of the digital HMIS on improving access to maternal health. Key indicators, such as the number of antenatal care visits, deliveries, and postnatal care, can be compared before and after the implementation of the system.

7. Evaluation and Feedback: Evaluate the effectiveness of the digital HMIS in improving access to maternal health based on the analysis results. Gather feedback from healthcare providers and stakeholders to identify any challenges or areas for further improvement.

8. Scaling Up: If the digital HMIS proves to be effective, consider scaling up its implementation to other districts or regions. This would involve adapting the system to local contexts and ensuring sustainability through ongoing training and support.

By following this methodology, the impact of implementing a digital HMIS on improving access to maternal health can be simulated and evaluated. It would provide valuable insights into the potential benefits of such an innovation and guide decision-making for its wider adoption.

Share this:
Facebook
Twitter
LinkedIn
WhatsApp
Email