Skip to main content

The effect of a mobile-learning curriculum on improving compliance to quality management guidelines for HIV rapid testing services in rural primary healthcare clinics, KwaZulu-Natal, South Africa: a quasi-experimental study

Abstract

Background

Despite significant achievements made towards HIV testing, linkage to antiretroviral therapy treatment and viral load suppression, the Sub-Saharan region of Africa continues to be reported to have the highest prevalence of HIV/AIDS, with over 26 million people living with the disease. In light of the added burden on already overwhelmed health systems due to the Covid-19 pandemic, maintaining the reliability and accuracy of point-of-care diagnostics (POC) results is crucial to ensure the sustainability of quality service delivery. The integration of technology-based interventions into nurse education curricula is growing, to help prepare students for the current practice environment which requires access to large amounts of information. The aim of this study was to determine the effect of a Mobile Learning (mLearning) Curriculum on improving the quality of HIV rapid testing services in rural clinics of KwaZulu-Natal (KZN), South Africa.

Methods

To achieve the aim of this study, pre-test and post-test audits were conducted in a quasi-experimental design. Eleven clinics of KZN, with the highest availability and usage of POC diagnostics were selected from a cross-sectional study survey to constitute the sample of this study. The World Health Organization On-site Monitoring Checklist-Assessment of Quality System was adapted and used as an audit tool to evaluate four key quality components. The effect of the mLearning curriculum on HIV testing quality improvement was determined through statistically comparing pre-audit and post-audit results. The independent samples t-test and the Levene’s test were employed to evaluate the equality of measured variables for the two groups. The relationships between variables were estimated using the Pearson pair wise correlation coefficient (p) and correlations were reported as significant at p < 0.05.

Results

A total of 11 clinics was audited at the pretest and 7 clinics were audited post-piloting of the mLearning curriculum. The estimated level of compliance of the participating clinics to quality HIV rapid testing guidelines ranged between poor and moderate quality. The mLearning curriculum was shown to have no statistically significant effect on the quality of POC diagnostic services provided in rural clinics of KZN.

Conclusion

The mLearning curriculum was shown to have no statistically significant effect on the quality of HIV rapid testing services provided in participating clinics; however, multiple barriers to the full adoption of the piloted curriculum were identified. The provision of reliable technology devices and improved internet connection were recommended to enhance the adoption of technology-based interventions necessary to improve access to relevant learning material and updated information.

Peer Review reports

Background

The sub-Saharan Africa region continues to be reported to have the highest prevalence of HIV/AIDS globally, having over 26 million people living with HIV/AIDS in 2019 [1]. However, a lot has been achieved in the field of HIV testing and disease management over the years. In 2019, an average of 81% of people living with HIV knew their HIV status, amongst these, 82% were accessing antiretroviral treatment and among those accessing treatment, 88% were reported as virally suppressed [2]. Rapid HIV testing has also been shown to be associated with decreased mother-to-child transmission of HIV and the increased linkage to Anti-retroviral treatment (ART) both locally and globally [3, 4]. Currently the advent of the Covid-19 pandemic has been reported to add an extra burden on overwhelmed health systems, posing a risk of severe disruptions to HIV services in sub-Saharan Africa [5, 6]. This is assumed to significantly increase mortality rates due to interruptions of ART supply [5].

In light of the significant role played by HIV rapid tests in improving access to HIV care and the foreseen challenges in resource-limited settings due to the Covid-19 pandemic, maintaining reliability and the accuracy of test results to ensure sustainability of quality service delivery remains crucial. The World Health Organisation (WHO) guidelines for assuring the accuracy and reliability of HIV rapid testing services highlight organisation, personnel, quality control and assessment amongst critical components necessary to assure quality service provision in all HIV testing facilities [7]. Poor adherence to guidelines has been observed to highly compromise the quality of HIV testing services provided in resource- limited settings of South Africa [8]. In the previous study, relevant stakeholders with experience in HIV testing and management also highlighted challenges to adherence to quality requirements due to lack of opportunities of continuous professional development (CPD) [9].

The South African National Department of Health has conducted workshops periodically to train and develop primary healthcare (PHC) clinic workers on HIV testing. They have also distributed relevant policy documents on HIV testing and management [10]. However, adherence to quality management guidelines in clinics situated in deep rural areas remains a challenge. The aim of this study was to determine the effect of a Mobile Learning (mLearning) Curriculum on improving quality components of HIV rapid testing services in rural clinics of KwaZulu-Natal (KZN), South Africa. It is anticipated that the findings of this study will inform policy makers and curriculum designers in the designing and implementation of an evidence-based curriculum to improve the quality of POC diagnostic services in resource-limited settings and hence contribute towards strengthening of health systems.

Materials and methods

Study design

In this study pre-test and post-test audits were conducted in a quasi-experimental design to evaluate the effectiveness of an mLearning curriculum on the improvement of quality components (organization, personnel, process improvement,and service and satisfaction) of HIV testing services in representative rural clinics of KZN. Quasi-experimental designs are defined as forms of experimental research used to establish a cause and/or effect of an intervention on a population without randomisation [11]. This design was selected as appropriate to achieve the aim of the study because it has been reported to add substantial value to health research and evidence synthesis. Furthermore, this design has been shown to be effective in determining the effects of interventions in ‘real life’ [12].

Sampling strategy

This study was conducted as a follow up on a cross-sectional study, which involved an audit of 100 rural PHC clinics in rural KZN [13]. The cross-sectional study was aimed at identifying barriers and challenges related to the implementation of POC diagnostics in the province. Eleven clinics (one clinic per KZN district) that have the highest availability and usage of POC diagnostics were selected from the survey and these constituted the sample of this study.

Intervention

The mLearning curriculum was designed with the intention to provide accessible and updated quality HIV POC diagnostics training to PHC nurses and HIV lay councillors in the rural clinics of KZN. The mLearning curriculum content consisted of three learning units (Counselling, HIV testing process and quality requirements), activities, an online quiz and an online survey.To deliver this curriculum content, an online education platform (Learning Environment) “Moodle” was utilised to create an interactive online course for participants to experience and give feedback on the content provided. Moodle is a free and open-source learning management system written in PHP and distributed under the GNU General Public License [14, 15]. This platform provides for various forms of electronic learning (eLearning), including mLearning through the Moodle app and web browser.

For ethical purposes, dummy usernames and passwords were created and loaded onto Moodle and sent to participants for access to the online course. Real email addresses were submitted to have accounts activated but participants’ personal identities were not loaded onto the system. Through contact sessions, participants were orientated on accessing the course via the Moodle app and the standard route for distance learning. The course was open for a period of four months, which included an additional extension of one month to accommodate participants reported to be experiencing connection problems.

Learning Outcomes: After the course the participants would be able to:

  • Define terms, including: HIV infection, AIDS, antibody, antigens, rapid test, window period.

  • Explain the need for quality HIV testing in the prevention and treatment of HIV/AIDS.

  • Identify and prevent factors that may compromise the quality of HIV rapid testing.

  • Understand the responsibilities of the health worker in the prevention and detection of errors prior to, during and post testing.

  • Propose strategies to improve the training program.

More detail on the design and piloting of this intervention has been published in our recent article [16].

Data collection procedure

The effect of the mLearning curriculum was evaluated by comparing pre-audit and post-audit results using a pre-test/post-test design. Pre-test and post-test data were collected using an audit tool as described below. Pre-test data was generated prior to participating in the three months mLearning course. Post-test data was generated after four months of open access to the mLearning course. The audit team consisted of the study primary investigator (PI), a trained research assistant, professional nurses, HIV lay counsellors (where available) and clinic managers. The audit team worked together to conduct audits. The PI and research assistant presented and explained the components of the checklist. The clinic managers or their representatives and/or HIV lay counsellors responded to the questions on the checklist and went on to show the PI and research assistant different items as requested on the audit checklist. To ensure efficiency, relevant clinic personnel were informed of the purpose of the audits and the procedure to be followed prior to clinic visits. The flowchart of our study process is shown in Fig. 1 below:

Fig. 1
figure 1

Study process flow chart

Audit tool and scoring guide

The World Health Organisation (WHO) provides guidelines for assuring the accuracy and reliability of HIV rapid testing services in rural and resource-limited settings [17]. Furthermore the South African Department of Health has adopted these guidelines to frame national policies towards ensuring provision of quality POC diagnostic services [18]. In this study the On-site Monitoring Checklist-Assessment of Quality System [17] was adapted and used as an audit tool (Checklist: Appendix A). The audit tool included four key components for assessment: Organization (Org), personnel (Per), service and satisfaction (Ser), and process improvement (Pro). Each of the components consisted of questions referred to as “Quality System Essentials” with “yes” or “no” responses. These were clear and easy for participants to understand. To determine the percentages, ‘Yes’ responses were coded as 1, ‘No’ responses as 0. The sum totals of ‘Yes’ responses were calculated and reported as percentages for each of the components assessed.

Data analysis

All data were collected manually on site and then entered onto an excel spreadsheet, cleaned and validated before importing onto Stata for analysis. All statistical analysis was conducted using Stata (version 13). Frequencies and 95% confidence intervals (CI) were estimated for all 11 audited clinics using the t-test. Cohen's d, Hedges' correction and Glass's delta statistical tools were employed to estimate the effect size for the two audit groups, with the Hedges' correction best recommended to minimize bias for sample sizes less than 20 [19, 20]. The Levene’s test defined as an inferential statistic used to evaluate the equality of variances for a variable calculated for two groups or more [21] and an independent samples t-test for two groups, were employed to compare quantitative variables between the pre-test and post-test groups. The relationships between variables were then estimated using the Pearson pair wise correlation coefficient (p) and correlations were reported as significant at p < 0.05 [22].

Results

Audits were performed at 11 selected rural clinics prior to the introduction of the mLearning intervention. Post-audits were only performed at seven clinics, which were able to access the intervention. In this section the characteristics of the audited rural PHC clinics of KZN are presented, also the pre- and post-audit results, followed by a statistical comparison of the pre- and post-audit results.

Characteristics of audited rural PHC clinics

Eleven rural PHC clinics from the KZN province in South Africa were pre-audited in March 2021 and post-audited in June 2021. All the audited rural PHC clinics are located more than 10 km away from the nearest town, with nine out of the 11 audited clinics located 10 km or more outside of the nearest hospital. The other two clinics from uMkhanyakude and Ugu districts were shown to have the closest proximity to a hospital (1.1 km and 4.1 km respectively). Due to the gradual phasing out of HIV lay counsellors, HIV testing was found to be performed by a variable number of PHC workers in each of the clinics. Furthermore, four clinics reported to have their HIV testing programmes supported by non-profit organisations on site. Consolidated record keeping of the number of HIV tests performed on a daily basis was therefore reported to be a challenge. None of the audited clinics had access to departmental email addresses and ten clinic managers reported to have been utilising personal emails for communicating with the department of health in the province of KZN. Current status of participating rural PHC clinics’ access to technology resources is illustrated in Table 1. Further participant characteristics are reported in our previous study [9].

Table 1 Audited KZN rural PHC clinics access to technology resources

Pre-audit scores for the audited rural PHC clinics in KZN

Pre-audit results showed that the 11 audited clinics’ average rating scores for compliance to the WHO guidelines for health care facilities in rural and resource-limited settings ranged between poor and moderate quality (50–87.5%). The Umzinyathi (50%) followed by the King-Cetshwayo (56.2%) districts, were rated as the least compliant districts and the Zululand district rated the most compliant (87,5%). Figure 2 illustrates the average pre-audit scores for each of the clinics audited per district.

Fig. 2
figure 2

Average quality pre-audit scores per district

Level of compliance to audited WHO HIV rapid testing quality components

Average responses for each audit component were determined from the sum totals of each of the quality essential questions making up the respective component. For example, the organisation component consisted of three quality essential questions with the following frequencies: 82%, 36% and 73%. Therefore, the consolidated average score for ‘organisation’ was: (82 + 36 + 73)/3 = 61%. Figure 3 illustrates the average responses for each of the audit components: organization, personnel, process improvement and service and satisfaction.

Fig. 3
figure 3

Level of compliance with audited WHO HIV rapid testing quality components

Figure 3 shows that audited clinics scored highest for the personnel component (88%), followed by the service and satisfaction component (74%). They scored moderate for the organisation component (64%) and lowest for the process improvement component (32%).

Post-audit scores for the audited rural PHC clinics in KZN

The post-audit results showed that the seven audited clinics’ average rating scores for compliance to standard quality guidelines also ranged between poor and moderate quality (56.2—87.5%). The Umzinyathi district (56.3%) rated the least compliant and the Zululand district (87.5) rated the most compliant. The districts that were not able to participate in the mLearning curriculum were the Umgungundlovu, Ilembe, Uthukela and the Ugu districts. Figure 4 illustrates average pre-audit scores for each of the clinics audited per district.

Fig. 4
figure 4

Post-audit average rating scores for each of the clinics audited

The results obtained at the post-audit were similar to those obtained at the pre-audits for each of the audit components evaluated, with a slight improvement for the Umzinyathi District from 50% to 56,3%. Further statistical results are presented below.

Statistical tests for differences between pre and post audit results

The t-test and the Levene’s test for independent samples were performed for each of the audit components. Detailed results for the organisation component (Question 1) are presented below, followed by the summary of the results for all the other components in Table 4.

Using an alpha level of 0.05, an independent-samples t-test was conducted to evaluate if the average percentages of tested quality components differed significantly as a function of the pre-test or post-test. For the organization component Q1, the mean (M) for the pre-test was 0.36, and the mean (M) for the post-test was 0.43. The p-value was 0.798 and, therefore, the difference between the two means is not statistically significant because the p-value is greater than 0.05. In addition, there was an estimated change of 6.5% (SE = 2.49%). This value implies how much the sample mean would vary if the data were to repeat a study using new samples from within a single population. The 95% confidence interval for the average percentage of organization ranged from -0.594 to 0.464. However, there is insufficient evidence (p = 0.798) to suggest that both pre and post test results changed the mean for the organization component. In addition, two more measurements from the organization component are reported in Table 2. The results showed that the mean scores for this component revealed a moderate increase from 0.73 to 0.86 in the pre-test, showing an upward growth (Table 2).

Table 2 Comparison results for the pre and post audit results for the organization Component Q1

The Levene’s test was also conducted to evaluate whether the population variances for the two groups are equal. The results, F = 1.776 and p = 0.201, indicate that there is no significant difference between the two variances. We further examined the standard t test, (t = 0.616, p = 0.546), which also indicates that there is no significant differences between the pre and post-test groups. The results are illustrated in Table 3.

Table 3 Independent samples test

Table 4 illustrates that the Levene’s test and the t-test results for all the quality HIV testing audit components show that there were no significant differences between the pre-test and post-test results, since the p-values are greater than 0.05. There was only one exception on one quality essential question: Service and satisfaction (question 3 “When reporting to outside providers, is turnaround time appropriate?”) with p = 0.008.

Table 4 Stata results for all audited quality essential components

Similar results were obtained for various questions, most significantly the last three questions under the service and satisfaction component. The similarities in the pre- and post-audit results can be attributed to the validity of the audit results obtained.

Effect size results

Effect sizes were calculated to determine the practical significance of the study finding, using three techniques; The Cohen’s d, Hedges’ correction and Glass. The Cohen’s d (0.436), Hedges’ correction (0.458) and Glass delta (0.378) results indicated small to medium effect size on overall audit results (0.3 to 0.5), with the Hedges’s correction recommended for small sample sizes also indicating medium effect size. Results for the organisation component (Q1) are illustrated in Table 5 below.

Table 5 Independent samples effect sizes

Discussion

Pre- and post-audits were conducted to determine the effect of a mLearning curriculum on quality improvement of HIV rapid testing services in 11 rural clinics of KZN. Audits were conducted on four quality components: organization, personnel, process improvement and service and satisfaction. The overall quality of the HIV testing services was found to have been maintained pre- and post-piloting of the mLearning curriculum, with a slight improvement observed in one of the clinics with the lowest overall compliance scores from 50% to 56,25%. This improvement was a result of a better score obtained for the organisation audit component, which could be attributed to improved access to quality documents. However, the statistics results showed that there were no statistically significant differences between pre- and post-audit results. A significant difference was observed for only one question in the service and satisfaction component; however, this difference was not sufficient to yield a significant effect for the whole component.

Statistical analysis also showed that the differences in pre- and post-audit sample sizes (n = 11 and n = 7, respectively) had small to medium effect size on the overall audit results. However, since the results showed that there was no statistically significant difference between the two groups, it was not necessary to determine the magnitude of the effect on the overall results [23]. The smaller sample size at post-auditing was due to barriers hindering 36% of the participants from taking full advantage of the proposed intervention. These included poor access to technology devices, and poor and slow internet connections, which led to less participation and to discouragement to complete the course. However, in quasi experimental analysis it is a good indication to establish that the manipulation or intervention did not only influence the mean between observations, but also influenced the standard deviation. Glass et al. (1981) recommends using the standard deviation of the pre-measurement as a standardizer [24]. For the current study, we reported three different techniques for effect size (d), with Hedges’s correction recommended for small sample sizes of less than 12 [23]. Though the differences may have been shown not to be statistically significant, Hedges’s correction indicated medium effect size, which cannot be considered negligible. This indicates that implementing the evaluated intervention in a conducive environment, where participants are afforded enough time and the necessary resources, may yield more positive results as recommended below [23].

To the best of the researcher’s knowledge this was the first study to investigate the effect of a mLearning curriculum for POC diagnostics in resource-limited settings. However, there is available evidence on the evaluation of the effects of mLearning on general nursing education. In contrast to this study, a study evaluating an mLearning module effectiveness to support inter-professional knowledge construction in the health professions, found that the module improved clinical reasoning and inter-professional communication of 98% of participating students [25]. A qualitative Systematic Review study on influences on the implementation of mLearning for medical and nursing education, also found mLearning to be beneficial for good communication and interaction between learners and the learning material for clinical practice [26]. However, similar to the current study, Lall et al. (2019) highlighted concerns over network connectivity and poor device functionality especially in clinical settings, as a barrier to adoption of mLearning. The overall introduction of technology-based interventions such as mLearning has been reported to have a potential to contribute to the provision and access to updated information and training material to facilities in remote areas. Moreover, it has been reported to have had a positive impact on learning motivation and study performance [26, 27]. However, as also observed in the present study, challenging realities for students, doctors, and nurses need to be addressed first [26].

Notable strengths of this study include that more than one test statistic was used to investigate correlations and independent sample effect sizes. This contributed to the reliability of the results obtained. Auditing of the same group which yielded to more coherent results, can also be associated with the reliability and accuracy of the quasi experiment in this study. Limitations to be highlighted in this study include that due to Covid19 restrictions, there were variations in the duration between pre- and post-audits for different clinics. Hence some clinics may have not been sufficiently exposed to the intervention, however the researchers ensured that minimum exposure of 3 months was achieved for all participants. Furthermore, participants were given a platform to express their views as well as evaluate the course for future improvement. Majority of the participants complemented the course content for being relevant and easy to access through Moodle as published in our recent qualitative study [16]. Moreover, 36% of the participating clinics had poor or no internet access and could not participate on the mLearning training and hence were only audited once. However, the conducting of audits of this nature enabled the identification of areas of improvement necessary for future adoption of technology-based interventions.

Based on the identified benefits of using technology-based interventions, as well as the barriers to access and connectivity, this study recommends the provision of small localised WiFi servers located in clinics, which would provide free information to all devices within a limited range. This study also recommends the establishment of internal information sharing platforms at the clinics to improve access to updated quality documents and to encourage interaction between PHC workers. Furthermore, qualitative analysis of the impact of an mLearning curriculum on quality improvement of POC diagnostic services is also recommended to provide more insight on the participants’ experiences.

Conclusion

The aim of this study to determine the effect of an mLearning curriculum on improving the quality of HIV rapid testing services in rural clinics of KZN, South Africa, was met. The findings of this study showed that the mLearning curriculum had no statistically significant effect on the quality of HIV rapid testing services in participating clinics. However, critical gaps on the lack of provision of reliable technology devices and internet connections were identified and the necessary provisions were recommended to ensure full adoption of technology-based interventions, necessary to improve access to learning material and updated information.

Availability of data and materials

Further data leading to and supporting the conclusions of this paper contains confidential information of participating clinics and is stored at the University archives, however it can be made available as supporting documents upon request.

Abbreviations

AIDS:

Acquired Immunodeficiency Syndrome

ART:

Anti-retroviral treatment

KZN:

KwaZulu Natal

HIV:

Human Immunodeficiency Syndrome

eLearning:

Electronic Learning

mLearning:

Mobile Learning

PHC:

Primary Health Care

POC:

Point of Care

WHO:

World Health Organisation

References

  1. Countries: AFRICA - EAST AND SOUTHERN [Internet]. UNAIDS. 2019 [cited 9 November 2020]. Available from: (www.unaids.org/en/regionscountries/countries.

  2. UNAIDS. Factsheet: global AIDS update. UNAIDS Geneva: 2019;2019. https://aidsinfo.unaids.org/Factsheets.

  3. Mashamba-Thompson TP, Morgan RL, et al. Effect of point-of-care diagnostics on maternal outcomes in human immunodeficiency virus–infected women: systematic review and meta-analysis. Point Care. 2017;16(2):67.

    Article  Google Scholar 

  4. Horwood C, Haskins L, et al. Prevention of mother to child transmission of HIV (PMTCT) programme in KwaZulu-Natal, South Africa: an evaluation of PMTCT implementation and integration into routine maternal, child and women’s health services. Tropical Med Int Health. 2010;15(9):992–9.

    CAS  Google Scholar 

  5. Jewell BL, Smith JA, et al. Understanding the impact of interruptions to HIV services during the COVID-19 pandemic: A modelling study. EClinicalMedicine. 2020;26: 100483.

    Article  Google Scholar 

  6. Mhango M; Chitungo I, et al. COVID-19 Lockdowns: Impact on Facility-Based HIV Testing and the Case for the Scaling Up of Home-Based Testing Services in Sub-Saharan Africa. AIDS Behav. 2020:1–3.

  7. World Health O. Consolidated guidelines on HIV testing services, 2019. Geneva: World Health Organization; 2020. p. 2020.

    Google Scholar 

  8. Mwisongo A, Peltzer K, et al. The quality of rapid HIV testing in South Africa: an assessment of testers’ compliance. Afr Health Sci. 2016;16(3):646–54.

    Article  Google Scholar 

  9. Chamane N, Kuupiel D, et al. Stakeholders’ Perspectives for the Development of a Point-of-Care Diagnostics Curriculum in Rural Primary Clinics in South Africa—Nominal Group Technique. Diagnostics. 2020;10(4):195.

    Article  Google Scholar 

  10. National-Department-of-Health. National HIV Testing Services: Policy. In: Health RoSA, editor. Pretoria: National Department of Health; 2016.

  11. Eliopoulos GM, Harris AD, et al. The use and interpretation of quasi-experimental studies in infectious diseases. Clin Infect Dis. 2004;38(11):1586–91.

    Article  Google Scholar 

  12. Bärnighausen T, Tugwell P, et al. Quasi-experimental study designs series—paper 4: uses and value. J Clin Epidemiol. 2017;89:21–9.

    Article  Google Scholar 

  13. Mashamba-Thompson TP, Drain PK, et al. Evaluating the accessibility and utility of HIV-related point-of-care diagnostics for maternal health in rural South Africa: a study protocol. BMJ Open. 2016;6(6): e011155.

    CAS  Article  Google Scholar 

  14. Lambda-Solutions. Introduction to Moodle Canada: Lambda Solutions; 2019 [Available from: https://www.elearninglearning.com/moodle/platform.

  15. Moodle.org. Moodle-open-learning plartform. 2020.

  16. Chamane N; Thompson R, et al. Designing and Piloting of a Mobile Learning Curriculum for Quality Point-Of-Care Diagnostics Services in Rural Clinics of KwaZulu-Natal, South Africa. Front Reprod Health. 2022.

  17. Guidelines for Assuring the Accuracy and Reliability of HIV Rapid Testing: Applying a Quality System Approach. 2005. https://apps.who.int/iris/handle/10665/43315.

  18. National-Department-of-Health. HIV counselling and testing (HCT) policy guidelines. Pretoria: Government Printers Pretoria; 2010. p. 22–3.

  19. Hedges LV. Distribution theory for Glass's estimator of effect size and related estimators. journal of Educational Statistics. 1981;6(2):107–28.

  20. Stephanie G. "Hedges’ g: Definition, Formula" From StatisticsHowTo.com: Elementary Statistics for the rest of us 2020.

  21. Derrick B; Ruck A, et al. Tests for equality of variances between two samples which contain both paired observations and independent observations. J Appl Quant Methods. 2018; 13(2).

  22. Heß S. Randomization inference with Stata: A guide and software. Stand Genomic Sci. 2017;17(3):630–51.

    Google Scholar 

  23. Wolverton S, Dombrosky J, et al. Practical significance: ordinal scale data and effect size in zooarchaeology. Int J Osteoarchaeol. 2016;26(2):255–65.

    Article  Google Scholar 

  24. Glass GV, McGaw B, Smith ML. Meta-analysis in social research. Beverly Hills: Sage Publications; 1981. Scientific Research Publishing (scirp.org).

  25. Floren LC; Mandal J, et al. An Innovative Mobile Learning Module to Support Interprofessional Knowledge Construction in The Health Professions. Am J Pharm Educ. 2019:84;1–22.

  26. Lall P, Rees R, et al. Influences on the implementation of mobile learning for medical and nursing education: qualitative systematic review by the Digital Health Education Collaboration. J Med Internet Res. 2019;21(2): e12895.

    Article  Google Scholar 

  27. Li KC, Lee LY-K, et al. The effects of mobile learning for nursing students: an integrative evaluation of learning process, learning motivation, and study performance. Int J Mobile Learning Organisation. 2019;13(1):51–67.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable. We wish to acknowledge the KZN department of Health for granting us permission to conduct this study and all the participating clinic managers, the PHC professionals who welcomed us in their clinics, took the time to attend workshops and share their knowledge and experiences with the primary investigators. 

Funding

Not applicable.

Author information

Affiliations

Authors

Contributions

N.C and T.P.M-T conceptualised the study. N.C an R.E.O conducted statistical analysis. The information on this study was continuously updated and verified by all authors. All authors agreed to the published version of the manuscript.

Corresponding author

Correspondence to Nkosinothando Chamane.

Ethics declarations

Ethics approval and consent to participate

Ethical approval to conduct this study was obtained from the KwaZulu-Natal Health Research Committee (KZ_201904_008) and the University of KwaZulu-Natal (UKZN) Biomedical Research Ethics Committee (BF514/18). Permissions to conduct audits in the 11 participating clinics were obtained from all 11 district health managers of KZN. Written informed consents were also obtained from each of the 11 participating clinic managers.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Chamane, N., Ebenezer Ogunsakin, R. & Mashamba-Thompson, T.P. The effect of a mobile-learning curriculum on improving compliance to quality management guidelines for HIV rapid testing services in rural primary healthcare clinics, KwaZulu-Natal, South Africa: a quasi-experimental study. BMC Health Serv Res 22, 624 (2022). https://doi.org/10.1186/s12913-022-07978-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12913-022-07978-4

Keywords

  • Mobile Learning
  • Quality rapid HIV testing and Quasi-experimental design