In recent years, there has been increasing use of e-learning and blended learning in continuing medical education (1–4). However, comprehensive design, implementation, and impact evaluation methods are needed for possible future replication in similar settings.
E-learning can be defined as the use of information and communication technology, digital tools, or media to deliver education (5). In spite of the need for traditional skills to align teaching strategies with teaching/learning objectives, e-learning is a valuable addition to the teaching toolbox, as it provides instructional resources, activities, assessments, and feedback online (1). Blended learning combines e-learning and traditional face-to-face methods to run asynchronous and synchronous learning activities (6–10). The flexibility of blended learning enables the online delivery of content together with the best features of classroom interaction and live instruction to personalize learning, encourage thoughtful reflection, and individualize instruction across a diverse group of learners (11, 12). The increasing use of blended learning is strongly driven by three reasons – improved pedagogy – increased access and flexibility – and increased cost-effectiveness (6). The recent COVID pandemic undeniably is an additional driver at the global level for e-learning and blended learning development.
Evaluations enable us to determine whether and how well the training accomplished the assigned goals and objectives (13). Evaluation remains one of the biggest challenges for training institutions, particularly workplace training professionals, and few organizations conduct comprehensive evaluations of their training. Several reasons have been reported, such as the time between training and the opportunity to use the skill or knowledge or the challenge of evaluating training and outcomes for complex skills or problem-solving (13). Moreover, few evaluations go beyond assessing learner reaction and satisfaction, which are levels one or two of Kirkpatrick’s four-level model (Reaction, Learning, Behavior, and Results), commonly used for training evaluations (14, 15). The introduction of e-learning and distance support for these training programs does not seem to have improved training evaluations (13).
In Guinea, the development of quality human resources for health, particularly in primary health care, management of sexual and reproductive health services, and research methods, is a priority for health authorities and their partners working on strengthening the health system. Over the past 10 years, internet coverage considerably increased in the whole country (0.4% in 2010 to 33% in 2018). The Ebola crisis (2014–2016) had put on hold several face-to-face training and led to develop a novel blended learning approach. Three blended learning courses were developed and implemented in Guinea. Evaluations of the first two courses (eSSP and eSSR) have already been reported (16, 17). However, these reports primarily focused on completion and success rates, factors associated with success, and learners’ perceptions (reactions) of the training (levels 1 and 2 of the Kirkpatrick model) (16, 17). Moreover, reasons for dropout and abstention among training participants, the change in their work behavior (level 3), and the impact of this behavior change on the results of their organizations (level 4) were not assessed. Thus, the main focus of this study was to evaluate (1) reasons for dropout and abstention, (2) change(s) in work behavior reported by learners following the training, and (3) the impact of the work behavior change on the achievements of the organizations or services where the learners work.
Course Development and Implementation
The three courses were developed in French and implemented between 2017 and 2021 by the Maferinyah National Training and Research Center in Rural Health or Maferinyah Center, a training and research institution of the Ministry of Health in Guinea located in Forecariah district (50 km from Conakry). The development and implementation of the courses were funded by the Belgian Development Agency (Enabel), and the Directorate-General for Development Cooperation and Humanitarian Aid (DGD). More details on the Guinean context and the actual development of the courses have been provided earlier (16, 17). The first two courses on Primary Health Care (eSSP) and Management of Sexual and Reproductive Health Services (eSSR) addressed the local health system context. They targeted health professionals working in health facilities and institutions, as well as medical students at the end of their medical school studies, before starting to work in the field (Table 1). The third course on Research Methods (eMR) targeted medical students before completing their thesis research component at the end of the medical school and health professionals already or intending to be involved in public health and research (Table 1). The “Analyze, Design, Develop, Implement and Evaluate (ADDIE)” instructional design model was used to develop the training modules (18). The capacity to develop and implement the course was built within the Maferinyah Center team through specific courses on e-learning and continuous coaching. The three courses were delivered through an online open Learning Management System (LMS), the Moodle platform. The eSSP, eSSR, and eMR respectively counted seven, nine and six two-week online modules (Table 1). The three courses consisted of a major online component or asynchronous learning with modules that could be downloaded in “Html” format. Synchronous learning included a continuous discussion forum, a Zoom call scheduled for each module, and was initiated during the COVID-19 epidemic (with increased use for the last training cohorts), and a five-day face-to-face capacity building workshop at the end of the training for a sample of trainees (Table 1).
Study Design and Period
The three implemented blended courses (eSSP, eSSR and eMR) were evaluated from June to August 2021. A cross-sectional study using a mixed-methods approach was used. We collected quantitative and qualitative data in four stages: (i) through the learning platform (Moodle course statistics); (ii) via an electronic questionnaire, (iii) during the learners’ capacity building workshops (opinion on short-term effect), and (iv) in the field to interview participants at their workplaces.
Theoretical Framework for Training Evaluation
Completion rate was the number of learners who completed the course by performing all learning activities over the total number of enrollees. Dropout rate was the number of learners who dropped out from the course after completing some activities over the total number of enrollees. Abstention rate was the number of enrollees who ultimately did not log into the online learning platform although they had received all necessary information to access it, over the total number of enrollees. Successful completion rate or success rate was the number of learners who passed the course (with an overall mean of marks greater than or equal to five out of ten), over the number of learners who completed the course. Overall success rate was the number of learners who passed the course, over the total number of enrollees.
Quantitative Data Collection
We collected selected sociodemographic characteristics and results of courses (participation, completion, and success rates) for all participants in the three courses through the Moodle open platform statistics. This information aimed to provide some basic information about the course before addressing level 2 of the Kirkpatrick model. An additional electronic questionnaire (KoboToolbox) was sent to all course participants by email to better understand the behavioral changes at work following the training experience (level 3 of the Kirkpatrick training evaluation model).
Qualitative Data Collection During the Capacity Building Workshops
During three workshops, which took place between June and July 2021 for 5 days per workshop, time was set aside to collect data from learners who attended the different workshops. In-depth individual interviews (IDIs) and focus group discussions (FGDs) were carried out by the primary author (TMM) and JMK using an interview checklist. As participants were accommodated at the Maferinyah Center, the IDIs and FGDs were performed during the evenings and lasted around 50 and 70 min, respectively. This qualitative data collection during the workshops aimed at achieving objectives 3 and 4 of the evaluation.
Five learners per cohort were purposively selected for the workshops according to their performance levels – the first two learners on the list, one from the middle, and the last two learners (with a low score and/or not having completed the course). The choice of interviewees was purposive and respected the principle of maximum variation and the data saturation model (19).
Additional Qualitative Field Data Collection
A team of three people collected additional qualitative data in the field for seven days to complement the already available data, more specifically to reach levels 3 and 4 of the evaluation model. We collected data from (1) learners who were not selected for the consolidation workshops (having validated or not the courses); (2) learners in their professional context and situation. We purposively selected interviewees along the Enabel intervention axis (regions of Conakry, Kindia, and Mamou), considering the principle of maximum variation and the data saturation model (19).
For the quantitative analysis, the data was extracted from Moodle platform and Kobo Toolbox through an Excel spreadsheet and imported into SPSS version 21 for analysis. Descriptive statistics were performed as proportions for categorical variables and median with standard deviation for continuous variables.
Regarding qualitative data, the interviews recorded in French were transcribed in full. We applied thematic analysis (20, 21) to analyze the responses to open-text questions. Each open-text response was coded in NVivo (TD). Inductive coding was used with codes discussed with the research team, grouped into themes, and based on the inductive thematic saturation model (19). Qualitative data were further analyzed to identify and document connections between the themes. Quotes from the open-text responses were used to illustrate findings from themes and quantitative results.
The research protocol was approved by the National Ethics Committee for Health Research in Guinea (No: 022/CNERS/2020) and the ITM Institutional Review Board in Belgium (Reference Code: 1363/20). Regarding the qualitative component of the study, free, informed, and oral consent was obtained from each selected participant before carrying out the interviews. Both quantitative and qualitative data were only accessible to the research team. The database is stored on a computer protected by a password at the Maferinyah Center.
Characteristics of Learners of the Three Blended Courses
Overall, 1016 health professionals, including 235 (23.1%) women, applied for the courses (eSSP, eSSR and eMR) through calls for applications launched online between 2017 and 2020 using professional [District.Team Guinea (22)] and social (Facebook) networks.
A total of 543 learners, including 137 (25.2%) women, were selected out of the applicants (Table 2). Overall, the majority of participants were between 30 and 40 years old (n = 379, 70%), male (n = 406, 75%), medical doctors (70%), Guinean nationals, or based in Guinea (87 and 88%, respectively). Women were more represented in the eSSR course (32%) than in the other two courses. Only 23 nurses were enrolled in the three courses, and 14 midwives participated exclusively in the eSSR course. Although the majority of participants resided in Guinea (93%), some learners resided outside Guinea (n = 71) (Table 2).
Table 2. Sociodemographic characteristics of participants to three blended courses in Primary Health Care (eSSP), Management of Sexual and Reproductive Health Services (eSSR) and Research methods (eMR), Guinea, 2018–21.
Main Results for the Three Blended Courses
Results of the three courses presented in Table 3 show that a similar proportion of participants completed all course modules (ranging from 67 to 69%) with dropout rates varying from 20% (eSSP) to 29% (eMR). Success rate (among those who completed the courses) ranged from 72% (eSSP) to 83% (eMR) and 85% (eSSR). Overall success rate (among all enrollees) ranged from 50% (eSSP) to 58% (eSSR). More detailed results on factors associated with successful completion rates have been described for eSSP and eSSR courses and published elsewhere (Table 3) (17).
Table 3. Results of learners on the blended courses in Primary Health Care (eSSP), Management of Sexual and Reproductive Health Services (eSSR), and Research methods (eMR), Guinea, 2018–21.
Characteristics of Participants in the Evaluation of the Courses
A total of 233/543 (43%) participants completed the online questionnaire, and 53 participated in individual in-depth interviews (IDIs) and five focus group discussions (FGDs). Sociodemographic characteristics of the participants in the evaluation or respondents are in line with those of course participants (Supplementary Table 1). The majority of respondents were between 30 and 40 years of age (72.2%), with a mean age of 36 years (SD = 6.4). The majority of respondents (77.7%) were male. Most participants in the evaluation were Guineans by nationality, resided in Guinea (88.8%), and in urban areas at the time of training (88.0%). Medical doctors were the most represented (82.4%) and working full-time at the time of the training (71.2%). The majority (n = 211; 90.6%) of respondents had started the course, of which 163 (77.3%) had completed the course and 48 (22.7%) had dropped out (Supplementary Table 1).
Reasons for Dropout During the Course or Not Attending After Enrolment (Abstention)
The main reasons reported for dropping out during the course (after starting at least one module) were interference with other courses or work overload (62.5%), lack of technological skills (10.4%), travel or displacement (8.3%), computer breakdown or loss (6.3%), internet connection problem (4.2%), and illness (4.2%) (Table 4). On the other hand, participants reported that the main reasons for not attending the courses after being enrolled (without even starting one module) were moving to or changing city where the internet network is not stable (31.8%), work overload (18.2%), interference with another ongoing course (18.2%), difficulty in accessing the learning platform (13.6%) and late receipt of information (13.6%) (Table 4).
Change(s) in Work Behavior Reported by Learners Following the Training
Application of Acquired (New) Knowledge and Skills
The majority (86.5%) of the participants reported that they put into practice the knowledge they had acquired during the course through several activities such as supervision (22.0%), service delivery (19.9%), training workshops (14.2%), work meetings (12.8%) and teaching (10.6%) (Table 5). An eSSR learner stated that: “During supervisions, especially integrated supervisions, I am often given the reproductive health part, as they know that I took the eSSR course. So, I calculate and interpret indicators and develop improvement plans for bottlenecks” (EIA, eSSR, female).
An eMR learner asserted: “The studies we carry out in the cardiology department, of course, require the drafting of a research protocol first of all. So, I put into practice the knowledge I acquired in the eMR course by writing research protocols and correcting students’ theses, but also by writing scientific articles for publication. These activities really allow me to apply what I learned in the eMR course” (EIA, eMR, male).
Some participants (13.5%) did not apply what they had learned because the course was not applicable to their daily work or current job.
“I have not yet applied the knowledge and skills I have acquired, but I would like to do so in the future because currently my job is not related to public health; I am in research” (EIA, eSSP, male).
Although they were not yet applying the acquired knowledge, other participants stated that they would continue to read the courses to keep up to date, hoping to practice what they had learned in the future.
Change in Behavior at Work or in Professional Practice
Participants who applied the knowledge and skills they acquired in the course reported that they had changed their work behavior, which is directly related to the course. The main findings are presented in Box 1. Many respondents reported feeling more confident and more comfortable at work following the training. The training also increased the visibility of some participants in their workplaces and was a source of inspiration for new activities in their work. For some respondents, the training allowed them to improve their professional practice and that of their colleagues in the same department. The training also enabled some participants to change their behavior in the workplace and provide support to other health services when called upon.
Box 1. Reported behavioral outcomes and impact on work organizations/services.
Impact of the Training on the Achievements of Learners’ Organizations or Services
A positive impact of the training on utilization/ coverage of services and increased revenues for the health facility was also reported by trainees (Box 1). An impact of the training on public health outcomes of the organizations or services where the participants’ work was reported by many participants, for instance, through improved supervision (Box 1).
This study documented a blended learning experience consisting of a major online component recently implemented in Guinea. Results showed that fair success rates could be achieved despite challenges learners face in a low-resource setting such as Guinea. Applied knowledge and an impact on behavior and performance in trainees’ workplaces were also reported by participants following the training.
E-learning and blended learning are not yet commonly used in Africa. A recent systematic review of global health capacity building and evaluations in low- and middle-income countries (2009–2019) showed that despite a sharp recent increase in e-learning or blended courses during the last 4 years, most studies conducted in Africa documented face-to-face teaching modalities (75%) (23). The same review reported that evaluations, in general lacked standardization, especially regarding the tools, and only face-to-face initiatives were evaluated in the long term beyond the individual level (23). Many evaluations are restricted to knowledge assessment through pre-and post-tests, which are indicative but limited (24, 25).
As a result, the findings of this study will further enrich the existing literature on e-learning and blended learning, shedding light on the design, implementation, and evaluation of e-learning and blended learning programs.
We found a similar course completion rate (70%) to that of a blended learning course for building workforce capacity for effective use of health information systems in Namibia and Tanzania (73%) (26), but less than a tuberculosis course completion rate (90%) in Ethiopia (27). Nonetheless, our findings align with those of studies that highlighted avenues to enhance the completion rate on online courses, though many other studies reported lower completion and success results (28–31). The success rates among enrollees reported in our study in the three courses varying from 50 to 58% are greater than those reported by several universities specialized in e-learning in Thailand, India, the United Kingdom, and France (17–48%). As Karsenti et al. (32) reported, our results demonstrated that in the African setting where internet coverage and quality and ownership of using new technologies in education remain a challenge, fair to high success rates can be achieved in e-learning or blended learning. The requirement for obtaining good results is the deployment of e-learning efficiency conditions such as a simple, attractive, and easily accessible delivery method with user-friendly navigation; but also a variety of communication tools for synchronous and asynchronous interaction between instructors and learners, and between learners themselves. Other prerequisites or enablers are quality content, a motivating pedagogical approach with clear objectives, varied learning resources, a continuous technical and pedagogical support to learners and instructors, ethical aspects, and continuous evaluation system incorporated (32).
Completing (and dropping out from) online courses remains a significant concern (31). After starting at least one module, reasons for dropping out from e-learning or blended learning were mostly related to workload, other work issues, or lack of technical skills. Very few respondents who dropped out mentioned internet connection as a reason. However, among the respondents who were enrolled but did not attend (even one-course module), internet connection or technical issues represented a major problem, and this corroborates the studies reporting that technological limitations can act as a barrier to e-learning within a faculty and geographical context (33–35). A recent Cochrane review identified factors that impact e-learning, such as interaction and collaboration between learners and facilitators, considering learners’ motivation and expectations, utilizing user-friendly technology, and putting learners at the center of pedagogy. The same review called for a better understanding of the issues related to enablers and barriers associated with e-learning, and a broader framework for making e-learning effective (36).
Capacity-building programs aim to provide the human resources of organizations with the proper knowledge that will enable them to perform their tasks and improve their skills and capabilities to innovate in their activities. Bouzguenda argues that applying acquired knowledge and skills deals with action and the validity of learning (37). About nine out of ten respondents apply the knowledge gained during the training, reflecting the relevance of course content to local capacity-building needs or, in short, the validity of the training. However, some learners (one over ten) suffer from the lack of applying the gained knowledge due to the lack of link between what they had learned and their current job or tasks; while expressing the willingness to apply the acquired knowledge in the future. This is in line with one theory of factors determining learning transfer, which advocates for a favorable work environment for employees to innovate (37).
Several systematic reviews have assessed e-learning training outcomes among health professionals compared to traditional learning and have shown similar results. Two reviews relating to nursing education reported high level of satisfaction among nurses or students (38, 39). A recent Cochrane review addressing the effectiveness of e-learning among health professionals concluded that e-learning could make little, or no difference in patient outcomes or health professionals’ behaviors, skills or knowledge (40). The impact on public health results such as health service coverage and health outcomes in settings such as Guinea has been less documented. Nevertheless, e-learning offers an alternative method of education. In the Guinean context, it had the added value of training health professionals in their working sites throughout the country, avoiding useless travel costs and leaving health facilities without key human resources.
We used the Kirkpatrick model as a framework to help us with a more standardized evaluation. The widely used Kirkpatrick model represents a simple tool and language in dealing with the different outcomes of training and how information about these outcomes can be obtained as well as a practical approach for the typically complex evaluation process. Similarly, it leads to some limitations – it fails to grasp complexity of the context and related factors – it shows a certain rigidity – and the hierarchical order and importance of the different levels can be questioned (41, 42). According to Bove and Little, “many training evaluation methods, including Kirkpatrick, are insufficient because they were not created considering the types of data now available. Further, many misconceptions about the relationship between training and workplace performance (e.g., the commonly held myth that completing a task in learning automatically translates to successful performance on the job) have led to poor evaluation practices” (43). Therefore, the same authors recommend the use of evaluation and data collection methods that are appropriate for the given training context (43).
Among study strengths, we can point out that it assessed higher levels of the Kirkpatrick evaluation model that are not often addressed in training evaluation. Reasons behind abstention and dropout were explored, and the qualitative component included participants who both completed, succeeded, or did not complete the courses. As limitations, first, the relationship between outcomes and sociodemographic characteristics included in an earlier paper (17) was out of the scope of this mixed-methods paper. Second, although this study did assess the impact of the training on learners’ daily professional activities, qualitative interviews were conducted among a limited number of learners, and it relied on self-report that is subject to bias (socially acceptable response); we could not interview the learners’ hierarchy or colleagues to crosscheck the provided information.
This evaluation showed fair success rates and a positive impact of the training on learners’ work behavior and the achievements of their organizations.
Data Availability Statement
The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.
The studies involving human participants were reviewed and the research protocol was approved by the National Ethics Committee for Health Research in Guinea (No: 022/CNERS/2020) and the ITM Institutional Review Board in Belgium (IRB Reference Code: 1363/20). The patients/participants provided their written informed consent to participate in this study.
TM, KK, TD, and AD conceived and designed the research protocol. TM, JK, TD, and AD elaborated the evaluation tools. TM and JK collected the data (online survey, interviews and focus group discussions) and performed transcriptions. TM and TD analyzed and interpreted data and drafted the manuscript. All authors critically revised the manuscript and approved the final version before submission, and they are accountable for all aspects of the work.
This evaluation was funded by the Belgian Development Agency (Enabel).
Conflict of Interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fdgth.2022.911089/full#supplementary-material
4. Hugenholtz NIR, De Croon EM, Smits PB, Van Dijk FJH, Nieuwenhuijsen K. Effectiveness of e-learning in continuing medical education for occupational physicians. Occup Med (Chic Ill). (2008) 58:370–2. doi: 10.1093/occmed/kqn053
9. Caner M. The definition of blended learning in higher education. In: Probst H, Killian M, Gallagher E, editors. Blended Learning Environments for Adults: Evaluations and Frameworks. Crete: Information Science Reference (an imprint of IGI Global) (2012). p. 19–34.
10. Ashraf MA, Mollah S, Perveen S, Shabnam N, Nahar L. Pedagogical applications, prospects, and challenges of blended learning in chinese higher education: a systematic review. Front Psychol. (2022) 12:1–13. doi: 10.3389/fpsyg.2021.772322
16. Millimouno TM, Delamou A, Kourouma K, Kolié JM, Manet H, Thoupia Baldé A, et al. Approche eLearning pour le renforcement des capacités des professionnels de santé en Guinée : une expérience post-Ebola. Sante Publique (Paris). (2021) 32:537–48. doi: 10.3917/spub.205.0537
17. Millimouno TM, Delamou A, Kourouma K, Kolié JM, Béavogui AH, Roegiers S, et al. Outcomes of blended learning for capacity strengthening of health professionals in Guinea. BMC Med Educ. (2021) 1:21. doi: 10.1186/s12909-021-02847-w
19. Saunders B, Sim J, Kingstone T, Baker S, Waterfield J, Bartlam B, et al. Saturation in qualitative research: exploring its conceptualization and operationalization. Qual Quant. (2018) 52:1893. doi: 10.1007/s11135-017-0574-8
22. District.Team. Mobilisation 2.0 des équipes cadres de district. Available online at: http://guinee.district.team/ (accessed October 14, 2021).
23. Naal H, El Koussa M, El Hamouch M, Hneiny L, Saleh S. Evaluation of global health capacity building initiatives in low-and middle-income countries: a systematic review. J Glob Health. (2020) 10:020412. doi: 10.7189/jogh.10.020412
24. Garley A, Eckert E, Sie A, Ye M, Malm K, Afari EA, et al. Strengthening individual capacity in monitoring and evaluation of malaria control programmes to streamline M&E systems and enhance information use in malaria endemic countries. Malar J. (2016) 15:1–8. doi: 10.1186/s12936-016-1354-y
25. Annan RA, Aduku LNE, Kyei-Boateng S, Yuen HM, Pickup T, Pulman A, et al. Implementing effective eLearning for scaling up global capacity building: findings from the malnutrition elearning course evaluation in Ghana. Glob Health Action. (2020) 13:1831794. doi: 10.1080/16549716.2020.1831794
26. Rudd KE, Puttkammer N, Antilla J, Richards J, Heffron M, Tolentino H, et al. Building workforce capacity for effective use of health information systems: Evaluation of a blended eLearning course in Namibia and Tanzania HHS Public Access. Int J Med Inf. (2019) 131:39–45. doi: 10.1016/j.ijmedinf.2019.08.005
27. Manyazewal T, Marinucci F, Belay G, Tesfaye A, Kebede A, Tadesse Y, et al. Implementation and Evaluation of a Blended Learning Course on Tuberculosis for Front-Line Health Care Professionals. Am J Clin Pathol. (2017) 147:285–91. doi: 10.1093/ajcp/aqx002
34. Bediang G, Stoll B, Geissbuhler A, Klohn AM, Stuckelberger A, Nko’O S, et al. Computer literacy and E-learning perception in Cameroon: the case of Yaounde Faculty of Medicine and Biomedical Sciences. BMC Med Educ. (2013) 13:57. doi: 10.1186/1472-6920-13-57
35. Ibrahim NK, Al Raddadi R, AlDarmasi M, Al Ghamdi A, Gaddoury M, AlBar HM, et al. Medical students’ acceptance and perceptions of e-learning during the Covid-19 closure time in King Abdulaziz University, Jeddah. J Infect Public Health. (2021) 14:17–23. doi: 10.1016/j.jiph.2020.11.007
39. Lahti M, Hätönen H, Välimäki M. Impact of e-learning on nurses’ and student nurses knowledge, skills, and satisfaction: a systematic review and meta-analysis. Int J Nurs Stud. (2014) 51:136–49. doi: 10.1016/j.ijnurstu.2012.12.017