I’ve written previously about why it’s so hard to create highly effective online learning experiences. Online learning poses big challenges to learners in developing countries, where the average completion rate of massive open online courses is well under 5% [1]. Over the past 18 months, working with a client, I’ve been prototyping and evaluating a new approach to self-paced online training for HWs in developing countries. This post briefly synthesizes some of the research-backed techniques we’ve used to create a simple, engaging, and effective learning experience.
What we do now
To overcome myriad barriers that can inhibit learning, we’ve learned to leverage many best practices from the learning sciences. Some key features include:
- a user experience that is simple and organized. This reduces frustration and sustains learners’ sense of self-efficacy [2].
- continuous engagement in low-stakes assessment with feedback. Assessment that is designed to provide feedback and promote learning rather than to evaluate learners is a powerful way to activate learning offline and online [3, 4]. Each question is followed by a short explanation, which improves learners’ subjective experience [5].
- learning driven by clinical cases and content curated to be directly related to learners’ work. This promotes persistence [6] while increasing HWs’ interest and motivation [7], but it also allows HWs to directly apply their prior experiences and knowledge stored in long-term memory.
- modules designed to iteratively repeat and build upon key concepts; this leverages the power of spaced repetition to activate learning [8].
- learning focused on only the most essential content. Given that all people have a limited working memory [9], this design choice improves knowledge retention [10].
- short courses (taking ~2-3 hours to complete) that have deadlines. Both of these factors have been shown to increase course completion rates [11, 12].
- a focus on basic knowledge and skills, rather than advanced clinical practices. This serves two key purposes: first, there are still substantial gaps in foundational knowledge and skills in our target audience [13], and second, these topics are more amenable to learning in a virtual setting than complex clinical knowledge and skills [7].
What we could do in the future
Our pilots demonstrate that this approach is already quite effective, with high completion rates, positive learner feedback, and substantial knowledge gains [13]. As we continue to expand this work, I’ve been thinking of ways to incorporate even more best practices inspired by the learning and behavioral sciences. Some ideas include:
- improving HWs’ extrinsic motivation to learn by securing continuing professional development accreditation of our courses.
- creating more varied, interesting, and challenging learning experiences with new types of automated assessments.
- leveraging the effects of social and workplace norms on motivation by creating tools that allow HWs to share their progress with each other and their supervisors.
- improving user experience and accessibility to our audience with features like text message-based account verification and a phone app that works offline.
- incorporating more adaptivity and utilizing a mastery-based approach to learning.
All of these techniques have to be adapted to the specific learning context. And even when they are implemented, it’s critical to continuously use data from pilots and user testing to drive iterative product development and quality improvement.
References
1. Kizilcec, R. F., Saltarelli, A. J., Reich, J. & Cohen, G. L. Closing global achievement gaps in MOOCs. Science 355, 251–252 (2017).
2. Simunich B, Robins DB, Kelly V. The Impact of Findability on Student Motivation, Self-Efficacy, and Perceptions of Online Course Quality. American Journal of Distance Education. 2015;29:174–85.
3. Szpunar KK, Khan NY, Schacter DL. Interpolated memory tests reduce mind wandering and improve learning of online lectures. PNAS. 2013;110:6313–7.
4. Roediger HL, Karpicke JD. The Power of Testing Memory: Basic Research and Implications for Educational Practice. Perspectives on Psychological Science. 2006;1:181–210.
5. Thomas MP, Türkay S, Parker M. Explanations and Interactives Improve Subjective Experiences in Online Courseware. The International Review of Research in Open and Distributed Learning. 2017;18.
6. Andersen L, Ward TJ. Expectancy-Value Models for the STEM Persistence Plans of Ninth-Grade, High-Ability Students: A Comparison Between Black, Hispanic, and White Students. Science Education. 2014;98:216–42.
7. Bin Mubayrik HF. Exploring Adult Learners’ Viewpoints and Motivation Regarding Distance Learning in Medical Education. Adv Med Educ Pract. 2020;11:139–46.
8. Kerfoot BP, Fu Y, Baker H, Connelly D, Ritchey ML, Genega EM. Online Spaced Education Generates Transfer and Improves Long-Term Retention of Diagnostic Skills: A Randomized Controlled Trial. Journal of the American College of Surgeons. 2010;211:331-337.e1.
9. Forsberg A, Guitard D, Cowan N. Working memory limits severely constrain long-term retention. Psychon Bull Rev. 2021;28:537–47.
10. Brame CJ. Effective Educational Videos: Principles and Guidelines for Maximizing Student Learning from Video Content. CBE Life Sci Educ. 2016;15.
11. Rodriguez BCP, Armellini A, Nieto MCR. Learner engagement, retention and success: why size matters in massive open online courses (MOOCs). Open Learning: The Journal of Open, Distance and e-Learning. 2020;35:46–62.
12. Ihantola P, Fronza I, Mikkonen T, Noponen M, Hellas A. Deadlines and MOOCs: How Do Students Behave in MOOCs with and without Deadlines. In: 2020 IEEE Frontiers in Education Conference (FIE). 2020. p. 1–9.
13. Thomas MP, Kozikott S, Kamateeka M, Abdu-Aguye R, Agogo E, Bello BG, et al. Development of a simple and effective online training for health workers: results from a pilot in Nigeria. BMC Public Health. 2022;22:551.