In my first post, I described three principles to guide an online learning strategy. Those principles are:
- Focus on outcomes, not inputs.
- Meet learners where they are, not where you want them to be.
- Design learning experiences based on how people learn, not how you want them to learn.
Today, I want to explore the third principle: design learning experiences based on research which shows how people learn. I’ve spent much of my career at the point where research and practice meet – in the design and development of learning experiences – so I have a lot of thoughts on this topic. I contend that many learning environments can be vastly improved by incorporating what the research tells us about learning, behavior, and cognition. In this post, I distill lessons I’ve learned in my work into a few guiding principles for designing effective and engaging online learning. Since these principles are supported by primary research, I provide links to some of the more interesting studies along the way!
If you are lucky enough to be designing a new training program – perhaps even from the ground up – don’t squander that opportunity! Take time to clarify the outcomes you want to achieve and how you’re going to measure those outcomes. You have an amazing chance to deeply understand your learners and design your training program to meet their particular needs. If you’re interested in the earlier steps of the design process, definitely check out my previous post on meeting learners where they are.
Once you’re clear on all of those critical parameters (outcomes, audience, and aligning with your audience’s unique needs), it’s time to start designing and building the learning experience itself. These steps in the process – planning and building – are the topic of this post. While this post builds off of my work in health professional education and medical education, these principles are broadly applicable to many other populations of adult learners.
Three Cardinal Rules for Online Learning Design
There are plenty of frameworks out there for online learning. I’ve found that many of them are quite involved. It’s easy to get bogged down in terminology and theory without really circling back to practice and evidence of what works. My goal here is to keep it simple and flexible without becoming simplistic. This framework can’t be used as a rubric for scoring an online learning experience. If you’re looking for rubrics, I would recommend the excellent resources developed by Quality Matters. Instead of a rubric, these are three rules that can be applied to design a new online learning experience or to rapidly evaluate one that already exists. These practices are all supported by empirical evidence, meaning they increase student learning, persistence, or engagement in online learning.
The three cardinal rules are simple. Your online learning experience should be: active and interactive, longitudinal and varied, and applied.
Active and Interactive
There are two components to this rule. “Interactive” is pretty intuitive. This means that your learners should have opportunities to interact with each other and with the instructor. The research is clear on this point – outcomes are better in online learning experiences that have more learner-learner or learner-instructor interactions. I recommend that you start with interactions that are content-centered (discussions, projects, etc.), but you may also want to create space for more social and non-academic interactions.
“Active” is a bit more involved, but still straightforward to implement. At its core, active learning is about cognitive effort. The process of learning requires effort: up to a point, greater effort drives deeper learning. This is something that we all know intuitively: we know that the best way to develop a skill (be it writing, playing the guitar, or interpreting CT scan images) is to practice it. This also applies to learning factual content. For example, answering questions leads to more learning than reading text that covers the same content. Active learning comes in all shapes and sizes: it can involve solving problems, answering questions (assessment), grappling with a topic in discussion, developing a project, delivering a presentation, and more.
I am tempted to include feedback as its own cardinal rule, but in truth, feedback is a critical “force multiplier” for interactive and active learning. Low-stakes assessments are so powerful, in part, because information on the correct answers provides us with an objective way to measure our own learning. Likewise, interactive learning is already full of indirect feedback (from nonverbal cues to constructive disagreement arising in discussions), but you also can – and should – try to include more structured opportunities for feedback.
Longitudinal and Varied
The two sides of this rule are really just different faces of the same coin. “Longitudinal” refers to repeated opportunities to practice our learning over time. The research couldn’t be more clear on this point: spacing out study sessions over time leads to much greater learning than so-called “massed practice.” However, we humans have terrible study habits, including procrastination and cramming. These habits persist, in part, because of “illusions of fluency.” As we cram, the material begins to feel familiar to us – we think that we really know it and understand it. We trick ourselves into believing that cramming is effective, even though it isn’t. When you’re building a new learning experience, think about how to ensure that your learners are able to revisit key concepts and content over time.
“Varied” refers to engaging with the same content and concepts in a variety of ways, and also to mixing up what we learn, rather than studying in big blocks. In health care, there are so many available options: simulation, case studies, rounds, lectures, mentorship, assessment, demonstrations, observation, and more. I find that varied learning leads to much higher engagement in synchronous (live) classes. Live classes that incorporate a variety of different activities (for example, short presentations, polls, breakout discussions, and Q&A) are much more interesting than lengthy lectures. Varied practice perfectly complements longitudinal learning: prevents us (as learners) from getting bored, while ensuring that we do have those repeated and spaced opportunities to practice.
Applied
“Applied” learning is all about never missing an opportunity to anchor learning to learners’ lives and experiences. Adult learners, in particular, are more motivated when they know the learning is relevant to their lives and their work. In health care, it’s fairly straightforward to make learning applied. You might use real patient cases, fictional (yet realistic) clinical situations, or relevant research studies. The only real limitations are your creativity and persistence in hunting down the right examples. If you’re working in health professional education yet struggling to find ways to make the learning applied, it’s worth considering whether that learning is, in fact, relevant to your audience!
Nuances in Implementation
There’s a lot of overlap and synergy between these three cardinal rules. For example, when students practice a clinical procedure together, there are active, applied, and interactive components, all in one learning experience. Giving a case presentation and receiving peer feedback might have active, interactive, varied, and applied components. We should embrace this complexity. As long as it all makes sense and “hangs together,” a learning experience that authentically incorporates more of these cardinal principles creates more opportunities for learning.
Even accounting for those nuances, I think there is power in the simplicity and clarity of the cardinal rules. We can use them to quickly evaluate a learning experience and enumerate how it does – or does not – put to use research-backed online learning practices.