How do you make e-learning engaging and not just engaging?

Nov 15, 2023


I've been asked to brainstorm our problem with our knowledge transfer training engagement. None of it trains in a way that asks the learner to apply the knowledge related to the tasks they carry out in their job roles.


In our case, it always turns into a variation of the talking PowerPoint with a video, essentially a self-study variant of a lecture with an MC at the end.


When I want to apply interactivity, I bump my head into the usual suspects:

Click to reveal, click a hot spot, click and drag to sort, click and drag to match pairs, and click the hot spot. Something with textual input.


The problem with those is that they seem to work more like behavioural tar pits dug in front of the learners, or a Next button that hides, to slow them down. They do not actually give the learner a chance for relevant interaction that correlates to the topics taught.


I've worked with Short Sims, where you create branching and feedback depending on choice. These work well, but are a bit more challenging to make.


Are there other I might have missed? Apart from making nicer slides and even better videos with enthusiastic presenters, preferably SMEs.

9 Replies
Judy Nollet

The big question: What should the folks to be able to do after the training?

Yeah, they don't need to do what those "usual suspects" ask them to do... 

Ideally, training would let them practice the tasks they do on the job. It would do so in a safe yet challenging environment that provides access to the necessary information. (In some situations, I think the most important thing a person can learn is where to find a relevant job aid when they actually need it. 😄) 

Thus, the "CCC" approach: set the context, give them a choice, and then the consequence(s) of that choice. In other words, "branching and feedback." 

And, yes, good scenarios/simulations are more challenging to create. If SMEs are available, don't just ask them what people need to do. Ask about the ways that things can go wrong.

Or, depending on the topic, you might be able to prompt an AI tool for ideas. You'd still have to put those into a functioning scenario, but it might, at least, generate more situations to work with. 

Christy Tucker

In addition to branching scenarios (which are the best option for some skills, but not for everything), you can look for other ways to have people practice skills that aren't necessarily a full branching scenario.

For example, a lot of my projects include one-question mini-scenarios. These are easy to write and build, but they still require a bit more thinking (i.e., cognitive engagement) than just clicking (i.e., behavioral engagement).

Open-ended text questions for reflection can be useful too, even if you're not scoring them. There's value in having people generate their own answers and reflect on how the learning connects to their own experiences.

What are the actions people need to take on the job? What decisions do they need to make? If you can identify those, it's easier to come up with ideas to replicate those actions and decisions through more meaningful activities.

Soren J Birch

That was a nice link, and thank you for your help. It gave me something to work with and think about. I am sitting here with pen and paper, trying to think through one of our scenarios.

I feel it is kind of like teaching orbital mechanics to the rocket sales guy and the guy responsible for preparing the rockets. It doesn't hurt, but at the end of their work day, it isn't important for them: Rockets are sold and go up (or bang), no matter what they were taught in physics class. Decisions are made on basis of the products, and any calculations are based on already drawn tables.

Here is the topic, unrelated to rocket science but still science:

Flow cytometry is an analytical method for cell counting.

It was invented by the famous Mr. Flowson in the past.

Today, it is used for blood cell count in laboratories and healthcare.

In our company, flow cytometry is used to count bacteria in raw milk

Flow cytometry uses a laminar flow, characterised by the lack of any turbulence.

Turbulence mixes liquids; laminar flowing liquids travel next to each other and do not mix.

In the instrument, the outer flow layer is a sheath layer that prevents the cells from leaving the core flow.

The core flows carry the objects of interest through the flow cell.

To be counted, the cells are stained with a fluorescent dye in a clear solution first.

The dye-stained cells are lighted up with a laser as they pass the flow cell. 

When illuminated by the laser at a low wavelength, the cells emit light at a higher wavelength. 

An optical sensor measures the light, and cell count is calculated.

The MC at the end of this asks,
"What characterises a laminar flow?" 
"What is the object of interest in the flowcell?"
"What does fluorescence mean?"
"How are cells counted?" 

Why does my learner need to know about this?

The first group must talk to clients about their cell counting analysis needs. These people will probably never operate one since they mostly talk to lab managers or people entirely outside of the laboratory.

They will probably not be asked detailed questions by the client on the method, as this is actually irrelevant for both. As long as the required results arrive, none of them really cares.

The second group needs to both operate and service instruments once it has been purchased. Neither of these are scientists, chemists, bioanalysts or lab technicians, so all this information is definitely not actionable.

I post this and go back to my drawing board to see what I can come up with.


Judy Nollet

Those groups should, at the very least, follow different paths in the course. Ideally, they'd just take different courses.

  • If the courses are separate, you can update one when needed and require that group to take it. The other group would be unaffected by the versioning. 

Put the first group in scenarios with questions that they might ask and that might be asked of them. Let them pull info/resources in the course that will help them. 

For the second group, demo/explain what they need to do. When possible, try to simulate small tasks, e.g., choosing a given setting. And be sure there are good job aids they can refer to when they're actually using the instrument. (Of course, ideally, their training also involves hands-on practice overseen by an expert.) 

Soren J Birch

Thanks, but it goes too far down the whole training curriculum. They will get their hands on it, watch videos of how it runs, and see results. But that comes later. This is just the beginning of the onboarding 

It exists because "You must know the history and underlying scientific method to learn anything about how to operate it, and who operates it." - stated by the responsible persons who don't want learners attending their classes without having some prior knowledge BEFORE they show up to the training.

I will need to think harder. 

Christy Tucker

When they say it's important to know the history, ask how someone's job performance will look different because they know that history. How will someone who knows the history be more effective at the skills needed than someone who doesn't know it?

Most of the time, when I ask that question, SMEs can't answer it. The history of something is rarely that important. So, the history part can probably be cut. If politically you can't cut it entirely, make it short so people don't have to sit through irrelevant explanations for long.

But overall, Judy's right--the only way to make this relevant and engaging is to split it by role. When what they need to do is so different, you can't really have the application practice without separating it.

Judy Nollet

I like Christy's suggestion to ask the SME how knowing the history will be used on the job. Maybe that will get the SME to rethink what's "needed" before the actual do-the-task training. 

I have no idea how microwaves work. But I can effectively use one to cook or reheat food and beverages. 😄

Best wishes, Soren. And good luck with the SME.