Forum Discussion
How do you evaluate the flow of a course?
Hi everyone,
I’ve been thinking a lot about flow in Storyline and Rise courses.
Not the visuals or interactions, but the way ideas move, build and connect for the learner.
When I review courses, flow is often the first thing I look at, because so many issues trace back to it. A course can be beautifully built, but if the flow is unclear, the learner has to work harder than they should.
Here are a few questions I often ask myself:
- Does each screen naturally lead to the next?
- Does the learner know why they’re seeing this information now?
- Is anything arriving too early, too late or without enough context?
- Are we building on what the learner already knows, or jumping around?
- Is there a moment where the pace suddenly gets heavier?
These small checks often reveal more than a long checklist ever could.
I’d love to hear how others approach this.
When you evaluate the flow of a course, what do you look for?
Are there signals or questions you rely on to check whether the experience “moves” the way you hoped?
Always curious to learn from different perspectives.
4 Replies
- CydWalker_mwhCommunity Member
An option could be to pilot it with user tester(s) and have them give you feedback on the flow.
- smousCommunity Member
100% agree CydWalker_mwh.
Leveraging beta-testers (especially early on) to evaluate the overall flow is a great strategy!
Any specific instructions you share with your testers or things in particular that you ask them to look for?- CydWalker_mwhCommunity Member
smous I think about what I want to know and tell the testers what feedback I'm looking for in addition to other things testers come up with.
Examples:
Does the flow seem logical, not jumping around in topic.
Is there enough context to understand what is going on?
Easy to navigate?
Is any content too heavy--too much on one slide, etc.
Interaction instructions clear?
Anything confusing?
- smousCommunity Member
I love these questions!
One challenge I keep running into is that teams often ask different questions, without being able to anchor them on a shared definition of what good clarity actually looks like (including flow).I’m currently exploring a sort of recipe for clarity: a few consistent signals that help reviewers and testers align on what they’re seeing, rather than reacting purely on gut feel.
Flow is one of the clearest places those signals show up.Beta testing seems to work best when it’s anchored to that shared baseline.
Related Content
- 7 months ago
- 11 months ago