Video vs. "full" elearning for software training

Nov 09, 2011

Hello again

As I'm doing more research for turning Packt into an online training provider, I'm find that what appears to be the most successful online software training focuses exclusively on video.

Lynda.com is the biggest example, but niche players like Drupalize.me follow a similar approach: each course is a set of videos that you can watch in order or dip in and out of. Mainly it's screencasts with a few face-to-camera lectures. And of course until recently Khan Academy was nothing but a big collection of videos.

Courses from Lynda.com don't include quizzes, branching, or any interactive elements. You just watch and learn.

For software training where a lot of the learning is watching procedures and following them yourself, is this the best approach to take? Do all the other elements of "full" elearning add a lot of value, considering the extra costs of creating them?

At the moment I'm inclined to say that online training for software really means "screencasts" and not a lot else. What do you think?

Thanks!

Dave

8 Replies
Ryan Martin

I agree with you. I've used video to learn coding, photoshop techniques, some real nerdy stuff. I believe it's the audience and the environment, meaning... the audience wants to get stuff done (usually they're trying to solve a piece of a larger problem), time is important. And the environment, they're usually copying what they see on the video, following along; which is the best way to learn - start building, and tweaking.

A quiz would be meaningless really, because in the end, the code works or it doesn't. If Lynda.com began requiring quizzes or branching/interactive elements, I'd see it turning off a lot of developers/programmers.

David Steffek

I agree. I'm using Lynda right now to learn a new software package. I jump straight to the video that covers the task I need and then go try it out for myself. I don't need to worry about getting an assessement correct. If it doesn't work, then I watch the video again, pausing as I go.

From an adult-learning perspective, I don't think it gets more "on demand" and "just in time" than that.

A lot of my job consists of developing systems training and for my current project I'm thinking of trying out a structure similar to Lynda. But, per our coporate standards, I will be including assessments. Baby steps, I suppose. A journey of a thousand miles begins with a single step!

Steve Flowers

Done well, videos are awesome for learning technical skills and tool piloting, particularly when the expertise of the presenter is apparent. My favorites include the videos and something I can quickly scan or reference (a tutorial article) as a companion to the video. I prefer these to the spoon fed simulations and forced chunking.  I can practice and assess my own performance, thank you very much Let me sponge at my own speed.

Kristen Hull

I create online sessions which teach our clients how to use the latest version of my company’s software.  The clients have been using the old version of the software, so they already know some basics.  Anyway, I use Captivate and make the session interactive (you gotta click the right spot to continue).  I also include narrator-free practices following the content.  (For instance… using these criteria, enter a new account.  There will be hints if you get stuck).  Presenting the clients with a screencast and then having them try it out on their own wouldn’t work for us.  First of all, I don’t think they’d be motivated to complete the second part.  Secondly, our application links to a client database, and we don’t want them to mess that up with fake examples.  I can’t really give them a fake online database…well I guess I could, but it would be very hard to create exercises that would work over and over without refreshing the database all the time.

From my in-person training, I think people (especially non-techy types) like having scenarios when they learn/practice software.  If I train people how to build a report, and tell them to go practice what they learned, their minds tend to go blank and they can’t think of what to try.  Or they forget to practice some key features.  By creating an exercise or quiz, I can make sure they practice what we’ve covered.

All that being said, lately, I have been thinking about restructuring each session, providing three options in a menu: watch (screencast), click along, and practice.  They can do all three or any combination of them.    

Steve Flowers

Hi Kristen,

I think that's a really good idea. Providing options for different audience types will really expand acceptance and might even impact your results. I will half-heartedly participate in hand-holding sessions, at best. But if you give me just what I need, just when I need it, I will love-it-long-time.

I've built things both ways - most of the time the audience really isn't much like me - they don't get grumpy about spoon-feeding or hand-holding.

I'll agree with your Database point. This is one instance where it's easier to provide practice in a simulation than to risk someone mucking up the real application.

David Barnes

Thanks for your thoughts.

What I'm getting out of this:

- For just in time and on demand software tutorials, video is great. Hop in, take what you need.

- For some activities you need a simulation or practice environment. If you're learning to change the font in a Word document, you can just watch a vid then try it for yourself. If you're learning to update 10,000 ActiveDirectory user profiles you might want some confirmation that you understood, before you try it.

- Written tutorials work well if you know most of the process already. You can quickly scan them just for the parts you don't know.

Of course other kinds of content are good too. The question is about trade offs: if you have a choice between creating a reinforcement activity for one procedure, or creating a video that shares a different useful procedure, but don't have the budget for both which are users likely  to value more?

Steve Flowers

I think this really depends on the sophistication of your audience. I think novices appreciate and benefit more from rudiments and conceptual orientation than more advanced users. So I'd say for basic courses, offering more Show Me > Let Me Try pairing would probably be appreciated. For more advanced users, less so and when included - definitely strategically. YMMV and every situation is a little different, but I'd make the audience profile the primary driver, followed by task complexity / ambiguity.

I wouldn't count out more abstract methods either. Sometimes building a mental model using block level breakdowns that tie to examples is a strong way to help users frame up the software landscape.

This discussion is closed. You can start a new discussion or contact Articulate Support.