eLearning for teaching software

Hi

I currently use Articulate to design two types of courses - those based on customer service (on a freelance basis) and those based on teaching users to use  Software (day job)

I find it relatively easy to design meaningful interactions and quizzes for the customer service type eLearning but I am struggling a little with the software courses. 

It's tricky to ask "test then tell" type of questions when the learning is to teach practical skills. I do include flash videos to show how to complete a task, but I think I could do more to introduce meaningful interaction.

My gut feel is that getting users to complete a software simulation can be a little patronising and too easy. I try to avoid the whole "clicky clicky bling bling" approach to interaction, not much point in using it, just for interaction sake if the learner does not benefit from it.

I'd love to hear from others who have designed this sort of eLearning and have any ideas on how I can  make the learning experience more interactive.

Thanks!

24 Replies
Zara Ogden

Personally I am a hands on learner.. I like to dig my fingers into the dirt and get messy. For this reason I am a simulation lover.

The trick is to have solid scripting. If you simply say click here click there and are very stiff about it then you can get boring. but what better way is there to teach someone other then to let them use it. I could write a test on Astrophysics and fluke a perfect score but doesn't mean i know anything about astrophysics.

We recently acquired (at my day job) a new onboard system (I work for a trucking company). This system is going to decrees the drivers paperwork immensely and free up our dispatch team to deal with higher priority issues. But the drivers need to learn how to use it.

By combining a mix of Engage in Engage, Quizmaker, and hyperlinking I can create a simulation that will teach the theory and have the users comfortable enough to use the system.And hopefully entertained.

My thoughts are most of the time we don't like a type of training because when we saw it it was not executed correctly or for the correct topic. Even text to speech has a place in the blind or deaf community.

I say challenge yourself to find a way to make it better for you and your team. We all fail but it is what we learn from it that counts.

James Brown

Sam,  I'm currently rewriting training for a software product. Currently I do not plan to incorporate quizzes into the program. Instead my training is designed to be used in conjunction with a person using the software. I.e. the person could be watching the training module and then tab back to the program and follow along. Eventually I decided to come up with scenarios that the end user could interact with in the training which would show them how to perform certain tasks but honestly my type of software training is not so much about how to do the task, but where to find the information on how to do the task if needed. 

Another critical aspect of my redesign was to break the training up based on job description. Many people forget that software is not used simply by one person. Software may be used by a wide variety of users and based on that fact there may be sections of the product that will be presented to one type of individual that would not be covered by others. I.e. a manager may not want to know how to key in an account but they certainly want to know where the reports are so they can get a general picture of the overall number of accounts added to the system.

I'm also not implementing quizzes for another reason, they honestly do not test the knowledge of the person. With a psycho-motor skill you have to have someone evaluate the person to truly see if the skill was acquired by the end user.  Either they can perform the skill or not.  With software I have found that learning is based on repetition. You do something enough times you will get it.

Kristen Hull

I create eLearning sessions to teach software (specifically on differences in upgraded versions. We still deliver new user training on-site). I use Captivate though, and I've never thought that it had a patronizing feel to it. I haven't had that sort of feedback either. It really lets the users go through the steps, make mistakes, and then correct themselves. I'll walk them through the process step-by-step (with audio narration), and they click the screen to perform each step. Once that is complete, I'll have them do the entire process (using a similar example) without audio. If they mess up, there is a hint on the screen. If they don't mess up, yay (Captivate lets you put hints before they click, but I don't use that...to me, that might be patronizing and too easy).

I still incorporate quiz-type slides. This covers things like verbiage changes between versions or software-specifc words/definitions.

I have tried to do this in Articulate at another company, and it was a lot more challenging; I really had to stretch my brain to figure out ways to get them to learn the software without a true software simulation. Captivate makes is so much easier, and hopefully, more effective.  (But this is such a better community [they can really get crabby over there], that I still love to visit Articulateland!)

James Brown

BTW - Honestly anything you do in captivate you can do in Articulate. The converted Articulate project is Flash and Captivate is also flash. It's just that Captivate is really geared to a step through approach. You still have to create your vector graphics and if the person does not get the correct sequence you may present them with an error. I have used Captivate in the past and I must admit it's not bad for software training but I'm also starting to see how to do the exact same thing in Articulate. Plus I think the Articulate overall appearance has more eye pleasing features than raw Captivate projects. Anyway that's just my 2cents worth. I'll jump down from my soap box now.

Zara Ogden

James Brown said:

BTW - Honestly anything you do in captivate you can do in Articulate. The converted Articulate project is Flash and Captivate is also flash. It's just that Captivate is really geared to a step through approach. You still have to create your vector graphics and if the person does not get the correct sequence you may present them with an error. I have used Captivate in the past and I must admit it's not bad for software training but I'm also starting to see how to do the exact same thing in Articulate. Plus I think the Articulate overall appearance has more eye pleasing features than raw Captivate projects. Anyway that's just my 2cents worth. I'll jump down from my soap box now.


James I know you like Articulate but I always thought you were an Abobe preferred user. Especially with all your flash experience. 

See now, to me, Mr. Brown's nod on Articulate for simulations means something. 

It is just the best product out there...

Bob S

Sam,

Great question. I have two words for you...

"Scavenger Hunt"

Consider not doing all of your software training in the e-learning tool itself. Instead show them the basics, the rationale, etc. via the course, then turn them loose on the real software (preferable in a dev environment). Have them go do exercises/search for answers in the software. Then they can come back to the course to complete that phase of the scavenger hunt (ie answer questions).

This approach works for individual steps/tasks (eg navigating to a screen) or for an overall process in the software (eg on-boarding a new customer).

Hope this helps,

Bob

James Brown

Zara King said:


James I know you like Articulate but I always thought you were an Abobe preferred user. Especially with all your flash experience. 

See now, to me, Mr. Brown's nod on Articulate for simulations means something. 

It is just the best product out there...

Don't get me wrong. My preference is Flash but for the neophyte and novice users, it has given them a way to produce eye pleasing materials with very little effort. Personally I wish I could export my flash content directly to Articulate and use the Articulate player to create the menu structure and play the content. That's the one thing I think is pretty cool about Articulate .  I think if there were an Articulate interface for Flash it would make a pretty good product into a thing of beauty.

James Brown

Bob, your scavenger hunt is a nice idea but some companies simply don't always have the luxury of a training lab or DEV environment nor the time. These learners typically go through training and are baptized by fire. I.e. seeing and doing. For the most part, that is the best method for the transfer of a psycho-motor skills. This is something the Behaviorist Fred Skinner did get right about some types of learning but not all.  I find that end users will sit through training and it typically takes between six to eight weeks before they are comfortable performing various tasks in the software and it generally will take a user approx. one year before they have a solid grasp of the product. I have seen this pattern repeated with both end users who I have trained and new software technicians that learn the product as part of their job.

Bob S

James,

Your points about time-to-mastery are well taken. However I do think that there are viable alternatives to having them tossed into the deep end of the pool.

Even without a dev environment, users can often be shown how to do a simple task such as navigating to a screen, given a job-aid, and then sent out to go execute that task and report back on something they saw there. Breaking down complex processes into simple steps like that are what we do for a living, right? 

So for experienced users learning small changes, make it more complex. For new users, break it down to simple tasks.

My two cents... and change.

James Brown

I actually created software cheat sheets to supplement training. It breaks down how to perform common tasks and basic window navigation and I have a cheat sheet for each type of job. The software that I train users on has approx. 6 to 8 job sets and with each job set comes different goals and not everyone will use the same windows but they do need to know where to find information pertinent.  If I were to try and evaluate them on where to find a specific  window or how to perform a certain task, this would only prove to be an irritant to them. If this were a trade school or college, I could see testing but in the domain that I am teaching, tests would not be helpful.

Heidi Payne

I'm amazed sometimes at this forum of mind readers! I have a new job creating software training modules in my company and have been fighting this same battle (mostly in my head). We have Captivate which I love for software capture, but after downloading the trial version of Articulate, I think I need to ask for a new software budget. :0)

Thanks to you all for answering questions I've been trying to answer for the past 2 weeks. You've given me some great ideas to go forward. James, do you create your software cheat sheets first and use them as a script? I like to give resources, but don't have time to do twice the work.

James Brown

Hello Heidi.. I actually use cheat sheets as a resource for use after training. I'm currently building my training in Power Point but later I will make a Articulate demo to show case to my bosses which should then trigger the authorization for the purchase of Articulate. I also supplement training with "How To documents" but end users also have an Electronic Help Manual, short videos and technical support via phone, email, or chat. Of course you will not need to create e-learning courses for everything. The key thing is to identify tasks that are performed often and have a high impact on the software and focus your training on how you will convey that knowledge. I.e. e-learning, how to pdf's, short video clips, etc. The rest of the low impact tasks may then be covered via other means. 

Hopefully that helps.

James

Sam Currie

WOW! There has certainly been some lively debate here and some great ideas. Thanks to all of you for sharing your thoughts and experiences. I also create cheat sheets and "stand alone" flash videos for step by step instruction, all of which are stored on our website, I'll be pointing the students to these using the resources tab so I can get some blended learning going on.

I think I will get over my reservations and build some software simulations using Captivate, so the user can practice themselves

Thanks again - you chaps rock!!

Margharita Nehme

Hi All,

Could anyone share an example of software training they've developed using storyline?

Here is my lesson structure:

title slide inlcudes lesson name

lesson objectives: tasks learners will be able to carry out

Simulation of task being completed

Scenario based task where learners have to try it out themselves, with hint prompts

I also include a link to a printable step action type PDF (as per client's request) 

I'm looking for ways to make creative, visually appealing, easy to navigate, etc..

Thanks in advance for sharing =)

James Brown

Hi Margharita,

I have been a software training for many years and I have used power point as a method to do an overview of the product but the best method I have found is to create simulations and hands-on exercises. While attending Boise State I used Flash to create this little example.

* Example

If you mouse over the rolling circles you will get instructions on what you need to do. i.e. type in three initials and click ok.  Most tutorials I have seen were built using Adobe Captivate but I used Flash for my example. One of the most effective methods I have seen is been with the use a a software emulator. Pearson's has developed these emulators for MS Word, Excel, Power Point, and Access and they have packaged it into a CMS called MYITLAB. They also published a book outlining the steps for each learning module and each lesson is incorporated into a final learning project where the student is required to accomplish certain tasks.

Another idea is to develop the tutorial with Articulate but then use the tutorial in conjunction with the software. That way the user must follow along with the tutorial and perform the tasks in the product. Unless you are going to have the student submit an assignment, there isn't going to be an easy way to gage if the learner has acquired the knowledge you wished them to acquire.

El Burgaluva

Hi, Sam

I don't think software simulations are inherently patronising or too easy; but I agree that many are. Like a lot of things, I guess it depends on how they're done.

 One big problem with software simulations as far as I'm concerned is this model:

  • First, let me tell you how to do stuff -- usually without telling you why or without any clearly defined goal
  • Now I'll step you through the process, prompting you re: what to do, but not really requiring that you use your brain
  • Finally, I'll test you on some arcane knowledge-based point and decree that passing my test -- having done the first two parts, of course -- indicates you can use the software effectively
  • Pat myself on the back for a job well done.


***groan***

Instead, you might try something like this:

* First, you get a snappy overview of what you can achieve/do with this software

That is, allow me to outline the benefits to you of using this piece of software (e.g. the problems it solves compared to the "old" software, the increased efficiency due to XYZ, the lower error-rate as a consequence of ABC-feature, etc.)

* Next, you''re immediately given a job: In this lesson you're going to help me complete some tasks, e.g. generate an invoice, repair a database, correctly format a template, etc. and... there's some kind of associated "workbook" task accompanying this first stage (which will be helpful for the test *hint*hint*).

You might, for example, have to complete a process chart (or some other kind of diagram) as you do the walk-through stage. Or maybe you have to circle the correct option from sets of interface images (e.g. icons, drop-downs, etc.) for completing certain tasks. And so on.

 * Cue first task... you get walked through the steps but wherever it's reasonable to expect that you (as a novice -- or even completely new -- user) will be able to fill in the data or make the correct selection, you'll be asked to do so.

This might, for example, be to choose the best option from a drop-down menu -- forcing you to read the options and think about them. Or complete a set of steps demonstrated earlier. Or consult an accompanying document, look up some data and enter it in the correct field. [Endless possibilities. The key lies in the "reasonable expectation" of your being able to do it instead of just being talked at for 15 minutes and then being prompted to repeat the steps the narrator droned on about... and/or doing some hoop-jumping quiz!]

Naturally, if you get something wrong, you'll receive contextual feedback to help you get back on track (ideally without simply being given the answer!). This feedback will be of a "general" nature if you make a "general" mistake and specifically tailored if you make an "expected" mistake, i.e. one that the SME(s) flagged as a common neophyte error. For example, choosing XYZ rather than ABC. Here, the tailored feedback will have been "planted" on [Selection XYZ] in anticipation of this common error.

Note: Don't go too wild with getting the learners to do stuff at this stage! This is, primarily, the "input" stage. So while it's good to involve the learners, get them to think critically about what they're doing, and make decisions which demonstrate they're following the concepts being presented... we have to bear in mind that this isn't a "practice" or "formal test" stage.

Okay, so after you've negotiated a simple step-through that (hopefully!) engaged you enough to prevent your dozing off...

... you now get dropped into a situation/challenge that requires a "real" (read: simulated but authentic) outcome. For example:

  • Siobahn, your boss, is responsible for [doing some task which relies on (name of software that's the focus on this training course!)].
  • Unfortunately, she's been pulled away by an important and improptu meeting.
  • Which means it's down to you to save the day!

[Show Siobahn coming in "Look, I'm really sorry to do this to you... I know you're fairly new to using [name of software] but I'm confident you'll be able to handle this. Here's [e.g. a printout of profit margins across each sector of the organisation in the last quarter]. I need you to [do whatever the software does with this type of data] by this afternoon. If you've got any questions you can ask Craig. He's been using [name of software] for a while now and has a pretty good handle on things. I would've asked him to do it, but he's working on something else and besides I think you're ready to tackle this one."]

 
You, the learner, then have to "navigate" through the software entering data, selecting drop-downs, clicking on buttons, etc. in order to complete the stated goal (which could be broken down into logically sequenced  "chunks").

You may need to obtain the data from an externally linked pdf file. Lots of areas are clickable to discourage you from just waving your cursor around until you find a link or hotspot area. And when you click any of these hotspots, etc. all kinds of consequences occur before your very eyes (i.e. as if you were using the program!) -- including, possibly, the whole thing being destroyed and your having to start again!

You can "summon" Craig via phone or in a "chat" window (or whatever is appropriate to the context established) when you need help. He may or may not always be available. [You could also be linked to the User Manual for such occasions -- or for instances where you prefer to look it up first and only contact Craig if you get stuck, perhaps.]

 If you can successfully navigate this simulation scenario, both you and I can be pretty confident in your ability to do so in a "live" situation as well.

You may or may not have to pass through some kind of "quiz" or assessment task -- depending on the beauracracy in your organisation or any number of other factors. Without being automatically cynical of "post-tests", it might well be appropriate to quiz you on certain terminology or concepts, for example. In most cases, though, I don't think it will be necessary because the goal is (likely to be) to teach you how to use the software -- not how to talk about it.

 * Repeat this process for any other tasks you need to be able to do in this particular software application.

--------------

The rationale/breakdown for this approach is roughly as follows:

  • Why you should pay attention and invest time and effort into this lesson (Not in a top-down "Do it coz I said" way, but in an attempt to get learner buy-in).
  • What you're in for/What you'll be doing (Basically, making the "Can-Do Statements" clear without mentioning that term, or formally delineating "Goals" or banging on about "Learning Objectives" or any other such "trainer language" that tends to wash over learners and/or send them spiralling into a pre-course stupor) 
  • Some kind of additional "focus task" (e.g. the workbook) that will double as a "cheat-sheet" in the second part -- and, likely a "training wheels" type job aide for the first little while
  • Non-passive walk-through/input stage. [NOTE: Not testing anything you wouldn't be expected to know or be able to work out -- which is too difficult and, therefore, frustrating and discouraging. Instead, it's just enough to keep you on your toes (read: awake) and give you a little confidence boost "Hey, even as a new user I can work this thing out!"
  • Immersive "practice/test" which learners can relate to the real world. It's not some abstract task that they're doing just because they're being told to. Of course, we know (and in their heart of hearts they know!) it's pretty darn close, but setting it up in this way creates enough of a "realistic fabrication" onto which a bit of harmless self-delusion can be attached. Just enough to make the "test" come across as less confronting or onerous or dry or pointless. Also, it's not strictly a "test"; help is available -- just like real life.
  • If you reach the end point -- no matter how many false starts you made, things you cocked up, rabbit holes you went down, etc. -- you've demonstrated that you're able to do it (more or less) on your own. (And if you need to go back to the first part again, you always can.)

--------------

Hope that helps!

All the best with it,
Leslie

Holly MacDonald

This is a great thread - so good to read about people taking something that is usually fairly boring and make it interesting. I have a couple of thoughts that might add something.

1. creating a persona-driven training (while the s/w is generic, the users are not) - this lends itself nicely to scenarios. One of my clients is doing this (although their training is classroom based) and they are using audio clips in the class to simulate the process trigger that would occur in real life.

2. define the tasks and level of proficiency you are going for. Too often the goal of s/w training is to show you everything that it can do and *hope* that you remember (or give you the user manual to refer to). It isn't realistic that all beginners progress straight to full competence after training. Also, designing around tasks not feature/menu/screen is much more useful. 

3. If you have the ability, you can cover basics in your intro tutorial and then produce short follow-up modules that are emailed (yes emailed, well the link to it at any rate) to them in a regular sequence. "Now that you've had a chance to do "X" in the s/w, here's a link to a short tutorial that will help you with "Y"." Treat learning like a process not an event. 

In terms of the elearning aspect, I think the "get a hint" option is a good one and the ability to contextualize feedback is really helpful. The sales orientation course on the storyline showcase includes both of these quite nicely.

http://articulate.demos.s3.amazonaws.com/sales_orientation/story.html

I think you have some great ideas and suggestions to use. I for one am going to save this thread for future reference!

Holly