Piloting a E-learning course

Feb 05, 2014

Hi everybody! I have very few experience with e-learning and I'm asked to write the piloting guidelines for a e-learning course. Can you give me some suggestions to do this? Thank you very much!


5 Replies
Nicole Legault

Hi Luca!

Thank you for joining the community, welcome! And thank you for posting your question, I think this is a great topic...

I think a good pilot program should provide you the ability to test everything you want to do for your e-learning project (development, publishing, uploading to LMS, testing the final output, etc) on a small scale to identify the value and reveal any problems, before you spend a significant amount of time and money on a large-scale project. 

Typically the pilot starts with a proposal which identifies objectives of the pilot. It should also document how the pilot will be carried out. Who will be involved in the pilot process? What software, hardware, and other resources are required to carry out the pilot, from start to finish (this would a list of project requirements)? How will they document their results of the pilot? What are the timelines and constraints for the pilot? You want the pilot program to be as representative as possible of your real, final large-scale project. The pilot proposal or plan should also identify metrics for how success will be determined.

Another aspect of your pilot project might be to actually have a user base, representative of your real learners, who complete the e-learning courses from start to finish and identify if they find the materials useful, if they can apply the knowledge they learn after, basically if the e-learning is actually successful with the learners.If your learners find the navigation is hard to understand and they sometimes don't know where to click in your pilot e-learning course, you know you'll need to make those adjustments for your large-scale project. That's an example of the kind of thing you might identify. 

Hope this helps get you going in the right direction! I have a blog posting coming out tomorrow about 4 reasons you need to test your e-learning course, and you may be able to glean some additional insights from that post so stay tuned!

Bruce Graham

Hi Luca, and welcome to the forums.

I would try and Pilot the course with as many "real" users as possible.

Try and get them to represent as wide as possible set of criteria from that group, (company newbies, age, experience, grade etc.).

Ask appropriate questions that are meaningful to what you are trying to achieve, such as:

1) What difficulties did you have navigating through the e-Learning program?

2) What parts of this e-Learning program do you think worked well in terms of clearly teaching the information?

3) What parts were confusing or difficult to understand?

and so on. 

Make sure that you have a way of deciding what feedback you will follow-up on, (and how...), and what becomes just "information".

I am sure others will come in with information, but hope that gets you started.

Once again, a warm welcome to the Heroes forum, and ask as much as you need to.

PS - beaten to it by Nicole

Laura M

I would love to hear more about creating piloting guidelines, if others have done so.  I have been working on a corporate-level project (I'm usually just working on smaller, site-level courses) and I had to push to make sure we could pilot the course before launching.  They were in a rush to create it and launch, but it will eventually be used for every new hire globally and I'm not comfortable just laying it all out there without some very real feedback!  

We are actually in the piloting phase right now.  Last week we assigned the course through our LMS to a variety of upper management throughout the organization and I had a small sample from my own site test it (to get the variety of positions we have locally).  We made sure to recruit people with a wide range of experience with the company to make sure we could see how those with a stronger base knowledge felt versus those that were newer to our company.  

We created a survey that was sent out with the invitation email and asked that everyone view the survey prior to taking the course, so they knew what we were looking for feedback on.  The survey has a lot of open space after each question for write-in suggestions.  We've gotten about 20% of the surveys back so far and are generally pleased with what we've heard.  It's been great to see most users actually writing in their thoughts and suggestions, instead of just clicking a multiple choice response.  I do think it will be a bit tough to go back and decide which of the feedback suggestions are worthy of implementing, which are nice-to-haves, and which are simply one person's preference.  I've worked really closesly with the SME for months, so it was nerve-wracking to put ourselves out there, but also sort of a relief to know we will finally be receiving input from others!  The fact that no one has called me up screaming makes me feel like we did something right.    

This discussion is closed. You can start a new discussion or contact Articulate Support.