E learning Evaluation

Mar 27, 2014

HI Guys,

I'm currently tasked with producing an evaluation form for learners in our pilot group to complete. I'm not entirely certain what questions should be included and was wondering if you had any suggestions.

As you might have guessed the purpose of the evaluation form is to capture the user experience and address any issues before the module goes live.

All help is much appreciated

Harri

11 Replies
Edie Egwuonwu

I'd want to know the usual things like:

  1. Did they like the experience?
  2. Did they learn from the experience?
  3. Did they see how what they learned related to their job?

I'd then want to know some usability information such as:

  1. Pace - too fast, slow, just right?
  2. Usability - intuitive, confusing- if confusing - what (page level feedback)? Did things work as intended?
  3. Engagement - too much interaction, not enough, just right?
  4. Clarity - were they clear on what they were supposed to do? 
  5. Seat time.
Nicole Legault

Hi Harri, 

Here's a few past forum discussions about e-learning pilots that might be useful:

On a personal note in the past when I have done pilots the things I have really focused on are 1) navigation (where they got hung up, didn't know where to click, what to do, etc.) and 2) content (did they understand it? does it relate to their job? is it meaningful?). I provided ways for the testers to document their findings or feedback. I also once was able to actually record my testers (with a video camera and a screen recording tool) as they completed the course, which gave me additional insights into where they got stuck, how long it took to do certain parts, where they hesitated or clicked in the wrong places, etc. etc. I got a lot of good information from doing this. And then I adjusted my course accordingly. Good luck!

Joshua Roberts

Harri C said:

Hi Everyone,

Thank you very much for all the responses, I'll bear them in mind when designing my evaluation form.

Also... 

Do you ask your pilot group to fill in the evaluation reflecting on the whole module or section by section? 

Thanks

Harri


I think this depends on the length of the module to be honest. If it is a long package or course then I would split it down to manageable chunks. At the end of the day you want feedback that is constructive so asking them to try and remember 60 minutes of a course at once may be a struggle.

Laura M

One of the things I find helpful is making sure your pilot group has access to the evaluation questions in advance, so they are aware of what they should be looking for.  I know I've gotten surveys after the fact for things and wish I could go back and double-check something.  We used Survey Monkey and sent out the evaluation link with the invite to pilot. 

Also, depending on the complexity and scope of what you're doing, you may try pilot groups with different focuses.  We had a broad, global group pilot a course for a wide variety of things (content, navigation, design, etc.) and then a couple of smaller groups let us know about more specific things (how was the pace and narration as someone who speaks English as a second language?  as a newer employee, how would you rate your understanding of the content before and after?)

Kimberly Read

Harri, For pilots, I like to use what I call "confidence surveys." I base the questions off the learning objectives of the course. For example, if the course topic is How to Change a Tire, I ask learners to rate their confidence (on a scale of 1-5) in their ability to use a jack, tighten lug nuts, etc. I provide learners the same confidence survey before they take the pilot course AND after they complete the course. This way you can measure the learning gain from having completed the course. For example, if a learner rated their confidence with log nuts as a "1" and after the course as a "5" then you can attribute the gain directly to having taken the course. Good luck!

This discussion is closed. You can start a new discussion or contact Articulate Support.