Participant Evaluations of e-Learning Modules

May 10, 2017

Hi all, I'm investigating participant evaluations of e-Learning and would greatly appreciate it if you could give me your thoughts on the following.

In what situations do you request participants provide an evaluation of an eLearning experience?

What are some approaches to evaluation that you have used and would recommend?

Any useful resources out there?

Many thanks

19 Replies
Jackie Van Nice

Hi Scott!

Though I haven't administered evaluations from my end, I can tell you I've had clients who tended to request that employees fill out evaluations as they're going through newly-launched training programs.

It seems to me it would be smart to get that sort of feedback before spending a lot of money to revamp older training too, but maybe that's just me. :)

eLearning Development

Hey Scott;

I have never had much luck getting value out of evals.  I find it rare that people spend the time to fill them out thoughtfully.  I do find free text to be more valuable than numeric scores.  It still may be a small percentage who provide actionable feedback but at least you can use the information you get.  I find the numbers are good for ego only.

Tim

Ben Sewell

We typically only get feedback from SMEs and instructors, so the feedback from the end learners is usually third hand.

If there is a slide we are particular proud of, or have tried something new with, then we will ask for specific feedback on these.

It's generally quite tough to get this feedback. I suppose the ideal scenario would be pretest (prior knowledge), observation (pain points while using the course), evaluation survey (how do you feel about it), post test (knowledge improvement).

Ashley Chiasson

I too have had clients request student feedback for newly launched programs. At the university, we have also solicited feedback in the form of student surveys and focus groups, and have done this with existing programs (to evaluate efficacy and need for revision) and with newly launched programs or pilot projects.

Surveys tend to have a low response rate, even when coupled with incentives. The focus groups have been our primary success when it comes to receiving effective feedback from students or participants.

Mark B.

I agree with the others that it can be especially valuable when launching a new program, so that you can identify any areas that may require improvement. However, rather than having it attached to the elearning itself for all participants to complete after launch, I prefer the pre-launch focus group approach. If time permits, having a diverse group of participants together to 'preview' the course and provide feedback is a great way to find any weaknesses before officially going live to everyone. I find the feedback from this approach is much more valuable, as often those selected feel somewhat special by being selected and are willing to provide constructive and useful comments...while when it's just a general feedback survey after it goes live, most people don't care to provide anything meaningful after receiving their grade.

If you're in a time-crunch and can't do this pre-launch, then you could still reach out to a select group of people and either provide a survey or ask them to take notes on their experience and then get together with them to discuss in detail. When possible, getting people in a room (physical or virtual) and having them talk about their experience will provide much more value than just them filling out a survey on their own.

Beth McGoldrick

We have a 23 course elearning program with evaluation. I use evaluation to judge whether they are able to use what they learned as my questions lead them to tell me, based on the course they just took, how they are applying that and what challenges they will have implementing it (ala Kirkpatricks updated L1 survey). I then use this information to determine if there is something missing from the course(s) and how we can improve them and help further support our learners. I can also put them in contact with someone who can further help them.

Susan Wolf

I try to incorporate evaluation into every learning experience I create. A good resource is Will Thalheimer's Performance-Focused Smile Sheetshttp://smilesheets.com/ Since implementing his ideas, I've learned a lot about our learner's experiences with our courses. I also think his method allows learners to reflect on their learning, which is key.

Courtney Peeler

Usually, we only use evaluations when we assign a new learning to a pilot group before it's send out to the masses. This allows us to make any content adjustments before the larger group takes it. We provide the pilot group a link at the end of the module to Survey Monkey and have numeric and open text fields for feedback.

Nicole Rye

For elearning courses, it's limited to the course owners and a handful of course reviewers before we publish/distribute the course. All of our current catalogue is mandatory learning for internal staff so the thinking has been...happy sheets aren't very relevant, as there generally isn't much we can do with their feedback due to content and scarce developer time.

Louisa Fricker

I ask for detailed evaluations during pilot testing; I'll take the testers' feedback and make revisions to the elearning module as many times as needed to get it right. But after that, in most cases, the module is considered finished and I wouldn't ask for further feedback. I won't ask people to do an evaluation unless I know I'll act on what they tell me. (Of course, after a certain amount of time, you have to re-evaluate the module so it can be updated, but that's a separate process.)

Dave Neuweiler

A simple -- yet surprisingly effective -- method of course evaluation is to simply ask, "How'd today go? Write for about two minutes."

This is effective because you're not asking the learner about what's important to YOU. You're asking the learner to express whatever -- whether it impressed, pleased, or aggravated the learner. It makes them think.

Asking to write for only two minutes promotes getting the first (and usually the best) impressions, and a focus on the most important aspect of the feedback.

Works for any kind of training.

 

 

Sam Sternman

Hi Scott,

Evaluations can be a tricky business.  I typically don't add them the LMSs that I support unless the client makes a request.  For both LMS applications I use, there are built-in evaluation tools - one is like a survey, the other results in badging and a more millennial-friendly approach.

In both cases, if the client asks for an Eval, it's been made mandatory - meaning that the entire course record in the LMS can't be marked as "complete" unless the Eval is completed.  And, I add a note on the last course screen to let users know that this requirement exists.

I'm wondering if you're really looking at a debrief...I would think that this might be a cool use of Articulate Review (assuming you have 360) where you could invite a panel of course participants (kinda like what Louisa mentions above) to give inline feedback on the course screens/content.

Lastly, I know that there are some LMSs that allow you to see responses for inline tests (SCORM) - if you have that option, that might be worth exploring, particularly if the responses can be aggregated.  One of my clients uses HealthStream (proprietary LMS to hospitals/clinics) that has a Test Analysis report where an assessment or graded survey can produce an aggregated report of first responses - it's great for debriefs or GAP analysis work post-publication.

Cheers,
Sam

Melanie Sobie

Our learning management system includes a default level 1 evaluation that we use with all eLearning courses, as an optional post-training evaluation. It has one question where they rate the training with a likert scale, and also an open comments section. We've found the comments section very helpful.

For a set of eLearning courses I have that are approved for continuing education credits I have a required custom evaluation that includes a set of 9 questions with likert scale responses.

This discussion is closed. You can start a new discussion or contact Articulate Support.