How to get learners to fill out post-course evaluations?

Jan 20, 2015

As training designers we all know that post-course evals are an important part of the training evaluation process. Community member Jonathon Miller posted a great comment on an article I wrote (Post-Course Evaluations: What E-Learning Designers Need to Know) which essentially said the following: 

"The problem I have with assessments is that when I can get stakeholders and trainers to agree to them, a number of attendees filling out the assessments don't put much thought into them."

I'd love feedback from the community on this... What can we do to encourage participants to take time to properly fill out post-course questionnaires? Any steps we can take as IDs to make sure we get the quality feedback we need to improve?

13 Replies
Jackson Hamner

I've been thinking about an idea to embed the evaluation questions directly into the course and sending the data to a google forms document with JavaScript. The thought being that the learner might be more willing to fill out answers if they don't have to leave the course to do so?

I haven't tested this out yet, but it would be interesting to see if we get more answers from users who don't have to navigate to a third party site (surveyMonkey, etc.) to fill in the answers.

Has anyone tried this before? Did it make much of a difference?

Nicole Legault

Oooh great ideas!! 

@Jackson - maybe you could consider using the "Survey" questions offered in Storyline or Quizmaker. Since they aren't graded they could be ideal for gathering post-course eval information.

Then you can make it look like any other quiz in the course which kind of ties in wtih @Deborah's idea of making it a quiz and a required part ..

Bruce Graham

The problem here is simple IMHO - the learners feel that they have done all that is required, (often on the back of badly designed and irrelevant courses). Why SHOULD they do this extra thing for you at the end. It does not matter in what form you put it, you are asking them to do something to which they attach no personal value. It does not matter how your "build" it - it will not get done.

So...

Why not just make it a series of questions throughout the course. I really think the "Quiz at the end" concept is now old and hackneyed, there are other ways to do things. Why do you need to wait until the end before you ask them if "...was this course relevant and be used in your day-to-day blah blah blah". At the end of a section within a course, why not just say something like "...Do you see how relevant to your job that is?", or whatever. Even at the start, ask "...do you know, exactly, why the next 10 minutes are relevant to you?".

It does not have to be "Post Course", (when learners may feel that they have done all they are morally contracted to do). If you ask questions all the way through, then one or two at the end will also (psychologically...) be more palatable, using the old trick of "If someone does you a favour, they are more likely to do you another one". Use that principle here.

I think we've all pretty much exhausted the old-skool and "expected" way to do quizzes now. In my years and years of doing this I've seen every which way and tweak, and in my experience  they are all a bit like shuffling deckchairs on The Titanic. Storyline (for example) gives us the ability to do things in a much more subtle way, picking where questions come from for a specific result. Perhaps it's time to try that and see what happens if you do something different? What's more important - "making them do the quiz" or "getting the business information you need"? If it is the second one, (and it should be...), you just might have to work a little harder and smarter and break the current expected paradigm. And if course sponsors disagree, you have the perfect negotiation tool - you have figures. Who says you cannot measure learning? Just measure where it fails, and change the way you do things. Learner engagement at "end of course quizzes" is a perfect example of where you CAN change things in our industry for the better.

Jeff Kortenbosch

Great point Bruce. How would the (survey) questions be stored by the LMS? We're limited to SCORM 1.2 (at least for now). Doesn't that not just pass Pass/Complete data to the LMS?

A basic thing we use, when required, is to make it part of the course program e.g. 1. e-learning module X, 2. Classroom session Y, 3 End-course survey > Certificate. The user needs to complete/pass all 3 to receive their certificate.

Cary Glenn

I've given up on "Reaction" surveys (or smiley sheets as they are often called). When I was doing in-class instruction I found that the responses focussed on food (donuts and free lunches) rather than the actual content of the course and the quality of instruction. If people got donuts, were able to get out early, and didn't really have to change then the course was great. As a manager the only time I really began to take notice was if there was a change in responses, or interesting trends in ratings. Most of the courses I'm involved with are safety related, these courses often involve people changing behaviors to work safely, often this means changes in procedures, eliminating dangerous short cuts and people finding out that what they are doing is dangerous and sometimes illegal. Many people dislike changing, even if they know it is for the better. Sometimes the most effective courses are ones that the learners dislike. Not because the course is bad but because the content causes a cognitive dissonance within the learner and that makes them unhappy

You are better off measuring "Behavior" changes. That is really where the effectiveness of the course will show.

Bruce Graham
Jeff Kortenbosch

Great point Bruce. How would the (survey) questions be stored by the LMS? We're limited to SCORM 1.2 (at least for now). Doesn't that not just pass Pass/Complete data to the LMS?

A basic thing we use, when required, is to make it part of the course program e.g. 1. e-learning module X, 2. Classroom session Y, 3 End-course survey > Certificate. The user needs to complete/pass all 3 to receive their certificate.

You could just do a series of "Yes/ No" questions throughout, which would allow minimal analysis. If Q1 was for example, "I know how this course will benefit me", then the "Correct" answer is "Yes". Basic analysis becomes possible. Not even saying that you need to use an LMS - this could be done by something like Survey Monkey - I am sure the tech is out there. Bottom line - so many people spend so much time beating themselves up because of those "nasty users who will not do what we want...". The way to win is to change the paradigm. Does not mater how you do it, do something DIFFERENT, and MAKE IT MATTER TO THEM.

 

J V

Great topic and all the shared ideas and perspectives are very helpful!   We dealt with (and are still dealing with) many of the same philosophical, practical and technical issues.  For our organization's needs, we developed standard sets of very simple (only 5 questions) instruments that allow us to compare across similar programs and keep the time requirements of our trainees to a minimal.  The questions we ask are tied to organizational metrics.  Strategically, we view the level 1 as a "canary in the coal mine"; tipping us if there are issues that we need to dig into.

This discussion is closed. You can start a new discussion or contact Articulate Support.