How Do You Encourage Course Feedback?

Jan 13, 2016

I create courses internally for an organization, so this questions might not apply to the freelancers here, but how do you encourage feedback on your courses?

I ask this, because today I got a call from a learner that, after 1400 other staff took the course, was the first to question some of the content.  They had 3 questions.  One was on something they thought conflicted with the way they practice work and with policy, another was regarding feedback on a MC question where I actually entered the wrong feedback, and the last was one of the questions randomly selected had incorrect triggers jumping them to the results slide too soon.

I was very thankful for the feedback and these were true errors that needed correcting.  Was she just the first random person to find these errors, or did she pay close enough attention to find them.  I'm hoping it was random and everyone pays attention - ;)

It did bring up the idea though, how do you promote and encourage your learners to critically think through the learning material you present, and do you offer mechanisms for the learners to send your or someone else feedback.

I look forward to hearing what others do.

22 Replies
Bob S

Currently we manage this via the LMS survey functionality. Each course has a short survey attached, mostly likert scale with free form section. This allows the user to easily give feedback should they choose, and it feels less uncomfortable than sending an email to the L&D folks.

Best tip we found is to keep it short, short, short. For us, the amount of useful info actually received is inverse to the number of questions on the feedback survey.....  I know this may seem counter-intuitive, but we've found it to be true.  If you can keep it to just 3-5 likert-scale type questions max, you are far more likely to get broader participation; meaning you wind up with far more data than just having a handful of folks complete a novel.   Also, this approach seems to encourage the learners that do respond to more often leverage the free form section that asks for additional thoughts.

We also typically remind them on the last screen of the course to take advantage of the brief survey and share their thoughts so we can improve the materials for everyone.

Great question and hopefully some others will share their best practices as well.

Alexandros Anoyatis

Tracy, this is such an underrated question - we're so invested into giving our courses the best treatment that we sometimes neglect or even forget about feedback altogether!

I too agree with Bob in that you should just add a short survey within the LMS itself, and outside (maybe even right after) the Storyline module. And I also agree with keeping the questionaire as short as possible.

However, I tend to encourage the use of textboxes instead of likert-scale questions; those interested in providing feedback will dig right in and do it no matter what, and I prefer having to deal with text rather than trying to decrypt a numeric average. Additionaly, none of the three remarks would be possible for the user to leave if using (just) a stock single-choice survey form.

Just my 2c,
Alex

Christy Tucker

I saw a suggestion a while back (I think on the LearnDash blog?) to put a link on every page of the LMS labeled "Report content errors or broken links." While I've always done a survey like Bob and Alex mention, I like the idea of having something easily accessible to users on every page. If you can make it easier for users to do that checking and reporting for you, you can hopefully spend less time trying to review the courses yourself. I would use this in combination with a survey.

It's also helpful to have a systematic plan to review courses periodically. That can be challenging, especially with a large library, but it's one more process to catch errors.

Mike Taylor

Hi Tracey! I've had pretty good luck putting a "Comments/Questions" link in the top right toolbar so that it's always available from everywhere in the course. I also add a slide at the end specifically asking for comments and/or questions. 

Comments or Questions page

It's been a huge help finding typos, mistakes and even highlighting gray areas of policies etc that were very helpful to improve the course. 

I made it a link to a super simple web form so that all the comments are automatically collected into a spreadsheet and I ask them if they'd like to be contacted about it. This is great to use for your performance reviews!  Showing how you fixed things or how well people like your courses. 8-) 

Parish.tracy@gmail.com Parish

Although perhaps a bit overwhelming to think about I really like this idea as well Christy.  I can't beleive how many calls I get that say

  • "I got stuck on the last screen".....um no the last screen has an exit button you still had a way to go I'm afraid.
  • "You know the screen with the guy, ya that's where it didn't work"....ummm the "guy" is on 50 slides of this course
  • "The screen froze so I closed it.".....umm which screen, which course, can you give me any details.


This idea of report a broken link would eliminate the need of trying to rely on users to give you the information you need to pinpoint the error/trouble with a course.  More often than not it's a quick 2 minute fix, but interpretting the panic call is the time consuming part.

Christy Tucker

My favorite one (this is an actual quote from a client):

"I have checked the link but when I click the buttons not much happens."

Not much happens? So something happens, just not enough, or not the right thing? And it's some buttons somewhere in this 12-module course? Um, yeah, I'll get right on that.  :-/

It often takes longer to figure out where and what the problem is than to actually fix it. If you could create a link on each page that could automatically identify the page the user was on, that would save the problem of trying to get users to give you what you need.

Alexandros Anoyatis
Christy Tucker

If you could create a link on each page that could automatically identify the page the user was on, that would save the problem of trying to get users to give you what you need.

I'm pretty sure there's a (paid) widget/add-on by James Kingsley available for a similar scenarios, but I can't be certain it fits your use case - it's been a while since I watched the video demo regarding this.

Mark Shepherd

Hi Tracy!

I like Mike's suggestion of placing a graphic encouraging feedback right at the end of the course.

In my organization, we are currently working on adding a Survey/Feedback option at the end of the course that links directly to a feedback platform online: SurveyMonkey!

Also, internally, we already encourage many of our course users to give us feedback, but it is far more useful/effective to do so from right directly within the course itself.

 

Ulises Musseb

Hello everyone, and my two cents:

In my organization, adding feedback and accepting suggestions is treated as part of the overall instructional design of the course. We have found a good increase in valuable feedback and comments (both with encouragement and with valuable information to improve current and future courses) as soon as we started treating the course feedback the same way we treat the courses.

At first we thought that adding a survey (separately) was enough to provide staff with the opportunity to express themselves. However, when we added the same instructional elements to the survey and made it part of the course (rather than an added link), there was a dramatic increase in both participation and in obtaining valuable information about the course. We even obtained feedback about the feedback tools!

We used to spend a great deal of time designing courses that are appropriate to the audience, normalizing content, designing for proper presentation and deliver, etc, but then we found ourselves spending no time in designing a tool for them to provide feedback. Not surprisingly, we were not communicating our intentions, objectives, etc, when we were giving staff the opportunity to provide with suggestions at the end.

Miranda Verswijvelen

We  have implemented this via our LMS and kept it short. We've got 4 Likert scale questons (fun, informative, engaging, applicable) and then two open text areas with a 'limiting' question: 'tell us about one thing you really liked', and 'tell us about one thing you would change or had trouble with'. We find that the last question gives us a pretty wide overview of navigation issues, bugs, content errors etc as experienced by different people.

The 'one thing' seem sto keep their comments short and to the point as well!

Cary Glenn

The company I work for has employees from northern BC to New Orleans, so far they haven't had a problem contacting me if there is a problem that their local admin can't help them with. We seem to have most of the technical issues figured out. That may change when we make the shift to Windows 10.

I'm not a big believer in Level 1 Kirkpatrick evaluations. I really don't care if they like the course. In some instructor-led courses I expect them to be ticked off because I am telling them the way they have been doing something is wrong/dangerous/illegal and they are going to have to change. All too often when I did do surveys I got useless information. I am more interested in is the course actually changing people's behavior. I am getting Will Thalheimer's new book on smile sheets. http://smilesheets.com/ Maybe it will change my mind.

Bob S

Hi Cary,

At some level I agree with you.... Level 1 evals can certainly be just smile sheets. That being said, we tend to focus on content applicability and clarity/accuracy more than the feel good topics in ours.  So when aggregated  across a broad spectrum of users they can reveal some interesting trends.

For example... We tried a different format for pop-ups in one particular course. And when we looked over time, we found that course consistently scored much lower than average in clarity of content. That was a good indicator our design didn't work as intended.

So yes, they can be squishy and feel good. But if well structured can add some tangible value too. My two cents!

Bhavya Aggarwal

In my previous company, the trainers used to conduct a QA session of the course with the employees of the company. He would gather people with different backgrounds for an hour, let them go through the learning material and watch them learn right in the same room. Of course, he used to sponsor a nice lunch in return :) Ours was an engineering product (Solidworks) , and as a group, we really did help in getting some of the discrepancies sorted. This was also useful in getting feedback in terms of the usability of the courses as well.  In fact this process was followed not just for new courses, but also for reviewing some of the existing courses to check, if there are any updates required. 

This discussion is closed. You can start a new discussion or contact Articulate Support.