What Guidance do you Give Your Beta Testers?

Feb 13, 2013

I am developing my very first e-learning course using Storyline and have included testing in my timeline. Do you give your testers a questionnaire or any guidance for the type of feedback you'd like or do you just have them take the course and let you know if anything is broken? I'd appreciate getting substantive feedback from them so I can improve the course and my skills. How can I ensure that I get this?

If you have any samples or templates, I would greatly appreciate them. Thanks!

15 Replies
Bruce Graham

Generally, (in my experience...), the areas are:

  • Confusing Navigation.
  • Confusing instructions when "doing" things.
  • Confusing or inconsistent messages.
  • Areas where learning is strong/useful and/or weak/redundant.
  • Technical/useability issues.

I try not to be too prescriptive with a "form", and let the feedback come freeform. May be harder for me to interpret, but seems to elicit more feedback, which is what the exercise is about

Bruce

Holly MacDonald

Michelle -

Great step to get feedback from your learners.

If you can combine the ability to watch your users and observe how they interact with your course, you'll see the areas of confusing navigation. If you couple this with some feedback channel, you'll get broader input. If you can't actually watch them, think of ways that you could do this - maybe a remote screenshare or a service like usertesting.com. You could also have them do a screencast describing what they are thinking rather than writing down. You might get more candid feedback.

I hope that helps.

Holly

Steve Flowers

I like to send out smaller tests to folks during the formative changes to see if the navigation makes sense, instructions are clear, messages are consistent, etc.. I love the idea of asking folks to do a screencast if they have the tools and are comfortable doing that.

When I test with larger groups, I want to see if the strategy is moving the needle. We have a testing package we use to measure before and after snapshots of abilities and perceptions. It's pretty low tech. All paper based.

The pre-module survey and assessment contents vary based on the context. Generally we're looking for the participant to describe how they feel about a particular topic before hand, how much they know, what their understanding of rules and consequences are, etc.. This is brief as we don't want to over-influence the experience with prep, but we do want to get some sense of where folks are before. We also assess various tasks on paper ahead of time and extend that capture with a confidence measurement (How confident are you in that answer?) 

We offer a similar assessment after the module experience that measures similar task contexts and asks the same confidence questions. This gives us a general idea whether the module did what it was designed to do. These tests are NOT part of the final deployment, though we do try to put L3 measures (are you using this on the job and can you accomplish the task at some point in the future) to evaluate how long the results last. If the L3 measures are off the mark and don't jive with our user test results, we make an adjustment or add something a little different to span the gap. It's all you can do, really 

Tim Slade

I think it really depends on what you want from your beta testers. In my experience, if you are using non-eLearning people, they're unlikley to point out any major/minor design/technical issues unless you ask them VERY specific questions. They just don't know any better. In most cases, the small details we'd like them to point out, they won't even notice.

Typically, if I'm having my SME(s) look at the content, I'll ask them to ignor the "look and feel" of the course and simply focus on content accuracy. Is everything accurate? Is the course conveying the complete message that you want? Etc.

If I'm having a group of end-users from the targeted audience look at the course, I'll simply ask them to give general feedback on the course. With this group, you'll likely have to ask a lot of questions to get the right info from them. For example, was their anything you liked/disliked about how the course looked? Do you feel the content helped you to meet the specified learning objectives? Etc. Again, unless they've reviewed eLearning many times in the past, they won't know to give you this type of feedback.

And lastly, if I'm looking for feedback on how the course looks, it's structure and/or design; I'll have a group of my peers look at the course. They'll be the ones who really understand what you're looking for in terms of feedback AND they are the best group for this type of review.

Hope this helps! Good Luck.

-Tim 

Marty King

I recently beta tested a course I developed by having two people complete the course while I watched. This works for me because I am in a corporate setting. I was able to observe navigation issues as well as measure engagement. I did not interact with the participant while they completed the course but did interview them afterward.  Of couse I had several others complete the course without me and then I interviewed each one. I highly recommend observing if you can.

This discussion is closed. You can start a new discussion or contact Articulate Support.