Looking for feedback on when testing a course that has been generally approved in content/navigation/etc by the client - what size of pilot group do you then push it out to for user testing?
Great question, Tracy. I'm not aware of any industry-wide best practice or standard for pilot group size. I'm sure each person has their own ideas, but I prefer a group of 3 to 5. I tend to prototype as I go along, so a pilot group of this size is small enough to be manageable, but still large enough to give me some diversity of opinions, experiences, and operating environments. Obviously if I was working on a really high-risk, high-profile project the pilot group might be a bit larger.
I've used groups of 5 to 15 depending on the content. In the case of two critical HS modules for supervisors, where 2000 people would be taking 1.75 hours of training, we designated a group of 15 that included supervisors with either an affinity for and an aversion to e-learning (as well as HS coordinators who had no involvement with the course development, and a couple people from the HelpDesk who would be fielding help requests). We also tried to choose people who are known for being picky and giving very specific constructive feedback. We also had people launch the learning on business and home devices and a range of browsers to see what if any technical issues emerged. As a result, the full release was smooth, the voluntary completion rate high, the feedback very positive.
5 - 10 always seems to work for me. I like to get them in a room together so I can observe their behavior. If it's a global role out I also make sure I've got at least one tester from the regions that I know that struggle with the infrastructure and technology.
6-9 people, In my experience different people look for different things so I would have a number of people who know the content inside out, a couple of people who have a vague idea of the content and a group of people who have no idea of the subject. I find that the SMEs will concentrate of the subject but may miss the finer points as they understand the message, the people who have no idea of the subject usually concentrate on the navigation or the look and feel of the course while the people who have a vague idea provide feedback on the understanding of the content.
All very important views but for different reasons.
In agreement with Jeff on this one. I also roll out to 5 users and always work to get a good mix from across the organization. If it's not global, my training co-workers are always the focus group.
I concur with Jason. My training co-workers are my focus group also. I also try to make sure the people testing don't know how the eLearning is supposed to work. That way, they're more likely to click/interact some where not expected and discover any errors.
11 Replies
Great question, Tracy. I'm not aware of any industry-wide best practice or standard for pilot group size. I'm sure each person has their own ideas, but I prefer a group of 3 to 5. I tend to prototype as I go along, so a pilot group of this size is small enough to be manageable, but still large enough to give me some diversity of opinions, experiences, and operating environments. Obviously if I was working on a really high-risk, high-profile project the pilot group might be a bit larger.
I've previously gone for 6-10 but this was more because I couldn't guarantee they would have time to provided feedback or the quality of it.
A few good reviewers/testers is a lot better than a large number of "yeah that looks good" ...
I've used groups of 5 to 15 depending on the content. In the case of two critical HS modules for supervisors, where 2000 people would be taking 1.75 hours of training, we designated a group of 15 that included supervisors with either an affinity for and an aversion to e-learning (as well as HS coordinators who had no involvement with the course development, and a couple people from the HelpDesk who would be fielding help requests). We also tried to choose people who are known for being picky and giving very specific constructive feedback. We also had people launch the learning on business and home devices and a range of browsers to see what if any technical issues emerged. As a result, the full release was smooth, the voluntary completion rate high, the feedback very positive.
5 - 10 always seems to work for me. I like to get them in a room together so I can observe their behavior. If it's a global role out I also make sure I've got at least one tester from the regions that I know that struggle with the infrastructure and technology.
5 -10 it is more important to get a spread of abilities and ensure that they are representative of target audience.
Also helps if you can watch over their shoulder UAT really changes your view of how you create things.
6-9 people, In my experience different people look for different things so I would have a number of people who know the content inside out, a couple of people who have a vague idea of the content and a group of people who have no idea of the subject. I find that the SMEs will concentrate of the subject but may miss the finer points as they understand the message, the people who have no idea of the subject usually concentrate on the navigation or the look and feel of the course while the people who have a vague idea provide feedback on the understanding of the content.
All very important views but for different reasons.
I think it really depends on the content that you are piloting. It also depends on what you are going to be able to do with the feedback you get.
In agreement with Jeff on this one. I also roll out to 5 users and always work to get a good mix from across the organization. If it's not global, my training co-workers are always the focus group.
We always try things out internally first, before asking around 5 users to test it out.
I concur with Jason. My training co-workers are my focus group also. I also try to make sure the people testing don't know how the eLearning is supposed to work. That way, they're more likely to click/interact some where not expected and discover any errors.
This discussion is closed. You can start a new discussion or contact Articulate Support.