What is your testing or QA process for e-learning?

Testing is a really important phase of course development. A thorough testing process can mean the difference between a course that is polished and professional course, and one that is sloppy with loose ends. 

What's your process for testing and doing QA for e-learning?

Do you  have it reviewed by a SME? Copy editor? Testing group?

Consultants and freelancers, do you incorporate testing time and subsequent edits into your project plan?

Any tips or recommendations for a first time e-learning developer on how they should approach testing?

Thanks in advance!

25 Replies
Alexandros Anoyatis

As far as reviews, I prefer having our SME's and stakeholders suggest edits during the actual development of content. This eliminates some/most/many/all (depending on case) subsequent review cycles.

I do allow a final review once development is complete, just to be sure, though.

I also have a proof reader standby, but leave it to the client to choose whether they want that service or not.

I make sure to note down early any pitfalls regarding workarounds due to issues such as device inconsistencies or limitations, or even SL bugs, so I can then focus on them more during testing. 

Alex

Andrew Winner

In terms of collecting and resolving feedback, I use an Issue Log similar to what a computer programmer would use. I list all the feedback I've gotten, as well as who's assigned to fix it and any comments. Each project has one dedicated and I just build it out as the project goes on. Sometimes, if it's a project that gets a lot of feedback, it can be 100 lines or more (which is not ideal). 

I host it on Google Docs and circulate the link to the project team for 100 percent transparency. Anyone can pop in at any time to see the status of the edits. 

-Andrew

Nicole Legault

@Alex - Very helpful tips Alex! Thank you for sharing.

How do you manage getting your SME/stakeholder edits during the development phase? Do you have a set amount of reviews allowed by the SME/stakeholders? 

In your experience, do your clients want to do proofing, or not? In your professional experience, from what you've seen, is proofing worth the added expense?

What do you think about user testing? Have you ever had a few potential learners test a course to assess their reaction or feedback? 

Thanks again for your feedback Alex!

Andrew Winner

It's just a Google Sheet so anyone can view/edit. But people can configure
the security settings however they want--for example, some people may
prefer to make it view only.

The headings I use are: Page/Issue Description/Priority/Date Raised/Raised
By/Assigned To/Comments/Date Resolved

Nicole Legault

@Andrew - Thanks for clarifying! I would love to see a sample/template/example file sometime if you would ever be willing to do that! Obviously an empty or blank one, not one with any private or company info. I'd also be interested in getting your feedback on user testing, have you ever done that? Do  you see value?

Ashley Chiasson

As a consultant, I do factor in QA/revisions into my planning. However, the thing I find tough when it comes to testing is that as a lone-worker, you’re so close to the project (e.g. you’ve been working on it for x number of hours), that it’s often tough to pick up small mistakes (e.g. rogue commas, etc. when you’ve been looking at something for so long. It’s one of the only things I miss about working in more of a team-oriented environment.

I also tend to factor a set amount of revisions (or review cycles) into the review and provide timelines for reviews. I typically specify what types of changes will be noted within that review (e.g. content or multimedia) in an effort to get my clients to focus on fleshing out all content-related changes in review cycle 1; that way, when it goes to voice over, I’m not having to have voice over make changes after they’ve recorded (unless I’ve screwed something up or the VO artist has screwed something up). That’s not to say that I won’t fix grammatical errors in review cycle 2 - it just streamlines things a bit more.

I do have a review log that I send out when I think the client will have a hard time with the concept of a review, but that doesn’t tend to happen often.

At my last corporate gig, there was a ISD review, SME review, and QA review, so many (many) hands in the QA pot, which could be good or bad (depending on the mood of those individuals usually). We used to use review checklists, issue tracking logs, and test cases (which were only used immediately prior to delivery).

Tracy Parish

If it's not sensitive content, I've had my mom test courses out.  She can shop and bank online, but that's about it.  She's a great tester for functionality ensuring I haven't missed instructions, buttons work as expected, and more importantly she will do things that I might not expect a learner to do and that's where the real insight comes in.

Cody Salinas

I'm in the same boat as Ashley.

Because I often develop courses as a lone wolf and am usually ingrained in the big picture, I always have my manager review my work to catch the smaller things, such as triggers misfiring, spelling errors, and so forth. My manager looks at the course with a fresh set of eyes, and that certainly makes the course more cohesive and fluid.

If I'm building content internally for multiple teams or departments, I usually have the SME from each review the course and answer anywhere from five to ten questions that help me edit the course accordingly.

Similar to JD, and especially when working with our SaaS teams to develop client-focused content, we make a real-life Jira board on the wall. It looks like Kindergarten class sometimes, but I found making these simple Scrum boards makes it insanely easy to create deliverables.

Cary Glenn

Typically, I send it off to a SME (if there is one, I've had quite a few courses lately where I had to research and build the courses on my own), my Manager, and sometimes the Manager of the subject area (they usually only get a say at design phase). I agree with Ashley that it is hard to get a good look at a project when you have been involved in it for quite a while. I do set timelines on people, if they haven't gotten back to me within a week I go ahead with the project. Their silence is taken as approval.

The review process at one company I worked at included: one or two SME's, maybe their manager; a technical writer; another instructional designer; a co-worker; and the manager of training. It often felt like they were just looking for something to change. It was literally move this image over one pixel. I soon decided that I didn't need to work there.

Cheryl Theis

We build a 2 round review cycle into our schedule. One round is lead SME, plus 1 other person who knows the product buy did not help write the training. This helps to verify the content and make sure it is correct/nothing missing.  While the SME are testing, at least one person on my team will test focuing more on flow and understanding of the content from a learner's prospective.

Testers are given a testing log to provide the changes/recomendations. These are compiled and discussed with the Lead SME at then end of the test cycle.  Once the agreed upon changes are made, round 2 testing is the Lead SME approving and myself verify all is working correctly.

@Tracy - Like you I have learned valuable insight from a non-typical tester. My 12 year old son loves to test my course to see if he can find any typos or buttons not working. Its the best $5 I spend sometimes

Joshua Roberts

Reviewing should never be taken lightly. This is one of the most important aspects of development, if you don't conduct a thorough review it can make you look very amateur and even worse, unprofessional.

Review the storyboard (internally) 
Review 1st phase (internally)
Review 2nd phase fixes (internally)
Review final phase (internally)
Release.

Multiple phases and it should be checked thoroughly after changes are made on the project.

Brian Miller

In my experience, we generally have an internal manager review the course at each phase of the project before sending it out to our client, who then reviews and gives their feedback as well. The main issue I have seen is where someone who has not been implicated in the project reviews the final version of the module, and decides that a lot of changes need to be made. Has anyone else had this issue? How do you handle it?

John Nixdorf

Some 30 years ago I saw this taped to a phototypesetter (anyone remember those?). It probably goes back to scriptorium days:

Ever Have A "Typo?

The typographic error

Is a slipery thing and sly

You can hun til you are dizzy

But is somehow will get by

Till the forms are off the presses

It is strange how still it keeps,

It shrinks down in a corner

And it never stirs orpeeps

The typographic error

Is to small for human eyes

Till the ink is on the paper ..

When it grows to mountain size

The boss she stares with horror

Then grabs her hair and groans

The copy reader drops his head

Upon his hands and moans

The remainder of the issue

May be clean as clean can bee,

But that typographic error

Is the olny thing you sea

Shawn McCreight

We review and QA for technical or functionality and content.

For starters we use http://www.reviewmyelearning.com/

Reviewing a course locally is never a good idea as many of the functions like web objects and variables and triggers do not function as expected. It is a blessing be able to test it in a real environment that also allows for regression and communication. 

As in most projects, clients like to have multiple reviewers, RME allows for reviewers to see each others comments. Saves time for everyone involved. Saves guess work too.

Note: at this time we have already had the storyboards QAd and vetted. So basically we are in development stage knowing the storyboards have been signed off and are complete.

A few simple rules we follow:

1. Fresh set of eyes (designated QA person or another ID) - this person while viewing for functionality, also reviews for consistency, and that all content exists (follows storyboards, etc.) 

2. Developer corrects issues and complete edits.

3. QA person reviews to ensure edits complete and no other new issues have arisen. 

4. Developer completes any additional QA

5. Client reviewer

At this point we follow the same process again with the client comments.

Hope that this was helpful :)

John Nixdorf

Since in some organizations there is an inviolable wall between course developers and LMS administrators (who very jealously guard secrets of the guild and give you feedback like "The SCORM on that course is broken", but won't let you see the actual error message or let you behind the veil to do any diagnostic). You might want to use SCORM Cloud, a website that you can set up a free account to actually test your courseware before throwing the .zip file over the wall to the LMS folks.

https://cloud.scorm.com/sc/guest/SignUpForm;jsessionid=1549E50A4B64B6D5F019AD11B63E5529

Rachel Barnum

A while ago I shared a template that I use when I'm QA'ing my courses

I will often start with the "back end" - checking each slide within Storyline itself for incorrect answers, too-long timelines, typos, empty triggers, etc.

Then I'll go through the entire course in preview and use my QA sheet. I check every single answer option, hover over everything, make sure they can't click something they're not supposed to, etc. I do that once or twice before sending it to a QA person.

From there, we go back and forth and make changes until we're happy with it to send it to the client or where ever it's going.