Need to Track/Report Quiz Results at the Question Level

Jan 24, 2013

Hi all,

Newbie here and very excited about finding this great resource for Articulate. I'm not new to user forums, but I am a newbie here! That said, I know that "search" is the first step...so...I just spent the last 15-20 minutes searching for this issue and really didn't find the exact scenario/answer.

We use Articulate Quizmaker to test competency in various e-learning modules on our LMS (Training Partner).  Although we are able to track the "completion results", I need to be able to track responses to individual questions to analyze/determine if there are any trends (i.e. % of staff that miss the same question(s)).

Please keep in mind that I am a newbie at Articulate.  Can someone help me with this? 

Thanks!

16 Replies
Christine Hendrickson

Good morning Frank. Welcome to E-Learning Heroes!

Quizmaker will send all reporting data to your LMS, including specific information on individual questions and answers. I'd recommend taking a look at this extremely useful article on troubleshooting Articulate / LMS issues:

http://www.articulate.com/blog/9-ways-to-troubleshoot-articulate-lms-issues/

In the article, you'll learn exactly what data Articulate sends to an LMS (how that data is reported is up to the LMS), as well as information on how to test your content on SCORM Cloud. 

I'm not sure if other community members who also use Training Partner, but if so, hopefully we can get some additional information for you. 

I hope this helps!

Frank Pietrantoni

Hi Christine,

Thanks so much for the speedy reply!  I'll have a look at the article you linked for me.

And I agree that any information from other Training Partner users would be really helpful.  I've found that all the various LMSes have equally various strengths and weaknesses. Getting information from users using the same application as yours is ideal.

Thanks again!

Philip Deer

Following up on a similar issue. I did not find the link above helpful a I was not able to determine what I need to do.
We have a Quizmaker 360 file published to our LMS (Wisetail).
However, when I pull a SCORM report, it only shows the final score, not the pass/fail per question. We would like to be able to see a report with results per question so we can analyze which questions are most commonly missed and see if there is a reason why.

How can we get a report that shows the breakdown per question per user?
I reached out to our LMS and they said;
"Not in the LMS reporting, but that functionality should be available in your Articulate test report."

Thanks in advance!

Brennan Penders

Hi Philip,

I know it's been a while since you brought this up, but wanted to check in on how your inquiry went with tech support?

We have the exact same need for question level reporting on a project I'm working, and I believe we need to publish as SCORM 1.2 ourselves.  Did you make any headway on this?

Thanks!

Lauren Connelly

Hi Brennan!

Ryan, one of our Support Engineers, found the fix for Philip!

You need to change the prompt to resume to always resume and see if that works.

Here's how:

https://community.articulate.com/series/74/articles/articulate-storyline-360-user-guide-how-to-change-the-resume-behavior#options

The Always resume option will always resume to the last slide that the user has viewed and won't be prompted if they want to resume the course.

Thanks for checking back in! 

Philip Deer

Hi Brennan,
We still publish to SCORM 1.2.
We ended up putting our quizmaker exams into Storyline.

This is giving us a report that shows how users did on each question.
However, if you want to analyze many users together to get a good snapshot of how people are answering questions, you have to stick with question types that only have a single answer (ie: multiple choice, true/false, pick one).

Any question types that have multiple parts (multiple select, pick many, drag and drop, etc...) generate an inconsistent number of rows in the report from person to person depending on how they answered. Thus, using an excel macro will not work because each person's report might contain a different number of rows.

We don't want to sacrifice the interactive types of questions as this opens up a better test experience and can help you create questions that check against learning objectives better in many cases. Thus, we only check how users are responding on each exam once per year (maybe twice in some cases) because it is an entirely manual and lengthy process to go user by user to see how they answered against the correct answer and then check multiple users answers against each other.

We like to see which questions are missed most often and greater than 50% of the time (if I can toss a coin and have better probability of getting it correct, the question and/or content needs work...usually the question). We go user by user on the ones who answered incorrectly  to see how they are answering and if most people who get it incorrect are answering the same way we definitely know there is an issue with the question wording or structure.

Hope this helps.

Linda Ferguson

Brennan - Thanks for your helpful info above about going through report line by line to figure out user correct/incorrect answer. 

I'm using Rise to create my quiz.  What is the functionality of Rise quiz reports?

Can we get better accuracy in Rise reports of quiz answers?  Our LMS uses Scorm 1.2.

Thanks,

Linda

Leslie McKerchie

Hello Linda, and welcome to E-Learning Heroes. 😊

Thanks for reaching out and sharing what you are working on in Rise 360. The conversation here was specific to Quizmaker.

The reports are dependent on your LMS, and this article shares what is sent to an LMS:

Rise 360: Quiz Data Sent to an LMS

Have you compared the reports from another LMS, such as SCORM Cloud?

If you'd like to work directly with our support engineers to investigate what's happening, don't hesitate to get in touch with them here.