Quizzes Recording Wrong

Jul 07, 2015

We are using Articulate Studio 13. Recently, we have ran into an issue with several modules not recording the quiz engages correctly. Though the user answers the questions correctly, they are being marked as incorrect. We publish to SCORM1.2 and launch our modules from Totara LMS 2.6. This is not happening with all users, or even with all users on a particular module. Any help or insight would be great!

20 Replies
Crystal Horn

Hi Jennifer! Welcome to Heroes.

It sounds like only some users are experiencing the problematic reporting. Are you able to differentiate somehow between those users, for example, by their web browsers?

And then, are you able to reproduce the issue in the SCORM cloud? If not, you may want to reach out to your LMS for support with tracking issues.

Jennifer Richardson

Hi!

Thanks for the reply. The only commonality we have right now is that they
are users from a specific group at the same location. They are using IE11.
I have sent them questions such as are they all using the same computer to
test, have they cleared local cache, do they have a proxy server and
haven't heard back. The thing that is most frustrating is I can sign on as
them, using IE11, take the module in question and have zero issues. So,
might be a site specific thing.

We sent the SCORM to Articulate and tested it on the Cloud with no success.
We were told that changing from Submit All questions at once to one at a
time might solve the issue, but that is a very, very big change to make
since we have over 500 modules. I have contacted the LMS and there is no
issue there, other than telling us to update to a newer release, so we did
that.

Trying to find the common links is the hard part since it is not happening
to all users, even at the same location.

Thanks
Jen

Judy Nollet

Jennifer, have you verified that they are answering the questions correctly? More than once, I've had people complain that the program wasn't scoring their tests correctly, but it turned out they misinterpreted the quiz-review screens. They thought the green checkmarks on the left  were their previous answers, and the checkmarks/dots within the squares/circles were the correct answers, instead of the other way around. Even with additional feedback on the screen, some people get so caught up in memorizing the misinterpreted responses that they repeat the same mistake over and over -- and they insist that they're answering correctly.

You could ask to see screengrabs to verify whether they're answering correctly. Or simply send them all an annotated screenshot explaining how to interpret the review slides (sample attached). Recently, I even had to set up an online conference so someone could share their screen and take the test with me watching. Sure enough, they were answering incorrectly.

I hope your problem is as easy to solve as this!

jen monroe

I am having a similar problem (SCORM 1.2 in D2L) with a 50 question graded quiz that must be completed before submitting all at end. During the quiz review, some questions have a green tick next to the blue bubble BUT the feedback bar at the bottom says incorrect.

I've been investigating and while I can replicate the problem, I get variable results.

Now, that I am investigation the specific question slides to understand if the problem lies there, I see that my choices are acting weird. (see attached). I deleted all choices and re-created them making sure that they didn't get out of order but when I switch to form view, they seem to have shuffled. Can you explain this?

jen monroe

No, but it might be happening only on specific questions and I just haven't seen all the problem questions yet. The questions are set with 1 attempt but the quiz properties with 2 attempts. The weird thing is that the first time I tested it and wrote down the problematic questions to investigate... then in the same LMS session, but after navigating away from the scorm to another file then coming back, I repeated the quiz and some of the problematic questions had been 'fixed' yet others were doing it. 

Jennifer Richardson

I have had users take the same SCORM, on the same browser with no issues. I
can sign in as the user that is having issues, take the SCORM on the same
browser and can't replicate.

We host our content with Akami and did notice that it was caching our
content for a very long time. This was an issue due to updated versions of
the content being on our repository, so we aren't sure if that had
something to do with it or not. We are still investigating. But might be
something you want to look into...

Jennifer Richardson

We upgraded our LMS to a newer release, cleared our server cache, told user to clear local cache, and waiting to see if they have a proxy or caching server....still haven't heard back. 

As I am sure you know, it is frustrating since we can't seem to replicate here and it is not happening to all users at the site that is using the SCORM. 

I have a feeling though that it has nothing to do with the LMS or servers bc we have a reseller that we send the publish packages to and they host and launch from their own LMS that has told us about this issue as well...with that said, I believe there is a bug in the software, but we have only been told to try using Sumbit One at a Time...which is not really an option we want to go with.

 

jen monroe

Thanks for the help! Yes, we had tried it on other LMS and there were NO problems...now I've just tried it on scormcloud and as you might imagine, it functions perfectly.

D2L had these comments to make about how the scorm packet was behaving (and they continue to claim they are perfectly compliant while insinuating that our scorm is not). Do we have any way of controlling this data being passed, when publishing? or is this really an issue they must deal with? At the moment they have passed the buck back to us, claiming our scorm is faulty:

From analyzing the SCORM package, we've been able to find a number of problems that will need to be correct within the package:

 1) The cmi.interactions.n.correct_responses.n.pattern and cmi.interactions.n.student_response is passing invalid data. According to the SCORM specifications (http://www.adlnet.gov/wp-content/uploads/2013/09/SCORM_1.2_RunTimeEnv.pdf page 3-48 is where the cmi.interactions section begins), the value for those calls must be a single character (from a-z or 0-9). Currently, we're seeing values such as "D_D" or "A_Dia_menjadi_lebih_peka_dan_yakin" being passed through.

 2) The cmi.suspend_data exceeds the 4000 character limit. When each question is answered, there are more characters appended to the cmi.suspend_data. This doesn't actually cause a problem until very late in the process when you get to the questions from 40-50 when reviewing the correct answers, but it is definitely something that should be corrected.

This discussion is closed. You can start a new discussion or contact Articulate Support.