Quiz shows correct answers incorrect on LMS

Dec 26, 2016

Hello everyone,

Since very recently, we're experiencing a very big problem with some of our clients. Some of them have correct answers noted incorrect when they pass their exam on our LMS. Our courses are actually a set of two courses: one for the course itself and a quiz for the exam and both are made using Storyline.

Our LMS provider claims that there is no way the problem comes from the LMS since it is the storyline output which makes all calculations and so on. But we cannot see the problem on Scorm Cloud. The problem does not happen all the time and only some clients have faced it so far. But the problem started on 15th December.

We changed some settings on Storyline to see if it could correct the issue but nothing seems to be working. The biggest problem I have now is I don't know how to check whether the problem comes from our Storyline content or from the LMS. Even though I have tested all courses (because that does concern all our courses) on Scorm Cloud, I admit I can't make as many tests as the people who actually use the course.

Do you have any idea?

34 Replies
Dave Cox

Hi Aurélien

Most LMSs have a way to view the SCORM communication. There should be some sort of log where you can view what gets sent to the LMS from the Articulate module. If you can access that log, that would probably be the best first step.

If that doesn't direct you to the problem, the next step would be to enable Articulate's debug mode. The debug mode opens another window and displays all of the SCORM communication that happens during the course execution. This mode dumps a lot of information, so it can really be a job to interpret the results, but it will allow you to see exactly what is sent to the LMS. Take a look here to see how you can place your module in the debug mode. 

Aurélien Largeau

Hi Dave,

Thank you so much for the debug mode tip. I didn't know there was one and I'm really happy there is.

I do have access to the LMS logs but they don't say much except things we already know. We've enabled the debug mode and launched the quiz more than a 100 times. The bug where some answers were marked incorrect even though they were correct happened 1 time. And the one time the bug happened we didn't see anything special in the Storyline window. All 15 questions of the quiz were asked and answered as usual and we could see our score on the results slide. That's only when reviewing the answers we realized that some were marked incorrect even though they were correct (and we double-checked our Storyline file to make sure they were).

The debug window was very interesting. We noticed that the course acted as if the quiz was finished since a score was sent to the LMS, BUT 1). the quiz was actually not finished and 2). we could see a score of 40% on the results slide but the debug window showed 0%. To make it clear, here is how it went: we answered 15 questions out of 15 and could see a score of 40% on the Storyline window but the debug window showed we only answered 12 questions and then a score of 0% was sent to the LMS. That means the 3 questions missing we actually did answer were not recorded (I guess that's why they were marked as incorrect) and it possibly messed up the final score.

What I still don't get though is why the course decided to send a score and thus end the quiz though 3 questions were still to be answered.

Crystal Horn

Aurélien, that's some really good detective work.  If you'd like to share your findings with our support engineers, we'd be more than happy to see if we can help identify what is going on when that quiz erroneously prematurely sends a score to the LMS.

You mentioned that it happened only once- was that instance in the same environment as with your other testing (browser, computer, network)?  If you can identify a difference (and I'm sure you've been trying), that'd be helpful for us as well.

Scott Wiley

Whenever we've had inconsistent issues like this, it was found that the course was exceeding the character limits in the SCORM suspend_data element.

Storyline adds a lot of data for each quiz question and also if you have your slides set to return to saved state instead of initial state.

If you are currently publishing to SCORM 1.2 or SCORM 2004 2nd Edition, this is a possibility.

We fixed our problems by choosing to publish to SCORM 2004 4th Edition (see character limits for each below).

Spec                                                Characters
SCORM 1.2                                     4,096
SCORM 2004 2nd Edition            4,000
SCORM 2004 3rd Edition             64,000
SCORM 2004 4th Edition             64,000

Hope this helps track down the problem.

Aurélien Largeau

Thanks Crystal.

Regarding the issue, I'm wondering whether the LMS could be responsible or not. Our LMS provider claims there is no way it can be since the LMS is simply acting as a receiver and receives information from the Storyline course. When we tested the course and the issue occurred, we could actually see that the course acted as if the quiz was finished, so it is obviously the course that is sending incorrect information. But does this mean the LMS is innocent?

I'm wondering because we've got the courses online since September approximately and the issue only happened two weeks ago. So, I'm not trying to point who's responsible here, but I'm really trying to understand how they communicate.

Dave Cox

HI Aurélien,

Of course it is possible that the LMS could be your problem. The Storyline course sends its scores to the LMS, but then the LMS is responsible for saving and storing the sent stores properly. Most of the time, these type of problems lie with the course content, not the LMS. That doesn't mean that the LMS doesn't have a role to play though. That is why it is important to take a look at the communication to be sure of what was sent by the course, as compared to what was saved by the LMS. 

Since you only started seeing the problem recently, I would ask, what has changed. Since the course was working, and hasn't changed, I would ask what else has changed. If you are communicating the scores correctly to the LMS, and it still isn't saving it properly, it is possible that there something going on there. It is still possible however, that there is still something going on with the course, and the LMS is responding to it different. It takes careful detective work, and a bit of cooperation from both sides to figure out these kind of issues. 

Ashley Terwilliger-Pollard

Hi Aurélien, 

As far as communication, you could look at enabling LMS debug mode to see what your course is communicating back and forth, and you could also look at testing a copy of the course at SCORM Cloud which is an industry standard for LMS testing. This article will walk you through how to do the testing there. 

Crystal Horn

I get it- it is an odd thing that is happening to you.  This is the quiz data that is sent from Storyline, and this is when a course communicates completion to an LMS.  This information might be somewhat generic for this mystery, but the more the better.

It looks like you're working with Cleo in your support case, so I'll be interested to see if we can identify the breakdown.  Thanks for keeping us posted!

Kathia Nieto

Hello!

I just published my very first project and I'm having tons of complaints regarding the quizz results.

The course has two parts. Part 1 has 3 modules with a quizz at the end of each module. I set one result slide to calculate the results for "selected questions" and another slide to "selected result slides".  Users get a passing score over 80% (which is the minimum %) where 80 is the 100%, but the LMS shows 66.66% and a "did not pass" message.

Part 2 has two modules and the same result slides programming.  Users get 75% /80% and a "did not pass" message.

I have tried to use the SCORM CLOUD, but my company has it blocked.

Any ideas?

 

Dave Cox

Hi Kathia,

Are part 1 and part 2 included in the same or different Storyline courses? 

If they are different courses, then try this: Set Part one, so that you have 1 results slide after each quiz, set to calculate the results for the selected questions. Then set 1 additional results slide and the end, set to calculate for the selected results slides, and include your first three results slides on this results slide. This should aggregate your scores correctly.

If they are both part of the same course, then it sounds like you have 6 quizzes. In this case, you should only have one results slide set to selected results slides to aggregate the scores from all of your quizzes.

 

Aurélien Largeau

Hello Kathia,

To be totally honest, I'm not 100% sure the problem is entirely on the LMS side. We've done plenty of tests and the best we could manage is to set results to be send on a per slide basis. This enables the LMS to store data step by step which is btw a far more secure setting. You should also consider enabling the debug mode which is very good and tells a lot of interesting things... if you can get the bug to show up.

Kathia Nieto

Hello to all. Thanks for your attention.

Aurélien: I do not know how to enable the debug mode. Can you please advise?

Dave: Part 1 and 2 are different courses, since the training was too long with a lot of images and videos I decided to split it, thus, the user gets at the end 2 passing scores, one for each part.  I currently have the final scores slides set as you suggest, but the problem is there despite that programming.

Wendy: how can I share my story?

Dave Cox

Hi Kathia,

In Part 1, slide 4.1 is the results slide for sections 1, 2, and 3. Then you have another results slide that has the results for slide 3.87. The results slide 3.88 also adds the slide 3.87, so I'm not sure what you are trying to do here. 

The results slide that compiles the results of the other results slides should be the last slide in your presentation. So you should probably move slide 4.1 to slide 4.2. 

It seems that slide 4.2 is getting the results from a single true/false slide. That would make that slide account for about 1/4 of the total score. But since that slide is also counted by quiz 3s results slide, your results are going to be off.

In part 2, the final results slide seems to be in the correct location, Slide 3.1 seems to contain the results for only the last two slides in the two quizzes.  Results slide 1.38 isn't connected to any slides, and slide 2.86 is connected to all but the last slide in that quiz.

Be sure to click the Edit Result Slide button for each result slide, and make sure that they are pointing to the correct slide.

I would fix them for you, but I'm not sure what you want done, and my version of Storyline is different than yours.

Kathia Nieto

Wow you detected the issue so fast!

What I want done is:

Part 1: 1 quizz after each module (3 total quizzes) but only the 3rd quizz is the one summarizing the results of the 3 quizzes to the user and showing one single result for Part 1.

Part 2: Same as above, but only 2 quizzes (one for each module). 

I'll update my Storyline version right away.

This discussion is closed. You can start a new discussion or contact Articulate Support.