Storyline 2 serious issue in setting score after reset result slide

Hi everyone, after publishing several courses in sl2 our customers started notifying problems on tracking status of the scorms. It happens that a course status is being set as completed also when the real score is less than the passing score. So we made a very accurate debug on scorm cloud using the debug function to discover where the error occurs. It happens if you insert a final test with the option to retry it after a failed attempt (having a lower score than passing score). The trigger "Reset results x.x Results When the user clicks" generate a strange sequential assignment of score that increases until reaches 100 and by consequence set the status as completed.

example of the wrong assignment registered in the log, after obtaining a score of 20/100 and clicking "retry quiz":

 In SetScore, intScore=20, intMaxScore=100, intMinScore=0
...
In SetScore, intScore=25, intMaxScore=100, intMinScore=0
...
In SetScore, intScore=33.33, intMaxScore=100, intMinScore=0
...
In SetScore, intScore=50, intMaxScore=100, intMinScore=0
...
In SetScore, intScore=100, intMaxScore=100, intMinScore=0

all the previous assignment are passed in a sort of loop error.

I attached the log for inspection.

This bug is invalidating all the scorm that are present on LMS because of completion status, so we had to  remove all Storyline 2 courses from the list in the LMS and we are facing an hard situation.

Anyone has experienced the same?

thank you

 

68 Replies
Zio Fonta

update: the error occures also without random selection, it concernes the quiz bank. When "exploding" the quiz bank and tracking score on the final result slide by checking individual questions, score attribution works fine. So we can say that it is for sure an issue related to the quiz bank functionality.

Zio Fonta

other update: it seems i'm the only one here interesting in this issue, but believe me, the bug is very serious!

we found out this difference in how Storyline2 sets intInteractionIndex increment for quiz slides. While SL1 correctly set an increment id for intInteractionIndex starting from 0 and increasing by 1 for each user "submit" to a quiz slide, SL2, wrongly, uses always a value intInteractionIndex=10 for each submit. This generates tracking problems.

Is it possible that we are the only people that are experiencing this problem???? How can all of you tracking  correctly on your LMS?

Emily Ruby

Hello Zio!

Are you testing this course in the SCORM Cloud, or just though your LMS? What LMS are you using, in case others here have had similar issues.

Have you tried importing the file into a new file?

I did a quick test course with the option to retake the quiz, and no matter how many time I hit retake, the final result was showing as incomplete on the SCORM Cloud test and the latest faled test score was showing.

Would you be able to share you file for us to test as well?

Jeffrey Hurt

So glad I found this thread!!!

This is happening to us right now and causing all kinds of issues. After a user fails a course they are taken to a "Sorry" page with two buttons. "Retake the Quiz" and "Review the Course." If a user exits the course without clicking on the buttons that contain the "Reset Results Slide" trigger, the course remains as "In Progress." As soon as the user clicks that button, it is sending a command to the LMS that the course is complete, but it is complete with a 0%.

If the user proceeds to retake the quiz and pass (or continue until they pass), it probably wouldn't even be noticed. However, if the user stops in the middle of the quiz for some reason (or chooses "Review the Course" and decides to leaves in the middle), then they receive a "Complete" on their transcript.

While these are easy for us to track because of the "Complete" status with 0%, it's not as easy telling someone, "Hey, you didn't REALLY pass this mandatory compliance course so can you please go back in and take it until you pass?" This course was updated from SL1 initially, but slides have been recreated fresh only in SL2. Even adding that "Reset Results" to a timeline will cause the "complete" status to be sent. 

Jeffrey Hurt

https://community.articulate.com/discussions/articulate-storyline/can-i-stop-storyline-reporting-a-score-to-the-lms?page=3

I wish I knew what to modify in the SCORMFunctions.js to cause this zero score from being sent as complete. 

Testing on SCORM cloud and even internal tests have a difficult time identifying the problem. The 0% complete is sent as soon the user clicks the buttons containing the "Reset Results Slide" in association with a question bank. It is very difficult to isolate this call, but I believe the answer could be to tweaking the SCORMFunctions.js file--only I don't know how/where to make the modifications.

Jeffrey Hurt

We are using Intuition as our LMS vendor. They read straight from the SCORM being passed and don't have any internal regulation. If it sends a 0% complete, it accepts and logs that. The issue has been a user completes a question bank quiz and gets a failing score (say a 35%) that is being sent to the LMS as a 35% and an "In Progress" because they haven't met the criteria. As soon as the person clicks the button containing the Reset Results Slide trigger, it is automatically sending a "Complete" status to the LMS with a 0% score.

Thanks for your assistance Emily!

Sara G

I'm so glad I found this thread. Otherwise we'd continue thinking the issue is from our LMS vendor! I converted our certification exam from SL1 to SL2, and revised and added some question slides. It does contain question banks.

I see the only suggestion offered by Articulate is to do away with question banks. In our case, this concerns a certification. You can't just change the structure of a certification without thinking about its validity and legal implications! 

Does anyone know if this issue applies only if you "covert" a file from SL1 to SL2? I'm willing to entertain the possibility of recreating the question banks in SL2, basically start from scratch. Before I spend hours doing that, I'd like to know if that will work. However, this is not sustainable. We have a number of courses with knowledge checks that also that are pre-requisites to this certification.

This is an issue that Articulate has to address immediately!