Tin Can API Problems on Quiz Retake

Jun 09, 2015

Hello!

I've recently had problems with a course that is being published for Tin Can API in Storyline 2. This course is graded on a quiz at the end of the course with a results slide. The first time the user takes the quiz, the course correctly sends the data to the LMS. When the user re-takes the quiz and gets to the results slide, the course doesn't correctly send the new scoring data to the LMS. If the user closes the course and bookmarks to the results page, the course will send the newly updated scoring data to the LMS. I will give an example.

User: Takes and fails the quiz with a 0% score. Closes the course.

User: Opens the course again and re-takes the quiz and passes with a 100% score. Closes the course. At this point the LMS will still read a 0% score.

User: Opens the course a third time and closes the course. Without doing anything else, the course send the previous scoring data of 100% to the LMS and the LMS reflects the most recent scoring.

 

Naturally, I would like the scoring data to be sent to the LMS without having to open the course a third time.

14 Replies
Andrew Downes

Seems like odd things are happening here. Tin Can follows a journalling model in contrast to SCORM's status model so it's equally odd that the score changes to 100% on the third attempt if the learner hasn't completed the assessment that time. 

Do you have access to the stream of Tin Can statements being generated? This will help to give an indication of where the problem might lie. 

Andrew

Ashley Terwilliger-Pollard

Hi Tony,

I'd also suggest that sharing which version of Storyline you're using, as Storyline 1 only supported Tin Can 0.9 vs.  Storyline 2 which offers support for Tin Can 1.0. So providing a bit more information here could help the community determine what may be happening. I'd also suggest looking at this outside of your LMS in a site such as SCORM Cloud for example which is an industry standard for testing LMS content. 

Tony Burkhardt

Hi Ashley and Andrew! Thank you for replying!

As I said in the OP, this is a Storyline 2 course. I do all of my Tin Can testing on SCORM Cloud as well. 

Andrew, I'm not completely certain where I would find the Tin Can statements that you are asking for. Are these the statements shown in the "View Registration State" page from SCORM cloud?

Tony Burkhardt

Hello again!

 

I'm sorry, we are not able to share the course. However, one of my co-workers found a solution to the problem. The submit results trigger on the results page for the final assessment was the problem for this course. To fix this, the submit results trigger was moved to a continue button on the last question of the final assessment, ordered below the submit interaction trigger. This forces Storyline to send the scoring information on slide change. If you leave the submit results trigger on the results slide, Storyline seems to send the scoring information differently if you are re-trying the quiz and the LMS may not receive the new scoring information.

Andrew Downes

Glad you got it sorted!

A suggestion for next time if you're not able to share your content, you can always make a quick unbranded course with just a few slides and dummy text to replicate your problem and share that. If you can't replicate the problem in the dummy course, that also might be helpful for you to diagnose the case of the problem!

Drew Bertola

Sorry to dredge up an old post, but there was no resolution (only a workaround).  I'm on Storyline2 Update 7 and see this exact same behavior as well.  Consider these three cases:

1. When the user fails the final quiz first, then retakes the course and passes, only a single "Passed" statement is sent for the quiz object - i.e. no result.completion statement is sent.  

2. When the user passes the quiz initially, there are two passed statements sent - one for the quiz and one for the course (with result.completion).

3. When the user fails initially, then closes (reloads) the course, then passes on the first attempt of the reload, behavior matches case 2 for the second attempt.

Our environment is Learning Locker LRS plus a custom front end LMS.  Courses are opened in an iframe via the frontend.

Our testing via chrome with Network inspector filtering for "statement" shows that the LRS is correctly recording all the statements emitted by the course.

I'm not fond of using a workaround to fix an issue in some 200+ courses.  It takes a good month to republish and test.

 

 

Ashley Terwilliger-Pollard

HI Drew,

In Tony's set up it was a specific trigger and location that was causing this issue, so I'm not certain that it was an issue with the course or Tin Can in general. It sounds like you're seeing the information passed from Storyline to your LRS, but it's not the information you'd want? Did you also follow the steps to test at SCORM Cloud to compare that to your LRS and what information is being shared? 

Drew Bertola

For the second part of your response, yes, it is not the information that we want.  What we want is a consistent and tincan compliant way to know whether a course is passed.  You have two cases there, as well.  1 is that a course has multiple quizzes, 2 is that it has a single quiz.  With tincan, you can issue passed statements for a multi-quiz course that doesn't indicate course completion.  Hence, the second passed statement to indicate that the entire course is passed.  Articulate is inconsistent in issuing the latter.  Please fix it.  Indications were that it was fixed in storyline2 update 7, but that is not the case.  We spent bucketloads of money republishing all courses (and license purchases) only to find out that we needed a hack to have it work as required by the spec.

Ashley Terwilliger-Pollard

Hi Drew,

Thanks for sharing that information here and after discussing with some colleagues we'd like to take a look at the Tin Can debug logs you've generated from your LMS in comparison with what our team can see while testing the same course at SCORM Cloud. I went ahead and opened up a case on your behalf, so someone will be in touch shortly with a link to upload your files and a reminder of how to pull the logs (detailed here if you want to get a jump start). 

Leslie McKerchie

Hello Rakesh and welcome to E-Learning Heroes :)

Thanks for reaching out. This conversation is a bit dated, and I'm not quite sure what you are needing help with.

Sounds like you would like to get the user data on those that are taking your quiz? 

Are you publishing to an LMS? That would be the recommended tracking of this data.

This discussion is closed. You can start a new discussion or contact Articulate Support.