Storyline 3 Update & TinCan LRS Reporting Changes?

Mar 29, 2018

Hi everybody,

I first noticed this problem starting on March 9th, right after the SL3 update. Since the update, I've had to republish all of our classes (for unrelated reasons), and all of the updated versions are now exhibiting this problem. I suspect it is related to the update, but if you have any suggestions on how to fix it, that would be amazing!

Our courses have various multiple choice graded questions throughout the course (5 scenes with at least 5 question slides each), plus a 25-question final at the end, which is a random draw from question bank.

Whenever a student opens the course, it is re-reporting to the LRS each interactive slide that has been experienced. For example, if the student has completed the first two scenes and logs off, when they log back in, the LRS will report:

"Student attempted Course A"

"Student experienced Question 1"

"Student experienced Question 2"

etc etc for at least 10 questions - all within the same second.

For one student, she just experienced 79 slides in the same second - wow!

This doesn't actually cause any complications with our reporting, as all slides are still correctly reporting when they are first viewed (and subsequently, too), but as you can imagine, it's quite a lot of unnecessary for our LRS.

We are in the process of switching to GrassBlade LRS, but still have an active SCORM account - the reporting issue is consistent across both LRS's. 

Has anyone else experienced this or have any suggestions on how to fix this? 

 **UPDATE** I forgot to mention, we do track progress and completion by using quiz results, but that results slide is only tied to the questions in the final exam, set to randomly draw 25 questions. This is never tied to the other questions throughout the course.

5 Replies
Katie Riggio

Hi, Michelle!

Thanks for bringing this to our attention! Have you had a chance to test one of your files on SCORM Cloud? I haven't come across a similar instance, but I created a case on your behalf for our Support Engineers so we can investigate this reporting issue with you! You should receive a confirmation email with a link to upload your file shortly.

Somebody from the team will be in touch with you directly, and I'll follow along as well!

Crystal Horn

Hey Michelle.  I checked in on this issue as well, and I wanted to let you know what I saw.

When I tested a sample xAPI course in SCORM Cloud, I suspect I saw the same as you did. Even though the reportage was correct after having resumed halfway through, the xAPI Statement Viewer showed duplicate activity id's for what I completed prior to resuming.

Most of our LMS protocols perform "duplicate" reporting, but the LMS should not interpret that information as new attempts from a completion or scoring standpoint. If you're running into any issues with inaccurate reporting, certainly let me know! Or if these statements are creating extra work for you, tell me about it so we can understand the impact.

I'm going to add these notes to your case that Katie created to confirm with our support team!

Michelle D

Hi Katie & Crystal,

Thank you both for your responses! I did receive confirmation about the support case being filed as well.

Crystal, are you saying that the duplicates are always/have always been there, but should be "removed" (or filtered out) by the LMS?

As I mentioned above, it's not necessarily producing anything that is inaccurate, but we do frequently review our LRS for various reasons, and the sheer quantity of duplicates is... inconvenient, especially if a student is experiencing some kind of trouble and closing/reopening several times - in a situation like that, it can easily exceed several pages on our LRS.

Our older classes on SCORM  are not exhibiting this behavior, only ones updated since the SL3 update. However, we are about to go live with new LMS & LRS, in which all of our courses have been updated to the most recent version of SL3, and all are experiencing this reporting issue - so, if there are unforeseen problems with it, they may surface quickly.

With a minimal amount of students testing, I have yet to see it affect any of our final exam reporting with incorrect information, but that is where my concern lies, as our students are required to pass a final exam, per state legislation. 

I could imagine a situation where, if a student were to complete steps in a particular order, it might affect it, for example:

  • Fail exam and choose to "retake exam," which will engage the "reset results" trigger.
  • Progress through a question or two
  • Close course
  • Reopen course (thus causing all duplicates to report again and possibly contribute to final score)
  • Achieve passing score illegitimately

Is this an unreasonable concern/are there preventative measures in place to limit this?

Thank you ladies again! Looking forward to an update to see what's happening and what we can do about it.

Ashley Terwilliger-Pollard

Thanks Michelle, I share this information with my colleague Ronaziel whose working on your support case. Did you get the email with the link where you can upload your project file? That'll be the next step to continue troubleshooting this. If you didn't see it, let me know or search your junk/spam folder for something from Support@articulate.com

This discussion is closed. You can start a new discussion or contact Articulate Support.