Demystifying SCORM/TinCan/Javascript reporting to SCORM Cloud

Jul 12, 2021

Our institution has switched over to Articulate for our learning materials from the now defunct iBooks, mainly using Rise and now Storyline. I've been leading the charge for a year now, but I'm still somewhat lost when it comes to reporting. We have two departments on two different LMSs - one has SCORM Cloud reporting integrated (this is their LRS, I assume); the other has no integrated reporting as of yet.

We are testing some non-conventional courses using SCORM Cloud, both our institution's and my personal account, and trying to make sense of the data. Here are results from a recent Storyline test - TinCan module:



This shows the interaction information the way we want, but also, the text of the responses are instead replaced with "choice_(gibberish)", the time the answer was submitted is not shown, and all the responses are out of order.

For comparison, here's what it looks like when we try a similar TinCan module, but exported from Rise:

While the overlapping text and "scormdriver" tags aren't ideal, this at least shows us the text of each answer, and appears to order them chronologically by when they were answered.

While our LMS's SCORM reporting shows this, it does not give us access to the xAPI LRS. My personal account does, and is a little more helpful:

Still no text for answer choices, but this at least shows everything chronologically.

So looking at the data we're getting, from the types of modules we're testing, I guess my main questions are:

1) Is there something I'm missing that would show the text of answer choices in Storyline, the way Rise does? I can locate these "choice_" tags in the tincan.xml file when publishing, but changing them manually seems to break the project altogether.

2) In the SCORM report, I'm guessing all the responses are submitted at the same time with the results slide, so they are not in any kind of chronological order. Is there a way to fix this, or submit answers to questions one at a time (short of making a results slide for each question)?

3) What is the difference between the SCORM interaction report, and the xAPI LRS? We are still very lost on how LRSs work, or how we could potentially integrate one (is SCORM Cloud considered an external LRS?) into our other LMS.

4) We've also looked at custom Javascript/xAPI statements (again, we're not sure what the difference is) to get data from things like clicks and custom variables, but even the "plain terms" tutorials seem to be just mountains of code that are frankly not in my wheelhouse, and many of them seem to be from years ago. Are there different techniques used between SCORM and TinCan for writing statements? Is it as simple as adding a Javascript trigger to get the variable we want, or do we need to go through writing a .js file, etc.? Is a quiz with results slide still necessary to use xAPI statements within Storyline, and is it possible to use xAPI statments AND see the normal quiz question/answer data in reports? I have yet to get a Javascript trigger to work in any capacity, even some of the basic ones I've tried on this forum.

Sorry for the megapost and flood of questions, but the whole process of changing something, publishing, uploading, testing, repeat can be very time-consuming and it feels like I've been going in circles for a year now.

Any help would be greatly appreciated!

9 Replies
Leslie McKerchie

Hi Ian,

Thanks for reaching out and sharing what you are experiencing with your reporting.

I'd like to understand how the project is set up as well to be sure I understand what I'm seeing here. You can share your project publicly here or send it to me privately by uploading it here. I'll delete it when I'm done troubleshooting.

Ian Donmoyer

Hi Leslie,

Thanks for responding. I think I may have answered at least one or two of my own questions this morning - publishing for SCORM 2004 gives us more of what we want (answer text, and latency times that seem to line up with the order answers were given) in SCORM reporting than TinCan. I guess the question then becomes: are we still able to do advanced stuff like custom Java/xAPI calls with SCORM 2004, or is that TinCan only? Bit confused because everything I read points to TinCan having more detail in data reporting.

A lot of these questions come from more than one specific project, but I will put something together today that addresses the major issues and send it your way. In particular, we built a virtual tRAT quiz based on this thread:

But I'm unable to get the custom variable to report properly. The tutorial for reporting the custom variable ("Score") says you need a short answer survey question linked to the results slide, but this doesn't make a lot of sense because it requires the timeline to start on a slide that's supposed to be hidden altogether. Another tutorial I read for adding a Java trigger at the end tells you to set completion to slides viewed, not the quiz/results slide. But doing this, you get no question/answer reporting for quiz questions, and I wasn't able to get the trigger to report anything either way.

But I'll get back to you, thanks for replying so quickly!

Ian Donmoyer

Hi Phil,

We're trying a lot to see what works; we've been leaning toward TinCan since we hear it's the newer, more advanced reporting method, but as I said in my last post, our analysts will likely prefer SCORM 2004 if the reports show latency times and plain text for answers, rather than "choice_(randomlettersandnumbers)".

I was pretty sure you would need to visit the survey question slide for that to work but boy... the tutorial about that sure is misleading about "hiding the slide from learners." Perhaps shorten the length of the slide to a fraction of a second and have it advance automatically? So they *barely* see it?

I guess then our major confusion lies with Java triggers, custom xAPI statements, whether there is a difference between those, and if it would be possible to implement them while also getting the SCORM question/answer reporting detail we already know works.

Phil Mayor

For the text entry question if everything is one timeline start you wouldn't need to shorten the length of the timeline. All you would need is variable to adjust the variable and then submit the interaction as long as feedback is done it will jump immediately, Should barely be visible and can be masked by changing the look of the slide (could also use the slide as a content slide and would then not need hiding).

For the LMS some of this will come down to the reporting of the LMS. Scorm 2004 3rd edition and above send question level data. IN storyline questions you will get better data if you use the graded questions over the freeform questions as often they send additional stuff.

For Rise I was under the impression it didn't send question level data, but I am wrong. 

If you plan on using both Rise and Storyline I think you are best not going the javascript route in xAPI statements as you will only be able to implement this in Storyline, also you will get some odd stuff sent back as Storyline will still attempt to send some bits back.

Ian Donmoyer

Actually there is a problem with using the text entry method to report the variable - I can't select the variable. In trying to duplicate this:

The variable (in my case, "Score") isn't listed in the dropdown, so I can't add it. In fact, none of the project variables are. At first I thought it was due to creating the text entry slide out of sequence with the others, but now no matter where I place the slide, I can't select the variable for this trigger:

Am I doing something wrong here?