Forum Discussion
Storyline & SCORM HELP
Is anyone else using Storyline & SCORM as their LMS? I am trying to figure out how to get a report from SCORM to see which questions my participants got correct and which they did not.
3 Replies
- Jonathan_HillSuper Hero
Unfortunately, much depends on your LMS reporting engine and how your course was built. If you used standard question slides (not freeform), a SCORM report within your LMS may show the data you seek - but it likely won't be in a very user-friendly format.
Which LMS are you using?
- AutumnWeatherhoCommunity Member
Hi Jonathan,
Thank you so much for your reply. I have to say, I’m a huge fan of your work on here! 😊
To give you a bit more context, our team creates most of our e-learnings in Articulate Storyline, with the occasional course built in Articulate Rise. The majority of our content is software training, and we assess participants in various ways, including software simulations. We publish everything as SCORM 2004 4th Edition.
We manage a few different LMS environments. The first is PeopleSoft, and the second is an in-house system called TED. TED allows participants to find and register for training, and they’re then given access through an emailed link. Neither TED nor PeopleSoft currently tracks completions automatically, so we monitor participant progress through SCORM and then manually mark completions in the LMS.
Previously, we used Adobe Learning Manager, which offered clean, detailed reporting. By contrast, SCORM reporting has felt much more limited. When we manually pull a SCORM report for a course (based on a date range), we only see whether someone has “Completed” or “Passed” but not which questions they got right or wrong.
The only way to view question-level data is to open the registration record for each participant individually, click into the details, and review their status over time. This makes it really difficult to gather meaningful insights at scale, especially since we’re trying to implement Kirkpatrick Model Level 2 evaluations to better assess learning outcomes.
If you have any suggestions, tips, or reporting strategies that might help us extract question-level data more efficiently, I’d be incredibly grateful for your insight.
Thanks again for taking the time to respond!
Warm regards,
~Autumn
- Jonathan_HillSuper Hero
Hey Autumn, thanks for the additional info. I faced this issue when I used both Litmos and Docebo. My experience is that most LMS platforms suffer the same drawbacks in respect of pulling legible, easy-to-interpret, question-by-question data from SCORM. To some extent, it's easier to 'see the Matrix' if you have designed the course yourself and recognise elements (such as slideID, object names, content) that appear in the individual SCORM reports. With third party content, this is trickier, and compounded when freeform questions are used. Often, the question itself is not displayed at all in the report, just the answers. So, being familiar with the content and the expected outputs is really important.
Back in the day, I figured out how to transfer the SCORM reports as CSV files into a macro-enabled spreadsheet that paired the data with the questions and presented it in a much more user-friendly format. I've also had similar success using PowerBI to carry out this analysis. I imagine it would be relatively easy to ask an AI to do this, too. But again, it relies on being familiar with the content and spotting the patterns.
You may have more luck sidestepping SCORM entirely. If you're authoring your own content, xAPI data passed to a seperate LRS might be easier to work with. Or link to a Google Form/Spreadsheet that captures the learner's responses in a similar way.
Related Content
- 8 months ago
- 9 months ago
- 8 months ago