xAPI Storyline 360
Jul 15, 2019
Hi,
I'm trying to track content from Storyline 360, initially to Scorm Cloud for testing.
https://articulate.com/support/article/Implementing-Tin-Can-API-to-Support-Articulate-Content
Test file story attached.
1) In publish/tincan/reporting when you leave it as "story.html" Scorm Cloud will track most of the pages. However, it does not track the questions/responses, showing only the page open and the results page (Capture.PNG). Why are the questions/responses not tracked?
2) I don't want to host in Scorm Cloud - hence I need to be able to send the results to a proper LRS. No matter how I try and format the story.html launch page, I get a 404 page not found error.
Base unencoded version for clarity
story.html?endpoint=https://cloud.scorm.com/lrs/xxxxxx/&auth=Basic xxxxxx&actor{"name": ["TEST"], "mbox": ["mailto:test@test.co.uk"]}
I have successfully used the ADL xapiwrapper.min.js to send statements from Storyline so I know the values for endpoint/auth are correct. I need to be able to track questions and responses and would rather not have to write the full statements for each question myself, seeing as that should be built-in functionality.
How do I format the xAPI launch page in Storyline?
3) I've tried to use debug mode https://articulate.com/support/article/how-to-enable-lms-debug-mode
Why on earth does this only work with Flash output?
Thanks
Karl Manning
10 Replies
Hi there, Karl! Thanks for sharing your .story file.
Tracking by the quiz results sends this quiz data to your LMS (here's a shot from SCORM Cloud):
Hi,
Thanks for the reply. Turning on track by quiz results has worked.
I'm also using the info from https://community.articulate.com/discussions/building-better-courses/guide-send-an-xapi-statement-from-storyline-360 which uses a JavasScript library to send other xAPI statement for example showing a layer, listening to audio, watching a video.
Karl
Perfect - glad to hear it Karl! Thanks for linking to that other discussion too as I'm certain it'll help anyone who stumbles across this.
Looking for an update about the debug mode for xAPI content... my organization will be discontinuing the use of Flash before the 'official' end of Flash support. Is there by chance a workaround to view the debug content?
Hi Jason!
Thanks for bringing this up! We have an active feature request logged for enabling LMS debug mode for HTML5.
You're right; when Flash is no longer supported this will not be an option. We're preparing for this change as well.
When I hear of an update, I'll be sure to report back to this discussion!
I'm curious to hear how people are obtaining the actor ID and email to send with xAPI statements, particularly if learners are accessing via an LMS. As far as I know, there's no way to pass that info from the LMS, but having students supply it separately would be onerous. Thoughts?
thanks!
andrew
Hi Lauren,
I am in need of the debug report for xAPI, and now that Flash is no longer supported, is there an update to implementing debug for xAPI?
Thanks for your prompt reply.
Hello Don!
I don't have an update to share on implementing debug for xAPI. As soon as I have an update, I'll make sure to share it in this discussion!
Bump on Andrew's question. I'm still baby-stepping into XAPI and am struggling to understand the logic of the language. I do see that my LMS is capturing activity from the XAPI packet with the correct user's name. I'm trying to figure out how to write statements that will report a quiz result along with other statements.
Hello everyone!
Great news! We just released another update for Articulate 360 and included a few important fixes you'll see in the release notes.
The item you'll be interested in is:
New: Troubleshoot xAPI statements with an easy-to-use debugger. Find out which statements fail and why. Export your results and share them with your team for analysis.
Just launch the Articulate 360 desktop app on your computer and click the Update button for Storyline 360. Details here.
Please let us know if you have any questions, either here or by reaching out to our Support Engineers directly.