Forum Discussion
Storyline Suspend Data Compression
Good day!
As some of us know SCORM 1.2 only limits its suspend data for up to 4096 characters. Storyline (360) compresses its data (e.g. SL variables and such) in order to fit the limitations of the suspend data. There must be an underlying decompression algorithm or its own unique reader on SL to read the suspend data.
My question is when this compressed suspend data becomes decompressed, would there be a possibility of it hitting the 4096 limit?
- DavidHansen-b20Community Member
Uh, first, those were suggestions for you to use to debug, not to just collect the info for someone else to analyze. If you get the error exception message from pako.inflate, then I might be able to help decode what that could mean.
Second, you need to actually include that output. It would be near the beginning of the debug log (just after the call to GetDataChunk). The file you attached as blank_Session_2.html does not start at the beginning and does it include the first GetDataChunk call that would then have the key error message.
- JanVilbrandtCommunity Member
Hi David,
first of all: A big thank you for your idea and sharing it with us.
My company is using the LMS "LSO" from SAP.
I ran into an error when using your code. I have included pako V. 2.0.4. Maybe there is a bug in that package.
Second there is a problem with your base64 encoded data. Your program code does not compress "binary data" (the zipped data from pako) but a textstring. So the result isn'nt really "compressed".
This is my solution based on your idea: (follow the unstructions on page 1 of this conversation)
function getDataChunk() {
.....
try {
var strDataC=objLMS.GetDataChunk();
WriteToDebug("GetDataChunk strDataCompressed="+strDataC);
var blob=atob(strDataC);
var decarray=[];
Object.values(blob).forEach( function (item) { decarray.push(item.charCodeAt(0)); });
var strData=window.pako.inflate(decarray, {to: 'string'});
WriteToDebug("GetDataChunk strData="+strData);
return strData;
} catch (err) {
SetErrorInfo(ERROR_INVALID_RESPONSE, "DataChunk Inflate error: "+err);
return "";
}
}
function setDataChunk() {
...
try {
var strDataC="";
var compressed=window.pako.deflate(strData);
var blob="";
Object.values(compressed).forEach( function (item) { blob+=String.fromCharCode(item); });
strDataC=btoa(blob);
WriteToDebug("SetDataChunk strDataCompressed="+strDataC);
return objLMS.SetDataChunk(strDataC);
} catch (err) {
SetErrorInfo(ERROR_INVALID_RESPONSE, "DataChunk Deflate error^: "+err);
return "";
}
}
The result works fine with Articulate Storyline 3 Version 3.15 and should work with Articulate Storyline 360 as well.
I tested with a training which creates about 7.300 bytes of suspendData.
The compressed data is only about 1.500 bytes.
The compression is great. The result is only 20% of the uncompressed data.
Some additional notes to the instructions of page 1:
<script src="lms/API.js" charset="utf-8"></script>
is now
<script src="lms/scormdriver.js" charset="utf-8"></script>
The name of the file ist now index_lms.html (not index_lms_html5.html).
Thanks again, David, for coming up with that idea.
Best wishes,
Jan
- JeremyTrott-098Community Member
Is anyone able to post a solution that works with Tincan/Xapi please?
- NickMorrisonCommunity Member
What's crazy to me is that this issue is STILL a thorn in everyone's side after all this time.
If a compression system/string via java "freeware" is available - why can't Articulate just amend their programming to contain this (or a similar fix) in their SCORM 1.2 export tool in the first place?We all want/need it.
This isn't something that we should have to get into the guts of the SCORM Package to play with/adjust "hack" just to make our courses work the way we (our clients) want them to.
Afterall - it's not as though courses, tracking and logging demands are getting smaller.
- BugnaitBugnaitCommunity Member
Hi David, I want to use instructions pako.min.js file and I have followed all the instructions as suggested by you, but when I am testing the Articulate Storyline 360 Scorm 1.2 compatible package on SCORM cloud it is not working and my module gets stuck on starting. If possible, please share any example package for the storyline with pako.min.j implementation....
- JanVilbrandtCommunity Member
The big question is: have you eead all posts of this discussion.
If not: There is a major update of pako.js
The old code of the first posts isn‘t working anymore because of major changes in pako.js.
++++
I am using my code for over a year now and it works fine. You find it in the later posts.
- DavidHansen-b20Community Member
Yep, I concur.
This is the most relevent entry: Pako v2 update