Forum Discussion

ChristianOmpad's avatar
ChristianOmpad
Community Member
6 years ago

Storyline Suspend Data Compression

Good day!

As some of us know SCORM 1.2 only limits its suspend data for up to 4096 characters. Storyline (360) compresses its data (e.g. SL variables and such) in order to fit the limitations of the suspend data. There must be an underlying decompression algorithm or its own unique reader on SL to read the suspend data. 

My question is when this compressed suspend data becomes decompressed, would there be a possibility of it hitting the 4096 limit?

  • If you are willing to open and manipulate files in the SCORM package, you can apply an easy patch that will add data compression to the suspend data sent to the LMS.  Note: Articulate suspend data is NOT compressed.  It is, however, not human readable.  But, it is VERY compressable - typically achieving a 10:1 ratio using a typical zlib compression method.

    Note: the following suggestion is only for people comfortable with unzipping/zipping their SCORM package and doing basic edits to text-based xml, html and javascript files.  If this is not you, then you should not consider doing this or find someone that can help you with it.  A good xml/html/js savvy editor is also beneficial.  The brackets editor is a good example.

    What you need to do:

    1) Obtain the pako.min.js package. This is an open-source, well-developed and reviewed, time-tested zlib compression library written purely in Javascript.  You can google it and download just that file or download the latest version right from the repository using this link: pako.min.js.   You are now going to add this file into your SCORM package (zip archive). 

    2) Unzip your SCORM course into a directory and change your working directory there.

    3) Put a copy of the pako.min.js file into the lms/ subdirectoy.

    4) Next edit index_lms_html5.html and search for "lms/API.js".  You should find something that looks like this:

    <script src="lms/API.js" charset="utf-8"></script>

    Then add this new line after that line:

    <script src="lms/pako.min.js"></script>

    Save the changes.

    5) Next edit imsmanifest.xml, and go to the end of the file.  Just before the line </resource>, add a new line with:

    <file href="lms/pako.min.js" />

    Save the changes.

    You have now successfully added the zlib compression library into your SCORM package.   All you need to do now is modify the routines that are used to send and receive the suspend data with the LMS.  To do that:

    6) Edit the file lms/API.js

    Search for "function GetDataChunk".  Replace the line containing return objLMS.GetDataChunk(); with the following lines:

    try {
    var strDataC=objLMS.GetDataChunk();
    WriteToDebug("GetDataChunk strDataCompressed="+strDataC);
    var strData=window.pako.inflate(atob(strDataC), {to: 'string'});
    WriteToDebug("GetDataChunk strData="+strData);
    return strData;
    } catch (err) {
    SetErrorInfo(ERROR_INVALID_RESPONSE, "DataChunk Inflate error");
    return "";
    }

    Then scroll down a bit to the next function which should be "function SetDataChunk".  Replace the line containing return objLMS.SetDataChunk(strData); with the following lines:

    try {
    var strDataC=btoa(window.pako.deflate(strData, {to: 'string'}));
    WriteToDebug("SetDataChunk strDataCompressed="+strDataC);
    return objLMS.SetDataChunk(strDataC);
    } catch (err) {
    SetErrorInfo(ERROR_INVALID_RESPONSE, "DataChunk Deflate error");
    return "";
    }

    Save your work.

    At this point you are now done with the modification to add compression to your suspend data, so you can now:

    7) Zip the contents of your SCORM package back up.

    Some caveats about this modification: the format of the Articulate suspend data is a sequence of information that matches the sequence of slides in your course.  If the data happens to get truncated, it is possible for Articulate to still process the data up to the point of the truncation and resume up to the last slide of data.  This means resuming will still "kind of" work, just not exactly.  However, if the compressed data gets truncated, the compression algorithm will fail completely and ALL the resume data will be thrown out (and you'll resume at the first slide).  For me, this is a more than worthwhile trade-off, especially since in my experience, I typically have been seeing a 10:1 reduction in suspend data size - this means 30KB of suspend_data size will now be just 3KB and less than the default SCORM 1.2 size limit for suspend_data (4096 bytes).

    Good Luck.

    As to why Articulate development hasn't just added something like this to their course exporter?  Anybody's guess, I suppose...

    • samer's avatar
      samer
      Community Member

      Thanks David for the solution! 

      Can I just copy and paste the lms/API.js to a different course, so i don't have to keep editing?

  • Hi Christian...

    Could you solve it? Certainly, the options offered by articulate are not an option.

    I interested in your option 3 - "Only set important slides' setting to resume"

    Regards!

    • ChristianOmpad's avatar
      ChristianOmpad
      Community Member

      Hello Zoe,

      Since our LMS only uses SCORM 1.2, I had to make do with the 4096 character limit (unless you republish to SCORM 2004 3rd or 4th edition). While the solution in #3 worked for me, there is no actual workaround with the limit. So basically what happened is, I found out that Storyline's Slide Setting "Resume to saved state" takes up additional suspend data and all I did was to limit its usage.

      I highly discourage using it on slides that are "media-rich"; that is, slides that have a lot of video, graphics or sounds in them. Instead, for these kind of slides I used the "Reset to initial state" option this helped me save some of those precious suspend data.

       

  • Hi Christian,

    If you have a large course that exceeds suspend data limits, here are some suggestions for correcting it:

    • Disable the resume feature in Storyline.
    • Reduce the number of slides until the resume feature works as expected. The limit will vary, depending on a variety of factors. You'll need to test your content in your LMS to verify.
    • Republish your course for SCORM 2004 3rd Edition or 4th Edition, both of which support much longer suspend data.

    There are some community ideas shared here that may be helpful to you as you research what may work best for your situation and hopefully others in the community will chime in to help you out.

  • Hi Christian, 

    The suspend data is compressed and not human readable, but it's still something that your  Learning Management System (LMS) would be able to read and decipher. I haven't seen anyone crack the algorithm or determine a way around it though. 

    If you can share a bit more about what you're hoping to accomplish or any trouble you've run into - I or others in the ELH community may be able to point you in the right direction. 

    • ChristianOmpad's avatar
      ChristianOmpad
      Community Member

      Hello Ashley,

      I am trying to figure out a way to resolve an unwanted behavior in a course I'm working on. Even when its completed, it would always resume to an exam question. I would like it to resume on last page that the user was in when they completed the course. So far every topics in ELH and help from support have lead me to a conclusion that I am facing a suspend_data problem.

      These are the suggested solutions that I have drawn from the discussion:

      1. Publish to SCORM 2004 3rd/4th ed.

      2. Minimize/delete slides.

      3. Only set important slides' settings to resume to saved state and set others to reset to initial state to minimize sending data to suspend_data.

      1 and 2 are not an option since the client's LMS only supports SCORM 1.2 and everything in the course is based on their specs. Therefore, it leaves me with option 3 but I have made little progress as to how to make this random behavior not be so random. 

      • ChristopherP's avatar
        ChristopherP
        Community Member

        Thanks for the solution for SCORM 1.2 suspend_data