Attempts not recording to LMS

Sep 06, 2014

I have published a course in SCORM 2.1 but when I run it within the LMS every attempt just overwrites the previous attempt and the attempt number is always just set to 1 no matter how many times I re-attempt the course. What it should do is update the attempt number and provide a history of my previous attempts.

I have not altered the default LMS settings. Any Ideas?

29 Replies
Gordon Layard

Hi Phil,

I don't think that's right because if I change the tracking to be based on screen completion rather than a Results slide then new attempts are recognised as such if you fully complete a course and then reattempt it. The problem only occurs when the tracking is based on Results slides.

Also when I conduct the same tests on the SCORM cloud again the number of attempts remains at 1 but when I test other modules (created with different authoring systems) the number of attempts updates.

I therefore conclude that the Storyline SCORM handling js scripts are responsible but I agree there is no options in relation to this in Storyline and I assume that there is a bug. I have filed a bug report.

Gordon Layard

Hi Phil,

Thanks again for your replies. I'm using an LMS from Seertech but the problem also occurs on the SCORM cloud. As I say, the course correctly updates attempts if tracking is based on slide completion rather than on a Results slide. I'm going to do some more testing today with a very basic 5 slide course to make sure I can replicate the problem. I'd be surprised if I was the first one to note the issue.

Gordon Layard

I've been working on this and have come up with a solution.

If I change

var DEFAULT_EXIT_TYPE = EXIT_TYPE_SUSPEND; in Configuration.js to:


then every attempt (even if the SCO hasn't been completed) is treated as a new attempt and I get a historic record of previous attempts rather than just having it overwritten.

Strictly speaking this breaks the SCORM standard because a new attempt should only occur after full completion of the SCO. Still it's better than never recording a new attempt which is the Storyline default which also breaks the SCORM standard.

Justin Grenier

Good Morning, Gordon.

I definitely don't want to stir the pot on whether or not we are breaking the SCORM standard, but I think I can try to explain why we've chosen to do things the way we do them:

  • One camp says that we should always set cmi.exit to normal upon completion of the SCO. This camp says that if we set cmi.exit to suspend upon completion, we are assuming that the learner has exited with the intent of returning to the SCO at a later time, and that it isn't fair for us to make this assumption. This camp also says that if we were to always set cmi.exit to normal upon completion, it would make it easier for the LMS to know that it should start a new attempt the next time the learner accesses the SCO.
  • Another camp says that setting cmi.exit to suspend will ensure that the current attempt is preserved and the run-time data is not reset the next time the SCO is launched. In order for bookmarking to work, content must be exited in a suspended state. In contrast, if cmi.exit is set to normal, the learner's attempt ends, and the run-time data of the current session will not be available if the SCO is relaunched. In other words, if we want to allow the learner to resume a completed course, we must use suspend.

Essentially, by always setting cmi.exit to suspend, we've gone with the method that we feel offers the most flexibility to the LMS. This way, if the LMS offers the option to start a new attempt each time, they can go ahead and do that. If we were to always set cmi.exit to normal upon completion, we might force the LMS to start a new attempt next time, but customers with different requirements would never be able to allow their learners to resume a completed course.

I hope that helps. Have a great day!

Gordon Layard

Hi Justin, thanks, that would all make sense except that my testing reveals that if the default exit type is set to Unload I am both able to resume my progress and also a new attempt is recorded.

Also your explanation doesn't explain the discrepancy between tracking via slide completion and tracking via Results slide (when the default exit type is set to resume). If tracking via slide completion then the attempt number correctly updates following full completion of the SCO. Tracking via results slide, however, never increments the attempt nuber.



Phil Mayor

Gordon Layard said:

Hi Phil,

Thanks again for your replies.  I'm using an LMS from Seertech but the problem also occurs on the SCORM cloud.  As I say, the course correctly updates attempts if tracking is based on slide completion rather than on a Results slide.  I'm going to do some more testing today with a very basic 5 slide course to make sure I can replicate the problem.  I'd be surprised if I was the first one to note the issue.

Looks like you are right, something to bear in mind.  I would have expected a completed course (passed/failed) to be opened in review mode, and in slides viewed this does happen.  I am not sure that new attempts should be pushed back on LMS as their are real reasons for tracking the number of attempts and in effect you are allowing a user to improve their grade without seeing that they failed at some point.and their should also be consistency between the two reporting types.

Justin Grenier

Hello again, Gordon.

As for the results you are seeing when you modify the value of the DEFAULT_EXIT_TYPE variable, Articulate software and its published output is supported as is. We cannot offer advice on customizing the published output to work in a specific LMS environment. You've posted in the right place though, so there may be other members of the community who can share workarounds they have used. Rustici Software is also an excellent fee-based resource who can help implement customizations for your LMS.

If you'd like to observe firsthand the differences between the data we send to the LMS when (1) exiting a course that tracks based on the number of slides viewed versus (2) exiting a course that tracks using quiz result, I would recommend enabling LMS Debug Mode. I'll be interested to hear if you can isolate specifically what difference is causing your LMS to record individual attempts, since I'm not aware that we send attempt information to the LMS.

You might also be interested in a modified SCORMFunctions.js file that seems to help with a similar problem in the Saba LMS.

Good luck with your project!

Laura Douglas

Hey Gordon!

I tried your suggestion about changing the default exit type to unload. But in testing my course in SCORM cloud it still acts the same. (i.e. only 1 attempt tracked). I wonder if something else needs to be done?

For example, all my courses are set to resume from where they left off when one exits the course --- should this be changed? Should I add in that the course is prompted? Or should I add in that the course begins anew each time?  I feel like I've been struggling with this issue for so long and you're the only one that's offered clear guidance on this!

Joe Koller

Thank you for the quick reply. We are trying to track each attempt the user takes (e.g. each time he/she click the retake quiz button a seperate attempt is tracked). My understanding is that Articulate does not send attempts to the LMS and therefore the only way to track this accurately is to have the learner exit the module and re-launch it. I am trying to see if there is javascript that will provide a work around.


Austin McKenzie

This just seems crazy. Tracking Attempts is something that is a SCORM standard. Why does storyline not have a simple option within the software to select the attempts number? Adobe Captivate has this all built in and is completely SCORM compliant. 

Will storyline get this basic functionality?

Leslie McKerchie

Hi Austin! This thread is a bit dated and it looks like it has been shared here that the LMS tracks this data and you could set up a variable as the users above shared.

I appreciate you sharing your thoughts and would encourage you to share directly with our team here if there is a feature or enhancement that you would like to see added.

TJ McKeon

For those of you experiencing this issue and use the Saba Platform here is what I received from their technical support:

Per review of the user communication logs as well as the result of my testing on the content, the content is passing a successful completion status, but as you stated the content exit status is in suspended state. Thus the content rollup is not completing and the class remains as In Progress.

I checked the microsite setting for "Consider Content Attempt in "Suspended" state for completion for SCORM2004 type content" and this is currently set to zero meaning the system will take the content specific value.

To resolve the issue, there are two options:

1. Update the microsite content setting to 1, so that all SCORM 2004 type content is considered for completion.

Admin > System > Configure System > Mirosite > SabaCloud > Site Properties > Content > "Consider Content Attempt in "Suspended" state for completion for SCORM2004 type content" = 1 > Save


2. On the Content Details page for this content, enable the setting "Consider Content Attempt in "Suspended" state for completion" by checking the check box and saving the change.

Note: Option 2 will only set the consider suspended state for completion for that particular content.



Laura Hastings

I see that this last update was over 3 years ago, so checking in to see if there have been any more discoveries on this topic. This has come to the top of our priority list because now the FDIC is requiring all financial institutions to track the number of attempts made on a knowledge check before the test is passed. Here's the specific language:  "Enhance the LMS tracking mechanism so that if an employee fails to achieve a passing score after several attempts, there is appropriate action taken to ensure they comprehend applicable laws and regulation."  We enjoy using the 360 suite, but if it's not able to track attempts, we won't be able to use it for any of our compliance training, which greatly reduces its usefulness.

Jose Tansengco

HI Laura,

If you need to track the number of attempts that your learners take before passing your quizzes, you can setup a numerical variable to track the attempts, and then follow the steps in this article to send the value of the variable to your LMS: 

A numerical value would then be sent to your LMS which will serve as a representation of the number of attempts taken before a passing mark was achieved. 

Hope this helps!

Joseph Francis

As an admin, where would I see that number of attempts data in my LMS?

By design, the LMS displays each attempt on a separate line, with a Status (cmi.core.lesson_status), Session Time (cmi.core.session_time), and Score (cmi.core.score), where appropriate. These values would then "roll up" to the Course level, where I would see a Lesson Status (cmi.core.lesson_status), an aggregate of the session times (cmi.core.total_time), and a score (cmi.core.score), where appropriate.

Laura Hastings

Hi Joe,
This is so helpful!! I think I'm going to be able to make this work!

I do have one question. I'm not a programmer, so if this is obvious, forgive my inexperience. In the first section that shows how to add a "send xAPI statement" trigger, step 4 refers to entering a valid URN in that area. What is a valid URN? If I'm attempting to send the number of quiz attempts before passing to my LMS, what should that valid URN be?

Thanks so much!

Laura Hastings | Sr. Learning Business Partner
Exchange Bank | Learning and Development
707.524.3248 |

Laura Hastings

Hi Joe,
I'm trying both the xAPI method and Survey slide method to see which works best with our LMS. When I attempt to add the trigger to the survey slide, my custom variable is not showing up on the drop down menu to select. So, my trigger looks like this:

Why doesn't my variable name show up in the drop down when I attempt to assign the variable in this trigger?


Laura Hastings | Sr. Learning Business Partner
Exchange Bank | Learning and Development
707.524.3248 |