Hello all. I have built a number of learning checks in Storyline 2 and have published them to track against the results slide and record a status of complete / incomplete. However, they are all tracking as passed / failed in my LMS.
I've been told that the reason for this is that my learning checks are exam type content with a mastery score, so they will track as passed / failed as default. Apparently I need to add a condition along with the mastery score forcing the content to record as complete / incomplete.
Hello Leslie, I tried your solution and it worked perfectly, but when I try to launch it nothing is played... I have a new window with the player but the content is not there.
Hi Fiona - I popped in to check on your case and it looks like Cleo sent you a few questions. Just wanted to be sure that you got that e-mail. It would have come from support@articulate.com - so be sure to check your spam/junk if needed and let me know if we need to resend :)
Just out of curiosity! Is it possible to change the no slides set for tracking purpose from the published output ? or it can be only done before publishing? Please let me know if more detail is needed on this :)
I know this is a very old post so unsure if anyone's reading this but I'm having the same issue. I've set my SL3 SCORM file to track via slides viewed and using complete/incomplete. However, the LMS is not reading this and appears to use passed/failed. This results in the whole course being completed prematurely.
I've tried to edit the SCORMfunction.js function but when I re-zip the file it isn't read by the LMS property. 'Your course content encountered an error during the upload process.' I've seen this before where I can't edit published files exported and re-zip the package to upload.
The error also stated that the LMS couldn't locate the 'imsmanifest.xml' - asked to check if it's found in the root directory. It certainly is there so is this an issue with modifying published output?
Hi Leslie, just read your comment. Daniel suggested modifying the SCORMfunctions.js file to amend the fields:
var SCORM_PASSED = "completed"; var SCORM_FAILED = "incomplete";
This helped his LMS properly tack complete/incomplete in his SCORM files. Otherwise it's incorrectly tracking by passed/failed.
This worked for him so is it not possible to make this change myself, zip and upload the SCORM files? The upload isn't working for me as it looks like I can't make this manual chnage.
You can certainly make this change if that's what is needed for your LMS, we just cannot support modifying that output.
I would suggest uploading your content to SCORM Cloud to see if you experience the expected result. This is a great way to see if the issue lies within the project or the LMS.
Thanks for your suggestion I've tested with SCORM CLOUD but it looks like there's an issue with Storyline 3.
My tests:
Created 2x SL2 and 2x SL3 files each with 2 slides (to only trigger completion on viewing both slides in each). Both set to track slides viewed with Complete / Incomplete set.
In LearnUpon LMS I opened one of the SL3 modules and it set the status to 'passed' and completed the course (as explained).
Re-enrolled the test account and opened the SL2 modules and it correctly reported the status as 'Complete'. I ran the other SL2 file and same correct behaviour.
So it appears it's SL3 that's causing the issue. Our LMS provider has had a few clients complain about this. I guess as most clients use some kind of assessment in the module and track by this it hasn't been reported more often.
They say I.E correctly reported the completion status (and it could be browser dependent) but I haven't been able to reproduce this. They have also seen this with other formats - XAPI / TinCan. They also say the issue effects Storyline 360/Rise.
In the screenshots, Ignore the other modules 5-7 I'm experimenting if quizzes could be used but we rather not include any quizzes to track by. All modules should be evaluated on slides viewed. Learners are assessed in class only.
I also tested some older modules from a course made with SL2 and the modules are tracking as 'completed' properly. SCORM debugger correctly logs: LMSSetValue( lesson_status) -> completed
I've raised a support ticket as this needs to be resolved urgently as we have a number of modules to go live soon.
Happy to share any files but there really isn't much to see.
Thanks for sharing the details and letting me know that you reached out to the support team. You're in good hands there and one of our support engineers will probably be your best source of support in this scenario with the debug log that you've shared. You should be hearing from someone soon.
Thanks, Leslie. As we have time pressures we've decided to go with a workaround.
We'll work with the status of 'passed' for non-assessed modules. If we set a 'Mastery score' in the last module settings in the LMS it tricks the LMS into not completing the course until it's been completed. This means the LMS is waiting to find out if the Learner passes a score. Regardless of no quizzes being used this seems to do the trick.
Considerations:
1. Lock the modules so the Learner can't complete the last module and early complete their course 2. An 'Overall Score' of '0%' is visible to the Learner in LearnUpon LMS
This workaround appears to track and report well. We don't care if the status is 'complete' or 'passed'
Screenshots indicate the status and tracking of test users. Hopefully, this will help others out for now.
39 Replies
Hello Leslie, I tried your solution and it worked perfectly, but when I try to launch it nothing is played... I have a new window with the player but the content is not there.
Hello Fiona - I hope you haven't been battling with this for 2 months since we last chatted 😬
Would you be able to share the .story file for me to take a look?
If you cannot share here in the forums, you are welcome to share privately here with our support team.
Thank you Leslie, I'll share it with your support team :)
Sounds great Fiona - I see where you reached out to our team (01062905) and you should be hearing from someone soon. I'll follow along as well.
Hi Fiona - I popped in to check on your case and it looks like Cleo sent you a few questions. Just wanted to be sure that you got that e-mail. It would have come from support@articulate.com - so be sure to check your spam/junk if needed and let me know if we need to resend :)
Just out of curiosity! Is it possible to change the no slides set for tracking purpose from the published output ? or it can be only done before publishing? Please let me know if more detail is needed on this :)
Hello Ashutosh and welcome to E-Learning Heroes :)
We do not support the modification of published output, so I would certainly recommend updating that within your project file before publishing.
I know this is a very old post so unsure if anyone's reading this but I'm having the same issue. I've set my SL3 SCORM file to track via slides viewed and using complete/incomplete. However, the LMS is not reading this and appears to use passed/failed. This results in the whole course being completed prematurely.
I've tried to edit the SCORMfunction.js function but when I re-zip the file it isn't read by the LMS property. 'Your course content encountered an error during the upload process.' I've seen this before where I can't edit published files exported and re-zip the package to upload.
The error also stated that the LMS couldn't locate the 'imsmanifest.xml' - asked to check if it's found in the root directory. It certainly is there so is this an issue with modifying published output?
Any advice?
Alex
Hi Leslie, just read your comment. Daniel suggested modifying the SCORMfunctions.js file to amend the fields:
var SCORM_PASSED = "completed";
var SCORM_FAILED = "incomplete";
This helped his LMS properly tack complete/incomplete in his SCORM files. Otherwise it's incorrectly tracking by passed/failed.
This worked for him so is it not possible to make this change myself, zip and upload the SCORM files? The upload isn't working for me as it looks like I can't make this manual chnage.
Hello Alex,
You can certainly make this change if that's what is needed for your LMS, we just cannot support modifying that output.
I would suggest uploading your content to SCORM Cloud to see if you experience the expected result. This is a great way to see if the issue lies within the project or the LMS.
Hi Leslie,
Thanks for your suggestion I've tested with SCORM CLOUD but it looks like there's an issue with Storyline 3.
My tests:
Created 2x SL2 and 2x SL3 files each with 2 slides (to only trigger completion on viewing both slides in each). Both set to track slides viewed with Complete / Incomplete set.
In LearnUpon LMS I opened one of the SL3 modules and it set the status to 'passed' and completed the course (as explained).
Re-enrolled the test account and opened the SL2 modules and it correctly reported the status as 'Complete'. I ran the other SL2 file and same correct behaviour.
So it appears it's SL3 that's causing the issue. Our LMS provider has had a few clients complain about this. I guess as most clients use some kind of assessment in the module and track by this it hasn't been reported more often.
They say I.E correctly reported the completion status (and it could be browser dependent) but I haven't been able to reproduce this. They have also seen this with other formats - XAPI / TinCan. They also say the issue effects Storyline 360/Rise.
In the screenshots, Ignore the other modules 5-7 I'm experimenting if quizzes could be used but we rather not include any quizzes to track by. All modules should be evaluated on slides viewed. Learners are assessed in class only.
I also tested some older modules from a course made with SL2 and the modules are tracking as 'completed' properly. SCORM debugger correctly logs: LMSSetValue( lesson_status) -> completed
I've raised a support ticket as this needs to be resolved urgently as we have a number of modules to go live soon.
Happy to share any files but there really isn't much to see.
SCORM DEBUGGER FOR CompTest4 (SL2)
Starting debugging SCORM...
LMSInitialize( )
============= CURRENT STATE
{"length":0,"items":{"suspend_data":"","max_time_allowed":"","time_limit_action":"","lesson_location":"","lesson_mode":"browse","entry":"ab-initio","lesson_status":"not
attempted","score_raw":"","score_max":"","score_min":"","exit_status":"","total_time":"0000:00:00.00","comments":"","comments_from_lms":"","student_pref_audio":"0","student_pref_text":"0","student_pref_speed":"0","student_pref_l
ang":"","data_from_lms":"","student_id":"alex.arathoon@fastway.com.au","student_name":"Arathoon, Alex"},"le":null}
=============
LMSInitialize( true )
LMSGetValue( cmi.core.lesson_mode )
LMSGetValue( cmi.core.lesson_mode )
LMSSetValue( lesson_status) -> browsed
LMSSetValue( exit_status) -> suspend
LMSSetValue( session_time) -> 0000:00:04.96
LMSGetValue( cmi.suspend_data )
LMSSetValue( session_time) -> 0000:00:06.83
LMSSetValue( lesson_status) -> incomplete
LMSSetValue( suspend_data) ->
LMSSetValue( session_time) -> 0000:00:06.84
LMSSetValue( suspend_data) ->
LMSSetValue( lesson_status) -> completed
LMSSetValue( suspend_data) ->
LMSSetValue( session_time) -> 0000:00:09.14
LMSSetValue( suspend_data) -> 1W46070ji1001111a0101101111t0n5qu10WeIpp5.5mY31SsDjzl1^1^000
Hello Alex,
Thanks for sharing the details and letting me know that you reached out to the support team. You're in good hands there and one of our support engineers will probably be your best source of support in this scenario with the debug log that you've shared. You should be hearing from someone soon.
Thanks, Leslie. As we have time pressures we've decided to go with a workaround.
We'll work with the status of 'passed' for non-assessed modules. If we set a 'Mastery score' in the last module settings in the LMS it tricks the LMS into not completing the course until it's been completed. This means the LMS is waiting to find out if the Learner passes a score. Regardless of no quizzes being used this seems to do the trick.
Considerations:
1. Lock the modules so the Learner can't complete the last module and early complete their course
2. An 'Overall Score' of '0%' is visible to the Learner in LearnUpon LMS
This workaround appears to track and report well.
We don't care if the status is 'complete' or 'passed'
Screenshots indicate the status and tracking of test users.
Hopefully, this will help others out for now.
Thanks for popping in to share Alex.
This discussion is closed. You can start a new discussion or contact Articulate Support.