Multiple Quizes in one articulate.
Jun 07, 2016
So here's the scenario:
We have one elearning course and a user may be able to take one of 6 different paths through the course. At the end of the course, they need to pass a test (let's say 10 questions and get 8/10 correct in order to pass).
The problem is, the test questions must relate to the journey they took through the elearning. So essentially, we need 6 different quizzes. Not a problem, Storyline handles that.
However, each learner can only take 1 path through the elearning. So each learner will only take 1 quiz (the quiz that relates to their learning path).
Problem:
If we create 6 separate tests, and the user takes only one of them, they will only have achieved 1/6th of the total number of points available to them, and the reporting in the LMS will look weird.
To put it another way, if a user scores 80/100 on their quiz, Storyline will report that they scored 80/600 (or 13.3%) to the LMS. Can anyone think of a solution that would work?
22 Replies
Hey Ryan - Do you need to track points or can a pass/fail score be used?
As long as you don't need to send the question data to the MS it would be possible to use javascript to overwrite the score, here is a file that Steve Flowers created that does that.
Hey both, the LMS needs to track the score that the user achieves in the test that they take.
Each of the six tests will be standard, so let's say 10 questions with an 80% pass-rate. So because a user can only take one quiz, we want the LMS to report that they got 80%. If we can't do that, then it's fine to say they got a score of 80 out of 100 (doesn't need to be a percentage). The one thing we don't want passing over to the LMS is a score of 13.3% or 80/600. Does that make sense?
You can use the code in the example I posted to send a score to the LMS, Steve commented all of the Javascript so it is easy to edit.
Just make sure that the course cannot overwrite you score or completed state, simple way to do this is to track by slides and ensure that you cannot achieve the total to pass (perhaps have a hidden scene that cannot be accessed).
Thanks very much for that. That is helpful. I tested on SCORM Cloud, and it worked perfectly from my desktop, so seemed a great solution. However, when I tested it on my iPad, it didn't work at all (returned a score of 0%).
I accessed via Safari for iPad and ensured that JavaScript was enabled on that browser. Is there anything I can do to make this work on an iPad?
Looks like the code in there is only for Flash output sorry should have checked.
tHis link should help
https://elearningenhanced.com/blog/2012/09/13/update-storyline-courses-status-lms-javascript
Hi Phil, thanks for this. It worked really well on both iPad and PC. I did find one problem though:
I set up a course where I used the JavaScript in the last link you sent. I set up four options:
If I failed the first session, when I went back to the LMS it said I'd failed with a score of 30%. Perfect.
If I then opened the course again, and chose the option to get 90%, when I went back to the LMS it said I had completed with a score of 90%. Perfect.
The problem is if I tried to do the above in one session. So if I executed the JavaScript to tell the LMS that I had failed, with 30%, and then went back and executed the JavaScript to tell the LMS that I had completed, with say 90%, then what would happen is the LMS (SCORM Cloud) shows a status of Failed and a score of 90%.
Is there anything I can do to force a course status change if I've already changed the status once in the same session?
I've added my .story file in case that helps.
If I understood the problem correctly, there are multiple "final quizzes" (which collect their points in their own variables). However, you cannot tell in advance what the "actual quiz" will be.
I believe that the following might achieve what you are looking for:
Do you mean like this, Guido?
Yes - not sure if it will work (due to a problem with my VM, I currently cannot start Storyline...), but it could be worth a try :-)
I'm not sure selecting Combine Points will do what you want, I think then they would have to get 80% of 600. You could try manually adding the result variable form each quiz to the final result variable before submitting to the LMS
I honestly don't know, and cannot see anything wrong in your file, but definitely does what you describe in Scormcloud, this may be a limitation. Would be worth Giving Steve Flowers a nudge see if he know any workaround.
Looking at the file, I'd probably work to make sure that the slide that grades the output is set to reset to initial state, along with each of the layers.
I see that you're asking the user to click the button after confirming score. That's cool. If it doesn't matter that the score submits automatically, I'd use something like this and move that trigger back to the base slide and send in the currentscore value:
//get LMS API
var player=GetPlayer();
var currentscore=player.GetVar("currentscore");
var lmsAPI = parent;
//set score; the first number is the score
lmsAPI.SetScore(currentscore, 100, 0);
if(currentscore<80){
SetStatus("failed");
}else{
SetStatus("completed");
}
Storyline *really* wants to use its own logic to send completions. I've started completely suppressing Storyline's score submission whenever I want to trigger my own submissions. This prevents Storyline from behaving in ways I don't want. The extra unreachable slide should get it in most cases. However, I have seen it fail when using things like question banks. Because each pull that includes a new question will actually ADD to the slide count.
I add this to a JS trigger on the master slide at the root:
ReportStatus=function(){
//since we don't want Storyline to overwrite our scoring... let's overwrite that $#it.
}
With this in place, there isn't any way for Storyline to send any kind of completion to the LMS. The other side-effect of completely overriding this is you can now use a results slide to capture every single question's response without worrying about how things track within the published output.
Running a few tests with your example file to see if any of these could be the culprit.
Steve you amaze me! Does this mean I can use a results slide to send ineraction data and override the score?
Sent from my iPhone
Wacky. None of the stuff I have above is really relevant to this case. I still recommend overwriting Storyline's tracking call. More reliable than using extra slides.
In my tests, the second SetStatus call isn't applying at all:
https://cloud.scorm.com/sc/guest/ViewDebugLog?logId=27963351-ca03-4fdb-912c-aa67a0abccaa&courseTitle=Test+My+Score2
Yes! This is what triggered me to just blow out Storyline's submission guts. Lets you still submit interactions and have COMPLETE control over completion behaviors and scoring. No more slave to Storyline's dead simple but limiting options.
OK. Within the same session, it looks like the SCORM API won't accept a completed status while the status is failed. I manually triggered a passed change BEFORE calling another completed and it appeared to behave just fine.
//get LMS API
var lmsAPI = parent;
//set score; the first number is the score
lmsAPI.SetScore(100, 100, 0);
//set status; possible values: "completed","incomplete", "failed", "passed"
SetStatus("passed");
SetStatus("completed");
I would still blow out Storyline's ReportStatus function. The other workarounds still leave space for Storyline to sabotage your custom behavior.
I got the same thing, it sets the score just not the status. Annoying because I wanted to use this to show progress tracking, I think I may be ok though because the LMS it is going into sets a course to complete when it reports 100% the benefit of a custom LMS.
Sent from my iPhone
So you could keep it at incomplete until you pass completed and wouldn't have an issue?
Sent from my iPhone
Yes. That would do it too. As long as you never passed in failed during the session. Setting to passed / completed will force past the session stop.
Hey, so this is great, and has given me an idea:
We can give the learner a couple of attempts and have the elearning pass incomplete if they fail the test. Then if they don't pass on the third attempt, we can mark it as failed and have them exit the course and come back the next day, which means they will be in a new session so if/when they pass the course, it should pass complete.
What do the folks think of that? Good idea or not?
Thanks for everyone's input by the way - this is truly great!
I am sure that would work
This discussion is closed. You can start a new discussion or contact Articulate Support.