Published Storyline Outputting Results before Assessment Completed??

Hi All,

I have been building multiple courses, which contain multiple quizzes, however, we have recently noticed when completing the course on our Learning Management System, that 100% score is outputted before the assessment is fully completed.

For example, a quiz of 10 question with 80% pass rate. If a user answers 4 questions correctly and then accidentally closes the course, a score of 100% is outputted from the course.

The course(s) use 3 results slides:

  • 2 results slide ( 1 per quiz)
  • 1 final results slide (used to track the score from both quizzes, currently set up as "Combine points from each quiz"

Would anyone be able to help advise how best to fix/stop this from happening?

Much appreciated

Tom

EDIT: I forgot to mention when I first posted this, that the quizzes do not use Question Banks, and use a variable technique found here:

https://community.articulate.com/discussions/articulate-quizmaker/only-require-some-questions-on-test-retake

46 Replies
Dave Cox

The first thing that I would check is the tracking results. Make sure that you are actually reporting the results from the results slide that combines the other two results slides. It is easy to select the wrong one. If that isn't it, then you may want to post a copy of your course here, or a sample for someone to look at it for you.

Thomas Whittaker

HI Leslie,

I have tested the SCORM on SCORM Cloud, and the problem is still occurring. I answered 4 questions and got :

I'm not sure what to consider, could changing the settings on our LMS potentially fix this?

Do you know of any fixes that may force users to complete the entire quiz?

Dave Cox

This is very strange. I loaded this in our test LMS sandbox and saw similar result to Thomas. When I run this locally to test, Assessment A runs, and appears to run correctly. When I run it on the LMS, however, Assessment B runs. If I stop after 1 or more questions, I always see it pass a raw score of 100.

You say that you aren't using question banks, but I see where you jump through the question bank before you jump to the quiz slides. I don't really have time to dig deeper, but I think the score is getting set to 100 when you pass through the question bank.

I assume you are using the question banks to randomly select your quiz. Is this correct? If you remove the question banks and use a random number generator, you can randomly jump to your quizzes, and shouldn't have this problem. You will have to use javascript to create the random number though ...

Thomas Whittaker

Hi Dave,

I use a question bank to randomly choose one of the quizzes, and hadn't considered that this could be the reason, as the question bank is not selected in the result slide(s) to track.

I will look into this fix and see if it fixes anything, and will let you know how it goes.

Dave Cox

Hi Thomas,

I did some more checking on your course, and I changed the slides out as I previously suggested. However, when I looked at your course, it appears that the raw score being sent to the LMS is a percentage of slides currently finished. (This is evern after I made the change.) So if you complete question 1 correct, a raw score of 100 get sent. If you complete question two correct, 100 is sent again. If you complete question 3 incorrect, 66.66 is sent, and etc. 

I'm not sure why it is doing this, and this is a different behavior than what I've seen previous. This may actually be to your advantage, otherwise completing only one of the assessment would send a raw completion score of 50 max, since only half of the questions were completed. I don't know how you got this behavior, as I haven't been able to do it.

If the test doesn't reach the results slide, an incomplete is sent to the LMS. The Passed status isn't sent until the results slide is reached.

I've attached the course with the changes I made for you to look at.

Thomas Whittaker

Hi Dave,

I have looked at the changes made, and tested this in our LMS. It appears the error is still occuring, and I'm unsure on what next to consider.

Any suggestions?

Thanks

EDIT: I have noticed that although the score is outputted, anything less than 80% is being shown as incomplete. I'm unsure if this is due to the SCORM or our LMS?

Thomas Whittaker

Hi Dave,

I have built a rough version, and everything appears to be working as expected.

I have a feeling that only retrying incorrect answers could be the cause of this, but I can not be sure. Do you have any thoughts of if this could be the case, or maybe even a different solution for it.

Thanks

Dave Cox

The behavior that I saw in your course was that when you answered question 1, if you answered it correct, 100 was returned to the LMS. For question 2, if you answered it correct, 100 again was returned. I you answered to incorrect, then 50 was returned, and the same for question 3. So what was getting sent to the LMS was the percentage correct for current questions answered.

All of the courses that I've built return the number of points that I set for each question, which is generally 10. So Question 1, if correct returns 10, and so on. 

I'm not sure how you got the behavior that you have, but the advantage is that if you have to tests, you only get scored on the questions that have been answered. To solve this anytime I've had dual tests, I've had to overwrite the score sent to Scorm after the results slide. The disadvantage is what you saw earlier, if you only answer 4 questions, but get them all correct, you will get a score of 100. To send a complete, you have to go through the results slide. 

I would love to know how you got your test to score this way.

Regards,

Dave Cox

Did you try rebuilding the project without the question bank slides? I think that might be when the logic changed.

Start with just one quiz and see if everything works as it should. Then after you get it working, try adding the second quiz, but use the javascript that I sent to select the quiz. Finally, if that works, each quiz will only score 50%. I can send you the javascript that you need to override that to the correct score.

Dave Cox

OK, You need to add a variable to your project. CurrentScore

On each question slide add a trigger to the correct layer to add 10 points to the variable CurrentScore then the timeline on that layer starts.

Add a trigger on the results slide to run this javascript, and make sure this trigger is after the submit results trigger.

var p = GetPlayer();

var cs = p.GetVar("CurrentScore");

var ns = 10; // Number of Question Slides

var currentScorePercent = Math.round(cs / ns * 10);

p.SetVar("CurrentScorePercent", currentScorePercent );

console.log("Current Percent " + currentScorePercent);

lmsAPI.SCORM_SetScore(currentScorePercent,100,0);

Thomas Whittaker

Just a few little queries I want to check before implementing this.

  1. CurrentScore variable, I'm assuming this is a number variable?
  2. Execute Javascript trigger - should this trigger be on the last results slide that will talk to the LMS, or on the results slide for each quiz?
Dave Cox

Yes, the CurrentScore variable should be a number variable. Sorry I wasn't specific.

The javascript trigger should be the last trigger on the results slide. If you are using a combined results slide, you can put it there. If no combined results slide, then put one on each of the results slides for each quiz. Either of these will work.

You need the CurrentScore variable because, even though Storyline keeps track of the score for the slides, you can't access it from javascript. Therefore we have to gather the score in a variable that we can access.

Dave Cox

I just realized that the js had an extra line in it that will break it. (It's left over from something else I was doing.) Sorry about that.

Try this one. I've removed the offending line.

var p = GetPlayer();

var cs = p.GetVar("CurrentScore");

var ns = 10; // Number of Question Slides

var currentScorePercent = Math.round(cs / ns * 10);

lmsAPI.SCORM_SetScore(currentScorePercent,100,0);