Extra questions and answers sending to LMS

Hi there.

I've created a confidence based multiple choice assessment in Storyline 360 which records the question, the answer the user selected, a score for their confidence, a score for their knowledge, and a total score (a combination of the confidence and knowledge scores). There's a bank of 10 questions, 5 of which are randomly selected each time the user takes the assessment.

I've got all of this information being reported to the LMS using hidden survey slides and some JavaScript, but for some reason it's duplicating some questions and answers in the LMS answers report when the assessment is retaken.

The first attempt reports correctly (5 questions and 5 answers showing).

The second attempt reports the 5 questions and answers that were selected, but it's also showing some of the questions and answers that were done in the first attempt (in the example that I tested it was showing the 5 questions and answers on the second attempt and 2 questions and answers from the first attempt). This is a problem, as there's no way for us to differentiate between which questions and answers were in the first attempt and which were in the second (which renders the whole thing useless).

I've reported this to our LMS technical team and they've advised that this is an issue with Storyline rather than the LMS. Here's the response I had from them:

I got a response from the developers regarding the issue with the answers report. It seems that the course is sending data shown in the Answers Report, but we can't determine why is this the case. In order to get to the bottom of this, you would have to check with the course author why the course is sending extra questions and answers to the LMS for a course retaking. As Absorb is a course host, not provider, we record only what the course is sending to the LMS, not the content or configuration of the course.

I can send the Storyline file to someone, but as it's an assessment for a client I'd rather not upload it publicly.


15 Replies
Tom W

The course is set to never resume, and there's no 'Retry' button - if they fail they have to close the assessment, watch a video about the topic, then re-open the assessment and do it again. I've tried adding a 'Reset results' trigger on the first page, but this just causes it to crash as soon as it's opened.

Do I still need to reset the results even if it's set to never resume? If so, how do I do this without the course crashing?

Tom W

I just added a 'Reset results' trigger to the 'Exit module' button, so the results were reset upon exiting the course, but it's still reporting extra questions and answers after the second attempt - instead of reporting 10 questions and answers (5 per attempt), it's reported 13, so has duplicated 3 answers this time. So it looks like it's not related to resetting the results.

Phil Mayor

I am sure it is the reset results, each time they take the assessment you need to reset the results (I don't know the structure of your course, but I don't think having it on the exit button is the right place,

The purpose of the reset result is specifically for retaking a course whilst still in the course  the user will take the assessment pas/fail  and then this data is sent to the LMS, if the suer retakes during that session then you need to reset the results. That way it will add one to the attempts and the interaction data being sent will be incremented so you can see it is a second attempt.

It sounds like a design issue, where you are forcing resume and not resetting results. Or the user is able to navigate to the questions and again is getting there without the results being reset.

The reset results is the only way to send a second, third attempt to the LMS within a resume session.

However there is one other possibility in that the course is being marked complete and opening in review mode, but then you should not be able to pass any data.

Phil Mayor

It depends on the LMS, most LMSs will need the user to click a new attempt button. Looks like I misread and you have it set to never resume. In this case it is the LMS that is the problem,, each time they open the course it is the same attempt as far as the LMS is concerned. You need to find a way to force a new attempt in the LMS each time the course opens which will give you different results. Which LMS are you using?

Tom W

I've gone back to Absorb to see if they can advise how to make the LMS force a new attempt, as I can't see how it can be done on the admin panel. I've tested it using the 'Completed/failed' reporting, but it's still doing the same thing.

It seems to report properly when I add a 'Retake' button which resets the results, but when the assessment is closed and re-opened it's still saving the previous attempts. So the issue seems to be with the course not creating a new attempt when it's re-opened. I have no idea how to resolve this though.