Extra questions and answers sending to LMS
Jun 10, 2019
Hi there.
I've created a confidence based multiple choice assessment in Storyline 360 which records the question, the answer the user selected, a score for their confidence, a score for their knowledge, and a total score (a combination of the confidence and knowledge scores). There's a bank of 10 questions, 5 of which are randomly selected each time the user takes the assessment.
I've got all of this information being reported to the LMS using hidden survey slides and some JavaScript, but for some reason it's duplicating some questions and answers in the LMS answers report when the assessment is retaken.
The first attempt reports correctly (5 questions and 5 answers showing).
The second attempt reports the 5 questions and answers that were selected, but it's also showing some of the questions and answers that were done in the first attempt (in the example that I tested it was showing the 5 questions and answers on the second attempt and 2 questions and answers from the first attempt). This is a problem, as there's no way for us to differentiate between which questions and answers were in the first attempt and which were in the second (which renders the whole thing useless).
I've reported this to our LMS technical team and they've advised that this is an issue with Storyline rather than the LMS. Here's the response I had from them:
I got a response from the developers regarding the issue with the answers report. It seems that the course is sending data shown in the Answers Report, but we can't determine why is this the case. In order to get to the bottom of this, you would have to check with the course author why the course is sending extra questions and answers to the LMS for a course retaking. As Absorb is a course host, not provider, we record only what the course is sending to the LMS, not the content or configuration of the course.
I can send the Storyline file to someone, but as it's an assessment for a client I'd rather not upload it publicly.
Thanks.
15 Replies
In the identifier there is normally a number that is incremented to identify it as a second attempt, are you remembering to reset the results slide on the second attempt?
The course is set to never resume, and there's no 'Retry' button - if they fail they have to close the assessment, watch a video about the topic, then re-open the assessment and do it again. I've tried adding a 'Reset results' trigger on the first page, but this just causes it to crash as soon as it's opened.
Do I still need to reset the results even if it's set to never resume? If so, how do I do this without the course crashing?
If you want to view the number of attempts then you will need to reset the result. You need to add a reset results trigger, I would try add do it when they leave the results slide, sometimes timeline start triggers for reset can crash the course, you may need to use an on click trigger.
I just added a 'Reset results' trigger to the 'Exit module' button, so the results were reset upon exiting the course, but it's still reporting extra questions and answers after the second attempt - instead of reporting 10 questions and answers (5 per attempt), it's reported 13, so has duplicated 3 answers this time. So it looks like it's not related to resetting the results.
On the 3rd attempt it reported 9 questions instead of 5. It seems to be saving any questions that were answered in a previous attempt, but not the current one, and reporting them as if they were on the current attempt.
I am sure it is the reset results, each time they take the assessment you need to reset the results (I don't know the structure of your course, but I don't think having it on the exit button is the right place,
The purpose of the reset result is specifically for retaking a course whilst still in the course the user will take the assessment pas/fail and then this data is sent to the LMS, if the suer retakes during that session then you need to reset the results. That way it will add one to the attempts and the interaction data being sent will be incremented so you can see it is a second attempt.
It sounds like a design issue, where you are forcing resume and not resetting results. Or the user is able to navigate to the questions and again is getting there without the results being reset.
The reset results is the only way to send a second, third attempt to the LMS within a resume session.
However there is one other possibility in that the course is being marked complete and opening in review mode, but then you should not be able to pass any data.
They're not retaking the assessment during the same session - there's no way for them to do this. They're closing the course, watching a video which is hosted elsewhere, then re-opening the course and redoing the assessment.
But if they are forced resume they remain in the same session so the results need to be reset. I don't think they will reset when you exit to change any variables and save in the resume you must jump slides before you exit.
So am I right in thinking that Storyline will report that a quiz attempt is the same one if the user closes a course and re-opens it, even when it's set to never resume?
It depends on the LMS, most LMSs will need the user to click a new attempt button. Looks like I misread and you have it set to never resume. In this case it is the LMS that is the problem,, each time they open the course it is the same attempt as far as the LMS is concerned. You need to find a way to force a new attempt in the LMS each time the course opens which will give you different results. Which LMS are you using?
It's Absorb. They've looked into it and seem to think it's related to Storyline rather than anything their end.
I would check if there is an option in the LMS to force a new attempt, Storyline has no way to tell the LMS that the attempt is new that is done when the course is opened y the LMS, Storyline sends the data which looks like it is all getting compiled into one attempt in the LMS
You maybe able to force a new attempt by setting the course as complete/failed
Failed is normally terminal for an LMS and then will start a ned attempt
I've gone back to Absorb to see if they can advise how to make the LMS force a new attempt, as I can't see how it can be done on the admin panel. I've tested it using the 'Completed/failed' reporting, but it's still doing the same thing.
It seems to report properly when I add a 'Retake' button which resets the results, but when the assessment is closed and re-opened it's still saving the previous attempts. So the issue seems to be with the course not creating a new attempt when it's re-opened. I have no idea how to resolve this though.
the course cannot tell the LMS to open a new attempt that is at the LMS level.
You can set the course to fail if they do not meet the requirements and that should force a new attempt, but you will not be able to get storyline to force a new attempt that is under the control of the lMS
This discussion is closed. You can start a new discussion or contact Articulate Support.