Need help with test scoring
Jan 20, 2020
By
Kyle Millar
I'm trying to build a module that contains 20 test questions. If the user answers a question correctly, they simply move on to the next question. However, if the user answers a question incorrectly (Question A, for example), they will be given some rationale for why the answer they selected was incorrect, followed by a second question that covers the same material, with slightly different phrasing (Question B). The passing score for this exam is 100%. But what I cannot figure out is how I can set up the scoring for the exam so that regardless of how many questions the student had to answer, if they answer them all correctly, they pass the exam.
1 Reply
Others may have different takes on this. But out of the box, this kind of functionality doesn't exist.
However, people come up with ideas for new question types every day, and the way they implement them is to build these questions, secretly and manually, in LAYERS, on top of a more simple question type, like a TRUE/FALSE question.
The concept is simple, but it's a lot of work to implement. And the major downside is that ultimately, it is only a T/F question and that is all that can be reported to the LMS.
If they satisfy your requirements, a trigger sets the base layer TRUE to selected, then submits the questions as a whole as complete.
If the answers given are not correct, based on your description, I'd think you move to more LAYERS. Q1Feedback gives the explanation as to what the answer should have been, and from there you can present a backup Q2 question, all the same work goes into Q2 as went into Q1.
If they get Q2 correct, you finally set the base to TRUE and move on.
If not, you may have to submit it as FALSE, which means you know already they are going to fail the quiz overall.
Is that enough info to start you down the path you custom question design?
This discussion is closed. You can start a new discussion or contact Articulate Support.