A Question about skipping a question and not counting it's total.

I've built a course that is sort of adaptive.

There are 3 modules in it and at the beginning the user makes some selections (using some nifty sliders) to tell them which modules they need to take.

There are groups of quiz slides for each of the modules.

There is one learning point that is essential to understand that is in module 2 and module 3.

I'm testing the understanding of this with a multi-select quiz question.

Currently I have that question in both the quiz blocks for the relevant modules. Each block has a results slide. The final slide of the course is a results slide from all the other result slides.

I'd like to be able to skip the question in module 3 if they already answered it correctly in module 2.

I've done this by setting a variable in the 'correct' layer of the module 2 slide and testing that variable when the module 3 question slide timeline starts. If it is set then I skip to the next slide.

The problem is that this now means that they don't get any points for that question and this impacts their score for the module and the course.

How do I avoid 'dinging' them for skipping the question but still reward them if they only end up covering module 3 and only have to answer it once?

Also how do I avoid dinging them for an entire set of missed questions if they didn't need to cover that content?

Thanks

Alan

9 Replies
Trina Rimmer

Hi Alan.

Normally I'd suggest setting up your quiz to "submit all at once" which would allow users to skip questions, but as you've observed, those questions would still be counted in the overall total. There may be a community generated method, but I didn't see one in a search through the forums. I'll tweet this question out in the hopes of attracting some responses for you!

Kevin Thorn

Hi Alan,

I believe this can be done with some careful consideration to the logic and evaluating EVERY possible condition. 

I built a similar project last year that's set up the same way with various quiz questions scattered through the lessons. And where a few were mandatory as well as duplicated in other lessons. Each lesson had its own Results slide with an overall completion Results slide that would evaluate all answered questions.

I'll take a look at your attached project file and see if I can find a quick solution for you.

Trina Rimmer

Alan, it looks like you're in good hands with Kevin. 

The kind of complex branching you're doing isn't typical for the majority of our users. But you're certainly welcome to weigh in with a feature request here if you think this is something we should be taking a closer look at in the future.

Alan Montague, CPLP

Good to hear that you are on the case Kevin ;)

My priority is allowing learners to skip entire modules worth of questions and not having the points for those skipped modules counting into the total that they are trying to hit to see if they passed or failed the course.

Cheers

 

 

Kevin Thorn

Hi Alan,

I took some time to study your file. While I think I understand the logic, there may be a different way to approach the solution you're looking for. For now, here's what I've determined.

First, you have two shapes on the start screen that when clicked, change a variable. However, that's the only place that variable is used and not sure why a user would click those shapes when there are buttons to advance to the quizzes already. One thing I did notice is your trigger order on those shapes. Remember, triggers fire from top to bottom so if you want to change a variable AND jump to another location, ensure the trigger to change the variable fires first (top trigger).

For the quizzes, the Quiz Bank setup may be what's causing some of the conflict in the final results slide as that sample question is being pulled twice from the bank for each quiz.

For Quiz 1 - Question 1, the variable "Q1Right" is being set to True on the Correct layer but there is no other place to evaluate that value and what to do if it is in fact True. From a basic logic sense remember that anytime you Set or change a variable, the idea is to evaluate that changed value for later purpose.

In this case, I added a trigger to Quiz 2 - Question 1 to evaluate if "Q1Right" i's True. If True, jump to the next question when the timeline starts. Essentially, you're evaluating whether the same question was answered correctly in Quiz 1. If so, skip that question. If not, then show that question again.

As for your Results slides, they are working correctly. What I mentioned earlier about there being a different way to approach this design may need some more thought around scoring. Meaning, in its current prototype state, Quiz 2 will result in a Failure if question 1 is skipped because of the current weighted points. That result failure is directly related to the weighted points and the percentage set for a passing score on the respected quiz Result slides. So you'll have to consider "What is a passing score *IF* question 1 is skipped in Quiz 2?" Meaning you'll need to determine a passing score for two scenarios: 1) answer 2 questions in Quiz 1 and answer 2 questions in Quiz 2, or 2) answer 2 question in Quiz 1 and answer 1 question in Quiz 2. From there your overall Final Results should score correctly.

That's as far as I have time tonight to troubleshoot. In the end I don't think your design is too terribly difficult to set up. Just need a bit more careful thought in the logic and scoring.

Hope this helps (file attached).

Alan Montague, CPLP

Kevin

thanks for putting in time on this.

I guess I might have even made it more complex with the question bank red herring.

I'm interested in what you say about using scoring weights to make skipping questions not cause an automatic failure.

Sadly I can't see how this would work if the learner skipped a whole section of questions because the choices they made at the beginning means that they don't need to cover this material.

Alan

 

Kevin Thorn

Yeah, the combination of a Question Bank along with individual Question Slides threw me off a bit, too.

I completely understand what  you're wanting to do, but a prototype for this type of design may need to include the actual number of questions per section. Essentially, build out a full prototype.

It's hard to test with just a few sample questions given the complexity of the scoring you're wanting to do. If you could outline the total number of questions per section and how many sections, including the branching choice(s), I'll give it another shot at putting prototype together.