Retry INCORRECT questions only

Jul 10, 2017

Hi team

can anyone give me instructions on how to only include the INCORRECT questions when students retry the quiz. I know there is a video using storyline 1 but the interface and variables boxes are different when i use storyline 2. I have true/false , multiple response, multiple choice, drag and drop questions.

I have 100's of questions for each modules and there are several modules.

Here is what i need:

1.students get unlimited attempts

2.students to attempt INCORRECT answers only

(Note: is not necessary to review test only to RETRY incorrect questions)

This is quite urgent !

Thanks in advance team

30 Replies
Louise Lindop

Hi Tom,

As far as I'm aware there's no automatic way to do this.

I would:

  • create a variable for each question, initially set to 0. 
  • for each question, on the base layer, add a trigger(s) to change the state of the radio button(s) to selected when the timeline starts for the correct option(s). Add a condition to only do this if the appropriate variable is set to 1.
  • for each question, on the base layer, add a trigger to submit the interaction when the timeline starts. Add a condition to only do this if the appropriate variable is set to 1. Make sure this trigger is below the one above.
  • for each question, on the correct layer, add a trigger to jump to the next slide when the timeline starts in the appropriate variable is equal to 1.
  • for each question, on the correct layer, add a trigger to set the appropriate variable to 1 when the timeline starts. Make sure this trigger is below the one above.
  • on the results slide, make sure there is a button that displays when they don't pass. this button should reset the results and take them back to the first question slide.

I've attached a basic example.

Louise

John Hanson

Hi Team,

I'm trying to edit the retry-wrong-question-only.story file to add a few more questions. Ultimately I'll need to add up to 50 questions.  The problem is, even though I copy the triggers, the order, everything I can think of, the variables for the slides I added (questions 5-7) don't get changed when you answer correctly.  What am I doing wrong?  I've tried this with both the variables being numbers as well as an earlier sample file where they were true/false variables.

I display the variables on each page so I can see what's happening.

 

Help!

Louise Lindop

Hi Andrea. No I don't think so. You would draw the 7, then, for example, get 3/7 incorrect, then on retry you'd draw another 7, which may or may not include some of the 7 from the initial drawer, so I don't see this method working in this case. If there were any that crossed over from the first 7 to the second 7 that were answered correctly it would skip these, but you might only get, for example,1 of these, then another 6, even though, if I understand correctly you would really only want 3 represented to the user. Louise.

Azizi Abdullah

Hi all. I tried Louise's solution on a 5 question quiz, but for some reason it glitches when viewing in mobile mode (Lanscape mode) only. I can get 2 - 3 questions wrong, and go back and answer them fine on in desktop mode (using the preview in Storyline3), but when I try the same in mobile landscape preview it navigates back to the question but then either freezes, or the three 'loading' dots appear and nothing else happens.

I was wondering if anyone would have the time to have a look at this and possibly let me know what's going on? Thanks. 

I've submitted a case as well (01241185), so here's hoping!

Louise Lindop

Hi Steven. I took a look and can't see anything obvious that's causing this. Interestingly the original sample I made doesn't behave like this, but I can't see any difference between the two. I also created another one recently and it all works OK in the landscape mobile mode too. Hopefully they'll look at your case and come up with something. In the meantime, I'd probably just start again and see if the same thing happened.

Azizi Abdullah

Hi Louise. I'm using Storyline3, and it consistently glitches on multiple choice questions when viewed in mobile landscape layout. Tech support have had a look and didn't quite fix it (thanks for trying Ryan), but gave me enough food for thought to fix it myself. 

I needed to use two sets of variables. One to tag each question as correct (on the correct layer), and another separate set to jump to the next slide if the question was tagged as correct. I also moved the triggers to set the variables to the top (Ryan's recommendation). 

Fixed sample is attached. Thank you Ryan!

Azizi Abdullah

No problem - however.....

Once I modified my actual project to match the sample the issue remained. Must be a latency issue. I delayed some of the triggers by 0.25s and it works fine now, albeit with a bit of a delay as the quiz skips through each question. I made this a little more bearable by hiding the "Continue" button on each "Correct" layer when it was being retried. 

Not the prettiest solution, but a solution that didn't require coding. That's what I love about Storyline, you can do almost anything without coding!

Freddy Blaaberg

Seems to be the solution that I need.
@Steven - I have tried your file. The only thing I miss is that when you finally has answered correct in your table, the result remain to show the "retry"-button.

In my case the member do not need to answer correct, but to confirm like a tick-off each issue.
The result then just has to be 100%.
My challenge is that the member not necessarily can confirm all issues in one go. so if he confirms some of them and continue - then he has to able to go into then file later to confirm further issues. In the end he has confirmed all issues and has passed the course.

Is there an updated version (I use 360)

Jennifer Morgan

Hi Everyone,

This has been extremely helpful since I don't use variables very much.  My quizzes entail other types of questions.  The triggers provided doesn't necessarily fit because  "on the base layer, add a trigger(s) to change the state of the radio button(s) to selected when the timeline starts for the correct option(s). Add a condition to only do this if the appropriate variable is set to 1."  It doesn't give me the option.  I've been trying to figure out to make it work for the following:

Numeric Entry

Drag and Drop 

Fill in the Blank

Can anyone help me? 

Louise Lindop

Hi Jennifer

The fill in the blank and numeric entry can be done in a similar way to the multiple choice. I've attached an example where I've included these. 

I'm not sure you'll be able to incorporate drag and drop. I've played around but so far can find no way of forcing the drag items into the drop areas automatically.

Louise

Paul Stover

Hi. Here is an example of what my partner and I came up with to have learners only retake the incorrectly answered quiz questions. We needed to be able to have various question types, like drag and drop, multiple-click software simulations, multiple choice, and text entry. We also wanted to eliminate that stuttering that is caused the above examples when they skip through each question that was previously answered correctly. What we determined as the easiest solution is to make a custom results slide that uses our own result variables rather than the built in variables that a regular results slide uses. Please see the triggers on the Retry button on the "fail" layer of the results slide and also on the "Correct" layers of each question. The sample here has had all of the content removed from it but shows how various question types can be used in this way. Also, note that we did have to add a little bit of javascript in the pass layer to submit the results to the LMS.

Paul Schneider

On a purely instructional design level, what is the instructional value of having someone retake a test unlimited until they get all questions right, but letting them skip questions they took before? On the surface level it seems you are just encouraging retying/guessing.  Doesn't really seem to assess the user in any effective way, and if the questions are meant as learning, why not just a bunch of practice questions they can keep trying?

Azizi Abdullah

One could also argue that only giving the participant one chance to answer, then telling them the correct answer when they get it wrong, has less educational value than making them attempt the question again. If they attempt the question again, at least there's a 50% chance they'll actually  think about WHY they got the question wrong the first time? It's another opportunity for learning, albeit at the risk of guessing, but it's better than just saying "That's incorrect" and expecting them to learn from that deflating experience.

Paul Schneider

Agreed - but then you could just leave it as a test, unlimited attempts, maybe even turn off review if you want - or leave review on - but they get random pooling again or all questions.  If they got an answer right before, no reason they couldn't get it right again - and reinforce the learning - unless of course, they are just taking multiple times and guessing - which is what unlimited times, but eliminate what I got right (or guessed right) essentially does.

J. Hanson

In my case, the customer required employees to score 100% on the assessment (job critical information and processes). When we initially set the quiz up so that they had to take all 50 questions over again if they didn't get them all right, with no feedback on what they had missed...students were understandably upset. One misread question and you were going back through the whole "experience" again, unsure of what you had missed.  This was not a valuable learning experience, it just created a lot of work-around behavior.  I guess people sharing answers keys is a sort of team building activity, but not really what we were after.

Reviewing only the questions they missed forced them to read those questions more carefully and focus in on what they didn't understand, rather than tediously re-doing the things that they did understand. Students believed that the number of tries were being tracked (in reality they were not, we just didn't disabuse them of the notion), so they worked towards as few tries as possible.  From a business outcome perspective, the assessments were effective in solidifying people's understanding of the subject...which is what we needed.

Paul Schneider

Thanks for sharing. The fact that students believe the attempts were being
tracked (or if the attempts had been limited) does add value to this
situation. Though another solution would be to break the test into smaller
tests/chunks (maybe based on some overarching topics). Then it wouldn't be
so onerous to go through it al again and really accomplish the same goal,
and if you "guessed" right, well you have to learn that the guess was
correct, not just a guess. When there are not limit to attempts, it just
becomes a bit silly (IMO) as to not really accomplishing the learning goals
- just a way to check off a compliance requirement.

J. Hanson

Paul, I don't disagree with anything you said. My preference is to have reinforcement "knowledge checks" throughout a series of eLearning episodes that help with knowledge transfer, followed by a series of guided real-life activities with debrief/reflection, followed by an in-person verbal assessment (with a rubric for guidance) by their manager, with reinforcement options available if students don't meet the competence guidelines. If that isn't possible, then scenario-based testing would be my next option.  Multiple Choice/True-false/Matching/Drag-and-Drop/Multiple Select etc., would be down the list.  Unfortunately the constraints around a lot of projects take you to the multiple choice level pretty quickly.  Then it's just a matter of figuring out which compromises to settle on to get learners as close as possible to the business outcomes you are trying to drive.