[SL2] - Pass if less then 5 incorrect answers (Multi-language)

Hi guys,

I'm building a multi-language course, where a language select is done when you start, which updates a variable, and alters the states of text-boxes throughout the course to the corresponding language. (1 state for each language). Did not find any better way to do this.

First problem now is questions/quiz. No way to change state here, so I need to create 6 question-banks (I've got 6 languages), and navigate to the correct bank based on that same variable. No problem. However, how do I skip counting the questions in the other question-banks when calculating score?

One solution would be to do some math, but I was hoping there would be a cleaner way of doing this, only counting the questions that the user are served.

In my example, the user need less than 5 incorrect answers to pass.
How should i check this? :)

I'm open for other solutions!

What I'm trying to accomplish here is a good way to create a multi-language course, without having to maintain 6 different projects.

My current workflow works great, except for the quiz part. The course is divided into 7 chapters, each chapter ending with a graded survey. But instead of 7 question banks, I need 42, because I need 7 for each langauge. (Each question bank includes a draw of 3 questions, from a total of 4-6)

 

17 Replies
Dave Cox

Hi Stian,

Yes, you can create a results slide for each question bank. The problem that I've run into in the past though, is you have to pick one of those results slides to report to the LMS when you publish. There is no way available in Articulate to select the correct results slide based on the quiz that you have picked.

The solution that I've been successful with is th create a final results slide that all of the slides report to, and select this one to report to the LMS. Then, since this slide report a sum of all of your quiz slides, I use javascript to override the value that is sends to the LMS with values that I collect during the quiz. I can post additional information if you are interested.

Stian Larsen

Hmmm, I see. So simply directing the user to the correct result-slide based on language selected(variable) will not be possible, as you have to select a single slide to report upon export.

Yes, please share additional information if that's ok! :) I'm familiar with web-development, but never done any javascript in combination with a LMS/SCORM.

EDIT: Why would you need multiple result-slides if you handle the result using JavaScript? In my working example, my result-slide is reporting on all question-banks, and I adjust the required percentage to reflect what is really needed to be correct. in my example, I say 13% is required, which translates to ~80% of the presented questions (as only 1/6 is used for each language). 80% of the presented questions equals less than 5 wrong answers. So far so good. Only problem here, is the user will see a required score of 13% instead of 80%, and will be confused (this is on the LMS side).

Stian Larsen

Thanks. I will read through it. One quick question though, as I just read the original question in that thread:

Will SL2 report correct percentage to the LMS? If I have 6 branches, and have 6 result slides, which combine to one final. Will that slide submit 100% completion if all questions from one branch is answered? If so, then that is enough in my example. Pass/Failed is handled at the LMS. All I need is correct percentage to be transferred.

I'll read the post now :)

Stian Larsen

This is not a very stable solution. What happens if the user re-enters the course, without completing it? Just to check back? Will the results be overwritten then? To show the "actual" 13% score, or the adjusted 80% score? Because the LMS controls pass/fail based on percentage.

Silly that there are no options to allow for such behaviour. I can think of many occasions where one would want to include multiple question banks, but only show a selection of them based on how the course develops.

Dave Cox

Hi Stian,

What happens depends on how your course and the LMS is set up. I'm sure that your LMS can be configured on a course by course basis for whatever you need. Once we pass a successful completion, we don't allow users to reenter the course just to check back, if the start it again, they are taking another instance of the course for another grade. We pass a completion on a passing score, and then we update both in the LMS.

Storyline updates the LMS on every graded question slide, and on every results slide, although it only sets the completion on the final results slide. This is when I update the score using Javascript.

I agree, that this may not be the best possible behavior, but as you observed, Storyline does not have previsions to allow us to select a question bank during run time, and then score on only that question bank. This is a work around that I came up with to solve this problem, that has worked fairly well for us. If you can find a better solution, I'm sure there is a whole bunch of people, including me, that would love to hear about it.

Stian Larsen

I really appreciate your help Dave, and I'm sure I will end up using your solution :)
I'm still hoping to find another way to handle this, but it seems you've already found one of the few possible workarounds.

Again, thanks a lot for your help. If anyone else happens to stumble upon other solutions, please write it here :)

OWEN HOLT

Stein,

You can use one question bank but it might take a spreadsheet to manage your variables. Essentially, all of your questions rely on variables to populate what the user sees for both the questions and the answers.

I posted about this a few years ago with a screenr example. You can find my response in this thread: https://community.articulate.com/discussions/articulate-storyline/multi-language-graded-quizzing-with-question-banks#186547

There is also a sample course file that you can see there.

OWEN HOLT

Attached is a sample POC where you can see the same branching scenario in 3 different languages. All 3 branches go back to one single assessment. The assessment uses variables to present the questions and answers in the selected language. Hopefully, this will make more sense once you see it. 

Dave Cox

I don't think there is a good way to import variables from Excel, but it would be easy enough to import them from a text file. Of course that would require javascript again, but the text file could be set up where they would be easy enough to manage. I did something similar for a couple of games that I wrote, and I wanted to be able to import multiple questions and answers for the game boards based on the topic. I can display as many topics as I want, and then import the correct questions and answers that matches that topic.

Stian Larsen

Thanks a lot for your example Owen! I can see that this would work for fewer questions, but as you mention it would require a lot of variables. Also, I need to draw 3 questions from 5+ available. In 6 chapters. Plus an control-question at end with 5 draws from all chapters combined. Times 6 langauges. That equals a minimum of 200 questions with 600 to 1000 answers. (3 to 5 alternatives) =/

In my previous development software, the score count was simply counting correct and false answered questions. Not total available, just the results. So even though I would have many question banks with lots of questions, it would not count toward the score unless it was presented to me and I made an answer.

OWEN HOLT

You might be able to make use of what I call a ghost assessment or ghost questions. Basically, you use variables controlled by questions the user actually sees to answer "questions" they never see that are part of the final assessment. This would get you to the right count on the results slide you use with your LMS. You can read more about ghost assessments in my post here even though it is for a very different use.