Problem of scoring "rules" between SL1 and SL2
Feb 18, 2015
Since the deployment of SL2, we have detected one issue that is very critical for us today. Indeed, with SL1, when starting the Assessment (Question Bank) and according to the number of points (by question) to obtain, the passing score is different with SL2.
We publish with TinCan API tracking standard
- Example with a SL1 File:
Number of questions (score of 10 points for each): 18
Minimum score to obtain: 80% (so 144 points)
After answering the first question well, the actual score is at 5,55%
We have a specific tool to check the different parameters during a module reading and with SL1 we can see that:
The line Score / % / Passing / % says that to pass the module we need to obtain 144 points (for the moment we obtained 10,00)
So everything is OK
- Example with a SL2 File:
Number of questions (score of 10 points for each): 5
Minimum score to obtain: 80% (so 40,00 points)
After answering the first question well, the actual score is at 100% !!!:
The line Score / % / Passing / % says that to pass the module we need to obtain 8,00 points (for the moment we obtained 10,00)
How is it possible? When continuing the assessment, the passing score is growing and the obtained score decrease or stay at the same level according to the answer status (correct/incorrect)
My problem is that, since we use SL2, people just need to do 1 questions and if they answer well at the first one and close the module, the score obtained is 100% and pass the module in our LMS.
With SL1 files, we did not have this problem as they needed to go through the whole assessment to be able to get the maximum score.
To sum up:
- SL1: Score from 0 to 100 during assessment navigation
- SL2: Score from 100 to 0 during assessment navigation
My question is: Did you change something in SL2? Is it a bug? Is it due to TinCan API new release version?
Thanks in advance for your support