Problem of scoring "rules" between SL1 and SL2

Feb 18, 2015

Hello

Since the deployment of SL2, we have detected one issue that is very critical for us today. Indeed, with SL1, when starting the Assessment (Question Bank) and according to the number of points (by question) to obtain, the passing score is different with SL2.

Some information:

We publish with TinCan API tracking standard

Explanations:

  • Example with a SL1 File: 

Number of questions (score of 10 points for each): 18

Minimum score to obtain: 80% (so 144 points)

After answering the first question well, the actual score is at 5,55%

We have a specific tool to check the different parameters during a module reading and with SL1 we can see that:

SL1

The line Score / % / Passing / % says that to pass the module we need to obtain 144 points (for the moment we obtained 10,00)

So everything is OK

  • Example with a SL2 File: 

Number of questions (score of 10 points for each): 5

Minimum score to obtain: 80% (so 40,00 points)

After answering the first question well, the actual score is at 100% !!!:

SL2

The line Score / % / Passing / % says that to pass the module we need to obtain 8,00 points (for the moment we obtained 10,00)

How is it possible? When continuing the assessment, the passing score is growing and the obtained score decrease or stay at the same level according to the answer status (correct/incorrect)

My problem is that, since we use SL2, people just need to do 1 questions and if they answer well at the first one and close the module, the score obtained is 100% and pass the module in our LMS.

With SL1 files, we did not have this problem as they needed to go through the whole assessment to be able to get the maximum score.

To sum up:

  • SL1: Score from 0 to 100 during assessment navigation
  • SL2: Score from 100 to 0 during assessment navigation

My question is: Did you change something in SL2? Is it a bug? Is it due to TinCan API new release version?

Thanks in advance for your support

Regards

Julien

4 Replies
Justin Grenier

Good Afternoon, Julien.

Before we dig in to the technical details, can you please test for reproducibility of the problem within SCORM Cloud?  SCORM Cloud is an industry-standard testing engine, and although “SCORM” is in its name, you can also use it to test Tin Can API content.

If your content works properly at SCORM Cloud and is only reproducible within your LMS, open a support case with your LMS provider to troubleshoot the issue.

On the other hand, if you can reproduce the problem within SCORM Cloud, we'd invite you to submit a Support Case so that we can take a closer look at the problem.

Thanks!

 

Julien Martin-Schmitt

Hello

I have tested on SCORM Cloud but it saves something only when the Assessment is finished and get the result slide. If we only follow the course (so don't follow the Assessment), it does not save anything (time for example). 

So for me it is not relevant as test sorry. I don't see any link between the fact that in :

  • SL1 after answering the first question, the objective is the 80% of the total assessment points (144 points as example above). Our actual score is then around 5%
  • SL2 after answering the first question, the objective is 10 points (only 1 question objective) and this objective increase with the assessment navigation

Even if I can understand that our LMS does not record at the good moment (not at the result slide), I don't understand why it changed bewteen SL1 and SL2

Can you please look at this statement please?

I will create a demand

Thanks

Julien

This discussion is closed. You can start a new discussion or contact Articulate Support.