Score by Question vs. at end - reporting issues?

Jul 08, 2013

Hi,
Has anyone ever come across any SCORM reporting issues that (appear to be) based on whether question responses all come at the end vs. when answers come question-by-question. Happening in Storyline, but also Presenter, on a completely random basis (it seems).

I have a client who appears to be saying that this is the basis of random LMS reporting errors, and whilst this sounds a little implausible to me I thought I would just put it out there and ask. 

Basically - for every question answered and continue button clicked the score is captured, but the status will only be captured when the last questions continue button is clicked, (that I what they are claiming).

They are saying that this is the cause of some users failing to complete a course, yet getting al the questions correct. My argument (for starters...) is that as most of the users have perfectly normal experiences, it must be local/environmental rather than the LMS.

Thanks

Bruce

3 Replies
Christine Hendrickson

Hi Bruce,

I definitely agree - if this doesn't happen for every user accessing the course, it's more likely that the problem is user or machine-specific. It could also have something to do with the way users are exciting the courses and so on.

Have they tested the course outside of the LMS at all? If they're concerned that it's a problem with the LMS, this would definitely help confirming or clarifying the issue. If it doesn't report correctly outside of the LMS (say in SCORM Cloud, etc.), then it may be with the project. If it does, they may want to get more information from the users that are experiencing problems and try to determine a common thread between the users. 

Hope you're having a great day Bruce!

Christine

This discussion is closed. You can start a new discussion or contact Articulate Support.