Forum Discussion

ValeriaVillarro's avatar
ValeriaVillarro
Community Member
9 months ago

Looking for more details on how to better understand and/or customize LMS reporting

We have several clients who are enjoying the customized courses we built that provide a pretest and unique score tracking so that should a learner reach the pass points/percentage of a specific module’s questions they are allowed to test out of that module, even if they did not test out of the full course. 
 
Pretest: 100% total and tracked for each quiz bank.  off the learner hit 100% they have tested out of the course and are done.  If they do not… The results page does the math so if the learner reached 100% in any module they have passed that module and while they can view the material they will not have to pass a subsequent post test. 
 
Any module the learner does not reach 100%, the learner will have to review that modules content and take and pass a post test at a 80% pass rate. 
At the end all the scores are added together and calculated out and reported to the LMS.
 
Here is the problem we are trying to solve:  to have this function correctly no matter how a learner uses the course.  We have to turn off reporting the results of the module post tests. And only report the pretest (which is essentially the final results page reporting to the LMS- or so we thought)
 
As a result our clients cannot track the interactions of any of the post test questions. They have 1000’s to tens of thousands taking these courses, they like to run stats on their questions. 
 
To allow for interaction tracking we can turn on tracking the post test in the LMS tracking menu. BUT if a learner plays loosy goosy with the course and exits after passing any of the modules. The score of THAT module gets sent to the LMS and they will report complete, passed and 80 or 100% of that module. If we turn off reporting, we can’t review questions. If we clear the results prematurely they can’t review their results.  If they follow the normal path, when they get back to the main menu all works fine as expected. (It still reports passed- since they technically did on their recent quiz- but it is now incomplete and a failing grade - I can live with that.). 
 
The likelihood of a learner doing this slim to none, it’s an Easter egg that would take some work and serious luck to find…but the number of learners taking the course means it might happen, or if an LMS is over active it might automatically mark a learner complete and passed without them doing anything. 
 
So we are looking to the awesome learning crew to ask if there is someone who might specialize or be super knowledgeable in LMS reporting understanding who could advise. 
  • JHauglie's avatar
    JHauglie
    Community Member

    Your dilemma sounds like an excellent case study for migrating to an LMS that can manage everything from SCORM to xAPI.

    Because you know what the output measures are that you are seeking, that helps a great deal. While it's true that SCORM can provide that data, your LMS may not be processing the package(s) correctly because of its own inherent limitations or architecture. So I would suggest that you explore what the LMS can do - another opportunity for strong vendor relationships, perhaps, or maybe replacing your current system.

    We did a major update and migration last year (to SuccessFactors) and have had some positive results achieved with tracking the data through different LMS quizzing features. Ideal? Nope. Good enough (reliable) to get some data that we can stand on? Yep. Are we able to explore the steps needed to get to using xAPI? We can, but "we just upgraded the LMS so we are not pursuing any further migration paths for the foreseeable future." Sigh.