Forum Discussion

JeanMarrapodi-c's avatar
JeanMarrapodi-c
Community Member
8 days ago

Tracking Test Outs vs. Completions (Workday Learning LMS)

Good day, brilliant friends.

My LMS administrator came to me with a question I can't answer. She would like to differentiate the people who take the pretest and test out of a course from the people who go through the entire course.

We use Workday Learning, and this is what they see. Complete doesn't tell them if they were a test out. Any suggestions?

Clicking their name lets me drill down and get to what they said on the actual questions but still doesn't show me they tested out.

This is how I have it set up in Storyline.

Is there something I can do with a completion trigger to identify who tested out?

 

  • This articles lists the data sent to an LMS: https://access.articulate.com/support/article/quiz-data-sent-to-an-lms-in-articulate-storyline 

    How you pull the data depends on the LMS. 

    If you can see the question data, you'd need to differentiate the pre-check questions from the final-assessment ones. For example, use different question text and/or responses. Of course, that's not doable if both quizzes are drawing from the same QB.

    Another option would be to add a disguised Short Answer Survey question to each quiz, and include that in the Results. Use a trigger to add "pre-check" or "final" to the answer field, as appropriate. That info should then be submitted to the LMS with the data from the graded questions. Note that you'll have to switch the Submit trigger so it runs with the Next button instead of the Submit button. 

    This post is about a different issue. But it does have a file that demos using a disguised question slide, so it might be useful: TIP: Limit How Many Selections a User Can Make on a Multiple-Response (Pick-Many) Question | Articulate - Community 

    • JeanMarrapodi-c's avatar
      JeanMarrapodi-c
      Community Member

      Interesting. Thanks JudyNollet. I have two different question banks - just a copy the pretest, but if I analyze the questions, I can't tell if they retook the test after the pretest. I wonder if there's a way I could add an invisible P for pretest or F for final to the questions via a trigger or variable that would show in the results that are sent to the LMS. 

      • JudyNollet's avatar
        JudyNollet
        Super Hero

        You could add "invisible" text (e.g., a letter the same color as the background). However, that's bad for accessibility, because it would still be seen/read by screen readers.

  • JoeFrancis's avatar
    JoeFrancis
    Community Member

    It's been a long time since I did it, and it was in a different LMS (SumTotal) than what I'm using now (Saba), but I seem to remember setting up a course with 2 learning objects, the pre-assessment and the course (which incorporated the post-assessment). Successful completion of the pre-assessment LO would "roll up" to the course object and set it to "Passed/Complete" with a score of 100%. A score of less than 100% would lock the pre-assessment from being re-entered ("one and done") and automatically launch the learner into the course.

      • JoeFrancis's avatar
        JoeFrancis
        Community Member

        The pre-assessment/"test out" is a standalone Learning Object--created in Storyline or QuizMaker--and the Storyline course is also a standalone LO. The LOs are attached to a single overarching course in the LMS as individual objects, configured to be presented sequentially.

        When the learner launches the course, the first LO in the order--the test out--launches. If the leaner successfully passes--e.g., scoring 100%--the LMS course is set to Passed, the learner's transcript is updated to Completed/Passed, and he/she is placed back on the LMS course description page. At that point, no further action by the learner is required.

        If the learner does NOT pass the test-out, then the next LO in the sequence--the Storyline course--is launched.