Pre-assessment trackable by LMS in a branched course?
Hi everyone.
Here's our plan: We want to create a chain of short microlearning courses that each cover one specific topic of our company strategy. Learners, however, only have to take a course if they don't pass a pre-assessment for a specific topic. Let me visualize the idea for you here:
Every time a learner fails the entry test, they get taken to the respective learning. Should they pass it, however, they can move on to questions about the next topic.
Our Learning Management System is SAP SUCCESS MAP LEARNING. We need Storyline 360 to report all pre-assessment results back to our Learning Management System so that we can ensure we track those who pass entry tests as well as those who end up having to take the microlearnings.
Does anyone have experience with this? Is what we're trying to do even possible?
It will be difficult, although storyline has different methods of tracking and more than one can be used, it always uses the first one that is reached. You may be able to achieve this but I think it will be a lot of trial and error
The attached file demonstrates a method that I think will work as desired.
Insert a Results slide after each pretest with the appropriate passing score. Indicate in the Quiz settings that it is a Knowledge Check.
For a Knowledge Check the program will show the results to the learner. However, it won't submit them to the LMS, so passing this test won't be tracked as completing the course.
Program the pretest Results slide so the learner has to go through the "microlearning" if they fail. Here's a simple way to do that:
In Slide Properties, remove the Player's Next button.
On the Failure layer, add a button that jumps to the microlearning (in my demo, it's the next slide).
On the Success layer, add a button that jumps to the next pretest (in my demo, it's the next scene).
Another option would be edit the Next trigger with a condition that checks whether the learner passed, so it jumps to the next slide if they didn't pass, else it jumps to the next pretest.
After the final microlearning, insert a Results slide that scores the questions in all of the pretests. Indicate that it is the Final Assessment. And make the passing score 0% to ensure the course will be marked as complete even if the learner fails the pretests.
My demo only has 2 sections, but the method should work for any number of sections.
Another option:
Don't use standard Results slides for the pretests. Instead, use one or more variables to track whether the user achieved a passing score.
For example, if passing requires 100%, use a T/F variable to track whether they have to take the microlearning (that is, track if any question is answered correctly). Have a trigger on each question that changes that variable to True if the answer is wrong.
That could happen when the Incorrect layer's timeline starts if you're showing feedback.
Otherwise, add a trigger to the Submit button (before the trigger that submits the question) with a condition so it only changes the variable if the correct answer does not equal Selected.
Then use a final Results slide as described above.
JudyNolletTHANK YOU SO MUCH! I cant really grasp your second suggested solution but the demo works great. Only downside is that there will ultimately only be ONE completion score. Meaning learners have to go through ALL branchings before getting to the final 'completed' slide, correct? Management is once again asking about tracking PRESCORES... Is there no way to make that happen? Thank you again
I set up the demo based on my understanding of what you wanted.
The learners must take each pretest before they get to the final Results/completion slide.
If they pass a pretest, they can jump directly to the next pretest. If not, they must go though the associated content before they get to the next pretest.
The final Results slide will score and submit the questions in all of the pretests.
Yes, the data would be submitted as if it were one big test. If you only look at the score, there's no way to tell what the learner got on each pretest. Well, unless they scored 0% (none correct) or 100% (all correct).
However, SL submits info about each question. That data could be reviewed to determine the results for each pretest.
We use another SAP Success LMS (SuccessFactors) but Phil is correct - this will be difficult. We were able to accomplish a similar effect by creating a test-out option for some courses but each course has its own test-out element. The LMS does have some quizzing options but these function as independent elements - so the learner would complete a Storyline module, then an LMS quiz. What you are wanting to do is not going to be easy, unless you create a series of "go/no-go" exercises which are not reported to the LMS but which serve to control what the learner sees from Storyline. And even that would have only one "score" reported to the LMS.
If your LMS supports question level reporting, you can use each mini assessment result to answer a "ghost question" that the user never sees. You can then use this "ghost question" assessment to report the results to the lms. Pulling a report on "questions missed" will essentially be the same as pulling a report on mini-assessments failed.
I wrote about ghost assessments a LONG time ago and Iam not sure if it is still around in the new community but reach out to me if you cannot find it and I am happy to walk you through the process.
I noticed that you've opened a support case as well. It looks like my colleague Janina is handling your case and replied to your e-mail with additional insight. If you have questions, feel free to reconnect with Janina through your case!