Forum Discussion
Is there a way to score each individual match of a drag and drop?
I am using the matching drag and drop and there are four terms and definitions. In the "edit matching drag and drop" I see score by question, but I assume that is the total - so they would get 10 points if correct and 0 if they even get one wrong? Is there any way to give them partial credit - like if they get 2 out of the 4 matches correct?
Thank you!!
38 Replies
- KevinThornSuper Hero
Thanks for the code share, Shailesh! Gonna go try that this weekend!
- BeatrizGimenoCommunity Member
Hi Shailesh,
Could you please help me implementing your workaround for lms/SCORMFunctions.js?
I honestly have no idea of what to do with those codes.
Thanks a lot,
Beatriz
- ShaileshMewadaCommunity Member
Storyline doesn’t allow scoring the individual answers.
In our case, all questions were Drag-n-Drop question and we have created custom score variable. We have then tracked the score variable using the mentioned technique.
Regards,
Shailesh - SijXCommunity Member
I realise it has been a while since Kevin and Rebecca tried to figure this out but I am looking for some new answers to basically the same question.
I would like to have a question with multiple correct answers and would like to award points for each correct answer selected. I would also like these points to be added to Results.ScorePoints and be passed on to the LMS.
So, my questions are:- How can I give points for each correct answer AND have these points added to Results.ScorePoints
- Which variable is used in an LMS to track quiz results? Is it Results.ScorePoints or something else? - KevinThornSuper Hero
Hi Sij
As mentioned earlier, Storyline's Results.ScorePoints only tracks the results on a question's 'weighted' points or a bank of questions and not by individual choices within a question. Additionally, since Results.ScorePoints is a system variable the current version of Storyline doesn't allow us to manipulate the data it collects.
While you can certainly design a custom interaction/question with multiple weighted choices and multiple feedback based on choice, the interaction itself would be part of a knowledge check and not be able to be tied to the course's end results.
I've not implemented or tried Shallesh's solution yet, but your answer lies in the SCORM output and not Storyline itself.
- SijXCommunity Member
Thanks Kevin. I completely understood what you said. Just one thing, I've built a lot of courses but haven't handled deployment much so SCORM and LMS isn't exactly my cup of tea
Could you please explain a little more on what you meant by "your answer lies in the SCORM output"? That sounded a bit like Shifu and Yoda - WilliamRLandryCommunity Member
Hm. I used to use a little program that was able to score partial credit on virtually any type of question type. The program is decent, but the interaction and media capabilities are extremely limited and complex, and the output files are horrible on any mobile browsers. That's why I bought storyline in the first place, but now that I know the program quite well, I'm becoming frustrated with its limitations regarding partial credit and question level feedback. I still have the other program (won't name it out of respect for the Articulate guys) which publishes in both flash and html5, and I'm wondering if there might be a way to embed it in storyline as a webobject, and/also still have it track to the LMS. Probably not...that would be too easy. As far as request features go - partial credit scoring and individual question level feedback would be invaluable! I'd like to know if Studio 13 has these functions.
- MikeTaylorSuper Hero
Prompted by this same question by someone else I just had a quick go at this and came up with one way to make this work.
Basically I created a MC question which can easily be scored by answer. Then I used a variable to keep track of how many correct items that are dropped on the target. Finally, I added some slide triggers to select the appropriate answer from the MC question objects.
As you can see I used an image over top of the MC question objects so they can't be seen nor interacted with.
This allows you to report the correct scoring via the results page without any scripting, etc etc.
Also note that I haven't included any logic to account for removing items after they've been dropped on the target or other similar "error handling"
I've attached my quick & dirty example.
- KristinAugustaCommunity Member
Thank you, Mike! I can't wait to play around with this!!
- RebeccaFleischCCommunity Member
Tx, Mike.
I've attached a revised version that does account for Learners dragging items off (and back on) the target. I adjusted a few things to make it easier visually, probably just for me. I'm easily confused
I placed a transparent shape (left a thin outline so community can see it) where the draggable words are (labeled transparent on the timeline). Design-wise, it would be cool to have a "tray" here where Learners could draw/replace the words. But it's just a quick mock-up.
Then I added T/F variables that can be used as conditions for the scoring of the 4 words (fun, cool, groovy, nice.) depending on where they're dragged to: Default = False. For example,
Add 2 to ScoreVariable when user drops cool on the Target Rectangle if CoolVariable=False.
Adjust CoolVariable=True when user drops cool on the Target Rectangle.
Subtract 2 from ScoreVariable when user drops cool on the Transparent rectangle if CoolVariable=True.
AdjustCoolVariable=False when user drops cool on the Transparent Rectangle
This has also been done for fun, groovy, and nice.
I placed references to the variables at the top of the slide so community can see what's happening.
Questions: Would this be the only way to accommodate users changing their minds? Is there a better way? And, when you mention "other error handling," what else are you foreseeing that could go wrong?
Comment: This is a LOT of manipulation by you, and then by me, to get to partial scoring!!!