Is there a way to score each individual match of a drag and drop?

I am using the matching drag and drop and there are four terms and definitions.  In the "edit matching drag and drop" I see score by question, but I assume that is the total - so they would get 10 points if correct and 0 if they even get one wrong?  Is there any way to give them partial credit - like if they get 2 out of the 4 matches correct?

Thank you!! 

30 Replies
Kevin Thorn

Hi Kristen,

You're correct in the default Storyline drag and drop interactions are limited to some degree.

There is a way to handle what you're wanting to do with a little variable finesse.

Start with a blank slide and build all your assets first (objects). Assign which ones will be drag items and which ones will be their drop targets by naming them in the Timeline. i.e. Drag 1, Drag 2, etc. and Drop 1, Drop 2, etc. Naming them in the Timeline will help later when you set up the interaction and assign variables.

Next, create a set of numeric variables for the Drag items with the same names: Drag1, Drag2, etc. Set their initial values to 0. Finally, create one last variable for Total with an initial value of 0.

Now, convert your slide to Freeform Drag and Drop. The default dialog box will appear and simply set which Drag item belongs to which Drop target. Make any other adjustments you need and Save and Close.

Now the fun part. For each drag item that gets dropped on a "correct" target add a trigger: Adjust Variable [Drag1] to =Assignment 1.00 when the user Drops Object On [Drop1]. Do the same for each Drag item.

Finally, on each of the feedback layers add your variable text box for Total - %Total%. Add triggers for when the Timeline Starts on that layer: Adjust Variable "Total" +Add "Drag1" when Timeline Starts.

Okay, I may have missed something just firing from thought, but I believe the above structure will get you going in the right direction. There is so much more you can build off that structure, too like visual check marks and red Xs to visually show the actual correct and incorrect choices. Just use the same variables to change the states of other objects.

Hope that help!

Rebecca Fleisch Cordeiro

Hi Kevin and Kristin,

Tx for the question, and the answer.

Directions were spot on Kevin. I'm attaching a sample here in case Kristin or anyone wants to take a look. Also, though, I have a question:

The total points are not, by default, picked up by the Results Slide. I know how to "get them there," and I have, for the attached (in red on the Results Slide).

But, would they be picked up by an LMS?

And what about the percent score? More triggers and variables, eh?

Kevin Thorn

Hey Rebecca,

Your example didn't alter the results rather you added another level of detail to the existing scoring mechanism. If you run your example and get them all correct, the results slide shows your 10 points along with the breakdown of each choice - plus, the default Results.ScorePoints. 

Results.ScorePoints is a default variable that's added to your variable library when you inserted a Results slide. If you were to add a second Results slide to your example, another variable would be created as: Results1.ScorePoints

When you publish to LMS or AO, and set the Scoring and Tracking option to track against a Results Slide rather than minimum number of slides viewed, SCORM picks up these variables and sends to the LMS:

- Results.ScorePoints

- Results.ScorePercent

- Results.PassPoints

- Results.PassPercent

All of the above are dependent on whether the LMS accepts them however. IN short, no you don't have to do anything additional for that example to work on an LMS. I say that, but as sure as just wrote that it won't work. Pretty sure it's fine though.

What you *could* do is use that default variable and tie it in with your other scoring. First, go into the edit mode of the interaction and make sure there are no weighted points associated with Correct or Incorrect.

Then, do the same triggers and logic to add your choice points to "Total" but instead add them to the existing "Results.ScorePoints" variable. Example on your Correct feedback layer: "Add Almond to Results.ScorePoints when Timeline starts."

Rebecca Fleisch Cordeiro

Hi Kevin,

Thanks for responding. I'm hoping you can explain/correct what I'm doing wrong.

Where I Am

I understand that I didn't alter the original and I'm aware that the results slide "by default" has these built-in variables:
Results.PassPercent, Results.PassPoints, Results.ScorePercent,
Resuts.ScorePoints

And I saw that if I got all 4 parts correct, I received the total score of 100%/10 points from the built-in variable and if I got even one of the 4 wrong I rec'd the total score of 0%.

Where I Want to Be

But I'd want the one correct, so 2.5 points, handed off to the LMS. And I didn't think the Total that I'd created would be handed off. You've suggested

<<go into the edit mode of the interaction and make sure there are no weighted points associated with Correct or Incorrect.>>

Do you mean Edit the Quiz so there are no points listed for correct and incorrect?

And: <<do the same triggers and logic to add your choice points to "Total" but instead add them to the existing "Results.ScorePoints" variable. Example on your Correct feedback layer: "Add Almond to Results.ScorePoints when Timeline starts.">>

I don't seem to be able to do this. When I click the Adjust variable drop-down, Results.ScorePoints doesn't display there. I can do the opposite, I can Add Results.ScorePoints to Almonds, but that's not right!

Can you tell me how I'd do this? TIA!

Kevin Thorn

Ah, I see.

Shooting from the hip I realized my original solution is not a solution at all. You can assign Results.ScorePoints to Total but not the other way around. The default variables are not in the list to "Adjust."

As for passing the points to the LMS, there in lies the problem because the interaction only scores per question, not per choice. For direct user feedback showing which choices were correct and/or incorrect his will work. Overall the points scored is for the whole question/interaction.

Since there's no current way to alter Results.ScorePoints from within Storyline, I'm afraid your stuck with the weighted amount you assign per question. It is a great feature request though as it would open up a lot more options in now to pass a score to an LMS.

From an instructional design approach you could send the learners down a different path if they got it incorrect. Exact same interaction but with a lesser weighted score. Then converge back to the same Results slide. Correct = 20 points. Incorrect = 10 points. Or something like that.

If there are any SCORM gurus here that know how to manipulate the the output files in conjunction with javascript, that *might* be an alternative option. 

Sorry I took you down a dead end journey. 

Rebecca Fleisch Cordeiro

Hi Kevin,

PLEASE don't be sorry. I'm SO glad that I totally get it and wasn't wrong or confused at the outset . Yeah, me :-)

And thanks for taking the time to reply, and in such detail, thinking it through with me, so to speak, rather than just saying, "oops, I was wrong - sorry." That would have left me hangin' a bit. This way I totally get that I totally get it.

It may be that all Kristin wanted was a way to provide direct Learner feedback and not pass it through to the LMS. But I know when I first got involved with Storyline, I wasn't aware of some of these things. So, thought I'd ask.

Thanks again, Kevin. Have a nice evening.

Shailesh Mewada

Hi,

I am having the same requirement. The scoring should be based on the correct responses and not for the entire question of Drag-n-Drop.

I have implemented the following workaround and it is working fine.

In story.html file,
ResizeBrowser(g_strBrowserSize);
var player = GetPlayer()   <


In lms/SCORMFunctions.js (SCORM 1.2)
function SCORM_SetScore(intScore, intMaxScore, intMinScore){
    intScore = parent.scormdriver_content.player.GetVar("MyScore")   <

thanks,
Shailesh

Sij X

I realise it has been a while since Kevin and Rebecca tried to figure this out but I am looking for some new answers to basically the same question.

I would like to have a question with multiple correct answers and would like to award points for each correct answer selected. I would also like these points to be added to Results.ScorePoints and be passed on to the LMS.

So, my questions are:

 - How can I give points for each correct answer AND have these points added to Results.ScorePoints
 - Which variable is used in an LMS to track quiz results? Is it Results.ScorePoints or something else?

Kevin Thorn

Hi Sij

As mentioned earlier, Storyline's Results.ScorePoints only tracks the results on a question's 'weighted' points or a bank of questions and not by individual choices within a question. Additionally, since Results.ScorePoints is a system variable the current version of Storyline doesn't allow us to manipulate the data it collects.

While you can certainly design a custom interaction/question with multiple weighted choices and multiple feedback based on choice, the interaction itself would be part of a knowledge check and not be able to be tied to the course's end results.

I've not implemented or tried Shallesh's solution yet, but your answer lies in the SCORM output and not Storyline itself.

Sij X

Thanks Kevin. I completely understood what you said. Just one thing, I've built a lot of courses but haven't handled deployment much so SCORM and LMS isn't exactly my cup of tea

Could you please explain a little more on what you meant by "your answer lies in the SCORM output"? That sounded a bit like Shifu and Yoda

William R. Landry

Hm. I used to use a little program that was able to score partial credit on virtually any type of question type. The program is decent, but the interaction and media capabilities are extremely limited and complex, and the output files are horrible on any mobile browsers. That's why I bought storyline in the first place, but now that I know the program quite well, I'm becoming frustrated with its limitations regarding partial credit and question level feedback. I still have the other program (won't name it out of respect for the Articulate guys) which publishes in both flash and html5, and I'm wondering if there might be a way to embed it in storyline as a webobject, and/also still have it track to the LMS. Probably not...that would be too easy.  As far as request features go - partial credit scoring and individual question level feedback would be invaluable!  I'd like to know if Studio 13 has these functions.

Mike Taylor

Prompted by this same question by someone else I just had a quick go at this and came up with one way to make this work. 

Basically I created a MC question which can easily be scored by answer. Then I used a variable to keep track of how many correct items that are dropped on the target.  Finally, I added some slide triggers to select the appropriate answer from the MC question objects. 

As you can see I used an image over top of the MC question objects so they can't  be seen  nor interacted with. 

This allows you to report the correct scoring via the results page without any scripting, etc etc. 

Also note that I haven't included any logic to account for removing items after they've been dropped on the target or other similar "error handling"

I've attached my quick & dirty example. 

Rebecca Fleisch Cordeiro

Tx, Mike.

I've attached a revised version that does account for Learners dragging items off (and back on) the target. I adjusted a few things to make it easier visually, probably just for me. I'm easily confused

I placed a transparent shape (left a thin outline so community can see it) where the draggable words are (labeled transparent on the timeline). Design-wise, it would be cool to have a "tray" here where Learners could draw/replace the words. But it's just a quick mock-up.

Then I added T/F variables that can be used as conditions for the scoring of the 4 words (fun, cool, groovy, nice.) depending on where they're dragged to: Default = False. For example,

Add 2 to ScoreVariable when user drops cool on the Target Rectangle if CoolVariable=False.

Adjust CoolVariable=True when user drops cool on the Target Rectangle.

Subtract 2 from ScoreVariable when user drops cool on the Transparent rectangle if CoolVariable=True.

AdjustCoolVariable=False when user drops cool on the Transparent Rectangle

This has also been done for fun, groovy, and nice.

I placed references to the variables at the top of the slide so community can see what's happening.

Questions: Would this be the only way to accommodate users changing their minds? Is there a better way? And, when you mention "other error handling," what else are you foreseeing that could go wrong?

Comment: This is a LOT of manipulation by you, and then by me, to get to partial scoring!!!

Silvia Pernsteiner

TinCan tracking of PARTIAL quiz scores?

Now here comes the really exciting part of this discussion topic:

What if one wants to track not only the total quiz score but a learner's performance per quiz question with PARTIAL grading for Multiple Response and Drag-and-Drop questions (count "some correct" when user submits 3 out of 6 correct responses of a single quiz question).

Set-up = Storyline 1 with TinCan v0.9

The partial scores of each quiz question have to be published for LMS and sent to SCORM Cloud as LRS.

Any work-arounds, anyone? Thank you!!

Rebekah Massmann

This was an amazing thread to find -- I think (hope) I've got my prototype working according to the final recommendation and download from Rebecca. Thank you!

However, I've run into a bit of an aesthetic/clarity issue now due to losing control over drag/drop snap to target settings. If anyone feels like taking a stab at a solution, I started a thread here:

https://community.articulate.com/discussions/articulate-storyline/modify-shape-of-drop-target-or-where-drag-item-snaps-not-in-an-interaction

Thanks!

 

Carmen Akerlund

Hi Rebecca, I've followed your demo exactly for a drag and drop interaction that I want to get individual scores for. But when I click submit I only get the "incorrect" flag but it doesn't show the individual scores like it does on your demo...is there something I have to do to make that show up that's not specifically mentioned in yours or Kevin's explanation?