Where can I see latency results for quiz questions?

Mar 28, 2014

Hi there.  I've been using Articulate Storyline and have enjoyed the ease of use more so than Captivate.  Hooray!  We use Moodle 2.2 and 2.3 for the two websites that we author modules.

It turns out, though, that Latency (time spent per question) is not reported, and we really would like that result.  ALl other items are reported, including the cmi.interactions_n.time value which gives us the clock time at which the attempt was initiated, but that doesn't always correspond with latency.

I found this previous thread that ended in 2013: http://community.articulate.com/forums/p/30392/176549.aspx .  It's now March 2014.  My Captivate modules do great in Moodle for latency, but there are no latency reports for my Storyline modules.  Has there been progress on this front?  It would be much appreciated.  

-Todd Chang

Children's Hospital Los Angeles

5 Replies
Ashley Terwilliger-Pollard

Hi Todd and welcome to Heroes! 

I'm happy to hear you're enjoying Storyline. I don't have a further update to share on this issue as it's still with our QA team, but I'll add this thread to the existing report. Have you also tested this out at SCORM Cloud mentioned in the thread you included?  It's a great testing method for SCORM content and you can use the LMS debug mode here to ensure the information is being sent.

Todd Chang

Here is the debug log for a test module (SCORM 1.2).  It reports all of the other item interactions except latency.

https://cloud.scorm.com/sc/user/ClientDebugLogReport?appId=9AJCDWGNV1&courseId=From2a018a720-21b0-45fc-9b1b-946baf37c2a6&logId=f6be744f-89d3-40a2-9a1e-8fdf39079f2a

Here's the output excerpt for Question 8:  

+ [14:23:46.246] LMSGetValue('cmi.interactions._count') returned '8' in 0 seconds
    [14:23:46.246] LMSGetLastError() returned '0' in 0 seconds
+ [14:23:46.246] LMSSetValue('cmi.interactions.8.id', 'Scene2_Slide8_MultiChoice_0_0') returned 'true' in 0.001 seconds
+ [14:23:46.247] LMSSetValue('cmi.interactions.8.type', 'choice') returned 'true' in 0.001 seconds
+ [14:23:46.248] LMSSetValue('cmi.interactions.8.student_response', 'Impetigo') returned 'false' in 0 seconds
    [14:23:46.248] LMSGetLastError() returned '405' in 0 seconds
    [14:23:46.248] LMSGetErrorString('405') returned 'Incorrect Data Type' in 0 seconds
    [14:23:46.248] LMSGetDiagnostic('') returned 'cmi.interactions.n.student_response must be a valid CMIFeedback - value must be consistent with interaction type. Your value is: Impetigo' in 0 seconds
+ [14:23:46.250] LMSSetValue('cmi.interactions.8.student_response', 'i') returned 'true' in 0.001 seconds
+ [14:23:46.251] LMSSetValue('cmi.interactions.8.correct_responses.0.pattern', 'Impetigo') returned 'false' in 0 seconds
    [14:23:46.251] LMSGetLastError() returned '405' in 0 seconds
    [14:23:46.251] LMSGetErrorString('405') returned 'Incorrect Data Type' in 0 seconds
    [14:23:46.251] LMSGetDiagnostic('') returned 'cmi.interactions.n.student_response must be a valid CMIFeedback - value must be consistent with interaction type. Your value is: Impetigo' in 0 seconds
+ [14:23:46.252] LMSSetValue('cmi.interactions.8.correct_responses.0.pattern', 'i') returned 'true' in 0.001 seconds
+ [14:23:46.253] LMSSetValue('cmi.interactions.8.result', 'correct') returned 'true' in 0 seconds
+ [14:23:46.254] LMSSetValue('cmi.interactions.8.weighting', '1') returned 'true' in 0 seconds
+ [14:23:46.254] LMSSetValue('cmi.interactions.8.objectives.0.id', 'Results') returned 'true' in 0.001 seconds
+ [14:23:46.255] LMSSetValue('cmi.interactions.8.time', '14:23:46') returned 'true' in 0.001 seconds
+ [14:23:46.266] LMSSetValue('cmi.core.score.raw', '0.9') returned 'true' in 0.001 seconds
+ [14:23:46.267] LMSSetValue('cmi.core.score.max', '100') returned 'true' in 0 seconds
+ [14:23:46.268] LMSSetValue('cmi.core.score.min', '0') returned 'true' in 0 seconds

It seems to work fine, except that cmi.interactions_n.latency is not reported.  This was a known issue with the QA team since May 2013, but we hadn't heard of any solutions.  Any reason it's taking this long?  

-Todd

Ashley Terwilliger-Pollard

Hi Todd,

I don't have information to share in regards to the process or timeline our QA takes with particular issues, but I see that there are just a few reports of this, so that may be part of the rationale for why it's taken this amount of time. I've included your thread here in the existing QA report, so the additional information shared above will be helpful as the team further investigates this issue. I cannot offer a time frame for when or if this issue will be addressed. I would recommend, at this time, to look for a different approach to this issue for your project. 

Klaus Giebeler

You posted a description as listed here: http://www.articulate.com/support/storyline/quiz-data-sent-to-an-lms-in-articulate-storyline

To my knowledge and as mentioned by you, cmi.interactions.x.latency is not only a very important information but also a SCORM requirement. Are there any plans to add this ASAP as it was existing in your past applications or are you SCORM complaint without?

Klaus

Ashley Terwilliger-Pollard

Hi all,

I wanted to provide an update here, as Storyline 2 Update 11 was just released, and included a number of fixes which you can see in the release notes available here. The item you all may be particularly interested in is how it fixed an issue where the latency value would always be 0 for questions slides in LMS data.

You can download the latest update here, and after downloading and installing the latest update you’ll want to republish any existing content to ensure that the updates and fixes are applied.

Let us know if you have any questions, either here or by reaching out to our Support Engineers directly.

This discussion is closed. You can start a new discussion or contact Articulate Support.