Forum Discussion

ChristyDotson's avatar
ChristyDotson
Community Member
18 days ago

Test results response file

Looking for:
What is the primary raw file name that contains the answer information for a quiz?

Issue:
I used the t/f interaction and updated some of the language to be 'yes' for true and 'no' for false. Despite the test having 'yes/t' selected as the correct answer, when the user is taking the test the correct response is being reported as 'f' and the student answer as 'f' even if the user did select 'yes/t'. When the student looks back at their answers in the LMS system it's reporting the answer as 'no/false' which is not correct. The student still receives the credit for getting the right answer despite the output reporting the wrong answer.

Since our LMS provider is small we are trying to determine if this is the way they are unpacking the output file or if there is some weirdness going on with these t/f interactions since they have had minor language updates.

  • Hi ChristyDotson , assuming SCORM 1.2, the T/F style questions are most likely using the SCORM defined cmi.interactions “true-false”. This means that the (legal) responses expected by an LMS are either "0","1" or "t","f". It won't matter what the text labels are, as the LMS would accept them, therefore Storyline will be hardcoded to send either "t" or "f".

    Multiple choice or "choice" legal characters are "0","9" or "a","z".

    Reporting for SCORM 1.2 quizzes is pretty awful and has been very rarely used in my experience and would need an accompanying document (Q and A matrix) to make sense of the data stored in the LMS.

    The most flexible response type in SCORM is the "fill-in", which allows a string of up to 255 characters.

    I've never looked into this in Storyline, but when I used to hand build courses, I would program all questions to use "fill-in". To the user, the front-end may look like T/F or multi choice questions, but I would use the "fill-in" type so I could store more meaningful data for reporting.

    SCORM 2004 improved allowing you to store more meaningful data in multi choice questions.

     

    • ChristyDotson's avatar
      ChristyDotson
      Community Member

      Thanks for the reply SamHill. I figured that the response file would still use the 't' or 'f' marker but find it odd that only the ones that I altered the screen text on seem to reporting back weird. Do you know, within the SCORM file packet, if there is an answer key that one could review to see if the answer is coded correctly? Or do most users just rely on the .story file and how it's built/responding visually in the editor?

      • SamHill's avatar
        SamHill
        Super Hero

        Hi ChristyDotson I'm not sure I understand the question. The SCORM cmi.interactions that are used by Storyline are determined by the SCORM 1.2 and 2004 specification document. To ensure that content is interoperable between different LMS, the content must follow the specification. The question type used determines the data that is expected by the LMS. Deviating from the expected data will cause issues with the LMS. In order for content to be interoperable, it must follow the specification.

        If you would like to deviate from the SCORM specification you would need to make edits to the JavaScript file "scormdriver.js" which handles the data transaction between LMS and content. You would need to intercept the Get and Set functions and make the appropriate changes.

        Let me know if you have any further questions.

  • ChristyDotson's avatar
    ChristyDotson
    Community Member

    It could be I don't fully grasp how the SCORM cmi.interactions are delivered to the LMS. One of my colleagues wants to see what answers the SCORM course has indicated as correct and compare it with the output response file that is received after a user interacts with the course.

    • SamHill's avatar
      SamHill
      Super Hero

      Ok I understand. In this instance, a document with questions and answers is the best method. Next to each option, you would likely need to add a key to indicate how the response will appear in the LMS. Using your Yes, No as an example:

      Q: Was the customers behaviour acceptable? : Scene1_Slide1_TrueFalse_0_0

      Yes : t
      No : f

      Highlighting the correct answer with bold, and adding the response as it would be seen in the LMS. Also adding the ID so it can be view in conjunction with the report.

      This is the type of data that is sent to the LMS for each question that has an associated results page (example of the data stored for the question above):

      LMSSetValue('cmi.interactions.0.id', 'Scene1_Slide1_TrueFalse_0_0')
      LMSSetValue('cmi.interactions.0.type', 'true-false')
      LMSSetValue('cmi.interactions.0.student_response', 'f')
      LMSSetValue('cmi.interactions.0.correct_responses.0.pattern', 'f')
      LMSSetValue('cmi.interactions.0.result', 'correct')
      LMSSetValue('cmi.interactions.0.weighting', '10')
      LMSSetValue('cmi.interactions.0.latency', '0000:00:11.68')
      LMSSetValue('cmi.interactions.0.objectives.0.id', 'Was_the_customers__behaviour_acceptable_')
      LMSSetValue('cmi.interactions.0.time', '11:40:08') returned 'true' in 0.001 seconds

      If you use a multiple choice question, Storyline uses the first letter of the Checkbox or Radio button label to send to the LMS. For example, if you have a label of "Yes", Storyline would send "y" the LMS. 

      Note: I have just found a bug with Storyline. If you have multiple values beginning with the same letter, the responses will be stored with that letter. For example:

      Q: How many stars are on the Australian flag? : Scene1_Slide2_MultiChoice_0_0

      Six : s
      Seven : s
      Eight : e
      Nine : n

      Following is the data send to the LMS when responding to the question with incorrect answers until selecting "Six", the correct answer.

      Note: Storyline attempts to send the full label to the LMS first. As you will see, a response of 'Seven' is sent to the LMS. I'm using SCORM Cloud, which rejected the value as it is not a SCORM compliant value. Storyline then falls back to a SCORM Compliant value (0-9, a-z), and uses the first initial for the value. The LMS vendor could make the decision to allow the value that do not fit the 0-9, a-z requirement though and allow the value of 'Seven' to be stored.

      LMSSetValue('cmi.interactions.4.id', 'Scene1_Slide2_MultiChoice_0_0')

      LMSSetValue('cmi.interactions.4.type', 'choice')

      LMSSetValue('cmi.interactions.4.student_response', 'Seven') returned 'false' in 0 seconds

      LMSSetValue('cmi.interactions.4.student_response', 's')

      LMSSetValue('cmi.interactions.4.correct_responses.0.pattern', 'Six') returned 'false' in 0 seconds

      LMSSetValue('cmi.interactions.4.correct_responses.0.pattern', 's')

      LMSSetValue('cmi.interactions.4.result', 'wrong')

      LMSSetValue('cmi.interactions.4.weighting', '10')

      LMSSetValue('cmi.interactions.4.latency', '0000:00:11.39')

      LMSSetValue('cmi.interactions.4.objectives.0.id', 'Was_the_customers__behaviour_acceptable_')

      LMSSetValue('cmi.interactions.4.time', '11:54:57')

       

      LMSSetValue('cmi.interactions.5.id', 'Scene1_Slide2_MultiChoice_0_1')

      LMSSetValue('cmi.interactions.5.type', 'choice')

      LMSSetValue('cmi.interactions.5.student_response', 'Nine') returned 'false' in 0 seconds

      LMSSetValue('cmi.interactions.5.student_response', 'n')

      LMSSetValue('cmi.interactions.5.correct_responses.0.pattern', 'Six') returned 'false' in 0 seconds

      LMSSetValue('cmi.interactions.5.correct_responses.0.pattern', 's')

      LMSSetValue('cmi.interactions.5.result', 'wrong')

      LMSSetValue('cmi.interactions.5.weighting', '10')

      LMSSetValue('cmi.interactions.5.latency', '0000:00:14.52')

      LMSSetValue('cmi.interactions.5.objectives.0.id', 'Was_the_customers__behaviour_acceptable_')

      LMSSetValue('cmi.interactions.5.time', '11:55:00')

       

      LMSSetValue('cmi.interactions.6.id', 'Scene1_Slide2_MultiChoice_0_2')

      LMSSetValue('cmi.interactions.6.type', 'choice')

      LMSSetValue('cmi.interactions.6.student_response', 'Eight') returned 'false' in 0 seconds

      LMSSetValue('cmi.interactions.6.student_response', 'e')

      LMSSetValue('cmi.interactions.6.correct_responses.0.pattern', 'Six') returned 'false' in 0 seconds

      LMSSetValue('cmi.interactions.6.correct_responses.0.pattern', 's')

      LMSSetValue('cmi.interactions.6.result', 'wrong')

      LMSSetValue('cmi.interactions.6.weighting', '10')

      LMSSetValue('cmi.interactions.6.latency', '0000:00:17.47')

      LMSSetValue('cmi.interactions.6.objectives.0.id', 'Was_the_customers__behaviour_acceptable_')

      LMSSetValue('cmi.interactions.6.time', '11:55:03')

       

      LMSSetValue('cmi.interactions.7.id', 'Scene1_Slide2_MultiChoice_0_3')

      LMSSetValue('cmi.interactions.7.type', 'choice')

      LMSSetValue('cmi.interactions.7.student_response', 'Six') returned 'false' in 0 seconds

      LMSSetValue('cmi.interactions.7.student_response', 's')

      LMSSetValue('cmi.interactions.7.correct_responses.0.pattern', 'Six') returned 'false' in 0 seconds

      LMSSetValue('cmi.interactions.7.correct_responses.0.pattern', 's')

      LMSSetValue('cmi.interactions.7.result', 'correct')

      LMSSetValue('cmi.interactions.7.weighting', '10')

      LMSSetValue('cmi.interactions.7.latency', '0000:00:19.81')

      LMSSetValue('cmi.interactions.7.objectives.0.id', 'Was_the_customers__behaviour_acceptable_')

      LMSSetValue('cmi.interactions.7.time', '11:55:06')

       

  • ChristyDotson's avatar
    ChristyDotson
    Community Member

    That was very helpful. Thank you, Sam! The plot thickens on the issue we are running into.

    We ran through the course twice, one with the known correct answer and one with the known incorrect answer. When we review the logs it looks like the system is delivering back the student_response as 'f' for both instances. We do see a difference between the two entries' result field (one being 'correct' and one being 'wrong'). Any thoughts on what is causing this?

    The question we are viewing is Scene1_Slide8. It is a true/false interaction that the text label in the course has been changed to yes/no. The correct answer for this question is yes/true. Here are the logs: Correct Answers log Incorrect Answer log. This is also happening on one other T/F interaction that we have updated the label to yes/no as well (Scene1_Slide12).

    • ChristyDotson's avatar
      ChristyDotson
      Community Member

      Just to see what would happen I replicated the course and changed the two problematic slides. I used the true/false interaction as before but did not update the value to say yes/no (I kept the standard true/false) and they are responding how you described in your example (giving student_response 't' for true and 'f' for false).

      Still wondering why this is happening but I suppose at least there is a way to fix it.

    • SamHill's avatar
      SamHill
      Super Hero

      Yes, something has gone wrong with that question for it to be reporting 'f' for both Yes/No responses. It's as if you have two false responses in your T/F question. I know it would not be possible for you to configure this way.

      The only thing I can suggest is to delete that question an re-recreate it. You could opt for a Multi Choice questions instead, or try the T/F again to see if there is a consistent issue that would need reporting as a bug.

  • ChristyDotson's avatar
    ChristyDotson
    Community Member

    Thanks again! I just took my replicated course that responded as anticipated and changed only the words from 'true' to 'yes' and 'false' to 'no'. The log is now reporting back student_response as 'f'. Off to submit a bug report!

    • KellyAuner's avatar
      KellyAuner
      Staff

      Hi, ChristyDotson!

      I see my teammate, Wilbert, is taking a closer look at your support case now. Please feel free to let them know if you have any questions. You're in good hands!