Forum Discussion

EmilyGallich027's avatar
EmilyGallich027
Community Member
3 years ago

Report on attempt and question response data

Background: I have a Rise course that culminates in a short 5 question quiz with an 80% pass requirement. The quiz allows unlimited attempts.  The course is exported in SCORM 2004, 4th edition format because we want to be able to track trends for which questions people are getting incorrect regularly. 

The issue:

  • In our LMS (Bridge), all quiz reattempt that a learner has made on the quiz within our Rise course seem to be grouped and reported back as one individual attempt, so learners always show as having passed on first attempt when this is not accurate.
  • LMS reporting for quiz question responses shows only one row for each learner, with the quiz questions showing in the columns along the top of the report. (Mock-up attached). If the learner has reattempted the quiz, the questions are then duplicated again as new additional columns for each reattempt that learner makes. This makes analysing trends impossible.
  • The LMS I used at my last company (Cornerstone) would interpret SCORM 2004, 4th edition format outputs from Rise courses to display a separate row for each learner attempt on the quiz, with the same number of questions listed across the top columns for consistency (Mock-up attached). This would enable us to analyse trends and determine which questions people typically struggled to get correct first time.

The request: Naturally, I contacted Bridge to query whether we could change the interpretation of the data / reporting in our LMS. However they have responded saying that this will be an issue with how the quiz has been setup in Rise. 

Can someone please help?

  • Hi Emily.

    Thanks for reaching out. I can see that you've already reached out to our support engineers for this. Have a good one.

    • BruceMeyer-adc1's avatar
      BruceMeyer-adc1
      Community Member

      Closing out the request in this way is of no use to the rest of the community, who are obviously reading this post because of some interest in the topic.

      • KarlMuller's avatar
        KarlMuller
        Community Member

        HI Bruce,

        You can ask the OP to provide a synopsis of how the issue was resolved (or not).

  • AlexBarrett's avatar
    AlexBarrett
    Community Member

    We have a similar issue.
    Cornerstone interprets all Quiz attempts as 1 in Reporting 2.0

    Cornerstone says this is a Rise problem
    Rise says that this is a Cornerstone problem

  • Hi - can we please see the response/solution to this issue? I'm having a similar problem with my LMS platform and keep going back and forth to understand whether i'm doing something wrong or if the platform isn't able to track attempts.

  • AlexBarrett's avatar
    AlexBarrett
    Community Member

    Hi Mark - I'm reasonably certain that no solution exists. What LMS are you using?

  • Hi - OP here. Never did find a resolution sadly. We've instead decided to manage the quiz element in Microsoft Forms and embed those into the Rise course. Which is clunky and confusing for learners. 

  • JM-0001's avatar
    JM-0001
    Community Member

    We are encountering the same issue with Adobe Learning Manager, and from our investigation, it appears that both the LMS and content authoring platforms (like Articulate RISE) are shifting the blame without a definitive resolution.

    Technical Analysis:

    I have conducted extensive debugging on SCORM 1.2 and SCORM 2004 packages by enabling debug mode to capture the communication between the SCORM runtime API and the LMS. Upon reviewing the data exchange, it appears that RISE-generated SCORM packages are failing to include a critical data structure necessary for LMS platforms to accurately track and store quiz attempt data in a structured format.

    Key Technical Issues Identified:

    1. No Quiz Attempt Identifier 
      • SCORM does not natively report a unique attempt identifier at the quiz level.
      • Each quiz attempt should have a distinct cmi.core.lesson_status or an equivalent attempt-specific key to differentiate between multiple attempts.
    2. Lack of a Persistent Question Identifier
      • SCORM is currently passing the question text string as the only identifier 
      • Without a persistent, unique ID for each question (independent of the text), LMS platforms struggle to properly aggregate learner responses across multiple attempts.
    3. Data Structuring Issue in LMS Reporting:
      • SCORM’s interaction model does not automatically create a new record for each quiz attempt.
      • Instead, the LMS must either:
        • Overwrite existing response data (leading to loss of previous attempts).
        • Append responses incorrectly due to a lack of a clear attempt hierarchy. (as seen in OPs example e.g. new columns are created for each new attempt and question response)

    Potential Workaround:

    For the LMS to interpret RISE SCORM data more effectively, a workaround would involve:

    • Deriving an attempt number by detecting duplicate interactions within a Quiz session.
    • Mapping question text strings to an internal LMS-generated unique ID to ensure responses can be aggregated properly.
    • Storing each new attempt as a separate row in the reporting structure instead of appending to an existing column, which distorts data analysis.

    Why This Matters:

    Without a structured approach to attempt tracking in SCORM:

    • Learner progress analytics are incomplete—we cannot distinguish between initial and repeated quiz attempts.
    • We lose insights into learner behavior, making it impossible to identify recurring mistakes or knowledge gaps.
    • Course refinement is hindered, as performance data is unreliable, preventing meaningful content improvements.

    Next Steps & Request for Support:

    This appears to be an inherent limitation in the way SCORM data is structured, specifically in how RISE publishes quiz attempts and responses. If there is a best practice or SCORM configuration that ensures proper attempt tracking, we need guidance on implementing it. Otherwise, this is a significant reporting flaw that should be addressed at both the authoring tool and LMS levels.

    Has anyone found a reliable solution, or is there a recommended way to modify SCORM publishing settings to better track quiz attempts and leaner quiz responses? Any insights would be greatly appreciated.

    • Justin's avatar
      Justin
      Staff

      Good Morning, JM, and thank you for providing such a comprehensive analysis of the behavior you've observed!

      The root cause of this problem is that the SCORM standard has not identified a concept such as a "quiz attempt."  This means that E-Learning Authoring Tools do not have a defined method for sending quiz attempts that they can expect all LMSes to understand, and that Learning Management Systems cannot expect to receive quiz attempts in a consistent format from all E-Learning Courses.

      Having said this, both Storyline 360 and Rise 360 have attempted to account for this by following these rules and logic for Question IDs within SCORM 2004:

      • Each quiz interaction gets its own interaction record in the LMS. Interactions are never overwritten by Rise.
      • Quiz ID's are truncated in length, but they will always be unique, thanks to the tail-end containing the attempt count for the quiz! This is our attempt to allow for data analysis lookups for attempt counts. Please note how, in the tail-end of the below example IDs, that it starts with 0, and iterates to 1 on the next attempt. Additional attempts would increment that by 1 for each attempt:
        • Example for first attempt: urn:scormdriver:Quiz_1_Title_Question_2-Multiple_Response%3A_Correct_is_correct,_without_two_spaces_0
        • Example for second attempt: urn:scormdriver:Quiz_1_Title_Question_2-Multiple_Response%3A_Correct_is_correct,_without_two_spaces_1
        • Example for third attempt: urn:scormdriver:Quiz_1_Title_Question_2-Multiple_Response%3A_Correct_is_correct,_without_two_spaces_2

       

      Since these rules aren't agreed-upon by the SCORM standard, we would expect that your LMS will need to provide access to advanced data analytics (such as the ability to write SQL queries) in order report upon the above data.

      Please let us know if you have any other questions!