Forum Discussion
Report on attempt and question response data
We are encountering the same issue with Adobe Learning Manager, and from our investigation, it appears that both the LMS and content authoring platforms (like Articulate RISE) are shifting the blame without a definitive resolution.
Technical Analysis:
I have conducted extensive debugging on SCORM 1.2 and SCORM 2004 packages by enabling debug mode to capture the communication between the SCORM runtime API and the LMS. Upon reviewing the data exchange, it appears that RISE-generated SCORM packages are failing to include a critical data structure necessary for LMS platforms to accurately track and store quiz attempt data in a structured format.
Key Technical Issues Identified:
- No Quiz Attempt Identifier
- SCORM does not natively report a unique attempt identifier at the quiz level.
- Each quiz attempt should have a distinct cmi.core.lesson_status or an equivalent attempt-specific key to differentiate between multiple attempts.
- Lack of a Persistent Question Identifier
- SCORM is currently passing the question text string as the only identifier
- Without a persistent, unique ID for each question (independent of the text), LMS platforms struggle to properly aggregate learner responses across multiple attempts.
- Data Structuring Issue in LMS Reporting:
- SCORM’s interaction model does not automatically create a new record for each quiz attempt.
- Instead, the LMS must either:
- Overwrite existing response data (leading to loss of previous attempts).
- Append responses incorrectly due to a lack of a clear attempt hierarchy. (as seen in OPs example e.g. new columns are created for each new attempt and question response)
Potential Workaround:
For the LMS to interpret RISE SCORM data more effectively, a workaround would involve:
- Deriving an attempt number by detecting duplicate interactions within a Quiz session.
- Mapping question text strings to an internal LMS-generated unique ID to ensure responses can be aggregated properly.
- Storing each new attempt as a separate row in the reporting structure instead of appending to an existing column, which distorts data analysis.
Why This Matters:
Without a structured approach to attempt tracking in SCORM:
- Learner progress analytics are incomplete—we cannot distinguish between initial and repeated quiz attempts.
- We lose insights into learner behavior, making it impossible to identify recurring mistakes or knowledge gaps.
- Course refinement is hindered, as performance data is unreliable, preventing meaningful content improvements.
Next Steps & Request for Support:
This appears to be an inherent limitation in the way SCORM data is structured, specifically in how RISE publishes quiz attempts and responses. If there is a best practice or SCORM configuration that ensures proper attempt tracking, we need guidance on implementing it. Otherwise, this is a significant reporting flaw that should be addressed at both the authoring tool and LMS levels.
Has anyone found a reliable solution, or is there a recommended way to modify SCORM publishing settings to better track quiz attempts and leaner quiz responses? Any insights would be greatly appreciated.
Good Morning, JM, and thank you for providing such a comprehensive analysis of the behavior you've observed!
The root cause of this problem is that the SCORM standard has not identified a concept such as a "quiz attempt." This means that E-Learning Authoring Tools do not have a defined method for sending quiz attempts that they can expect all LMSes to understand, and that Learning Management Systems cannot expect to receive quiz attempts in a consistent format from all E-Learning Courses.
Having said this, both Storyline 360 and Rise 360 have attempted to account for this by following these rules and logic for Question IDs within SCORM 2004:
- Each quiz interaction gets its own interaction record in the LMS. Interactions are never overwritten by Rise.
- Quiz ID's are truncated in length, but they will always be unique, thanks to the tail-end containing the attempt count for the quiz! This is our attempt to allow for data analysis lookups for attempt counts. Please note how, in the tail-end of the below example IDs, that it starts with 0, and iterates to 1 on the next attempt. Additional attempts would increment that by 1 for each attempt:
- Example for first attempt: urn:scormdriver:Quiz_1_Title_Question_2-Multiple_Response%3A_Correct_is_correct,_without_two_spaces_0
- Example for second attempt: urn:scormdriver:Quiz_1_Title_Question_2-Multiple_Response%3A_Correct_is_correct,_without_two_spaces_1
- Example for third attempt: urn:scormdriver:Quiz_1_Title_Question_2-Multiple_Response%3A_Correct_is_correct,_without_two_spaces_2
Since these rules aren't agreed-upon by the SCORM standard, we would expect that your LMS will need to provide access to advanced data analytics (such as the ability to write SQL queries) in order report upon the above data.
Please let us know if you have any other questions!
Related Content
- 4 months ago
- 10 days ago