Forum Discussion
Report on attempt and question response data
We are encountering the same issue with Adobe Learning Manager, and from our investigation, it appears that both the LMS and content authoring platforms (like Articulate RISE) are shifting the blame without a definitive resolution.
Technical Analysis:
I have conducted extensive debugging on SCORM 1.2 and SCORM 2004 packages by enabling debug mode to capture the communication between the SCORM runtime API and the LMS. Upon reviewing the data exchange, it appears that RISE-generated SCORM packages are failing to include a critical data structure necessary for LMS platforms to accurately track and store quiz attempt data in a structured format.
Key Technical Issues Identified:
- No Quiz Attempt Identifier
- SCORM does not natively report a unique attempt identifier at the quiz level.
- Each quiz attempt should have a distinct cmi.core.lesson_status or an equivalent attempt-specific key to differentiate between multiple attempts.
- Lack of a Persistent Question Identifier
- SCORM is currently passing the question text string as the only identifier
- Without a persistent, unique ID for each question (independent of the text), LMS platforms struggle to properly aggregate learner responses across multiple attempts.
- Data Structuring Issue in LMS Reporting:
- SCORM’s interaction model does not automatically create a new record for each quiz attempt.
- Instead, the LMS must either:
- Overwrite existing response data (leading to loss of previous attempts).
- Append responses incorrectly due to a lack of a clear attempt hierarchy. (as seen in OPs example e.g. new columns are created for each new attempt and question response)
Potential Workaround:
For the LMS to interpret RISE SCORM data more effectively, a workaround would involve:
- Deriving an attempt number by detecting duplicate interactions within a Quiz session.
- Mapping question text strings to an internal LMS-generated unique ID to ensure responses can be aggregated properly.
- Storing each new attempt as a separate row in the reporting structure instead of appending to an existing column, which distorts data analysis.
Why This Matters:
Without a structured approach to attempt tracking in SCORM:
- Learner progress analytics are incomplete—we cannot distinguish between initial and repeated quiz attempts.
- We lose insights into learner behavior, making it impossible to identify recurring mistakes or knowledge gaps.
- Course refinement is hindered, as performance data is unreliable, preventing meaningful content improvements.
Next Steps & Request for Support:
This appears to be an inherent limitation in the way SCORM data is structured, specifically in how RISE publishes quiz attempts and responses. If there is a best practice or SCORM configuration that ensures proper attempt tracking, we need guidance on implementing it. Otherwise, this is a significant reporting flaw that should be addressed at both the authoring tool and LMS levels.
Has anyone found a reliable solution, or is there a recommended way to modify SCORM publishing settings to better track quiz attempts and leaner quiz responses? Any insights would be greatly appreciated.
Related Content
- 4 months ago
- 24 days ago