Engineering Journal: Storyline Quality Update Q2 2024
Published on June 25, 2024 Hello E-Learning Heroes community members! It’s Jesse Taber, Director of Engineering, and I’m back with the Q2 2024 Storyline quality update. This quarter I want to start by spotlighting one area where we’re currently focused on improving quality—course publishing—before providing updates on the typical quality metrics. Let’s dig in! Spotlight on Publishing Quality We’ve been tracking how often publishing fails in Storyline 360 for over a year now. In the spring of 2023, we made impressive progress in reducing this metric, but we have since regressed. Line chart showing Storyline 360’s publishing failure percentage for updates 74 through 86. Before I explain why this happened and how we plan to continue reducing publishing failures, I’d like to walk you through what’s going on in the background when you publish your project—so you understand why the process sometimes fails. How Does Course Publishing Work? When you publish a course, Storyline reviews every scene and slide and builds data files describing everything that needs to happen to recreate the author’s vision for their course in a learner’s browser. Every object, trigger, animation, timeline cue point, image, video, etc., gets represented as a component in these data files. Additionally, images, video, and audio may need to be compressed or re-encoded in order to work properly in a variety of browsers. Next, the course player settings are distilled into separate data files that describe the player options, colors, text labels, etc. Finally, some specialized formats like publishing to Word or Video do extra work to create the final published Word or MP4 video file. As you can see, publishing a single Storyline course requires creating dozens of files and can be quite complex. And the more complex a process is, the higher the failure rate tends to be. How Do We Calculate the Publish Failure Rate? To calculate the publish failure rate, we divide the number of publish operations that failed by the total attempted. How do we know when publishing fails? It’s simple. When you publish your course, we record some basic information about it like the audio and video quality settings, the player features that are enabled, and the output type (Web, LMS, Video, etc.). As it progresses, we record other information related to performance and, most important, whether it finished successfully. Why Did the Publish Failure Rate Spike in July 2023? When fixing a publishing-related bug, we discovered that we weren’t capturing every failed publish. Some failed attempts to publish were being counted as intentional cancellations. When we fixed this issue and released it in Update 78, we knew we’d likely see a rise in publishing failures since we’d be capturing all of them correctly. While it was disheartening to see this metric increase, we knew it was important to have a complete and accurate picture of what was happening for authors using Storyline in their day-to-day work. Why Does Publishing Fail? But even with more accurate publishing failure metrics we could only tell how often publishing fails—not why it failed. To understand why, we had to correlate the failed publishing metrics with the error report data we get from users who submit this form: When we receive this data, we can tag it with keywords that allow us to recognize patterns, pinpoint issues, and fix them. When we started digging into publishing error reports, we discovered that several of the most common failures were related to disk input/output operations (disk I/O). What does that mean? Remember earlier when we talked about how complex the publishing process is? Creating so many files requires Storyline to write and read a lot of data to and from the computer’s hard drive, which is commonly referred to as disk I/O. Unfortunately, anytime software has to perform disk I/O operations, things can go wrong. For example: Your computer can run out of disk space. Storyline might be contending with another process to read or write a given file on disk. Background processes that scan or copy files, such as anti-malware or cloud backup services, often cause this kind of contention. An intermittent operating system issue could cause a disk I/O operation to fail unexpectedly, as was the case with a Windows update that rolled out in the spring of 2023. Disk I/O failures are often outside Articulate’s control and might require your help to resolve. For example: If you’re running out of disk space, you’ll need to free some up in order for Storyline to successfully publish your course. If Storyline encounters contention with another process when trying to read or write a file, sometimes waiting a split second and trying again works. Other times a particularly stubborn scanning service might prevent us from accessing a file for a long time. In these cases, you may need to stop the service temporarily or exclude certain files and folders from being scanned. But how do you, as the author, figure out why it’s failing? Unfortunately, right now there’s no easy way to do that. But that’s where this next section comes in. Next Steps We know how frustrating it is when Storyline fails to publish your course. And that frustration is only compounded by the fact that the error report dialog tells you something went wrong but doesn’t say why or how to fix it. That’s why we’re working to redesign this experience with an eye for providing you with more context about why the publish operation might have failed. This context will include information about whether the failure might be something the author could resolve themselves (e.g., free up disk space) as well as the specific scene and slide that were being published when the failure occurred. Of course, not all errors that happen during publishing will require the author to intervene. Many of these errors represent legitimate bugs in Storyline that we need to fix. We’re currently addressing the top 10 such errors to help ensure you can publish your projects successfully. Quality Metrics Application Error Rate The application error rate measures how often Storyline 360 displays the “Articulate Storyline Error Report” dialog. We track this data for both Storyline 360 sessions—or unique instances of opening and then later closing Storyline 360—and users. Our goal is to get this metric under 1% for both Storyline sessions and users. As we entered Q1 of 2024 the application error rate for Storyline sessions was hovering at around 1.15% and has continued to hold steady there. Line chart showing Storyline 360’s application error rate per session for updates 81 through 87. When I first reported the application error rate by user for Update 84 last quarter it was around 10%, but has since risen closer to 20%. While the session application error rate tends to stabilize the longer a given release is available, the user level rate often climbs. That’s because as more people adopt a new update, the chance that they encounter at least one unexpected error gets higher. We’re working hard to address unexpected errors in Storyline to improve this metric. Line chart showing Storyline 360’s application error rate per user for updates 81 through 87. Downgrades This metric tracks how often a Storyline 360 user updates to a new version of the application—only to downgrade later to an earlier version. We interpret downgrades as an indication that authors encountered issues in a new version that prevented them from completing their work. Last year, we saw this metric dip below 1% at the end of the second quarter and remain there through the middle of the third quarter. Since then, it has climbed and seesawed between 1% and 1.6%, with Update 84 dipping under 1%. While we’re happy to have reduced this number from the 2% we saw last year, we want to understand better why our customers downgrade. We’ll be working to clarify this in the coming months. Line chart showing Storyline 360’s downgrade percentage for updates 81 through 87. Defect Rate This metric tracks the percentage of open support cases associated with an unfixed bug. An increase in this number is a signal that our support team is spending time fielding bug reports instead of helping customers get the most out of our products, so our goal is to keep this value below 10%. This metric has been below the 10% threshold for quite some time now, but we will continue to monitor it closely and take action if it suddenly spikes due to an influx of reports about a bug. We rely on support cases to direct our bug-fixing efforts, so I encourage you to contact our support team if you’re experiencing issues with Storyline 360. Line chart showing Storyline 360’s defect rate for October 2023 through April 2024. Incomplete Sessions This metric tracks how often Storyline 360 quits unexpectedly due to an error. Our goal is to maintain this metric under 1%. The Storyline team spent Q1 focused on improving this metric as much as possible. It’s been hovering around 3.5% since Q4 of last year, and our efforts to improve it have been slow to progress. Every single instance that contributes to this metric requires a lot of time to investigate and fix, and each fix yields a marginal improvement to the overall rate. I reduced the range of values on the vertical axis of this chart to help illustrate the progress that was made last quarter. At the beginning of Q1 we saw a rate of ~3.8%. As we headed into Q2, that rate had dropped to ~3.3% for an improvement of 0.5%. We have had to direct some of our quality efforts into other areas recently, but we will continue to monitor this closely and revisit it in the future. Line chart showing Storyline 360’s incomplete session percentage for updates 81 through 87. Wrap-Up We’re currently focused on improving the publishing failure metric. I’ll provide an update on our progress in next quarter’s article. In the meantime, if there are any topics you’d like to see covered in these quality updates, please reach out: jtaber@articulate.com.93Views1like0Comments