Published on October 25, 2024
Hello E-Learning Heroes community members! I’m Jason Famularo, a Senior Software Engineer on the Storyline 360 team. One of my responsibilities is improving the quality of the Storyline 360 application, and that’s perfect because I’m here to give the Q3 Storyline quality update. This month, I’d like to spotlight the team’s quality process, which we use to validate new versions of Storyline 360 before they are released. As always, we’ll also review our quarterly Quality Metrics.
Spotlight on Storyline 360 Quality Procedures
Before we dive in, let’s ask, “What exactly is quality?” To quote the IEEE, “Software quality is a complex topic,” but in short, it describes the “reliability, usability, performance, and security” of a software application. All software teams want their products to function correctly, be easy to use, perform well, and certainly be secure. If a team doesn’t have a quality process defined, low-quality software will result.
When I started here at Articulate, my manager, Jesse Taber, told me that “everyone owns quality.” This is a crucial mantra, because the focus on quality happens during each step of the process as a feature is designed, implemented, changed, tested, and maintained.
So how do we ensure high-quality software on the Storyline 360 team?
Storyline 360 Quality Process Overview
The team’s quality process includes the following steps:
- Automated testing during development
- Acceptance testing when a feature is complete or updated or a bug is fixed (aka QA)
- Daily internal build testing and validation
- Weekly private beta releases
- Final release validation
- Post-release monitoring
Let’s look at each step in more detail!
Automated Testing
Storyline 360 utilizes two different automated testing strategies: unit testing and integration testing. Unit testing is additional code that tests small, individual components of the application. This very powerful tool alerts software engineers when a small, seemingly unrelated change introduces a bug. Articulate has over 15,000 unit tests! While that sounds like a lot, each one runs in a fraction of a second, so the total time needed is just ten to fifteen minutes.
Unit tests cover a wide gamut of functionality in Storyline 360, such as creating, opening, and saving project files, adding and editing content in the project, publishing courses, and creating course output.
Integration testing is a layer above unit testing. This style of testing is focused on testing modules and components with each other. This ensures that the application as a whole works well with itself. A change in one module might have a downstream impact on others. For example, the new AI Assistant features in Storyline introduced an AI Assistant side panel. Our integration tests can catch an issue where the new AI Assistant side panel might introduce a bug with the Triggers or Comments panel, which we might have otherwise overlooked.
Before a software engineer’s changes can be permanently added into the codebase, all unit tests must pass. Depending on the complexity of the change, we may also run integration tests at this time. If we don’t run them here, they always run at least once daily due to how long they take to complete.
Acceptance Testing
Along with our automated tests, we use acceptance testing. Before a software engineer delivers their work, they write specific steps that cover how their work should be tested, including both obvious and not-so-obvious ways the code may break. This will also cover the steps they took to test the code and any other areas that should be retested as a result. The software engineer then assigns their changes to another engineer for code review and testing. QA engineers monitor each change as well and assess the need for additional testing.
For higher-risk changes, a QA engineer is assigned to do additional testing to mitigate that risk. The assigned engineers execute the test steps written by the authoring engineer and also perform additional testing they deem necessary. Edge cases are exercised at this time, and the tester attempts to use the feature in every way possible: with just a mouse and with just a keyboard, in various languages, undoing and redoing, using alternate avenues of accessing features—such as right-clicking—and just generally trying to find scenarios the author may have missed in their testing.
The tester documents how they exercised the functionality. If issues are found, the authoring engineer fixes them, and the process repeats. Once the changes are tested satisfactorily, they are merged into the codebase. Once this is done, integration tests are run automatically. If anything fails, the engineer determines why and provides fixes immediately.
A majority of bugs and issues are found during this stage.
Internal Testing
Once a day, usually overnight, a new version is created and given to Articulate employees for early access and testing. Other teams use these internal builds to access unreleased features and verify bug fixes. Other companies may call this process alpha testing or dogfooding, and while we love creatures with paws here at Articulate, we just call it internal testing.
Any crashes that happen in this version are logged and sent to the Storyline 360 team, where a software and QA engineer review and determine the course of action to take. We’ll usually reach out to our coworkers who experienced the crash and gather more information.
Private Beta
In addition to internal testing, we have a private beta for customers like you! We release new builds weekly after testing has been completed and we are confident that the beta version is stable. Private beta releases don't get as much internal scrutiny as major releases, so bugs or other issues may surface—if you do find an issue, you’ll have a direct line to the engineering team (or, more accurately, we will have a direct line to you and will be in touch if you run into issues or crashes).
We actively monitor feedback and telemetry data from the private beta to ensure the upcoming release behaves as expected.
If you are interested in participating in the private beta, please email us at beta@articulate.com. We receive great feedback from our users who are in the beta and are truly thankful for their efforts in helping us make a better product for everyone!
Final Release Candidate Validation
Storyline 360 has an ideal release cadence of once a month (sometimes it’s five or six weeks). The final week of the release cycle is called Overview Week and consists of a team-wide effort to find and stamp out any remaining issues. We create a release candidate build 7 to 10 days before we release it to users and use that for the Overview Week testing.
The first step of Overview Week is reviewing all the code changes included in the next release. Three senior engineers tackle this task and identify risky changes that require additional testing or focus.
Once the week begins, all changes made to the release are subject to a full regression test. The development team divvies up new features and bug fixes, and they retest each change again to find anything missed the first time and to ensure that it works with all the other changes in the release.
While that happens, the QA team does end-to-end testing of all new features along with testing the output of published courses to ensure the new changes don’t break existing functionality. We maintain a list of key features and tricky scenarios and retest them in various web browsers.
As the week goes on, the team creates courses that mimic features our customers use to create their amazing courses. For example, tabs interactions are a popular way for course authors to create dynamic and compelling content, and we make a course featuring tabs to ensure things are still working. Background audio, a way to give your course a soundtrack, is another example of a popular feature we exercise during our Overview Week.
If an issue is found at any point, software and QA engineers review, resolve, and retest them.
The final release usually happens on a Tuesday morning (in the United States).
Post-Release Monitoring
About 15% of Storyline 360 active users install a new release within the first week of its availability. By the end of the second week, about 30% of active users have installed the update. If there are any major issues in a release, we typically find them in the first two weeks, before the majority of users have adopted the release.
The support and QA teams monitor incoming bug reports and work quickly to determine the cause and plot a course of action. The QA engineers also monitor the feedback users leave when they downgrade to a previous version of Storyline 360. This feedback often contains insight about new issues and helps us expedite fixes.
Software engineers monitor telemetry data, and the support team monitors support cases and forums. If anything looks problematic, we address it as soon as possible. If the issue is particularly gnarly, we do a service release, which is a new release with critical bug fixes.
Summary
The Storyline 360 team follows a multistep process for each release to ensure product quality. We know it’s frustrating when issues creep into the release, so we are constantly working to improve our methodologies and to catch things as soon as possible.
Quality Metrics
Let’s review our quarterly quality metrics!
Application Error Rate
The application error rate measures how often Storyline 360 displays the “Articulate Storyline Error Report” dialog. We track this data for both Storyline 360 sessions—or unique instances of opening and then later closing Storyline 360—and users. Our goal is to get this metric under 1% for both.
The application error rate for Storyline sessions has been hovering around 1.15% for the past six months and has continued to hold steady.
Line chart depicting the Storyline application session error rate from update 84 through update 90. The Y axis is the error rate percentage and the X axis is Storyline updates.
The data points indicate update 84 released in January 2024 was 1.16%.
- Update 85 released in February 2024 was 1.12%.
- Update 86 released in March 2024 was 1.15%.
- Update 87 released in April 2024 was 1.13%.
- Update 88 released in May 2024 was 1.17%.
- Update 89 released in June 2024 was 1.18%.
- Update 90 released in July 2024 was 1.15%.
The graph below tracks the percentage of users who encounter an error in a release. When Jesse first reported the application error rate by user for update 84 last year it was around 10%, but has since risen to closer to 20%. While the session application error rate tends to stabilize the longer a given release is available, the user level rate often climbs. That’s because as more people adopt a new update, the chance that they encounter at least one unexpected error gets higher. We’re still working hard to address unexpected errors in Storyline to improve this metric—it’s been a focus for the past six months—and we are starting to see the results of our work.
Line chart depicting the Storyline application user error rate from update 84 through update 90. The Y axis is the error rate percentage and the X axis is Storyline updates.
The data points indicate update 84 released in January 2024 was 19.49%.
- Update 85 released in February 2024 was 19.76%.
- Update 86 released in March 2024 was 21.17%.
- Update 87 released in April 2024 was 19.37%.
- Update 88 released in May 2024 was 18.06%.
- Update 89 released in June 2024 was 18.95%.
- Update 90 released in July 2024 was 15.66%.
Downgrades
This metric tracks how often a Storyline 360 user updates to a new version of the application—only to downgrade later to an earlier version. We interpret downgrades as an indication that authors encountered issues in a new version that prevented them from completing their work.
Last year, we saw this metric dip below 1% at the end of the second quarter and remain there through the middle of the third quarter. Since then, it has climbed and seesawed between 1% and 2%, with a recent spike back to 2%. This spike is due to an increase as we work to remove old third-party dependencies in our Modern Player (so we can keep it Modern 😉). We’ve addressed these issues with a series of service releases, and expect downgrades to return to lower levels in the coming months. Collecting feedback when users downgrade to a previous version of Storyline 360 was very helpful here.
Line chart depicting the Storyline downgrade rate from update 84 through update 90. The Y axis is the downgrade percentage and the X axis is Storyline updates.
The data points indicate update 84 released in January 2024 was 1.45%.
- Update 85 released in February 2024 was 1.25%.
- Update 86 released in March 2024 was 1.93%.
- Update 87 released in April 2024 was 1.08%.
- Update 88 released in May 2024 was 0.84%.
- Update 89 released in June 2024 was 1.37%.
- Update 90 released in July 2024 was 1.99%.
Defect Rate
This metric tracks the percentage of open support cases associated with an unfixed bug. An increase in this number is a signal that our support team is spending time fielding bug reports instead of helping customers get the most out of our products, so our goal is to keep this value below 10%.
This metric has been below the 10% threshold for quite some time now, but it’s getting close to that threshold again. This is also due to the Modern Player dependency removal discussed in the previous section. We’ve addressed many of these issues and will continue to do so if any new ones come in.
We rely on support cases to direct our bug-fixing efforts, so I encourage you to contact our support team if you’re experiencing issues with Storyline 360.
Line chart depicting the Storyline defect rate from January 2024 through July 2024. The Y axis is the defect rate percentage and the X axis is the month.
The data points indicate January 2024 had a defect rate of 8.10%.
- February 2024 had a defect rate of 5.50%.
- March 2024 had a defect rate of 6.24%.
- April 2024 had a defect rate of 8.45%.
- May 2024 had a defect rate of 5.48%.
- June 2024 had a defect rate of 6.55%.
- July 2024 had a defect rate of 9.50%.
Publishing Failures
This metric tracks the number of users who get an error during publishing. If you enjoy reading our engineering journals and have a great memory, you may recall this number was around 4.25% last quarter. What happened?! In short, we realized that we were unfairly penalizing ourselves by counting attempts to publish that were canceled by the author.
We’ve made a significant effort over the past six months to address most of the top publishing failures. While we haven’t seen a drop in these error percentages yet, we expect to see one because adoption of a new version of Storyline 360 often takes a few months.
Release 92 introduced redesigned publishing success and failure dialogs. If publishing fails with a problem the user can address, the user is told what happened and steps they can take to resolve it, an example is the running out of disk space message. Additionally, if publishing fails on a particular slide, we call out that slide so you can look at it and figure out what went wrong, be it a recent change or complex feature. We want you to get unblocked as soon as possible!
We’ve got some work left to do to reach our goal of less than 1%.
Line chart depicting the Storyline publishing failures from update 84 through update 90. The Y axis is the publish failure percentage and the X axis is Storyline updates.
The data points indicate update 84 released in January 2024 was 1.80%.
- Update 85 released in February 2024 was 1.85%.
- Update 86 released in March 2024 was 2.10%.
- Update 87 released in April 2024 was 1.89%.
- Update 88 released in May 2024 was 1.88%.
- Update 89 released in June 2024 was 1.85%.
- Update 90 released in July 2024 was 2.10%.
Incomplete Sessions
This metric tracks how often Storyline 360 quits unexpectedly due to an error. Our goal is to maintain this metric under 1%.
The Storyline team spent Q1 focused on improving this metric as much as possible. The percentage peaked at 3.8% at the end of Q4 last year, and those efforts have slowly made a big difference, as the percentage has steadily dropped this year. We will continue to monitor this closely and revisit it in the future as needed as we aim to achieve our goal.
Line chart depicting the Storyline incomplete sessions from update 84 through update 90. The Y axis is the incomplete session percentage and the X axis is Storyline updates.
The data points indicate update 84 released in January 2024 was 3.86%.
- Update 85 released in February 2024 was 3.60%.
- Update 86 released in March 2024 was 3.56%.
- Update 87 released in April 2024 was 3.25%.
- Update 88 released in May 2024 was 3.03%.
- Update 89 released in June 2024 was 3.03%.
- Update 90 released in July 2024 was 2.94%.
Wrap-Up
Due to the lag in the number of users upgrading Storyline each month, we often see that our efforts take months to materialize in these metrics. It can be worrisome as we wonder, Did all that work do anything?! But as we see with reducing the number of incomplete sessions, that work does matter and eventually shows up in the quality statistics we track.
We spent the past six months focusing on the publishing failure metric. As the bulk of that work is in the more recent updates, we should see improvements in those metrics soon. If we don’t, we’ll revisit publishing failures and resolve troublesome bugs quickly!
In the meantime, if there are any topics you’d like to see covered in these quality updates, please reach out to the team at storyline-engineering@articulate.com.
Check out all the latest features and enhancements to Articulate 360 apps.