Help us improve Storyline 360 & Accessibility

Apr 10, 2024

Hi, E-Learning Heroes community members!

I'm Ronnie Pilman, Senior QA Engineer and Accessibility Lead for Storyline 360 at Articulate, here to give you an update on accessibility—and ask you for a favor.

Let’s start with the update and a refresher on our goals. Our mission is to move accessibility in Storyline 360 beyond Web Content Accessibility Guidelines (WCAG) standards. We want all learners—regardless of ability—to be able to navigate Storyline 360 content effectively with any web browser, device, or assistive technology. And we want to make creating accessible courses easier for authors.

To do that, we are continually improving our products to address accessibility issues and make enhancements. Most recently, we added customization options for closed captions to ensure they’re easy to see and don’t obstruct critical content. We also made all the modern player controls compliant with the latest WCAG 2.2 guidance and fixed a bug that was preventing learners from using the ESC key to close player tabs and the accessibility settings menu. (Dive into the complete list of enhancements and fixes for all the details.)

That leads us to the ask. As we plan for future Storyline 360 improvements, we need your feedback to help guide our next steps. What areas of accessibility do you think require the most attention? How can we serve you and your learners better? To let us know, please complete this short survey: https://research.articulate.com/storylinea11y.

The survey will be open until May 7, 2024. We’d love to hear from you before then.

 

Thanks,
Ronnie Pilman (He/Him/His)
Senior QA Engineer II
CPACC

18 Replies
Sam Hill

Looking forward to the outcomes from this survey. I think SL has come a long way in recent years, but still quite a long way behind what is possible hand coding. I'd love to be able to get to the stage of testing SL content, and not having to give allowances as the content is built in SL360. I'd like to be as tough on SL360 content as I am on hand coded content.

I think some major enhancements could be providing the ability group elements into containers that could then have aria attributes added such as aria-live.

Being able to programmatically change the screen reader focus would be a big leap for developers and provide the ability to provide a good end user experience when reading the content with a screen reader.

Bill Dane

Actually, I don't mind the Focus Order controls so much. The positive is that once you set the focus order, you know for certain how the slides are read with a screen reader.  My biggest complaint about it is the inability to select multiple lines (using the shift and ctrl keys) and move them more easily. I have also noticed some inconsistency with image alt tags sometime showing in the Focus Order list and sometimes they don't. 

Bill Dane

One thing I know that videos in Storyline are in violation for is the lack of audio descriptions.

Storyline's closed captioning editor is one of the best I've ever seen. One thing that would infinitely helpful would be the ability to select a range of captions and move them all. Say for instance, you edit the audio in the video to add or remove a phrase, and now you must move several minutes of captions a few seconds one way or the other - quite painful as it stands now.

Steve Blackwell

Well, we have just got SL360, having been users of SL3 for some years now (and Studio before that).

Shame the survey is now closed.  We're users of SL for the Player shell but import published Adobe Captivate simulations into slides.

I was just using some free time on a Friday afternoon to look into Text-to-Speech and Closed Captions and wondered if they were both available in regards to the hint boxes in SL simulations.

I think we have never moved off Captivate because of the method of screen capturing in SL.  As I recall, all the captures are in SL Story window, and I think they also stay within the player and can't pop-out fullscreen.

If text-to-speech does work with the hint boxes, closed captions therefore could replace hints but only if you had control of freeform movement of closed captions around each slide when necessary and the sims could pop out to be full screen, that would be a winner for us.

Nom de Guerre -

Dear Articulate Staff,

You guys make one of the best products, if not "THE BEST'' product in the market. That said, I am writing this concern as I am visually impaired, and it appears to people who tested the product for accessibility are not impaired with any impairments.

I state the above, as you will see in the attached screen, the scale numbering seen if you were to open the Edit Audio dialog box is gray on white, extremely poor choice if the audience is visually impaired. Gray on white background does not provide color contrast discrimination as black would as seen in the stage/slide timeline. 

If you guys could please make the update in not-too-distant future, I (and I am sure anyone else with visual impairment) would be much obliged. Thank you and looking forward to the update.

 

Kind regards.

Happy (sometimes not so happy) User

Cary Glenn

From the development side the focus order can be difficult to deal with, especially when you have slides with multiple layers. If we could have some kind of option where we could organize the layers would be greatly appreciated. 
Ideally, there would be some kind of built-in accessibility checker/reminder about alt-text for images etc. 

In doing accessibility testing I found that the text entry boxes do not have enough contrast for people who use screen magnification aids. This could be fixed by having a border around the text entry. 

Cary Glenn

I was just thinking some more about this. If you really want to up your accessibility credibility Articulate should either deprecate some of the built-in features that aren't accessible, like matching and sequencing (anything that requires a mouse) or include a warning that the activity isn't accessible.

Cary Glenn

Hi, 
I'm happy to help. Learners with visual impairments can use aids like JAWS, NVDA, VoiceOver or other similar aids.  Some may use other aids like screen magnification. Screen readers aren't perfect and they rely on designers and developers to think about their needs. 

If we consider a matching question with 4 options. The learner with visual impairments has to keep the stem of the question in their mind. Then they have to tab (or arrow) down the left-hand side of the options to hear them. Next they have to tab (or arrow) down the right-hand side to hear that column. They have to keep all 8 options in their mind and try and match them up. They then have to go back to the left-hand column (using the keyboard commands again) find the option they want to select first and select it using the return key (usually). Now they have to go to the right-hand column and select the correct item. And repeat as necessary. Also they now have to keep in mind what responses they have already matched up. I don't actually know what they do if they want to change their answer. You might want to try using NVDA (it's free) or VoiceOver on a Mac and try navigating a course with your eyes closed to get an idea of the challenge. 

Matching, sequencing, etc. types of questions add a massive cognitive load for learners with disabilities. If they have to use most of the processing power to keep the mechanical features of a question in mind they have little left to answer the question. 

I recently wrote a lessons learned document about accessibility for a company that had used hired a third-party testers who were visually impaired. It is humbling and distressing to watch someone spend 20 minutes just to find the start button on a course that was just one example. Most times they had better results but it made me realize that even the best companies struggle with creating learning experiences that are barrier free. 

Sam Hill

Great explanation Cary. A good exercise is to turn off your monitor and try and complete the interaction blind using only a screen reader and keyboard.

Even when you have the bias of being familiar with the visual layout of the interaction, you can quickly appreciate the cognitive load required to complete some interactions and recognise that the design isn't optimised for SR accessibility.

That cognitive load makes some interactions much more difficult to complete.

Bill Dane

Thanks Cary. I've been creating compliant content for the past 20+ years, but always good to get another perspective. A good friend of mine, who worked in the accessibility lab at the Dept of Education (blind), first gave me a glimpse of what challenges people with disabilities face. He told me that there's a difference between compliant and usable. In this case, with the matching exercise, it is indeed compliant because you can actually navigate the exercise and complete it, but as you said, the "cognitive load" (I like that phrase) makes it fairly unusable. 

I don't use any matching in my knowledge checks in our courses, we stick with mostly true/false, multiple choice, and select all that apply. I also have another that uses columns and rows with radio buttons that is fairly easy to navigate. My general rule is that if you have to use a mouse or a monitor, it's no bueno.

BTW, if you'd like to share your lessons learned doc here, that would be great. I'm sure the Articulate developers would like to read it too.

Ronnie Pilman

Hey, everyone! I'd like to thank all of you who participated in the survey. We collected responses from more than 100 people, which is amazing! The survey closed on May 7th, so I've begun reviewing the survey results and feedback. When I'm done, I'll share a summary in a separate article. I appreciate your patience while I work on this. Thanks again!