Forum Discussion

Aïda_Domínguez's avatar
Aïda_Domínguez
Community Member
2 months ago

Organising project phases

Hi there!

I'm curious to learn how others organise the workflow when creating E-Learning content. I personally find kind of challenging the final revision, in which I need to check that everything in the content is well and adjusted. I was thinking in creating a checklist that could be used as a support but I don't really know what kind of format could really be useful... any idea? How do you usually check that everything is perfect right before delivering?

20 Replies

  • IngaBryant's avatar
    IngaBryant
    Community Member

    I usually do ALPHA review with the SME group -- usually managers and experts from target audience groups -- to check the content (is it applicable to all regions, departments, etc.), are we targeting the right issues the right way, is it correct, is grammar/syntax all right, etc. For that I use Review360. Everyone can see the comments. Since we are large organization that grew very fast, very often this leads to numerous discussions about processes and unification. (Which I welcome!)

    Then, once all is ironed out I have the Lead SME/Content Owner review the slide deck (Word document for Storyline) or Review site for Rise (I do not like the PDF document that gets generated from Rise). 

    Once all good, I place the course to our LMS and run BETA testing. Depending on the course, I can have anywhere from 3-5 people (for some small courses) to 10-12 people for courses that go to 35k employees. This time, I prepare an email what to look for (functionality, navigation, score/completion tracking, ability to complete the quiz with information provided), email them a form with things to note/check (and provide feedback on) and columns for "as is/image" to "what should be changed". Once all Beta forms are in, I review with the SME to make sure content-related changes should be applied.

    It's a process... but I'd rather spend time up-front and have flawless course then have hiccups once over 30,000 employees are enrolled in the course and get stuck. 

  • I'm a little late to the conversation, but this is a topic near and dear to me (As evidenced by the length of my response! 😁). My review strategy is an editorial/iterative approach. Asking for a comprehensive review of a final product at the end of a project is a very bad idea. There's no effective way to create a tool that will cover every aspect for every project type—it would be massive, and I have yet to encounter a teammate or client who would use it effectively (or at all). So, like an editor, my reviews are broken down to reflect the product's stage, and when I do a review, I make multiple passes at a product, each pass focused on a particular aspect (e.g., first grammar, then formatting, then overall flow, then overall design). This is particularly important for projects with multiple deliverables, developers, and when using a blended approach. I want quality and consistency not just for a single deliverable, but across all deliverables at all stages—a typo-riddled script distracts from getting effective feedback from a client and creates doubt as to the quality of the final product.

    Once I have my initial analysis, I meet with the team to create a style and standards guide, then have the client sign off on it. What are we creating, how are we creating it, and what kind of fonts, colors, images, graphics, buttons, interface will we use? Is there audio and if so, who's recording it or are we using SL voices (and which one)? Certain clients even have terminology preferences (e.g., client vs. sponsor; subject vs. patient) that go in the style/standards guide. Identify the standards, stick to 'em, review against 'em when you get to the relevant product phase.

    Early phase reviews focus on the requirements outlined in the analysis phase and conducting a grammar/terminology/tense/voice check of the content, the flow, objectives, etc. Middle phase reviews focus less on the content and more on the delivery approach and design—once a project goes into development, I want my content locked down (just check for typos/grammar). I also run a check to ensure everyone is following the style/standards. When I can iterate and run a pilot, do the pilot reviewers find the format effective? Is there something about the content flow or interface that they find confusing? Late phase reviews are the "spit and polish" type: final typos check, UX (audio and animations are synched; menus, links, buttons, triggers, state changes all work as they should; etc.). I like testing in SCORM Cloud if I'm developing for a client but don't have access to their LMS and need to check the SCORM file is giving me the data the client needs.

    There are lots of other benefits to this approach, like always having content that meets the project objectives or keeping SMEs/stakeholders on task. Iterative reviews that match the phase allows reviewers to focus on specific items, so you get better feedback and fewer mistakes are overlooked. Sorry for the book-length comment, but I hope this helps!

    • Aïda_Domínguez's avatar
      Aïda_Domínguez
      Community Member

      Hi Stephanie, it definitely helps! As you said, reviews on final products can become a nightmare if there was no previous check. I think then that I will try to focus on the whole creation process, so I can also implement a first agreement on style/standard guide. Also, if you have any comments on formats, it's more than welcome!
      Thanks a million 💗

  • AnaTirico's avatar
    AnaTirico
    Community Member

    Hi everyone! I’ve been following this thread and really appreciate all the insights—it’s super helpful to see how each person approaches final reviews.

    In my workflow, I also use separate checklists, and I’ve found that breaking things down helps me stay focused and catch more issues. Currently, I use three main checklists during the final phase:

    1. Content structure and grammar:
      This one focuses on reviewing the overall course structure, spelling, grammar, formatting, and making sure everything flows logically. I check for clear titles, instructions, and a consistent visual layout.
    2. Accessibility and responsiveness:
      Here I ensure the course is accessible for screen readers, has sufficient color contrast, alt text for images, logical tab order, and works well across different devices and screen sizes.
    3. Page length and transitions:
      This checklist helps me monitor the length of each screen—I try to avoid excessive scrolling to keep attention high. I also review hooks at the beginning and cohesion and coherence between sections, to make sure transitions feel natural and engaging.

    If anyone uses similar lists or has templates to share, I’d love to exchange ideas! 😊

    • Aïda_Domínguez's avatar
      Aïda_Domínguez
      Community Member

      Wow! Super organised 💗 Noele was also commenting how important it is to separate the "types" of feedback so the idea to keep separate checklist seems nice. Do you adapt them differently when it comes to Rise or Storyline? I think the last it's way harder to review.

  • Globalec's avatar
    Globalec
    Community Member

    For managing e-learning projects, one of the first steps (I assure) is making a clear instructional script that defines different multimedia elements that will be present; it must be clear for stakeholders and the development team. This will be used for previous or more revisions but I thinks it's important to notice that, as an iterative product, it will be always matter of updatings (as mobile apps do).

    • Aïda_Domínguez's avatar
      Aïda_Domínguez
      Community Member

      I was also thinking in having, directly on the script, the elements that will be used in each slide (audio, images, videos, etc.) as I kind of need to revise them separately (like, is the video reproducing automatically or when clicking, etc).
      Do you have any specific format for your instructional script? I've crafted one myself but still need to test it.

      • StephanieDiaz-a's avatar
        StephanieDiaz-a
        Community Member

        If I don't have some kind of a source document (one commented mentioned PPT slides), I'll create a storyboard document in Word. It's just a standard document with a table for each slide. I attached an example for something I used a few years ago, stripped out some branding, and you'll have to forgive the blocky style the stakeholders decided on for the project. 😉

        I'll use this when writing content with SMEs from scratch. It's a live document (comments/track changes) until it reaches final versioning, where I get an approval signature by a stakeholder; then it is versioned for a developer to use and becomes a live document again since sometimes things need to change and it's nice to document where/how/why when you have a difficult stakeholder/client.

  • Aïda_Domínguez​ - I agree with JudyNollet​ here. It's definitely a matter of preference and the tools available to you. 

    I've been in jobs where I created a checklist in a word document and in a project management tool (such as Basecamp or Trello). 

    For the list itself, I really thought about all the things I need to double-check before it gets implemented - the link Judy shared is a great example! I can also see if I have an example of one of my lists, if that's something you'd be interested in seeing!

    • Aïda_Domínguez's avatar
      Aïda_Domínguez
      Community Member

      Hi! If you have any example, it's more than welcome 💗 the more, the merrier.

  • ErinParks's avatar
    ErinParks
    Community Member

    I think your QA partly depends on your workflow - I tend to do it regularly throughout the project so that that final testing/feedback approval doesn't feel so overwhelming.

    And yes, having a checklist helps.   JudyNollet​ had quite a comprehensive list.  One thing I might add is doing a Visual check on every slide.  Sometimes when I'm playing around with states, or motion paths or what have you, I'll have moved an element out of place (even just slightly).  So in a final revision I'll do a visual sweep.  Sometimes I'll step back from the screen a bit or sort of squint for a different perspective.

    • Aïda_Domínguez's avatar
      Aïda_Domínguez
      Community Member

      The step back is more than necessary, yes! Today we were commenting with a colleague how many times we get so focused on the screen/content and then we can't see obvious mistakes on menus or players. 

  • A lot of the e-learning I have been creating is derived from powerpoint decks that were developed internally by some of the leaders here. I have mainly been using those as the base - but then mapping them out in an order that lends itself better for e-learning. As it's all powerpoint, a lot of the content is getting updated to be more interactive and engaging, so I've taken a number of creative liberties. It then goes through a gauntlet of approvals and revisions. So TLDR - I don't have a proper system, I use the content given to me as the base and then adjust as I see fit. 

    • Noele_Flowers's avatar
      Noele_Flowers
      Staff

      This is super interesting—so basically what you're saying is that most of the actual feedback comes earlier in the needs analysis phase, vs after the course is built? 

      Does it ever come up that the course you end up building feels super different than the expectations, and how do you end up addressing that? 

  • Love this question Aïda_Domínguez​ — and I think this could be a great place to see a wide range of commenters since like JudyNollet​ mentioned this is probably something that every project does slightly differently. 

    But, for me when I'm launching a new project in this vein I find it can be helpful to create a pretty prescriptive quality assurance form to make sure you're soliciting the right feedback. I find when I'm more open-ended about the feedback I'm looking for it may still be high quality, but it might not actually cover all the bases.

    It can be more helpful to direct people on the steps you want them to take and have you tell them specifically about them—so for example, have them respond separately about things like typos or grammatical errors or broken links, vs things like "how engaging or helpful was this piece of content." 

    Really curious to hear how others in this community approach quality assurance—gonna tag in some folks I think might have something interesting to say here: KellOrding​ CherylStGermain​ ChrisMcAllis074​ ErinParks​ PhoebeSterdan-0​ 

    • Aïda_Domínguez's avatar
      Aïda_Domínguez
      Community Member

      Completely agree! The feedback phase is right now one of my pain points, specially when there is more than one person reviewing content and they have different fields of expertise. The idea of separating the feedback per area sounds like a first step towards a solution! Thanks also for tagging ^^

    • Aïda_Domínguez's avatar
      Aïda_Domínguez
      Community Member

      Hi Judy! No need to say I have bookmarked your post as an absolute favourite 💗 haha I'm on the process of adapting the checklist into a document and it's being super helpful. Do you currently use it in any specific format?

      • JudyNollet's avatar
        JudyNollet
        Super Hero

        My usual format would be to put what I need into a table in Word—and then print it. Yup, I'm old-fashioned enough that I like to check things off with a pen. 😁