Forum Discussion
Organising project phases
I'm a little late to the conversation, but this is a topic near and dear to me (As evidenced by the length of my response! 😁). My review strategy is an editorial/iterative approach. Asking for a comprehensive review of a final product at the end of a project is a very bad idea. There's no effective way to create a tool that will cover every aspect for every project type—it would be massive, and I have yet to encounter a teammate or client who would use it effectively (or at all). So, like an editor, my reviews are broken down to reflect the product's stage, and when I do a review, I make multiple passes at a product, each pass focused on a particular aspect (e.g., first grammar, then formatting, then overall flow, then overall design). This is particularly important for projects with multiple deliverables, developers, and when using a blended approach. I want quality and consistency not just for a single deliverable, but across all deliverables at all stages—a typo-riddled script distracts from getting effective feedback from a client and creates doubt as to the quality of the final product.
Once I have my initial analysis, I meet with the team to create a style and standards guide, then have the client sign off on it. What are we creating, how are we creating it, and what kind of fonts, colors, images, graphics, buttons, interface will we use? Is there audio and if so, who's recording it or are we using SL voices (and which one)? Certain clients even have terminology preferences (e.g., client vs. sponsor; subject vs. patient) that go in the style/standards guide. Identify the standards, stick to 'em, review against 'em when you get to the relevant product phase.
Early phase reviews focus on the requirements outlined in the analysis phase and conducting a grammar/terminology/tense/voice check of the content, the flow, objectives, etc. Middle phase reviews focus less on the content and more on the delivery approach and design—once a project goes into development, I want my content locked down (just check for typos/grammar). I also run a check to ensure everyone is following the style/standards. When I can iterate and run a pilot, do the pilot reviewers find the format effective? Is there something about the content flow or interface that they find confusing? Late phase reviews are the "spit and polish" type: final typos check, UX (audio and animations are synched; menus, links, buttons, triggers, state changes all work as they should; etc.). I like testing in SCORM Cloud if I'm developing for a client but don't have access to their LMS and need to check the SCORM file is giving me the data the client needs.
There are lots of other benefits to this approach, like always having content that meets the project objectives or keeping SMEs/stakeholders on task. Iterative reviews that match the phase allows reviewers to focus on specific items, so you get better feedback and fewer mistakes are overlooked. Sorry for the book-length comment, but I hope this helps!
Hi Stephanie, it definitely helps! As you said, reviews on final products can become a nightmare if there was no previous check. I think then that I will try to focus on the whole creation process, so I can also implement a first agreement on style/standard guide. Also, if you have any comments on formats, it's more than welcome!
Thanks a million 💗