Development process and reviews- Suggestions??

Sep 06, 2012

I"m curious...at what points during your development process do you stop and someone does a review of the content. What specifically do you have them review for at each point? Who does your reviews?

With help of a SME, we have an ID design and write content. It then goes to a Developer (me) to develop the content. Then it goes to QA. We're trying to create more checkpoints along the way (w/out waiting until the end), but we want to be thoughtful about it (not review just to review).

Any suggestions???

23 Replies
Eric Nalian

Hey Kristin,

I do this at several points.

  1. Content review - to ensure that it is accurate/up to date.  This review is with the SME
  2. Design review - this one is not really a formal meeting, it is between me asking my supervisor 'Hey, does this sound like a good idea'.
  3. SME/user review - after the course is developed, SME's and users review the course
  4. Depending on how the review goes, step 3 is repeated until the course is perfect

-Eric

Kristin Savko

Eric, thank you for your feedback! Can I ask a few follow-up questions?

In step 2, is your supervisor also an ID, or just someone knowledgeable in learning?

Step 3- When you say "user review" is this an actual student user, or someone posing as a student?

Thanks again for the feedback. I'm sure the process needs to be different for different people and different cases, but I'm sure there's a best practices pattern... 

Bob S

The other one we've added for some projects is the "Visuals Sign-off"

Seems like there are often companies and/or certains stakeholders that want final say on the visuals that were chosen. Some reasons for this include the strict branding/visual standards that seem to have become more prevelant in recent years.

Hope this helps,

Bob

Greg Rider

I like all the posted replies.  I'm currently piloting our review process, but we decided that the SME and the Senior-most approver will review the content before I (the ISDer) get too far into the development of the PPT that will go into Articulate.  So, our process looks like this:

1) SME sends PPT draft with all content to Sr. Approver, while I start adding iStockphoto and other images into this same draft. [NOTE: This draft also includes all quiz and assessment items.]

2) Once the Sr. Approver reviews and suggest content changes, I incorporate these changes and then devote myself 100% to adding all images and animations to this approved content version of  PPT.

3) SME reviews the PPT with all images and animations and content changes.

4) I make revisions to images and animations and start writing/finalizing the narrations.

5) SME and Sr. Approver review Narration & suggest changes .

6) After Narration is approved, I record and finalize the course in Articulate, where it will be reviewed on a cloud. scorm site before it launches in our LMS.

Hope this adds some helpful information to the conversation.  If not, please ignore.

Have a good weekend, pursuing non-ISD activities!

GREG

Helena Froyton

You could also have a review done after the completion of each of the phases in the ADDIE model. 

Does anyone else want to share their development process and when and who make the much needed reviews?

Do any of you get a sign-off and/or review of an Instructional Strategy and Storyboards?

In your opinion, what must be present in each of these items?  What is the reviewer looking specifically for?

Thanks you!

Regards,

Helena

Steve Flowers

Argh! I wrote a really long response but the outage this morning ate my response -- bad technology timing... In a nutshell, it looked a little like this:

  • One size fits all process doesn't usually fit all. So delivery artifacts--and those that build these artifacts--need to be flexible enough to adapt to what the customer needs / wants, the scope of the project, the resources provided, the trust relationship / synchronization of the client and the provider, etc..

That said, we put together a process "blueprint" when I was with the Coast Guard. This isn't intended to be a burdensome process. It is if it's taken literally. Within this process we have a set of defined delivery artifacts. Each artifact is delivered in iterations. Sometimes we can go full iteration cycles, sometimes we can't. The idea is avoiding unnecessary work by iterating in layers. We define these iterations as:

  • Rough - this could be an outline or simply a series of questions that need to be answered. Rough... is rough and exploratory. Intended to eliminate unnecessary work on things that don't matter as much.
  • Polished - this is a deliverable based on feedback in the rough iteration. It's close to right but not intended to be 100%.
  • Final - this is the 100% stage

Within the process, illustrated in appendix E of this guide (http://www.uscg.mil/forcecom/training/docs/training_SOP7_Sep11.pdf) we recommended using some worksheets to ramp into the effort and validate whether e-learning was the right or best fit given the circumstances. We also placed some artifacts within the design stage of the process:

  • Design document - this is a really brief document that describes the purpose and promise of the solution as well as the target outcomes, objectives, and audience. I've found that the client will sometimes provide something with many of the characteristics of this framing document. This provides the trail head for the rest of the solution. Again, brief...
  • Design flow - this expands on the design document to frame specific topic and content areas. Within this document we capture all of the facts, principles, procedures, and / or processes relevant to the purpose and promise outlined in the design document. This will also outline order, priority, relationships, chunking, etc.. The scope of the project will get some refinement in this artifact. This is where assumptions are tested and objectives are reshaped.
  • Assessment plan - this defines the strategy for assessment. Not just within the product but related to the entire campaign or effort. We don't (and shouldn't) always assume that we can package and dose an assessment directly connected with our e-learning products. We also might include some assessment or evaluation mechanisms with the sole purpose of "proofing" the solution. These proofing mechanisms might be removed after a test deployment. Every situation is different.
  • Functional prototype - this is another scope proofing point. Too often we designers dump a load of garbage on our developers at the last sliver of the schedule without providing an opportunity for the person at the end of the project cycle to influence decisions. Often, our design "guesses" are rewritten and passed back to the developer. This is torture for a developer (I've been on both ends of this equation.) This is wrong. The prototype serves to establish ground truth on the assumptions for patterns the design will employ to create beneficial moments within the experience -- BEFORE the storyboards are built.
  • Scripts and storyboards - NOW we get to storyboarding (or expanding our functional prototype in iterations building the storyboards INTO the product).

These don't need to be long and drawn out, nor necessarily tackled in this order every time, but I believe each of these considerations will help to narrow down design decisions and prevent excess pain. There are other tools, patterns, and mindsets that I might employ to show my work in the design and exploration stages (design is a process of illumination) to make things flow smoothly. It really depends on the situation:

Design process is fun stuff. But it's situational and fairly personal.

Cheers,

Steve

Scott Hewitt

Hi,

I've looked at lots of other industries and production methods to see how they work - taking best practice and ideas to see how we can improve our own. Our process is constantly evolving and also is flexible depending on the client, content and the development tools that we are using. We use a range of e-learning tools for our projects including game engines so this can require a different approach.

We have client review points and internal development review points. We try and do these frequently - to keep the development process moving (we can't work if we are waiting for the client to review). We use elements of AJILE development and also M.V.P (minimum viable product) and build this into our review process. We will know that we will want to review - graphics, script, prototype etc but we are flexible during the development process so that we are not 100% rigid to any review stages.

Scott

Sheila Bulthuis

Steve, such a great summary of your process considerations.  But I can't believe you went through writing that out twice! 

I agree wholeheartedly with the idea that the scripting and storyboarding - no matter how you do it - is really the last (although possibly biggest) part of the process.  I've seen so many people jump right to "Let me show you all my cool ideas about how we'll develop the course and how we'll make your content interactive/engaging/impactful/whatever" before taking the time to fully understand the content, develop a plan for the flow, think about how to chunk it, identify necessary level of detail, etc.  

I usually combine the design doc and design flow into one deliverable, and sometimes I even do it in the development tool instead of as a separate document,.  I don't think it matters much how you do it , but it's so important that it be done.

Helena Froyton

Hi Steve,

I am sorry for only replying to you now.  It has been a busy week.  Thank you very much for all of your comments.  I particularly liked the process framework in appendix e.  i also liked the link you attached on storyboards.  Sorry you had to write it twice, but I really appreciate your comments.  I liked the use of a blueprint and the artifacts being delivered in iterations. 

Hi Scott,

I agree with you on the need to be flexible.  If you don't mind me asking ...what are the elements in AJILE development and  M.V.P?

Hi Sheila,

I agree with you on the importance of scripting and storyboarding.  It is a main step in the design phase.  What elements are a part of your design doc and design flow.  I am creating a design doc and at the moment have objectives, content related to them and their test questions.  Do you get a sign-off from stakeholders before you continue with storyboarding?

Thanks again,

Helena

Natalie Van Doren

I'm doing quite a bit of this at the moment as my e-learning team really only came together a month ago.  Our process goes something like this..

1: Client scoping documentation / kick off meetings

2: Content gathering / SME Consultation

3: ID:  Design Document (course structure, need to know info, required skills after training, Activities: Absorb/COnnect/Do & VARK checklist)

4: Design Doc Sign Off

5: Develop Visual Design & acquire signoff

6: Develop first module, Self Review, Peer Review ,  SME Review

7: Maintain from feedback received

8: Full module signoff.

9: Rinse and repeat for remaining modules.

I find that everyone has a different idea of what they do & don't like, so at some point I need to put a Design Freeze on what I am doing.   Too many designers spoil the course...      My boss aims for 95% complete. His view is that Designers nit pic for perfection, however often the general public will not notice the tiniest of alignment issues...

Chantelle N

Saw a recent post that had some responses linked back to this one and I wanted to actually piggy-back off of Natalie's comment in her last paragraph.

I have been pondering exactly the point of subjectivity in course reviews. I have had some senior coworkers make critiques to the point where even I find myself thinking that we are wasting time. In fact, some courses already went through a few pilot tests before they were QA'd, with great feedback from the audience.

This is not to say that some of the things that have been critiqued are not useful or don't need to be changed, but for each handful of useful comments it seems there are a handful of equally subjective ones (e.g. not liking the rate of "spinning" of the Storyline icon animation, or issues with differing font sizes that occur in the matching activities, since you can't control the size of the boxes).

Because there are not "laws" on many design elements, I find myself being boiled down to an order taker since otherwise it just becomes a battle of "Well I think XYZ.." - "But I think ABC" . Being the subordinate, that's a game I can't win. More importantly, I don't think it's a game that can be won, since there are likely good reasons for both sides in these cases. It essentially becomes an issue of trust.

But this has had me thinking . . . how do you manage the process from a perspective of drawing that line on reviews? How do you somewhat control (or avoid, if possible) getting comments that are just based off of someone's personal preference and make a case for your own design choices without hard, factual evidence of being "right" or "wrong"?

Thanks for your thoughts .

Chantelle

Sheila Bulthuis

Helena Froyton said:

Hi Sheila,

I agree with you on the importance of scripting and storyboarding.  It is a main step in the design phase.  What elements are a part of your design doc and design flow.  I am creating a design doc and at the moment have objectives, content related to them and their test questions.  Do you get a sign-off from stakeholders before you continue with storyboarding?

 Thanks again,

 Helena

I somehow missed this... I'm sure because I forgot to re-subscribe to the thread after I posted. 

In any case, if it still matters/is helpful:  My Design Docs cover much the same things as yours do, Helens. I don't go into the detail of the content, but I do outline it, usually with main topics and sub-topics. In a lot of cases, I'll also indicate the source for the content; that helps me pinpoint the areas where I need more content/info from the client.

I also almost always indicate the planned method of presentation; for ILT this might be "short lecture following by structured discussion" or "simulation activity." For e-learning it's things like "explanatory narration with animated illustrative images" or "drag and drop activity to categorize XYZ" or "scenario-based activity built with decision tree."

Sheila Bulthuis

Chantelle Nash said:

 But this has had me thinking . . . how do you manage the process from a perspective of drawing that line on reviews? How do you somewhat control (or avoid, if possible) getting comments that are just based off of someone's personal preference and make a case for your own design choices without hard, factual evidence of being "right" or "wrong"?

Thanks for your thoughts

Chantelle

I always let people know the purpose of each reviews; it sounds like these are QA reviews where the content, etc. is fine?  If that's the case, I think you could position it as "Please review for the following:  typos, broken functionality etc."  You could also try to set the expectation that you're looking for input, but will make the final decision about what to change.

This discussion is closed. You can start a new discussion or contact Articulate Support.