Getting started with needs analysis - share your best practices

Feb 15, 2011

Got a good question from a designer today & thought it would make a great forum discussion! Here's what she wants to know:

"I'm familiar with the ADDIE model, but I really want to get some "real world" advice about how to get information in the "Analysis" or data-gathering phase. For example, when you identify your SME, where do you go from there? In my limited experience, the SME starts out with a Powerpoint or something to that effect- but what if they do not have that information gathered yet? What if they just have an "idea?" Do you have a formula or questionnaire that you use as a method of needs assessment? I'm hoping you can help me with this! Look forward to hearing from you!"

20 Replies
Jeanette Brooks

Here are some thoughts off the top of my head: When SMEs ask for training, they’re looking to solve some kind of problem. Maybe there’s a skill gap among their audience, and it’s costing the company money. Or maybe it’s behavioral change they’re looking for – like they want employees to think and behave differently when making choices about safe work practices.

You mentioned that often an SME starts out with some existing PowerPoint slides or other materials – and a lot of times those things can be helpful, but it’s always a good idea to back up a little and really find out the root of why they think training is necessary. Having those initial conversations will be a really important step in figuring out whether training is really a viable way to solve the problem. And if training is a viable solution, the conversations you have with SMEs and other stakeholders will be what helps you form clear objectives for the training.

You might start out with some open-ended questions like, “What kind of business result are you looking for? What will happen if the audience doesn’t get trained – i.e., what are you trying to prevent or fix by creating this learning program? What skills do learners need to know, or what attitudes/feelings/beliefs do they need to change in order for you to see the result you’re looking for? What kind of evidence in the workplace would let us know that they’ve mastered those skills (or changed their attitudes/feelings/beliefs)? 

It’s a good idea to come up with a set of interview questions to guide the conversation, and maybe these resources from the Downloads area will help you form a plan for the types of questions you want to ask:

Zara Ogden

We have created a Training Request Form that we either ask the SME or "Owner" to complete. However with a new client we walk through the document together. It includes...

Basic Info

- Name

- Company

- Date Requested

- Target Delivery Date

Content Detail

- Topic

- Topic Summary

- Why are we creating this program

- What do the learners need to accomplish

- Audiance (who will receive the training)

- # ppl

- Reoccurring time frame

- Prerequisits

- Objective

- Suggested Duration

Assessments & Evaluation

- Is certification required

- Regulatory Body

- Name of Certification

- Is a final assesment needed

- Should the final assessment be graded

Tracking & Reporting

- what should be tracked (certification, assessment, training complete)

Key COntacts

- SME detail and contact info

I chose the topics with the thought process that it opens doors and creates natural questions and conversation. I don't expect a client to fill it out and automatically be able to create a program but it will give me an idea of what they want. We can then begin discussion on the adult learning, design options, and needs.

After this process I ask a key question... What are the key topics/objectives that must be covered to a maximum of 7. This requires conversation because sometimes it takes massaging, digging and probing questions. I unfortunately don't have the option to directly asses the end learner as they are in transit all across the country so I really have to have good questioning, analytical thinking and problem solving skills.

It is a challenge but one of more favorite parts of the ID process.

Brian Sullivan

I agree with the "treat it as a problem strategy".

We use a variant of a Creative Problem Solving technique (based on the Basadur Simplex technique - http://www.basadur.com)  to engage SMEs and stakeholders, develop business objectives, outline curricula, pinpoint learning objectives, discover resources, identify gaps and extract information.

We never start from existing resources or current training or SME stated objectives (though we don't ignore them either -- they become information to be sorted through and analyzed in the problem solving process). We always start from an essentially clean state. Sometimes this makes the SMEs initially wary but usually this wariness goes away quickly.

We find that this is the most efficient way to use SME time and the most direct route to developing an efficient communicative narrative that fits the real purpose of the required training.

Jeanette Brooks

Brian Sullivan said:

We never start from existing resources or current training or SME stated objectives (though we don't ignore them either -- they become information to be sorted through and analyzed in the problem solving process). We always start from an essentially clean state. Sometimes this makes the SMEs initially wary but usually this wariness goes away quickly.


The fact that their wariness goes away quickly is testament to your skillfullness in developing a good relationship with them.

For a designer, that is often a challenging part of a project. A lot of times, SMEs might have already invested significant time developing their own slides, documents, job aids, etc. - and they might be really proud of that work. It's an important skill, to  recognize and validate their work while also guiding them through the process of looking at the problem from a blank slate. Tom has some really helpful blog posts on working effectively with SMEs:

http://www.articulate.com/rapid-elearning/how-to-get-the-most-out-of-your-subject-matter-experts/

http://www.articulate.com/rapid-elearning/what-everyone-should-know-about-working-with-subject-matter-experts/

Mike Taylor

We have a document very similar to Zara's.....

Who is the "Owner" of this training initiative? 

Who will develop the training? (Internal/External Vendor) 

What are the training needs? What is happening?

Who will receive the training?  Identify the specific audience(s)
Where are they and do they have computer access?  

What is the purpose of the training? 

What are the primary (terminal) objectives of the training? 

Who will provide feedback/sign-off for the training content? 

What is the completion criteria? Will there be a training assessment? Or any other 

requirements?

Does it need to be tracked in the LMS?

Is there a deadline to complete the training? 

Who will be the point of contact for questions about this training? 

Steve Flowers

For everything but compliance training we normally require some type of performance analysis that intentionally looks at all of the issues surrounding the perceived performance gap. Rarely is training the only solution to the problem, in our experience. The other elements appear in a "performance pie":

  • Capacity
  • Selection / Assignment
  • Motivation
  • Incentives
  • Resources
  • Information
  • Skills
  • Knowledge

Some of these are individual factors, others are organizational influencers. Only the last three on the list are prime selections for a training application and in many cases we'll do further pre-design analysis to tease out performance support system feasibility. We love training but it's not the only tool in our bag, particularly if we can be more effective or get it done with less resources with an alternate solution path.

We use the Harless Peak Performance System for process consistency across the different types of analyses we conduct. These include new performance planning, diagnostic, and job task analysis. We also couple a pre-design analysis (on top of the existing performance analysis) with each solution endeavor. In any case the data from surveys, interviews, and business statistics drives the high level solution decisions we make. The pre-design analysis narrows the goals and objectives to a set that is addressable within the solution span.

"Treat it as a problem strategy" is really the only way to go. Otherwise you're stabbing at invisible problems with expensive knives. By isolating and eliminating potential causes you can avoid spending resources on solutions that don't fix the problem. In some cases you'll find the problem really isn't worth the cost of solving.

I think this type of intensive analysis isn't always an option (and in some cases it can be counterproductive and cost more than a proposed solution). As lucky as we are here to have a performance focus and solution alignment based on problem identification, we'll always run into the odd case of compliance training or other push for training that can't be battled with logic and statistical evidence. For those instances we still conduct a pre-design analysis that attempts to frame the problem we're solving.

I'm curious about the Basadur Simplex process. Will have to look into that one Thanks, Brian!

Amanda Westendorf

I think we use SMEs in a little bit different context than has been mentioned here. We (training department) usually have already interviewed our project stakeholders and have an idea of where the project is going, before we fully involve the SME. We tend to use the SME as a sounding board to ask clarifying questions, conduct research, proof read/test, etc.

However, before we are to this point, we have already conducted a session similar to Michael Allen's Savvy Start, where we have project stakeholders, SMEs, recent learners, training developers, etc. all in a room together to do some rapid prototyping and where everyone can come to a consensus of what the finished product should entail.

Stephanie Harnett

I run through a basic list of questions but that list is never given to the client. These are questions that I will weave into our initial conversations to get a feel for the project and for the client. I find that getting the best from the stakeholders and SMEs is mostly about balance between the technical process (information gathering) and creating connection (understanding how the client feels). Maintaining a balance from the get go leads to a better client relationship and increases the likelihood that you will create those teachable (ah ha) moments with users.

I agree with the comment of starting with a blank slate. No two clients are alike and no two topics really are either. I keep my past experience out of the initial conversations. I guide but don't steer. It's about listening to the unique position the client is explaining and there is always something unique. It is those bits of detail that allow me to create a custom feel for clients - even if it is a topic that has been done tons of times before.

The information I collect in the beginning of the project is first seen by the client in the form of a project charter. If I've done my job, it will capture the key messages, key topics, case study ideas, culture, things to avoid and general approach. There is usually a to-do list for the client that falls out of this; some stuff they need to go and gather internally. This gets them involved - a little skin in their own game.

Stephanie

Diana Gryckiewicz

Great information, thank you to everyone. At my work, LMS and elearning are new. I would appreciate if anyone had a elearning development framework from start to finish - in addition to creating the ecourse, we have yet to develop processes for all things elearning. What topics should be included in our 'Standards Manual'...

Renee Schuh

Hi Group!

I was looking through our community posts to see if there was anything out there on criteria for alpha/beta testing once an e-learning product is developed (and before it is delivered). Do any of you have any good resources for criteria that is used for testing an e-learning course...anything from the copy editing/grammar to system functionality to the pedagogy of a course? If you have anything that you have or wouldn't mind sharing, I'd love to hear from you! I'm getting ready to alpha test an e-learning course and I'd love to make my criteria of questions for the testers as robust as possible.

Thanks so much!

Renee

Gerry Wasiluk

Hi, Renee!

Here's a couple of things that we give to our e-learning developers.  A lot of them are not professional developers but SME and novice developers.

For those courses to be on the LMS (ours is named "GLN" for Global Learning Network), we have three versions of the LMS  that we maintain. 

  1. Test environment  -- for just us that manage the LMS and IT to do first testing in
     
  2. A QA environment - for LMS administrators to learn and play around with the LMS--and for then to test e-learning content in before it goes to production
     
  3. A Production environment - the version of the LMS that learners use and where final versions of e-learning courses get set up in

We treat e-learning a little like it is a software application that has to look and work right but also has to communicate properly and work correctly with the LMS.  Our favorite quote around "Why testing?" is "Your learners are not your beta testers--they deserve a good experience from the start."

We also have an e-learninn standards document which I've also attached.  It's a bit too much technical at times and we need to split it up and put it in a new format, but it also has some of our course standards and recommendations in it.  It's a bit dated on some stuff so "buyer beware." 

Help all this helps.  All the best.

Gerry Wasiluk

No problem.   Hope it helps in some small way.

And thank a little birdie named Jeanette   who remembered that I had provided some of this before and pinged me in to help.  I was buried into the Storyline beta.

Just another great example of how Articulate tends this great community.  Even when they don't have the answer per se and they know who does, they reach out behind the scenes to get their users answers.  That just rocks in my book! 

Steve Flowers

Great info, Gerry.

We have several different environments as well:

  • Review - This is for content / functional review of everything but LMS runs. These are updated continuously as development continues and we often ask reviewers to look at specific sections of content to verify everything is up to snuff. We may also run ad-hoc user tests on target users to verify that the solution will do what we predict it should.
  • Development - This is for LMS functional review. At this point, content should already be at the polish stage, though for external vendors we ask for a functional prototype to verify packaging.
  • Production - When everything is done, verified complete, and approved we hang it at final destination.

For *beta* testing / user testing we have several components we'll use. These are low tech. Our sample size usually ranges around 12 - 15 users in two different groups (usually 6 or 7 users per session).

  • We begin by baselining with a survey and pre-test. These are paper based. We want to gauge the demographic (to validate our user profiles - rarely is this different than the audience profile that starts the design, but sometimes we learn new things) as well as the approximate level of skill / knowledge / confidence going into the learning experience. Confidence is an important data point. We collect self-reported confidence measurements in the pre-test.
  • We then have the learner's access the materials while we observe each session (there is one observer for every 2 users). We take notes where folks look perplexed, annoyed, or stuck. 
  • At the end of the session, we present a post-test with similar confidence measurements.

We don't do pre/post measurements in the solution. We have a single test that can be accessed at any time that serves as the completion gate. If our user testing data is consistent, we don't need these measures to prove whether the solution will do what it was designed to do. We do include a survey in each course. 

Over the long term, we encourage the client to look at the organizational impact. The client already has many of these measures (if they don't, something is really wrong and training will never fix it). If they pay attention to those measures and set goals according to the desired outcome, it's a better thermometer than an assessment metric (essentially short term recall). Memory close to the learning event in no way correlates to long term performance -- and we can never tell if all we measure is that short term recall

Elizabeth Israel

I guess that I take a further step back before I even start my data gathering.  When somebody comes to me and says "we need training" I begin asking clarifying questions to determine exactly what the issue(s) is/are since I don't assume that training is always the solution.  Additionally, I also ask who else should be pulled into the meeting as they will be impacted.  Sometimes, I find myself really needing to put on my consultant hat and be a facilitator so that I can get the right parties talking with each other.  Once the actual concerns/issues are on the table I start the process over again.  I have to say that there have been many times when the project/product owner or SME told me what training was needed when, in fact, it turned out to be something totally outside the realm of training.  I think that sometimes it's important for all of folks in training to remember we need to provide value-add and prove why we're important to the organization in many ways on an ongoing basis.

This discussion is closed. You can start a new discussion or contact Articulate Support.