How do you identify learning objectives?

Nov 20, 2014

Question to the masses:

When tasked with an e-learning project, how do you go about identifying the learning objectives? 

Does your boss or client provide them to you? If no,  are there any specific questions that you ask your client or boss to help you identify the learning objectives? 

Do you analyse source content to identify them yourself? If so, what is your method for doing that?

Any tips, tricks, experiences, or recommendations would be greatly appreciated. 

Thanks in advance!

31 Replies
Ralf  Baum

Hi Nicole,

I usually prefer a direct communication with the client about the learning objectives. It is very difficult to identify the learning objectives only by reading/analyzing source content. The risk of misinterpreting is too high.

I recommend to ask the client "What is the goal of this training? We need an aim for didactical reasons." Usually you'll get usable answers.

It's only conversation and there's no need to be ashamed to ask about the learning objectives. It's too important to have clear targets.

Best regards
Ralf

 

 

Steve Flowers

In the best case, terminal and enabling objectives are the result of a comprehensive analysis. It's more common for dominant cultural habit and, worse, content assumptions to drive the objectives. 

Exposure to information alone rarely produces persistent and flexible knowledge. And there is often less chance that knowledge (knowing something) will result in meaningful behavior change. This can be mediated with good questions to some degree if the stakeholders are receptive.

In the second case, like Ralf, I like to engage in a conversation to help structure goals. It's almost never a single conversation. Sending a question or two at a time to clarify and dig into the real performance goals serves a couple of goals. First, it keeps the SME engaged with microbursts that serve a good purpose. Second, it helps to give time to think about new questions. Interviews in a single setting are nice, but spacing it out can be better in my opinion.

I do review content ahead of time. It's an artifact that helps to frame assumptions. You can see a lot of what folks care about by the content they highlight as important. The content produces questions:

Based on what you've told me and the content you provided it appears that:

  • X is a common problem in your audience.
  • Y is difficult to understand.
  • Z is a frequent task.
  • A is an infrequent task.
  • B is a critical task.

Or, the content provides some insight into asking better questions:

  • How frequent is task X performed?
  • Do these concepts apply the same way to task X and task Y?
  • How stable are the processes associated with these tasks?

I like to ask if I can interview a couple of folks that regularly perform the job well. Rarely get the chance or the SME states that they do it. Part of the conversation is getting folks to understand the relationships between accomplishments and content. Starting with content is normally wrong-headed, but it's often what we have. No choice but to 1) try to convince folks to partner on an analysis or 2) use what we have at hand and try to uncover the roots as we go.

Verb darts on Bloom's Taxonomy without challenging, questioning, or digging deep into what the content is really trying to solve is something I try to avoid... and poke fun at. I've totally played verb darts in the past and it's shameful:)

Adele Sommers

Piggybacking on Ralf's point and Steve's very clear and thoughtful response, I aim (if at all possible) to have the clients explain their business case for the training they would like to create, and also answer the X, Y, Z, A, B questions themselves. 

I then remain on high alert for any evidence of whether the training the client envisions is poorly aligned with the organization's business goals, a bandaid for badly designed processes, or a way to work around obstacles to performance that have little or nothing to do with employee skill sets.

If training does not seem to be the answer (or at least not the best answer, or the current answer) to the clients' needs, I attempt to steer them in a better-justified direction. Sometimes that means recommending other types of performance interventions, such as redesigning processes, removing road blocks to productivity, or even rethinking their business model. In some cases, it's meant that I ended up with no role at all in the final picture, but it was the only right and ethical thing to do. The clients were enormously relieved to have had the opportunity to reanalyze and fine-tune their business or product model before shelling out funds for an expensive training program. As a performance consultant as well an instructional designer, this type of guidance is another key service I can provide.

I mention these other factors because I consider all of them to be part of the needs assessment process, as well as necessary precursors to developing "learning objectives." If learning objectives aim to do nothing more than help employees work around obstacles or master overly complex procedures, they are not really valid performance goals, as they beg the question of how to best bridge the perceived "performance gap." 

Steve Flowers

Heh. Blooms is valuable. Just not as arbitrarily as is commonly used. Lots of folks use Bloom's to fill a formality very early in the process. Picking a verb that seems to closely match the content isn't much better than tearing off the top of the taxonomy and tossing a dart at the "know" verbs.

When constructed this way, the objectives don't matter nearly as much as they could. Moving the determination of objectives further down the line can help. By defining performance requirements, tasks, skills and subtasks, practice and measurement opportunities first - the objectives almost write themselves. Using Bloom's as a frame at this point is helpful.

Nicole Legault

These are all excellent and very insightful points.

@Steve Very good points and I like how you refer to Bloom's taxonomy. I also agree with you that as you identify the tasks and subtasks people need to complete on the job, the learning objectives write themselves.

@Adele I love your point about how being a performance consultant helps you identify if training is the solution or NOT. I personally think training is often NOT the solution. I've also wondered if there's a conflict of interest in having an ID do the needs analysis up front because if they identify training isn't the solution, aren't they out of a gig? hehe. 

Alexander Salas

@Steve and @Adele have pretty much given us a great summary on learning objective (LO)development.  Learning objectives are essential to any adult learning activity and especially for E-learning.  In most cases, interactive E-learning modules are self-directive learning experiences and learning objectives provide referential goals for the learner to achieve.  I'm currently developing my Work Product for the Certified Professional in Learning and Performance (CPLP) certification and this has been a rewarding experience in terms of aligning content with context.

So, I'm addressing this question in the context of being an ID contractor (for hire) and not an employee in a corporation.  According to the CPLP handbook,"Training aims to improve individuals' knowledge or skills, whereas performance improvement aims to improve individual and organizational performance in relation to organizational goals.".   Therefore, whether is a task-based or a highly cognitive learning product, the terminal and enabling objectives should align to your testing materials and both of these need to be derived from business drivers.  Bloom's taxonomy is a great reference for IDs to develop the LOs, however, how does that help the learner?  Do you present LOs as a bullet list (of boredom) in the module? Many IDs use the SMART (approach from Project Management) to create objectives i.e. Specific, Measurable, Attainable, Realistic and Timely.

Adele Sommers

Nicole, you've brought up an excellent point about the potential "conflict of interest in having an ID do the needs analysis up front because if they identify training isn't the solution, aren't they out of a gig."  I agree!

I think the problem often lies on both sides of the fence, beginning with a sponsor who may have a fixed notion that training is required to achieve a certain performance goal and therefore sees a set of nails sticking up. Enter a "hammer-wielder" (an ID service provider with one particular, even if beautifully diverse, tool set) to pound those nails. Chances are excellent that time and resources will go into developing a training program, whether it is truly warranted or not.

The tenants of the human performance technology (HPT) discipline tell us that training should always be the last resort — because it's both expensive and highly ephemeral. It may not be the best way at all to achieve a desired behavior change in a given situation. Job simplification or process redesign may be a much better solution, for example. Training is usually merited only for well-designed job tasks when performers either lack job skills/knowledge or have insufficient skill practice.  

That's why I love Alexander's point about the CPLP certification program. It's an example of a professional development process that helps ensure that a consultant will enter a situation with a broader perspective and an even bigger performance improvement tool belt.

Yes, there's a tremendous temptation when working with any client in a consulting mode to simply "take the money and run." Who really wants to talk a client out of a lucrative training contract? Yet that is exactly what we must do if we suspect, and can explain, that a training approach is not the solution to the need (whether that means at all, of the type requested, at the scope envisioned, or at the current time). At a later point, training may very well be justified, but perhaps for different reasons or at a different scale. Yet, this very counseling scenario is the moral and fiduciary responsibility we have to our clients to be straight with them, by viewing their situation through our overall system performance lens rather than primarily through an ID lens.

On the flip side, there are also plenty of situations in which much more training is needed than sponsors are willing to acknowledge or pay for. When training really is appropriate, it might be everything we can do to convince them to allow us to be as thorough as necessary in developing a solution that will fully meet the learning objectives!

Peter Rushton

I'm on a bit of a internal rant with myself over learning vs. training - so I had to respond to Adele's quote "... that training should always be the last resort..."

My internal ranter is saying, "Yes, but learning shouldn't be!" - with the idea that self-directed learning is better than training-directed learning.

(I don't want to suggest that anyone has posted otherwise, but I did want to get this out to clarify for myself.)

Wouldn't it be wonderful to find an employer convince enough to ask us to create a "continuous learning library" of e-learning modules! Now there's a gig ;~))

Adele Sommers

Bravo! I love your comment, Peter! When learning is self-directed (such as with informal learning in the 70:20:12 model), it tends to be highly relevant, much less expensive than the cost of developing many types of formal training, more application-focused (since it pertains to the ways that people survive and advance on the job), and more durable, since people would tend to reinforce it often.

The trick is for the organization to find ways to codify the informal learning into a knowledge base, such as through a wiki of best practices or another type of "continuous learning library," as you've suggested.

Edgar Mueller-Gensert

I go with the good ol' Kirkpatrick pattern and work it backwards when talking to the client:

1. What is the one company key figure to be changed? (revenue, fluctuation of employees, etc.)

2. What is the desired behavior, the learners must acquire to make that happen? (improve the conversion rate, motivate co-workers,...)

3. What is it, the learners will have to know to achieve that behavioral change? (how to ask, what does motivate the co-workers,..)

4. How must the course be designed to achieve good learning? (the classic "happy sheet" outcome)

The learning objectives, as I understand them,  are hidden in item 2.

No. 3 and 4 are just the segue, no. 1 is a matter of good metrics on the customer's side.

In summary, I concentrate on the change of behavior. If this does not happen, forget training.

 

E.

Peter Rushton

Hi Adele:

Thanks for the 70-20-10 link. I am still digesting the commentary but 2 things came to mind (I know, collision ;~))

First, if true, that's a heck of an ROI - as in bad for training; and how social media might be changing the ratios and costs.

Not sure if the ratios accurately reflect the relative status of employees - e.g. new hires vs. managers and there may be a bias in there somewhere.

Adele Sommers

Hi, Peter! I'm not sure whether we should think of the 70:20:12 model as a significant shift in the traditional training paradigm. For example, the article discusses its origins as a collection of historical observations regarding how learning actually occurs in the workplace. 

Although many organizations have embraced the model and have defined strategies for extending and enhancing it, I daresay that there will always be countless important applications for formal training to develop and maintain employee skills. Although formal training represents a smaller slice of an employee's learning continuum, it's often an indispensable one!

Bruce Graham

70:20:10, in my experience has always been the way, it's just that after x years of corporate learning and elearning programmes, many people are BEGINNING, (and I ay beginning...) to wonder where they have gone wrong, and consider what other options may be open to them.

Saying that, the 70:20:10 model will, (no doubt...) become another buzzwork in the next few years, and will open itself to misuse.

I have always said that learning objectives MUST be measurable, (publically committing to this as my one objective for this year...). This is sometimes INCREDIBLY hard for a (freelance) ID to achieve, mainly as we are not always exposed to the corporate metrics that are used to do that.

Sometimes, the learning objectives that we get are their interpretation, sometimes they have no idea how, (for example), to link to the "the business". In these cases - should (freelancers) fight to try and get to the decision-maker, or "do the best we can"? I always found it easier to get to the decision-makers when I was in corporate world, and therefore get closer to the real, measurable business objectives when writing learning objectives.

This is a very complicated subject with a HUGE number of variables.

Peter Rushton

70:20:10 - It is interesting that the original research for this was done in the 1980's and later published in 1996 - all essentially pre Internet and social media buildup and I wonder if 50:30:20 (wag) isn't a better distribution today, given the pace of electronic interactions.

One idea I found intriguing was a slight rephrasing for a good "learning" paradigm:
10% training, 20% from others, 70% doing.

P.S. Google Images have some very creative visuals on this.

P.P.S. Given the high cost of corporate training, I just can't get my head around a 10% return.  So, given the original research referenced highly effective managers - vs. underlings - I still suspect some segmentation here. 

Rachel Barnum

I actually just finished up this blog article about writing learning objectives: http://www.ohthatrachel.com/2014/11/21/writing-effective-task-oriented-learning-objectives/

Typically I start with the client objectives and we break it down into what the learners *really* need to know. Where are there weaknesses? What happens when a learner doesn't know this thing? etc. 

Nicole Legault

That is a great blog post, Rachel! Thanks for sharing your very practical approach. Also, I LOVE that little comic so much I think I need to share it on LinkedIn. 

All this talk about the 70:20:10 model got me thinking that I don't know much about that model and what it stands for. So, I did what I always do when I want to learn something new: created a mini e-learinng demo about with Storyline. 

70:20:10 - A Model for Learning

These discussions are so enlightening and enriching!

Peter Rushton

Hi Nicole - nice job on the demo - love the converging animations. Wish all EL was that quick and informative ;~))

This reply also gives me a chance to compliment your writing style. As someone new to the blog I have been reading lots from you in the beginner section.  Nice breezy posts with good form and content.  Thanks.

This discussion is closed. You can start a new discussion or contact Articulate Support.