Here is what a world without objectives might sound like–
“I don’t know. What do you think? Last year we covered ice breakers. How about a session on humor this year?”
“Well, I’m not entirely certain what scenario-based elearning would do for us, but I know our competitors are doing it.”
“I want to build a great program for the reps, but not sure what to do. Role plays? Job aids? Videos? Drills? All of them? How do I decide?”
“We got a really good speaker. She is funny, really a hoot. Let her talk about whatever.”
“Mobile devices are all the rage and I think we should put this on them, so they can check them out whenever.”
“I like panels. At least one of them will have something substantive to say.”
“Review the programs we built. Can you write some objectives for them? The funder is demanding objectives now. Ha. If we write objectives to the programs we built, we have a great chance of meeting those objectives.”
“Well, how would we evaluate this? No clue where to start.”
“Let’s just serve up a smorgasbord of rich online resources and see what happens.”
“There is so much here that I can’t tell where to focus. Do I need all of it? How do I get to what I need?”
“I’m looking at some high production value here, but I can’t judge whether it would work for us or not.”
My good friend Marc Rosenberg posted a column at Learning Solutions magazine. The title: Why I hate instructional objectives.
It set me off. Marc doesn’t dislike objectives, not really. We share disdain for crummy objectives that over-promise and under-deliver, and for objectives that natter on without speaking to real world concerns. Admittedly, many objectives do that, creating cynicism in their wake.
But that doesn’t mean we should abandon objectives. Please, no. We don’t want to encourage instructional designers to flop about without clarity of purpose, purposes derived from analysis of the work, worker and workplace. We must begin with the ends in mind. That is what objectives are.
What we need is not fewer objectives or no objectives, it’s better objectives. I LOVE good, useful objectives and make my case in a counter opinion, also at Learning Solutions.
Hope you will weigh in.
Loved your response. You’re my new hero!
Hey Allison,
As a practitioner in the field of ‘Improvement Design’ (Dave Ferguson’s name which I love), it would be ridiculous to try and design anything without purpose or objective. The issue with ‘objectives’ is how they get used within the materials learners consume, in addition to their use in what I like to call the ‘Magic Bubble’ effect. The magic bubble is the hypothetical bubble of learning that gets delivered directly into a learner’s brain. That bubble is built on the belief that if I properly state the objectives to my learners, design the content around those objectives, test against those objectives and wrap that up all nice and neat using best practice design for whatever media it is being consumed in, then I will successfully deliver ‘learning’.
I have even heard comments that ‘real learning’ begins with properly stated objectives. The use of the word ‘learning’ as a noun isn’t a semantic issue. Its a design issue. Most designers, design materials using the magic bubble theory. They don’t account for what is actually being learned or what the contribution of content is to the PROCESS of learning. So the issue I see with objectives isn’t the designer having explicit objectives and purpose, but rather the use of objectives in the delivery of content.
What Evidence Shows About Objectives
John Hattie has written two books that summarize hundreds of experiments on 138 different influences on learning. He reports an overall effect size of .4 on use of behavioral objectives and advance organizers. An effect size of .4 is a moderate effect. He comments that “The overall effects of behavioral objectives show much variance but the effects are higher when the learning intentions of the lesson are articulated, when notions of success are included, and when these are shared with the students. When they are primarily for the teacher,… or aimed primarily at surface learning .. the effects are lower” (p. 167). He also comments that even though the effect size is not as high as other interventions, objectives are relatively inexpensive to develop and present so offer a good return on investment.
Applying the personalization principle, in my ISD class I ask designers to first write a traditional learning objective e.g. given.. the learner will. at a standard of… AND then for the student materials, translate it into informal language using first and second person.
In short, evidence supports Allison’s comments.
Hattie, J., (2009), Visible Learning and Hattie, J. (2012) Visible Learning for Teachers. Routledge.
Hi Ruth,
Once again, I would never argue against a designer designing with intention. It would be silly of me to do that. But a couple of things about Hattie’s work which in my opinion, don’t translate into the ‘evidence’ to support what Allison is saying (not that there isn’t evidence, just that the work stated is not it).
1) Hattie’s work is focused in large part in the formal education (academic) space and does not include substantive results from workplace performance (or so is my understanding of this work)
2) Hattie’s work is a meta-analysis that includes only ‘quantitative data’ and does include ‘qualitative’ studies that would give us insight into what was meant by something like ‘effect’
3) “The overall effects of behavioral objectives show much variance but the effects are higher when the learning intentions of the lesson are articulated” – what are the effects here? Are we measuring test scores where tests are constructed based on the specific objectives mentioned? Is everyone ok with the notion that these tests are a significant measure of something other than test performance?
I look at the work by Dr. Sugata Mitra which paints a rather different picture for what may work in an academic setting with children than what we believe to be true. The fact that students are able to achieve an average score of 30% on standardized tests on a given subject without any intervention at all (no objectives, no lessons, just interest and motivation), with that bumped up considerably when there is intermittent facilitation, should have us all looking at the convention at least a little bit differently.
I’m not sure anyone would disagree that you need purpose and intent if your going to design anything. The question is what does the final design look like? (I don’t have the answers….just questions :))
Sorry in #2 – I meant to say, it does NOT include qualitative studies.
I believe learning objectives are the backbone of design and development. Using Q&D FEA techniques, I can sit in a room with a group of SMES and elicit the skills and hidden knowledge, construct learning objectives and gain consensus on them. Non trainers understand crisply worded objectives because they describe what the learner must do to demonstrate mastery. So, what happens with conceptual or principle-based learning? The same thing — SMEs understand distinguishing between examples and non-examples or the principles that guide someone counseling to improve performance. In a long day, you can get everyone on the same page and leave with a set of objectives that guide your design. Elegant objectives yield elegant design and development. They are the roadmap or the skeleton — you chose your term….they guide development of the rubrics we use to evaluate complex cases, they articulate the criteria for mastery and they are the essential element of elegant design. Without them, we have no way to communicate expectations. An ISD who can design without learning objectives?? Ah yes, I’ve worked with him….we’re still waiting for that storyboard!
I like the last paragraph of Ruth Clarks comment. I’ve seen learning objectives that have burned my eyes. Formal learning objectives (Mager style) should be written for the designer not for the learner. To a learner, the same objectives can be mind numbing. It is not hard to re-structure the same learning objectives as a demonstration of a skill, presentation of an ideal “artifact” (document etc). Something meaningful to the learner that is motivational and provides a model of performance as a goal for the learning. Formal Learning Objectives are one of the the things that give people the impression of ISD/ADDIE as stodgy, boring and old, whether it works or not.
In my world (higher ed with very smart and experienced ed tech and student learning outcomes committee folks) we get into debates about the differences between learning outcomes and objectives.
It’s a bit indulgent.
In an adjacent world, I was chatting with the SEO guy in our marketing dept today.
His discontent: “Everyone wants us to use Pinterest now, but my first question is why and to what end?”
Keyword: discontent.
People like feel successful. Even if success means “we decided that the project was This and not That, and we finished it.”
Objectives, outcomes and goals create feedback loops that make life worth living, no?
It seems to me that disregarding learning objectives because they do not show significant effect sizes or because they are overused is a step in the wrong direction. Learning objectives are essential to designing learning activities. I agree with all of Allison’s quotes above. I once had a student ask me to present to a group of undergraduates. I asked him questions about the presentation (what would you like me to present about? who is the audience?) and he responded, “It’s up to you, I just want you to do a presentation.” I was so bewildered – where do I start. How do I know if I’ve addressed any of the topics he wanted me to address. We finally sat down together and I learned that he wanted a presentation that would help students prepare for graduate school – specifically how to apply and find funding. From there, I was able to write my learning objectives and design my presentation. The idea of a teacher or Instructional Designer building a course without learning objectives is somewhat scary. How will they know if students are learning what they need to learn? How will they know which learning activities to choose?
I do agree that objectives can be overused at times and underused during others. I’ve seen schools that demand teachers put up the SWBAT (students will be able to) for the day on the whiteboard and if it is not up there, the teachers get written up. The SWBAT is always in instructional design language and rarely relevant to the student. So, I agree that objectives need to be better designed and if they are shown to the learners (which they should be to guide learning), they should be in a language that helps the learners understand what is expected of them in the course.
Instructional designers like to argue about angels and pinheads. In the real world, my world, it is not about theory; it is about designing effective, efficient training that prepares students to perform real tasks. Remember the modern aphorism – “love of theory is the root of all evil” -William M Briggs
My experience is in designing military technical and operational training and training systems. It is about task performance and criterion based assessment, rather than normative assessment (although the bad habits are hard to break in some areas).
I come down somewhere in the middle between Marc and Allison
Marc is not the first person to diss performance objectives. I remember Roger Schank dissing objectives in the early ‘90s in favour of goal based scenarios (Despite what he said, objectives/outcomes have to be embedded in the design). Today, computer technology is providing the means – games, simulations, models – to reach some of his visions – and they are increasingly being used in military and civil applications such as medical training.
So, with respect to Marc’s ‘E’ (expectations), certainly this is worthwhile, but he has it in the wrong order – it should be first – and reflect the outcome the organisation needs and the task being trained.
And I would like to add to Allison’s list of a world without performance objectives –
Do we need to train to qualification?
How are we going to practice this task before we get on the shop floor?
What tools and information does the performer need to practice this task?
In order to practice this task, what other tasks do we need to be competent in?
What we lose sight of is that a performance objective is a specification for development of training. The problem with most performance objectives, and particularly crummy ones, is that they don’t provide enough information to direct developers in both content and instructional methods, or in media, or in outcomes. This is particularly true if one is designing for development in an LCMS environment. And if we threw out the crummy ones, we’d have nothing left at all. For many ‘designers’ the objective itself, the plan, is the objective – not the resultant performance.
So, how do we get there? It is a simple path – task, content, methods, media , design document. What is important is the process used to derive the objectives and the depth of information and guidance contained in those objectives.
First, stay grounded in the tasks. There is a beautiful symmetry between task lists and instructional designs.
Second, use a taxonomy that SMEs find easy to understand to classify content and the level of performance of content. I have used Ruth Clark’s with great success.
Thirdly, use the content classification to identify methods (Ruth Clark again) that will frame the choice of media, particularly that supporting practice, and frame the functionality of the training device/system.
Finally, optimise media selection based on organisational, cultural, and budget needs.
Learning needs practice, and if you figure out how students are going to practice, then assessment falls into place – after all, we will assess in the same environment that students practice in aren’t we? We’re not going to practice fault finding in the simulator, then have a written exam, are we?
Embedded in this process, is the focus on the task, and practice, and so analysis needs to focus on – tasks, practicing the tasks, and providing the information and tools to support practice. Scenarios if you will with workplace standards driven by real manuals and procedures.
And finally, yes, you can prepare objectives – but useful ones that have content specified, that have methods specified, that have media specified, and that can actually train people to qualification.
My two cents… I always preferred to think of — and use — formal instructional or performance objectives primarily as a tool that gave the ID a “track to run on” in developing the content. If the content was for ILT (classroom or virtual), then it served a similar purpose for the instructor (no such person in the case of self-paced e-Learning). Such a “track” helps the ID to stay focused, to determine what the main content is, and what the supporting material is.
It is particularly helpful in this way in dealing with SMEs — and who hasn’t worked with Subject Matter Experts who a.) think they are great teachers when they really are not, and/or b.) think *everything* must be taught, including every tangent, every nook and cranny, every side path from the main “track we are running on”, as determined by the formal objectives?
As a proponent of focusing just as much if not more on performance support content and tools and less on formal training, I think formal objectives can aid the ID and instructor on this key decision point as well: What should we include in the training content, what do we really need to tell/show/etc., what do we want them to remember, to store in the precious real estate that is their brain… versus what is better provided in a job aid, checklist, EPSS, on-demand small module, or other performance support resource?
This last point is critical — we’ve all seen the hockey-stick graphs, the increasing-increases as to just how much info/data/etc. is in the world these days. This trend isn’t going away — and as a proponent of David Allen’s GTD method and principles and his mantra to not abuse your brain by using it in ways it wasn’t intended, I think it is vital that we really focus more and more on the “training” vs. “performance support” decision points. Properly using formal objectives can help in this regard — as an ID, keep asking yourself if the next fact, concept, process, procedure, or principle is really necessary to include in the formal training content in order to optimize the chances of the performance objective being achieved for the most learners… or should that bit be saved for a performance support resource instead? In short, as an ID or instructor, never think you can win against the Forgetting Curve — you will lose. Use your objectives to help you stay focused on these critical decision points as you design and instruct.
I of course agree with — apparently both — Marc and Allison that we shouldn’t disparage X (objectives, in this case) by noting the worst examples of X. That is like attacking PowerPoint because some people create bad presentations (you could still attack PowerPoint for other reasons, just not for this reason.)
That all said, I don’t think such formal objectives are very helpful for students/learners to read or have read to them in class. They are formal. They are boring — even the ones that are perfectly written qua objectives for the ID or instructor’s needs for a “track to run on.”
Instead, I think the relevant chunks of content should provide learners/students with both context statements and WIIFMs. The context statement is a brief statement that connects the upcoming topic/lesson/course with what they already know (or what they just learned in the case of a mid-course lesson). I like Marc’s list of “expectations” — but I think most of those are covered in good WIIFMs, which I think are vital for learning motivation, yadda yadda. For these reasons I was always pleased that for a decade at Element K we always made sure our content (whether e-Learning or print courseware) had both formal and informal objectives for each content chunk — geared towards to the ID/instructor — and included context statements and WIIFMs — geared towards the learner/student.
So I guess I don’t like adding an “E” to the ABCD as Marc suggests, because that seems to imply that the learner/student will still be subjected to the whole thing. The boring ABCD part, and the more relevant E part. Just give them context statements and WIIFMs at each appropriate point, and save the formal objective for behind the scenes, for the ID during design, and for the Instructor during prep and instruction.
In Allison’s response to Marc at Learning Solutions, I like that she notes a learner can quickly use a list of objectives to determine relevance of a course/etc. for them. That seems fine to me — with the caveat that they aren’t the full formal objectives. I think shorter, informally written objectives will suffice for this benefit for the students who are interested. Again, we provided these for learners in the materials we created at Element K over the years — but we downplayed even informal objectives in favor of the context statements and WIIFMs for each section.
I guess that was more than two cents… hopefully it was worth reading.
Best,
Tom Stone
One of the things I found when I started bring an objectives-based style into my workplace is that it was an effective way of focusing the entire team on reaching our goals. Because we state the audience and the behavior we want to measure, it enables everyone on the learning team to focus their work in that direction. From a project management standpoint they are a way of communicating part of the way we will reach these desired outcomes. I work on projects where the objectives may never be shown to the learner, but having them led us to developing something we could evaluate for success.
It seems to me that Marc’s post is meant to be provocative, and in the end, he’s a proponent of overtly stating the /exigence/ for the training. When I used to teach rhetoric (i.e., freshman composition), we called it the answer to the “So what?” question.
Instructor/ISD/participant guide: By the end of the training, participants will be able to do X.
Learner: So what?
Is there an argument whether or not we waste learners’ time by stating the objectives in their participant guides or in the first few frames of their e- or m-learning? Yes, we probably do waste their time a little by stating them. For the most part, they will care much more about the answer to the “So what?” question than about instructional objectives.
Really, if you can’t answer the “So what?” question re: why you’re designing the training, developing the training, paying for the design/development of the training, or expecting people to spend time taking the training, then the best-written objectives in the world won’t make your training worthwhile.
Of course, I completely agree that objectives are enormously useful behind the scenes. They are the structural beams that we build training on.
Why do we continue to focus on end-of-training objectives? Doesn’t it make more sense to write objectives for on-the-job performance? I agree with several other comments that instructional objectives are important for the instructor, and that they should be rephrased into informal language for the learner. Why not take it to the next step and make sure the objectives state expected and desired on-the-job application of the learning?
Working in the financial services industry, we have an intense focus on measurable impact based on profit margins and regulatory scrutiny. Cant just hope for measurable impact. Its starts with excellent well defined business and performance objectives. In this industry can’t afford to live without them.
Hey there I am so excited I found your weblog, I really found you by accident, while I was
searching on Aol for something else, Nonetheless
I am here now and would just like to say thanks for
a tremendous post and a all round enjoyable blog (I also love the theme/design), I don’t have time to go through it all at
the moment but I have book-marked it and also included your RSS feeds, so when I have
time I will be back to read more, Please do keep up
the superb job.