Course Effectiveness
5 Topics32 Creative Ways to Design Custom Quiz Results Slides in E-Learning #363
Quiz Results Slides in E-Learning RECAP #363: Challenge | Recap This week’s challenge asked course designers to share creative ways to customize quiz results slides. Examples include charts and graphs, celebratory animations, custom graphics, and more. Alison Sollars Example | Alison Sollars | Website Roy Phillips Example | Roy Phillips Jennifer Rowlands Example | Jennifer Rowlands McKenzie Day Example | McKenzie Day | Website Max Textor Example| Download | Max Textor Ingrid Cuthbert Example | Ingrid Cuthbert Peter Mercier Example | Peter Mercier Hilla Schlegel Example | Hilla Schlegel Amar Kulshreshtha Example | Amar Kulshreshtha | Website | @AmarShreshtha Emine Sharma Example | Emine Sharma Yekaterina Martynova Example | Yekaterina Martynova Jodi M. Sansone Example | Download | Jodi M. Sansone | Website | @jodimsansone Scott Wilson Example | Download | Scott Wilson Jonathan Hill Example | Learn more | Jonathan Hill | Website | @DevByPowerPoint Helen Dudley Example | Helen Dudley | Website Cynthia Rondinelli Example | Learn more | Cynthia Rondinelli Samuel Apata Example | Download | Samuel Apata | Website | @afrostem Craig Agnew Example | Craig Agnew Nancy Woinoski Example | Nancy Woinoski| Website Morgan Thomas Example | Morgan Thomas Elizabeth Pawlicki Example | Elizabeth Pawlicki Ron Katz Example | Ron Katz | Website Karin Lorbeck Example | Download | Karin Lorbeck Kelsey Corder Example | Kelsey Corder Thierry EMMANUEL Example | Thierry EMMANUEL Frederic Brewer Example | Frederic Brewer Daniel Cañaveral Example | Daniel Cañaveral Danny Benton Example | Download | Danny Benton | Website Jaclyn Blum Example | Jaclyn Blum Ron Katz Example | Ron Katz | Website Sharlene Daley Example | Sharlene Daley Sharon Plunk Example | Sharon Plunk New to the E-Learning Challenges? The weekly challenges are ongoing opportunities to learn, share, and build your e-learning portfolios. You can jump into any or all of the previous challenges anytime you want. I’ll update the recap posts to include your demos. If you have a blog, please consider writing about your challenges. We’ll link back to your posts so the great work you’re sharing gets even more exposure. If you share your demos on Twitter, please include #ELHChallenge so your tweeps can track your e-learning coolness. Share Your Quiz Results Slide Examples! The quiz results slide challenge is still open! If you have one or more ideas you'd like to share, please jump over to the original challenge and post your links in the comments section. I'll update this recap page to include your examples.36Views0likes0CommentsMeasure The Effectiveness of Your E-Learning Course With Kirkpatrick's 4 Levels of Evaluation
Your e-learning course is all finished and uploaded to your LMS. Your work is done, right? Think again! In our article on the ADDIE ID Model, we explained why evaluating your course is an important part of the instructional design process. After all, how else will you know if your course is effective? If you want to evaluate your courses, but are unsure how to go about it, we’ve got the answer: the Kirkpatrick Model, developed by Dr. Don Kirkpatrick in the 1950s and recently refined by the Kirkpatrick Partners Organization. It’s just what you need to determine if your course is effective. This model is based on the idea that there are four main factors that determine the effectiveness of your training: learner satisfaction, knowledge or skill acquisition, application of new knowledge or skills on the job, and the achievement of final goals. Let’s take a look at each level and how to measure them. Level 1: Learner Satisfaction The first, and easiest, aspect to measure is the learner’s reaction to the training. Did they enjoy it? Did they find it meaningful and relevant? Learner satisfaction is key, since motivation plays a big role in knowledge acquisition and retention. If learners find your course relevant and engaging, they’re more motivated to pay attention—and therefore more likely to actually learn and retain the information in the course. The easiest way to find out what learners thought of your course is to have them fill out a short questionnaire at the end. When you measure learner satisfaction, strive to ask meaningful questions such as, “Was the information relevant?” or “Identify a work situation where you’ll use the new skills you’ve acquired.” These are solid questions about the content learned, not about unrelated issues, such as whether they had any technical problems when accessing the course. Check out this article for more information on measuring learner satisfaction. If you need help coming up with other relevant questions, be sure to check out this list of 60+ Questions to Include in a Post-Course Evaluation. Remember: post-course evaluations alone are not enough to determine the success of your course, but they are a good place to start. For more information, check out this article: Post-Course Evaluations: What E-Learning Designers Need to Know. Level 2: Knowledge/Skill Acquisition This second aspect is pretty straightforward: how much of what they were supposed to learn did your learners actually learn? An easy way to measure how much the audience learned is to include a pre-test and a post-test. For example, ask your learners to rate themselves on a scale of 1-5 for how well they can do a task before the training and have them do the same rating post-training. By comparing their initial score to their score after taking the course, you can determine if there’s an improvement. If you’d like some tips and considerations for using pre-tests, check out Why and How I Created This Pre-Test in Rise 360! Level 3: Application of New Knowledge/Skills It’s all well and good to measure how much people have learned, but what really matters is how much of that new knowledge they can apply on the job. In some cases, this is easy to measure. For example, if the performance issue is quantifiable, all you have to do is compare the “before” numbers to the “after” numbers. In other cases, when the performance issue is not easily quantifiable, it can require close observation and analysis of the learner’s behavior. The best way to do this is to have a supervisor or manager work closely with the learner to assess their behavior both before and after the training. You can then gather the information through surveys, observation, work records, and/or interviews with the managers and learners themselves. Level 4: Achievement of Expected Outcomes The last thing to measure is to what extent your course produced the desired business outcomes. If you followed the ADDIE model where you’ve done your up-front training needs analysis, then you’ve likely identified an expected cost-benefit. This is when you revisit that cost-benefit and compare the results to the business objective that drove you to create the course, such as reduced costs, increased sales, and higher productivity. Now What? There’s no use in evaluating your course if you’re just going to file away the results. If the evaluation shows your course is not as effective as you’d like, consider revising your course. If the evaluation shows it’s highly effective, you know you’re on the right track and you can keep doing what you’re doing. Like this article? Subscribe to our newsletter to get the latest e-learning inspiration and insights directly in your inbox. You can also find us on LinkedIn and X (Formerly Twitter). And if you have questions, please share them in the comments.229Views0likes9Comments7 Tips for Writing Effective Training Evaluations
Well-designed training should keep the learner and their experience front and center. The material should be relevant and impactful and the design logical. As an instructional designer, one of the things I constantly ask myself is, “Am I meeting my learners’ needs?” This can be a challenging question if you don’t have a direct line to your learner—and many of us don’t. So, how do you know if your training is in line with your learners’ needs? With an inviting post-training survey that’s well-structured and well-written. A post-training survey isn’t just a way to gather data. It’s a window into your learners’ experience and your personalized path for continuous improvement. Writing effective surveys is your key to unlocking this valuable information. Follow these seven tips for writing effective post-training surveys and you’ll be evaluating like a pro in no time! 1. Align your questions with desired learning outcomes When you design a post-training survey, write your survey questions and assessment questions up-front to align with the learning objectives. It’s a good idea to tie one survey question to the primary learning objective(s) of the course. For example, if my learning objective is “By the end of this training you will be able to list five tips for writing effective surveys,” a relevant survey question might be “As a result of this training, can you list tips for writing effective surveys?” Even though the survey isn’t a post-test, it’s still helpful to gauge whether the learner feels they’re retaining what you’ve taught them. 2. Write questions that give you measurable data An important aspect of writing effective surveys is ensuring you’re asking the right questions. For example, if you want to know if the course was relevant to your learner’s job, ask them to rate how relevant the course was to their job on a scale of 1 to 5. Asking an open-ended question like “Did you like this course?” may not get you measurable data without you wading through a bunch of comments. 3. Keep the survey (and questions) short and sweet Just like with e-learning courses, learners can get overwhelmed by a survey that’s too long. When they’re overwhelmed, they’re more likely to give hurried answers that aren’t relevant or, worse, skip the survey altogether. If you want to gather valid and impactful information from as many learners as possible, keep your survey short and to the point. As a best practice, cap your survey at five questions and keep them to a single sentence each. Also, avoid open-ended questions and provide your learner with different response options—five is standard. 4. Avoid vague or leading questions When you go to draft your survey, write questions that are clear and unbiased. Avoid leading questions that influence your learners’ responses. Also, stay away from vague questions that don’t ask an apparent question. In the example below, you can see how the revised question is clearly stated and doesn’t imply an answer. Vague Question Concise Question In a world where people are using e-learning for most things, was this course able to help you better learn how to design it? After taking this training, do you feel capable of creating effective e-learning courses? To check your writing, put yourself in your learners’ shoes. Read the question you wrote and ask yourself “does this make sense” and “does this question have the answer written with it?” 5. Beware the nested question Nested questions are questions within a question. They are one of the more obvious attempts at skewing survey results. You can never tell which question the learner was answering! Check out the example below to see what I mean. Nested Question Single Question How would you rate the impact of this article on your survey-writing skills and on your overall instructional design skills? How would you rate the impact of this article on your survey-writing skills? 6. Ensure your survey follows visual design principles Just like your e-learning course, make sure your survey follows some basic visual design principles. This makes your survey inviting and easy for all learners to access. Don’t use distracting colors or graphics, and make sure you have good contrast between the text and background. Also, if possible, only show one survey question at a time. If your survey is long or asks more in-depth questions, it can be helpful to have a progress bar. 7. Write your survey so it aligns with adult learning theory Let’s face it, everyone is busy. It can be difficult to convince learners to begin—much less complete—a survey. Keep this in mind and write your surveys (like your e-learning!) to follow adult learning theory. Tell your learner up-front what’s in it for them. If you can help your learner understand why they’re taking the survey and how the information will be used, they’ll be more invested. They’re also more likely to complete the whole survey and provide thorough answers. The Bottom Line Writing post-course surveys is equal parts art and science. By following the tips I’ve shared here, you’ll be off to a great start! For more advice on creating effective surveys, check out these helpful resources: How to Measure the Satisfaction of Learners Taking Your Online Courses Post-Course Evaluations for E-Learning: 60+ Questions to Include Post-Course Evaluations: What E-Learning Designers Need to Know And remember to subscribe to our newsletter to get the latest e-learning inspiration and insights directly in your inbox. You can also find us on LinkedIn and X (Formerly Twitter).367Views0likes10CommentsTips for Creating Effective Post-Course Evaluations
As e-learning and instructional designers, we create training materials that are intended to improve performance and impact a business’s bottom line. In addition, we hope to create a learning experience that will be positive and engaging for the learner. But how can we measure the success of our training, demonstrate the value, and prove there have been changes in performance? One key way is with solid evaluation methods. Evaluation is part of ADDIE, one of the most commonly used instructional design models. In fact, the E in ADDIE stands for Evaluation. For guidance on how and when you should evaluate, many instructional designers use the Kirkpatrick Model, which outlines four levels of evaluation designers should consider throughout the training development process. The four levels of evaluation measure: Reaction: The learner’s reaction or opinion of the training. Learning: Whether learners acquired the intended knowledge, skills, and attitudes. Behavior: Whether learners apply what they learned on the job. Results: Whether business goals were reached as a result of the training. One way to get insights and measurements into level 1 (Reaction) is through questionnaires distributed to learners at the end of a course. These post-course evaluations have questions designed to gather information about learners’ reactions and opinions about the training. It’s important to remember that post-course evals measure participants’ opinions; the information gathered with these questionnaires is subjective, not objective. And while they might be informative and insightful, they should ideally go hand in hand with analysis and measurement of performance metrics, such as KPIs (read more: Use KPIs to Make the Business Case for E-Learning). Through these metrics you can glean some crucial, though still subjective, insights into your learner’s experience. When you craft the questions you want to include in your post-course evaluation, focus on performance- and task-based questions. Here are examples of the types of questions you might want to include: Rate your level of confidence with the new skills acquired on a scale of 1-5 How do you feel you can apply what you've learned on the job? Do you have a task in the near future that allows you to apply the new skill/knowledge? Identify a specific work situation where you will apply what you’ve learned These questions address performance and application of new knowledge and skills, which should be the end goal of training. Answers to these types of questions will give you insights into how the training impacted the bottom line. The hope is that with the data collected you can make improvements to existing and future courses, and provide supplemental or additional training materials if needed. Hopefully this information and the sample questions help you feel more confident about creating your own post-course evaluation. Let me know how it goes in the comments below! Follow us on Twitter and come back to E-Learning Heroes regularly for more helpful advice on everything related to e-learning. If you have any questions, please share them in the comments.135Views0likes2CommentsEverything You Need to Know About Measuring and Showing the Value of Training
For folks who aren’t familiar with professional training, instructional design, and e-learning, it can be tough to see how investing in strong training programs can produce results for an organization. That means that it’s up to us on the training team to show how training can help improve productivity, drive innovation, and boost morale. We’ve got the in-depth guides you need to assess the training needs at your organization, measure the impact of your training programs, and show the overall value you’re delivering. All About Training Needs Analysis Use KPIs to Demonstrate the Value of E-Learning How To Calculate the Cost-Benefit of E-Learning Here’s How to Prove the Value of Training to Your Organization 2 Ways to Show the Value of Online Training12Views0likes0Comments