3 Biggest Training Evaluation Mistakes Learning & Performance Professionals Make

September 10, 2020

3 Biggest Training Evaluation Mistakes Learning & Performance Professionals Make

While many things in our world have recently changed, a few things remain constant. Among them is the misguided way that many organizations go about training and attempting to show its value.

Here are the top training evaluation mistakes. Are you making them? If so, don’t worry. We will show you how simple it is to correct them.

Mistake #1: Addressing Evaluation Requirements After a Program Has Launched

All too frequently we receive calls from internal practitioners and training consultants alike who have spearheaded multi-million-dollar leadership development programs without first defining specific, tangible outcomes.

Of even more concern, they also have not identified exactly what the managers involved in the program should do to influence those metrics, nor had they prepped senior managers to coach and monitor performance.

Using this approach nearly guarantees that there will be little or no value to report. Ultimately, they have created “nice to have” programs that are easily cut at times like now, when budgets are tight.

To avoid this pitfall, use the Kirkpatrick Model in reverse:

  1. Start every project by first considering the key company metrics you plan to influence, and articulate how this will contribute to the Level 4 Result of your organization.
  2. Then, think about what really needs to occur on the job to produce good results (Level 3).
  3. Consider next what training or other support is required for workers to perform well on the job (Level 2).
  4. Finally, consider what type of training will be conducive to imparting the required skills successfully (Level 1).

If you are not sure what questions to ask up front to clearly define program outcomes, join us in our newest certification program, the Kirkpatrick Strategic Evaluation Planning Certification Program. To read a success story, pick up a copy of Bringing Business Partnership to Life.

Mistake #2: Spending the Majority of Your Training Evaluation Resources on Level 1 Reaction and Level 2 Learning

Training professionals invest nearly 70 percent of their training evaluation resources in Levels 1 and 2 (Evaluating Learning: Getting to Measurements That Matter, ATD 2016). Sadly, these are the least important levels, and this pattern leaves few resources for the more important job of ensuring training effectiveness at Level 3 Behavior and Level 4 Results.

Level 3 is the most important level to not only evaluate but also to invest in for any important program. Without on-the-job application, training has no hope of contributing to organizational results and therefore is of little value to the organization. If your program is important enough to have a Level 3 plan, then it is also important enough to have an evaluation of Level 4 Results.

If you are mystified by how to get to Levels 3 and 4 in your training evaluation, attend the Kirkpatrick bronze level certification program, or pick up our latest book.

Mistake #3: Relying Solely on Standardized Surveys

Some believe in the existence of a miracle survey that will give you all  the training evaluation data you need. Don’t buy it. For mission-critical programs, it is important to employ multiple evaluation methods and tools to create a credible chain of evidence showing that training improved job performance and contributed measurably to organizational results. For less important programs, you will want to be equally careful about selecting the few evaluation items you require.

Surveys, particularly those administered and tabulated electronically, are a wonderfully efficient means of gathering data. However, response rates tend to be low, and there is a limit to the types of information that can be gathered. It is so easy to disseminate these surveys that they are often launched after every program, no matter how large or small. The questions are not customized to the program or the need for data, and people quickly pick up on the garbage in–garbage out cycle. This creates survey fatigue and makes it less likely that you will gather meaningful data for any program.

Learn how to build an evaluation plan that works in Kirkpatrick bronze level certification, or follow our advice in Kirkpatrick’s Four Levels of Training Evaluation.

Conclusion

It is also a mistake to simply feel overwhelmed and do nothing to meaningfully evaluate your key programs or change ineffective practices. If you are unsure of what to do next, please contact us. We are happy to help you. 

Scroll to top Arrow