Maximize Your Program Outcomes with a Training Evaluation Sonar Plan – Part 4

October 9, 2013

The navy must always be prepared before entering the field. They do not wait until they are in the middle of the ocean to think about how they will navigate their submarines and respond to threats.

They prepare for every eventuality and engage the sonar upon launch, monitoring it carefully during the entire excursion, making adjustments as required to stay on course and avoid danger.

How can you plan your evaluation efforts as diligently as if you were a navy submariner? Read on to find out.

Click here to learn more about stakeholder involvement.

Tools should be built during the design and development of a program. Don’t wait until after the training is complete to think about how you will monitor on-the-job application and results. Building tools and scheduling future pings during the training design and development process increases the likelihood that they will be implemented and actually occur. Click here to learn how to automate this process.


Scale Your Training Evaluation Plan to the Importance of the Initiative

The amount of training evaluation sonar required is directly proportional to the level of importance the initiative holds for the organization. Routine training programs can incorporate passive and active sonar that does not require a great amount of time and effort:


Sample Routine Training Evaluation Plan

            Levels 1 and 2:

            Levels 3 and 4:

  • Self-monitoring tool (introduced during training)
  • Delayed survey
  • WLP calls to a few supervisors to ask if they have seen the desired on-the-job performance

Mission-critical programs should contain multiple passive and active techniques at each of the four levels. This helps to ensure that potential implementation problems are identified and can be resolved before they reduce the organizational impact of the initiative:


Sample Mission-Critical Training Evaluation Plan

            Levels 1 and 2:

  • Pulse checks
  • Numerous activities
  • Post-program survey
  • Post-program interviews (if indicated by survey results)

            Levels 3 and 4:

  • Self-monitoring tool (submitted to supervisor or WLP weekly)
  • Supervisor observation checklist
  • Delayed survey
  • Dashboard
  • Interviews or focus group
  • Executive modeling and/or supportive messages

Click here to learn more about determining the proper degree of evaluation for various types of programs. 

With a bit of advance planning, you will maximize the impact of your training and be able to show the organizational value with the data you collect. You can learn much more about creating such a plan in the Kirkpatrick Four Levels® Evaluation Certification Program. We soon will launch an automated tool to assist you in this process, as well. Click here to register for the information webinar on Oct. 17.

Finally, hats off to our armed forces. We appreciate you!

Join the Discussion


We welcome your feedback about this quick tip. Here are some ways to join the conversation:

Don’t Miss a Feature
Sign up to receive free resources now and additional free content in our occasional e-newsletter.


Additional Resources:

The other quick tips in this series

Kirkpatrick Four Levels® Evaluation Certification Program

Automating Kirkpatrick Evaluation: Introducing the Kirkpatrick New World Edition of Metrics That Matter®

Don’t Have Them Sign Off, Have Them Sign On

The Training Is Over….Now What?

Training Evaluation Mistake #3: Using Up All Training Evaluation Resources on Levels 1 & 2

The Brunei Window Washer: Bringing Business Partnership to Life

Scroll to top Arrow