Innovations Fresh Thoughts For Managing

Measure What Matters

“Not everything that counts can be counted, and not everything that can be counted counts.” — Albert Einstein
Eileen Whitaker

What is the value of your learning programs? I’m talking about onboarding, security awareness, technology training, professional development — all of it.

When I pose this question, I’m typically met with blank stares, or I am told the number of attendees in classes or the quantity of classes offered. But what is the value? While anecdotal stats provide important insight, metrics that measure butts in seats don’t necessarily indicate value. Peter Drucker, the founder of modern management, said, “What gets measured, gets managed.” Given this truth, it is essential that we are intentional and plan what gets measured, ensuring we’re measuring what matters.

I invite you to consider four key assumptions when setting expectations for your learning programs and defining value — Purpose, Data, Quality and Effort:

PURPOSE: Learning programs should enable learners to maximize performance, not give trainers another course to deliver.

DATA: Make decisions based on data. Evaluations help maximize value and encourage continuous improvement. Without data, the effect and value of decisions cannot be measured.

QUALITY: Quality is defined by the customer — lawyers and staff. How well you meet their expectations determines success.

EFFORT: To deliver value, action is required. Learning is not a spectator sport.

When it comes to measuring learning effectiveness, two common challenges are shared by training teams: 1) the inability to get their hands on the data and 2) the fact that measuring is time-consuming. Yes, effort is required, and you may have to consider new strategies to obtain the data you need, but the most successful organizations measure.

The traditional framework of learning programs tends to be “develop and deliver.” By broadening the focus and shifting efforts toward building awareness, increasing engagement, developing solutions, using engaging delivery techniques, and reinforcing with strategies that transfer learning to the job, continuous learning can play a larger role in the business of law.

A learning program is valuable when it drives performance, solves a problem, increases adoption or increases efficiency. It is also essential that you can articulate the value of your programs. Value is not the number of people trained or who watched the e-learning. It’s not necessarily assessment scores either — unless connected to a business benefit. A learning program is valuable when it drives performance, solves a problem, increases adoption or increases efficiency.

Here are some tips to get you going on the right path.

Measure what matters to the firm. Know your firm’s business drivers and values. Identify how your learning programs align with the firm’s vision and values. Having clarity on what matters to the firm, and aligning your programs to those drivers, will determine what should be measured. You can’t measure what matters until you know what matters.

Use proven techniques to evaluate learning program effectiveness. Kirkpatrick’s Four Levels of Evaluation and the use of key performance indicators (KPIs) are a proven standard in learning. The four levels are:

  • Level 1, Reaction: To what degree did participants react favorably to the learning event?
  • Level 2, Learning: To what degree did participants acquire the intended knowledge, skills and attitudes based on their participation in the learning event?
  • Level 3, Behavior: To what degree are participants applying what they learned to the job?
  • Level 4, Results: To what degree are targeted outcomes occurring, as a result of the learning event(s) and subsequent reinforcement?

Focus first on Level 4, using a top-down approach to define the desired business results of each facet of the program, and state your goals in a measurable manner. Use the Four Levels of Evaluation as a framework for measurement and KPIs to help determine progress toward set goals. KPIs for projects might include finishing on time, being under budget, and staying within scope. For a helpdesk, it might be length of talk time, speed to answer and first call resolution.

With learning programs, effectiveness might look different depending on the type of program and could warrant varying KPIs for each. Some common KPIs for measuring learning quality include user adoption, calls to helpdesk, customer satisfaction, time to competence, training cost per employee, compliance training hours per full-time employee (FTE), percentage of audit plan completed, document health, return on investment (ROI), and return on expectation (ROE).

Measuring is not “one and done.” After you identify your key performance indicators, establish a benchmark or baseline measurement. Then consider the appropriate time interval to repeat the measurement so you can indicate progress and/or know when you’ve met the goal. It’s important to socialize program goals, as you’ll likely need assistance and support from many lateral teams to help capture relevant data.

Transform your learning programs from those made for “develop and deliver” to those that drive performance, solve a problem, increase adoption or increase efficiency. Accomplish this by establishing metrics that matter. First, know what brings value to the firm, align your learning programs to the business, and set measurable goals using the Four Levels of Evaluation. Next, establish KPIs for each learning program, determine the benchmark, and set a cadence for continued measurement. Finally, socialize your goals and measurement plans with stakeholders and lateral teams as you seek the assistance and support from the helpdesk, library staff, security team and staff supervisors.

Don’t rely on speculation. Instead, confidently measure what matters and allow the data to tell the story.

If you position yourself as a strategic business partner to the firm, you won’t want to miss this interactive and highly engaging session at ALA’s Annual Conference & Expo in Grapevine, Texas. Eileen Whitaker will expand on the information in this column and discuss how to establish metrics that matter.