Saturday, October 17, 2009

Five Examples of Learning Metrics that Matter




This week I participated in a discussion on learning metrics with my new friends at #lrnchat. What a pleasure it is to be part of these weekly events. Where else can you say things like, “if we start analyzing the problem using the Six Boxes® model based on Thomas Gilbert’s theory, we are sure to make some progress” without people running away to see if someone refilled the punch bowl?

The questions that were tossed around this week were:
  • Are learning metrics different from business metrics?
  • What are you measuring in your organization?
  • How do you tie organizational learning to business performance outcomes?
  • What else besides metrics do you use to show impact?

I was excited and a little bit surprised as I watched the responses roll by. I get so caught up in what I’m doing at work that I lose sight of what is going on in the rest of the world. Some people seemed to be struggling with measurement basics, while others seemed to have a strong grasp on the topic. This got me thinking about what we are doing at my organization, which probably falls somewhere in the middle.


I thought a lot about this discussion after it was over. Really, the heart of the matter is: What value do we as learning professionals bring to the organization? How do we show it? The short answer is that we do this by helping business leaders meet their goals.


I have been fortunate throughout my career to have been involved in a number of projects that made a huge impact on the business. But I can also recall times when it was difficult to see connections between the work that I was doing and the impact it was having on the organization. These are the times when we all fall back on measures such as, “butts in seats,” level one evaluations, and elearning completion rates. These are important feedback indicators for the training department, but they usually don’t mean very much to business leaders. That being said, I can think of examples in which these measures alone have been important to the business. What it comes down to is that sometimes the measures associated with learning success will be obvious and glamorous, and other times they will not. Here are a few examples of both from my experience:
  1. Sales Training is one of those cases where the metrics are obvious and easy to align to the business. We run a class on how to sell against our competitors effectively. The natural metric associated with this is the number of competitive takeaways. We have clear evidence that sales reps who take this class have a higher rate of competitive takeaways than the general rep population.


  2. Equipment Service Training is another example with obvious metrics. Well-trained service reps fix problems more quickly and are able to tackle more service calls in less time.


  3. Operations Training can have a variety of metrics. We ran a class on how to write work instructions for a part of the business that was very procedurally oriented. Every time they had people leave the department due to promotions or turnover, processes broke down. Our metric was the existence of well-written work instructions in the departments that participated in the training. This helped the business leader keep continuity when there was turnover.


  4. Compliance Training metrics are not very glamorous. This is a case where counting up completions works to serve the measurement need. Our legal department wants to reduce or eliminate ethics violations, or failing that, be able to prove the company did its part in making employees aware of their responsibilities and the consequences. We make sure everyone goes through our business ethics course and provide a report shows that this has been done.


  5. Soft Skills Training metrics are of course the fuzziest area. Yes we conduct level three evaluations to show behavioral change, but most people don’t get too excited over these. However, our Chief Human Resources Officer is very interested in employee engagement. She knows that offering soft skills training helps people with personal development which is important to engagement. She also knows that managers and supervisors who use what they have learned in soft skills courses are going to receive higher engagement scores. So yes, we count “butts in seats” for these courses.
The point I’m trying to make with these examples is that the metrics have to match the need.  When we help our organization’s leaders meet their goals by identifying and addressing the learning component of their business problem, we are truly adding value.

6 comments:

  1. Mike, I lost the purpose of your message? Yes, some are easier than others (you used obvious examples)and yes, the metrics have to match the need.

    And one of the greatest needs in management is supervisory skills (the soft stuff). Hire for attitude, train for skills. Yes, we can train for skills and measure the outcomes. Its been my experience that too many supevisors don't know how to evaluate attitude (in interviews)? So, you comments stated the obvious and didn't give me any new insight into addressing (measuring) the soft stuff? BillK

    ReplyDelete
  2. Hi Mike, I found your blog while looking for specific information that your 5 points covered but I would have liked to read a little more detail.

    I am a technical trainer for a large company but my audience is not the internal employees that your 5 points address. Rather they are the installers, developers, designers and integrators who come to learn more about our products.

    Our training program was developed to try and mitigate the calls to our tech support team since our product is relatively new to a lot of people who are getting in to the renewable energy sector.

    I read a lot of blogs, white papers and articles about the metrics of training but they all seem to focus on the employees of a company where the metrics can easily be analyzed for trend analysis (like your blog clearly mentions) however, this isn't the focus of my training. We have seen a drop in the number of tech support calls over the last 2 years but we would like to see if it is our training that is dropping those calls or if the attendees have just become more familiar with our products through repetitive installs.

    We have a course feedback form but it is driven by our corporate office and in my opinion is pretty lame. Do you have any suggestions or anecdotal examples for my program?

    Great site, by the way, and I look forward to clicking on the bookmark as the days roll on.

    Greg

    ReplyDelete
  3. Bill K - thanks for commenting on this post. You are right that some of the examples I shared were obvious. My point was that, obvious or not, metrics used to measure learning success have to be tied back to what the business leader is trying to accomplish.

    Measuring soft skills is always a challenge. When we conduct softskills training, we conduct level 3 (behavioral change) feedback by conducting post training surveys with participants and their managers. These surveys point out that most people are doing something differently as a result of having participated in the training. But, as I stated in my post, most business leaders don't get too excited over this. Behavioral changes don't necessarily translate into the business improvements needed. Two address this, I do one of two things: use a learning transfer goal management system such as Friday5s available from the Fort Hill Company, and/or get agreement from the business leader on what the appropriate business measure should be. For example, we run a management development program for high potential sales managers. The program focuses on topics such as leading teams, coaching, developing business acumen, etc. Our measures of success are retention of these high potential employees in the organization, and their rate of promotion within 1-2 years of program completion. Granted, there are other factors that lead to these metrics, but our business leaders do recognize that the training does contribute to the program participant's success.

    ReplyDelete
  4. Greg S - Thanks for commenting on my post.

    I don't pretend to be an expert on learning metrics. Like you, I'm just trying to sort out what will be most meaningful to my customers. But for what it's worth, here are my thoughts:

    External training proposes a unique challenge. You said that people come to the training to learn about your products. My first thought would be to use a level 2 product knowledge test as one of the measures. But that will probably only get you so far from a business standpoint. I think the answer lies in why the external people are coming to the training. What are they trying to accomplish? My thought here would be to have them communicate their goals to you prior to coming to training. You could then focus delivery on their needs and then follow up with them afterwards on whether or not they were able to meet their goals.

    In your follow-up you could also specifically ask something like, "Did the training help you reduce the need to call our tech support?"

    I hope this provides some food for thought. Good luck with your search.

    ReplyDelete
  5. The difficulty is in tying the training directly back. Can you expand on how you've done that. I think that would be interesting. Your blog, for those of us in the industry for years doesn't tell us much.

    Tell me more about what variables you used to get your metrics. What was it like getting that data and what was management's response?
    Thanks

    ReplyDelete
  6. Anonymous wrote:

    "The difficulty is in tying the training directly back."

    I agree with that statement wholeheartedly. It is difficult to isolate training's impact on business results when other factors are impacting those results as well. So what we do is adopt the same metrics that the business leader uses to measure his/her success as our level 4 metric. Yes, we do the Kirpatrick levels 1,2 and 3 as well, but that is more meaningful to us in the training department as feedback on our program design and delivery than it is to the business leader. In the end, when they measure the success of their business initiative and they achieve their goals, they are attribute some of that success to our efforts. They are grateful to the training department for the contribution we have made to that success.

    Thanks for your comment.

    Mike

    ReplyDelete