The title questions are one that a fellow leadership guru uses within his own organization and suggests them for everyone else. These questions could be asked about our training efforts also. We all know people who have gone to classes, workshops, seminars and conferences, returned to work only to do the same exact activities they did before. No change is evident.
I was confronted with this early in my career when several of us took a problem-solving course. When we returned to the factory, a large problem immediately presented itself. "Great," I thought, "I get to try this out." When I suggested to the production superintendent that we work on it together using the methodology we just learned, the response was, "I don't have time. I have to fix this now." Unfortunately, he didn't fix it, because he didn't take the time to explore potential causes. He assumed a cause and made a change to an input. No change occurred to the output though.
A few years ago, I helped set up an education event for non-profit and business executives on how to have better boards of directors. (Business leaders were invited because they sit on many non-profit boards, and we thought this might help those efforts be more efficient and effective.) A coalition of the local Chamber of Commerce and United Way created it. We told participants that we expected that they would take at least one idea back to their non-profit organizations, and that we were going to follow up with them in three months. After the event, we asked each participant for three ideas they thought they could implement right away, and three ideas they would be willing to work towards implementing in the next three months. When we surveyed them later, we found out that 85% of the participants had implemented at least one idea in the past three months. To us, this seemed really good. It sure was a lot different than our normal experience with education events.
We attributed the success to telling everyone we were going to ask how they were going to use the data.
What if you did that every time you sent someone to a training class? Wouldn't they be more likely to pay attention and figure out how the information and skills were going to be useful to them?
There are lots of studies that show that training is an effective intervention. One meta-analysis shows that you can predict correctly 62-65% of the time whether a person had training or not based on their 'test results.' Frankly, the authors of this paper seemed ecstatic (at least in academic terms) that their analysis showed training was effective. I wasn't all that impressed with the expressed change in results. From a quality management perspective, if we're not scoring a full sigma bump in results, it hardly seems worth the effort. (Their analysis was slightly more than half, and less than two-thirds bump.)
But if you just ask your staff to demonstrate a difference in how they operate based on the new information from whatever workshop, class, seminar and conference they attended, I bet one of two things will happen: 1) the education will be effective; or 2) they'll stop attending fluff education events because the benefit of being away from work is not worth the pain of implementing something that doesn't apply to their work situation.
No comments:
Post a Comment