There are some good points to having additional input into the appraisal rather than the supervisor's own experience:
- We will act differently around the supervisor than we do our peers, other managers and external agents. Only the rare individual doesn't modify his behavior depending on the social context.
- Similarly, the more input the supervisor has the more thoughtful the analysis can be in order to affirm strengths and provide options for career or job development.
However, there are too many pitfalls to make 360 Feedback worthwhile.
- We have a bias towards a Habit of Perception. The way we perceive someone's behavior, approach and results rarely changes unless the other person makes an extraordinary effort to be different. If we think the person's been rude two years ago, 95% of the time we'll report that he is still rude. Likewise, if we think she's been a godsend to the department when she arrived five years ago, she still walks on water today.
- The tool is no more objective when executed by a group of evaluators than by one person. It's still subject to a multitude of prejudices--favoritism ("I like him"), gender/race/age, categorization (stereotypes about groups of people, e.g. computer technicians, accountants, engineers, machinists, southerners, northerners, midwesterners). When the supervisor gets the pile of appraisal data, he or she will inevitably find confusing information which, for any particular attribute, will be described as "Some people like you and others don't".
- Some people will rate high, i.e. they will be lenient. Perhaps because they hope for some leniency themselves, they "don't want to complain". Nor do they want to make the other person feel bad, or be angry at them for what they said.
- On the other hand, some will be overly critical. "He is not as good as me, and I was rated average. Therefore, my ratings for him will be average and below." Some people never rate highly because "nobody walks on water."
- Only the exceptionally good or bad employee, in another person's eyes, will receive feedback from any one rater that is significantly different from all the others. Otherwise, you can predict any one feedback provider's scores by looking at their rating sheets for other people. We tend to grade everyone within a narrow range of possible scores.
- Overlap of goodness, or badness, into other aspects of the person's performance occurs. If we really like a person's punctuality and dependability on assignments, and we know a little about the quality of their work, we will assume their quality to be really high and rate them accordingly.
- The most recent, emotionally connected event will overwhelm our memory of the other person's performance for the review period, usually 12 months. Unless we all keep a diary of our interactions, likes and dislikes about all of our teammates' performance, we will not remember the mundane good performance of hers, nor the consistently below-average results of his, if we get a more recent, startling interaction with the person.
- The company's or department's overall results will shade the feedback, and could be falsely attributed to the individual. If the team is winning, the individuals must be doing good work. If the team is losing, then this poor slob who's number came up for review is the cause. In addition, we often think that our own success is because we're good but our failure is someone else's fault (management, economy, the poor slob who's number came up, etc.).
Unless your group is demographically and culturally homogeneous, and have similar personalities and life experiences, you will receive a confusing, disparate set of feedback.
Of all the reasons that harm the value in 360 feedback results, the most common is the Habit of Perception. I've seen this on my own reviews and on the feedback I receive for peers and subordinates. In fact, the feedback never changed for several years such that I stopped soliciting feedback except every other year or every third year. Not that I expected things to change in that amount of time; I was just trying to save my time and the raters' time from being wasted.
No comments:
Post a Comment