2017 Recruiter Survey
Recruiting Apologia: Artificial Intelligence Can't Replace Recruiters

Feedback Is Overrated. I'll Prove It.

 
 One of the old jokes in the travel industry is the value of feedback. A couple visits your resort and is encouraged to jot down a few notes about their experience.
 The accommodations price and staff were excellent, but it rained all weekend. 3/5 stars

Maybe I'm just a curmudgeon, but don't we all cringe a little when someone mentions a review from Yelp or Glassdoor? It's a good thing to know how your audience feels, but it would help if we understood the purpose of feedback and stopped treating it as a measuring tool. Feedback is a form of data that is best used for calibration, not measurement. The central problem with Feedback is that it measures how someone reports they feel, not how they actually feel. This leads to the most egregious use of feedback, which is global action based on individual reports.

But what is feedback? It sounds naughty just questioning its value, but if you're willing to join me, we'll take this ride together.

1) Feedback is easily gamed by how you solicit or design the gathering of data.

2) Feedback by definition is a tool of judgement. When someone is judging, they apply a different standard. Feedback makes everyone an authority. 

If we learned anything from the early days of Yelp, let us learn that.

Think of your local car dealer. They have surveys after every experience, and did you know they are judged/paid by those results? Anything less than a 5-Outstanding-Perfect-Totally Satisfied is considered not good enough. I've worked with dealers before, so I know that, which means that I'm not going to ding our service advisor if there is a mistake because anything less than a 5 impacts them. If you're keeping score at home, that means my feedback is tainted. But what if you don't know this? What if you're the kind of person who doesn't believe in perfection? The dealer wants to put a survey card in your hand?

Feedback is not supposed to be used to measure perfection. It's not even supposed to measure individual results. It is mostly useful when it's combined with results-oriented data. Feedback identifies problems when results are not met. A result includes something like. Did they sign on the line that is dotted? Did they perform the task they were trained to do? Did they return the product to the store later? Did their measurable performance change relative to the management directive? 

Once you have measured a result, feedback creates a useful counterpoint. What is the relationship between the feedback and the result? Is there a trendline that matches the results? Did reported satisfaction rise or fall in comparison to the results? Feedback can serve as a warning light or a go ahead signal, but on its own creates a false story that consistently fails to deliver. Many a failed project is met with complaints that all of the feedback was positive prior to starting. It's not a convincing argument. Neither is high employee turnover because the conditions in the store made it impossible to get 5's on your survey cards.

Feedback isn't terrible. Why, here's an example of how feedback was personally useful. A few years ago, a webinar I presented received a number of 3 ratings in a survey form that went to 5. A typical webinar averaged above 4. Now, the feedback rated me highly as a presenter, and there were no specific complaints pinpointing the problem. How could the webinar be rated average, if I was rated highly? 

The webinar was new material, and I suspected the real reason was the webinar was unfocused. The actual content did not deliver what was promised, but the audience liked me, and didn't want to penalize me with low ratings (just like my service advisor). They registered their displeasure with 3's overall, but gave me individually a 5. More important - there were no positive comments about the webinar delivered outside of the survey. Most of my webinars have a few individuals that write specific words of praise for what they learned, sent in by email. In this case, no one went the extra mile. Those two clues - an average rating and a lack of extra praise - led me to completely redo the material for the webinar. A future version of the webinar rated above 4, with two emails thanking me for the content. 

It wasn't the feedback that was important here. It was the difference in the feedback. The solicited feedback signaled a downward trend in comparison to my other work. That could have meant the topic was simply not out of interest, but the lack of unsolicited feedback is what identified the content was the problem. In both cases, the feedback on the quality of the webinar was an imprecise measurement, but it was a flag that made me look deeper.

But what is the value of feedback? To me, it had me improve, but the real measure would be if one month after the webinar, the habits of the attendees changed. Their personal feelings have an impact on their ability to learn, but the KPI for their managers should have been their performance, not if they enjoyed my presentation. Feedback, in this case, was useful for me in alerting me there was a problem for me - but results-oriented data is what is relevant to the person purchasing the webinar. 

Let's think about a conversation between the recruiter and the manager after the webinar.

Manager: How was it?
Recruiter: It was good. I learned some new information, and really liked the way the presenter talked us through the changes in the software.

That sounds pretty positive, but is it accurate? 

Let's say the recruiter was underperforming, and as part of their yearly review two weeks ago, were instructed to attend training to improve. When the manager asks, "how was the webinar," the recruiter has a lot of background to process that has nothing to do with the webinar. If the recruiter praises the webinar, they're taking responsibility for improved performance. "I learned something new" has the subtext "and now I'll do better." What were they supposed to say? If the recruiter says, "the webinar was average and a rehash of what I already knew," they might gain themselves an advantage. They were supposed to learn, but the webinar didn't help them, so it's not their fault if they don't perform. 

At the same time, if they say they already know the material, they're representing a knowledge base that is not leading to a solid performance. If they say the webinar was average, the manager could very well conclude that the recruiter is not interested or capable of learning, believing themselves to be an expert without the performance to back it up. Talk about a minefield. 

But wait, there's more! If the recruiter says the webinar was not useful, the manager is confronted with the issue that they paid for training that wasn't any good. If the manager can't select training that is useful, that is a sign that they made a mistake. Telling the boss they wasted money brings yet another problem to the table. 

With all of that swirling in their head, the recruiter gets an email survey asking their opinion of the webinar. Gulp. 

 I was just trying to get some feedback, and now we're about to fire a lazy recruiter who works for a clueless boss! 3 out of 5 stars, because it was raining. 

Feedback is complicated because people are complicated. Our memories are faulty, our justifications self-serving, and our understanding of what we need to function is poorly understood by the world's best neuroscientists, much less a manager with a business degree and two undergraduate psych courses from 1998. 

And yet we are addicted to that, sweet, sweet Feedback. It is a requirement that we get feedback in a 360 degree circle, from all stakeholders, and customers, and vendors. So what's the solution? I do have one. It's a simple one, involving only a shift key on your computer. It's so simple, I'm going to make a picture to share it on Facebook. Here it is. 

Feedback is not Data. 

feedback is data. 

Comments