Feedback Is Overrated. I'll Prove It.

 
 One of the old jokes in the travel industry is the value of feedback. A couple visits your resort and is encouraged to jot down a few notes about their experience.
 The accommodations price and staff were excellent, but it rained all weekend. 3/5 stars

Maybe I'm just a curmudgeon, but don't we all cringe a little when someone mentions a review from Yelp or Glassdoor? It's a good thing to know how your audience feels, but it would help if we understood the purpose of feedback and stopped treating it as a measuring tool. Feedback is a form of data that is best used for calibration, not measurement. The central problem with Feedback is that it measures how someone reports they feel, not how they actually feel. This leads to the most egregious use of feedback, which is global action based on individual reports.

But what is feedback? It sounds naughty just questioning its value, but if you're willing to join me, we'll take this ride together.

1) Feedback is easily gamed by how you solicit or design the gathering of data.

2) Feedback by definition is a tool of judgement. When someone is judging, they apply a different standard. Feedback makes everyone an authority. 

If we learned anything from the early days of Yelp, let us learn that.

Think of your local car dealer. They have surveys after every experience, and did you know they are judged/paid by those results? Anything less than a 5-Outstanding-Perfect-Totally Satisfied is considered not good enough. I've worked with dealers before, so I know that, which means that I'm not going to ding our service advisor if there is a mistake because anything less than a 5 impacts them. If you're keeping score at home, that means my feedback is tainted. But what if you don't know this? What if you're the kind of person who doesn't believe in perfection? The dealer wants to put a survey card in your hand?

Feedback is not supposed to be used to measure perfection. It's not even supposed to measure individual results. It is mostly useful when it's combined with results-oriented data. Feedback identifies problems when results are not met. A result includes something like. Did they sign on the line that is dotted? Did they perform the task they were trained to do? Did they return the product to the store later? Did their measurable performance change relative to the management directive? 

Once you have measured a result, feedback creates a useful counterpoint. What is the relationship between the feedback and the result? Is there a trendline that matches the results? Did reported satisfaction rise or fall in comparison to the results? Feedback can serve as a warning light or a go ahead signal, but on its own creates a false story that consistently fails to deliver. Many a failed project is met with complaints that all of the feedback was positive prior to starting. It's not a convincing argument. Neither is high employee turnover because the conditions in the store made it impossible to get 5's on your survey cards.

Feedback isn't terrible. Why, here's an example of how feedback was personally useful. A few years ago, a webinar I presented received a number of 3 ratings in a survey form that went to 5. A typical webinar averaged above 4. Now, the feedback rated me highly as a presenter, and there were no specific complaints pinpointing the problem. How could the webinar be rated average, if I was rated highly? 

The webinar was new material, and I suspected the real reason was the webinar was unfocused. The actual content did not deliver what was promised, but the audience liked me, and didn't want to penalize me with low ratings (just like my service advisor). They registered their displeasure with 3's overall, but gave me individually a 5. More important - there were no positive comments about the webinar delivered outside of the survey. Most of my webinars have a few individuals that write specific words of praise for what they learned, sent in by email. In this case, no one went the extra mile. Those two clues - an average rating and a lack of extra praise - led me to completely redo the material for the webinar. A future version of the webinar rated above 4, with two emails thanking me for the content. 

It wasn't the feedback that was important here. It was the difference in the feedback. The solicited feedback signaled a downward trend in comparison to my other work. That could have meant the topic was simply not out of interest, but the lack of unsolicited feedback is what identified the content was the problem. In both cases, the feedback on the quality of the webinar was an imprecise measurement, but it was a flag that made me look deeper.

But what is the value of feedback? To me, it had me improve, but the real measure would be if one month after the webinar, the habits of the attendees changed. Their personal feelings have an impact on their ability to learn, but the KPI for their managers should have been their performance, not if they enjoyed my presentation. Feedback, in this case, was useful for me in alerting me there was a problem for me - but results-oriented data is what is relevant to the person purchasing the webinar. 

Let's think about a conversation between the recruiter and the manager after the webinar.

Manager: How was it?
Recruiter: It was good. I learned some new information, and really liked the way the presenter talked us through the changes in the software.

That sounds pretty positive, but is it accurate? 

Let's say the recruiter was underperforming, and as part of their yearly review two weeks ago, were instructed to attend training to improve. When the manager asks, "how was the webinar," the recruiter has a lot of background to process that has nothing to do with the webinar. If the recruiter praises the webinar, they're taking responsibility for improved performance. "I learned something new" has the subtext "and now I'll do better." What were they supposed to say? If the recruiter says, "the webinar was average and a rehash of what I already knew," they might gain themselves an advantage. They were supposed to learn, but the webinar didn't help them, so it's not their fault if they don't perform. 

At the same time, if they say they already know the material, they're representing a knowledge base that is not leading to a solid performance. If they say the webinar was average, the manager could very well conclude that the recruiter is not interested or capable of learning, believing themselves to be an expert without the performance to back it up. Talk about a minefield. 

But wait, there's more! If the recruiter says the webinar was not useful, the manager is confronted with the issue that they paid for training that wasn't any good. If the manager can't select training that is useful, that is a sign that they made a mistake. Telling the boss they wasted money brings yet another problem to the table. 

With all of that swirling in their head, the recruiter gets an email survey asking their opinion of the webinar. Gulp. 

 I was just trying to get some feedback, and now we're about to fire a lazy recruiter who works for a clueless boss! 3 out of 5 stars, because it was raining. 

Feedback is complicated because people are complicated. Our memories are faulty, our justifications self-serving, and our understanding of what we need to function is poorly understood by the world's best neuroscientists, much less a manager with a business degree and two undergraduate psych courses from 1998. 

And yet we are addicted to that, sweet, sweet Feedback. It is a requirement that we get feedback in a 360 degree circle, from all stakeholders, and customers, and vendors. So what's the solution? I do have one. It's a simple one, involving only a shift key on your computer. It's so simple, I'm going to make a picture to share it on Facebook. Here it is. 

Feedback is not Data. 

feedback is data. 


You Don't Actually Want Recruiters To Help You

How is that for a title? Forgive me. It's been a while since I've written here and I'm just converting a Facebook comment into a post because it's pretty good. I think. You be the judge. 

Derek Zeller posted a character study at Recruiting Daily where he creates "messages" from candidates. His main thesis lies in treating candidates better. I get it. It makes sense. But he's just dead wrong, and I'll explain why. 

Derek writes, 

"Without candidates, we wouldn’t have recruiters, and you’d be just another candidate desperately searching for a job instead of the person responsible for filling them."

This is the old, our clients are most important resource. Without clients, who would pay the bills? Unfortunately, that's not actually the point of those stories. Clients and customers and candidates exist because they exist. Our mindset in selling them is important, and there's a lot of truth in the idea that people do business with people they like. The point of treating someone as special is that they will like you. 

That's a sales gimmick. It's like the whole, 7 Habits of Highly Effective People pitch. It's not moral. It's effective. There is a big difference. 

I have two main problems with Derek's theory. 

1) People are not rational

In studying consciousness, we know that what we perceive of as a justification is literally a "fake" memory processed in the dorsolateral prefrontal cortex that is manufactured as a past event in response to a judgement from the nucleus accumbens and the amygdala. Basically, our brain gets good and bad tingles, and then tells a really good story about those tingles.

That makes it very hard to get into the mind of anyone. Their brains lie to them.

2) No candidate should care about you, and you shouldn't care about them. 

I'm serious. You should care about the people you plan to care about. Pretending that a candidate is special because they called you or emailed you makes you feel better, but it does nothing for them. 

Candidates care about getting jobs. That is their focus. If you want to help them, you help them. If you want to care about them, care about them as a person, not as a candidate.  

Let's slaughter another sacred cow to make the point here. Sometimes, the best thing you can do for a candidate is to ignore them. 

What's funny is that is not a best practice for you, but it is the best thing for a candidate that you can not help. Time spent with you, especially time spent with someone who will use your conversation to put off other conversations, is not time well spent. If a candidate likes me, and I like them, our talking together creates a false sense of security for both of us. 

As a third party recruiter, I guard against this. Feeling good after a call is not the same thing as moving the recruiting ball down the field. Getting someone on the phone who isn't yelling at me is great for me - but it's a waste of time for that person. 


Screen Shot 2017-01-25 at 5.54.31 PM
Let's finish this off.  Here's how candidates act:
1) What's in it for me?
2) Will this person help me?
3) What do I need to do to make this person more useful to me?

It would be bizarre if they acted in any other manner. Our goal should not be to understand, but to be useful. Useful comes in many forms. One way of being useful is ignoring people you can't help. That's not in any training manuals, but it's more useful than wasting their time trying to make yourself feel better about not being able to help them.

Recruiters exist to perform their job. That job is making introductions to people who can get hired. When we try to be more than that, it's because we want to feel like we are doing more.

This is not bad. I'm not saying don't be polite and don't try to help people. I'm saying we should recognize our role, our limitations, and not pretend that a sales pitch is the same thing as having a big heart. 


The Relevancy Of Relevancy Scores And The Lie Of UI

Your computer is lying to you. Or rather, the user interface is giving you information that you mistakenly believe is accurate because it Looks Official

Some examples:

LinkedIn Zipcodes: Why do you think they are accurate? Are they inputted and checked? Has the person moved since starting a LinkedIn account? Is the distance from the zipcode the distance from their house or from the city center of the zipcode? 

We don't know. I bet the developer never thought about it - because developers don't think about such things. 

LinkedIn employee count: LinkedIn filters people by the size of their company. While they can add Fortune 1000, how do they track the number of employees?  It's almost assuredly by the number of employees on LinkedIn, which is why Construction companies have 1000 employees on LinkedIn and 5,000 employees overall.

Facebook Search: The number of Female Developers who work for Citibank in Boston. Do that search. That should suffice as an explanation. 

It's not just the searches. It's our systems.

What's the Relevancy Score in Facebook Ads? I asked someone yesterday, but then asked them to tell me what the score meant without using the word "relevant."  It's harder than you think. Especially since none of us "really" know, including Facebook employees who are just repeating what they are told.

Relevancy Score has a meaning to Facebook, of course. It's the number of times people click on your ad.  Based on that score, Facebook determines if you're a good or bad advertiser. which affects how often your ads show up. They want the better performing ads to show up more.

Stop and think about that for a second. Relevancy Score is how often you spend money with Facebook. That's useful to Facebook. It's useful to Advertising firms, who are paid on the percentage of their spend. Is it useful to the rest of us? Is it relevant to our goals using Facebook ads? Maybe.

And yet - try this - go ask someone in social what the relevancy score is. They'll say it's a score that determines how relevant your ads are. Relevant to what? Relevant to who? They'll struggle, and then probably throw something at you.

Again, it's not their fault. It's a kind of short-circuit that occurs. Relevancy Score, 10 miles from Zipcode - these seem like they are carefully calibrated to be accurate data. They are not. They are tricks (some honest, some not) of the User Interface, much like the 36,284,312 results on a Google page.

One of the challenges we have in recruiting social people is the parrot effect, where jobseekers tell us what they've heard, without thinking it through. The best hires - don't just repeat the words. They are not fooled by a Title Field.  

 


Interview Questions To Ask A Facebook Display Ads Manager

Dollarphotoclub_89266628

If you're going to hire someone to manage Facebook Ads, you need to first get a sense of what you're looking for. 

Here's what I'd ask the hiring manager: 

1) What do you want from this?  (sell more, more likes, testing Facebook, testing the candidate to see if they can do more)
2) Are you going to manage the person? 
3) Are you going to be looking over their shoulder the whole time, or do you just want reports? Do you want them weekly or monthly? 
4) What budget are you looking to spend? Do you have that in your budget/the bank? What would make you not spend that budget? 
5) Have you hired someone like this before? 
6) Do you know that creative is not the same thing as ad optimization? 

 Here's what I ask the candidate:

1) Tell me what you did last Tuesday.
2) What's the most you've spent in a month? What could you have done to make that better? 
3) Were you paid on spend, or a salary? Was it worth your while to increase the spend to get a better paycheck?
4) What's the difference between a dabbler and a professional (the word Power Editor should appear pretty quickly)
5) How do you build customized audiences from scratch? What do you need to build them quicker? 
6) Did you have a Facebook rep who would show up when you called? Did you like them? (I love the second part of that question. The answer is usually no, or "as a person, or as a rep?")
7) What kind of sweepstakes and giveaways did you do? 
8) When was the last time you started a new account? (If it's not in the six months, don't hire them for a premium)
9) What kind of training did you receive? 
10) Talk about the CRM/email software you've worked with. 

What You're Looking For
It doesn't matter what they answer in these questions. It matters that they understand why you asked the question. 

If you're looking for interview scripts, feel free to reach out with questions and I'll write one up to post on the blog. 

 

 


Good Recruiters Don't Need To Be Experts In Who They Recruit

When I first started recruiting, I worked on tech jobs. These were desktop, and hardware, and sys admin, and then some developer and designer roles. I knew next to nothing about the technology, but in 1999, neither did anyone else. Heck - we were hiring kids out of college with C++ course to be Java programmers for $45-60,000.

As I got better, and the tech did as well, the old argument that recruiters should know what they were working on led some of us to get CIS degrees (most of them got out of the industry). I thrived by asking questions, and my business really went wild once I started posting interview questions like what to ask a Java Swing candidate. In addition to traffic, hiring managers would call me and argue about what I posted (managers don't call recruiters unless they have to - in case you're wondering. 

In 2006, when I rolled out of recruiting to join my wife's marketing firm, I focused on social media. Blogging, copywriting, and SEO were my bread and butter, matched with her design prowess (just check out Brandstorming.com for a taste). As I got better, social media exploded, and I found myself consulting with firms large and small to do the work I once recruited for. That led me to work hard at learning digital, including email, PPC, and the full marketing stack from story to distribution. That made me a better recruiter, because I was recruiting for the roles I performed (and later managed). 

And what I found out was the old saw that candidates wanted recruiters to understand what they worked on was not universally true. Calling as a peer, candidates found that lying to me was more difficult than the recruiters they were used to working with. I burrowed down into details, and I joked with them about using the wrong jargon. And you know what - they hated it. I didn't realize in technology how much I was missing - in my early career blindness, I was practicing matching candidates by how well they pitched a story and how serious they were about the job. In today's world, I see massive flaws in candidates, but I also see massive flaws in the job descriptions, and had to learn that the perfect candidate was one who could survive the interview and then thrive at the job. That's not always clear on the resume. 

It's true that knowledge of an industry helps, but after a decade in digital, I'm finding the gaps in my own experience. I don't have millions of dollars running through my fingers, which means that I'm behind the curve in understanding social display for the Home Depot. I'm not testing a 5,000 page sites traffic using eyeball tracking and service virtualization (no one is doing that last one, yet). I'm not up-to-date on the size of pictures of the best phones for Facebook live, and I've never looked at my website on the Microsoft Surface Pro. 

In short - the experience I've relied on to sell and recruit as a digital expert is no longer accessible unless you're actively inside a team of people doing the work. That doesn't mean I can't still place someone with experience using Facebook Ads to drive webinar sign-ups that fit into the Hubspot funnel, but it does mean that if I were tasked to do the job - I'm no longer able to sit side by side with a candidate and compare notes. 

It's a career arc that is strange. I went from knowing nothing to knowing everything (that you needed to know to hire for social) to knowing some parts of everything. My biggest challenges now are making sure I stick to the script, listen to the candidate, don't jump ahead, and most important - that I don't mistake nostalgia for technical competence. 

I once firmly believed that a career recruiter should be able to effortlessly switch industries, as knowing how to recruit was more important to knowing who to recruit. That insight wasn't wisdom, but rather the experience of working on different technologies that moved faster than our ability to learn them. If that's still true (and it seems a constant), then no recruiter can ever be expert in their field unless their field is dying. 

That's too much thinking, and it's the nostalgia trap instead of real understanding. Do you know why managers and candidates think they need recruiters who understand them? It's because our industry hires entry level recruiters and burns them through them. The number of inexperienced or new recruiters is always several times greater than the number of experienced recruiters. A new recruiter at a tech firm in San Francisco is going to talk to hundreds of people in a week, while I talk to the top 20% in the industry. This means that most of the people talking sand emailing with recruiters are talking to inexperienced recruiters. Internally, recruiters have multiple requirements and a lot of process to manage. The niched recruiters internally tend to work themselves out of the job and move on.

It's very likely that managers and candidates are mistaking technical expertise for a recruiting model that brings them recruiting expertise. It would have been simpler to point this out in the beginning, but for those of you who've read to the end of this post, would you have believed me?


Inbound Marketing Specialist - Colony, TX

A client of mine up by Frisco is looking for someone with a HubSpot Certification to work inside with his team of marketers in the medical device industry. 

If you've been working at an agency with multiple clients, or if you've worked internally and had decent training, this might be for you. Also, if you're traveling too far, and if you live north of 121 between Frisco and Lewisville, you're going to be a happy camper. 

1) You're the button pusher. You can run the campaign and make sure it's optimized to drive leads to medical professionals

2) You may use the word strategy and campaign, but you're self-aware enough to know that a couple of years in PPC or email campaigns is not enough to understand the full marketing stack. 

3) When interviewing, you don't repeat the words A/B Testing or Success Factors because you think they're a magic totem.

What you'll learn.

1) B2B Marketing

2) How to navigate internal marketing structures in a rapidly growing company

3) The joy of not having an hour commute.

If you're interested, send a note to socialmediaheadhunter@gmail.com with some indication that you've read this blogpost, and I'll get you in touch with the hiring manager.  



Interview Answers I Don't Like To Hear From Email Marketers

Over at Digital Marketing Headhunter, I critiqued a couple of job descriptions for email marketers, and then offered up a list of interview questions for email marketers that I would ask. 

But all questions need answers. That part of the script I haven't published, but I will show the answers I don't like to hear:

Answers I don't like to hear from candidates. 

1) We sent out 10 million emails a month (and no explanation of what they were). 
2) We did extensive A/B Testing of the emails. (what does extensive mean? what did you test? Was that a test each week before the send?
3) I've worked with all of the email software programs and know them well
4) We were CAN-SPAM compliant. 
5) Our data team would pull the lists each week, and we'd work with the graphics department to get the right images, and then the IT department to code the email. I would test and send the email (nothing wrong with that, but it suggests someone who is only good in a large operation, and will need each one of those components to work. But at least they know it takes more than one person. Those who don't know this and assume they can do it all, are often lacking in experience). 

 

If you have your own job description, or questions you'd like to add, leave a comment or email me and I'll publish them. If you want it confidential, please mention it in the email.


List Of Director Of Social Media Interview Questions In a B2C Market

This is a list of interview questions you can use to interview director of social media. It's not comprehensive, but if you had all these answers, you should have a very good idea of that they do and if they're a fit for your position. If you find this useful, and need to hire - consider reaching out. If you use it, please leave the brand Social Media Headhunter and my name in your social sharing. 

Skillset: 

What kind of social media do you do? What I mean is that everyone thinks they do social. So I need to know if you use it for inbound marketing, customer response, branding and advertising or research? 

Do you utilize social display ads? Do you work with a Facebook/Twitter client partner? Can you call them on your cell if you needed to? Would they answer? 

How much content creation do you do personally? 

How do you feel about deleting comments on Facebook that include curse words? 

Is it worth it to invest in Twitter? Why? What businesses work best? 

Give me an example of a good viral social media plan that isn't Fiberglass pools.

Give me an example of a good national social media plan that isn't mentioned at every single conference? 

Do you speak at conferences? Do you enjoy it? Why? 

Tell me what you did yesterday. 

What kind of software do you work with? Anything you're expert in? Anything you need to function?

How versed are you in mobile? Tell me why. 

What does it mean when I say social and digital should be integrated? What does that actually mean?

 

Management:

Who do you report to? What title would you like to report to?

Do you hire people in your department? How do you know if they're good?

How many people report to you? What's the most number of people you've had report to you? 

Did you have to fight for your budget bit by bit, or did you have it set in stone? 

How do you stop PPC/Digital from stealing your budget mid-year? 

Have you selected vendors before? How do you decide who to work with? 

What is your career path? 

Wow Factor:

IBM says they care more about Klout factor than SAT scores. Defend and then attack that position. 

Who is someone in social media you know that you're impressed with? Why?

How did Digg work? What is today's Digg? 

Talk to me about sponsored posts.

How good are your private profiles? How much, I guess we'll call it Dark Hat work do you do? 

Tell me how blogs impacted SEO in 2008. What's the change today? 

Pitch me shareability like you're talking to the CEO and trying to get $1MM in budget.