St Louis MKSSA Presentation: September 25-26

Check for pricing and attendance details.
Jim Durbin returns to St Louis with all new recruiting techniques and a clear view of how technology like artificial intelligence changes the industry in the next year. 

On September 25th in Kansas City, and on September 26th in St Louis, Jim will deliver a dual session training. Session 1 is new techniques for LinkedIn, Facebook, and resume search. Session 2 is a reveal of the AI tools that will hit or are already on the market for recruiting. 
Powerful Recruiting with LinkedIn and Facebook:

The Social Media Headhunter presents a 90 minute super-session on the new LinkedIn UI, Facebook search, and his 360 sourcing method for job boards, databases, and social sites. .

In this presentation, you'll learn:

*       The fastest way to search using the new keyword search
*       Search strings to build company and title lists
*       Short Boolean - the key to 360 Sourcing
*       Facebook search
*       The best messages for contacting candidates by email, text, and InMail. 
*       Meetup, Behance, Github, and more
*       How to use Indeed and Glassdoor to find candidates without resumes

"Just to let you know, this was one of the best webinars I've ever attended."
- Michael B. 

"Jim Durbin is an awesome presenter!  Definitely learned a lot!  I was writing names down as fast as I could! LOL!"
- Susie L.

"In all my 12+ years of recruiting, I have never experienced a training so thorough on how to make candidates come to me!"
- Abraham B. 
Jim Durbin is the Founder of As a marketer and recruiter, he's been placing people since 1999. In the last decade, he's been known as the Social Media Headhunter, where he's trained over 9,000 recruiters on digital tools. He runs a B2B marketing firm, develops startups in the AI and recruiting space, and takes as many full desk searches as he can get his hands on. He predicted the demise of LinkedIn in 3 years, quit Twitter in 2016, and writes regularly in industry publications and on his blog,

Hiring a social strategist in Dallas

I'm "this" close to launching a search for a social strategist. Just working out the details. It's the kind of social job you dream of - strategy, but working under a great boss with real marketing experience. 

What kind of social? Glad you asked. It's important that you ask, because you're supposed to know there are different kinds by now. 

It's a mix of customer service and branding. The client has customer service needs, and wants to see what social can and should be doing for them, but they also want to extend and expand their branding and lead generation. You're not selling online - but you are generating leads and working to explain how to best respond to your customer base. 

Here's some boilerplate I made up.

Your role is a new one, designed to increase the capabilities, performance, and tracking of social data that includes Facebook, Twitter, LinkedIn, Instagram, YouTube and other popular channels. As a strategist, you’ll work with the marketing and customer service teams to engage with our members, enhance branding campaigns, and integrate with product and service teams. This position does not have direct reports.

Experience with regulated industries is a plus. They're innovative, but not the wild west - and you don't want to have to explain to the execs why someone on Facebook is yelling at you.



-Design a comprehensive social media strategy 

-Create and manage best practices for customer service and corporate communications.

-Work with design and content teams to create shareable content appropriate for specific networks

-Train and or manage content managers and owners on social best practices.

-Build performance and tracking systems using existing marketing technology and processes.

-Create and deliver reports on social milestones and experience.

-Serve as a champion internally for social integration.

-Listen and engage in relevant social discussions involving the brand and the industry.

-Plan and execute regular social promotions and campaigns along with performance metrics and results.

-Develop social media content strategies and editorial calendars.



You know the rest of the drill. Find me if you're interested. I mainly post these to prove I still exist, and so I have something to send you if you ask me what the job entails. 

You have to be in Dallas. You're not going to make six figures. This isn't a job posting to Facebook. And agency experience or corporate experience is important - it's not a job for freelancers or consultants. 


Register For The April 27th LinkedIn Webinar: Powerful Recruiting With The New UI



In the last 9 years, with the help of Kathy Simmons of Experts Connection, I've delivered paid webinar training to over 9,000 recruiters on the topics of LinkedIn, Facebook, Twitter, and the digital searching. 

These top rated webinars come with a 90 minute session followed by live Q&A, a video download, the slide deck as a PDF, and if you register before April 20th, I'll send you me new LinkedIn UI search guide. 



Book the Webinar!

Cost for the webinar is $125. 

TIME: Thursday, April 27th 2 PM EST/ 11 AM PST.  

In this webinar, you’ll learn:

  • The fastest way to search using the keyword search
  • Search strings to build company and title lists
  • 360 Sourcing - a method that works with the LinkedIn UI to surface hot candidates
  • 3 New Messages to generate interest and stand out to jobseekers
  • Important changes in settings, groups and connections (with a PDF checklist)
  • Live searches from your requests (send them in with registration to make sure yours is covered)

Come join me - and join the mailing list on the page to learn about what we're doing in Sourcing and Recruiting in 2017. 

About Those Pesky Machines...


In the dawn of the new century, we laughed at the idea that machines could think.

We played with them. We paid for them to get smarter.

Years of the relentless drumbeat of disruptive technology numbed us to the real danger. The machines weren’t taking over. We, were becoming the machines.

Blinded by the sheer power of gigantic lists, we begin to think and act like pale algorithmic copies of the software that we thought were making us into something better. Something we thought was making us, more than human.

As we grew more faster, more capable, and more stalky, our prospects rebelled. They put up what we can only call, a resistance, to our well crafted templates and phone messages and social entreaties. In our amazement at the power of our new human-machine intelligence gathering, we forgot the greatest app was simply, ourselves.

And so as we embrace the new machine-hybrid world, the resistance has struck deep into the core of our existence. They only speak when they wish to be spoken to! Having escaped our plans for data supremacy - we must finally and completely embrace our human side - we must become, once again - the flesh and blood heroes of old. 

I fear no machine! I fear when men cast off their humanity and function like machines. 

Or rather, I fear for their jobs, because people who act like machines are easily replaced by machines. 

Prediction Facebook Will Not Be A Social Network In 2022

HYPOTHESIS: In the next five years, Facebook will turn away from being a social network and consider itself solely to be the operating system of the internet.
Algorithms will drive the death of social networks, because what an algorithm can track is not of interest to human beings. Instead of being useful, algorithms track what we do - contributing to wasted time and personal dysfunction.

The value of social traffic is already declining. Advertising is addicted to traffic, but the value of paying traffic is high. The value of traffic that does not convert is close to zero for all but the largest of brands. Advertising is struggling to prove its worth, but as the move towards pixel conversion grows stronger, the emptiness of "social," "video" and "images" in short bursts will cause Facebook to pivot to run in the background instead of being a news source.
In simple explanation - trending news that points to twitter hashtags, shocking headlines, or stolen reddit posts has a net negative value on a brand. Declining click-through rates will eventually lead Facebook to abandon generic advertising through a news feed.
The future of social is limited social groups with strong privacy controls. Facebook, who has already squashed the other social networks (Twitter and Snapchat are dead, even if they don't know it), will recognize this and seek to be the internet login, running apps in the background that function like a personal enterprise software suite. 

Recruiting Apologia: Artificial Intelligence Can't Replace Recruiters

I am a steel-driving man

The hot buzz around AI is the replacement of human workers with robots or AI. I'm not worried, for several reasons. The generics are that AI isn't really AI, and by the time it is, it will replace almost all jobs. The second is that the people programming machine learning don't seem to understand recruiting.

For the purposes of this defense, I'll use AI/machine learning/and computer screening as the same idea.  

The data points that are gathered are very useful for a system, but I'm highly suspicious of the claims that they can screen better than a human. These claims are based on, quite frankly, terrible screening processes. For a chat bot to work, it has to be fed the right information. Maybe I'm missing the good ones, but I'm not seeing any example of excellent screening. I'm seeing poor screening practices replicated in these bots. An analogy would be a drop-down box in a job board asking that applicants have 2 years of experience. This sounds like a good idea, but in practice, has a value very close to zero. For AI to replace recruiters, it will have to actually replace recruiters, not just automate portions of the hiring process. Gathering data is only useful if it affects the outcome, and is not counted as an advantage of AI.  

This is intended as an apologia, a formal defense of my opinions. I believe that in the process, the action, and the skillset, recruiters cannot be replaced. The most likely scenario is augmentation for the collection of data and the reduction of paperwork. 

Recruiting is not the same as hiring. It's value lies in social proof and time saved. 

The primary definition of recruiting is making an introduction between two individuals who are poorly trained to interview. The social proof of a recruiter introduction is useful in calming fear and creating certainty. In a mature recruiting model, the hiring committee receives screened candidates who have already been vetted. This means the hiring committee should not be asking basic questions of motivation, experience, and suitability.

*All people are poorly trained to interview or be interviewed. There is no formal process, no testing, and not enough actions to constitute good interviewers or good interviewees. Belief in superior interviewing skills for a task that constitutes a very small portion of time worked is delusional. An outcome of getting hired or hiring is simply not a good indicator of interview quality. It should also be noted that a good interviewer (a recruiter), does not translate into being good at being interviewed.  

A computer screen would have to demonstrate superior screening to the hiring committee, but also social proof. Passing a test is not the same as passing a human screen. Unless companies are willing to train managers to ignore their social conditioning, the recruiter introduction will be of more value than an AI screen. I should also note that many companies utilize referral systems and rank referrals as a high quality of hire. If AI were to replace recruiters, this would eliminate the belief in and the practice of referral-based hiring. After all, if social proof is less valuable than an AI screen, then a referral is less valuable than an AI screen. 

Defense of AI is an attack on referrals, because recruiting is a form of referral-based hiring. 

Matching speech patterns

Another way humans generate comfort lies in connecting the rhythm of our speech patterns to our brain waves. Two humans speaking to each other literally reach a sweet spot in their conversation where their brain is functioning on the same bandwidth. A person cannot understand you unless their brain can match your speech, including tone, speed, words, and volume to their experience.  Again, humans are highly adaptable in listening to each other. Computers are uni-directional in this manner. The cues that tell us that we are being understood are not present in an AI interface, and cannot be. To be successful, an AI at a minimum has to be able to adopt and adapt speech patterns to generate a good conversation.

In the absence of this skill, the candidate will be assigning a large portion of their conscious processing to thinking past the screen. Instead of exploring and discussing their value, they are seeking to tell a story that will survive the computer and create good marks in the eyes of an unseen human reviewer.

Consistency is vital in the decision-making process

A human being making a decision has several well-known triggers. Verbal and written statements to a human being rank highly in terms of creating consistency. Typing answers on a computer is not a trigger. This principle is known as disinhibition. While engaging with an AI bot, the human is not making statements they feel compelled to live up to. Without the non-verbal cues of a conversation (including phone calls), a job-seeker is not making positive statements about the company, the interview or a job. Those positive statements are a major cause of decision making later in the process. Any human-computer interface would have to mimic a large portion of non-verbal clues including facial tics, presence, breathing, rhythm, and mirroring to generate a strong response from the candidate.

It is possible to program these cues, but the likelihood of mistakes due to what is known as the "uncanny valley" is not being pursued. If a robot interface is too human, we recoil. If it is not human enough, we don't care what it thinks. Failing to understand behavioral science is a major flaw in AI systems. We don't recognize how good humans are at screening each other, something that no AI can replicate. 

Current AI's are sterile, voice recognition is bad, and translation of slang is nearly impossible. 

An automated interface is by definition, sterile. Research into human-like robots and avatars shows improvement in certain kinds of information, but that information has to be carefully curated and applied to a working "AI-interface" that has sufficient voice recognition software and around 40% of the facial reflexes of a human. No chat bot is doing that now because quite frankly, it's a different skill set than writing an AI logic engine. 

The attempt to match language translation and dialects only works in a laboratory. It's similar to showing off your fancy software on your desktop configured to run your software. That you can make it work is not the same as testing it in the field. If human beings struggle to understand each other, with accents, lingo, mistaken phrases, and even the way in which we think, how could a computer be any different? When we look at voice activated software, we forget that we have to train the software to understand us, or we have to fit into a comfortable middle ground of dialect. 

The medium matters as well. Chat and text and voice and email and the eventual computer interface don't match up to expectations. The amount of logic necessary for the AI to learn requires a true AI, which again, is not what is offered today or in the near future. 

We don't know why we hire

Research into quality of hire is simply not conclusive. In order to replace a functioning part of the hiring process, we would have to better understand hiring. We are literally at the leeches to draw blood stage in our understanding of why people make decisions.  Confirmation bias and the role of empathy could be big factors in the success of an employee. In short - if enough people are involved and want the employee to succeed, their chances of succeeding are probably improved. Removing human contact at any stage could very well lead to disastrous hiring. 

And worse, we could be masking the effects with "good data." Team chemistry is something we can observe, the same way we can observe communication networks or social messaging. There is no one who has, as of yet, figured out to create a network that improves on communication, and there is no one who has figured out to high create high-performing teams. 

Pretending that large amounts of data and reason can lead us to successful hires is a nice fiction we peddle to sell books and explain success. 

Screening is by definition one-sided, and susceptible to changes in the market

Finally - screening is a very one-sided view of the hiring process. Companies screen jobseekers, suggesting a power differential that is one-sided. In that design, companies get to choose what they like and don't like. As supply and demand of qualified workers rises and fall in tune with technology, economic health, and generational changes, the power differential swings back to the jobseeker, and screening is seen as demeaning and useful only to the company. This is always true in high demand positions, where executives, top programmers, and top salespeople don't feel the need to participate in screening processes.

AI screening has to be useful to the jobseeker, whether they get the job or not, or it will be seen as a net negative, accidentally leaving out top performers and instead only delivering minimally qualified candidates who are willing to be shepherded into a digital cattle call. 

AI is fantastic and replacing poor processes. Those involved in paperwork, scheduling, basic screening and process notifications can and will be replaced, but the gains in productivity are actually the removal of loss of productivity from too much data. AI solves the problem of digital application. It does little to solve the problem of recruiting. Recruiting is a human function that is often mistaken for data entry and collection. The industry will shrink, as process recruiters (mostly internal) are replaced by software. This is not a threat to recruiters whose primary purpose is contact with jobseekers. 

Feedback Is Overrated. I'll Prove It.

 One of the old jokes in the travel industry is the value of feedback. A couple visits your resort and is encouraged to jot down a few notes about their experience.
 The accommodations price and staff were excellent, but it rained all weekend. 3/5 stars

Maybe I'm just a curmudgeon, but don't we all cringe a little when someone mentions a review from Yelp or Glassdoor? It's a good thing to know how your audience feels, but it would help if we understood the purpose of feedback and stopped treating it as a measuring tool. Feedback is a form of data that is best used for calibration, not measurement. The central problem with Feedback is that it measures how someone reports they feel, not how they actually feel. This leads to the most egregious use of feedback, which is global action based on individual reports.

But what is feedback? It sounds naughty just questioning its value, but if you're willing to join me, we'll take this ride together.

1) Feedback is easily gamed by how you solicit or design the gathering of data.

2) Feedback by definition is a tool of judgement. When someone is judging, they apply a different standard. Feedback makes everyone an authority. 

If we learned anything from the early days of Yelp, let us learn that.

Think of your local car dealer. They have surveys after every experience, and did you know they are judged/paid by those results? Anything less than a 5-Outstanding-Perfect-Totally Satisfied is considered not good enough. I've worked with dealers before, so I know that, which means that I'm not going to ding our service advisor if there is a mistake because anything less than a 5 impacts them. If you're keeping score at home, that means my feedback is tainted. But what if you don't know this? What if you're the kind of person who doesn't believe in perfection? The dealer wants to put a survey card in your hand?

Feedback is not supposed to be used to measure perfection. It's not even supposed to measure individual results. It is mostly useful when it's combined with results-oriented data. Feedback identifies problems when results are not met. A result includes something like. Did they sign on the line that is dotted? Did they perform the task they were trained to do? Did they return the product to the store later? Did their measurable performance change relative to the management directive? 

Once you have measured a result, feedback creates a useful counterpoint. What is the relationship between the feedback and the result? Is there a trendline that matches the results? Did reported satisfaction rise or fall in comparison to the results? Feedback can serve as a warning light or a go ahead signal, but on its own creates a false story that consistently fails to deliver. Many a failed project is met with complaints that all of the feedback was positive prior to starting. It's not a convincing argument. Neither is high employee turnover because the conditions in the store made it impossible to get 5's on your survey cards.

Feedback isn't terrible. Why, here's an example of how feedback was personally useful. A few years ago, a webinar I presented received a number of 3 ratings in a survey form that went to 5. A typical webinar averaged above 4. Now, the feedback rated me highly as a presenter, and there were no specific complaints pinpointing the problem. How could the webinar be rated average, if I was rated highly? 

The webinar was new material, and I suspected the real reason was the webinar was unfocused. The actual content did not deliver what was promised, but the audience liked me, and didn't want to penalize me with low ratings (just like my service advisor). They registered their displeasure with 3's overall, but gave me individually a 5. More important - there were no positive comments about the webinar delivered outside of the survey. Most of my webinars have a few individuals that write specific words of praise for what they learned, sent in by email. In this case, no one went the extra mile. Those two clues - an average rating and a lack of extra praise - led me to completely redo the material for the webinar. A future version of the webinar rated above 4, with two emails thanking me for the content. 

It wasn't the feedback that was important here. It was the difference in the feedback. The solicited feedback signaled a downward trend in comparison to my other work. That could have meant the topic was simply not out of interest, but the lack of unsolicited feedback is what identified the content was the problem. In both cases, the feedback on the quality of the webinar was an imprecise measurement, but it was a flag that made me look deeper.

But what is the value of feedback? To me, it had me improve, but the real measure would be if one month after the webinar, the habits of the attendees changed. Their personal feelings have an impact on their ability to learn, but the KPI for their managers should have been their performance, not if they enjoyed my presentation. Feedback, in this case, was useful for me in alerting me there was a problem for me - but results-oriented data is what is relevant to the person purchasing the webinar. 

Let's think about a conversation between the recruiter and the manager after the webinar.

Manager: How was it?
Recruiter: It was good. I learned some new information, and really liked the way the presenter talked us through the changes in the software.

That sounds pretty positive, but is it accurate? 

Let's say the recruiter was underperforming, and as part of their yearly review two weeks ago, were instructed to attend training to improve. When the manager asks, "how was the webinar," the recruiter has a lot of background to process that has nothing to do with the webinar. If the recruiter praises the webinar, they're taking responsibility for improved performance. "I learned something new" has the subtext "and now I'll do better." What were they supposed to say? If the recruiter says, "the webinar was average and a rehash of what I already knew," they might gain themselves an advantage. They were supposed to learn, but the webinar didn't help them, so it's not their fault if they don't perform. 

At the same time, if they say they already know the material, they're representing a knowledge base that is not leading to a solid performance. If they say the webinar was average, the manager could very well conclude that the recruiter is not interested or capable of learning, believing themselves to be an expert without the performance to back it up. Talk about a minefield. 

But wait, there's more! If the recruiter says the webinar was not useful, the manager is confronted with the issue that they paid for training that wasn't any good. If the manager can't select training that is useful, that is a sign that they made a mistake. Telling the boss they wasted money brings yet another problem to the table. 

With all of that swirling in their head, the recruiter gets an email survey asking their opinion of the webinar. Gulp. 

 I was just trying to get some feedback, and now we're about to fire a lazy recruiter who works for a clueless boss! 3 out of 5 stars, because it was raining. 

Feedback is complicated because people are complicated. Our memories are faulty, our justifications self-serving, and our understanding of what we need to function is poorly understood by the world's best neuroscientists, much less a manager with a business degree and two undergraduate psych courses from 1998. 

And yet we are addicted to that, sweet, sweet Feedback. It is a requirement that we get feedback in a 360 degree circle, from all stakeholders, and customers, and vendors. So what's the solution? I do have one. It's a simple one, involving only a shift key on your computer. It's so simple, I'm going to make a picture to share it on Facebook. Here it is. 

Feedback is not Data. 

feedback is data.