How Much Do We Really Need to Be In Someone’s Face? NC State Researchers Chris Parnin and Mahnaz Behroozi Dismantle Technical Interviews

If you’ve been living under a rock (or maybe you just don’t obsess over HackerNews) and missed some of the more thoughtful research on technical interviews, allow us to recap. TL;DR: whiteboard interviews are terrible and measure the wrong thing so you’re definitely missing out on quality talent.

But in all seriousness, the current technical interview method sucks and NC State researchers, Dr. Chris Parnin and PhD student Mahnaz Behroozi, have the data to back it up.

Among the interesting findings: the current whiteboard problem-solving exercises don’t help companies identify the best performers. In the experiment that the duo ran, “people who took the traditional interview performed half as well as people that were able to interview in private.” What’s more: not one woman passed the public whiteboard interview but all who took the private one did. That tells us that not only does the current model privilege those candidates who are good at public whiteboard exercises (as opposed to candidates who are, well, just good at coding); it also makes it easy to exclude entire groups of candidates.

We sat down with Chris and Mahnaz. Here’s what they had to say.

OKAY, SO WHAT MADE YOU WANT TO LOOK AT THE TECHNICAL HIRING PROCESS THROUGH A RESEARCH LENS?

CHRIS: My research interests really started off with examining how interruptions affect the coding process - and it was a natural leap to look at whiteboard interviews. If interruptions require 10 or 15 minutes of recovery time for an engineer, it seems like watching someone code and asking them questions during an interview would be even worse.

I personally had all kinds of interviews when I was a student - and they never really felt like the right way to have a conversation about what I wanted to do. It always felt very antagonistic and like the interviewers thought I was an idiot. Mahnaz brought a really interesting lens to it because of her work on stress, anxiety and the cognitive load.

MAHNAZ: My research also focuses on biometrics and initially we thought eye trackers would be a good way to understand a candidate’s signals. But I’m also an international student; English is not my first language. Going through interviews in the US system was difficult – thinking in English is hard for me and you’re also asked to explain your thinking while you’re problem solving aloud while writing on a whiteboard. It really breaks your stride.

I could never figure out what the priority was. Is it how I’m talking? My past knowledge and experience? How I’m solving the problem? And, of course, more women typically experience the added stress of men interviewing them.

WHAT WAS ONE OF THE FIRST THINGS THAT MADE YOU SAY, “OH, WE’RE ON TO SOMETHING HERE”?

CHRIS: I was looking through the literature and found something called the Trier Social Stress Test and realized, “Oh my God, this is a technical interview.” It was a device invented by psychologists specifically to induce stress. They strapped people in, stared at them, and made them solve math problems and give a talk while measuring their cortisol levels. Staring at someone solving a math problem at a whiteboard and making them talk about it is a technical interview!

Our industry accidentally reinvented the stress test. It’s been a progression from “How would you figure out how many gas stations are in the United States?” to “Do this red/black tree on the whiteboard.” It’s all arbitrary and there’s a real opportunity to ask ourselves if this is really what we want to understand when we hire someone.

MOST EMPLOYERS CAN’T SHIFT THEIR HIRING PRACTICES OVERNIGHT. ANY IDEAS FOR THEM?

CHRIS: If this one factor - putting people under a microscope while they code - is impacting interviews, just turn it off. That’s the simple fix. How much do we need to be in someone’s face? How can we inject some privacy into the process so a candidate can focus?

My mom’s been managing a temp agency for decades. She basically interviews people for a living - and she’s found there are some people who can’t get through a phone interview well. But if they do it in person, they’re amazing. We know the interview format has a huge impact on performance outcome. We can’t say one format is best. But employers could help by offering choices - over the phone, in person, on video call. And same for a coding assessment - public with someone, private on your own, or a combination.

The harder answer is that we all look for signals that we think are reliable. But there’s a lot of cross-interference with the current model because candidates are stressed. It has no bearing on their ability to do the job.

I heard someone that had been at Google talk about this very problem. For years, he thought he was interviewing tons of people who couldn’t code. Then he went on job interviews himself and realized they weren’t idiots as he’d written them off to be. They actually just didn’t do well in that format. He said, “Most probably would have been great colleagues; I was just arrogant.” The signal was unreliable. It was garbage.

WHERE MIGHT YOU GO NEXT WITH YOUR RESEARCH?

MAHNAZ: In our work, there were definitely people who were different from the majority. They actually loved to be watched and talk about their thought process while coding. From a research point of view, that’s very interesting. We have to know who that person is, how self aware they are, what interview condition works best for them. There are different dimensions to studying technical interviews and uncovering what approach works best to evaluate people.