Recently I came across a blog by Anna McDougall, talking about how her team at HelloBetter have reimagined the technical interview, and it really made me think! For years, we’ve been trying to balance fairness, accuracy and efficiency when hiring people. Yet, most interview processes are designed around a few conversations and some often-outdated tests, quizzes or case studies.
As we all know, hiring is already expensive and time-consuming, and now AI tools are enabling candidates to “cheat” on traditional assessments. So how do we really get a true idea of how people actually work?
This new method, dubbed the McDougall Method, offers a practical alternative. Instead of traditional assessments or case studies, it focuses on a 90-minute collaborative interview. A candidate joins one or two existing developers, dives into a copy of a real codebase from your stack and tackles a real problem. The best part: candidates are encouraged to use the same tools they’d rely on in the job, including AI assistants!
This method flips the whole interview model: you get to see how a person actually works and thinks, not how well they’ve memorized a language or an answer. You learn how they communicate, manage their time, and most importantly, how they go about solving a problem. And you discover whether they can collaborate and fit with the culture of your organisation. The final part of the session also involves a conversation where the candidate can discuss how it went and what they would do differently. Here, you also get a glimpse into a candidate’s self-awareness and willingness to reflect and learn.
Here are 3 compelling effects of this new method on the hiring process:
- It tests what matters. Developers don’t spend their days solving trick puzzles. They design systems, debug messy bugs, and make trade-offs. This method gets the interview closer to that reality
- It surfaces cultural fit earlier. A resume and a Q&A can’t tell you how someone responds to feedback under pressure, but a live session often can
- It improves candidate experience. Instead of hoops to jump through, candidates feel they’re being assessed on real work. This signals investment in culture and fit, leaving candidates impressed and reinforcing employer reputation
Of course, this new method requires investment from you and your team. Preparing realistic codebases and defining consistent evaluation criteria takes time. But compare that to the hidden cost of a mis-hire: wasted months, lost productivity, and morale drag. Looking at it that way, the investment is a no-brainer.
The deeper question is this: are your interviews measuring skills developers actually use, or clinging to outdated tests simply because they’re familiar? With AI shifting both how developers work and how candidates present themselves, the gap between old assessments and real ability is only going to continue to grow.
If you want to know more, McDougall’s original post is worth a read: You Can’t Outrun AI in Tech Interviews, So We Designed Around It.
The core message is this: your technical interview should reflect how your developers actually work.
The best talent is collaborative and knows how to leverage the right tools to solve a problem. If your process doesn’t reveal that, you could be missing the candidates who would make your team thrive.
Written by Samantha Howell