It's not a super controversial to assert that the software engineering interview process is broken, but I think it's worthwhile to do that anyway. The software engineering interview is broken. There are lots of reasons for this:

  • interview processes are overoptimized for rejecting candidates that aren't good, that they often reject candidates that are good. This isn't a problem if it happens occasionally, but it's really routine.
  • it's difficult to design an interview process that's works consistently well across different levels and different kinds of roles, and companies/teams can easily get into a place where they really can only hire one type or level of engineer.
  • while many engineering teams know that the hiring process is biased, most of the attempts to mitigate this focus on the bias of the interviewer by making interview processes more consistent across candidate or easier to score objectively, while abdicating for the ways that the process can be biased toward certain kinds of candidates.

I've been part of lots of conversations over the years about "making interviews better," and many of the improvements to the process that come out of these conversations don't do much and sometimes exacerbate the biases and inefficiencies of the process. I think also, that the move toward remote work (and remote interviewing,) has presented an underrealized opportunity to revisit some of these questions and hopefully come up with better ways of interviewing and building teams.

To unwind a bit, the goals of an interview process should be:

  • have a conversation with a candidate to ensure that you can communicate well with them (and them with you!) and can imagine that they'll fit into your team or desired role.
  • identify skills and interests, based on practical exercises, review of their past work (e.g. portfolio or open source work,) that would complement your team's needs. Sometimes takes "figure out if the person can actually write code," but there are lots of ways to demonstrate and assess skills.
  • learn about the candidates interests and past projects to figure out if there's alignment between the candidate's career trajectory and the team you're building.

Most processes focus on the skills aspect and don't focus enough on the other aspects. Additionally, there are a bunch of common skills assessments that lots of companies use (and copy from eachother!) and most of them are actually really bad. For example:

  • live coding exercises often have to be really contrived in order to fit within an hour interview, and tend to favor algorithims problems that folks either have memorized because they recently took a class or crammed for interviews. As engineers we almost never write code like this, and the right answer to most of these problems is "use a library function", so while live coding is great for getting the opportunity to watch a candidate think/work on a problem, success or failure aren't necessarily indicative of capability or fit.
  • take home coding problems provide a good alternative to live coding exercises, but can be a big imposition timewise particularly people on people who have jobs while interviewing. Often take home exercises also require people to focus more on buildsystems and project-level polish rather than the kind of coding that they're likely to do more of. The impulse with take home problems is to make them "bigger," and while these problems can be a little "bigger" than an hour, a lot of what you end up looking at with these problems is also finishing touches so keeping it shorter is also a good plan.
  • portfolio-style reviews (e.g. of open source contributions or public projects,) can be great in many situations, particularly when paired with some kind of session where the candidate can provide context, but lots of great programmers don't have these kinds of portfolios because they don't program for fun (which is totally fine!) or because their previous jobs don't have much open source code. It can also be difficult to assess a candidate in situations where these work samples are old, or are in codebases with awkward conventions or requirements.

There isn't one solution to this, but:

  • your goal is to give candidates the opportunity to demonstrate their competencies and impress you. Have an interview menu [1] rather than an interview process, and let candidates select the kind of problem that they think will be best for them. This is particularly true for more senior candidates, but I think works across the experience spectrum.

  • if you want to do a programming or technical problem in real time, there are lots of different kinds of great exercises, so avoid having another candidate implement bubble sort, graph search, or reverse a linked list. Things like:

    • find a class (or collection of types/functions) in your codebase that you can share and have the candidate read it and try and understand/explain how it works, and then offer up suggestions for how to improve it in some way. I find this works best with 100/200 lines of code, and as long as you can explain idioms and syntax to them, it doesn't matter if they know the language. Reading code you don't know '
    • provide the candidate with a function that doesn't have side effects, but is of moderate length and have them write tests for all the edge cases. It's ok if the function has a bug that can be uncovered in the course of writing tests, but this isn't particularly important.
    • provide the candidate with a set of stubs and a complete test suite and have them implement the interface that matches the test cases. This works well for problems where the class in question should implement a fairly pedestrian kind of functionality like "a hash map with versioned values for keys," or "implement an collection/cache that expires items on an LRU basis."
    • have the candidate do a code review of a meaningful change. This is an opportunity to see what it's like to work with them, to give them a view into your process (and code!), and most importantly ask questions, which can provide a lot of insight into their mindset and method.

    I think that the menu approach also works well here: different people have different skills and different ways of framing them, and there's no real harm in giving people a choice here.

  • independent/take home/asynchronous exercises tend to be better (particularly for more senior candidates,) as it more accurately mirrors the way that we, as programmers work. At the same time, it's really easy to give people problems that are too big or too complex or just take too long to solve well. You can almost always get the same kind of signal by doing smaller problems anyway. I also believe that offering candidates some kind of honoraria for interview processes are generally a good practice.

  • one of the big goals of the interview processes is to introduce a candidate to the team and give them a sense for who's on the team and how they operate, which I think has given rise to increasingly long interview sequences. Consider pairing up interviewers for some or all of your interview panel to give candidates greater exposure to the team without taking a huge amount of their time. This is also a great way to help your team build skills at interviewing.

  • assessing candidates should also balance the candidates skills, alignment with the team, with the team's needs and capacity for ramping new members. Particularly for organizations that place candidates on teams late in the process, it's easy to effectively have two processes (which just takes a while,) and end up with "good" candidates that are just haphazardly allocated to teams that aren't a good fit.

These are hard problems, and I think its important both to be open to different ways of interviewing and reflecting on the process over time. One of the great risks is that a team will develop an interview process and then keep using it even if it turns out that the process becomes less effective as interviewers and needs change. Have (quick) retrospectives about your interview process to help make sure that stays fresh and effective.'


I think this is a follow up, in someways, to my earlier post on Staff Engineering. If you liked this, check that out!