Skip to content

The overlooked benefits of algorithms in the workplace

    You describe the potential of using candidate screening technology in the form of an online game, such as Wasabi Waiter from a company called Knack, where one person is a server at a busy sushi restaurant. How can that be effective in assessing applicants?

    Thanks to Hackette

    It’s thinking more creatively about what we screen for, using insights from psychology and other research into what makes a good team player. You don’t just want what we call exploitation algorithms, which look at who became successful employees in the past, like someone who graduated from Ivy League college and captained a sports team.

    There is a lot of talk about the black box problem, that it is difficult to understand what the algorithm is actually doing. But from my experience as an expert witness in employment discrimination lawsuits and recruitment investigations, it is also very difficult to break through the black box of our human mind and trace what happened. Indeed, with digital processes we have that paper trail and we can check whether a game or some kind of automated emotional screening will outperform the previous way of screening in creating a more diverse pool of people.

    My personal experience of applying to jobs that require aptitude tests and personality tests is that I find them opaque and frustrating. Speaking to someone face-to-face can give you a little sense of how you’re doing. If the whole process is automated, you don’t even really know what you’re being tested for.

    That’s what many people feel. But here’s where I get a little more contrarian. It’s not just about how people experience the interview, but also what we know about how good people are at making judgments during an interview.

    There is quite a bit of research showing that job interviews are a poor predictor of job performance and that interviewers consistently overestimate what they can actually get out of an interview. There is even research showing how bias rears its ugly head within seconds. If we are serious about expanding the pool of people eligible for a job, the sheer number of applicants will be too much for any human to take on, at least in the early stages.

    Many of these workplace biases are well documented. We have known about the gender pay gap for a long time, but it was very difficult to close it. Can automation help there?

    It was frustrating to see how stagnant the gender pay gap is, even though we have equal pay laws on the books. With the huge datasets now available, I think we can do better. Textio’s software helps companies write job postings that are more inclusive and result in a more diverse applicant base. Syndio can detect wage inequality between different segments of the workforce in large workplaces, which is sometimes harder to see.

    It’s quite intuitive: if we use software to look at many different pay forms and many different job openings, we can break through that veil of formal job descriptions in a large workforce and see what’s happening in terms of gender and race. We used to think of auditing as a one-off – once a year – but here you can have a continuous audit over several months, or when there’s a sudden increase in the pay gap due to things like bonuses.