Skip to content

Should you get paid for teaching a chatbot to do your job?

    As a result, the company spent a lot of time training new employees who were hired to replace those who quit. Many of the skills needed were what the researchers called “tacit knowledge,” experiential knowledge that can’t be easily codified but that large language models can absorb from chat logs and then mimic. The company’s bot helped with both technical and social skills, pointing agents to relevant technical documents and suggesting chipper phrases to calm seething customers, such as “looking forward to helping you fix this as soon as possible!”

    After the bot started helping, the number of problems the team solved per hour increased by 14 percent. In addition, the probability that an employee would quit in a given month dropped by 9 percent, and customer attitudes toward employees also improved. The company also saw a 25 percent drop in the number of customers seeking to speak to a manager.

    But when the researchers broke down the results by skill level, they found that most of the chatbot’s benefits benefited the least skilled workers, who saw a 35 percent increase in productivity. The best-trained employees saw no gains and even saw their customer satisfaction scores drop slightly, suggesting the bot may have been a distraction.

    The value of that highly-skilled work, meanwhile, multiplied when the AI ​​assistant directed lower-skilled workers to use the same techniques.

    There is reason to doubt that employers will reward that value of their own accord. Aaron Benanav, a historian at Syracuse University and author of the book Automation and the future of worksees a historical parallel in Taylorism, a productivity system developed in the late 1800s by a mechanical engineer named Frederick Taylor and later applied to Henry Ford’s automobile factories.

    Using a stopwatch, Taylor broke down physical processes into their component parts to determine the most efficient way to complete them. He paid special attention to the most skilled workers in a trade, says Benanav, “to get less skilled workers to work in the same way.” Instead of a picky engineer carrying a stopwatch, machine learning tools can now collect and disseminate employee best practices.

    That didn’t work out so well for some employees in Taylor’s day. His methods have been associated with declining incomes for higher-skilled workers because companies could pay lower-skilled workers to do the same type of work, Benanav says. Even though some high performers were still needed, companies needed fewer and competition between them increased.

    “Some say that played a pretty big role in fueling unionization among all these low-skilled or medium-skilled workers in the 1930s,” says Benanav. However, some less punitive arrangements emerged. One of Taylor’s supporters, the mechanical engineer Henry Gantt—yes, the card man—created a system that paid minimum wage to all workers, but offered bonuses to those who also achieved additional goals.

    Even if employers feel incentivized to pay top performers a premium for teaching AI systems, or employees win it for themselves, it can be difficult to divide the spoils fairly. For starters, data from different workplaces can be aggregated and sent to an AI company that builds a model and sells it to individual companies.