Skip to content

Report describes Apple’s “organizational dysfunction” and “lack of ambition” in AI

    A Siri logo in an iOS interface near the iPhone's dock
    Enlarge / Siri, Apple’s sort of AI assistant, is popping up in iOS.

    Samuel Axon

    A new behind-the-scenes report in The Information details Apple’s struggle to keep up with AI features and innovation amid the rise of large language models (LLMs) powering breakthrough tools like ChatGPT.

    The article focuses on the efforts of the company’s AI chief since 2018, John Giannandrea, to bring order to a fragmented AI group and make Apple more competitive with companies like Google, from which Giannandrea has defected.

    In some ways, The Information’s piece summarizes or confirms what we already know, such as Apple employees’ frustrations with the limitations of Siri’s underlying technology, which had previously been reported, but it calls on new sources to provide additional information. add context and depth. to the story.

    For example, it reveals that the team that worked on Apple’s long-in-development mixed reality headset was so frustrated with Siri that it considered developing a completely separate, alternative voice control method for the headset.

    But it goes beyond telling neutral details; it lays out all that information in a structured case to argue that Apple is ill-prepared to compete in the rapidly changing field of AI.

    Think differently, indeed

    As Google restructures itself to push for products like Bard and Microsoft ChatGPT and injects related AI features into a wide variety of products from Bing to Word to GitHub, Apple’s recent approach to AI is different; it has focused almost exclusively on practical applications in features for the iPhone. The focus is on using machine learning to improve palm detection on the iPad, give iPhone users more useful photo editing tricks, and improve suggestions in Apple’s content-focused apps, among other similar things.

    That is a different tack than the ambitious experiments and innovations in the air that you see at companies such as OpenAI, Microsoft or Google. Apple has been relatively conservative, trying to use AI and machine learning as a tool to improve user experience, not to really reinvent how much of something gets done or disrupt existing industries.

    In fact, The Information’s sources provide plenty of examples of senior Apple leaders slowing down (or at least curbing) aggressive efforts within the company’s AI group for fear of exposing products like Siri to the same kinds of embarrassing factual errors or unhinged behavior. see show. that ChatGPT and its ilk have done. In other words, Apple isn’t too keen on tolerating what many in AI research and product development call “hallucinations.”

    For example, Siri’s answers aren’t generative — they’re human-written and human-composed. Apple’s leadership was hesitant to allow Siri developers to push the voice assistant toward detailed back-and-forth conversations seen in the latest LLM-driven chatbots. Those are seen as more attention-grabbing than useful, and Apple is concerned about responsibility for bad answers.