Skip to content

Open Source AI Gets Founders – and the FTC – Excited

    Much of yesterday’s conversation was filled with the acronyms you’d expect from this group of high-minded panelists: YC, FTC, AI, LLMs. But interspersed throughout the conversation—fundamentally, you might say—was the advocacy for open-source AI.

    It was a huge shift (or a throwback, if you're a Linux fan) from the app-obsessed 2010s, when developers were eager to containerize their technologies and port them to larger platforms for distribution.

    The event also came just two days after Meta CEO Mark Zuckerberg declared that “open source AI is the way forward” and released Llama 3.1, the latest version of Meta’s own open source AI algorithm. As Zuckerberg put it in his announcement, some technologists no longer want to be “constrained by what Apple lets us build” or face arbitrary rules and app fees.

    Open source AI also happens to be the approach that OpenAI takes not uses for its largest GPTs, despite what the multibillion-dollar startup’s name might suggest. This means that at least some of the code remains private, and OpenAI doesn’t share the “weights,” or parameters, of its most powerful AI systems. It also charges fees for enterprise-level access to its technology.

    “With the rise of composite AI systems and agent architectures, using small but finely tuned open source models yields significantly better results than a [OpenAI] GPT4, or [Google] Gemini. This is especially true for enterprise tasks,” says Ali Golshan, co-founder and CEO of synthetic data company Gretel.ai. (Golshan was not present at the YC event.)

    “I don’t think it’s OpenAI versus the world or anything like that,” says Dave Yen, who runs a fund called Orange Collective for successful YC alumni to back up-and-coming YC founders. “I think it’s about creating fair competition and an environment where startups don’t risk dying the next day if OpenAI changes their pricing models or their policies.”

    “That's not to say we shouldn't have safeguards,” Yen added, “but we also don't want to unnecessarily restrict speeds.”

    Open source AI models have a number of inherent risks that more cautious technologists have warned about. The most obvious is that the technology is open and free. People with malicious intent are more likely to use these tools to cause harm than an expensive proprietary AI model. Researchers have pointed out that it is cheap and easy for malicious actors to train away safety parameters present in these AI models.

    “Open source” is also a myth in some AI models, as WIRED’s Will Knight has reported . The data used to train them can still be kept secret, their licenses can restrict developers from building certain things, and ultimately they can still benefit the original modeler more than anyone else.

    And some politicians have pushed back against the unbridled development of large-scale AI systems, including California state Sen. Scott Wiener. Wiener’s AI Safety and Innovation Bill, SB 1047, is controversial in tech circles. It aims to set standards for developers of AI models that cost more than $100 million to train, requires certain levels of pre-deployment safety testing and red-teaming, protects whistleblowers working in AI labs, and gives the state’s attorney general legal recourse if an AI model causes extreme harm.

    Wiener himself spoke at the YC event on Thursday, in a conversation moderated by Bloomberg reporter Shirin Ghaffary. He said he was “deeply grateful” to people in the open-source community who spoke out against the bill, and that the state “has made a series of amendments in direct response to some of that critical feedback.” One change that has been made, Wiener said, is that the bill now more clearly defines a reasonable path to shutting down an open-source AI model that has gone off the rails.

    The featured speaker at Thursday’s event, a last-minute addition to the program, was Andrew Ng, the co-founder of Coursera, founder of Google Brain and former chief scientist at Baidu. Ng, like many of the other attendees, spoke in defense of open-source models.

    “This is one of those moments when [it’s determined] “If entrepreneurs are allowed to continue to innovate,” Ng said, “or if we have to spend the money that would go to building software on hiring lawyers.”