Then Sam Altman said a year ago that OpenAI's Roman Empire is the real Roman Empire, he wasn't kidding. In the same way that the Romans gradually amassed an empire of land stretching across three continents and a ninth of the Earth's circumference, the CEO and his cohort are now seeding the planet with their own latifundia – not farmlands, but AI data centers.
Tech executives like Altman, Nvidia CEO Jensen Huang, Microsoft CEO Satya Nadella, and Oracle co-founder Larry Ellison are fully committed to the idea that the future of the American (and possibly global) economy is these new IT infrastructure warehouses. But data centers are of course not really new. In the early days of computing, there were gigantic power-hungry mainframes in climate-controlled rooms, with coaxial cables carrying information from the mainframe to a terminal computer. Then the consumer Internet boom of the late 1990s led to a new era of infrastructure. In the backyard of Washington DC, huge buildings were built with racks and racks of computers that stored and processed data for technology companies.
Ten years later, 'the cloud' became the weak infrastructure of the internet. Storage became cheaper. Some companies, such as Amazon, have benefited from this. Giant data centers continued to grow, but instead of a technology company using a combination of on-premises servers and rented data center racks, they moved their computing needs to a number of virtualized environments. (“What is the cloud?” a perfectly intelligent relative asked me in the mid-2010s, “And why am I paying for seventeen different subscriptions?”)
Meanwhile, tech companies collected petabytes of data, data that people happily shared online, at corporate workplaces and through mobile apps. Companies began finding new ways to mine and structure this 'Big Data', promising it would change lives. In many ways it did. You had to know where this was going.
Now the tech industry is in the fever dream days of generative AI, which requires new levels of computing resources. Big Data is tired; There are large data centers, and equipped with cabling, for AI. Faster, more efficient chips are needed to power AI data centers, and chipmakers like Nvidia and AMD are jumping up and down on the proverbial couch proclaiming their love for AI. The industry has entered an unprecedented era of capital investment in AI infrastructure, pushing the US into positive GDP territory. These are huge, whirlwind deals that might as well be handshakes at a cocktail party, slathered in gigawatts and exuberance, while the rest of us try to track down real contracts and dollars.
OpenAI, Microsoft, Nvidia, Oracle and SoftBank have made some of the biggest deals. This year, an earlier supercomputing project between OpenAI and Microsoft, called Stargate, became the vehicle for a large-scale AI infrastructure project in the US. (President Donald Trump called it the largest AI infrastructure project in history, because of course he did, but that might not have been hyperbolic.) Altman, Ellison, and SoftBank CEO Masayoshi Son were all involved in the deal, pledging $100 billion to start, with plans to invest up to $500 billion in Stargate over the next few years. Nvidia GPUs would be deployed. Later, in July, OpenAI and Oracle announced an additional Stargate partnership – SoftBank was noticeably absent – measured in gigawatts of capacity (4.5) and expected job creation (about 100,000).
