Skip to content

Sony’s AI controls a race car like a champion

    Takuma Miyazono started driving virtual racing cars at the age of 4 when his father took home the highly realistic motorsport game Gran Turismo 4† Sixteen years later, in 2020, Miyazono became the Gran Turismo world champion, winning an unprecedented “triple crown” of esports motorcycle racing events. But he had never faced a Gran Turismo driver just like GT Sophy, an artificial intelligence developed by Sony and Polyphony Digital, the studio behind the Gran Turismo franchisee.

    “Sophy is very fast, with lap times that are better than expected for the best drivers,” he says through a translator. “But when I looked at Sophy, there were certain moves that I later only thought were possible.”

    Video games have become a major sandbox for AI research in recent years, with computers controlling a growing number of titles. But Gran Turismo represents an important new challenge for a machine.

    Unlike board games that AI has mastered, such as chess or Go, Gran Turismo requires constant judgment and quick reflexes. It requires challenging driving maneuvers, unlike complex strategy games like Starcraft or Dota. A Gran Turismo Ace must balance by pushing a virtual car to its limits and grappling with friction, aerodynamics and precise driving lines with the subtle dance of trying to overtake an opponent without unfairly blocking their line and incurring a penalty.

    “To be so skillfully outperforming human drivers in head-to-head competition is a milestone for AI,” said Chris Gerdes, a Stanford professor who studies autonomous driving, in an article published Wednesday in the journal alongside the Sony study. Nature.

    Gerdes says the techniques used to develop GT Sophy could help develop autonomous cars. Currently, self-driving cars use only the kind of neural network algorithm GT Sophy used to track road markings and sense other vehicles and obstacles. The software that controls the car is handwritten. “GT Sophy’s success on the track suggests that neural networks may one day play a bigger role in automated vehicle software than they do today,” Gerdes writes.

    Sony announced in 2020 that it was developing a prototype electric car with advanced driver assistance features. But the company says there are no plans yet to use GT Sophy in its automotive efforts.

    GT Sophy also shows how important simulated environments have become for real-world AI systems. Many companies developing self-driving technology use advanced computer simulations to generate training data for their algorithms. Waymo, Alphabet’s self-driving car company, for example, says its vehicles have covered the equivalent of 20 million miles in simulations.

    “Using machine learning and autonomous driving for racing is exciting,” said Avinash Balachandran, senior manager for Human Centric Driving Research at the Toyota Research Institute, which is testing self-driving cars capable of driving at extreme speeds. He says Toyota is working on “human enhancement, where technologies leveraging the experiences of motorsport experts could one day improve active safety systems.”

    Bruno Castro da Silva, a professor at the University of Massachusetts Amherst who researches reinforcement learning, calls GT Sophy “an impressive achievement” and an important step toward training AI systems for autonomous vehicles. But da Silva says Gran Turismo’s move to the real world will be challenging because it’s difficult for reinforcement learning algorithms like GT Sophy to consider the long-term implications of decisions, and because it’s difficult to understand the ensure the security or reliability of such algorithms.