Tech

Sony’s AI drives a race car like a Champ


Takuma Miyazono started driving virtual racing cars at the age of 4, when his father brought home the highly realistic sports car racing game Gran Turismo 4. Sixteen years later, in 2020, Miyazono becomes Gran Turismo The champion of the world, win one unprecedented “three crowns” of eSports racing events. But I’ve never faced a Gran Turismo the driver quite liked the GT Sophy, a artificial intelligence developed by Sony and Polyphony Digital, the studio behind Gran Turismo Franchising.

“Sophy is very fast, with better lap times than expected for the best drivers,” he said through a translator. “But observing Sophy, there are certain moves that I only believe are possible after that.”

Video games has become an important sandbox for AI research in recent years, with computers mastering a growing range of titles. But Gran Turismo represents a significant new challenge for a machine.

In contrast to board games that the AI ​​has mastered, such as chess or To go, Gran Turismo requires constant judgment and high-speed reflexes. It’s much more complicated than action games like Starcraft or Dota and requires challenging driving maneuvers. ONE Gran Turismo ace must keep his balance as he pushes a virtual car to its limit and grapples with friction, aerodynamics and precise steering with a delicate dance of trying to overtake his opponent without getting in their way. unjust and subject to a penalty.

“The skillful training of human drivers in a head-to-head competition represents a landmark achievement for AI,” said Chris Gerdesa professor at Stanford who studies autonomous driving, in a paper published Wednesday alongside the Sony study in the journal Nature.

Gerdes says the techniques used to develop GT Sophy can help with development self-driving car. Currently, self-driving cars only use neural network algorithm that GT Sophy used to track road markings and recognize other vehicles and obstacles. The vehicle control software is handwritten. “The success of the GT Sophy on the track shows that neural networks may one day have a larger role in the software of autonomous vehicles than they do today,” Gerdes writes.

In 2020, Sony announced that it was developing an electric vehicle with advanced driver assistance features. But the company says there are still no plans to use the GT Sophy in its automotive efforts.

GT Sophy also shows how important the simulation environment has become for real-world AI systems. Many companies developing self-driving technology use complex computer simulations to generate training data for their algorithms. Eg, Waymoself-driving car company owned by Alphabetsaid their vehicles covered the equivalent of 20 million miles in simulations.

Avinash Balachandran, senior manager of Human-Centered Driving Research at Toyota Research Institute, which is testing self-driving cars capable of driving: “The use of machine learning and control auto for very interesting race operating at extreme speeds. He said Toyota is working on “human amplification, where technologies that leverage expertise from motorsport could one day improve active safety systems.” .”

Bruno Castro da Silva, a professor at the University of Massachusetts Amherst who studies reinforcement learning, calls GT Sophy “an impressive achievement” and an important step in training AI systems for autonomous vehicles. But da Silva says that going from Gran Turismo to the real world will be difficult, because it is difficult for reinforcement learning algorithms like GT Sophy to consider the long-term effects of decisions and because it is difficult to guarantee security or reliability of such algorithms.



Source link

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button