The artificial intelligence Developed by DeepMind, a subsidiary of Google, managed to defeat it for the first time at two champions of electronic sports (Sports), after learning to play on their own in the StarCraft II strategy video game.
Through a program called AlphaStar, DeepMind developed a deep network of neuronal learning Trained directly through raw data from StarCraft II, as explained by the Google company in an official statement.
In a series of tests that took place on December 19, AlphaStar managed to beat the Polish player Grzegorz & # 39; raja & # 39; Komincz and his partner, German Dario "TLO & # 39; Wünsch, Both members of the eSports Team Liquid team. This game took place on a competitive map of the game and without restrictions of rules.
The StarCraft video game saga, a real-time strategy developed by Blizzard, was considered by DeepMind as a "great challenge" Due to its complexity, mechanics and the extension of the maps, which complicates the training of automatic systems to be competitive.
To train your AI, DeepMind It uses raw data from the StarCraft II interface through two techniques known as supervised learning and enhanced learning. The neuronal network converts the units of the game and uses an LSTM memory core that provides support for long-term learning.
StarCraft, the strategy game of Blizzard.
The Google algorithm, a multiple learning software that was initially used to train AlphaStar's neuronal network through supervised learning, Learn from human players from other Blizzard video games and your "& # 39; macro" system & # 39; and of resources & # 39; micro & # 39;. With these techniques, he defeated 95% occasions to the most demanding difficulty – known as "elite" – of the games developed by the Californian study.
Subsequently, the researchers underwent AlphaStar in a process of enhanced learning, For which a continuous StarCraft II league was created with competing real players, which created a global map with the strategies chosen by competitors.
AlphaStar analyzed the success rate of each strategy and its possible counterattack tactics. With the StarCraft league, the software accumulated An experience of more than 200 years of real play, achieved in just 14 days.
The other successes of DeepMind
The Google system managed to beat the professionals' & # 39; manned & # 39; What's happening for the first time with eSports players according to DeepMind, and also with a 5-0 result. For this, it took advantage of a greater average of actions per minute, tens of thousands compared to hundreds, and overcame limitations of the algorithm as a delay of 350 milliseconds between observation and action.
South Korean Lee Sedol, world champion of the board game "go", competed against artificial intelligence AlphaGo by Google. (Photo: AP / Lee Jin-man)
The current tests of DeepMind with StarCraft II are not the first of the Google AI with video games, which already plays in other titles such as Atari, Mario and Dota 2, and, since last summer, also in Quake III in the way "Capture the flag ".