Cheyenne MacDonald at Daily Mail has written a fascinating
new article on AI and its abilities to clone video game software, by simply
observing how a game is played.
A Georgia Tech research team has developed AI that can remarkably
replicate a video game by observing as little as two minutes of game action. To set up a typical experiment, the AI is
first given a video clip of a player interacting within the game. The AI then uses a search algorithm to review
a set of rules which can predict how the game must operate. The end result is a predictive model of the
game’s engine. At no time is the AI
exposed to any actual game code; the AI predicts how the code should be written
to replicate what it observes in the game.
Researchers discovered that an AI predicted game engine ends
up very similar to the original, and any imperfections seen between the
original and cloned game can usually be smoothed out through multiple
iterations of game video clips given to the AI.
In other words, with every subsequent pass, the AI actually improves on
its own prior model.
The AI researchers also did some comparative work with other AI learning techniques, and found that their approach using search algorithms built game engines which were closer to the original than what was produced using neural networks.
The results are very impressive to say the least. The Georgia Tech team believes that this
particular AI technology may be applied in the future to reduce development
time in the video game industry.
Games used to train the AI included Super Mario Bros, Mega
Man and Sonic the Hedgehog.
Check out the original article here, to see all the great
images of original games vs. the AI cloned versions of them.
Comments
Post a Comment