Falling through floors, walking through walls, floating off the map— glitches in gaming can cause anything from needless frustration to losing hours of playtime.
In some extreme cases, these glitches can make games completely unplayable or render some features unusable, resulting in massive loss of revenue or even lawsuits. For instance, Cyberpunk 2077’s buggy initial release in 2020 cost the video game developer, CD Projekt, more than $51 million in refunds.
However, it can be extremely difficult for developers to simply ship a product that is completely bug-free. Video games need to accommodate the countless potential choices a player could make at any given stage of the game, and it’s not always feasible for playtesters—who play games to ensure quality control—to catch them all before a game is released.
Where developers saw grunt work, however, computer science PhD candidate Sasha Volokh saw an opportunity for automation. This problem is the subject of the paper co-authored by Volokh and advisor and William Halfond, a computer science professor, which won best paper award at the 17th International Conference on the Foundations of Digital Games.
Adapting to the game
Currently, video game play testers must go through the motions and carry out all possible player choices in each game, which is both laborious and time-consuming.
Volokh provides the example of a platformer—a game where the character navigates the world in a series of two-dimensional stages full of platforms — such as Mario Brothers or MegaMan, to illustrate the problem.
Proper quality control for these games means that a human needs to input every combination of button presses for hours on end, just to make sure every possible action is running as intended.
“You’re basically trying to cause some unintended mechanic to occur,” Volokh said. “Like running through a wall or getting an item that you weren’t supposed to at that point in the game.”
Instead, Volokh’s playtesting software can adapt to the game in question, “learning” its controls and randomly inputting character choices as a human would.
“I would describe it less as a complete replacement of playtesters and more like an attempt to augment their abilities or capabilities,” Volokh said. “The more repetitive aspects of this work can be done in an automated manner, but still controlled by the playtesters.”
The program is meant to be used as a tool in the playtesting process, eliminating the need for a human to carry out all these actions themselves. Instead, the program can take care of much of this mindless work by cycling through something called a “valid action.” In other words, a combination of inputs and game context that translates into running, jumping, and so on.
“The tool looks at the code and all the different possible paths you can take, all the different ways a certain code can run,” Volokh said. “For instance, if a player is on the ground and holding the jump button. The tool would look at this combination and determine that it’s a valid action. As long as the player is on the ground, when they jump, they go up. It’s not a valid action if they’re already in the air and pressing jump won’t do anything.”
This way, a game can be tested more thoroughly than just randomly mashing the buttons on a controller, for example. By exhausting the valid actions that a player can take on a given stage, developers can ensure that things like environment interaction or player velocity work as intended.
Open ended, open sourced
The program, which is currently hosted on Github, is available for anyone to download and try, and Volokh intends to keep it that way.
“I would like to keep it open source because it’s something that can be valuable to build on,” Volokh said. “I would be interested in seeing what others can build onto this work.”
In the future, Volokh hopes to implement AI into the program. With AI integration, it might be able to play games with an objective as a human does, such as reaching the end of the level or obtaining an item.
“It’s a very simple strategy that’s been known to be remarkably effective, but we want to go a step further,” Volokh said. “Our next level is to play with a goal in mind like completing a stage or progressing in the game, simulating how a person plays.”
Published on April 5th, 2023
Last updated on April 10th, 2023