PlaytestCloud is a platform that makes it easy for mobile game studios to playtest their games. Game developers easily watch how gamers play their games by uploading their game builds to playtestcloud.com, and we take care of distributing them to the right group of players from our panel of more than 350,000. These gamers play the game on their own devices and record a screencast of their experience. As of today, 25% of the top 100 grossing App Store games have already been tested with PlaytestCloud.
We started PlaytestCloud with the idea of making the game testing process as simple as possible for our customers. A large part of that idea was that there should be no SDK that needs to be integrated with the game code. We saw an opportunity that we can ‘inject’ a software library, the PlaytestCloud Wrapper, directly into the binary package of the game. The PlaytestCloud Wrapper modifies the game code so that the game, when launched on the player’s device, will show a PlaytestCloud pre-roll explaining the game testing process. It then captures a video of the gameplay and uploads it in the background. The PlaytestCloud wrapper exists as two separate code bases — one for iOS and one for Android.
Modifying game code without being detected
In addition to recording the gameplay, we also hook into certain behaviors of the player’s device. For instance, we interrupt recording when a player receives a phone call, and we record where the player touches the screen so that we can highlight their touches in the video. We also overlay the game with certain screens, for example, tasks that the players must perform during the testing process.
A significant challenge here is that ideally, the game code itself must not notice that additional code is being run alongside it. Otherwise, incompatibilities might arise where a game encounters an unexpected situation that can lead to erratic behavior and crashes. For instance, some games access the topmost window and assume that it is the main game window, but in some cases it might be an overlay from the PlaytestCloud Wrapper. If the game then calls methods on that window, they might not be defined or do something that the game would not expect.
To combat this, we try to keep the “attack surface” of the wrapper low, keeping the number of observable changes to a minimum. We also implement several “masking” mechanisms, that pretend towards the game that nothing has changed and it runs within its expected environment.
The Problem: Regression testing is a significant manual effort
When we add new features or “masking” mechanisms to the wrapper, we need to make sure they don’t cause incompatibilities with the unlimited ways that mobile games can be implemented.
Previously we tested for such regressions manually. We injected the PlaytestCloud Wrapper into representative game builds, installed them a device from our office lab, making sure that the core features, video recording and upload still work.
As a safety net we used a staged rollout process. Here, new code versions would only a subset of the playtests on our platform first, and then be rolled out across all game tests. In the worst case that meant that a single playtest on our platform would break, but wouldn’t impact every playtest on the platform.
Saving time on every code change with AWS Device Farm
Discovering AWS Device Farm was a game changer for us. It enabled us to automate most of what we previously did in our manual testing process. We now have a fully automated testing workflow that injects the PlaytestCloud Wrapper into a game build and installs, launches, and tests the game for all new code versions on a variety of real iOS and Android devices hosted in AWS Device Farm. We use Appium to navigate through the PlaytestCloud Wrapper screens and start screen recording on each device. After recording for about a minute, we assert that a video was successfully recorded and uploaded to our backend.
As a bootstrapped company, we must make the most of our limited developer hours. Using AWS Device Farm saves us approximately 1.5 hours of manual testing time on each release of the recording library. With up to 25 releases per month across iOS and Android, that’s up to almost a week of developer time per month. It also helps us to be much more confident when making changes. We run these tests not only on each release, but on every code commit, which would have been unfeasible before.
This article was originally published in The Amazon Web Services Game Tech Blog.