Competitor analysis playtesting is a user research methodology that works wonderfully well in determining best practices for player experience in games. They are very flexible and can be used at different points in the development cycle: you can compare your game with one - or more - of its competitors, or alternatively, a competitor analysis can be carried out between multiple competitor titles.
Even beyond this basic distinction, there are lots of different ways to approach a competitor analysis. These range from largely informal exercises to much more structured studies involving real players.
Competitor analysis can be useful at any point during the development cycle, but they have standout benefits at four stages. The first is discovery and pre-design, where competitor analysis can help shape your initial concepts and design work. The second is prototyping, where your early vertical slices can be compared against other titles (taking into account the much more polished and complete states of those games of course!). The third is mid-lifecycle, when you are looking to potentially refresh a game's onboarding, or carry out a soft relaunch of a multiplayer/GAAS title. Finally, competitor analysis is great for post-mortems, when your aim is to look at what went well (and not so well) in preparation for your next project.
Informal Competitor Analyses
Informal competitor analyses are the type of exercise you might decide to carry out between a few members of your team: for example, playing through competitor games and discussing them in group sessions or workshops. These can be completely unstructured, or you can use tools to bring in a little structure and rigour, such as shared note-taking and pre-agreed assessment criteria and frameworks. One way to bring an external perspective here would be to commission a set of sample reviews, which emulate the type of review you might receive from a commercial, consumer-facing publication or website–but my view here is that there are other methodologies that will deliver greater impact if you're willing to carry out some independent research.
Competitor analysis can also be carried out by a UX expert, whose task will be to break down the games into relevant points of comparison, apply heuristics and/or principles derived from psychology and human-computer interaction, and pull out good and bad practice from the titles reviewed. The goal here is not to say which of these games is 'better' or likely to perform well from an audience-facing perspective, but to highlight different ways of approaching aspects of game design in terms of the player experience. The aim is to ensure that you are adhering to (or surpassing!) best practice among the games analysed.
User Experience Studies
The above methodologies don't actually involve players, but competitor analyses are of course enhanced by getting the player perspective. A User Experience (UX) competitor analysis playtest is structured like the expert analysis outlined above, with similar goals, but rather than relying on heuristics and principles, the researcher would base their analysis on real player data. This data should be primarily qualitative for a UX analysis (video, audio and interviews) but can be supported survey data.
What if you want an evaluative component to your competitor analysis: which do players find more fun, or frustrating, or boring? You could run a competitor playtest with more players, and the primary outcomes would be survey measures. But as you might have spotted, there's overlap between the UX-focused and evaluative approaches outlined above. You can also plan out dedicated hybrid studies, analysing qualitative data for a sub-set of players to examine UX issues, but also implementing quantitative surveys. This approach has the benefit of showing best practice across competitor games, but also looking at more standardised points of comparison across those games as well.
- Goal: Onboarding comparison
- Setup: 30-minute single-session playtest per game
- Audience: 5 players who play similar games to the one tested
- Preparation: Play onboarding section of each game multiple times, noting potential challenges and the structure of each.
- Analysis procedure: Allow at least as much time for analysis as player video time that you want to analyse. Watch all videos while taking detailed notes about UX issues that players experience (if possible, categorise these while taking them to speed up interpretation: e.g. first load, onboarding, store, menus, etc) and write up issues across the lines of these categories.
- Goal: Examine Day 7 retention
- Setup: 7-day longitudinal playtest with 15-minutes of gameplay and a survey per day and per game
- Audience: 15 players per game
- Preparation: Play each game for long enough that you understand the core mechanics and metagame, write and test player surveys that reflect common game structure.
- Analysis procedure: Carry out descriptive analysis comparing the three games, including qualitative analysis of free-text responses, and potentially inferential statistical analysis to establish reliable quantitative differences between them (e.g. to look at which scores higher on a given measure).
On that note: what about benchmarking our games? In practice, benchmarking means different things to different people: it can be any of the methods outlined above, or somewhere in between them. But I think a better specific definition is that benchmarking means to apply standardised measures to our games (and those of competitors), and establish baseline markers in terms of where we are and where we want to get to in terms of how players experience our games. There are challenges to benchmarking that go way beyond picking a methodology: these are mostly based around the nuances of designing survey questions, and in establishing a set of measures that applies to more than one game. For example, the measures specified above - fun, frustrating, boring - are fairly generalisable, but not particularly informative in terms of telling us what to do to make a game better. They can help to give a high level picture of how our game is received, but do not necessarily help to tell us why players experience it the way that they do. When looking at benchmarking in this way you can either look at frameworks which describe an assessment approach or pre-baked measurement tools.
Evaluative or appreciation-focused competitor analysis, with a greater number of players and focus on quantitative responses, can help you to understand whether your competitor games are hitting the targets that you also want your game to hit (or exceed), and can give you a useful steer on choosing between alternative solutions to game design challenges. Quantitative, appreciation-focused competitor analysis combined with qualitative analysis can be highly effective at eliciting both direct comparisons and rich data derived from player behaviour.
Benefits of Competitor Analysis
What are the main benefits of carrying out a competitor analysis? A UX/player-focused competitor analysis will help to ensure that your game at least equals (or potentially exceeds) your competitors' best practice in terms of main beats. Maybe one of your competitors has a great onboarding section, but a weak transition into teaching players about the metagame, while another title does the latter very well. Another has text that is hard to read with no relevant settings available to tweak presentation, while another includes readable text with a variety of accessibility options. In looking at what works in these titles, you would start off with a set of principles for having great onboarding, successful transition to metagame, readable text, and a useful set of relevant accessibility settings. This should mean less iterating on these and other UX issues, leaving you more time to work on the game's core experiences.
PlaytestCloud is a great resource to not only understand what your competitors are providing to players, but to also find your niche in your specific genre. We go into great detail on all of these points in the webinar linked below. If you are interested in running a competitor analysis with PlaytestCloud, please reach out and we can help you get started.
e.g. the work of Hochleitner, Hochleitner and Haller (https://www.researchgate.net/publication/226475861_Using_Heuristics_to_Evaluate_the_Overall_User_Experience_of_Video_Games_and_Advanced_Interaction_Games) ↩︎
Beats are a term for the main points of action or interaction in your game, or your game's story. A common beat might be the player's first interaction with an enemy, or receiving rewards for the first time. ↩︎