From Vanya Tucherov, former reporter and Tableau QA Engineer extraordinaire:
145 million people across the US pay to play video games each year. 21.6 billion dollars is spent worldwide annually on games, across many platforms and genres. With such significant numbers, there's sure to be a huge volume of data available to investigate.
More importantly, in order to successfully bring a new product into this huge a market, a lot of analysis needs to be conducted- a studio needs to understand what players want, what motivates them, the elements which can drive a player away from a game. The data is, at least figuratively, if not literally, all over the floor, so making some sense out of it can be a significant challenge.
Almost as much of a challenge as building and releasing a successful massively multiplayer online game generating four terabytes of data- some 60 billion events and 360 data event elements of information a day, some of it needing desperately to be contextualized to be of any significant value.
The way Craig Fryar at BioWare went about doing this when working on Star Wars: The Old Republic fell into a series of stages:
- Prototyping to rapidly iterate on the data which was available
- Validate to make sure the data is as good as possible
- Automate to make things sustainable- with that volume of data, it's just not going to be possible to have a person do this- but being able to do incremental refresh and on-server scheduling, all the new data can be collected and imported during the low traffic times overnight.
As a result, BioWare analysts have been able to flip the 80/20 model to empower themselves to spend their time analyzing the data or helping to make decisions from it rather than fighting with getting the data in shape to use.