[Article] Raph (Theory of Fun) Kosters - The Cost of Games

Link → The cost of games – Raph's Website

This article charts the rising cost of game development (see above graph)

Then goes into the declining price to player per MB of installed game vs development cost per MB.

So the content size of games is going up as the resolution, processing and graphical power has increased.

However the production costs have levelled out, showing that game content production has hit productivity limits. e.g. Game Engines reduce the programming load but make for a market where more art/asset/content creation is needed to compete.

And Raph has three potential forecasts.

  1. In 10 years Free games become the normal.
  2. Super high priced AAA games with astronomic development budgets.
  3. System or Procedural games reduce the development costs.

Now the Author states his data set is limited and his predictions could be off but this data set shows these trends that probably most of you are aware of.

The question is how can game and engine developers break the cost per MB trend, whilst maintaining good income from players expecting more from less?

PS My thoughts on the issues:

  • Moore’s Law hardware limits could crimp or reduce the upward demand for more content e.g. limit processing/graphical power growth.
  • AI Helper and or Procedural Automated content generation could boost content production and reduce costs.
  • Someone develops the ‘Oasis’, a super sized MMO sandbox that players and developers can modify to make and play the games they want.

For a good insight into the industry try Raph Koster’s - Industry Lifecycles (Casual Connect 2008) presentation slides.

I don’t think most indies have to worry about it…

4 Likes

I worked with Raph on a couple of games. His insights are always an interesting read, but I wouldn’t worry about it too much, they are high level observations from a particular point of view, not really actionable.

5 Likes

Really not sure how useful cost per megabyte is as a valuable metric. Especially with number 3, why would systems driven games be the outcome of having high res lightmaps and uncompressed audio?

4 Likes

Are there any Indie/Casual game data sets that prove your points e.g. Indie/Casual/Mobile games that are not growing in size, production costs and decreasing in price?

Lightmaps are part of the problem, the more assets/content you need to create to make a current generation game the more time/money it will cost to produce.

To reduce this production cost you could move to a more procedural systemic way to generate your games. E.g. XCOM2’s use of procedural level from chunks or No Man’s Sky procedural universe.

The thing is modern game engines do not have procedural toolkits for content generation built in, the first game engine to make an easy to use configurable procedural content toolkit will probably gain an advantage in the market.

However built in procedural content goes against modern free game engines asset store business models.

What would happen if No Man’s Sky turned their game into a game engine?

@Arowx

No doubt the biggest challenge at the moment is content development and whilst game engines have become epically powerful the art content pipeline hasn’t evolved extensively in general.

Although that’s not to say there aren’t answers, like Houdini for e.g. which is being adopted by AAA studios and indie’s alike to try and offset some of the burden… I can tell you know even from initial investigation it will save me weeks if not more in pipe creation alone.

I’ve never believed 3D modelling was just artistic nature, there’s a fair bit of technical skill and with new tools the paradigms are shifting… It’s a necessity but things are getting very complicated and somewhat out of control at this stage, especially when you look towards the top end.

A lot of indies and even bigger outfits not willing to trade blows with massive budget endeavours have come in on themselves… Which is a good thing because the focus is now on fun and gameplay, I played Divinity Original Sin 2 and it was a really good game even though in terms of modern games it’s relatively small. Do it right though and there’s tons of hours of decent gameplay without grandiose universes.

Whilst procedural creation is become somewhat a necessity in higher tier games (even from lone wolfs) we know when overused results it results in games like NMS which isn’t desired, or they get stuck in development hell like Star Citizen…

End of the day, procedural methodologies are just tools… They can aid you but the new focus on quality over quantity is a good thing, tons of “stuff” and large for the sake of it seems great in theory but in practice it’s another matter.

What points? I made no claims.

1 Like

The number of assets isn’t what is being measured, though, only the size of those assets. Going from 2k to 4k to 8k textures is an exponential increase in file size, but if all of those files are being generated by Substance Designer, then there is no significant increase in labor. It’s the same story for lightmaps and audio. How much data they actually take up is entirely arbitrary. Audio in particular has always been compressing it’s file size down to whatever point will fit it all on the disk.

4 Likes

I’m not sure I necessarily agree on (all) the forecasts. I think the smart AAAs (aka almost all of them) are going to find ways to cut costs. You can see EA trying to do that by implementing Frostbite across all their games. Additionally, Ubisoft uses procedural generation quite a bit in their development (they used it for AC Unity I know, to build the city blocks, and anyone who’s seen the video for Beyond Good and Evil 2 knows they have to be using there (though I don’t recall Ancel’s statements in that video)).

Along similar lines to that second point, an increase in proceduralism can cut costs. My mind immediately goes to procedural textures, which I know nothing about, but seems to have the potential to help with this content creation problem.

(after reading the actual article, he apparently addresses this. Good on him)

Is he really predicting 1 TB games in 2020? That’s pretty ridiculous.

Some other things I see from reading the article:

He’s including free-to-play games with regular games. That seems like a mistake to me, mainly because such games are designed differently. This is just my opinion, I have no strong logical point to back it up.

He claims we’re close to market saturation, and that “we reached 50% of people are gamers long ago.” Well, not really. Not really at all, actually, unless by “gamers” you’re referring to mobile games. And that’s once again an issue like my previous paragraph, of conflating totally different types of game developments as one.

As “proof” let’s consider the sales of the breakout “hardcore” game of 2017, PUBG (and regardless of whether or not one thinks it’s “easy” or “based on luck,” a first person shooter designed around competition between 100 people is absolutely a “hardcore” game). We’ll go with this (the most recent data I saw at a glance) which says the game has sold 30 million. Let me first stop and say wow, that’s an incredible number.

Anyway, taking a look at this (it’s a question on Quora, but the answers link to legitimate sources including the UN) gives us a few estimates on how many people live in developed countries. I’ll ignore the fact that the market for games will get larger as underdeveloped countries get developed, and focus on the UN data which claimed in 2010 that 1 billion people live in developed countries.

30 million. 1 billion. 3%. Where in the world is he getting 50% from anyway? He doesn’t source that claim, which brings the validity of that and other claims into question, but I found this which looks weak in some areas. As one example, for any landline answers they specifically asked to speak to the youngest person in the house, which seems incredibly sketchy to me. This understandably skews their data towards the younger people, which according to their results has larger percentages of people who play games. The 18-29 age group is 67%, 30-49 = 58%, 50-64 = 40%, and 66+ = 40% (how in the world did they come up with those group sizes?).

Along similar lines, the video games with the highest budgets are also “hardcore” games like GTA V, Destiny, TOR, DICE games, COD games, etc. The “hardcore” market is nowhere near saturated. That’s a completely ridiculous assumption.

He makes a good point about marketing costs increasing when trying to promote outside of your core audience. However, a few things like the decreasing “stigma” of games (which is still going on for hardcore games) mean that the core audience is necessarily expanding, so that’s a mixed bag.

One of the most concerning things is where he gets his data on game costs, which I don’t see anywhere. I hope he’s not using VGCharts, which is well known for being inaccurate. If I don’t know where his costs are from, can I believe him?

Along those same lines, I see in his first plot has a 360 million dollar game from 2014. The problem is, there are no 360 million dollar games. He’s probably talking about Destiny, which we all know was rumored to have a 500 million dollar budget. However, if you look at the Wikipedia page for Destiny, there are comments from both devs and publishers saying that it was nowhere near 500. In fact, the 500 includes multiple things unrelated to the cost of the game itself which could be spread out over the entire IP. According the actual contract for the game (I didn’t read through it, I’m trusting WIkipedia which links it) the cost was 140.

The most expensive game listed in this guy’s data is completely wrong. Why should I believe him?

On game sizes. This is also a very questionable claim, given what we know about things like compression and high res textures–additional “byte” size that is already part of game development but was typically in times past curtailed at the cost of performance. In more recent times these things have instead been pushed to the forefront to allow for higher-quality performance at the cost of game size. One thing that immediately comes to mind is audio compression–the audio itself (and the work to make it) is unchanged, but with less compression the size goes up.

Let’s take a look at a few AAA series over time.

AC Black Flag - 29 GB
AC Unity - 41 GB
AC Syndicate - 40 GB
AC Origins - 43 GB

As one can see, the install size seems to be leveling off.

COD Advanced Warfare - 45 GB
COD Black Ops 3 - 45 GB
COD Infinite Warfare ~ 54 GB
COD WW2 - 45 GB

So once again, there’s none of this crazy escalation as implied by his data. The only company I can think of making these truly crazy larger and larger sized games is Microsoft, and I have no idea what’s going on with them.

Another thing is his trendlines, which clearly do not represent the data properly in a few cases (the main one being bytes). A linear regression is obviously not accurate there, so I have no idea why he’s using one.

I could go on but I’m getting tired of writing this. Basically, this guy doesn’t source his data, his data’s wrong in some cases, and the conclusions are based off of simplistic statistical analysis. After going through it more deeply I’m less confident (aka not confident at all) in his statements.

3 Likes

Take anything Koster posts on his blog with a grain of salt.

His claim to fame was Star Wars Galaxies: An Empire Divided before that game he was relatively an unknown man in the industry. Kind of like how we know who Sean Murray is now, because of No Man’s Sky and we will continue too hear about him for a few more years too come, even though he too was an unknown face in the industry before that game was announced and show cased to the public.

1 Like

Indeed. He was a pretty well known at that time, and he was our team’s cd for a couple of years right after that. He’s a nice guy, and knowledgeable, but like a lot of designers that get known for a hit, they have a hard time repeating their success. Mostly now he does articles and speaking about observations of the industry. While interesting, they don’t really provide any practical information for developers, or much that really isn’t already known. Fun to read sometimes for sure, but really just editorial commentary.

2 Likes

Battlefield
1942 (2002) 1.2 GB
2142 (2006) 2.2 GB
Bad Company 2 (2010) 15 GB
3 (2012) 20 GB
4 (2013) 30 GB
Hardline (2015) 60GB
1 (2016) 50 GB

  • Note this is probably just the initial game install spec and not the DLC additional space needs.

And as technology improves e.g. display resolution, gpu, cpu and sdd/hdd the industry will push to take advantage of the capabilities this inherently pushes up install sizes.

I would expect that there is an even more pronounced install size change in the mobile space…

iPhone (Yr) Ram HD
1 (2007) 128 MB 4-16 GB
3GS (2008) 256MB 8-32 GB
4 (2009) 512MB 8-64 GB
5 (2012) 1GB 8-64 GB
6 (2014) 1-2GB 16-128 GB
7 (2016) 2-3GB 32-256 GB
8 (2017) 2-3GB 64-256 GB
X (2017) 3GB 64-256 GB

Or is your argument based on price or development costs?

  1. You’re comparing games across an enormous time range, and across multiple console “generations.” Stick with one. I went with the most recent, which sees games leveling out around 50 GB. Your data supports this.

I agree that size will go up over time. However, there’s nothing to support this statement:

“We’re talking one terabyte games that cost $250m to develop, by the early 2020s.”

That’s utter nonsense.

  1. Why are you posting the size of cell phone storage? Both of our examples are based on console/PC games. For both consoles and PC, storage space is significantly larger than our “max” of 50 GB.

My argument is that most of the data doesn’t support what he’s trying to say.

So game sizes increase due to technology but the cost of developing more content will not, what technology offsets the larger workload for developers/artists/modellers.

More data on the price of games over time adjusted for inflation would be ideal, surely the industry has this data?

No one’s saying the cost of developing the content straight up does not increase with game size increase. But it’s not a linear correlation as implied by the graphs. Murgilod and myself both gave clear examples of situations where the development time or “cost” is the same yet can encompass a very broad range of final sizes–texture and audio.

I’m sure there’s data on the price. Not sure where it is though, and keep the growth of the market in mind as well–which is probably a very nebulous thing to try to define, so it would be difficult to say whether or not the fiscal return on games (absent post-release monetization like MTX) has increased or decreased.

Everyone is an unknown until they do something that gets lots of attention. That’s how fame works. I don’t see how it’s evidence against anyone.

I definitely agree with taking this stuff with a grain of salt, but that’s because it’s a prediction of the future, not because of how the fellow making it became known.

1 Like

This. Tech and shifting markets are prone to disruption. Trends change constantly.

But more importantly, this is all just high level observational stuff. Some of it is just common sense, just with a pretty graph. But, let’s say these observations are 100% accurate and come to fruition. So… what? How does it impact your development today? Knowledge of broad trends (that aren’t market related) won’t help make a shitty game good, or a great game fail. It’s not anything that actual studios can have an impact on (or want to). It is completely irrelevant to hobbyists. And if there is some developer out there who is going to be the next big game changer, they aren’t going to be making choices based on theoretical trends, they are going to be breaking the mold via creativity and design, not memory footprint.

——

So, please, let’s not have any arguments or heated debates over one persons speculation of the future that doesn’t impact or can be affected the games we are working on today. It’s interesting information, but nothing productive to argue about.

3 Likes

Plus, I suspect that a at least some of it is coincidental rather than either a cause or an effect anyway.

For instance, the thing about dollars per megabyte is surely related to decreases in hard drive manufacturing costs over and above anything else.

1 Like

So it’s nothing to do with the rising power of the GPU to display the data?