Ex-NVIDIA Driver Developer on Why Every Triple-A Game Ships Broken

+
Ex-NVIDIA Driver Developer on Why Every Triple-A Game Ships Broken

http://www.dsogaming.com/news/ex-nv...-every-triple-a-games-ship-broken-multi-gpus/

“Nearly every game ships broken. We’re talking major AAA titles from vendors who are everyday names in the industry. In some cases, we’re talking about blatant violations of API rules – one D3D9 game never even called BeginFrame/EndFrame. Some are mistakes or oversights – one shipped bad shaders that heavily impacted performance on NV drivers.
These things were day to day occurrences that went into a bug tracker. Then somebody would go in, find out what the game screwed up, and patch the driver to deal with it. There are lots of optional patches already in the driver that are simply toggled on or off as per-game settings, and then hacks that are more specific to games – up to and including total replacement of the shipping shaders with custom versions by the driver team.
Ever wondered why nearly every major game release is accompanied by a matching driver release from AMD and/or NVIDIA? There you go.”

This is something i have been telling for years. And DX12 or Mantle will not change it.

---------- Updated at 03:41 PM ----------

Edit:
If this was one more thread about how DX12 or Mantle will be revolution and all games will work better with it - buy now this thread would be packed with many posts from "experts".

But when we talk about how things are, about reality, people have nothing to say.

I'm not surprised at all. I was expecting this.
 
If this was one more thread about how DX12 or Mantle will be revolution and all games will work better with it - buy now this thread would be packed with many posts from "experts".

But when we talk about how things are, about reality, people have nothing to say.

I'm not surprised at all. I was expecting this.

That's because there usually is little to say about facts, and much to say about rumors. It's a matter of opinion, everybody has one. What else did you expect from news like this (which aren't really news as you yourself said)? All we can do is nod and say "we knew it".

If you want to expand the thread with a bit of speculation, I'm gonna say the gaming industry is plagued with poorly educated programmers, designers and specially managers. They think they know what they're doing, and they think they can do it in a certain amount of time, but reality shows otherwise. I don't mean to say all game programmers must have formal university education, but at least they should be a step above of "I taught myself (Visual) C++, now I will develop and ship this game".
 
That's because there usually is little to say about facts, and much to say about rumors. It's a matter of opinion, everybody has one. What else did you expect from news like this (which aren't really news as you yourself said)? All we can do is nod and say "we knew it".

If you want to expand the thread with a bit of speculation, I'm gonna say the gaming industry is plagued with poorly educated programmers, designers and specially managers. They think they know what they're doing, and they think they can do it in a certain amount of time, but reality shows otherwise. I don't mean to say all game programmers must have formal university education, but at least they should be a step above of "I taught myself (Visual) C++, now I will develop and ship this game".

The people who know how things work (or at least have basic understanding and awareness) don't talk about DX12 like it will change everything.
This "we knew it" make no sense. All i see is hype.

And even good designers or programmers need time to create something good. This is big problem, the fact that devs must make a complex game in 1-2 years. This leads to mediocrity.
The biggest problem is that people buy these games like junk food. Here take my money!?
What do you have from this?
 
The lack of demand for quality and consumer behaviour is one problem, yes. But that has been discussed at length in other threads. Besides that though, there are lots of other problems:

-An (over)saturated market that requires you to a) find a niche (like Paradox often does) or b) serve a wide audience and pump high marketing budgets to sell enough copies.

-Exploding budgets that require you to sell lots of copies to make it worthwhile (at least if you're aiming for a state of the art blockbuster game. But even costs for small indie projects tend to be quite high). Controlling these leads to low wages, unsecure jobs, tight schedules and all manners of undesirable stuff.

-A young industry. Universities have only recently picked up on games development. Regular computer scientists with a university degree are likely to look for different, much better-paying jobs (as game industry jobs tend to be severely underpaid, because of the above reasons). Experience and proven workflows are lacking.

-An everchanging industry. Game development very regularly goes through paradigm shifts and severe changes of workflow. 2D to 3D, game-specific engine to reusable engine, code-driven to data-driven, to name but a few. Keeping up with industry standards can be challenging. This is something university education might actually help with: To teach young game programmers to think abstract and in concepts, instead of learning current tech by rote.

-Game programming is hard. Games are (ever more) complex systems, consisting of dozens to hundreds of subsystems from very different backgrounds (graphics, audio, physics, networking....). Plus games, being soft real-time simulations, have quite strict timing constraints - something a lot of business software does not. Additionally there's a ton of middleware you need to be able to adapt to and work with.

-Science and Art. Game development combines these in quite a unique way with lots of implications, not the least of which are a great need for iteration and testing to make things "fun", as there often isn't a "right" way.

Long story short: I'm not surprised that games are often lacking on a technical level. It's development of complex software - among the hardest you could choose - under tight timing constraints in an industry that tends to drive away qualified personnel with bad working conditions.
 
Last edited:
The lack of demand for quality and consumer behavior is the biggest problem. This drives game developing.

And f**k Universities.
If you really like games and making games, you don't need to go to University where you will be distracted learning bunch of nonsense.
And if you enter game developing just to make money, then your game will suck. It will become just one more product. Your ability to express yourself artistically will be lost - you don't need it actually. You will be like robot who creates some products for masses.
 
The lack of demand for quality and consumer behavior is the biggest problem. This drives game developing.
It partially drives AAA development on a business level, maybe. Even on high profile projects, the devs themselves aren't in it to take away people's money. Besides it's still arguable if there's any problem with many enjoying their yearly Call Of Duty - there's no proof people are just buying it because they don't know better or whatever other argument is thrown around so often - and developers providing accordingly. In and of itself that's valid. If you don't like it, just ignore it, there's plenty of other games out there. Or if you're convinced there's a problem, campaign against it: educate (casual) gamers to have them make more informed choices or whatever you feel needs doing.
But that has very little to do with why there often are big technical deficits in game code. It's not like any dev sits there thinking "fuck it, people will swallow it, anyway". Trust me, devs are the people who are most passionate about making it the best they can - they're often pouring several years of their lives into it, after all. It's a struggle of tight schedules, tight budgets and complex problems that need solving that lead to hasty implementations and lack of testing. These are issues that need solving - and I know too little about it to judge if that means R&D to give devs better tools to accomplish more and higher quality in shorter time, reducing business overhead, budget reallocation (away from shiny graphics towards more important stuff) or making high production value games more expensive to cover higher development costs (don't forget that game prices have been largely stabely for decades while development costs skyrocketed). Probably some of all of the above.
We're slowly getting towards more cost-efficient development with ready-made engines, data-driven development and such, already. Smaller projects cut down graphics costs to make a better, if less shiny, game. New distribution channels and the rise of indies show that you can make do with less business overhead - but you're unlikely to sell millions of copies or even get any visibility for your game at all; for every Minecraft there's probably a hundred obscure, but excellent indie titles.

And f**k Universities.
If you really like games and making games, you don't need to go to University where you will be distracted learning bunch of nonsense.
You're pointing out severe mistakes in how current game devs often code, but would deny that education might help?
It's true, that if you're thorough and approach the issue very professionally, you can teach yourself a lot. But it's also very easy to put together games that are running, but suffer such severe mistakes. If you're self-taught and approach it with a "it's working, so it's fine" attitude, you'll make exactly these kinds of mistakes you pointed out.
Even if you're not enrolling in a university course, a look at their curriculum gives you valuable hints on what to study and in what order; which books are any good and what other resources you have at your disposal to learn stuff. You get a solid foundation of theory that is often ignored when self-teaching, but is important to understand stuff instead of replicating it and in order to be able to adapt to new concepts quickly. Plus, you have a schedule, are forced to learn the stuff that seems boring or useless, but comes in handy at some point, as well; your learning is supervised to some degree and you can easily figure out where you're lacking. If the course is any good, you'll have plenty of projects that give you hands-on experience, both working on games, and especially working on games in a team with a schedule. Plus, you'll have the chance to attend some events, make first industry contacts, etc, etc.
You can learn on your own, and if you're thorough and approach it professionally, something good will come of it. But you can also accept university as a tool to help you achieve that. So, no. Don't f**k Universities.

And if you enter game developing just to make money, then your game will suck. It will become just one more product. Your ability to express yourself artistically will be lost - you don't need it actually. You will be like robot who creates some products for masses.
Right, because game developers love to live off of thin air and love. Just because you're passionate about it, doesn't mean you like to be screwed over by getting paid maybe half of what computer scientists get in other industries, for longer hours and often tougher challenges. Appropriate wages and no crunch time will most certainly not diminish your work's quality or even kill off any artistical expression. Besides, we're talking primarily about programmers here - who don't have much to do with artistical expression; not in a classic sense, anyway.
 
Last edited:
It partially drives AAA development on a business level, maybe. Even on high profile projects, the devs themselves aren't in it to take away people's money. Besides it's still arguable if there's any problem with many enjoying their yearly Call Of Duty - there's no proof people are just buying it because they don't know better or whatever other argument is thrown around so often - and developers providing accordingly. In and of itself that's valid. If you don't like it, just ignore it, there's plenty of other games out there. Or if you're convinced there's a problem, campaign against it: educate (casual) gamers to have them make more informed choices or whatever you feel needs doing.
But that has very little to do with why there often are big technical deficits in game code. It's not like any dev sits there thinking "fuck it, people will swallow it, anyway". Trust me, devs are the people who are most passionate about making it the best they can - they're often pouring several years of their lives into it, after all. It's a struggle of tight schedules, tight budgets and complex problems that need solving that lead to hasty implementations and lack of testing. These are issues that need solving - and I know too little about it to judge if that means R&D to give devs better tools to accomplish more and higher quality in shorter time, reducing business overhead, budget reallocation (away from shiny graphics towards more important stuff) or making high production value games more expensive to cover higher development costs (don't forget that game prices have been largely stabely for decades while development costs skyrocketed). Probably some of all of the above.
We're slowly getting towards more cost-efficient development with ready-made engines, data-driven development and such, already. Smaller projects cut down graphics costs to make a better, if less shiny, game. New distribution channels and the rise of indies show that you can make do with less business overhead - but you're unlikely to sell millions of copies or even get any visibility for your game at all; for every Minecraft there's probably a hundred obscure, but excellent indie titles.


You're pointing out severe mistakes in how current game devs often code, but would deny that education might help?
It's true, that if you're thorough and approach the issue very professionally, you can teach yourself a lot. But it's also very easy to put together games that are running, but suffer such severe mistakes. If you're self-taught and approach it with a "it's working, so it's fine" attitude, you'll make exactly these kinds of mistakes you pointed out.
Even if you're not enrolling in a university course, a look at their curriculum gives you valuable hints on what to study and in what order; which books are any good and what other resources you have at your disposal to learn stuff. You get a solid foundation of theory that is often ignored when self-teaching, but is important to understand stuff instead of replicating it and in order to be able to adapt to new concepts quickly. Plus, you have a schedule, are forced to learn the stuff that seems boring or useless, but comes in handy at some point, as well; your learning is supervised to some degree and you can easily figure out where you're lacking. If the course is any good, you'll have plenty of projects that give you hands-on experience, both working on games, and especially working on games in a team with a schedule. Plus, you'll have the chance to attend some events, make first industry contacts, etc, etc.
You can learn on your own, and if you're thorough and approach it professionally, something good will come of it. But you can also accept university as a tool to help you achieve that. So, no. Don't f**k Universities.


Right, because game developers love to live off of thin air and love. Just because you're passionate about it, doesn't mean you like to be screwed over by getting paid maybe half of what computer scientists get in other industries, for longer hours and often tougher challenges. Appropriate wages and no crunch time will most certainly not diminish your work's quality or even kill off any artistical expression. Besides, we're talking primarily about programmers here - who don't have much to do with artistical expression; not in a classic sense, anyway.

First, i'm the person who play games from the beginning, an old gamer. Second, i work with 3ds Max, Photoshop, Mudbox... I create models and textures. An artist.
Maybe you have noticed in my other posts that i explain things from technical point of view. Actually i'm the only person here who does this.
So everything that you write here is nothing new to me. But i disagree with your conclusions.

And again, even the best programmer needs time. This is obvious when you see old teams, like Bethesda for example, who never optimize their games properly. These people aren't beginners.

Schools, Universities - centers for indoctrination.

Yes, you need money, but you don't need a lot of money.
And one wonderful thing about artists is that they love what they do. They don't think about money. These people want to paint, write, sculpt..., and that's it. They live in the moment. Every man should be an artist, at least as a hobby.
But when you start to work for the money, when you become greedy, your artistic expression is limited. This is the rule.

The reason why devs are limited is because they have allowed this to themselves. They have allowed to bunch of the people (publishers) who don't give a fuck about games, to control them.
Good old human behavior! Lets separate ourselves and then try to work together. This philosophy has never worked, and it will never work. Universe tells this to us ever day, second...

But without the people, masses, this wouldn't exist. They think this is ok, support it, buy it...
Indoctrination!
 
http://www.dsogaming.com/news/ex-nv...-every-triple-a-games-ship-broken-multi-gpus/



This is something i have been telling for years. And DX12 or Mantle will not change it.

On the contrary, SPIR-V used with Vulkan will change it. Instead of some weird cheating and replacing shaders at runtime there will be standard open compiler shared across all vendors. It will eliminate exactly the kind of problems described there.

Good overview: http://www.g-truc.net/post-0714.html

Of course the question of using APIs properly will remain all the same. But there will be less mess in the lower parts of the stack.
 
Last edited:
Moderator: Refrain from disparaging the qualifications of any other member, or your posts will have no place on this forum.
 
If artists without managerial skills were left to their own charges without the "money hungry" publishers overseeing, then not only would most of the stuff they release be complete rubbish, but almost no one would hear of their work and play it. Just look at Steam Greenlight.
 
Last edited:
If artists without managerial skills were left to their own charges without the "money hungry" publishers overseeing, then not only would most of the stuff they release would be crap but almost no one would hear of their work and play it. Just look at Steam Greenlight.

Hm, apparently some disagree: http://flyingwildhog.com/about

We do not believe in corporate management and corporate ways. Our studio is managed without producers and we try to employ as many anti-corporate management ideas as possible. Such a flexible structure allows us to limit all the obsolete bureaucracy and hierarchy to minimum.

We do not believe in managers or leads who themselves don’t create real content of a game. People who are responsible for management in our studio also work as programmers, artists or designers.

We do believe that people who are given freedom, both for their ideas and use of their skills, work better and more effective.

We don’t complicate things that are simple.

I'm sure though that it's not as easy as it sounds.
 
Top Bottom