Nvidia’s GameWorks: A double-edged sword for Witcher 3

+
You are right about that, but that wasn't why I responded. The difference is that I believe that cpdr will do a lot to release the game with the best graphics they can. If they think Nvidia can deliver that, they wil use their tech. And (to what is was originally responding) they shouldn't abandon that because someone doesn't has the right graphics card.
Are you implying that they don't need the sales of the half the players on PC that use AMD cards? Because I certainly won't put my money down for something that runs like a buggy mess on my system.
 
Actually, it would be shortchanging nVidia users to require that CDPR not use GameWorks technology. There is no evidence that CDPR's choice of technology will impair performance or make a "buggy mess" of the game on AMD GPUs. That is FUD, and it need not be refuted turther.

That isn't how technology works. When they use Nvidia's proprietary tech and services, they make it difficult for the game to be optimized for AMD platforms. They could always come up with the tech themselves if they so wished it, or use ones open source.

Actually, technology works exactly like this:

You have a game you want to produce.
You have certain qualities that you want to achieve in that game.
You find the best tools and services you can acquire and your team can use to achieve the qualities you want.

If those tools and services are proprietary, you use them anyway, because there is no excuse for doing anything less than the best job you can with what you can work with.

You don't use open source just to satisfy armchair developers who demand you use open source. You use the best technology available, whether or not it is open source.

You don't avoid using a proprietary product or service just to satisfy those who demand you use a competitor's proprietary product or service. You use the best technology available, no matter what competitor it comes from.

Anything less is not professional, it is not engineering, it is amateur and has no place in serious software development.
 
Last edited:
@Guy N'wah
You should know that any game whether it be Batman, Crysis, Battlefield, Tomb Raider, ends up borking the experience for those whose graphic card manufacturer didn't get to sign a deal. Asking CDP to gimp the game because certain hardware can't handle it, like asking devs to downgrade a pc game to make it look comparable to console counterparts is of course ridiculous. That isn't the argument. The point is that (a) CDP could have come up with this tech on their own so that it ran just as well on AMD and Nvidia hardware, and (b) make it so that it didn't end up running like a train wreck on AMD platforms.
 
No, the point is that if CDPR had tried to come up with all that technology on their own or repurpose open source to do so, we would not be getting the game for $60 list price eight months from now, and it would not perform as well as it will because they used established technology and professional support.

"End up running like a train wreck on AMD" is FUD. I am amazed Thracks and others would stoop to FUD.

I run an AMD card as my big iron GPU. I expect the game will perform entirely to my satisfaction on that card. I would not have chosen it otherwise.
 
Last edited:
I am amazed Thracks and others would stoop to FUD.
I wish I could share your optimism, but this has happened all too often with both Nvidia and AMD supported games to be mere conjecture. This isn'ta one off incident, this is routine. Now, usually I just refrain from buying such games, but The Witcher is a series that I am invested in. I don't think expecting reassurance from the devs on this issue is too much of an ask.
 
I'm proud owner of nvidia card and I love this company and their products and I came to this forum just to say one thing........I hope cdpr won't use gameworks in their game and if they do they will keep it to the minimum.... I played way to many broken games with nvidia gameworks to trust this thing anymore : COD, AC, now watchodgs - almost always performance sucks big time... nVidia should stick to making excellent VC and other hardware. Well they can keep physx - it's cool, but other libraries should die.
 
That hardly proves anything then, CoD, AC, W_D all of them known games that are "optimized" like trash and you're a nvidia card user, the "alleged" crippling was for AMD users.

Almost everything from GameWorks they(titles above) used was slapped on top of the game on the PC version and whether the effects were enabled or disabled the performance was horrible in either case.
 
I'll tell you one thing, Watch Dogs ran like GARBAGE on my i7-920 2.6GHz Quadcore, 12GB DDR3, GTX570.

I literally had to drop EVERY setting to off/low in order to get playable FPS at 1920x1080. The game world looks horrendous, and the actual gameplay itself is very shallow. It's amazing how hollow the world feels, but that is beside the point. If THIS is "optimized for nVidia", then I feel really bad for AMD users. Then again, Ubisoft is renowned as a rather poor PC developer.
 
That hardly proves anything then, CoD, AC, W_D all of them known games that are "optimized" like trash and you're a nvidia card user, the "alleged" crippling was for AMD users.

Almost everything from GameWorks they(titles above) used was slapped on top of the game on the PC version and whether the effects were enabled or disabled the performance was horrible in either case.

The thing is that it's now golden age for PC... majority of games runs great, they are well optimized and when I see those rare games that are not then they are using gameworks. It's pattern for me so it's hard to blame something different coz devs are different, engines are different, only common thing is gameworks.
maybe pattern is that only bad devs are using gameworks - dunno but I try avoid any game with gameworks - WD was my last mistake. tw3 will be the only exception

and there is this: https://twitter.com/thinkinggamer/status/452737821946970112

Devs smashing GameWorks on Twitter. And one of them is Ex-CDPRed engine engineer and also a guy from Ubisoft Montreal that is partnered with nvidia!!
 
Last edited:
Actually, it would be shortchanging nVidia users to require that CDPR not use GameWorks technology. There is no evidence that CDPR's choice of technology will impair performance or make a "buggy mess" of the game on AMD GPUs. That is FUD, and it need not be refuted turther.



Actually, technology works exactly like this:

You have a game you want to produce.
You have certain qualities that you want to achieve in that game.
You find the best tools and services you can acquire and your team can use to achieve the qualities you want.

If those tools and services are proprietary, you use them anyway, because there is no excuse for doing anything less than the best job you can with what you can work with.

You don't use open source just to satisfy armchair developers who demand you use open source. You use the best technology available, whether or not it is open source.

You don't avoid using a proprietary product or service just to satisfy those who demand you use a competitor's proprietary product or service. You use the best technology available, no matter what competitor it comes from.

Anything less is not professional, it is not engineering, it is amateur and has no place in serious software development.

You missed one very important element of how game development should work:

You have to satisfy and please your customers.


Using proprietary tech instead of using open source or creating stuff of your own isn't professional, it's the easiest way. By using proprietary tech you only fully satisfy a part of your customer base and you know that. It's a shortcoming of the tech from a business point of view. It would be better to create an own solution or use open tech in that case. That's my opinion and you don't have to agree with it, of course. ;)

About Watch_Dogs: I play the game with mos settings on ultra, high textures, high shadows and high shaders with temporal SMAA on 1080p with steady 30-50 FPS on my overclocked AMD Radeon HD7870. I can't understand how that is "horrible" optimization. The game looks great and runs great on my mid-level system. Same was true for AC IV Black Flag which ran at 40-50FPS with almost everything on ultra and 4xMSAA on 1080p. Some people just don't seem to understand that open world games can't be compared to linear shooters in terms of visual quality. It's a completely different thing to display graphics in a corridor FPS with scripted events and very few moving objects on screen and an open world game with many, many moving objects on screen and stuff happening at the same time. There are much more calculations and simulation involved which requires a good hardware. Juist have a look at GTA V and how poorly it looks and runs in 720p on old consoles. And GTA IV was a complete mess on PC for a very long time for a good reason...

It's too easy to make Gameworks responsible for that. It's the nature of the open world game which is the problem. And people who don't know much about tech, settings and PC gaming in general I fear.

So I don't have much fear that Witcher 3 will run worse on AMD cards than on Nvidia cards. I have a much bigger fear that it will run not that well on every system due to its open world structure. At least most people won't be able to play it on ultra in 1440p or above. The console versions will probably much worse than the stuff we saw in the video. My problem with Gameworks here in this case is just the attitude of CDPR towards the topic. It seems to me that they never really cared for an open source or self-made solution to certain tech and that's sloppy. They do so much good PR? Why not telling us in detail why they chose Gameworks? It's one thing to decide on something and another thing to communicate and justify it to your fans....
 
You are right about that, but that wasn't why I responded. The difference is that I believe that cpdr will do a lot to release the game with the best graphics they can. If they think Nvidia can deliver that, they wil use their tech. And (to what is was originally responding) they shouldn't abandon that because someone doesn't has the right graphics card.

They can deliver the game with the BEST graphics without using EITHER NVidia or AMD proprietary tech. Developers should NOT be biased and favor over one manufacturer. If they want to make certain tech available for Nvidia, then they should make the game MANTLE ready. Let me ask you, If they implement MANTLE on Witcher 3 and gamers are getting extra 10FPS gain over NVidia, by YOUR theory, they shouldn't abandon that because someone doesn't has the right graphics card ??
 
Almost everything from GameWorks they(titles above) used was slapped on top of the game on the PC version and whether the effects were enabled or disabled the performance was horrible in either case.
I don't think there is such a thing as getting slapped on top when it comes to Gameworks, you are allowing Nvidia to implement its libraries and optimize the code for nvidia hardware.
 
They can deliver the game with the BEST graphics without using EITHER NVidia or AMD proprietary tech. Developers should NOT be biased and favor over one manufacturer. If they want to make certain tech available for Nvidia, then they should make the game MANTLE ready. Let me ask you, If they implement MANTLE on Witcher 3 and gamers are getting extra 10FPS gain over NVidia, by YOUR theory, they shouldn't abandon that because someone doesn't has the right graphics card ??
There is a big difference between adding effects and adding an entire rendering API. Mantle will ONLY run on AMD hardware and it will obviously require to maintain another codebranch and implement it into REDEngine 3.
That is NOT the case with GameWorks, some effects are Nvidia limited but the rest will work on both AMD and Nvidia and it will be present on both the consoles as well which as it so happens use AMD hardware.


And you make the best of the resources available to you, do you really think creating their own hair rendering solution, cloth physics, destruction is simple? It would need immense amount of time and money, not to forget developers that are actually into this, there is a big difference between implementing a feature that was developed by those experts at it with proper documentation vs first creating your own solution then learning its every crack and crevice.


I don't think there is such a thing as getting slapped on top when it comes to Gameworks, you are allowing Nvidia to implement its libraries and optimize the code for nvidia hardware.
Yeah there is, take the PhysX SDK for example, the effects these games use from that is a few things while it's a full blown physics SDK, AC uses havok for the actual physics for example and as far as I'm aware PhysX SDK is one of the fastest, if not the fastest engines right now.
The things implemented by those games is mostly TXAA, HBAO+, soft shadows.


Out of those TXAA only works on NV hardware and is not essential at all in fact a lot of people dislike it because of too much blur.
HBAO+ works perfectly in both AMD and Nvidia as far as I've seen, certainly better than simple HBAO/HDAO and obviously SSAO.
Soft Shadows is taxing on both the hardwares.


Now for the exceptions...
CoD used the fur tech for their dog, which was confirmed to be OpenCL and not CUDA so there really is no restriction over there, OpenCL is supposed to be AMD's forte.
Nvidia APEX Turbulence - the only remaining effect in the APEX library that is GPU accelerated only.
ALL of the rest of the APEX modules run on CPU as well as GPU, the latest PhysX SDK(which the Witcher 3 will be using) has SIMD optimizations specifically for the CPU.
 

Aver

Forum veteran
And you make the best of the resources available to you, do you really think creating their own hair rendering solution, cloth physics, destruction is simple?

But maybe it's not the best resource available. Look at comments on Eurogamer when Ubisoft announced GameWorks will be in next 3 games - it was dominated by disappointment and predictions of poor optimization by both Nvidia and AMD fans. You can also see devs that are using GameWorks and they are also criticizing it. Even Nvidia admitted that even tho it's possible to optimize game without source code, it might make it very difficult and Nvidia doesn't allow to share with AMD parts of code that are using their solutions.

But my favorite comment on GameWorks was a picture shopped by one poster on NeoGAF: "GameWorks. It's meant to be played on console."
 
FYI UE4 has GameWorks libaries already implemented.
I dont see a bright future for AMD if they dont take some actions.
 

Aver

Forum veteran
FYI UE4 has GameWorks libaries already implemented.
I dont see a bright future for AMD if they dont take some actions.

"An Epic representative emailed me to clarify that "NVIDIA GameWorks is not built into UE4. The engine ships with PhysX." This is curious because on Nvidia's developer portal, the company states that "we’ve incorporated support for NVIDIA GameWorks directly into Unreal Engine 4 making it easier for UE4 licensees to take advantage of our technology."Nvidia is choosing their words carefully, but the intent seems to be touting the inclusion of the GameWorks libraries (not just PhysX) directly into the Unreal Engine 4 core, and Epic has made it abundantly clear to me that that's not the case."

Source:
http://www.forbes.com/sites/jasonev...ut-gameworks-amd-optimization-and-watch-dogs/
 
No offense, but this is what I am most worried about. I just want to play the game while it is at its best, and if TW3 with gameworks runs better on Nvidia cards (which it probably does) then I will technically be forced to buy Nvidia. Then comes along another game that has mantle (which might run a lot better on AMD hardware, you never know) and I'm stuck with Nvidia...

This shit will be creating a sort of split between games, that can't be good if you ask me...

Yeah I'm not a fan of this type of thing happening from either camp.

Nvidia does come out with some impressive graphical tech. Physx is the top physics engine around, and AMD has no alternative. but I am not a fan of how nvidia loves to embed their closed tech and engineers into as many games as possible, and while I doubt they intentionally gimp performance on AMD, performance on AMD is certainly not something nvidia cares about when implementing their libraries. This wouldn't be bad if their libraries weren't a black box that AMD can't easily optimize for, but alas, they are.

And on the other hand we have AMD's mantle (I've actually found mantle quite impressive performance-wise when In tried it in BF4. It significantly increased my minimum FPS compared to DX11), which is an entire new API that only runs on AMD cards (despite AMD's claims that they will 'open it', so far they haven't put their money where their mouth is). For the most part AMD has been pretty good about using open tech (opencl etc...), but when it comes to mantle I've seen no sign of it ever running on anything other than AMD cards.

You end up feeling 'screwed'/missing out one way or another with either brand :/ There's always gonna be some game that you anticipate 'optimized' for whatever brand you didn't buy.
 
One thing I don't get..
Why do they use NVIDIA GameWorks while both consoles use AMD cards? Or GameWorks is only for PC use?
 
One thing I don't get..
Why do they use NVIDIA GameWorks while both consoles use AMD cards? Or GameWorks is only for PC use?
As stated a few times... because GameWorks is not Nvidia limited, a few effects are but other than that GameWorks isn't limited to Nvidia-only hardware.
 
As stated a few times... because GameWorks is not Nvidia limited, a few effects are but other than that GameWorks isn't limited to Nvidia-only hardware.

Let's be honest: Gamesworks is 90% about nvidia-only proprietary libraries. The only thing not limited to nvidia GPUs are CPU Physx effects but they usually don't work very well. Optimized HBAO+, TXAA, optimized tesselation and the good Physx stuff only run on nvidia GPUs.

You can say whatever you want: choosing Gameworks for development is favoring nvidia above AMD.
 
Last edited:
Top Bottom