Nvidia’s GameWorks: A double-edged sword for Witcher 3

+
I never said that they are perfect, they are not, but in that case it's not their fault. MSI added their own fan control to the firmware and it didn't work well with driver's fan control - it's up to MSI to make their custom solutions work, not up to divers team.

msi does not have thieir own driver team....lol.....That is why it is both teams faults. MSI and AMD work together to create a driver that works across the board for all GPUs factory and Partner's AIB

btw here is a great video on why MSI has their own fans/cooler/ fan software.....to keep this 290x cool....... lol

https://www.youtube.com/watch?v=u5YJsMaT_AE quiet mode then uber mode!!!!!!!
 
msi does not have thieir own driver team....lol.....That is why it is both teams faults. MSI and AMD work together to create a driver that works across the board for all GPUs factory and Partner's AIB

MSI don't make drivers, but they make firmware. This problem was fixed by firmware update released by MSI, not by new drivers release, so you are wrong.

Could you now go back to the topic? Please?
 
It runs on AMD in some games...with a bad performance compared to nvidia GPUs....but you're right, my bad. HBAO+ is possible on AMD cards.

I didn't notice much performance difference when I used it, but 280x is a pretty beastly card :)

Do you have any examples of games that won't allow hbao+ on amd? every game i've seen with it allowed me to use it.
 
Last edited:
Can we expect this kind of Nvidia Game Works gap in W3?

Witcher 2 is my favorite game of all time, but if this kind of sabotage on AMD hardware is going to happen on this game (see video)? I wish to cancel my preorder and I will no longer support any publisher who uses Games Works.

This is absolutely ridiculous considering an R9 290/x beat out the 970/980 in high AA/high resolutions in Shadows of Mordor and Lords of the Fallen (benchmarks on same channel). So it only seems to happen on Games Work games that use the Nvidia library for the core of their game.

Add to this the consoles use AMD 7000 series GCN hardware. If anything? The game should run better on AMD at high AA/resolution. (and usually does before Gameworks). I think the developer owes us an answer to this question. Will the libraries to Witcher 3 be closed to AMD and are you using the Games Works library for the core of the game. If so? I wish to cancel my preorder now. I am not going to buy a game with performance cut by 20-30 FPS when this kind of thing is ground for unfair practices and no developer should be supporting it. Witcher 2 runs faster with Ubersampling on my R9 290 Tri-x than my GTX 970 (which I sold).

It is a simple yes or no answer CDPR. Does this game use Game Works for the core of the game or is it just effects on top. If it is the latter? Good, great, grand. You get money from Nvidia and we don't care. If it is the former? We deserve to know now. Also? Please spare me TXAA nonsense. It is a blurred, cheap AA that should only ever be used on a HORRIBLY optimized game in place of SMAA, and then you are just making the choice between blur (TXAA) and jaggies (FXAA). If Witcher 3 has to use TXAA? It simply failed optimization and the PC version is being held back. A decent brand R9 290/GTX 980 should be running 4 times what the consoles can and MSAA x4 is not even four times that. SSAAx4 is.

Thanks in advance and I hope you don't sell out like Ubisoft, whom I hope goes bankrupt at this point, since they are already morally bankrupt.

https://www.youtube.com/watch?v=Pobd1omu6i0&list=UUJEhJYDr4NAUEtcMRNrZmcw
 
There is always one of these posts/threads everytime Ubisoft releases another shit PC port...

Yes, Witcher 3 uses GameWorks.

The game has ditched Havok in favour of PhysX SDK and that runs entirely on CPU so GPU is irrelevant.

The game uses APEX Clothing and APEX Destruction modules, this again as nothing to do with the GPU brand because this will be present on the consoles as well.

The game will have optional like HBAO+, HairWorks, GPU Accelerated PhysX, TXAA. The latter two require a Nvidia GPU. HBAO+ doesn't and HairWorks atleast partially works on AMD hardware too and it's based on DirectCompute and not CUDA.

The few of these things that DO require the GPU are OPTIONAL and you don't have to use them. If they aren't enabled then they obviously cannot affect the performance of the game, any other claim is just FUD.

If you still feel like cancelling your preorder well then go ahead.
 
Witcher 2 is my favorite game of all time, but if this kind of sabotage on AMD hardware is going to happen on this game (see video)? I wish to cancel my preorder and I will no longer support any publisher who uses Games Works.

This is absolutely ridiculous considering an R9 290/x beat out the 970/980 in high AA/high resolutions in Shadows of Mordor and Lords of the Fallen (benchmarks on same channel). So it only seems to happen on Games Work games that use the Nvidia library for the core of their game.

Add to this the consoles use AMD 7000 series GCN hardware. If anything? The game should run better on AMD at high AA/resolution. (and usually does before Gameworks). I think the developer owes us an answer to this question. Will the libraries to Witcher 3 be closed to AMD and are you using the Games Works library for the core of the game. If so? I wish to cancel my preorder now. I am not going to buy a game with performance cut by 20-30 FPS when this kind of thing is ground for unfair practices and no developer should be supporting it. Witcher 2 runs faster with Ubersampling on my R9 290 Tri-x than my GTX 970 (which I sold).

It is a simple yes or no answer CDPR. Does this game use Game Works for the core of the game or is it just effects on top. If it is the latter? Good, great, grand. You get money from Nvidia and we don't care. If it is the former? We deserve to know now. Also? Please spare me TXAA nonsense. It is a blurred, cheap AA that should only ever be used on a HORRIBLY optimized game in place of SMAA, and then you are just making the choice between blur (TXAA) and jaggies (FXAA). If Witcher 3 has to use TXAA? It simply failed optimization and the PC version is being held back. A decent brand R9 290/GTX 980 should be running 4 times what the consoles can and MSAA x4 is not even four times that. SSAAx4 is.

Thanks in advance and I hope you don't sell out like Ubisoft, whom I hope goes bankrupt at this point, since they are already morally bankrupt.

https://www.youtube.com/watch?v=Pobd1omu6i0&list=UUJEhJYDr4NAUEtcMRNrZmcw

I think they don't answer this kind of question, so I wouldn't wait. I think you should cancel your pre-order, wait for some reviews on this aspect of the game (performance, etc.) and buy it if it's up to your standards - if it's a digital version, you won't lose pre-order bonuses and if it's box, maybe you'll lose only a poster or nothing, except if you have the option of getting some of the great stuff I saw in one thread (then, it would suck).

I'm an AMD user and I am a little concerned as well - I know it will run better on Nvidia, because how can it not, aren't they like best buddies now? I just hope CDPR don't forget there a lot of AMD users and we would like all the eye candy possible and that it runs well enough
 
Welcome to the long and on-going PC war between Nvidia and AMD. It was inevitable CDPR was going to pick one side or the other. Ubisoft always seems to endorse Nvidia while EA seems to endorse AMD. As far as actual features go, Nvidia definitely blows AMD out of the water. I think part of the reason CDPR made this choice is because they recognize they'll get the best results from using Nvidia tech in terms of PhysX, Hairworks, etc. It's unfortunate for AMD players, but they would not have had those benefits to begin with, or they would have been poorly attempted had CDPR used another 3rd party tech. It's a known fact that overall Nvidia has better support for PC gaming. They invest more in it and they work heavily with the developers to make Nvidia cards have top performance with their games.

On a side note, all AMD is missing out on are features like TXAA, Hairworks, PhysX, etc. It certainly won't ruin the game, but they do make the experience a lot more immerse and enjoyable. If you are thinking about getting a new graphics card, I'd highly recommend Nvidia. I used to own AMD, but they had terrible driver support and I had more issues than benefits using their cards with games. Nvidia is the complete opposite. They are constantly releasing new drivers, especially around games. They are constantly improving performance and making their software for their hardware better. Sure it's political and Nvidia is trying to move AMD out of the way, but I wouldn't suffer needlessly if you want to reap the benefits of what TW3 has to offer and you can afford it.

I've definitely had a much better experience with Nvidia than AMD. I'd also like to point out it didn't matter what your GPU was for Watch_Dogs... The game was terribly optimized and even stuttered on my high-end system. Ubisoft always struggles when it comes to PC ports it seems. Hopefully TW3's PC version will not suffer because of consoles.
 
No, they get a physics engine and the support that comes with it, which is critical given that they are developing the game and the engine at the same time and have limited resources.
So you are merely agreeing with me that Nvidia provides better support for PC gaming? Okay. PhysX is better than Havok anyways, so I only see this as a step in the right direction. It's a shame AMD users won't be able to benefit, but it's not exactly as if PhysX is a new kind of feature. It has been used in popular franchises such as Batman: Arkham and Borderlands just to name a few.
 
I've definitely had a much better experience with Nvidia than AMD. I'd also like to point out it didn't matter what your GPU was for Watch_Dogs... The game was terribly optimized and even stuttered on my high-end system. Ubisoft always struggles when it comes to PC ports it seems. Hopefully TW3's PC version will not suffer because of consoles.

Yeah, I have the same experience with both cards. Used to be an Nvidia owner, then tried several top AMDs because it's cheaper/cooler with better reviews. However, results were not good. It's not that it's that much slower but they're not very optimized for all games, causing artifacts/stutters and whatnot (even on popular titles). Mainly compatibility issues. I never had that experience with an Nvidia card.

I'm just waiting to buy an Nvidia card before W3 releases. CDPR supports Nvidia since W2 or earlier, so that's a plus.

BTW, this doesn't mean the W3 won't be playable on an AMD card. I'm sure it will work just fine. My Radeon works fine with W2. I just want a better compatibility overall.
 
Yeah, I have the same experience with both cards. Used to be an Nvidia owner, then tried several top AMDs because it's cheaper/cooler with better reviews. However, results were not good. It's not that it's that much slower but they're not very optimized for all games, causing artifacts/stutters and whatnot (even on popular titles). Mainly compatibility issues. I never had that experience with an Nvidia card.

I'm just waiting to buy an Nvidia card before W3 releases. CDPR supports Nvidia since W2 or earlier, so that's a plus.

BTW, this doesn't mean the W3 won't be playable on an AMD card. I'm sure it will work just fine. My Radeon works fine with W2. I just want a better compatibility overall.

Exactly. My issue with AMD was less so their hardware performance and more so their drivers and software. With AMD, I was having all sorts of issues running Skyrim and various other popular titles because of bad drivers. I had to constantly reload drivers or try out beta drivers just to get the game to work. It was incredibly frustrating and really left a bad taste in my mouth. Not to mention AMD cards tend to be a lot louder and much less efficient than Nvidia, which is why they are generally cheaper.

I just want an amazing experience playing the Witcher, and I believe only Nvidia will offer it to the fullest. They provide the best service. Work with the most developers to make quality PC ports. Constantly are releasing drivers improving our hardware. Add all sorts of amazing graphical and game-oriented features only available to Nvidia. I just don't really see a contest as Nvidia just seems more committed to truly offering the best PC gaming experience possible. Unless Nvidia were to do something terribly wrong, I doubt I'll ever consider AMD again for a product in the future.
 
Isn't this getting a bit out of hand? Cancelling pre orders because AMD users miss 2-3 options, and it was known before the pre orders started that the game was going with Nvidia.

And don't pretend that missing a AA option and hair physics will break the game and the immersion for you, the witcher is more then just pretty graphics.

I understand that you want the game to run properly on your system, everybody does, and CDPR said they will optimize the game for each platform aandacht push it's limits. So don't ask for CDPR to respond here, but trust them because they already awnsered your question/concern.
 
The new consoles use AMD technology don't they? I don't think AMD will be pushed to the side.

Microsoft and Sony both use AMD hardware because Nvidia was too expensive and wouldn't make a deal with consoles. The game will certainly be fine on AMD GPUs, it will just be better with the additional features if you have an Nvidia GPU. This really shouldn't come as a surprise to anyone as Nvidia has been giving additional features to many AAA games for quite some time now. You want to benefit? Switch to Nvidia. It's more expensive than AMD, but they have higher quality products that are more efficient with better driver support. You pay for better service and better hardware.
 
I think Ubisoft and Nvidia outdid themselves this time around. Not only Assassin's Creed Unity gets an average of 30 FPS on a GTX 780 and an i7 4770k, but it also dips down to 17 FPS on the PS4. It is a game that is not optimized on any system, and funny thing is that it's one of the biggest releases this year.
It's even funnier how us, PC gamers, spend more and more money to get more powerful PCs, and at the end of the day it doesn't matter because a poorly optimized game doesn't even properly take advantage of the power that is available to it.
No one should buy/build PCs based on ridiculous requirement of poorly optimized games like Unity, and I'd say buying games like this before patches and price drop just gives the greedy and lazy companies like Ubisoft more money to make more unoptimized games.
And lastly the trailers and interviews saying that Nvidia is in close collaboration to optimize and make the game look the best on PC, has turned into pure comedy after Watch Dogs and Unity. It gives me the chuckles every time I see one of those trailers, and I try to avoid that game because it means the game is unoptimized say 99% of the time. These trailers and the game performances suggest two possibilities: Nvidia does this on purpose in order to force the customers to buy overpriced cards, or Nvidia is simply incompetent.
I don't think Nvidia should be blamed when the game preforms poorly on consoles too. Watch Dogs in particular had PC options hidden with the title "This is PC who cares" . I am really unsure that Nvidia has a big role in anything ubisoft does, despite what they may say.
 
Regarding TXAA You're not missing out on anything.
Physx can look nice but tends to be way overdone it also uses alot of performance.
 
I don't think Nvidia should be blamed when the game preforms poorly on consoles too. Watch Dogs in particular had PC options hidden with the title "This is PC who cares" . I am really unsure that Nvidia has a big role in anything ubisoft does, despite what they may say.

Nvidia shouldn't be blamed at all. They don't build these poorly-made engines that aren't well-optimized. All they do is add features for the PC port and try to make the game somewhat workable with Nvidia GPUs. If not for Nvidia, it's like PC ports from Ubisoft would be even worse.

Regarding TXAA You're not missing out on anything.
Physx can look nice but tends to be way overdone it also uses alot of performance.

TXAA is amazing. It significantly reduces performance cost with the same results as SMAA or MSAA. PhysX can admittedly hog performance, but did you see it in action with Batman Arkham: Asylum, City, and Origins? Absolutely stunning. Carpets would tear as expected. Money would flutter as expected. Batman's cape would move and perform as expected. Even leaving trails of footprints in snow would be rendered and stand there based on your path through it. PhysX makes the experience that much more immersive. Hairworks will be amazing as hair will actually move realistically rather than being a frozen mesh that has clipping issues. Long story short, I wouldn't want to live without Nvidia features. They are just too awesome.
 
Far Cry 4 will be the best test so far to see if Nvidia's GameWorks will adversely affect AMD users. Remember that Far Cry 3 was an AMD Gaming Evolved title and it ran equally well on Nvidia and AMD cards. Take a look at the recommended graphics cards for Far Cry 4.

http://blog.ubi.com/far-cry-4-pc-specs/

It recommends a GTX 680 or a Radeon R9 290x. A 290x is a much stronger card. Does that mean an AMD card needs that much more power to render the same effects? We shall see... The minimum requirements are equal because you wouldn't be activating the higher-end effects with a weak or old card anyway unless you wanted to watch a slideshow.

If the 290x recommendation is a typo, how come they haven't fixed it yet? Normally, you compare a GTX 680 with a Radeon HD 7970 and a GTX 770 with a Radeon R9 280x. Also, note that they got the counterparts correct with Assassin's Creed Unity system requirements.

The results should be interesting...
 
Far Cry 4 will be the best test so far to see if Nvidia's GameWorks will adversely affect AMD users. Remember that Far Cry 3 was an AMD Gaming Evolved title and it ran equally well on Nvidia and AMD cards. Take a look at the recommended graphics cards for Far Cry 4.
Far Cry 3 also wasn't ported by Ubi Kiev(the same studio that does AC ports) however Far Cry 4 will be so we'll see how that turns out...

TXAA is amazing. It significantly reduces performance cost with the same results as SMAA or MSAA.
The real benefit of TXAA is that it gets rid of temporal aliasing, no other from of AA does that(SMAA temporal exists but it has plenty of ghosting) - what this means is textures do not flicker in the game.
 
Last edited:
Top Bottom