Are AMD Cards Going To Be Well Optlimzed For TW3?

+
Are AMD Cards Going To Be Well Optlimzed For TW3?

Before, I ask this question or present my opinions, I would like to say that this thread was not created to start a flame war between Nvidia or AMD users. I also apologize if this question has been brought up in the past in these forums. I want to bring up this question because the TW3 was delayed due to optimization and I would like to know what is CDPR's opinion on this. Firstly, I would like to say that CDPR has contributed a great deal in making PC gaming (and gaming in general) a better experience for all, as they strongly advocate a DRM free system for video games.

However, I would like to talk about their support for Nvidia GameWorks, as there have been acquisitions by AMD's Robert Hallock that GameWorks "is a clear and present threat" to Radeon Cards as it blocks AMD from optimizing games that use GameWorks such as TW3 (Plunkett, 2014). Therefore, it has been said that "Radeon users are essentially simply robbed of performance, while said games run just as good with or without Gameworks for Nvidia users. In other words, Gameworks allegedly only exists to artificially inhibit performance on AMD hardware" (Dlux, 2014).

I wanted to know that if this is also true in the case of TW3, because I want to know that if this game will look just as amazing as it would if I had an Nivdia Card. I also want to know why is CDPR supporting a closed SDK system like GameWorks. Furthermore, thanks to community and CDPR for reading my tread. Once again, I apologize if a similar tread like this was created.

Source #1- http://www.kotaku.com.au/2014/05/amd-calls-nvidia-program-a-clear-and-present-threat-to-gamers/
Source #2- https://steamcommunity.com/app/292030/discussions/0/522729359055831210/?l=english
 
I have no doubt CDPR's optimizing the game for both nvidia and amd cards; some clarification from devs would also be nice.
 
Well, Thracks's "clear and present danger" claim was caught out as false and FUD, so his credibility when it comes to anything but AMD's own product announcements is defunct.

Specifically, he claimed "Nvidia is asking to developers to give them exclusive access to code by contract", which was baseless and false and was refuted.

http://www.forbes.com/sites/jasonev...ut-gameworks-amd-optimization-and-watch-dogs/

The developers have stated minimum and recommended specifications for AMD GPUs and have said nothing that could be taken as a statement that the game will underperform on them. Claiming anybody said otherwise is FUD.
And at the risk of repeating myself, if AMD wishes their products to be seen as competitive with nVidia's, AMD must provide tools and support that are right now in fact, not sometime in the future, competitive with nVidia's.
 
Last edited:
By the sound of things, nothing is going to be very well optimized for tw3.

Depends on how you define optimized. If you mean they have crammed more graphics, action, and physics into one game than they otherwise could have, I'd say they have done that rather well. If you mean they have reduced the capabilities of the game until they have a set that can run on common hardware, I'd say I'm glad they have not.
 
Depends on how you define optimized. If you mean they have crammed more graphics, action, and physics into one game than they otherwise could have, I'd say they have done that rather well. If you mean they have reduced the capabilities of the game until they have a set that can run on common hardware, I'd say I'm glad they have not.
Well, I know that it is still 4 months until release but right now they need a top-of-the-line rig to fluidly display the game on "high" settings on PC (i7-4790, 8GB RAM, GTX 980, SSD). Ubisoft would already have been crucified by gamers for that kind of requirements. ;)

At OP: Gameworks isn't really a thread to AMD users, it's just an inconvenience. Usually (or better theoretically) if you don't use respective Nvidia-only features you shouldn't have any problems on AMD cards. Of course optimization is never a solid science but a complex matter. And after all, you could even use an older, cheap Nvidia GPU to do Physx stuff for your AMD system. Your system should profit greatly by that in a Physx game.
 
Depends on how you define optimized. If you mean they have crammed more graphics, action, and physics into one game than they otherwise could have, I'd say they have done that rather well. If you mean they have reduced the capabilities of the game until they have a set that can run on common hardware, I'd say I'm glad they have not.

No. More like while the game looks decent, it has no business running at the aforementioned frame rate on one of the highest tier cards available on the market.
Here's hoping that we can run the game decently on high-ish settings without a NASA computer around launch.
 
After AC Unity I am also wary. ACU was unplayable even on my highend 280X until like two weeks after launch when AMD released new drivers. I would really hate to take 2 weeks off to play TW3 only to find out it is unplayable because I own AMD. Especially when AMD is in both consoles. Please CDP, optimize this game.
 
Well, Thracks's "clear and present danger" claim was caught out as false and FUD, so his credibility when it comes to anything but AMD's own product announcements is defunct.

Specifically, he claimed "Nvidia is asking to developers to give them exclusive access to code by contract", which was baseless and false and was refuted.

http://www.forbes.com/sites/jasonev...ut-gameworks-amd-optimization-and-watch-dogs/

The developers have stated minimum and recommended specifications for AMD GPUs and have said nothing that could be taken as a statement that the game will underperform on them. Claiming anybody said otherwise is FUD.
And at the risk of repeating myself, if AMD wishes their products to be seen as competitive with nVidia's, AMD must provide tools and support that are right now in fact, not sometime in the future, competitive with nVidia's.

FYI, AMD does offer tools and SDKs that are decent. I am sorry but Forbes covering a story about computer hardware is funny, because they have no expertise in it at all. Also, my question to you is: is GameWorks open source? can I see the source code? because from what I have read it is not. Also, I believe that Nivdia may be restricting technologies that they have for optimization AMD hardware. Saying that this is false and FUD, is not true because we have seen games like Watch Dogs, or AC Unity support this idea. Also please see this article and video: http://www.pcper.com/news/Graphics-Cards/AMD-Planning-Open-Source-GameWorks-Competitor-Mantle-Linux

Don't get me wrong, you can blame the developer for not optimizing the game, but its clear that some games run better with Nvidia hardware. The same can be said with AMD and Mantle, such as BF4, DA:O, etc. As I said, from what it seems CDPR is still a partner of Nvidia, and sure that doesn't mean that they will badly optimize the game for AMD cards, but it does clearly mean that AMD cards won't be able to access GameWork's technologies. So my question is why is CDPR supporting Nvidia that uses constructed business practices. Just like how they tackled the DRM free issue, why are they affiliated with Nvidia? I am not saying that they should either be a AMD partner.
 
Last edited:
Forbes has a lot more technical expertise and unwillingness to spread uncorroborated rumors than Kotaku and the unqualified contributors on the Steam forum. Forbes writes regularly and knowledgeably on gaming in particular. So your attempt to discredit the refutation of AMD's claims holds no water.

To use a software development kit or middleware or to work with a manufacturer on optimization, they must have product ready and engineers ready when you need them, meaning Mantle was never even in the picture.

The only proof of this pudding will be in the eating, unless CDPR makes a demonstration on AMD hardware before release. I would like to see one, too. The GPU on my highest-performing host is a Tahiti, and I really would like to get an idea of what I can expect. But until then, everything else is speculation at best and FUD at worst.
 
Last edited:
I just used those two sources just to bring in the topic, like I said Pcper, did a very good video and overall of the whole GameWork thing. I am sorry but I don't trust forbes compared to other websites out there like 3dguru, Anandtech, etc. However, I am sure that the game will run nicely because consoles use AMD hardware as well. http://www.pcper.com/news/Graphics-C...r-Mantle-Linux
 
Last edited:
After AC Unity I am also wary. ACU was unplayable even on my highend 280X until like two weeks after launch when AMD released new drivers. I would really hate to take 2 weeks off to play TW3 only to find out it is unplayable because I own AMD. Especially when AMD is in both consoles. Please CDP, optimize this game.
Sorry, but an 280X is not a "highend" card anymore. IIRC AC Unity's recommended specs even demanded a better card...

Still I hope that W3 will be well optimized for every hardware and both nvidia and AMD.
 
I don't even see how this is a question at this point. Of course it won't be well optimized for AMD; did you not read the system reqs? In what world is a GTX770 the equivalent of an R9 290? If the game performed as well on AMD, they would have listed the actual competitor to the 770 in the system reqs (ie a 7970/R9 280x).
 
Sorry, but an 280X is not a "highend" card anymore. IIRC AC Unity's recommended specs even demanded a better card...

Still I hope that W3 will be well optimized for every hardware and both nvidia and AMD.

I consider it high-end. Not highest, but high. Middle would be something like 270.
Unity runs pretty well on 280 with omega drivers and looks stunning, I just hope Witcher will run well right on release.
 
I don't even see how this is a question at this point. Of course it won't be well optimized for AMD; did you not read the system reqs? In what world is a GTX770 the equivalent of an R9 290? If the game performed as well on AMD, they would have listed the actual competitor to the 770 in the system reqs (ie a 7970/R9 280x).

I thought of the same thing, that's why I started this tread, because a R9 290 shouldn't be in the equivalence with the GTX 770. Furthermore, we don't even know what CDPR means by recommended specs. I think its most likely 1080p, with high settings, no way it can be maxed. A journalist from a recent event stated that the PC, he played TW3 on, was with 2 SLI 980s.

---------- Updated at 10:35 PM ----------

I consider it high-end. Not highest, but high. Middle would be something like 270.
Unity runs pretty well on 280 with omega drivers and looks stunning, I just hope Witcher will run well right on release.

I think the R9 280X/7970, should be able to at least hit high/medium settings.
 
Forbes has a lot more technical expertise and unwillingness to spread uncorroborated rumors than Kotaku and the unqualified contributors on the Steam forum. Forbes writes regularly and knowledgeably on gaming in particular. So your attempt to discredit the refutation of AMD's claims holds no water.

To use a software development kit or middleware or to work with a manufacturer on optimization, they must have product ready and engineers ready when you need them, meaning Mantle was never even in the picture.

The only proof of this pudding will be in the eating, unless CDPR makes a demonstration on AMD hardware before release. I would like to see one, too. The GPU on my highest-performing host is a Tahiti, and I really would like to get an idea of what I can expect. But until then, everything else is speculation at best and FUD at worst.

Sorry but when FORBES article is a written extension of NVIDIAS reply of a simple "nu uh", I'm not gonna put much faith into it.
 
Sorry but when FORBES article is a written extension of NVIDIAS reply of a simple "nu uh", I'm not gonna put much faith into it.

Exactly what I thought as well, Nivdia's GameWorks is not even open source, I want to see its source code to make sure that none of their optimizations effect AMD users and their hardware.
 
Well, if the Witcher 2 is anything to go by, I'd guess not.

My gtx 970 is 10 fps faster than my hd 7990 in Witcher 2, but about 12 fps slower in Tomb Raider.

Although that could just be poor scaling with xfire.
 
Still no word from CDPR on this? Well I wont be pre-ordering or buying the game until I have seen that it works well on AMD hardware and nVidia bribes and dirty tricks to use gameworks to purposely lower performance on AMD hardware have not been used when they "helped" to make this game.
 
Top Bottom