Are AMD Cards Going To Be Well Optlimzed For TW3?

+
Well I wont be pre-ordering or buying the game until I have seen that it works well on AMD hardware

This is the best thing you can do, wait for benchmarks, then decide if you want to purchase the game or not. Pre-orders are generally a bad thing, even if it is CDPRs game..
 
For more on the Gameworks bit, I found this article. Concise, informed and worth a read.

http://www.extremetech.com/gaming/1...elopers-weigh-in-on-the-gameworks-controversy

For my purposes, I run twin 290x and I look forward to playing Witcher 3.

If Witcher 3 doesn't work with those cards for some insanely unlikely reason, (I would blame @blank_redge), I will ebay/replace them with cards that do work.

I use the hardware to play games. Witcher 3 is a big thing for me, it's much more important I have a good time with it than I keep inferior hardware. The game trumps the tools I use to play the game, of course.

That wouldn't be the case for many lesser games, but there are precious few Big Releases I'm excited about. For those few, I condfigure my system, not the other way around.
 
Delaying buying the game until you've seen that it works is fine. Making unfounded accusations of illegal activities is not.

If you knew what nVidia gets up to and has done in the past then you would not think these were unfounded claims.
Free feel to look into the nvidia bumpgate scandal, assassins creed DX10.1 removal, Batman Anti-Aliasing scandal, Nivida bribing Origin PC to drop AMD and bad mouth them etc etc

Nvidia are happy to lie, cheat and bribe anyone, even there own customers and partners.

The way you're meant to be played :)
 
If you knew what nVidia gets up to and has done in the past then you would not think these were unfounded claims.
Free feel to look into the nvidia bumpgate scandal, assassins creed DX10.1 removal, Batman Anti-Aliasing scandal, Nivida bribing Origin PC to drop AMD and bad mouth them etc etc

Nvidia are happy to lie, cheat and bribe anyone, even there own customers and partners.

The way you're meant to be played :)

While I agree with a lot of people that there are lots of questions we all liked to be answered in regards to so many things I really don't see the relevance to what Nvidea have or haven't done in the past. That have nothing to do with TW3 and CDPR.

My recommendations is as others already have said to wait till the game is out and then see for yourself how the game runs on whatever hardware that is similar to your own and then decide whether you want to support the game or not.
 
Last edited:
I really don't see the relevance to what Nvidea have or haven't done in the past. That have nothing to do with TW3 and CDPR.
Since i've been a long time fan of the witcher i want TW3 to be great at release, however since they have "partnered" with nvidia and since gameworks is integrated into this game it makes me and others worried.
 
I'm not sure I "get" the controversy.

AMD seem to claim that optimising is hard without the source code for GW libraries ... Which sounds valid if it were not available.
NVIDIA have introduced a standardised method for distributing the source code for GW libraries, which seems to permit AMD's preferred mode of optimisation.

At this point I'm not sure what the issue is except that NVIDIA start with their hardware/drivers/library versions aligned for good performance on their own hardware and AMD need to play catch-up. Doesn't seem sinister, nor evil and permits the developer to work to a common API, with once the wrinkles are ironed out similar performance on both GPU lines.
 
Since i've been a long time fan of the witcher i want TW3 to be great at release, however since they have "partnered" with nvidia and since gameworks is integrated into this game it makes me and others worried.

I get that. I have AMD too atm just to be clear. But I just don't see the relevance in what Nvidea may or may not have done in the past in regards to TW3. CDPR is not Nvidea and given CDPR's track record nothing indicates that they would favoritize (is that even a word?) something over another.

As far as I know they also have some kind of partnership with Microsoft and the only thing they got was a exclusive card game if I remember correctly as CDPR stated they wanted nothing to do with anything that would compromise or gimp the core game for other platforms. I think this also applies to AMD/Nvidea.

No matter what we will only really see and be all the more wiser when the game is released and we can see independent benchmarks of the game.
 
I have been biased towards Nvidia. If I have to name a reason I don't have any except Nvidia's new features. However, atm I am waiting for AMD's new cards and will most likely end up with 1 or 2 of them because of performance + the issue with GTX 970 VRAM.
 
I've generally always said - historically speaking, between the two. AMD tend to release cards with (strictly speaking) superior hardware, versus their green counterparts, but they're not quite as good at 'squeezing' performance(per watt), by design/architecture & software.

While NV tend to release cards with less powerful(speed) hardware, but also with lower wattage requirements(at least more recently they have). They then 'make up' the difference in hardware with architecture specifics, and the fact they're quite a ways in the lead with regards to the gpu (proprietary)software side of things.

So, depending on the project/game, and how it handles the card's hardware, and if/when coded to utilise it's software. It can look like the system requirements are imbalanced between the two. You've to also keep in mind that between the two recommended gpus, the 290 is the cheaper one, and the 770 can be quite a bit more expensive.

As a side note, when people are calling CDPR out on those said requirements - there are a lot of factors in this, but I commend them on posting 'actual' requirements to comfortably play the game. Unlike a lot of other developers, which post their minimum requirements as ridiculously low, to the point that they're probably within the margin of literally being able to even run the game, technically speaking.

A lot of other developers also wouldn't have so openly revealed what hardware their hand's on press systems were running during the event. Before the games' release.

---
Whoa, that ended up turning into a wall of text, sorry ;p
 
Last edited:
As far as I'm aware, games being optimized for particular GPUs are a myth, at least in PC land. Let me explain. PC games typically use Direct3D/OpenGL for rendering. Direct3D/OpenGL are highly abstract APIs designed to interface between the game itself and the GPU drivers, thus mitigating the need for specific architectural optimizations. Thats why a game like the Witcher 3 can run on AMD, NVidia and possibly even Intel hardware provided it's DX11 capable.. So when developers optimize a game, they are in reality optimizing for Direct3D/OpenGL, and NOT for a particular GPU architecture.

Of course there are exceptions for this, such as with low level APIs like Mantle and the upcoming DX12, where the developer has much greater control over, and ability to access the hardware.

With DX11 though, it's the IHV's (NVidia, Intel and AMD) responsibility to optimize their drivers for 3D games and Direct3D, because ultimately it's the drivers that are interacting with the GPU itself on the lowest level. The game developer, in this case CDPR, cannot access the hardware on a low level.

As for the Witcher 3 being Gameworks, this by no means guarantees that it will run better on NVidia than on AMD. Far Cry 3 was Gaming Evolved, and so was Crysis 3. Both titles run faster on NVidia than they do on AMD. Also, Metro Last Light was Gameworks, but it runs slightly faster on AMD hardware.. The only real performance advantage Gameworks gives to NVidia is that it allows them to get a headstart on optimizing their drivers for the game..
 
In terms of Nvidia, at least. A lot of their driver "optimization" that I've seen, from inspecting driver files. Lends to them going so far as to disable, or reduce a games' specific graphical effects, that don't play nice with their cards, instead of working to find a real solution. They're particularly fond of background forcing reduction of game settings with water/reflection oriented effects in games.

E.g: You've picked 'ultra' in a specified setting, when the gpu drivers are in fact forcing that said setting to cap at 'high', and overriding your preference in cases where the performance suffers enough from it.
 
Last edited:
Well, I have a I7 4790k, 8gb ram, R9280X. If I notice at day one that TW3 runs bad for me, I will return game to GOG same day. Really tired of broken games and really tired of nvidia "cash style" way of life.
 
You need to define what you mean by bad. If you try running on Ultra with ubersampling on 1440p, then you might have some issues. Does this make the game bad or broken, no.

I'm still waiting for the new AMD cards, hopefully they release something before the W3 is released, otherwise I'll have to get a 970.
I have historically used AMD cards, just a preference of mine, I don't get into the fan boy squabbles, but I'll just get the best that is currently available in my price range.
 
[Question to the Devs]Will be the Witcher 3 a performance mess with AMD Hardware?

Edit: I didnt know there was already a topic about this. Well, we are having another game with awful performance in AMD hardware: Dying light. I really really hope we could be able to play The Witcher 3 with a decent fps.
 
Last edited:
I get that. I have AMD too atm just to be clear. But I just don't see the relevance in what Nvidea may or may not have done in the past in regards to TW3. CDPR is not Nvidea and given CDPR's track record nothing indicates that they would favoritize (is that even a word?) something over another.

As far as I know they also have some kind of partnership with Microsoft and the only thing they got was a exclusive card game if I remember correctly as CDPR stated they wanted nothing to do with anything that would compromise or gimp the core game for other platforms. I think this also applies to AMD/Nvidea.

No matter what we will only really see and be all the more wiser when the game is released and we can see independent benchmarks of the game.

Back when TW2 came out which was another nVidia "partnered" game they had support for nvidia surround but not for AMD eyefinity.
http://forums.cdprojektred.com/threads/10461-Eyefinity-support-in-Witcher-2

Its a long thread so I dont expect you to read it but basically to support eyefinity is very easy to add and developer docs and SDK are available from AMD but CDPR wont add support as it is an nVidia game. I am all for games supporting different features of graphics cards but to give one side multiple monitor support and deny another side is wrong and not fair. Now with TW3 being even more of an nVidia game then TW2 was you can see where the problems can be.
 
until shipping nobody knows. the optimization its from the developers to the software and from the manufacturer to the drivers. if you have an old card its more likely that the manufacturer wont update the driver for your card. nvidia updates more frequently the drivers. i have a notebook with 16 gig ram an i7 3640 and nvidia gtx680m i hope to play it on mid or a little more. i brought the collector's ed for my xbox one. i hope to have a demo or benchmark to try the game on my pc soon. i have the witcher 1 and 2, the copy of the third from gog costs 35 us for me but i want to see if i can run the game decently before i pay for it.
 
More platform FUD.

Bring a claim with evidence, not discredited rumors about CDPR refusing to support AMD, or keep silence.
 
Last edited:
Top Bottom