Are AMD users left high and dry?

+
Charges of who has been sleeping with whom should be reserved for when Geralt and Yennefer are reunited. Constrain speculation to facts that are or could be public knowledge, not confidential business arrangements of which you cannot possibly have any knowledge.

They are using nividia's game works library, tw3 has been showcased exclusively on nvidia cards, nvidia has offered free copies of the game with purchase of their cards. The last developer that did the same as what CDPR are doing now with nividia was ubisoft with AC unity and everyone knows how that turned out. It is known nvidia game works cuts off amd at the knees plain and simple. Its pretty clear CDPR are in partnership with nividia. Does that mean this game will run bad on amd? Guess we'll find out Tuesday.

---------- Updated at 09:09 PM ----------

Here is some anecdotal evidence. I have been an ati card buyer since it was a Canadian company and I have NEVER had a problem playing any game that was optimized for the Nvidia cards. Why do people think that suddenly this one game will render AMD cards useless? I don't get how this manufactured problem even started? If you have a crap video card you wont get a great visual experience regardless of the company, if you have a high end video card be it Nvidia or AMD you will have a great visual experience. Isn't that all that really matters? Or must your experience have the best frame rate period?
Because game works library is not allowed to be looked at by amd, which means it will take them longer to release optimized drivers. Now, thats not a bad thing IF this game is not in dire need of drivers from amd on release.
 
AMD's claims related to GameWorks were proven false and constitute FUD. They will get no hearing here.
 
Last edited:
/snip

---------- Updated at 09:09 PM ----------


Because game works library is not allowed to be looked at by amd, which means it will take them longer to release optimized drivers. Now, thats not a bad thing IF this game is not in dire need of drivers from amd on release.

So the frak what!?! So you wont get the most optimised drivers right away tooo bad. Drivers ALWAYS come post launch at various speeds. It was over a month before we got the AMD driver for DA:I and that game was optimised for the AMD. You are manufacturing a problem when there isn't a real problem. This back and forth between the two companies has been going on for decades and its NEVER caused one company's hardware the inability to play any game.
 
So the frak what!?! So you wont get the most optimised drivers right away tooo bad. Drivers ALWAYS come post launch at various speeds. It was over a month before we got the AMD driver for DA:I and that game was optimised for the AMD. You are manufacturing a problem when there isn't a real problem. This back and forth between the two companies has been going on for decades and its NEVER caused one company's hardware the inability to play any game.
Yes stupid me expecting a $60 game to run well on my machine granted the hardware I'm using doesn't belong to a company that gets locked out. This is not a manufactured problem by me. If you fail to see the problem with a hardware developer having a game developer exclusively use their tool set in a licensing agreement then there is no real point in replying to me. In pc gaming optimization is key. If you would like to read more about it I have provided a link and yes I'm aware of the forbes article where nividias rebuttal to amd simply consist of a "nu uh". The article I provided is talking with two developers.
http://www.extremetech.com/gaming/1...elopers-weigh-in-on-the-gameworks-controversy
 
Even Mantle got donated to OpenGL - which I wish all the Devs would move to and support.

CDPR will have to use Vulkan (if it's available to them) or OpenGL + OpenCL on other platforms. DirectCompute is Windows only, so they don't use it already on PS4 for instance.
 
Yes stupid me expecting a $60 game to run well on my machine granted the hardware I'm using doesn't belong to a company that gets locked out. This is not a manufactured problem by me. If you fail to see the problem with a hardware developer having a game developer exclusively use their tool set in a licensing agreement then there is no real point in replying to me. In pc gaming optimization is key. If you would like to read more about it I have provided a link and yes I'm aware of the forbes article where nividias rebuttal to amd simply consist of a "nu uh". The article I provided is talking with two developers.
http://www.extremetech.com/gaming/1...elopers-weigh-in-on-the-gameworks-controversy


It is a manufactured problem because the game WILL run well on AMD hardware. You are implying it is a mutually exclusive arrangement IT IS NOT. These types of manufacturer endorsements happen all the fraking time. There isn't a single manufacturer endorsement that cause the other manufacturer's cards to stop working or not able to work on higher settings. You are making up a problem. My r9 290 is going to run witcher 3 like a fraking dream. I am not going to have a sub par experience because I have an AMD card. Now if an AMD card is old and weak then yes it will suck but if a Nvidia card is old an weak it too will provide an experience that sucks. You are QQing over an imaginary problem.
 
These types of manufacturer endorsements happen all the fraking time.

It's more than just endorsements. Manufacturers can tweak their drivers for existing games to improve their performance. I.e. cheating and instead of executing shaders as is, they translate them on the fly into something that works better on their hardware. It's a major problem, because it creates all kind of uncertainty and you can't rely on the standard with confidence with such behavior. There is some hope that next generation APIs like Vulkan will render such practice irrelevant. But until then - it will be commonplace.
 
It's more than just endorsements. Manufacturers can tweak their drivers for existing games to improve their performance. I.e. cheating and instead of executing shaders as is, they translate them on the fly into something that works better on their hardware. It's a major problem, because it creates all kind of uncertainty and you can't rely on the standard with confidence with such behavior. There is some hope that next generation APIs like Vulkan will render such practice irrelevant. But until then - it will be commonplace.

Oh for the love of god.. The whole point of drivers is to take the optimization of a game one step further. Yes Nvidia for the witcher 3 will have an EASIER time optimizing their drivers for the game, just like AMD had an easier time optimizing their drivers for Dragon age: inquisition BUT IT DOESN'T render the other manufactorer incapable of making a driver. There is not a single case in the history of gaming where these types of endorsements have made it impossible for player not to play with a competitor's hardware.

YOU ARE MAKING SHIT UP. YOU ARE QQing over an imaginary issue. MY R9 290 is not going to suddenly only render the game in low textures at 800 by 600 resolution.

Please tell me what makes the witcher 3 so fraking different that AMD users are fucked?
 
Oh for the love of god.. The whole point of drivers is to take the optimization of a game one step further.
No. The whole point of the driver is to be standard complaint and to do what it's told. When they start twisting shader logic inside out we get a horrible mess of incompatible implementations. It's task of the developer to work on the optimizations, and the driver should be as dumb as possible.

This is not an imaginary issue, it's a view of industry insiders who know what they are talking about and who understand this mess very clearly. May be you didn't follow recent developments. Vulkan developers talked about this quite a bit.
 
Oh for the love of god.. The whole point of drivers is to take the optimization of a game one step further. Yes Nvidia for the witcher 3 will have an EASIER time optimizing their drivers for the game, just like AMD had an easier time optimizing their drivers for Dragon age: inquisition BUT IT DOESN'T render the other manufactorer incapable of making a driver. There is not a single case in the history of gaming where these types of endorsements have made it impossible for player not to play with a competitor's hardware.

YOU ARE MAKING SHIT UP. YOU ARE QQing over an imaginary issue. MY R9 290 is not going to suddenly only render the game in low textures at 800 by 600 resolution.

Please tell me what makes the witcher 3 so fraking different that AMD users are fucked?
It just reached the point where people search for situations they can complain about.
 
Preview version of the game works great on R9 290X in 2.560×1.440 resolution, here is rough translation with google translate.

The shots came in 2,560 × 1,440 with Ultra details, Hairworks is however not activated in these shots. We also dispensed with the Nachschärfe filter, Chromatic aberration and motion blur. All remaining settings are set at the maximum level. Was used a Core i7-5820K @ 4.2 GHz, 16 GiByte DDR4-2600 RAM and a Sapphire R9 290X Tri-X @ 1,125 / 3,200 MHz. With this setup, the frame rates of being beaten even without Day-1 patch version were to greatest parts to 50 fps. Under high load, for example when galloping through the most densely populated areas in the villages or in the forest, they fell well short times to just above 40th

And here is the source.
http://www.pcgameshardware.de/The-Witcher-3-PC-237266/Specials/The-Witcher-3-Screenshots-1159185/
 
Last edited:
No. The whole point of the driver is to be standard complaint and to do what it's told. When they start twisting shader logic inside out we get a horrible mess of incompatible implementations. It's task of the developer to work on the optimizations, and the driver should be as dumb as possible.

This is not an imaginary issue, it's a view of industry insiders who know what they are talking about and who understand this mess very clearly. May be you didn't follow recent developments. Vulkan developers talked about this quite a bit.

Riiiiggghhhht. That is why drivers come out without fail the day a game is released and have nothing to do with optimization... oh wait...

I have even read driver release notes that say things like "further optimized..." So your fantasy land isn't the reality i live in. Ciri can open a portal to your world if you ask her nice because this world you have found yourself in isn't going to conform on your insistence not matter have emphatic you are.

---------- Updated at 04:50 PM ----------

Preview version of the game works great on R9 290X in 2.560 × 1.440 resolution, here is rough translation from google:

The shots came in 2,560 × 1,440 with Ultra details, Hairworks is however not activated in these shots. We also dispensed with the Nachschärfe filter, Chromatic aberration and motion blur. All remaining settings are set at the maximum level. Was used a Core i7-5820K @ 4.2 GHz, 16 GiByte DDR4-2600 RAM and a Sapphire R9 290X Tri-X @ 1,125 / 3,200 MHz. With this setup, the frame rates of being beaten even without Day-1 patch version were to greatest parts to 50 fps. Under high load, for example when galloping through the most densely populated areas in the villages or in the forest, they fell well short times to just above 40th

And here is the source.
http://www.pcgameshardware.de/The-Witcher-3-PC-237266/Specials/The-Witcher-3-Screenshots-1159185/

And this is without drivers that are optimized for witcher 3 . Sounds like I am totally screwed as an AMD user!!! oh wait...

Isn't soooooo freaky that I said those with a good AMD card would get a good graphical experience with teh game?!?!? How did I ever predict this? Oh maybe because it is has been like this since the history of the two companies!!!
 
Last edited:
Riiiiggghhhht. That is why drivers come out without fail the day a game is released and have nothing to do with optimization.

I'm saying that they should not. But they do twist it now, that's why the same logic used in another application won't work the same way on the same driver. Do you get the drift? They create an environment which optimizes specific application against the standard (usually because of private agreement), and all that is possible because the driver is "too smart". When driver becomes dumb and optimization task becomes developers' responsibility (and shader compilation is very standard conformant across all drivers - see SPIR-V), then all this mess becomes impossible. That's what should happen, but it didn't yet.
 
Last edited:
It is a manufactured problem because the game WILL run well on AMD hardware. You are implying it is a mutually exclusive arrangement IT IS NOT. These types of manufacturer endorsements happen all the fraking time. There isn't a single manufacturer endorsement that cause the other manufacturer's cards to stop working or not able to work on higher settings. You are making up a problem. My r9 290 is going to run witcher 3 like a fraking dream. I am not going to have a sub par experience because I have an AMD card. Now if an AMD card is old and weak then yes it will suck but if a Nvidia card is old an weak it too will provide an experience that sucks. You are QQing over an imaginary problem.

You are adding in your own exaggerations to try and have an argument. You are the one manufacturing a problem, a problem I never claimed. You either get it or you do not, and you clearly do not. I nor anyone else claimed GWL would cause amd cards to "not work". I ask you to do some research on nvidias GWL before replying back. In simpler terms "IF.....IF" TW3 were to have major problems on AMD cards that require a "driver" then amd users will be as the topic title suggest, and for quite awhile. There is also ample evidence of games using GWL that strangely under perform on AMD cards. Again, so we are perfectly clear I am NOT saying the game won't run on AMD cards because of GWL but what I am saying is that it may run ***** and driver support will be a long ways off.
http://www.extremetech.com/extreme/...surps-power-from-developers-end-users-and-amd
 
Seems it will run well enough on current AMD cards minus the hairworks, but that's also too taxing for most nvidia cards. I wonder if the upcoming R390X will be able to bruteforce even hairworx.
 
Top Bottom