Will TW3 have AMD-specific tech?

+
BorsMistral said:
Let me start by saying that I don't care about vendor specific tech at all. The Witcher is a great series and I I want it well optimized for any kind of hardware, so that the maximum number of gamers can fully enjoy it and CDP can rake in some serious, well deserved cash.

That being said, there seem to be a few misconceptions in this thread.

- TressFX is vendor agnostic, it uses DirectCompute. AMD isn't locking to it's own hardware and the tech runs on nV cards without issues. Oh, and it can do hair, fur and grass just as easily.

- Mantle support would be nice. If done properly you'd probably see 10% to 20% or more increase in performance compared to DX11... or at least that's what they say, we'll see when the BF4 patch comes. Again, I'd rather have TW3 itself optimized to run well on all hardware.

- PhysX... let's see... The software version of it, that in games with PhysX you can use on non-nV systems, is purposefully hobbled with the only goal being boosting nV GPU sales (which we may not like, but is an OK thing to do in capitalism). Also, there are a bunch of alternatives, from Havoc to open source ones like Bullet. In the end, Physx does look nice and could be considered worth it in some situations, but it's in essence a vendor-centric sales tool.

So CDP, please, less locked technologies and more goodness that everyone can enjoy.

The claim that CPU PhysX is or ever was deliberately crippled is FUD. Spreading FUD about one's competitors is also a common behavior in capitalism, but nothing like a worthy one.

The use of 8087 floating point was a decision that was necessary at the time, because PhysX existed before support of the SSE instruction set became common. It was not removed until recent versions of PhysX, which now implement proper multithreaded floating point.

But since no games with PhysX support are being released with those old versions of PhysX, old claims are meaningless, and continuing to repeat old claims is FUD.
 
Alextyc1 said:
Ubisoft kiev strikes again :p/>/>/>
(Those who make assassins creed ports since 2011... and some other ubisoft games)

I could make a better port with a toaster with my eyes closed than those lazy devs ...

Actually AC4 is pretty optimized. That chart may have been made using the 331.65 drivers. With the 331.82 and later, AC4 got some pretty big increases in performance.

I'm playing it right now in fact on my machine, and I'm getting 60 FPS most of the time with everything maxed out at 1440p.. This is with SLI though..
 
PrinceofNothing said:
Actually AC4 is pretty optimized. That chart may have been made using the 331.65 drivers. With the 331.82 and later, AC4 got some pretty big increases in performance.

I'm playing it right now in fact on my machine, and I'm getting 60 FPS most of the time with everything maxed out at 1440p.. This is with SLI though..

My friend has a quad core and gtx 460,and he cant get 30 frames in havana no matter on what settings the game is,the optimization is very poor although its looks great and i spend 60 hours with the game,but not everyone have sli pc's that can handle ubisoft kiev bad optimization.

What is your sli setup?
 
Alextyc1 said:
My friend has a quad core and gtx 460,and he cant get 30 frames in havana no matter on what settings the game is,the optimization is very poor although its looks great and i spend 60 hours with the game,but not everyone have sli pc's that can handle ubisoft kiev bad optimization.

What is your sli setup?

I'm running two Gigabyte Windforce GTX 770 4GB cards, on a 3930K overclocked to 4.5ghz and 16GB of DDR3-2133..

Drivers have a lot to do with it I'm telling you, because 331.82 drivers gave a big increase in performance. And 331.93 has an updated SLI profile.
 
PrinceofNothing said:
I'm running two Gigabyte Windforce GTX 770 4GB cards, on a 3930K overclocked to 4.5ghz and 16GB of DDR3-2133..

Drivers have a lot to do with it I'm telling you, because 331.82 drivers gave a big increase in performance. And 331.93 has an updated SLI profile.

Ill tell him...
Anyway i have an amd card :p
 

Aver

Forum veteran
PrinceofNothing said:
I'm running two Gigabyte Windforce GTX 770 4GB cards, on a 3930K overclocked to 4.5ghz and 16GB of DDR3-2133..

Drivers have a lot to do with it I'm telling you, because 331.82 drivers gave a big increase in performance. And 331.93 has an updated SLI profile.

How can you say if game is well optimized when you have graphics cards worth 800$. It's more than cost of average PC (Steam survey). If something would ran badly on PC like that then it would mean that game is completely broken. Especially if game looks like Black Flag. It would be well optimized if it would work fine on one 400$ card.
 
Aver said:
How can you say if game is well optimized when you have graphics cards worth 800$. It's more than cost of average PC (Steam survey). If something would ran badly on PC like that then it would mean that game is completely broken. Especially if game looks like Black Flag. It would be well optimized if it would work fine on one 400$ card.
Well it works fine on a 400$ PS4
(which has a graphics core very similar to a HD7850/7870...)
 

Aver

Forum veteran
M4xw0lf said:
Well it works fine on a 400$ PS4 />/>
(which has a graphics core very similar to a HD7850/7870...)

But we are talking about optimization of PC version here. Ubisoft puts a very little effort into their PC ports, so they are often broken, unoptimized or have awful M&K controls.
 
Aver said:
How can you say if game is well optimized when you have graphics cards worth 800$. It's more than cost of average PC (Steam survey). If something would ran badly on PC like that then it would mean that game is completely broken. Especially if game looks like Black Flag. It would be well optimized if it would work fine on one 400$ card.

I can say the game is optimized, because I'm getting 60 FPS MAXED OUT at 1440p. If the game wasn't optimized, I would not be getting such high frame rates with all the eye candy turned on.

With Borderlands 2, my computer can't even maintain 60 FPS maxed out because the game is broken with PhysX turned on, due to them using that inefficient POS DX9 coupled with the aging Unreal Engine 3.5.

So BL2 obviously isn't well optimized, despite looking far inferior to AC IV..
 
M4xw0lf said:
Well it works fine on a 400$ PS4 />
(which has a graphics core very similar to a HD7850/7870...)
No it would be more equivalent of a 7790-7850 a 7870 is quite a bit better than a 7850
 
RageOrb said:
No it would be more equivalent of a 7790-7850 a 7870 is quite a bit better than a 7850

Architecturally, it is more a 78xx (Pitcairn) equivalent and definitely not a 77xx (Cape Verde) or 7790 (Bonaire). With 18 compute units, it falls between the 7850 (16 CU) and the 7870 (20 CU). It is clocked slower than either, at 800 MHz, so it should perform close to the 7850 (860 MHz).

The difference may be significant for engines that burden the output processors (as TW2's Red Engine does), because the Pitcairn architecture has 32 ROPs vs. 16 for the Cape Verde and Bonaire. I for one would not be surprised to see TW3 perform better on the PS4 vs. the Xbone.
 
My awesome Geralt will most likely kill those wolves too quick for me to notice their furs.

Just saying.
 
Top Bottom