Forums
Games
Cyberpunk 2077 Thronebreaker: The Witcher Tales GWENT®: The Witcher Card Game The Witcher 3: Wild Hunt The Witcher 2: Assassins of Kings The Witcher The Witcher Adventure Game
Jobs Store Support Log in Register
Forums - CD PROJEKT RED
Menu
Forums - CD PROJEKT RED
  • Hot Topics
  • NEWS
  • GENERAL
    THE WITCHER ADVENTURE GAME
  • STORY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 THE WITCHER TALES
  • GAMEPLAY
    THE WITCHER THE WITCHER 2 THE WITCHER 3 MODS (THE WITCHER) MODS (THE WITCHER 2) MODS (THE WITCHER 3)
  • TECHNICAL
    THE WITCHER THE WITCHER 2 (PC) THE WITCHER 2 (XBOX) THE WITCHER 3 (PC) THE WITCHER 3 (PLAYSTATION) THE WITCHER 3 (XBOX) THE WITCHER 3 (SWITCH)
  • COMMUNITY
    FAN ART (THE WITCHER UNIVERSE) FAN ART (CYBERPUNK UNIVERSE) OTHER GAMES
  • RED Tracker
    The Witcher Series Cyberpunk GWENT
THE WITCHER
THE WITCHER 2
THE WITCHER 3
MODS (THE WITCHER)
MODS (THE WITCHER 2)
MODS (THE WITCHER 3)
Menu

Register

Should Witcher 3 support DX11 multithreading?

+
P

prince_of_nothing

Forum veteran
#1
Feb 18, 2014
Should Witcher 3 support DX11 multithreading?

For those that aren't familar with DX11 multithreading, it's a performance feature of DX11 that parallelizes rendering and relies on the GPU drivers to work. Currently, NVidia is the only IHV that supports this feature. AMD and Intel do not.

Few games use it because of that, but when they do, it seems to be very effective at reducing CPU bottlenecks in the rendering process and increasing performance. Two games I know for a fact that support it are Civilization 5 and Assassin's Creed III. Both games perform much better on NVidia hardware as a result. Also, Project C.A.R.S due out this year also supports DX11 multithreading, and even though the game isn't out yet, you can see that NVidia has a sizeable lead over AMD, mostly due to that feature:


As you can see by the video, the GTX 660 is roughly 55% faster than the HD 7850. Thats not to say that all games with this feature would have such a large performance gain though, as it only increases performance in CPU limited situations. Anyway, since the Witcher 3 will be sponsored by NVidia, I think it has a good chance of supporting DX11 multithreading.

A massive game like the Witcher 3 will be both CPU and GPU intensive, and the fact that there are no loading screens compounds the matter as data has to be loaded and rendered very quickly to not break the feeling of immersion that unbroken continuity brings. Witcher 3 is a prime candidate to be a beneficiary of this technology, so we should petition CDPR to support it, unless they already are! :D
 
G

GuyNwah

Ex-moderator
#2
Feb 18, 2014
Real product managers and development managers do not let customer calls for specific technologies drive their decisions. They have to do what is best for the product based on everything they know about it and many things we do not know. It is also far too late in the development process to be introducing new technology.

So either they are or they are not using this already, and nothing we can say will change it. The petition thread may be used to discuss merits of the technology, but it would not be constructive to use it for what Americans call armchair quarterbacking.

The merit of DirectX 11 multithreading is far from proven. AMD chose not to implement it for the good cruel reason that it isn't compatible with their GPU architecture. Some game engine developers chose not to implement it for the equally good reason that it makes their engine run slower, by asking the GPU to do operations that the CPU is better equipped to do.
 
Last edited: Feb 18, 2014
  • RED Point
Reactions: jjavier
sidspyker

sidspyker

Ex-moderator
#3
Feb 18, 2014
prince_of_nothing said:
so we should petition CDPR to support it, unless they already are! :D
Click to expand...
Production of the game has finished though, I don't know if now they can experiment with the renderer or have the resources unless it's a painlessly quick process so to speak.
 
P

prince_of_nothing

Forum veteran
#4
Feb 18, 2014
Guy N'wah said:
The merit of DirectX 11 multithreading is far from proven. AMD chose not to implement it for the good cruel reason that it isn't compatible with their GPU architecture. Some game engine developers chose not to implement it for the equally good reason that it makes their engine run slower, by asking the GPU to do operations that the CPU is better equipped to do.
Click to expand...
Really? I thought the reason they didn't implement it was due to sheer incompetence. I read that it's very difficult to do because your entire driver code base has to be changed. I know it took NVidia roughly two years to do it, and even after they released the drivers that had that function enabled, it only worked in one game and that was Civ 5.

They've got it working pretty good now though by all reports.
 
P

prince_of_nothing

Forum veteran
#5
Feb 18, 2014
sidspyker said:
Production of the game has finished though, I don't know if now they can experiment with the renderer or have the resources unless it's a painlessly quick process so to speak.
Click to expand...
Wasn't there a post recently which stated they were still working on the renderer?
 
G

GuyNwah

Ex-moderator
#6
Feb 18, 2014
AMD has stated they don't like it because it doesn't go far enough. It requires multiple DX11 command queues, which they don't have, and they believe (with good reason) that Mantle will do the same thing only better. They evaluated the CPU scaling of DX11 multithreading and concluded it bottlenecks on the CPU. (Naysayers may contend that everything bottlenecks on AMD CPUs, but I don't think that's either true or relevant.)

prince_of_nothing said:
Wasn't there a post recently which stated they were still working on the renderer?
Click to expand...
I'd be surprised if they weren't tweaking the renderer right up to release. But if they have not done this already, it is a massive change.
 
Last edited: Feb 18, 2014
P

prince_of_nothing

Forum veteran
#7
Feb 18, 2014
Guy N'wah said:
AMD has stated they don't like it because it doesn't go far enough. It requires multiple DX11 command queues, which they don't have, and they believe (with good reason) that Mantle will do the same thing only better. They evaluated the CPU scaling of DX11 multithreading and concluded it bottlenecks on the CPU. (Naysayers may contend that everything bottlenecks on AMD CPUs, but I don't think that's either true or relevant.)
Click to expand...
I recall some posts from an AMD dev on Rage3d forum once stating that they tried DX11 multithreading, but it either gave them very little or no performance increase. In fact, there is an interview with Richard Huddy on Bit-Tech in which he states that DX11 multithreading only results in an increase of a factor of two in draw call submission, which we know is not true going by NVidia's results in Civ 5. NVidia got much higher than that..

You can read that interview here

That interview was three years ago, so even then, AMD apparently had intentions to ditch DirectX and go with a lower level API, aka Mantle.

But just because it wasn't working for AMD, doesn't mean the technology is broken or ineffective. It works very well on NVidia hardware, arguably just as effective as Mantle at reducing CPU bottlenecks. AMD seems to have inefficient D3D drivers. The numerous Mantle reviews exposed that, when they benchmarked NVidia cards for comparison:



That's from the Tech Report review, and the GTX 780 Ti is faster than the R290x on the Mantle pathway (plus even with the 7850K in D3D), which implies that NVidia has very efficient D3D drivers.

I'd be surprised if they weren't tweaking the renderer right up to release. But if they have not done this already, it is a massive change.
Click to expand...
Yeah you're probably right. I just hope they did it, as the Witcher 3 is such a huge game that it would really benefit from the technology I think.
 
G

GuyNwah

Ex-moderator
#8
Feb 18, 2014
Certainly you're right that DX11 multithreading didn't work well for AMD. That doesn't mean their performance claims for it are not true; they're true for their architecture; nVidia has different results for their architecture. I think it is part of their reasoning around Mantle; they have something they believe will get better results, at least with their architecture.

Since it works well for some applications on nVidia, it's certainly not broken. But it is not proven that it translates to general applicability. User-oriented reviews based on a small number of games aren't by themselves meaningful. The technology has to be attractive to developers across a variety of game engines for it to warrant adoption.
 
D

darcler

Senior user
#9
Feb 18, 2014
prince_of_nothing said:
Wasn't there a post recently which stated they were still working on the renderer?
Click to expand...
They have stated some time ago that the renderer is feature complete, and they are working on tweaking it and fixing bugs. Don't remember the source, though.
 
L

loupblanc91

Forum regular
#10
Feb 18, 2014
Can someone explain me what in this tech is interesting ?
Why do we want this wich replace charge on CPU in charge on GPU, knowing that on a pc we are always GPU limited and not CPU. Although the price of a good GPU is way more expansive than the price of a good CPU
 
S

SercaNesrin

Senior user
#11
Feb 18, 2014
Mantle = innovation. With a little more work done on it, I can see and understand it's benefits over DirectX. Although I now have a GTX 780 "to see how PhysX change gameplay", I really respect AMD for the past 6 years. My CPU is a 8 Core AMD FX8350 clocked at 4ghz.
 
G

GuyNwah

Ex-moderator
#12
Feb 18, 2014
loupblanc91 said:
Can someone explain me what in this tech is interesting ?
Why do we want this wich replace charge on CPU in charge on GPU, knowing that on a pc we are always GPU limited and not CPU. Although the price of a good GPU is way more expansive than the price of a good CPU
Click to expand...
Here is why it is interesting. Modern CPUs and GPUs are little supercomputers with number-crunching ability we could only dream of having on our desktops as recently as eight years ago.

CPUs are good at doing a lot of different things at once. Multiprocessing and threading are ways of taking advantage of this; it lets you do things like load the resources (objects like textures and meshes and maps) for cells that aren't visible yet and let Windows do its housekeeping without slowing down your game.

GPUs are good at doing one thing across a big complex scene really fast. But a lot of time is spent sending commands and resources to the GPU. In DirectX 9 and 10, only one thread communicates with the GPU at any time. You can't load the instructions to compute the next scene while interacting with the GPU to render the current scene.

DirectX 11 and Mantle change that by allowing multithreading on the GPU. Program threads can be loading stuff without tying up the GPU, and the GPU can execute commands without interacting with the thread that loaded the commands. Done right, this gives you a way to use a lot more of the GPU over longer jobs. (Done wrong, it makes everything worse: unstable, slow, and deadlock-prone. Threaded programming is an art that many programmers are NOT masters of.)

Mantle goes DirectX 11 one better by making the relationship between CPU threads and command lists flexible. It makes it possible to have several threads setting up resources and both setting up and executing commands. I suspect this is a reason AMD took a pass on implementing DX11 multithreading: their engineers feel, with some justification, that they have already done better and do not wish to incur the high cost without benefit that implementing an inferior feature would be.
 
Last edited: Feb 18, 2014
  • RED Point
Reactions: loupblanc91
S

shawn_kh

Rookie
#13
Feb 18, 2014
Intel CPUs have always been superior to AMD CPUs. I thought With both PS4 and XB one bearing AMD CPUs, that this trend would change this Gen; but Intel CPUs are still superior thus far, but there is still time for AMD to step up their game. Meanwhile I'm going to get an Intel CPU again.
 
P

prince_of_nothing

Forum veteran
#14
Feb 18, 2014
Guy N'wah said:
DirectX 11 and Mantle change that by allowing multithreading on the GPU. Program threads can be loading stuff without tying up the GPU, and the GPU can execute commands without interacting with the thread that loaded the commands. Done right, this gives you a way to use a lot more of the GPU over longer jobs. (Done wrong, it makes everything worse: unstable, slow, and deadlock-prone. Threaded programming is an art that many programmers are NOT masters of.)
Click to expand...
A good summary Guy, but from what I understand, the whole point of DX11 multithreading was to take the burden of multithreading off the shoulders of the developer (the developer doesn't have to thread anything as the drivers do that automatically, they just have to set it up right), and place it on both the API and the GPU driver; especially the latter. That's why it took NVidia so long to come out with usable drivers as they had to refactor practically their entire driver codebase..

As we discussed earlier, AMD wasn't seeing any improvement so they gave up, but I don't think it's due to their architecture. DX11 multithreading is all done in software, and the drivers are absolutely critical for it to work properly. For whatever reason, AMD could not get the drivers to work properly and in the end, decided to abandon it and go full bore down the Mantle path..

We'll see whether it pays off for them in the long run.

Mantle goes DirectX 11 one better by making the relationship between CPU threads and command lists flexible. It makes it possible to have several threads setting up resources and both setting up and executing commands. I suspect this is a reason AMD took a pass on implementing DX11 multithreading: their engineers feel, with some justification, that they have already done better and do not wish to incur the high cost without benefit that implementing an inferior feature would be.
Click to expand...
DX11 multithreading does the exact same thing though, except the greatest burden is placed on the GPU driver instead of the API.. I don't think Mantle can survive without other IHVs supporting it, but it seems to be vendor locked to GCN hardware devices only at this time, though that may change..
 
P

prince_of_nothing

Forum veteran
#15
Feb 18, 2014
loupblanc91 said:
Can someone explain me what in this tech is interesting ?
Why do we want this wich replace charge on CPU in charge on GPU, knowing that on a pc we are always GPU limited and not CPU. Although the price of a good GPU is way more expansive than the price of a good CPU
Click to expand...
Guy has a good summary of what the technology does. And PC games are not always GPU limited. A big game like the Witcher 3 can easily become CPU limited due to how detailed a scene is, and how long the draw distance is. From the short snippets of gameplay, it appears that the Witcher 3 has both a VERY detailed setting, and the draw distances are amazing!

So while it's too early to tell at this point, it could very well be CPU limited in certain areas.
 
G

GuyNwah

Ex-moderator
#16
Feb 18, 2014
@prince_of_nothing
DX11 multithreading doesn't get the developer out of having to be thread-conscious. It lets you do deferred rendering from multiple threads; while that's a big performance improvement, it just moves the thread sync issues around without getting rid of them. Something still has to track completion of threads and order the actual execution of the deferred rendering jobs. Because order between deferred rendering jobs is an application issue, I don't think it can be solved to anybody's satisfaction at the driver level.

AMD claims their implementation in Mantle is different and more flexible, allowing thread to renderer relationships that don't exist in DX11. Since public information about Mantle is scarce and probably will be for some time, I'm going to err on the side of taking their word for it, until there's clearer evidence that I'm wrong.

I'd say Mantle needs a "killer app" to survive. Support won't be forthcoming unless there's something that can be sold for enough to justify a high-risk venture. I've caught a lot of flak for saying Mantle is so closely tied to GCN that no other GPU vendor would implement it, but I still don't see anything to the contrary that has a better foundation than bare assertions or PR-speak.
 
Last edited: Feb 18, 2014
P

prince_of_nothing

Forum veteran
#17
Feb 18, 2014
sercan said:
Mantle = innovation. With a little more work done on it, I can see and understand it's benefits over DirectX. Although I now have a GTX 780 "to see how PhysX change gameplay", I really respect AMD for the past 6 years. My CPU is a 8 Core AMD FX8350 clocked at 4ghz.
Click to expand...
Mantle does the exact same thing as DX11 multithreading, except that the burden for threading is placed solely on the API, and the developer has more explicit control over how threads are used. With DX11 multithreading, the threading is handled automatically by the driver I believe. The developer just has to set up their engine for it and determine how many threads are going to be used by the driver for rendering.

*Edit* It appears that the developer still has a lot of responsibility for handling threading in DX11 multithreading.
 
Last edited: Feb 18, 2014
P

prince_of_nothing

Forum veteran
#18
Feb 18, 2014
Guy N'wah said:
@prince_of_nothing
DX11 multithreading doesn't get the developer out of having to be thread-conscious. It lets you do deferred rendering from multiple threads; while that's a big performance improvement, it just moves the thread sync issues around without getting rid of them. Something still has to track completion of threads and order the actual execution of the deferred rendering jobs. Because order between deferred rendering jobs is an application issue, I don't think it can be solved to anybody's satisfaction at the driver level.
Click to expand...
Well it will be interesting to see where this technology goes if anywhere, with the next version of DirectX. NVidia has put all this work into refactoring their drivers for this technology. It would be a shame to see all of that work go to waste, if Microsoft abandons that model and goes with something closer to Mantle.

Perhaps the dependencies and inefficiencies could be sorted out in an improved version.

AMD claims their implementation in Mantle is different and more flexible, allowing thread to renderer relationships that don't exist in DX11. Since public information about Mantle is scarce and probably will be for some time, I'm going to err on the side of taking their word for it, until there's clearer evidence that I'm wrong.
Click to expand...
I think AMD is supposed to publish the official spec sheet for Mantle sometime in the first quarter this year.
 
G

GuyNwah

Ex-moderator
#19
Feb 18, 2014
Yeah, where DirectX "Next" goes is, as of now, anybody's guess. If it even exists, it's a fair guess that AMD and nVidia have received the advance specs for it. AMD doesn't seem to want it, and it seems to have missed the train for Maxwell.

OpenGL would probably be a better solution that would be genuinely in common across all platforms, but multithreading with OpenGL's asynchronous model is its own can of worms, and getting AMD, Apple, Intel, Khronos, Microsoft, and nVidia to agree on anything is truly herding cats.
 
Share:
Facebook Twitter Reddit Pinterest Tumblr WhatsApp Email Link
  • English
    English Polski (Polish) Deutsch (German) Русский (Russian) Français (French) Português brasileiro (Brazilian Portuguese) Italiano (Italian) 日本語 (Japanese) Español (Spanish)

STAY CONNECTED

Facebook Twitter YouTube
CDProjekt RED Mature 17+
  • Contact administration
  • User agreement
  • Privacy policy
  • Cookie policy
  • Press Center
© 2018 CD PROJEKT S.A. ALL RIGHTS RESERVED

The Witcher® is a trademark of CD PROJEKT S. A. The Witcher game © CD PROJEKT S. A. All rights reserved. The Witcher game is based on the prose of Andrzej Sapkowski. All other copyrights and trademarks are the property of their respective owners.

Forum software by XenForo® © 2010-2020 XenForo Ltd.