DirectX 12 ready?

+
Status
Not open for further replies.
@Guy N'wah: It's not really a hypothetical question anymore since Linux market is already developed and growing. Efficiency here is not in how many games they can sell - there is enough demand for Linux and OS X games to develop for those platforms (plus there is of course a demand from developers for high quality engines for them where REDengine can become quite attractive). Efficiency here is more in the the accessibility of tools and API and their quality too. I.e. when DX is significantly easier to use it's more efficient for development. And if OpenGL is too inconsistent across targets and lacks good tools, it's inefficient, i.e. requires much more effort for comparable quality outcome. So improvement in that area increases efficiency and makes it an obvious choice.

As many developers said, they use DX not because MS pays them money or because they care about Windows, on the contrary, they admit that it limits their possibilities. But they use it because it does the job and does it efficiently. If glnext can offer them the same - they'll ditch DX easily.
 
Personally, i don't mind if they do or they don't do it.

If what Micosoft sais is true, it would be a great improvement... but be realistic, right now pc games are dx9, most of them, and some few of these dx11, too. I'm not sure but I think that dx12 will be only for the new Windows OS, it means to make a game compatible for 3 different dx versions.

So I think that most surely most of the games will not use it, and I prefer the game to work that to make some experiments, for an OS that is by now in beta status. Improvements are good, but to be practical, is good, too.
 
GDC is coming and then we'll see what glNext and DX12 are all about. It will be very interesting to see what comes out of it.

Competition is good and it was about time MS got serious about this as well. It's a smart move on MS's end to make Win 10 a free upgrade, more people will upgrade,thus more incentive to use DX12.

Will W3 see a DX12 update, I don't believe it would, but CP77 could benefit from it, since it's probably still at least 2 years out.
 
Will there be any efforts to make the game take advantage of directx 12 once it's out?

Found this link today showing some advantages of DX12 on the CPU front:
http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/3

I recently bought a r9 270 4gb and don't wish to buy another card just after about a year. And I'm sure I'll have to run the game at less than max settings which I obviously don't want to. It would help out a bunch for people in similar predicament if Witcher 3 would get post launch support for DX12.

So my question is - is it possible to upgrade the game for DX12 easily enough to make the effort worth the time? Or does DX12 work in such a way that the game will simply be able to take advantage of it when it comes out without the need for any additional work by the developers?

oh and what about mantle?

From what I've read so far, migrating to DX12 will not be an easy task. It's a different API and will require not a small amount of work to make use of it.

Also from what was shown so far, DX12 will help with CPU bound games, if TW3 is GPU bound, then there might not be much difference.
Of course it remains to be seen what other improvements will be there, but still I don't think it would make sense to do this for TW3,.
 
Also from what was shown so far, DX12 will help with CPU bound games, if TW3 is GPU bound, then there might not be much difference.

An open world game like W3 could easily be both, GPU and CPU bound. Same if true for example with the Assassin's Creed games, they can be both limited by CPU and GPU depending on the situation and settings.

The more CPU power you have the more objects you can bring on screen at the same time (combined with their scripted or dynamic behaviour/AI), the more GPU power you have the better the objects can look.

Together with RTS games open world games are probably the one genre that profits most from more effective CPU processing. For Witcher 3 it's very likely coming too late, but for any other upcoming game of its kind in 2016 or beyond DX12 will probably be a very positive development for offering a more detailed and lifelike world at more fluid framerates and without severe CPU limitations.
 
Personally, i don't mind if they do or they don't do it.

If what Micosoft sais is true, it would be a great improvement... but be realistic, right now pc games are dx9, most of them, and some few of these dx11, too. I'm not sure but I think that dx12 will be only for the new Windows OS, it means to make a game compatible for 3 different dx versions.

So I think that most surely most of the games will not use it, and I prefer the game to work that to make some experiments, for an OS that is by now in beta status. Improvements are good, but to be practical, is good, too.

There are no more game with DX9 since a while, all games being developped under DX10/11 since almost five years now. Only really old games still run on DX9, and upcoming new titles will probably run with DX12 in a few months, but I think it's a bit early for The Witcher 3 since we don't know when DX12 will be released yet, and most users won't upgrade to windows 10 soon.
I would love the game to support DX12 but won't change my pleasure playing it anyway.
 
I might have misread, but I "got" from one of the DX12 previews that there were significant advantages to running DX11 code under DX12, and it wasn't limited to re-written or optimised code (though obviously it helps somewhat if there are specific optimisations for any novel API).

So while W3 may not be optimised for DX12, it may still be advantageous to run under Win10/DX12 once that is available.
 
Do you have any source for that?

Intel's demo at Siggraph seemed to show that, but I don't think it really did. What it showed is that it's possible, at the level of a demonstration anyway, to have a program that switches between DirectX 12 and DirectX 11 on the fly. But I doubt it was done just by substituting the DLL linkage. I suspect they had different pipelines for each mode.

Some developer commentary suggests that it's not their intent to provide a "DX12 under DX11" mode. I think it will look more like a DX11 compatibility layer for existing DX11 code, and a native DX12 layer for new DX12 code. There are too many things you simply want to do differently in DX12.

"Two big ones in terms of API changes are much lower overhead and being able to pre-compile more pieces of what you need to do up front so you can then cheaply use them on the fly with pipeline state objects when required. Another aspect is multi-threading. Having more cores running more threads at lower frequencies (power states) is much more power efficient than running the same code on one thread really fast. So, multithreading in DirectX 12 is another one of the things that will let developers get great power benefits. Also, in DirectX 12, the application takes more responsibly for ensuring correct rendering and it can do that more efficiently than the Intel driver could in the past since the application knows what it needs to do in essence whereas a driver needs to be more general and conservative."

[Andrew Lauritzen talking to Steve Waskul, http://waskul.tv/portfolio/steve-waskul-andrew-lauritzen-2014/]
 
Yeah, I thought may be DX12 provides a compatibility shim for DX11 but implements it differently underneath (someone suggested that glnext can do a similar thing for OpenGL 4), but given they are so different it's not that easy.
 
We have to keep in mind that it takes a while for the developers to become completely familiar with DX12, so we probably will not see a sudden mass-migration from DX11 to DX12 in games for some time. Even if CDPR wanted to turn the Witcher 3 into a DX12 game, they need at least several months to just get familiar with it.
There is a low possibility that a few games running on DX12 are released sooner than expected from the developers that exclusively make games for Microsoft, that is if they had access to it earlier than other developers.
 
While we're talking about tech, anyone know if the Witcher 3 is incorporating physically based rendering into RedEngine?
 
While we're talking about tech, anyone know if the Witcher 3 is incorporating physically based rendering into RedEngine?

Not really a DirectX 12 question, but the answer is yes. John Mamais stated, with specifics, that they are using PBR.

"The stuff we're using that's really cool for next-gen is dynamic IBL," he continues, referring to image-based lighting. "We're using PBR (physically-based rendering), and water simulation's really interesting. It basically reacts to the weather conditions so you'll get choppier waves in wind."

Tech Analysis: The Witcher 3: Wild Hunt
 
Yep it's physically based, couldn't you tell by looking at the screenies :p

I was looking very closely at the screenies and I could see some things that were telling of PBR - like image based specular map looking things on Leather, swords, and so forth, but other things don't seem to be present like subsurface scattering or indirect lighting. When I did see things that appeared like they were SSS, they seemed to be baked into the diffuse. Like the little red ridges underneath the cheek on the Zoltan Screenshot.

Like, to me this is full utilization of Physically based rendering. Like Redengine is just on the cusp of this but the 14 minute demo footage isn't quite there.

[video=youtube;VwpjZ-JGXE4]https://www.youtube.com/watch?v=VwpjZ-JGXE4[/video]
 
Baked or not I can't say but I'm happy as long as real SSS is there instead of faked one which makes all skin look like that of a corpse(Geralt not withstanding). That is one of the 'key' ways games right now are doing with PBR(the ones that boast of it specifically), using materials like metal and leather and such because it's the easiest way to 'show off' PBR but it applies to a lot more than just that.
 
Status
Not open for further replies.
Top Bottom