DirectX 12 ready?

+
Status
Not open for further replies.
Maybe if Windows 10 allowed you to pay off the retail price in monthly installments (with interest of course), that would be better for me. Otherwise I'm gonna have to save up for months just to afford the upgrade.

Just get yourself a Win8 licence which you can get for less than 50 bucks quite regularly and you'll be able to get Win10 for free once it's out. ;)
 
From APIs like DX12 and Mantle low-end CPUs should have the biggest benefit.
On the other side companies like EA or Ubi on purpose do not optimize their games for this CPU category.

Buzzword and hype!
 
From APIs like DX12 and Mantle low-end CPUs should have the biggest benefit.
On the other side companies like EA or Ubi on purpose do not optimize their games for this CPU category.

Buzzword and hype!

DX 12 is not out yet, so there's nothing to optimize for there. Planning for the future, yes, but nothing will be released in the short term, more like 2016.
As for Mantle, it requires additional development time and effort and if they haven't planned that into the development process, I don't see them adding it.

I doubt any publisher would want their games to run like crap, so I hope that Ubisoft learned from the AC:U performance issues. DX 12 and Mantle could help in this regard, but they needs to gain traction, as well as the developers getting familiar with them.
 
DX12 has already been released to developers. So developers who have the inclination are already doing revision and optimization for it. You're still probably right that there won't be any significant DX12 applications this year.

The greater question with Mantle may be whether it is seen as redundant with DX12; if it's seen as redundant and not facilitating enough sales to customers who can't go Windows 10, there may not be much use for it.

And I think ancient76 is right that this will carry the greater benefit for lesser CPUs. These (AMD A-series and FX-4's, Intel Celeron, Pentium, and Core i3) are the ones most constrained by threading limitations in earlier DirectX and OpenGL.
 
Anand had some very interesting preview of DX 12 that shows that there is clearly a lot of potential. Of course, mileage will vary depending on the game, but still it looks very promising.

Definitely the high profile developer studios will take advantage of this, so roll on GDC and let us see what they've been cooking so far. :)
 
From APIs like DX12 and Mantle low-end CPUs should have the biggest benefit.

High end rigs with multi GPUs (and even single elite cards) should see a nice increase as well. One of the biggest issues with SLI and Crossfire is the tendency to become CPU bound, as both of them dramatically increase GPU performance.

On my rig, with G1 GTX 970 SLI, I am CPU bound at times with an overclocked Core i7 4930K @ 4.5ghz and I game at 1440p with high IQ settings. I actually returned my G1 GTX 970s to Newegg today for a refund, and I intend to buy either two GTX 980s, or a single Titan X if it will be available soon.

I think there's a good chance that CDPR may release either a DX12 or DX11.3 performance update for the WItcher 3 by the end of the year. I know they are looking into it. Not sure about Mantle though. Mantle's lifespan on Windows, and even Linux seems like it's on the countdown. Valve is supposed to show us GLNext at GDC this year, so there probably won't be any platform left for Mantle to gain a foothold in.
 
DX 12 is not out yet, so there's nothing to optimize for there. Planning for the future, yes, but nothing will be released in the short term, more like 2016.
As for Mantle, it requires additional development time and effort and if they haven't planned that into the development process, I don't see them adding it.

I doubt any publisher would want their games to run like crap, so I hope that Ubisoft learned from the AC:U performance issues. DX 12 and Mantle could help in this regard, but they needs to gain traction, as well as the developers getting familiar with them.

There's nothing to optimize for!!??

AC:U doesn't work bad because there's no APIs like DX12. It works bad because simply it isn't optimized. This game is just one example.
And DX12 or Mantle means more work for developers - something that they don't want.
Nobody will spend more money and time to enable it's game to work 3-4 frames faster with new DX. I have explained previously that PC isn't closed platform - you can always buy better hardware for their unoptimized game. And this is good for hardware manufacturers.

This is now things work.
 
Last edited by a moderator:
There's nothing to optimize for!!??

AC:U doesn't work bad because there's no APIs like DX12. It works bad because simply it isn't optimized. This game is just one example.
And DX12 or Mantle means more work for developers - something that they don't want.
Nobody will spend more money and time to enable it's game to work 3-4 frames faster with new DX. I have explained previously that PC isn't closed platform - you can always buy better hardware for their unoptimized game. And this is good for hardware manufacturers.

This is now things work.

You might have misunderstood what I meant.
Games that are currently in development might be looking into DX 12 and making their engines make use of it, but since it's not officially out yet and the feature set is not known completely, thing might change at any time, I doubt games that are aiming to release this year will specifically target DX 12.
They might in the future, but I doubt it will be for the time it's released.
We don't even know what features of DX 12 will require new GPUs and we can't expect developers to use that if there's still no hardware for it. One year from now it will most likely be a completely different story.

As for AC:U, it was badly optimized and Ubisoft is to blame for that. Could a lower level API like DX12/Mantle help with that? Maybe yes, maybe not, maybe even it could have helper with a lot more than 3-4 frames, I don't know, but here we have a situation where the game definitely needed more time, but they still released it.'
So in this regard, the delay of W3 is a good thing, better wait a bit longer and release a great game, than hurry a release a broken mess.

In the long run many will spend money and time to use DX 12 or glNext, it doesn't matter which, especially if it allows better graphics on the same hardware, because people care about graphics and many games depend on this to sell.
Unless developers have some kind of deal with Nvidia or AMD, I don't see why they would cripple their games on purpose.
Of course not all games would need DX 12, the benefits will vary, if the game is not very graphically intense or they are not familiar with DX 12, they could easily build it the way they already know.

We don't need to buy a new GPU every year like some years back, unless you are going for some 4K resolutions, a moderate setup will be more than enough for a lot of people.
 
I think AC Unity gets a lot of undeserved flak for being unoptimized. From a CPU perspective, it had excellent optimization. It uses all 6 of my cores, and I'm able to easily sustain 60 FPS at 1440p, maxed settings with FXAA, which is impressive because the game is rendering so much detail.. If the game were really that unoptimized, sustaining 60 FPS would be either insanely difficult or impossible.

My biggest issue with AC Unity, is I think the engine doesn't handle draw distance as well as it should. LoD transitions and pop in in particular can be jarring and are too noticeable.

It will be interesting to see how the Witcher 3 compares with AC Unity in terms of LoD and pop in, as the Witcher 3 will have similar density and detail..
 
I think AC Unity gets a lot of undeserved flak for being unoptimized. From a CPU perspective, it had excellent optimization. It uses all 6 of my cores, and I'm able to easily sustain 60 FPS at 1440p, maxed settings with FXAA, which is impressive because the game is rendering so much detail.. If the game were really that unoptimized, sustaining 60 FPS would be either insanely difficult or impossible.

My biggest issue with AC Unity, is I think the engine doesn't handle draw distance as well as it should. LoD transitions and pop in in particular can be jarring and are too noticeable.

It will be interesting to see how the Witcher 3 compares with AC Unity in terms of LoD and pop in, as the Witcher 3 will have similar density and detail..

I'm gonna need some proof on those claims because if that's true then you're the only person in the world that gets performance like that in that broken game
 
I think AC Unity gets a lot of undeserved flak for being unoptimized. From a CPU perspective, it had excellent optimization. It uses all 6 of my cores, and I'm able to easily sustain 60 FPS at 1440p, maxed settings with FXAA, which is impressive because the game is rendering so much detail.. If the game were really that unoptimized, sustaining 60 FPS would be either insanely difficult or impossible.

My biggest issue with AC Unity, is I think the engine doesn't handle draw distance as well as it should. LoD transitions and pop in in particular can be jarring and are too noticeable.

It will be interesting to see how the Witcher 3 compares with AC Unity in terms of LoD and pop in, as the Witcher 3 will have similar density and detail..



1440p at 60 fps max with fxaa?
why am I finding that extremely hard to believe in?
 
Off topic and out of order. We are not going into any arguments over AC Unity. If you had read prince_of_nothing's other posts, you would have the answer you claim to seek and not have to post aggressive challenges like that. At the risk of stealing his thunder, he's running a pair of SLI GTX 970's. Performance such as he claims is entirely credible, and your challenges need no rebuttal. Thank you for not pursuing them.
 
Yep, like Guy said, that's with GTX 970 SLI with an overclocked 4930K driving them. Anyway, I don't have the GTX 970s anymore, I returned them to Newegg for a refund, and I'm thinking about a pair of 980s instead.

But back on topic. The last bit of gameplay we saw for the Witcher 3 had a 60 FPS YouTube version.. I'm curious, does the source material for a 60 FPS YouTube video have to be at 60 FPS as well for the video to play at that speed? Some guy on another forum told me that the source material didn't have to be at 60 FPS for them to do that, which I didn't find very believable.

If it's not true, then it's conceivable that CDPR already has the Witcher 3 running at 60 FPS. Likely with SLI, but it's still a good indication that the engine is no slouch and will take advantage of the hardware you throw at it.
 
@prince_of_nothing

The source file has to be 60 FPS of course, but you can capture at 60 FPS even if your game is not running at that speed :) I run AC:U on 45 FPS for example, and if I select the option to capture the video at 60 FPS on Shadowplay, it creates a 60 FPS video. I believe it upscales it, therefore the video looks a little bit fast forwarded.

I don't know if that's the case with the last video though, because it doesn't look that fast forwarded to me.
 
And DX12 or Mantle means more work for developers - something that they don't want.

Why not? If Mantle (or it's iteration in glnext) will turn out well supported on all target platforms, it would be in the interest of engine developers to support it in order to increase their reach to their potential users - i.e.game developers.
 
Seems to me that the topic of DirectX 12 in the Witcher 3 has run dry and the thread became random off-topic FPS/GPU/SLI mess. General graphics and PC requirements discussion has already its own thread co please continue there. Closing this one.
 
Status
Not open for further replies.
Top Bottom