Witcher 3 MUST be DX11

+
DX11 is highly suitable technology for what CDPR means to do. OpenGL might be just as suitable. What is unsuitable, however, is customer demands for specific technology. Developers select the technology that enables them to deliver the results they want. They must be free to do so without external demands. Unless you are not only a professional but also an insider -- and none of us who do not have Red Team banners are insiders -- you have no information that you can base any kind of informed decision on. You can't say "must use DX11" or "must use DX9" with any authority that the developers are obliged to listen to, and you have no beef with them if they do not, because developers are paid to play the cards they're dealt, not listen to us kibitzers.
 
PrinceofNothing said:
You need to follow the debate. You brought up 128 bit HDR and said that nobody used it because you can't tell the difference, after which I responded and said the reason they likely do not use it is because of the performance hit..

And that's when you started talking about it being "expensive."



Yes, so they were already very detailed to begin with and would not benefit from tessellation.



Using Crysis as an example of "realistic water" doesn't really help your argument. Crysis was the greatest back in the day, but since then, other games like Crysis 2, Crysis 3 and Assassin's Creed IV have much better looking water.

Crysis 2 and 3 uses tessellation for water, but Assassin's Creed IV doesn't. It just depends on whatever works best for the developer and their particular circumstance. Accordingly, CDPR will probably decide to tessellate the ocean in the game based on that tech trailer.

[media]http://www.youtube.com/watch?v=vRj8We3JpIk[/media]



That shot was taken with streaming disabled, otherwise it would be a blurry mess. That's the whole point of tessellation. You can control the LOD of an object without using multiple textures.



Nobody said DX9 couldn't do soft shadows. The reason why devs do not use that feature in DX9 is because the performance cost is too great, not because it cannot be done.

The Witcher 3 will be a vast open World game, so using DX9 with soft shadows would kill performance.



DX10, which is a subset of DX11.



Very good for you!

I follow the debate. But you are persistent and you talk about things you don't understand. If you want to discuss about this, make something. Experience it.

Btw, with todays programmable pipelines you can create programs (shaders) with your own instructions to create better effects, totally ignoring any API. But your game can still use dx9.

Game devs often lie about certain features to be only dx10/11. Just remember first Crysis and very-high dx10 only settings. They work under dx9 and XP with better performance.

And native dx11 games do not exist.

That's all.
 
Ancient76 said:
I follow the debate. But you are persistent and you talk about things you don't understand. If you want to discuss about this, make something. Experience it.

While I have never programmed or created any games, I think you are overestimating your understanding and underestimating mine. You've been flat out wrong on quite a few of your arguments, if you hadn't noticed..

Btw, with todays programmable pipelines you can create programs (shaders) with your own instructions to create better effects, totally ignoring any API. But your game can still use dx9.

If that's the case, why are new shader models released when new APIs are released? DX11 has SM 5.0. DX10 had SM 4.0, DX9 had SM 3.0..

As far as I know, you can't use SM 5.0 in a DX9 game on DX9 hardware, as they don't support SM 5.0's advanced feature set..

Game devs often lie about certain features to be only dx10/11. Just remember first Crysis and very-high dx10 only settings. They work under dx9 and XP with better performance.

DX10 was mostly supposed to increase performance (not image quality), but Crytek messed it up as they didn't have the experience to properly implement it the first time. The first Assassin's Creed had a DX10 path that was faster than the DX9 path however, and it supported MSAA..

Anyway, if you ever get around to making your own AAA games, then you can come back to lecturing us and CDPR about why they should stick with ancient and outdated programming models and hardware whilst the rest of the World progresses.

And native dx11 games do not exist.

Tell that to DICE, Crytek, Firaxis, Ubisoft, 4A Games and finally CDPR
 
Moderator: That will be enough of telling other members what they are or are not qualified to comment on. Next post to do so gets deleted.
 
Thor666Arise said:
Well I don't mind a DX 11 OPTION! but I really hope they don't make it exclusive for DX 11.
Yes it would look better no question but a lot of people with weaker graphic cards couldn't play it.
It would be a shame if those people couldn't enjoy the masterpiece that the Witcher 3 is hopefully going to be.
i said this before but with everything older then a 400 which is DX11
you shouldnt try this game on pc this makes no scence
it would lag even on low and minimized resolution
i would say a 760 and nothing lower
So why mention people which wouldnt even be able to play the game with fun because of 18-20 fps with drops in fights to 10fps that makes no scence
not really better for them but bader for the others
 

IsengrimR

Guest
I love how some people are like: "DX9?! That's olllddd! Buuu!". Question is, if it looks good, if it works well, what the hell does it matter?
I am no specialist, but Witcher 2 being "only" dx9 proved to me that Microsoft releasing new versions of DX is mostly a smoke screen to make them exclusive for their new OS'es.
It doesn't have to be DX11, it doesn't have to be DX10. For all I care, It can run OpenGL ( actually that would be way better if it did, in my opinion ). Certainly would be nice if it supported it all, yeah.

And btw, I will get Windows 8 when hell freezes over, tried it out once and I hate it with passion, it's the worst OS I've ever seen and I worked on Vista and ME. Windows 7 is usable after I've turned off most of crap ( services, autostarts ) it want's to run.
 
ColIsengrim said:
I love how some people are like: "DX9?! That's olllddd! Buuu!". Question is, if it looks good, if it works well, what the hell does it matter?
I am no specialist, but Witcher 2 being "only" dx9 proved to me that Microsoft releasing new versions of DX is mostly a smoke screen to make them exclusive for their new OS'es.
It doesn't have to be DX11, it doesn't have to be DX10. For all I care, It can run OpenGL ( actually that would be way better if it did, in my opinion ). Certainly would be nice if it supported it all, yeah.

And btw, I will get Windows 8 when hell freezes over, tried it out once and I hate it with passion, it's the worst OS I've ever seen and I worked on Vista and ME. Windows 7 is usable after I've turned off most of crap ( services, autostarts ) it want's to run.


There is a huge difference between DX9 and DX11. DX11 has better performance and features not present in DX9 like Tesselation.
 
ColIsengrim said:
I love how some people are like: "DX9?! That's olllddd! Buuu!". Question is, if it looks good, if it works well, what the hell does it matter?

That's the point, it wouldn't look good or work well. DX9 has too many limitations to successfully pull off a game of this magnitude. The Witcher 3 is an extremely ambitious game, and will be to my knowledge the first big story driven game to have no chapters, or artificial breaks in the storytelling. It will also be completely contiguous with no loading screens.

Using DX9 would sacrifice or severely diminish the impact of those important gameplay aspects.

I am no specialist, but Witcher 2 being "only" dx9 proved to me that Microsoft releasing new versions of DX is mostly a smoke screen to make them exclusive for their new OS'es.

As you say, you're no specialist so you don't understand what it is you're talking about. Microsoft has had to walk a fine line between progressing technology, and preserving backward compatibility.

If an older OS doesn't support a newer version of Direct3D, it's because the new technology is fundamentally incompatible and could not be implemented.

It doesn't have to be DX11, it doesn't have to be DX10. For all I care, It can run OpenGL ( actually that would be way better if it did, in my opinion ). Certainly would be nice if it supported it all, yeah.

DX10 is a subset of DX11, so the Witcher 3 should still run on DX10 cards albeit with less performance and some graphical features missing.

And btw, I will get Windows 8 when hell freezes over, tried it out once and I hate it with passion, it's the worst OS I've ever seen and I worked on Vista and ME. Windows 7 is usable after I've turned off most of crap ( services, autostarts ) it want's to run.

Your loss. Windows 8, and especially Windows 8.1 are absolutely superior to Windows 7 in just about every respect; especially for gaming.
 
Cormacolindor said:
There is a huge difference between DX9 and DX11. DX11 has better performance and features not present in DX9 like Tesselation.

I couldn't care less about tessellation. What is really good about dx11 is performance and no texture flashing that was all over the place in tw2.
 

IsengrimR

Guest
PrinceofNothing said:
Windows 8, and especially Windows 8.1 are absolutely superior to Windows 7 in just about every respect; especially for gaming.

Care to explain in which aspect it is "superior"? I didn't notice anything like that while I was using it. Especially for gaming? Well, wasn't there a number of issues with that?
But that's beside the point.

Last thing I would say about microsoft though is that they do a good job, and especially in a matter of backwards compatibility. Balance my arse.
Disclaimer: I despise Microsoft anyway, so there it is.


PrinceofNothing said:
I couldn't care less about tessellation. What is really good about dx11 is performance and no texture flashing that was all over the place in tw2.

I also couldn't give a crap about tessellation. In most cases I set settings lower in games, if I cannot see the difference between what I can run and the said lower settings here and there.
Texture flashing? That, I didn't notice.
 
ColIsengrim said:
Care to explain in which aspect it is "superior"? I didn't notice anything like that while I was using it. Especially for gaming? Well, wasn't there a number of issues with that?
But that's beside the point.

Superior in boot up time, shutdown time, resumption from sleep or hibernation, data transfers, more efficient thread and memory management, much better security, lower power usage, and generally, lower operating overhead.

As for gaming, the gaming enhancements come from the more efficient thread management which handles CPU resources, more efficient memory handling, and I also want to say the new driver model. I'm not certain that the driver model has any direct impact, but I think that the drivers on Windows 8/8.1 have less overhead than they do on Windows 7.

Don't quote me on that one though.. The driver model is mostly responsible for why you cannot simply just drop the newer versions of Direct3D in the older operating systems, as they require the updated driver models as well and thats a change at the deepest level.

Disclaimer: I despise Microsoft anyway, so there it is.

At least you admit it.

Texture flashing? That, I didn't notice.

I think he means texture pop in..
 
Top Bottom