The Witcher 3: Wild Hunt - PC System Requirements are here!

+
Coming: when investors says so. because it's not when the game is ready. waiting time debating shit. not my thing.

Well in CDPR's case (since they love their fans so much and are focused on delivering a complete product) you see a sort of balance act come into play. Pleasing both the goat and the cabbage. Assuring they meat deadlines with investors while making sure a product is as good as it can be come launch. And I respect that greatly, especially when the industry is such a bitch nowadays and developing a game of this scale is no easy task.

I somehow feel people forget just how complex game development is. If it weren't for unyielding passion, everyone would quit I reckon.
 
At the sake of just trying to put blunt facts here so that the topic can be gotten back to:

♦ Games get bigger ---> DLC and Patches get bigger. More audio, higher res textures, more complex coding, and the more difficult it becomes to test for every scenario. A (typical) test team of 40-120 people can't compare to the weird stuff people find when you have 100k+ people doing different things.
♦ Unless you've actually gone through the process of game development first hand (which it is obvious which ones here have), it's hard to grasp the reasoning, logic, and difficulties that go on, so making demands like that makes you look impractical is all. In a public forum, people are going to want to share information and opinions.
♦ Don't want the patch? It's an offline game, you don't have to get it. But unless you know exactly what the patch is for, dismissing it outright like that is a bit silly.
 
Alright putting my mod hat on.

I think Ryanza received pretty much all the answers he wanted (or not wanted for that matter). Let's try and steer the thread back on topic.
 
A post has been deleted.

If you have nothing to say then don't, if you don't want to take part in a discussion then don't, constantly posting with only the intent to incite will be met with action.
 
Speaking of technical stuff. The Witcher 3 is built on Directx11 right? With Directx12 coming "soon" , is it possible to be implemented in W3 after release via patch? I think something similar happened with Crysis 2? I could be wrong.
 
Speaking of technical stuff. The Witcher 3 is built on Directx11 right? With Directx12 coming "soon" , is it possible to be implemented in W3 after release via patch? I think something similar happened with Crysis 2? I could be wrong.

Possible, but there would have to be good reason for it, such as better performance on some class of PC. Doing it just to say you support DX12 would be a waste of time and money. (The same argument applies to Mantle and OpenGL.)

And "soon" deserves the scare-quotes. There has to be DX12 hardware, DX12 drivers, middleware, and development kits, developers experienced in DX12 before it becomes a low-risk proposition. And even then, it has to compete with using your developers to produce other revenue-generating products, like CP2077.
 
Speaking of technical stuff. The Witcher 3 is built on Directx11 right? With Directx12 coming "soon" , is it possible to be implemented in W3 after release via patch? I think something similar happened with Crysis 2? I could be wrong.
It was a different story with Crysis 2. DirectX11 came out 2008-09 and Crysis 2 came out in 2011. It was the first Crysis to be made for the consoles and the last-gen only supported the DirectX9 feature-set(GL equivalent on PS3) so they did multiplatform developement for the game and later patched in with their DirectX11 renderer. They had that because CryEngine is a commercial engine so they had to stay top of the line and hence had a working DX11 renderer.

DirectX12 will come out sometime at the end of this year, devs will need to learn new functions and 'trick' this takes time and experimentation. It also functions different from DX11, a lot of layout change so the whole engine which is DX11 right now will have to be rewritten with DirectX12 API.

Further still, API adoption takes time, you need compatible middleware, you need compatible hardware(the consumers need compatible hardware), you need to update all your tools to match the new rendering features and workload. And while MS has said some of DX12 will work with existing cards, DX12 has 'feature levels' and the spec is still underway so the chances of having complete backward support is not very good.

TL;dr - Possible? Yes. Will it happen? I'd say 99% chances on No.

CDPR also hasn't been at the front line of DX adoption with TW2 - two years after the introduction of DX11, the game was still done in DX9.
Neither had most other games, just about everyone was in the same boat.
 
Last edited:
CDPR also hasn't been at the front line of DX adoption with TW2 - two years after the introduction of DX11, the game was still done in DX9.
 
CDPR also hasn't been at the front line of DX adoption with TW2 - two years after the introduction of DX11, the game was still done in DX9.
Well considering the amount of financial troubles they had during the development of the game, it's not a big surprise. IIRC the original plan was to use the old Aurora engine for TW2 while at the same time working on Red engine and using it in TW3. But they had to scrap that plan and use the unfinished Red engine for TW2.

Considering the situation, I think the game turned out alright. And by far the best outcomes was that CDPR didn't go bankrupt..
 
Seems everyone is forgetting that there were only a select few titles that used DX11 back then and even then most of them didn't do much and kept DirectX9(sometimes even 10) as the 2nd option. Half the time DX11 version didn't bring significant enough difference and often questionable FPS loss despite the fact that DX11 gets rid of a lot of bloat from the 9 codebase leading to lesser overhead and is more efficient.

It's also about serialization, DX9 sent calls 1 by 1 while DX11 takes them all and then send them in a single pipe 1 by 1. Maybe people didn't focus too much on it at that time since it would be a lot of extra work. There were a few titles that offered a bit of Tessellation but implementation/extent combined with the GPU performance then the result wasn't very good. Crysis 2 had some really famous brick walls in certain places but they focused too much on non-essential things when they could have prioritized it.

Only within the last 2 or 3 years did we start using DirectX11 properly and discarding older version support.
 
Last edited:
Ok so my PC is:
I7 2600k 3.4
8gb ram
GTX 570 SLI
OS 64bit Win 7
DX11
Think I can run the game decently or should I just pick it up on my PS4?
 
Ok so my PC is:
I7 2600k 3.4
8gb ram
GTX 570 SLI
OS 64bit Win 7
DX11
Think I can run the game decently or should I just pick it up on my PS4?
The 1,28GB VRAM in the 570 might become a problem. But otherwise you have a very solid rig.
 
The "minimum" is usually wrong though.

I mean... CoD Ghosts, Wolfenstein The New Order, Watch Dogs, Crysis 3, Dishonored, Shadow of Mordor, Infinite...
All of these games lied either in minimum or recommended settings. Some by quite a bit. Especially when it came down to VRAM. TNO even lied about API :p...

Meh, I am more then safe at 4.
 
I don't mean system requirements of games, from personal testing I have concluded that the new minimum VRAM required for games now(not older games) should be 2GB AT THE VERY LEAST. Below that is.... suffering and it's only gonna increase in the future
 
I don't mean system requirements of games, from personal testing I have concluded that the new minimum VRAM required for games now(not older games) should be 2GB AT THE VERY LEAST. Below that is.... suffering and it's only gonna increase in the future

yes , it's the reality and it's just begin i think the next ati card will be on 20 nano et the next geforce on 16 nano so more space on the card so more v ram . Each generation of it the editor focus on one point , the last generation was on power consumption ,i think the next will be on v ram because developer have more and more demanding of it. the standard now is 2 -3 gig of vram , i think in one or two year it will be 4-6 gig on the new geforce and ati .we will see if i'm right. perhaps with direct x 12 , with the new api reduced the demand on the cpu .
 
I don't mean system requirements of games, from personal testing I have concluded that the new minimum VRAM required for games now(not older games) should be 2GB AT THE VERY LEAST. Below that is.... suffering and it's only gonna increase in the future


Ohh I agree, I would not buy a GPU with less then 2 GB of VRAM.
Still...
Would you believe me if I tell yo the ATI 5770 (1GB version) still plays EVEN the newest games on low-medium settings? :)
 
Guys. Do you think it will start with a 2 core CPU at all? I got an i5 2520M at 3.2 Ghz. Assuming we will be able to turn off shadows this time ( I am looking at you W2) my GT555M should be able to handle the graphics at least on low, plus I got 8GB ram.

Edit: Sadly, I can't afford to upgrade at all so I am left in the optimization team's mercy but I really wanna play it even if it's just on low.
 
Last edited:
Top Bottom