The Witcher 3: Wild Hunt - PC System Requirements are here!

+
Hmm, i may need to upgrade my GPU, i have a GTX 650 Ti . Furthermore i am yet t be in 1080p gaming, i am at 900p. Anyway i will be changing GPU soon, it won't be an issue. Eagerly waiting !
:hmm:
 
So an i5 2500K is the minimum? My understanding is the only difference between an i5 and i7 is hyperthreading. If I overclocked my i5 2500K will I be OK even though my processor just meets the minimum spec? Or will the lack of hyper threading make a difference? I hope to upgrade my 7870 to a 970 soon. The processor is the only thing I'm concerned about.

My specs
i5 2500K
AMD 7870
Windows 7 64Bit

The Core i7 also has slightly more cache than the Core i5, but yes, overclocking can definitely help to an extent. The question is how high you're willing to go.. If it were me, I'd shoot for 4.5ghz. But as I said a few pages back, if CDPR is recommending an i7, then it must be because the engine can scale to at least 6 threads if I had to guess, or even 8. So HT will likely give a performance increase in the Witcher 3, but not as much as having real cores instead of virtual ones.

One thing that the new consoles have done to the gaming industry, is force developers to thread their engines to be able to use the weak CPUs in the consoles effectively. If they can't thread them, they will essentially go bankrupt as no one wants to buy underperforming games.

All of the new 3D engines scale to at least 6 threads or more and Red Engine 3 looks like it's falling in line with the others in that respect..
 
The Core i7 also has slightly more cache than the Core i5, but yes, overclocking can definitely help to an extent. The question is how high you're willing to go.. If it were me, I'd shoot for 4.5ghz. But as I said a few pages back, if CDPR is recommending an i7, then it must be because the engine can scale to at least 6 threads if I had to guess, or even 8. So HT will likely give a performance increase in the Witcher 3, but not as much as having real cores instead of virtual ones.

One thing that the new consoles have done to the gaming industry, is force developers to thread their engines to be able to use the weak CPUs in the consoles effectively. If they can't thread them, they will essentially go bankrupt as no one wants to buy underperforming games.

All of the new 3D engines scale to at least 6 threads or more and Red Engine 3 looks like it's falling in line with the others in that respect..

Thank you for the detailed response.

So in the future would you recommend I upgrade to an i7 since more game engines are going to take advantage of of 6 or more threads?
 
Last edited:
So... I'm in need of an upgrade for sure, My Mobility Radeon 5650 will probably burn if I force it to run Witcher 3.
Those minimum specs are high for regular laptop users, guess it's time to look for a gaming laptop
 
Thank you for the detailed response.

So in the future would you recommend I upgrade to an i7 since more game engines are going to take advantage of of 6 or more threads?

Yes definitely. TLP (thread level parallelism) and ILP (instruction level parallelism) are the future. CPUs keep adding more cores/execution units and threads, and as games become bigger and more complicated, they will need to be programmed to exploit those execution units.....and they are..

If someone were to buy a new gaming rig today, I would not recommend a straight quad core CPU. Either a hyperthreaded quad core, or a hex core would be what I recommend. I've seen some guys complaining about the newest titles like AC Unity tapping out their quad core CPUs. And by tapping out, I mean close to 100% usage on all cores..

To be fair though, a lot of that comes from the overhead associated with PC gaming and not from the game itself. DX12 will solve that problem to a large extent, but for now, hyperthreading definitely helps to deal with that extra overhead.
 
So an i5 2500K is the minimum? My understanding is the only difference between an i5 and i7 is hyperthreading. If I overclocked my i5 2500K will I be OK even though my processor just meets the minimum spec? Or will the lack of hyper threading make a difference? I hope to upgrade my 7870 to a 970 soon. The processor is the only thing I'm concerned about.

My specs
i5 2500K
AMD 7870
Windows 7 64Bit

Edit: I forgot to add that I have 8Gb of Ram.

I'm running an i5 2500k o/c to 4.2 ghz with 8 ghz RAM on a Sandy Bridge m/b - Asus P67A-GD65 - and twin watercooled Radeon 290x. Supported by a Corsair 800W power supply - a good one, though.

This setup currently runs Dragon Age Inquisition, Elite Dangerous, Star Citizen, Alien Isolation at 6400 x 1600 quite smoothly, ( 30+ fps on High to Ultra) with some hitching for Star Citizen. SC is pretty unoptimized though.

I check my CPU load with SpeedFan and GPU loads with MSI Afterburner and/or GPU-Z pretty often. I can tell you the o/c i5 2500k does fine so far.
 
@ Sardukhar, you're extremely GPU bound though at those settings, which is a lot different than the average gamer that games at more CPU limited resolutions like 1080p..
 
@ Sardukhar, you're extremely GPU bound though at those settings, which is a lot different than the average gamer that games at more CPU limited resolutions like 1080p..

...what? If an o/c 2500k can keep up at those resolutions, it will be fine at a mere 1080p, provided your GPU can keep up as well.

I mean, I could tell you of running those games at lower resolution with my 6950 a couple months ago, ( except ED, didn't have it yet) and the 2500k still not being a problem, but my point is that as long as your GPU can keep up, you shouldn't see serious CPU limitations from a game in the neighbourhood of the above mentioned. Not from an o/c 2500k.

Perhaps W3 will be heavily CPU dependent to the point an o/c 2500k couldn't deal - but that would be an hypothesis without much evidence at this point.
 
What I mean is that, the higher the resolution you game at, the less the CPU becomes a factor in performance. The CPU issues draw call commands to the GPU so they can render a frame. At 6400x1600, the GPUs are taking a lot longer to render a frame so they're not waiting on any instructions from the CPU. If you lowered your clock speed, you may not even notice any performance decrease at all..

At lower resolutions, you usually get the reverse, where the GPU ends up waiting for the CPU since the GPU can render frames much faster at lower resolutions.
 
My specs:

CPU: i5 3470K 3,4Ghz
GPU: GTX 760 oc
RAM: 8 GB

I hope I can play on High settings with AA and Sync off. Any suggestions / thoughts from fellow pc gamers?
 
My specs:

CPU: i5 3470K 3,4Ghz
GPU: GTX 760 oc
RAM: 8 GB

I hope I can play on High settings with AA and Sync off. Any suggestions / thoughts from fellow pc gamers?
I'd assume you'd be fine since the recommended (What I assume is for max settings) is a 770. I'd remember to turn Depth Of Field off for more FPS too. It's really an unnecessary effect and I know in a few games it takes a hit on my FPS.
 
What I mean is that, the higher the resolution you game at, the less the CPU becomes a factor in performance. The CPU issues draw call commands to the GPU so they can render a frame. At 6400x1600, the GPUs are taking a lot longer to render a frame so they're not waiting on any instructions from the CPU. If you lowered your clock speed, you may not even notice any performance decrease at all..

At lower resolutions, you usually get the reverse, where the GPU ends up waiting for the CPU since the GPU can render frames much faster at lower resolutions.

Oh, I get it - that's what CPU bound is, after all. The twin 290x render pretttty quickly, though. Even at that resolution. More importantly, the machine wasn't cpu-bound even at lower resolutions, at least on my 6950. I've run a couple things on the side monitors, with the 290x's - at 1080p and no real drop there either. Thaaaat I recall - it was mostly accidental. Sometimes DAI renders in one window, and only uses one GPU, because Eyefinity needs fullscreen or go home.

If I do turn down the i5 2500k to it's base 3.3 ghz, which I had to do while troubleshooting the horror show that setting these cards up was, I do see framedrop, especially in something like Watch Dogs.

Of course, that's Watch Dogs, so, yeah. Also DAI, but don't hold me to that.

Anyway, the 2500k, o/c shouldn't be a limiting factor at 1080p - or better - as long as the game isn't heavily CPU dependent.
 
Vram will be at least 2gb min and 4 gb recommended since the cards mentioned in respective specs have that much.

Specs seems a little high, the only reason for this specs being good would be if they are for run it at 60fps.

Someone said mordor is worst than this, well, mordor runs perfect with less than this, the vram is the only weird thing in mordor specs and was a lie because runs ok with 2gb and perfect with 4gb even on mid range cards.

I have a FX 8320, 8gb ram and r9 270x 4gb, the only game that doesn't run on max settings is unity. Inquisition runs perfect. Of course none of them surpass the 50 fps but none of them gets bellow 30 neither. mostly 40 fps on mordor and inquisition.

So, this being good specs or bad specs depends on how fast the game run with them. The Witcher 2 runs 60+ fps all time maxed out on my system.......
 
Vram will be at least 2gb min and 4 gb recommended since the cards mentioned in respective specs have that much.

Specs seems a little high, the only reason for this specs being good would be if they are for run it at 60fps.

Someone said mordor is worst than this, well, mordor runs perfect with less than this, the vram is the only weird thing in mordor specs and was a lie because runs ok with 2gb and perfect with 4gb even on mid range cards.

I have a FX 8320, 8gb ram and r9 270x 4gb, the only game that doesn't run on max settings is unity. Inquisition runs perfect. Of course none of them surpass the 50 fps but none of them gets bellow 30 neither. mostly 40 fps on mordor and inquisition.

So, this being good specs or bad specs depends on how fast the game run with them. The Witcher 2 runs 60+ fps all time maxed out on my system.......
Can the 770 effectively use 4GB? I never looked into that model, but I've heard many people say it can't or that it never uses more than 2GB at 1080p in many games. Heard there wasn't much of a gain in FPS with a 4GB 770
 
If I do turn down the i5 2500k to it's base 3.3 ghz, which I had to do while troubleshooting the horror show that setting these cards up was, I do see framedrop, especially in something like Watch Dogs.

Of course, that's Watch Dogs, so, yeah. Also DAI, but don't hold me to that.

Just out of curiosity: Are you using Mantle in DAI?

---------- Updated at 08:33 AM ----------

Can the 770 effectively use 4GB? I never looked into that model, but I've heard many people say it can't or that it never uses more than 2GB at 1080p in many games. Heard there wasn't much of a gain in FPS with a 4GB 770

In older games the 4GB are indeed not really helpful, as that amount of RAM would only be needed in extreme scenarios involving high resolution and antialiasing settings, in which case the 770's GPU would indeed be too slow anyway. However, VRAM demand has increased drastically with some recent titles - AC Unity, Watch Dogs, Lords of the Fallen, for example. Here, the additional VRAM does help, since the demand for graphics horsepower isn't in fact increased that much, only the game wants more VRAM for texture streaming etc.
 
The PC minimum specs are only slightly higher than the consoles. And I strongly doubt that the minimums amount to "no, you can't play it". I think being fractionally below the minimums means the game will still play well at reduced settings. (Some of the requirements, like 64-bit and DX11, aren't soft, though.)

So I think taking offense or blaming CDPR for not making a game that will deliver Old Masters-quality artwork on just any PC is premature.

Console CPU runs at 1.6 ghz in case of PS4 with a 7850 gpu, xone cpu 1.75 ghz but with a 7770 gpu. so.....i would say i5 3.3ghz with 7870 both overclockable are more than little better, or i could say that even if consoles run "the minimum" specs, the game could still run with lesser graphics than consoles on PC.
 
My ancient Core 2 duo E8400 and radeon 4870 would commit suicide.
In February worthy successors i7 4790 and GTX 980 will take their duty.
Cannot happen soon enough!
 
Last edited:
Can the 770 effectively use 4GB? I never looked into that model, but I've heard many people say it can't or that it never uses more than 2GB at 1080p in many games. Heard there wasn't much of a gain in FPS with a 4GB 770

It's weird but depends on the game, for unity i think it needs it. while on watch dogs or mordor not that much. I think it depends on how many thing the game loads into the vram.

The vram problem is that developers use it now instead of system ram and consoles have 8 gb of it, at least 5gb usable. watch dogs, ryse son of rome, mordor, unity, they all use more than 2 gb at 1080p with the highest texture settings.

I changed my 270x 2gb for a 4gb model and the change is huge in these games.
 
Top Bottom