So an i5 2500K is the minimum? My understanding is the only difference between an i5 and i7 is hyperthreading. If I overclocked my i5 2500K will I be OK even though my processor just meets the minimum spec? Or will the lack of hyper threading make a difference? I hope to upgrade my 7870 to a 970 soon. The processor is the only thing I'm concerned about.
My specs
i5 2500K
AMD 7870
Windows 7 64Bit
The Core i7 also has slightly more cache than the Core i5, but yes, overclocking can definitely help to an extent. The question is how high you're willing to go.. If it were me, I'd shoot for 4.5ghz. But as I said a few pages back, if CDPR is recommending an i7, then it must be because the engine can scale to at least 6 threads if I had to guess, or even 8. So HT will likely give a performance increase in the Witcher 3, but not as much as having real cores instead of virtual ones.
One thing that the new consoles have done to the gaming industry, is force developers to thread their engines to be able to use the weak CPUs in the consoles effectively. If they can't thread them, they will essentially go bankrupt as no one wants to buy underperforming games.
All of the new 3D engines scale to at least 6 threads or more and Red Engine 3 looks like it's falling in line with the others in that respect..
Thank you for the detailed response.
So in the future would you recommend I upgrade to an i7 since more game engines are going to take advantage of of 6 or more threads?
So an i5 2500K is the minimum? My understanding is the only difference between an i5 and i7 is hyperthreading. If I overclocked my i5 2500K will I be OK even though my processor just meets the minimum spec? Or will the lack of hyper threading make a difference? I hope to upgrade my 7870 to a 970 soon. The processor is the only thing I'm concerned about.
My specs
i5 2500K
AMD 7870
Windows 7 64Bit
Edit: I forgot to add that I have 8Gb of Ram.
@ Sardukhar, you're extremely GPU bound though at those settings, which is a lot different than the average gamer that games at more CPU limited resolutions like 1080p..
I suppose this throws my plan on using an i3 4160 to play witcher 3 out the window ;_;
I'd assume you'd be fine since the recommended (What I assume is for max settings) is a 770. I'd remember to turn Depth Of Field off for more FPS too. It's really an unnecessary effect and I know in a few games it takes a hit on my FPS.My specs:
CPU: i5 3470K 3,4Ghz
GPU: GTX 760 oc
RAM: 8 GB
I hope I can play on High settings with AA and Sync off. Any suggestions / thoughts from fellow pc gamers?
What I mean is that, the higher the resolution you game at, the less the CPU becomes a factor in performance. The CPU issues draw call commands to the GPU so they can render a frame. At 6400x1600, the GPUs are taking a lot longer to render a frame so they're not waiting on any instructions from the CPU. If you lowered your clock speed, you may not even notice any performance decrease at all..
At lower resolutions, you usually get the reverse, where the GPU ends up waiting for the CPU since the GPU can render frames much faster at lower resolutions.
Can the 770 effectively use 4GB? I never looked into that model, but I've heard many people say it can't or that it never uses more than 2GB at 1080p in many games. Heard there wasn't much of a gain in FPS with a 4GB 770Vram will be at least 2gb min and 4 gb recommended since the cards mentioned in respective specs have that much.
Specs seems a little high, the only reason for this specs being good would be if they are for run it at 60fps.
Someone said mordor is worst than this, well, mordor runs perfect with less than this, the vram is the only weird thing in mordor specs and was a lie because runs ok with 2gb and perfect with 4gb even on mid range cards.
I have a FX 8320, 8gb ram and r9 270x 4gb, the only game that doesn't run on max settings is unity. Inquisition runs perfect. Of course none of them surpass the 50 fps but none of them gets bellow 30 neither. mostly 40 fps on mordor and inquisition.
So, this being good specs or bad specs depends on how fast the game run with them. The Witcher 2 runs 60+ fps all time maxed out on my system.......
If I do turn down the i5 2500k to it's base 3.3 ghz, which I had to do while troubleshooting the horror show that setting these cards up was, I do see framedrop, especially in something like Watch Dogs.
Of course, that's Watch Dogs, so, yeah. Also DAI, but don't hold me to that.
Can the 770 effectively use 4GB? I never looked into that model, but I've heard many people say it can't or that it never uses more than 2GB at 1080p in many games. Heard there wasn't much of a gain in FPS with a 4GB 770
The PC minimum specs are only slightly higher than the consoles. And I strongly doubt that the minimums amount to "no, you can't play it". I think being fractionally below the minimums means the game will still play well at reduced settings. (Some of the requirements, like 64-bit and DX11, aren't soft, though.)
So I think taking offense or blaming CDPR for not making a game that will deliver Old Masters-quality artwork on just any PC is premature.
Can the 770 effectively use 4GB? I never looked into that model, but I've heard many people say it can't or that it never uses more than 2GB at 1080p in many games. Heard there wasn't much of a gain in FPS with a 4GB 770