Something to consider as well is number of cores versus clock frequencies. Games are going to benefit less from multiple cores at low frequencies than fewer cores at high frequencies. More cores at lower frequencies are better for productivity (rendering, compiling code, etc.). More cores does
not directly result in better gaming performance.
The way I've always chosen is to look at the highest clock frequency available for the fewest cores. So, say there are 4-core chips hitting 5.2 GHz. If I want to future-proof, I'll want to find an 8-core chip in that same ballpark, say:
8 Cores at 4.8 GHz.
- A 12-core running at 3.5 GHz is likely to be "good". Alright. It'll probably show its limitations within a year or so.
- Vice versa, a 12-core at 4.3 GHz is overkill, expensive, and will likely offer very little return for gaming. (It's like buying a semi-truck with a Ferrari engine and using it for grocery shopping.) By the time games are made that will truly benefit from that power, the chips will be much more advanced and far cheaper.
- I'd want no part of a 16-core at 2.9 Ghz (<-- this would be a pure, workstation CPU and likely to offer surprisingly poor gaming performance before too long).
For gaming, I want to ensure that I maintain competitive GHz while the game is running on a
single core. That should ensure relatively smooth performance, and a GPU at the same level will handle the load very nicely. Games can't be relied upon to utilize that much threading effectively. (Not yet.) Backwards compatibility is also a factor if I intend to hang onto the games of yore and enjoy them for years to come.
As always, I'll never buy anything that was "just released". Give it a generation or two. It's always a bit of a gamble, but I'd say a very strong
8-core in the ~5.0 GHz range is the best bang for the buck right now.
For AMD chips, I'm happy to subtract 0.5 GHz. AMD chips have traditionally offered equivalent performance to Intel chips at significantly lower GHz and
much lower costs. Don't rely too much on "benchmarks", either. For whatever reason, AMD chips tend to get rather spotty scores in bench tests, but their actual, in-game performance can often whomp Intel chips. I've owned three AMD systems over time, and these are my considerations:
- AMD chips have not been adopted by developers as readily as Intel chips. There are occasionally odd issues that arise for certain titles / drivers / Windows updates. These issues are actually pretty darn uncommon, to be fair -- AMD chips work flawlessly with most titles. However, the issues that do arise are not always formally addressed.
+ The cost offset makes AMD chips a
really real consideration. I'd much rather buy an AMD chip at the specs I want than shift down to a lower grade Intel chip for the same price. (The primary reason I purchase Intel is for that extra bit of reliability, and because I can usually get them almost at-cost. I normally have to wait for months, though.)