Building a gaming PC

+
Price per core and frames per dollar I’m sure AMD is ahead but from a sheer number of frames perspective I’m fairly certain the 9900k etc are still top dogs from game benchmarks.

correct me if I’m wrong tho
 
Price per core and frames per dollar I’m sure AMD is ahead but from a sheer number of frames perspective I’m fairly certain the 9900k etc are still top dogs from game benchmarks.

Not really, unless you are using some ancient and poorly designed games that can't use all cores. Modern games are a lot more commonly GPU bottlenecked, and spread the load on the CPU evenly. So the more cores you get, the less the CPU matters.

TL;DR: more cores > higher CPU frequency for modern games which use proper parallelism with something like Vulkan to saturate the GPU properly,
 
Depending on the price, I got my eyes on either the RTX 3080 or the RTX 3070. RTX 3080 Ti is out of the question, I’d rather use the spare cash on a Ryzen 9 3900X instead of a 3700X.
 
I thi
Depending on the price, I got my eyes on either the RTX 3080 or the RTX 3070. RTX 3080 Ti is out of the question, I’d rather use the spare cash on a Ryzen 9 3900X instead of a 3700X.

I think you’ll be fine with either of those. From what we’ve heard the performance jump for this series is set to be impressive
 
Price per core and frames per dollar I’m sure AMD is ahead but from a sheer number of frames perspective I’m fairly certain the 9900k etc are still top dogs from game benchmarks.

correct me if I’m wrong tho

It likely depends on the game. As Gilrond mentions a lot of places trend toward testing older titles. Perhaps because they're familiar with them, have done older tests and are attempting to compare/contrast new parts with old ones and/or don't want to redo tests with new titles on old hardware. Regardless, the notion more cores does nothing for gaming is increasingly becoming less true. It stands to reason as games advance there will be improvements in this area.

Even if we assume Intel is better for gaming how much better is a relevant consideration. An extra 10 fps may not be matter in all cases. If my target is 1440p/60 fps and I get 70-75 with an amd cpu and 80-85 with intel it's not a big deal, for instance. If you're trying to achieve 300 fps in an old competitive shooter and think 10 less fps is going to lead to fewer hits to the dome you might look at this differently.

Personally, my competitive shooter days are behind me. Not that I was ever truly exceptional at them in the first place. Most games I play have fps intentionally capped around the monitor refresh rate for the given resolution. This may mean lower theoretical fps, sure. It also means consistent fps and feels smoother to me. It takes a load off the GPU because it's not being pushed to the max. So, lower temps, less noise, etc.
 
It likely depends on the game. As Gilrond mentions a lot of places trend toward testing older titles. Perhaps because they're familiar with them, have done older tests and are attempting to compare/contrast new parts with old ones and/or don't want to redo tests with new titles on old hardware. Regardless, the notion more cores does nothing for gaming is increasingly becoming less true. It stands to reason as games advance there will be improvements in this area.

Even if we assume Intel is better for gaming how much better is a relevant consideration. An extra 10 fps may not be matter in all cases. If my target is 1440p/60 fps and I get 70-75 with an amd cpu and 80-85 with intel it's not a big deal, for instance. If you're trying to achieve 300 fps in an old competitive shooter and think 10 less fps is going to lead to fewer hits to the dome you might look at this differently.

Personally, my competitive shooter days are behind me. Not that I was ever truly exceptional at them in the first place. Most games I play have fps intentionally capped around the monitor refresh rate for the given resolution. This may mean lower theoretical fps, sure. It also means consistent fps and feels smoother to me. It takes a load off the GPU because it's not being pushed to the max. So, lower temps, less noise, etc.
fair enough. im not a competitive player by any means really and prerfer a good story driven single player. so honestly i could be happy at a capped 60fps but as ive got a monitor that reaches 120hz @ 3440x1440 i was looking to squeeze as many frames out of the hardware as possible. however as a couple of you guys have said its leaning towards greater core utilisation as games advanced so im ever more leaning towards a 4900x
 
I think you’ll be fine with either of those. From what we’ve heard the performance jump for this series is set to be impressive

Pretty sure even a low-end 3060 card is going to be impressive improvement over previous cards. In fact its what I am personally aiming for. I'm hoping for a good card in 400-500$ range. I just dont know if I will be able to play CP2077 with my current card until 30xx comes. Although I was positively surprised with RDR2 that I could in fact, play it comfortably.

RTX3080 is going to arrive like a star destroyer out of hyperspace and blow competition to smithereens. This is inevitable. The only questions left are "when" and "how much".
 
hello I have an HP PAVILION GAMER laptop with an Intel core I5 9300h and a nvidia GEFORCE GTX 1660 Ti and what these components are enough to launch CYBERPUNK 2077?
sorry for my poor english i'm french :LOL:
 
hello I have an HP PAVILION GAMER laptop with an Intel core I5 9300h and a nvidia GEFORCE GTX 1660 Ti and what these components are enough to launch CYBERPUNK 2077?
sorry for my poor english i'm french :LOL:

We don't know yet. My guess would be it should definitely be playable. I'm going to really reach and guess on a mix of Med-Ultra settings minus some of the bells and whistles, like ray-tracing, 2K+ resolultions, etc.

EDIT: Ninja'ed by Bloodartist.
1588875155022.png
 
I expect it to be well playable on max settings with upcoming AMD Navi 20 cards. And I hope they'll switch to standard Vulkan tray tracing exertions, instead of using Nvidia only ones.
 
Not really, unless you are using some ancient and poorly designed games that can't use all cores.
Basically 95% of all games then. Which is why across the board you get better performance in games with gains in single threaded CPU performance vs adding more cores/threads.

And frankly, I still play a lot of old and poorly optimized games (like The Witcher 1) which is why I'm still very much keen on seeing proper IPC improvements in CPUs every time I upgrade them. Makes those replays so much more enjoyable. And it's really nice to see AMD being so committed in improving their IPC. IIrc their target was 7% minimum improvement with each new gen of CPUs.
 
Last edited:
Basically 95% of all games then

Old games. And I wouldn't worry about them, they aren't usually that demanding that they can't work with any modern CPU well. Witcher 1 works without any issues on Ryzens, I get some crazy FPS there.

So basically, the argument of "Intel is better because of higher boost" is already irrelevant.
 
Old games. And I wouldn't worry about them, they aren't usually that demanding that they can't work with any modern CPU well. Witcher 1 works without any issues on Ryzens, I get some crazy FPS there.

So basically, the argument of "Intel is better because of higher boost" is already irrelevant.

Boost? he said IPC, which Intel's lead has shrunk a lot but is still a lead and for games is better.

Ryzen are massively better value for money and i would recommend them for every build that isn't "money no question, best performance for games"
 
I think those who prefer Intel for gaming care about boost (i.e. higher frequency). Better IPC if anything is beneficial for parallel workloads, which was claimed above isn't the bottleneck of the older games. The argument used to be: "games are poorly parallelized, so the higher is the core frequency, the better they perform". And better IPC with less cores can't win vs more cores either way.

Better IPC only matters when you have the same number of hardware execution units to run on (i.e physical cores with SMT thrown in and such) in the systems you compare. I.e. if you have 16 core processor with better IPC than another 16 core processor, then sure. The first one is better. But if you have a better IPC 8 cores processor, 16 cores processor with worse IPC will beat it anyway in workloads that could benefit from better IPC.

So overall Ryzen has best performance for games already. Intel fell behind simply due to having less cores today (for comparable prices).
 
Last edited:
TW1 has a lot of trouble spots tho, where the performance still tanks due its reliance on single threaded CPU performance.

Old Vizima would be a prime example of that, probably due to the high NPC count (with soldiers and scoia'tael fighting each other). There are certain places where I can get 100fps and it just tanks to 60 when I move the camera because of CPU performance. Game barely uses GPU resources anyway.

You can still argue that it's an old game and of a lesser importance, but this game is pretty damn important to me. More so than any other game ever made. The first game I benchmarked after upgrading to 9900K was TW1 and was definitely pleased to see those trouble spots being smoother than on my old 4790K.
 
Last edited:
It almost certainly will not happen but imagine if they released a benchmark tool like Final Fantasy XV to gauge the performance of our hardware before release and determine if any upgrades are required. That would be a giant help.

Also in looking in new monitors to experience Cyberpunk 2077 on, which panel/resolution do you prefer? I have been gaming on 1080p for over 10 years so I am a bit behind. I guess what would you recommend between 2560x1440, 3440x1440, or 3840x2160? What about panel type, IPS or VA ?
 
It almost certainly will not happen but imagine if they released a benchmark tool like Final Fantasy XV to gauge the performance of our hardware before release and determine if any upgrades are required. That would be a giant help.

Also in looking in new monitors to experience Cyberpunk 2077 on, which panel/resolution do you prefer? I have been gaming on 1080p for over 10 years so I am a bit behind. I guess what would you recommend between 2560x1440, 3440x1440, or 3840x2160? What about panel type, IPS or VA ?
Specifically bought my Acer Predator X34P for cyberpunk a few months back
34” curved ultra wide
3440x1440
IPS panel (better colours) with over 1billion colours.
4ms gtg response time.
Love it.

IPS wins every time for single player story driven games like cyberpunk for best quality picture in the panel in terms of colour accuracy. Imo anyway
 
Specifically bought my Acer Predator X34P for cyberpunk a few months back
34” curved ultra wide
3440x1440
IPS panel (better colours) with over 1billion colours.
4ms gtg response time.
Love it.

IPS wins every time for single player story driven games like cyberpunk for best quality picture in the panel in terms of colour accuracy. Imo anyway
I have the X34A and btw, those panels are 8bit. The whole "over 1 billion colors" is mostly a marketing term, the actual figure that 8bit panels can produce is 16.7 million colors. Which btw, is enough.

True 10bit panels are able to reproduce 1,07 billion colors, but those are rare and very expensive.

When it comes to resolution, 1440p strikes the best balance right now in terms of image sharpness and performance.
 
So overall Ryzen has best performance for games already. Intel fell behind simply due to having less cores today (for comparable prices).

Not true, otherwise Zen would be on top of game benchmarks.
Also, cores? Currently there's 8 cores with 9700k, 8 cores/ 16 threads with 9900k. Coming end of this month is 10c/20t 10900k, 8c/16t 10700k. Currently, for games, that's enough.

If cores were the be all and end all then 64 core threadripper would be at the top, but its not. There's only one game, Ashes of the Singularity, that does well the more you throw at it. Does anyone actually play it?

Intel has been left in the dust with workstation type workloads, not games.
If you do that work, of course, get a Zen. If you want good value, get Zen. If you want the best gaming PC - get Intel. Its not subjective. Things may change with Zen 4000 but that's not now, nor confirmed.

Best gaming rig now - Intel CPU & Nvidia GPU. (as its been for a very long time)

Further on that, its why i've said that AMD has their work cut out for them. Given what Intel can still do with a major screwup, when they finally get back on track, with new IPC and smaller process, things may look bad.

Also in looking in new monitors to experience Cyberpunk 2077 on, which panel/resolution do you prefer? I have been gaming on 1080p for over 10 years so I am a bit behind. I guess what would you recommend between 2560x1440, 3440x1440, or 3840x2160? What about panel type, IPS or VA ?

Concur with Eskimoe, 1440p.
21:9 or 16:9, whatever your preference. Mine is 16:9 as its most compatible with all content and i don't like 21:9's lack of vertical height.
Same with IPS or VA, both have their strengths and weaknesses.
Whatever you go with, i strongly recommend getting over 30". It's immersive and video/work/other use will benefit.




This is being released this month. Acer Predator XB323U
Quite a few have been waiting for a decent non-curved, 32" 16:9 1440p high refresh panel. (fad of curving non-21:9 screens needs to stop)
It's IPS, 165hz, claimed 1ms response, gysnc & freesync compatible and height adjustable. Lets hope theres minimal IPS glow.

If you like 21:9 you're in luck because there's a ton to choose from. If you can afford it, the 38GL950G.

2:1 ratio:


Wish 2:1 ratio monitors existed. Smack bang in middle of 16:9 & 21:9. Best one size fits all.
Minimal black bars for other content, not lacking in vertical height, still captures wide shots beautifully, still fine for close ups. More and more content is being filmed in 2:1.
3000x1500, 3500x1750, 4000x2000, etc resolutions. Everything about it is aesthetically pleasing. Guaranteed they'll sell well if released. This needs to be a thing.
 
Last edited:
Top Bottom