Building a gaming PC

+
I think 4K is still too taxing if you want to have a decent framerate. At least for me the value of higher framerate is higher than the value of higher resolution. I think even with the new generation of GPUs, 2560x1440 will remain the optimum resolution spot. May be in a couple more generations 4K will be feasible with decent framerates in high end games.
Post automatically merged:

Also, I don't quite get the appeal of pushing resolution higher if one needs to resort to upscaling to do it. Higher resolution is about increasing image quality, while upscaling is about decreasing it as a trade off for improving framerate. One kind of cancels another. So it makes more sense for me to avoid upscaling while using resolution that GPU can natively handle, having max image quality you can achieve that way.

I.e. paying that kind of money for GPUs and then upscaling things is kind of strange to me. For lower end GPUs upscaling can make more sense though, since framerates there just drop below comfortable even with average resolutions.
 
Last edited:
I think 4K is still too taxing you want to have a decent framerate. At least for me the value of higher framerate is higher than the value of higher resolution. I think even with the new generation of GPUs, 2560x1440 will remain the optimum resolution spot. May be in a couple more generations 4K will be feasible with decent framerates in high end games.
Post automatically merged:

Also, I don't quite get the appeal of pushing resolution higher if one needs to resort to upscaling to do it. Higher resolution is about increasing image quality, while upscaling is about decreasing it as a trade off for improving framerate. One kind of cancels another. So it makes more sense for me to avoid upscaling while using resolution that GPU can natively handle, having max image quality you can achieve that way.

I.e. paying that kind of money for GPUs and then upscaling things is kind of strange to me. For lower end GPUs upscaling can make more sense though, since framerates there just drop below comfortable even with average resolutions.
Depends for me. I played in 4k since it came for pc pretty much. In the start it took 3 gfx cards too get a decent framerate now i can do it with 1. Its very expensive and highly not necessary, i agree. im a bit of a nerd tho and my only hobby is gaming so for me it was worth it. Dont go 4k unless you can keep it up tho, it sucks too have too go back -.- 4k is just so much more crisper then even 1440p sadly(even with DLSS). with a great 4k monitor its even better. Im fine with around 60 fps in most games i play (RPGS) but if i did play shooters it would probably be more in line too go with 1440p or even 1080p. You are pretty cpu bound in those resolutions with mega ultra gfx cards tho (4090) so its always a tradeoff. Prices are very out of line and power creep is also quite concerning especially the new 13900k -.- 300 w? really? the 4090s can darw 600 w -.- thats 1000 w almost if you can load both up 100%. insane.
 
Yeah, I‘m also pretty comfortable with 1440p. Not sure I’d even want a monitor large enough to justify 4k. It would have to be pretty big. And while I do notice 1440p compared to 1080p I still don’t mind playing in 1080p, as it’s not an insane difference for me personally.

The power consumption of certain new cpus and gpus is bothering me too. At this point I rather have less performance and a low to medium level of power consumption. So far the 10700 intel cpu is enough for me. And I usually prefer to not push my 3060ti too much. I’ll probably stay with that setup for as long as it works and plays most of the games I enjoy.
 
Do you guys have a recommendation for a 1440p monitor? I would be looking for something a bit nicer - with higher quality HDR, G-Sync and at least 144HZ.

I've read reviews and stuff, just curious if anyone has one that has worked out particularly well for them. :)
 
I'm currently using LG 27GP850 (27", 2560x1440). Very good colors (IPS), very good response time, 180 Hz max refresh rate, adaptive sync wtih LFC. HDR isn't something I care about yet, since on Linux it's not yet properly supported. But it has some very basic level (HDR10 / DisplayHDR 400).

https://www.lg.com/us/monitors/lg-27gp850-b-gaming-monitor#pdp_spec
Post automatically merged:

Here is a good video about HDR btw and why it's hard to implement:

 
Last edited:
Yeah, I‘m also pretty comfortable with 1440p. Not sure I’d even want a monitor large enough to justify 4k. It would have to be pretty big. And while I do notice 1440p compared to 1080p I still don’t mind playing in 1080p, as it’s not an insane difference for me personally.

The power consumption of certain new cpus and gpus is bothering me too. At this point I rather have less performance and a low to medium level of power consumption. So far the 10700 intel cpu is enough for me. And I usually prefer to not push my 3060ti too much. I’ll probably stay with that setup for as long as it works and plays most of the games I enjoy.
Tbh i have a 27 4k display and im loving it. The pixel densety is insane and it has a ips 144hz and hdr10 1000 nits. Expensive as hell tho and i dont think they even make it anymore since it was so expensive. Im guessing it sold very badly, now they make them like 42 and with crap hdr instead but the price is the same -.-
 
About power consumption, an interesting detail for new Zen 4 AMD CPUs - you can run them in eco mode, reducing TDP let's say from 170 W to 105 W. Performance decrease is pretty small while power usage drops a lot. It's not a very well known point and it's something I'll probably be using once I get one of those.
 
About power consumption, an interesting detail for new Zen 4 AMD CPUs - you can run them in eco mode, reducing TDP let's say from 170 W to 105 W. Performance decrease is pretty small while power usage drops a lot. It's not a very well known point and it's something I'll probably be using once I get one of those.
AMDs and intels TDP are pure bullshit now days -.- if the cpu draws 330 w your lowest TDP should be just that. Eco mode limits the power draw but were pretty close to diminish returns as it is it seems. Its the same with the 4090 now, some people are saying you loose 5% performance by lowering the power target quite a bit. But the whole TDP thing has been bullshit for some generations already. My 3900x is 105w but it draws easily over 130w even on auto settings.
 
Yeah, I have 5900X now and it's also officially 105 W TDP. Not looking forward to 170 W. So that eco mode looks really useful in the new models.
Post automatically merged:

And good point about diminishing returns. It seems like everyone runs to pump up power usage for small improvement of performance just to look better in benchmarks. Bad trend.
 
Last edited:
The thing thats most weird about it is that watt in doesnt even show up in amds TDP calcs. Its all temperature based from the start. if you have 300 watt in you need too remove the heat those 300w generate. Thermaldynamics is an unbreakeble law. Also a problem is the rising temperatures i think, more heat = higher resistance = more voltage = higher power draw. This is why subzero overclocking exists. Those low temps enables overclocks that just isent possible at "normal" temps.
 
It certainly doesn't help that here in the US there is still a 25% Tariff Tax on things like GPUs manufactured in China. I was hoping that President Biden would drop that nonsense but he hasn't. All it does effectively is cause a 25% Inflation rate to the consumer.

If AMD wants to compete with the 4090, which is actually considerably more efficient than the 3090, then they are going to have the exact same problem. Like Notserious80 noted The Laws of Thermodynamics is unbreakable at least with any of the current technology for the foreseen future. Even if they try to compete against the 4080 (320W total board power) they are likely going to draw at least 300W to do so. There is really no way around it because while circuit design can give you some minor gains most of the gains come from shrinking the process size and we've are about at the theoretical limit already
 
Last edited:
It certainly doesn't help that here in the US there is still a 25% Tariff Tax on things like GPUs manufactured in China. I was hoping that President Biden would drop that nonsense but he hasn't. All it does effectively is cause a 25% Inflation rate to the consumer.

If AMD wants to compete with the 4090, which is actually considerably more efficient than the 3090, then they are going to have the exact same problem. Like Notserious80 noted The Laws of Thermodynamics is unbreakable at least with any of the current technology for the foreseen future. Even if they try to compete against the 4080 (320W total board power) they are likely going to draw at least 300W to do so. There is really no way around it because while circuit design can give you some minor gains most of the gains come from shrinking the process size and we've are about at the theoretical limit already
The recent leaks seem too say its around 450 watts too :) with more rasterisation power but less RTX and no tensor (no cuda either for those that need that). The 7950xt is also a thing thats supposedly competing with the 4090ti this time around so we shall se, its only rumors and leaks so far so. Still its a troubeling trend with higher and higher powerdraw when were in a energy crisis and inflation is rampart. Ill wait untill next year anyways, all the 4090s are already out of stock over here and the cpus are just way too expensive -.- its like 5000 dollars for both and it just seems like the wrong time too spend that.
 
Physics is unbreakable, but who said they need such increase in performance at the cost of such insane cooling? I certainly don't see it as a good idea. Let them progress without worsening the power draw and cooling solutions even if increase in performance won't look as good in their PR benchmarks.

One thing for sure - I'm not going to buy such ridiculously monstrous sized GPU. So some more reasonable lower level model will do if that will be needed.
 
one customer ordered ryzen 9 7950x with 128GB DDR5 and RTX 4090. I had luck to find two stock pieces of rtx 4090 on suppliers stock the same day when was released article that scalpers scalping 4090s and sell them for doubled to tripled prices on ebay. I think the price for the card was good. About 40000 CZK excluding VAT and our rabat. it does about 50000 CZK including 21% VAT and our 2.5% rabat. 50000 CZK = about 2000 USD by todays rate
 
Physics is unbreakable, but who said they need such increase in performance at the cost of such insane cooling? I certainly don't see it as a good idea. Let them progress without worsening the power draw and cooling solutions even if increase in performance won't look as good in their PR benchmarks.

One thing for sure - I'm not going to buy such ridiculously monstrous sized GPU. So some more reasonable lower level model will do if that will be needed.
I agree kinda, i tend too watercool my cards anyways so for me its not that big of an issue. makes the cards more expensive but i gain so much by it that i dont mind. The price is the biggest thing for me -.-
 
I almost don't know anything about electronic, but from what I know about electricity it's distributing all that power through such a small connector sounds like a really bad idea... Pretty scary if you don't see it quickly enough o_O

This is what Nvidia's 12VHPWR connector looks like from the inside. There's only a thin piece of metal that connects the bridge plate to the six pins. This could be a problem for the outer wires when the cable gets bent.

Usually, each of the six pins would be crimped to one wire, without soldering.

F25A04C8-6DB1-43AC-BA77-903F1FA70B47.jpeg
10F2224F-D8C5-45B4-9050-65A8098A9B9A.jpeg

Screenshots from Paul's Hardware,
in reference to
Igor's Lab.
 
Last edited:
Top Bottom