Hardware Thread - General.

+
I'm somewhat weary of the whole Ray-Tracing thing and I'd say people get too hung up over it. It's clearly Nvidia's latest buzzword to get people hyped, and I'm not sure we should assign the amount of value that Nvidia would like you to attribute it.

There's some 4 games currently on the market, that use Ray-Tracing to make the fall of light SLIGHTLY prettier than it normally would've been. You're going to have to stand still and alt-tab out of your game to compare screenshots.

Sure, if you have a $2000+ rig that you upgrade every other week, I can understand you might be worried over what your ray-tracing might do. At that point you're more of a tech-enthousiast than a gamer. Literally everyone else should just focus on

instead of getting distracted by Nvidia latest carrot-on-a-stick. :p

You'll be alright and SigilFay just laid out the most important bit.

That is the goal with the marketing. :sneaky:

Presenting the RTX line! Presenting the RTX 2080! Presenting the RTX 2080 ti! Presenting the RTX Limited 2090!

Yaaawwwnnn...

My philosophy remains the same. Choose the best of the last-gen tech you can afford, keep expectations real, expect problems no matter what, and enjoy the game.

xx60 = budget line. Great performance for present titles. Some minor compromises will be needed for especially demanding titles. It will show its limits in about a year.

xx70 = solid standard. Expect really good performance for ~2-3 years.

xx80 = screaming performance and future-proofing. Quite expensive, but it should give excellent performance for ~3-5 years.

A "ti" version is always worth it if I can afford it. It's literally maximized parts and performance for that model of card. That's worth the extra bucks.

I think ray tracing will wind up being a lovely addition, but also a very processing-intensive one early on (until the hardware advances). Personally, as lovely as the pretty lighting will be, I'd focus more on 4K performance if I had to choose. Getting rid of anti-aliasing hogging 20%-30% of my FPS is far more important to me that ensuring every pixel is reflecting the right tint of light.

We're in a spot with graphics right now where the new techniques are likely to be very expensive, performance-wise, for rather subtle returns. The only thing I'm super-excitedly-eagerly-hyped for is 4K resolutions. Performance is going to become sooo smooth once that's standard.
 
That is the goal with the marketing. :sneaky:

Presenting the RTX line! Presenting the RTX 2080! Presenting the RTX 2080 ti! Presenting the RTX Limited 2090!

Yaaawwwnnn...

My philosophy remains the same. Choose the best of the last-gen tech you can afford, keep expectations real, expect problems no matter what, and enjoy the game.

xx60 = budget line. Great performance for present titles. Some minor compromises will be needed for especially demanding titles. It will show its limits in about a year.

xx70 = solid standard. Expect really good performance for ~2-3 years.

xx80 = screaming performance and future-proofing. Quite expensive, but it should give excellent performance for ~3-5 years.

A "ti" version is always worth it if I can afford it. It's literally maximized parts and performance for that model of card. That's worth the extra bucks.

I think ray tracing will wind up being a lovely addition, but also a very processing-intensive one early on (until the hardware advances). Personally, as lovely as the pretty lighting will be, I'd focus more on 4K performance if I had to choose. Getting rid of anti-aliasing hogging 20%-30% of my FPS is far more important to me that ensuring every pixel is reflecting the right tint of light.

We're in a spot with graphics right now where the new techniques are likely to be very expensive, performance-wise, for rather subtle returns. The only thing I'm super-excitedly-eagerly-hyped for is 4K resolutions. Performance is going to become sooo smooth once that's standard.

When I can play games at 120+ FPS at 4K, I'll be on board. Until then, I'll gladly stick with 24" 1080p. But I can understand why people prefer visual fidelity.

Personally, I find it remarkably difficult to immerse myself in a game that runs sub-60, visuals aside.

And Witcher 3 manages to look drop-dead gorgeous on even medium settings (high textures, though). I'm confident the same will be true of Cyberpunk 2077.
 
Personally, I find it remarkably difficult to immerse myself in a game that runs sub-60, visuals aside.

My current rig is what you could consider a dying specimen. A 6 year old AMD FX-8350, on a 6 year old MOBO, with two sticks of DDR3 RAM that are more than a decade old. Newest component is the GTX 970, which I bought when Witcher 3 came out.

The USB ports often fail, which cause me to lose sound, keyboard or mouse on occasion, and just 3 days ago I was forced to remove one of the RAM sticks since it refused to boot because of it. I *hope* that also rectifies the random black-screen reboots I've been having. There are unexplained dips in performance for newer games, causing either a 4 second freeze, or where FPS dips into the 20's for a minute until the system shakes it off.

You eventually get used to the sub-60 performance, but by god I could use the upgrade. It's just not worth fixing whatever ails the old boy.


Because next to that geriatric system, stands this:




*Pokes CDPR*

What do you want? Which GPU/CPU do you want me to put in there? Your altar to Cyberpunk 2077 stands ready to receive you! Do you want me to sacrifice a chicken in your name? I *will* sacrifice a chicken in your name!


I'd take a picture of the old rig too, but I'm afraid that the sudden flash of light would be enough to finally put it out of its misery. :(
Hang in there ol' timer! Daddy needs you! Hang on long enough for CDPR to give us the system Specs at E3. Your watch will have finally ended...
 
My current rig is what you could consider a dying specimen. A 6 year old AMD FX-8350, on a 6 year old MOBO, with two sticks of DDR3 RAM that are more than a decade old. Newest component is the GTX 970, which I bought when Witcher 3 came out.

The USB ports often fail, which cause me to lose sound, keyboard or mouse on occasion, and just 3 days ago I was forced to remove one of the RAM sticks since it refused to boot because of it. I *hope* that also rectifies the random black-screen reboots I've been having. There are unexplained dips in performance for newer games, causing either a 4 second freeze, or where FPS dips into the 20's for a minute until the system shakes it off.

You eventually get used to the sub-60 performance, but by god I could use the upgrade. It's just not worth fixing whatever ails the old boy.


Because next to that geriatric system, stands this:




*Pokes CDPR*

What do you want? Which GPU/CPU do you want me to put in there? Show me the way! Do you want me to sacrifice a chicken in your name? I *will* sacrifice a chicken in your name!


I'd take a picture of the old rig too, but I'm afraid that the sudden flash of light would be enough to finally put it out of its misery. :(
Hang in there ol' timer! Daddy needs you! Hang on long enough for CDPR to give us the system Specs at E3. Your watch will have finally ended...

The random frame dips could be a result of CPU bottlenecking, or thermal throttling. Perhaps snag Rivatuner Statistics Server + MSI Afterburner and turn on the in-game performance overlay so you can see temperature and usage.

Anyway, I know what it's like to have a dying system. Used to have one with very similar specs to you. I have a beefy rig now and I enjoy high framerates, but that's not because I'm rich (on the contrary). It took a lot of saving up, and a lot of priority shifting (I don't buy cars, TVs, or even half-decent mattresses).

I could give you some input on the hardware side if you'd like, if I knew what your budget was. If you'd prefer to wait for official specs, I understand that as well.
 
The random frame dips could be a result of CPU bottlenecking, or thermal throttling. Perhaps snag Rivatuner Statistics Server + MSI Afterburner and turn on the in-game performance overlay so you can see temperature and usage.

Anyway, I know what it's like to have a dying system. Used to have one with very similar specs to you. I have a beefy rig now and I enjoy high framerates, but that's not because I'm rich (on the contrary). It took a lot of saving up, and a lot of priority shifting (I don't buy cars, TVs, or even half-decent mattresses).

I could give you some input on the hardware side if you'd like, if I knew what your budget was. If you'd prefer to wait for official specs, I understand that as well.

Cheers mate! Well, this is the hardware thread. ^^

Don't worry, my budget is whatever CDPR tells me my budget is. I've saved up more than enough. I'll meet whatever the recommended specs are and that'll be it. I was thinking/hoping a GTX 1660 Ti and a Ryxen 5 2600X, but I'd also have to consider the new Ryzen 3000 chips, whose reveal won't be long in coming. Maybe hold out for those instead.

As shown in the picture, the new Rig will have and I already possess in terms of newly bought components:
A NZXT H400 case
A B450M Mortar Motherboard (Which should accept both the Ryzen 2000 and the Ryzen 3000 chips after a small BIOS update)
2x8 GB Vengeance LPX 3000MHz DDR4 RAM
A Mugen 5 PCGH CPU cooler.
A 240 GB Kingston SSD

And it'll be my first time attempting to overclock.

I'll can salvage the 600 Watt BeQuiet PSU, a 1 Terrabyte HDD and the GTX 970 from the old rig to carry me over should the old bird die, but I'd be without a CPU . For now, I hope the old one lasts long enough for me to hold out and make an informed decisions once CDPR reveals Cyberpunk's System Requirements and/or AMD releases the Ryzen 3000 series.
If the old rig does die ahead of the reveal, I'll go for the Ryzen 3 1600X, then upgrade to a better 3000 series should it not be enough for Cyberpunk.

My only current headache is whether or not I've bought the right RAM. It says it's a CMK16GX4M2B3000C15 V5.22, while the motherboard's website doesn't list THAT PARTICULAR version as being supported. Hope it won't matter... and I'll give MSI a call tomorrow morning to check. I still have the warranty.
 
Last edited:
Should be no issue for a 2060. What I foresee here, though, is that as a "budget" model of the RTX line, the ray-tracing functionality is probably going to get outclassed fast, and utilizing it in demanding titles will wind up costing some significant performance. Should hold it's own for a year or two, I imagine.

Yeah, I made the purchase fully aware that it's not an ideal card for RT, only providing a very basic level of RT performance. The RT demo of Quake really impressed me, but I'm ok with holding off from it for now. In a few years time, I'll be purchasing my "real" RT card and I'm sure the tech will have matured nicely by then.

Personally, as lovely as the pretty lighting will be, I'd focus more on 4K performance if I had to choose. Getting rid of anti-aliasing hogging 20%-30% of my FPS is far more important to me that ensuring every pixel is reflecting the right tint of light.

This is where DLSS comes in, which is the other piece of tech exclusive to the RTX line. Pretty clever of Nvidia to have more than one type of carrot ready. Sadly, for me, DLSS is entirely focused on 4k atm, and is almost non-existent at 1080p.
 
Last edited:
Yeah, I made the purchase fully aware that it's not an ideal card for RT, only providing a very basic level of RT performance. The RT demo of Quake really impressed me, but I'm ok with holding off from it for now. In a few years time, I'll be purchasing my "real" RT card and I'm sure the tech will have matured nicely by then.

Probably. Plus, "tricks" will invariably be figured out over time to speed up the processing for it. Besides, building with the intention of upgrading is one way to go. I've even turned a small profit on that sometimes by cannibalizing used parts I was done with and building complete systems I would sell for a few hundred bucks. You won't get rich, but it's better than letting the parts gather dust in a closet.


This is where DLSS comes in, which is the other piece of tech exclusive to the RTX line. Pretty clever of Nvidia to have more than one type of carrot ready. Sadly, for me, DLSS is entirely focused on 4k atm, and is almost non-existent at 1080p.

That's probably going to wind up being transitional tech...as it seems to be primarily interested in mitigating the effect of things like AA and resolution scaling when the framerate starts to dip. Such processes should not be necessary when the actual hardware is capable of handling the raw flops and has enough / fast enough RAM to get those textures in place, as the data values involved will be much, much larger. The DLSS idea of caching as much as possible before rendering is the idea!

However, true 4K hardware will make things like AA completely obsolete. No human eye would ever be able to see the jaggies as the pixels of such displays will be too physically small. That means a lot more polygons on screen and increased FPS. Best of both worlds. But it will take a lot of processing power on the video card's end. Right now, things are still built with all sorts of parts being dedicated to post-process image "clean-up".

Soon. Soon enough. (Then, it will be in the past, and young people will be scoffing at true 4K rendering as "old-school" and "so bad". :p)
 
I actually like no AA. I know, I'm crazy. But it makes the game feel crisper somehow. AA makes things too blurry.
 
I actually like no AA. I know, I'm crazy. But it makes the game feel crisper somehow. AA makes things too blurry.

Exactly. It's a bit of a compromise. SMAA is pretty crisp, and it cleans up most scenes in a game, especially for extremely long draw distances...but it's extremely performance heavy.

TAA is just horrible, like, hey! -- know how to ensure there are no jaggies??? -- Blur every single pixel on the screen so there's not one hard edge anywhere!!! ( So, naturally, that's become the industry standard, lately.

No AA is the crispest image one can get, since it's the raw geometry drawn straight to the screen.

FXAA has been my preference for a while. It's a bit fuzzy, but it's a nice blend, as it hits all pixels evenly across the screen without totally destroying jaggies. So, you get a sort of "free" mip mapping for textures, too. It's also extremely process-friendly. (Although, I can easily see why people don't like it, as it can make depth seem a bit "flat".)
 
I believe they said its already next-gen game, but theres options for older comps.
And what does "next gen" refer to in this case? Or did they mean consoles?

I think Im going to aim at 2060RTX since my budget is not as great as it once was, but my current GTX980Ti has a fan that has started rattling. I figure I may have to update before CP2077 comes out.. I don't care about ultracool graphics or somesuch, I just want to be able to play at non-terrible fps where I can enjoy gameplay.. And I figure RTX is one way or another going to be the next "leap".
 
And what does "next gen" refer to in this case? Or did they mean consoles?

Yeah. Meaning the game is being built for the upcoming versions of Xbox / PS4.


I think Im going to aim at 2060RTX since my budget is not as great as it once was, but my current GTX980Ti has a fan that has started rattling.

Honestly, aside from the ray-tracing option, going from a 980 ti to a 2060 would probably not offer much performance gain. Even in the benchmark testing, the 2060 offers mostly minor gains...and the variance in real-world applications is usually going to be much tighter than benchmark results. In a few cases, the 980 actually performs better than the 2060.

I'd argue that in order to get an undeniable gain in performance, one would need to grab a 1080 ti or RTX 2080. (Just compare the 980 ti to a 1080 ti on that same site to get a ballpark of the difference.)
 
I plan to build a brand new gaming PC for this game,as I did for Crysis in 2007. If you plan on upgrading,put off the video card until the game is out in retail,and folks on [H]ardOCP forums,anandtech,Toms,ExtremeSystems,GamersNexus,IE: real gamers have had it in hand for a few days,to get a much truer picture of what this iteration of the Red engine needs, to stretch its legs.I am guessing THIS title will make good use of 6+ cores+SMT(as it ran on a 8700k+32Gb of DDR4 ram/NVME/1080ti) and will run great on 16Gb,but WILL take advantage of 32Gb,if on offer.... I have all my parts picked,except GPU. Quality Ram is cheap again. One can get a 2x16Gb/4x8Gb of Corsair or GSkill or Crucial Hyper X ,DDR4 3000 for 195.00 Canadian,delivered on Amazon.
 
SMAA is pretty crisp,

TAA is just horrible,

AA is the crispest image

FXAA has been my preference for a while.)

Wish I could make heads or tails from any of those terms. I’ve fallen of the bandwagon on them for a decade or so.

I’m going to look up a guide when CB77 launches.
 
Holy Heck! Back when I built my last rig some 6 years ago, your choices for a monitor were easy. 1080 x 1920 at 60hz with decent contrast. Done! I was so focussed on the case I almost forgot the bloody monitor.

Now you've got your TN, PSI, AS, FreeSync, Gsync, 60-240hz, m/s response rates QHD, 4K and backlight.

I'm drowning!


Which would you say is best for a game like Cyberpunk? Or to a similar extend, Monster Hunter World, Witcher 3 and the latest management games like Frostpunk, Satisfactory and Anno 1800? I just want to get immersed in different worlds, with only light shooting/action...

I'm now considering the MSI MAG27CQ. It's an AS screen with 144 refresh rates, 1m/s response time, 2560 x 1440 at 27 inch. I'll pick a GTX 1660 ti to go with it, but I'll probably update to a Navi 20 AMD GPU later on.

https://www.msi.com/Monitor/Optix-MAG27CQ

Round about 400 dollars/euros is about all I'm willing to spend on a monitor, so I'm not going 4k.

For as far as I can tell, a TN screen would've been better for straight-up competitive shooters, which I don't play, whereas the MSI MAG27CQ is better for getting immersed, with a better quality colors, but ghosting if the action gets too fast. Or is the 144 hz refresh rate and the 1m/s response time, total overkill for my gaming habits and should I go for something more modest?
 
Well, my preference has always been IPS monitors because of the deep, rich color range. However, they will be lower refresh rates and/or much more expensive for higher refresh rates. IPS monitors are not really made for gaming particularly.

Modern games are taking active advantage of the higher Hz, so Gsync / Freesync is a worthy consideration. Play a shooter on a 60 Hz monitor, then play it on a 120 Hz, and you'll start to feel a difference. They will almost undeniably improve the experience in games like Rainbow 6 Siege, Apex Legends, or Counterstrike Go (where tiny, precise mouse movements can make a huge difference). While games like Monster Hunter World, Sekiro, or TW3 will not really see any gameplay improvements, you'll likely notice that they feel smoother and slicker.

Regardless, I would not recommend pouring money into a 4K monitor, as true 4K support is still probably a ways off. I think most people would be far happier with a higher-tier, 1080p or 1440p monitor in practice. (If you have the money to burn, then there's certainly no problem with 4K monitors...it's just that there's not much offset for all that money, presently. [From a strictly gaming perspective.]) In general, scaling the resolution is probably the single most impactful thing that can be done to affect a game's performance. Games today are not really designed to be run in 4K, so FPS in demanding games will likely plummet. Smoothing things out will require either turning down graphics options...or scaling back the resolution...:think:.

Despite my own preferences, I'd say the solution most people would be satisfied with is:

Gsync or Freesync
32-40 inches
Aspect Ratio -- 16:9
Resolution -- 1080p-1440p
Refresh Rate -- 120-144 Hz
 
Last edited:
Minus a few inches off, those general preferences match the MSI MAG27CQ

Freesync
27 inches
16:9
1440 p
144 Hz

Now I'm just left wondering whether I'm going overboard on the 144 Hz refresh rate and the 1m/s response time for a game like Cyberpunk 2077.
 
Minus a few inches off, those general preferences match the MSI MAG27CQ

Like Sigil says, recommend 30+ inches.
27" is too small. 32" 16:9

16:9 compatible with nearly everything. 21:9 lacks height. (wish 2:1 was the standard) Also 32 16:9 larger screen area than 34" 21:9.
32" 1440p is exact same DPI as 24 @ 1080p (91 DPi) the defacto standard for the myriad of software so 0 scaling issues because you leave scaling at 100% and everything works.

LG 32GK650F

Should say if your desk is really shallow 32" may not be good. Id upgrade the desk to one 900mm deep. Look at office desks. e.g. 1500mm x 900 or 1800 x 900.
Get a good chair while youre at it if not already. Not a crappy 'gaming' chair or Chinese ebay knockoff. A good quality comfy office chair. Do not get fake PU leather. Cloth or mesh (or real leather)
 
Last edited:
Last edited:
Hah! The review to the LG 32GK650F just popped up in my youtube subscription feed and I was already giving it serious consideration, so thanks for weighing in Triffid.

I've already got a decent IKEA desk and the IKEA markus office chair (upholstered with cloth. Like this, sans fake leather. Set-up isn't mine, but I have the same chair + desk) I'm happy with those, but thanks for the advice. I can tell you're passionate. :D

Yep, Hardware Unboxed are great for no nonsense info. Just them and Gamers Nexus and youre 99% covered.
Good chair.
The desk might be too shallow but that setup uses 2 monitors and its going to be wider than the 32" so if he can do it you should be fine.
(yep passionate and a tad pedantic and want others to enjoy the goodness)
 
SYSTEM: i7-7700, 3.6 Ghz | GTX 1070, 8GB Vram | 16GB Sram

As long as I get good performance on Very High Settings, at 1080p I'll be very happy. I can't imagine that this game will be much more demanding than Deus Ex: Mankind Divided, where I get good performance on Very High Settings, at 1080p. I will need to upgrade my 60hrz monitor though.

We have to remember that Cyberpunk 2077 started development back in 2013/14. I think development on Deus Ex: Mankind Divided started only one year prior to that, back in 2012/13. Three generations of Nvidia graphics cards have been released since then. The GTX 900 series dropped back in 2014, while the 1000 series were released back in 2016, and of course now the 2000 series in late 2018.

Hopefully Cyberpunk 2077 won't be much more demanding than some of the AAA titles that were released back in 2016/17.
 
Top Bottom