GTX 1080. Should be enough for decently high quality visuals at 1080p60.
GTX 1080 here as well. Bought in October 2017, my first high-end GPU, and it brings me a lot of gaming joy. It's mind-blowing to see how it becomes more and more obsolete for some modern titles and doesn't always suffice for absolutely maximum settings.GTX 1080. Should be enough for decently high quality visuals at 1080p60.
Sounds all too familiar. If my 970 were to die -- heaven forbid! -- I'd be out of the gaming loop for quite a while.
Funny enough i bought my 970 (amazing card btw) around five years ago mostly for The Witcher 3 and now i torn between 3070, RX6800 and 1660 SU. Guess i'll wait for tests.
The latter's CPU and GPU system requirements, by the way, seem to be higher than Cyberpunk 2077's, which gives me lots and lots of hope for wonderful optimization. Yet, Cyberpunk 2077 is apparently more RAM-heavy and requires 12 GB of it, and Prey (2017) required 16 GB and gav me occasional slightly-sub-60 FPS because of this, yet it was almost always over 60 FPS.
Feeling actually pretty safe here, not gonna lie.
A long long time ago, 2 years ago, I thought I would build a computer that would be future proof for Cyberpunk 2077. I had to have a good graphics card with real time ray tracing. So I immediately bought a nvidia 2080 for $1100. Obviously for this money I was surely future proof. After trying it out on some ray traced games it was obvious it was not capable of running real time ray tracing. Then 2 years forward they release a $700 card that smashes mine. It was my most expensive computer and needless to say I spent over a year just playing World of Warcraft Classic on a $3000 computer. Maybe next time I'll be a little more patient with my purchase.
I did some testing on 3DMark benchmarking tool that has a RTX+DLSS test. At 4k RTX DLSS 2.0 could only get 30+fps. At 1440p it got 60+fps. This was on a nvidia 2080.
Two years ago, I bought a Radeon VII on the used market (Facebook) to a kid upgrading to a 2080ti.
I'm not a developer, let alone a developer that makes stuff for consoles, but performance issues (if the delay was indeed performance-issue-related) do not necessarily mean that the requirements for the game are too high for the hardware to handle. May as well be software issues because one single developer messed up something in a way that is difficult to revert.I'd feel safe for my GTX970 being over the minimum requirements...
However, the latest delays being all about not managing to get it to work on PS4/XBone systems is making me kinda worried.
Let's say they don't manage, cut losses and declare that anything below a Nvidia 2000 series won't do. It's a possibility I'm not looking forward to.