Which card are you using for Cyberpunk2077?

+

Select the card series you will use.

  • Nvidia 30 series

    Votes: 57 18.3%
  • Nvidia 20 series

    Votes: 104 33.3%
  • Nvidia 10 series

    Votes: 87 27.9%
  • AMD 6000 series

    Votes: 4 1.3%
  • AMD 5000 series

    Votes: 17 5.4%
  • AMD RX series

    Votes: 15 4.8%
  • Other

    Votes: 28 9.0%

  • Total voters
    312
GTX 1080. Should be enough for decently high quality visuals at 1080p60.


Rip.

Yeah, I 100 percent thought it was just going to be another "slight upgrade over last series". But the 3000 cards really blew them out of the water (for the same price), and now amd also showing up.
 
Last edited:
I feel bad for anyone who last year bought a 2080 Ti in anticipation of the April launch. The game getting delayed and then the RTX 30xx series being launched would be like a double-whammy of disappointment.
 
I have a 1070. As long as the game looks and runs decent, I'm good. I would like to experience ray tracing at some point, though. Maybe a future playthrough.
 
I feel bad for anyone who last year bought a 2080 Ti in anticipation of the April launch. The game getting delayed and then the RTX 30xx series being launched would be like a double-whammy of disappointment.

Just curious what card you have?
 
Last edited:
Titan X (Pascal) <- ~1080Ti

I'm hoping its enough to crank everything 1080p and enjoy. For my tastes I can do without raytracing. Will upgrade my GPU when I want to max things at higher resolutions in the future.
 
GTX 1080. Should be enough for decently high quality visuals at 1080p60.


Rip.
GTX 1080 here as well. Bought in October 2017, my first high-end GPU, and it brings me a lot of gaming joy. It's mind-blowing to see how it becomes more and more obsolete for some modern titles and doesn't always suffice for absolutely maximum settings.

That being said, my i5-3470 and 8 GB of not-so-high-end RAM may be cause, but the absolutely majority of games still goes beyond 60 FPS - especially DOOM and DOOM: Eternal.

The latter's CPU and GPU system requirements, by the way, seem to be higher than Cyberpunk 2077's, which gives me lots and lots of hope for wonderful optimization. Yet, Cyberpunk 2077 is apparently more RAM-heavy and requires 12 GB of it, and Prey (2017) required 16 GB and gav me occasional slightly-sub-60 FPS because of this, yet it was almost always over 60 FPS.

Feeling actually pretty safe here, not gonna lie.
 
Sounds all too familiar. If my 970 were to die -- heaven forbid! -- I'd be out of the gaming loop for quite a while.

Don't feel too bad. They've been starving us of a reasonably priced GPU for almost half a decade. If only the current 3000 series were in stock anywhere. :(

Another GTX 970 owner here. I'm surprised how long it has lasted. I'm likely to use it for CP77 for as long as the 3070 RTX drought lasts. Should it break tomorrow, I'll probably go 5700 XT and not bother with the RTX 3000 / AMD 6000 series.

This isn't going to be pretty...
 
Last edited:
Funny enough i bought my 970 (amazing card btw) around five years ago mostly for The Witcher 3 and now i torn between 3070, RX6800 and 1660 SU. Guess i'll wait for tests.
 
Funny enough i bought my 970 (amazing card btw) around five years ago mostly for The Witcher 3 and now i torn between 3070, RX6800 and 1660 SU. Guess i'll wait for tests.

Bought my GTX 970 on an MSI deal that had Witcher 3 shipped with it! :D It was a Steam Key, so I bought in on GoG later. Then I bought Blood and Wine on GoG... then I bought Witcher 3 GOTY on GOG, just for the hell of it.

Not sure I'll do the same again for Cyberpunk...

And yeah, agreed! Amazing Card.
Post automatically merged:

The latter's CPU and GPU system requirements, by the way, seem to be higher than Cyberpunk 2077's, which gives me lots and lots of hope for wonderful optimization. Yet, Cyberpunk 2077 is apparently more RAM-heavy and requires 12 GB of it, and Prey (2017) required 16 GB and gav me occasional slightly-sub-60 FPS because of this, yet it was almost always over 60 FPS.

Feeling actually pretty safe here, not gonna lie.

I'd feel safe for my GTX970 being over the minimum requirements...

However, the latest delays being all about not managing to get it to work on PS4/XBone systems is making me kinda worried.

Let's say they don't manage, cut losses and declare that anything below a Nvidia 2000 series won't do. It's a possibility I'm not looking forward to.
 
Last edited:
A long long time ago, 2 years ago, I thought I would build a computer that would be future proof for Cyberpunk 2077. I had to have a good graphics card with real time ray tracing. So I immediately bought a nvidia 2080 for $1100. Obviously for this money I was surely future proof. After trying it out on some ray traced games it was obvious it was not capable of running real time ray tracing. Then 2 years forward they release a $700 card that smashes mine. It was my most expensive computer and needless to say I spent over a year just playing World of Warcraft Classic on a $3000 computer. Maybe next time I'll be a little more patient with my purchase.

I did some testing on 3DMark benchmarking tool that has a RTX+DLSS test. At 4k RTX DLSS 2.0 could only get 30+fps. At 1440p it got 60+fps. This was on a nvidia 2080.
 
A long long time ago, 2 years ago, I thought I would build a computer that would be future proof for Cyberpunk 2077. I had to have a good graphics card with real time ray tracing. So I immediately bought a nvidia 2080 for $1100. Obviously for this money I was surely future proof. After trying it out on some ray traced games it was obvious it was not capable of running real time ray tracing. Then 2 years forward they release a $700 card that smashes mine. It was my most expensive computer and needless to say I spent over a year just playing World of Warcraft Classic on a $3000 computer. Maybe next time I'll be a little more patient with my purchase.

I did some testing on 3DMark benchmarking tool that has a RTX+DLSS test. At 4k RTX DLSS 2.0 could only get 30+fps. At 1440p it got 60+fps. This was on a nvidia 2080.
Two years ago, I bought a Radeon VII on the used market (Facebook) to a kid upgrading to a 2080ti.
For 300€.
:D

You want some salt with that Thomass79?
 
If everything goes to plan the Gigabyte Aorus 3080 Xtreme will run it otherwise my trusty 1080 Ti will be at the ready.

Four weeks to go and hope some cards will find their way to European retailers.
 
I'd feel safe for my GTX970 being over the minimum requirements...

However, the latest delays being all about not managing to get it to work on PS4/XBone systems is making me kinda worried.

Let's say they don't manage, cut losses and declare that anything below a Nvidia 2000 series won't do. It's a possibility I'm not looking forward to.
I'm not a developer, let alone a developer that makes stuff for consoles, but performance issues (if the delay was indeed performance-issue-related) do not necessarily mean that the requirements for the game are too high for the hardware to handle. May as well be software issues because one single developer messed up something in a way that is difficult to revert.

As in, there is probably at least one piece of code somewhere inside the current build that is causing the troubles with consoles, yet is important and has to be reworked properly, rather than scrapped entirely to solve the issue. Moreover, simply removing something may often lead to other issues, it's really a snowball of things going wrong because things are interlaced and tangled pretty hard, both in the scope of programming and marketing or whatever else.

Hopefully the newest generation of consoles is more reliable and consistent with PC development, that'll help us all see games dropping faster.
 
Top Bottom