C:\cp77\hardware_requirements.info

+
I suspect that it will be like Watch Dog Legions, only two cards that can get you 4k ultra will be RTX 2080 Ti and RTX 3090. I suspect that we will be getting 30 FPS. The trailer was at 30 FPS, so I suspect that was on the 3090, yikes. I might have to lower some of the settings.

Also all ASUS TUF cards have all the smaller capacitors.

If so, I wonder if the 3080 is only 10% slower?

also.. wasn't the trailer 60 fps?
 
Also all ASUS TUF cards have all the smaller capacitors.
MLCCs are the smaller ones and they are better. The ASUS TUF was the only card that was shipped with 6x10 MLCCs. The FE has 2x10. The big black caps are the SP caps ( / POS caps). But it also depends on other components like the acutal quality of the chip, so if you got a card with 6 SP caps, it doesn't mean that the card will always crash. It's just more likely to crash and you probably can't overclock it.

Edit: More recent benchmarks from several sources show that the difference between MLCC and SP caps is not that important and cards can possibly crash despite having 6x10 MLCCs.
 
Last edited:
MLCCs are the smaller ones and they are better. The ASUS TUF was the only card that was shipped with 6x10 MLCCs. The FE has 2x10. The big black caps are the POS caps / ST caps. But it also depends on other components like the acutal quality of the chip, so if you got a card with 6 POS caps, it doesn't mean that the card will always crash. It's just more likely to crash and you probably can't overclock it.

If that is the case what happens with the Zotac and Gigabyte cards that don't have any MCLL capacitors?
 
If that is the case what happens with the Zotac and Gigabyte cards that don't have any MCLL capacitors?
the online stores are probably showing some older footage of the cards. As far as I know, PC Partner (Zotac's mother company) already claimed that they are reworking their cards.
 
the online stores are probably showing some older footage of the cards. As far as I know, PC Partner (Zotac's mother company) already claimed that they are reworking their cards.

Ah, ok.
I've only seen Gigabyte cards without MCLL online so far
 
2. If we're talking about input lag, than that is something that you rather feel in your hands than see with your eyes. You press a button, then the sound is delayed, and then the visuals are delayed even further. You can't just stare at a screen and say that you don't see a difference between 120Hz and 240Hz. And it also makes a difference in non-competitive games. But it's true that the noticeable difference between 120Hz and 144Hz is much smaller than between 50Hz and 60Hz.

This has been tested on numerous occasions and the diffrence between 120 and 240 Hertz/FPS is almost indistinguishable in any form by pro-gamers, let alone the average gamer.

3. Plenty FPS would mean stable 120 FPS on 120 Hz, and that actually requires more than 120 FPS on average. If a person wants to get the best out of those 120Hz, I wouldn't call it a complete waste to spend more money on a better graphics card or cpu.

A good ballpark figure seems to be 140-150, if you want to ensure smooth 120. Anything beyond that, is indeed a waste of money, since the benefit is almost insubstantial and the cost significant. The worst possible bang for your buck; Whimper for a grand, if you will.

Which may just be the cost difference between a solid 120fps / 1440p rig and a ePeen one.
 
No the YouTube trail from Nvidia for this game that came out recently, didn't have the option for 4k 60 FPS
 
I have an i7-3770k with 8gb RAM and am upgrading to an rtx 2060 or 2080. Won't be playing it on ultra but anyone think i will have any issue running it?
 
I have an i7-3770k with 8gb RAM and am upgrading to an rtx 2060 or 2080. Won't be playing it on ultra but anyone think i will have any issue running it?
Judging by the released system requirements and your measured expectations I think you’ll be more than fine with that setup, unless you want to play it at 4K at +60fps. What resolution and fps are you aiming for, by the way?
 
Judging by the released system requirements and your measured expectations I think you’ll be more than fine with that setup, unless you want to play it at 4K at +60fps. What resolution and fps are you aiming for, by the way?

I just want it to run smoothly on medium to high settings. With No need to overclock anything in order to avoid cutting short the life of my parts.
 
I just want it to run smoothly on medium to high settings. With No need to overclock anything in order to avoid cutting short the life of my parts.

I believe you'll be alright then :ok: Of course, I'd advice you to upgrade your PC (e.g. download buy more RAM) but if you're satisfied with your setup and/or don't have the money for upgrading then there'll be no need to worry too much, especially after the game gets a few patches post-release.
 
I believe you'll be alright then :ok: Of course, I'd advice you to upgrade your PC (e.g. download buy more RAM) but if you're satisfied with your setup and/or don't have the money for upgrading then there'll be no need to worry too much, especially after the game gets a few patches post-release.

Thanks for the feedback and I can always just put another 8 gigs in it I still have free ports.:cool:
 
yes the 3080 10-15 % slower, but not enough VRAM for 4k ultra.
Both will be able to run CyberPunk 2077.
But the latest and greatest GPU (just one part of a computer that will play this game) alone, is about the same size as your console haha.
I can't wait to play this game on a RTX 3090

unnamed.jpg
 
yes the 3080 10-15 % slower, but not enough VRAM for 4k ultra.
Both will be able to run CyberPunk 2077.
But the latest and greatest GPU (just one part of a computer that will play this game) alone, is about the same size as your console haha.
I can't wait to play this game on a RTX 3090
I can link to an article like this one from TweakTown that did some vram testing for 4k on a number of games, and the highest number in 4k was 6.4GB of vram. (up to 8.4GB of vram in super sampled 8k). In general, throughout many other sites, I see everyone basically quoting around 8GB of vram for 4k... I have yet to find any sources claiming you need substantially more than that.

Obviously, all games are not created equal, but would you mind providing some sources that back up your claim of needing over 10GB of vram to run CP2077 in 4k?
 
I can link to an article like this one from TweakTown that did some vram testing for 4k on a number of games, and the highest number in 4k was 6.4GB of vram. (up to 8.4GB of vram in super sampled 8k). In general, throughout many other sites, I see everyone basically quoting around 8GB of vram for 4k... I have yet to find any sources claiming you need substantially more than that.

Obviously, all games are not created equal, but would you mind providing some sources that back up your claim of needing over 10GB of vram to run CP2077 in 4k?
Watchdogs Legions requires 11 GB VRAM per the developer official published requirements.

Cyberpunk2077 will implement even more ray tracing features. I suspect it will be on par with Watchdogs Legions and be 11 GB plus of VRAM required.

Just speculation, yes. But 10 GB won't be enough VRAM for 4k ultra for the latest games in less than a month when Watchdogs Legions come out requiring 11 GB
 
Well... in that case, I'm speculating that CP2077 will be better optimized than anything Ubi can release, and only need 9GB of vram, because other games need less. :shrug:
 

DC9V

Forum veteran
The 3090 isn't optimized for gaming. It's a workstation GPU.
It actually is optimized for gaming. There's a chance that driver path ways won't even be available for productivity applications. Just because it's a very expensive card doesn't mean it's a "workstation" thing. It rather is a top-of-the-line card that has been announced with the intention to artificially raise the prices of the 3080s on a market that basically doesn't exist due to the lack of competition from AMD and Intel. With its 15% more power, It clearly is made to make the 3080 look more affordable.
The EVGA 3090 pre-order is currently listed for 2200€ in the Netherlands. Absolutely insane...
 
Last edited:
It actually is optimized for gaming. There's a chance that driver path ways won't even be available for productivity applications. Just because it's a very expensive card doesn't mean it's a "workstation" thing.

The only place where it really shines is workstation workloads. The difference between 3080 and 3090 looks kinda lackluster in gaming.

But after looking at it, aparrently it lacks the software for true workstation workloads. So basically it's a card without a justifiable use-case.

God, I hope AMD gets their shit together with their drivers.
 
Top Bottom