Can CDPR Devs share E3 2019 System Specs?

+
Like others have said it would have been the best machine one can game on.
OP - when you say 'not overkill', there's nothing you can do that betters the list below, its the current ultimate:

9900k
2080ti
32gb RAM (even if 16gb is fine for released game)
Fast SSD.

Played at 1080p with frame drops, not sure the fps. The e3 2018 demo on a 1080ti was locked @ 30fps.

Anyone wanting global raytracing at 1440p better hope the 3080ti is released by April.
Anyone wanting non-raytracing minimum 60fps @ max settings better revise what they think will be required.
4K people i wish you the best of luck. (there are reasons it's not recommended)
 
Last edited:
That's correct, the demo was shown on a PC. For presentations like that we always use the most powerful hardware on the market, so the specs of the presentation PC are in no way indication of any hardware requirements.

During E3 we used:
CPU: Intel i7-8700K @ 3.70 GHz
MB: ASUS ROG STRIX Z370-I GAMING
RAM: G.Skill Ripjaws V, 2x16GB, 3000MHz, CL15
GPU: Titan RTX
SSD: Samsung 960 Pro 512 GB M.2 PCIe
PSU: Corsair SF600 600W
 
That's correct, the demo was shown on a PC. For presentations like that we always use the most powerful hardware on the market, so the specs of the presentation PC are in no way indication of any hardware requirements.

Regardless, thanks for stopping by and sharing.

With a year yet to go, it might get optimized for architecture that isn't even out yet. Ray Tracing and Zen 2 have barely had their potential tapped or refined.

Still, that I7-8700K is already lower than many would have feared. The lowest tier Ryzen 3000 (3600) is already rumored to keep up with that, and the 3600 is "only" a $200 GPU.
 
Last edited:
That's correct, the demo was shown on a PC. For presentations like that we always use the most powerful hardware on the market, so the specs of the presentation PC are in no way indication of any hardware requirements.

During E3 we used:
CPU: Intel i7-8700K @ 3.70 GHz
MB: ASUS ROG STRIX Z370-I GAMING
RAM: G.Skill Ripjaws V, 2x16GB, 3000MHz, CL15
GPU: Titan RTX
SSD: Samsung 960 Pro 512 GB M.2 PCIe
PSU: Corsair SF600 600W

Thanks Vattier! For where the game is out right now that is some impressive performance based on the hardware. Ray traing is still in its infancy so if you guys were running the demo with it enabled and having solid performance, that speaks volumes. Also it is good to see it is still running on a 6-core 8700K stock @ 3.7GHz and that it didnt change from last year. That shows great optimizations where you guys are at now. I thought with some extra features AND ray tracing that an 8 core with high IPC would be absolutely required and I am very pleased to see that it isn't.

Once again I tip my hat to your team, I am very impressed :)
 
That's correct, the demo was shown on a PC. For presentations like that we always use the most powerful hardware on the market, so the specs of the presentation PC are in no way indication of any hardware requirements.

During E3 we used:
CPU: Intel i7-8700K @ 3.70 GHz
MB: ASUS ROG STRIX Z370-I GAMING
RAM: G.Skill Ripjaws V, 2x16GB, 3000MHz, CL15
GPU: Titan RTX
SSD: Samsung 960 Pro 512 GB M.2 PCIe
PSU: Corsair SF600 600W

Dang now that's some juice!
I'm sitting with an i9-9900K, 2TB SSD, and 64GB of RAM here, hope the 2080 (which I got earlier this year) remains a top tier option until when the game releases lol
 
That's correct, the demo was shown on a PC. For presentations like that we always use the most powerful hardware on the market, so the specs of the presentation PC are in no way indication of any hardware requirements.

During E3 we used:
CPU: Intel i7-8700K @ 3.70 GHz
MB: ASUS ROG STRIX Z370-I GAMING
RAM: G.Skill Ripjaws V, 2x16GB, 3000MHz, CL15
GPU: Titan RTX
SSD: Samsung 960 Pro 512 GB M.2 PCIe
PSU: Corsair SF600 600W

Vattier, could you tell us what the rough FPS was? Was it locked at 30?
cheers
 
@ReeseNE
CPU and GPU might be reasonable, but RAM is overkill.

At this point I'd like to inform you that the amount of RAM and VRAM, even disk space, is highly overrated.
It's no wonder, because the propaganda "AAA" industry doesn't care about optimization at all, which leads to high requirements on such games.
CDPR did very well job with Witcher 3 in this regard. This game was so large, and the engine was looking so awesome, and it required less than half of specs what many other games required, from which were even looking less good and weak with contents compared to Witcher 3.

This'll probably be the last process of development a few months before the release in CP2077. When CDPR is doing it right, the game will be done already, but they'll have to do optimization, bug-fixing etc. to get the most out of the engine before it launches, thus we'll most likely see system requirements shortly before the release date.
well, there was that instance of Nvidia HairWorks annihilating your performance if you had a an AMD graphics card? it like cut it in half?
 
well, there was that instance of Nvidia HairWorks annihilating your performance if you had a an AMD graphics card? it like cut it in half?
<youtube]R-6cFURV5qg>

I'm not super-deep into hardware, but I feel safe to say that's not because of bad optimization or RAM shortage.

The Nvidia card simply had certain dedicated software/hardware installed on it that allowed for better emulation of hair Physics. Enabling HairFX on a Nvidia card, allowed Witcher to run more or less normally. Like pouring beef into a meat grinder. It comes out like you'd expect. The native ability is there to handle the task.

It's Nvidia's gimmick and at the expense of AMD, Nvidia cards are designed to handle it.

Use an AMD graphics card and it'd be like pouring beef into a slushy machine. It comes out minced, but the poor device will struggle to do the job of a meat grinder. You weren't supposed to confront an AMD card with the game's request to render Nvidia hair physics. It'll try to use its existing abilities, but lacking Nvidia's native ability, the performance of an AMD card will tank as it tries to struggle its way through rendering that at the cost of everything else. There's no optimizing that.


I expect it'll be much the same story for Ray Tracing.
An RTX 2060 barely handles that. Neither the Vega nor any of the NAVI cards is really cut out to do that.

AMD has promised that the next gen consoles will have Ray Tracing on those console's AMD hardware, but AMD is going to have to figure something out to manage equivalence with a Nvidia RTX card. I don't think they'll manage to brute force their way through. Maybe they'll develop their own equivalent effect.
 
Last edited:
I'm going to need an upgrade :oops:

Just remember that CDPR has used the best hardware available to present that demo and still has 9 months to go before launch. ;) That list is not representative of the hardware you're eventually gonna need. If you can hold off upgrading, wait until 2020 and see if they'll update the list, or what the general recommendations across gaming will be.

For people like me who are still running on 6 year-old hardware though, yeah, we'll definitely need an upgrade.
 
For people like me who are still running on 6 year-old hardware though, yeah, we'll definitely need an upgrade.
6 Years old is the right number, thats about how old my rig is.
I was able to play Witcher 3 on high setting with good frames, if that is any indication I'm not too fussed about my specs, but still.... 6 years is a long time...
 
they used Titan RTX GPU in the E3:
Here full specs of the machine they used:
https://www.guru3d.com/news-story/cyberpunk-2077-demo-was-nvidia-titan-rtx.html

Not too overkill overall to be honest, except the GPU and RAM of course. But there is probably very few to no optimization made, that we can assume at least.

Hahaha! We know. Where do you think that site got it's information from? ;) Scroll 10 posts up from above yours.

That's correct, the demo was shown on a PC. For presentations like that we always use the most powerful hardware on the market, so the specs of the presentation PC are in no way indication of any hardware requirements.

During E3 we used:
CPU: Intel i7-8700K @ 3.70 GHz
MB: ASUS ROG STRIX Z370-I GAMING
RAM: G.Skill Ripjaws V, 2x16GB, 3000MHz, CL15
GPU: Titan RTX
SSD: Samsung 960 Pro 512 GB M.2 PCIe
PSU: Corsair SF600 600W
 
More updates via WCCF:
https://wccftech.com/cyberpunk-2077-e3-2019-demo-ran-1080p-ultra-rtx-on/

CDPR's Alvin Liu

Ray Tracing was on [in the demo]. We were showing off Ray Traced Emissives, Sky Light, and Ambient Occlusion. However, I’ve seen super impressive screenshots internally about raytracing (they get sent out in a digest e-mail), so we’re clearly still working on it as they looked more impressive than what I remember seeing in the demo. Especially at night and with neon reflections. NVIDIA also has representatives and work with our studio to continue to improve and utilize this technology, similar to Witcher 3 and Hairworks.

The game was running on Ultra, but we are continuing to improve our visuals.

The game demo was running at 1080p, but our trailers and publically released assets are at 4K. The UI is designed mostly at 4K (eventually it will entirely be at 4K native), but we have the technology to swap assets and do intelligent scaling to handle 1080p, widescreen, 720p, 1440p, and so on. We can also design specific UI at 1080p and other resolutions, on a need by need basis, such as on a screen or graphics with heavy icons that might look bad otherwise.
 
For people like me who are still running on 6 year-old hardware though, yeah, we'll definitely need an upgrade.

Considering that the game also releases on a 6 year old console (ps4), shouldn't a 6 year old pc enough to play the game with decent performance?
 
Considering that the game also releases on a 6 year old console (ps4), shouldn't a 6 year old pc enough to play the game with decent performance?

I fear that the game might be better optimized for the 6 year old familiar console than it is my random collection of hardware though. But, you have a point.
 



Just dying to run this on my own computer. Thinking, 1080P "Overkill" (The top graphic pre setting ,in the leaked 2077 menu system ,which list: "Overkill, Very High, High , Medium , Low...)will be a definite go,maybe 1440P "Overkill" with resolution scaling down a touch to 90%. I am a eye candy nut,and highly prefer "Maxxed" at 1080,or 1440 ,then high or a mix of high/very high at a higher res like say 4K res. I want character geometric and texture detail set to 11/10 my 1st playthrough.I hope my 2080Ti Black from eVGA will be able to handle it. When Crysis hit in 07,I built a new PC weeks before it was released and my q6600 @ 3.3Ghz ,2x1Gb of ram and a eVGA 8800GTX was the best and I had to turn things down a bit at 1080p. I hope this game,like the Witcher Series,has good support for SLI.
 
Top Bottom