Hardware/Software Technical Discussion Thread

+
The machine that the demo ran on may not be representative of the requirements, I guess the demo used very high settings, possibly higher than ultra (only achievable by editing configuration files) to look as impressive as possible, but the actual released game could be less demanding than that. Nevertheless, if this is going to be a cross-gen release where the PC version is similar to the next gen consoles while the PS4 and XB1 are supported but significantly downgraded, I can still see a large increase in requirements compared to TW3. Especially if the game comes only in 2020.
if its coming to current gen, even downgraded, i see no reason why the devs wouldnt allow significant downward mobility on the graphics options.

Aye. Got my 1080 Ti a few months ago. There's no chance I'm upgrading unless the new 1180 or 1180 TI is significantly better.
just dont even worry, chasing the bleeding edge is pointless. something new ALWAYS comes out. to 1080 will last just fine.

Not concerned about CPU bottlenecks. Have an 8700K with a Noctua NH D15.

Just so you know, even that can bottleneck depending on the demands of the game. ive bottleneceked a 6700k on an overclock with lesser cards before in certain games. Just depends on how much it asks the game to track. granted this isnt a problem here, but a lot of multiplayer games have CPU bottlenecks. BF is famous for it.
 
if its coming to current gen, even downgraded, i see no reason why the devs wouldnt allow significant downward mobility on the graphics options.


just dont even worry, chasing the bleeding edge is pointless. something new ALWAYS comes out. to 1080 will last just fine.



Just so you know, even that can bottleneck depending on the demands of the game. ive bottleneceked a 6700k on an overclock with lesser cards before in certain games. Just depends on how much it asks the game to track. granted this isnt a problem here, but a lot of multiplayer games have CPU bottlenecks. BF is famous for it.

Yep, you're most definitely right about bottlenecking, the reason I said I'm not worried is that I've never had issues with bottlenecking in open world games before with this system (not even GTA V), and I don't play multiplayer games (Except Overwatch and HoTS, occasionally). Obviously, I can't speak for 2077 since it's not out, but I don't anticipate any issues. It depends on how much detail CDPR crams into the game, and how much that detail can be tweaked as needed (lowered NPC count, for example).

You're right about not chasing the bleeding edge of hardware. As long as 2077 can run at 144hz (or at least 120hz) at 1080p on medium-high settings, I'm happy.

But if a 1080 Ti proves incapable of that when the game launches, I'll have to rethink it a bit. Not interested in 60 or 30 FPS gameplay personally. I'm not one of those people that needs to have powerful hardware as a status symbol, I just want high FPS. Purely personal. :)
 
Ya, was just reading something on techpowerup yesterday talking about NVidia basically being ready to release their next GPU family, but they are waiting on Pascal inventory to clear out right now.
 
I think they will use the very successful GTAV recipe...

Launch the game 1 year before new gen consoles come out, relaunch it for new gen consoles 2 years after the first release.

The goal is to make many of us buy this game twice.

That's pretty much what's gonna happen. If this game turns out to be GOAT,I don't think people will mind double dipping
 

Guest 4310777

Guest
Yep, you're most definitely right about bottlenecking, the reason I said I'm not worried is that I've never had issues with bottlenecking in open world games before with this system (not even GTA V), and I don't play multiplayer games (Except Overwatch and HoTS, occasionally). Obviously, I can't speak for 2077 since it's not out, but I don't anticipate any issues. It depends on how much detail CDPR crams into the game, and how much that detail can be tweaked as needed (lowered NPC count, for example).

You're right about not chasing the bleeding edge of hardware. As long as 2077 can run at 144hz (or at least 120hz) at 1080p on medium-high settings, I'm happy.

But if a 1080 Ti proves incapable of that when the game launches, I'll have to rethink it a bit. Not interested in 60 or 30 FPS gameplay personally. I'm not one of those people that needs to have powerful hardware as a status symbol, I just want high FPS. Purely personal. :)

I get really annoyed by fluctuations in frame rate. Take the game HITMAN for instance, I can get over 100hz average, but some areas drop to a 60hz low. I usually target a 90hz low, with a 120hz frame cap on my gsync display. This game is special to me tho, I want my first experience to be no compromise.. 1440p with all bells and whistles. Unfortunately I have a feeling it's going to take more than GTX 1180 to achieve that, and I don't even have an SLI compatible motherboard lol
 
This game is special to me tho, I want my first experience to be no compromise.. 1440p with all bells and whistles. Unfortunately I have a feeling it's going to take more than GTX 1180 to achieve that, and I don't even have an SLI compatible motherboard lol
144 hz? yeah, you'll PROBABLY need more than just one card if you're trying to run 1440 and high settings.

scratch that, yeah you need sli or crossfire. im not even gonna entertain doing so without it.

you should get a better motherboard anyways, what is it, mITX?
 
144 hz? yeah, you'll PROBABLY need more than just one card if you're trying to run 1440 and high settings.

scratch that, yeah you need sli or crossfire. im not even gonna entertain doing so without it.

you should get a better motherboard anyways, what is it, mITX?
Screw that. He needs double SLI, next-gen Titans. 4 total.
 

Guest 4310777

Guest
144 hz? yeah, you'll PROBABLY need more than just one card if you're trying to run 1440 and high settings.

scratch that, yeah you need sli or crossfire. im not even gonna entertain doing so without it.

you should get a better motherboard anyways, what is it, mITX?

Yeah,
Asus z270i, Intel 6700k, G.Skill 16gb 3733mhz, EVGA 1080ti founders ed, Samsung 950pro m.2 SSD
 
I don't care about consoles, I just need the maximum quality what my GTX 1070 can achieve, this card is 2-3x more powerful than the PS4 or X box one, The Witcher 3 was very demanding but perfectly optimised, everyone who plays on PC we need one thing: don't downgrade the game because of consoles.
 
Last edited:
I hope the red engine will be perfectly optimized and smooth as butter;
I just played 2 hours of homefront: the revolution, and oh boy what a fucking mess of an engine is the cryengine... fps is perfectly acceptable, its a 2016 game of course so there's not a single performance drop with a gtx980. but my GOD, the TEMPERATURES ! GPU is almost at 80°C ! I thought there was a plane in my bedroom, it has been a VERY long time since I haven't heard my 980's fans blowing at full speed ! in comparison, the witcher 3 is barely hitting 65°C on a 3+ hours long session and my GPU remains rather silent. this is very telling on the quality of CDPR homemade engine, really.

actually, I'd like to know if CDPR devs were monitoring the CPU/GPU temperatures on the rig they used for E3 demo ? what were the cooling methods for both ?
 
I hope the red engine will be perfectly optimized and smooth as butter;
I just played 2 hours of homefront: the revolution, and oh boy what a fucking mess of an engine is the cryengine...
that has more to do with the developers of homefront, and not cry engine itself. Cryengine is fine.

but my GOD, the TEMPERATURES ! GPU is almost at 80°C ! I thought there was a plane in my bedroom, it has been a VERY long time since I haven't heard my 980's fans blowing at full speed ! in comparison, the witcher 3 is barely hitting 65°C on a 3+ hours long session and my GPU remains rather silent. this is very telling on the quality of CDPR homemade engine, really.
could be a driver issue tbh. none of the cry engine games i've ever played have done that.
 
that has more to do with the developers of homefront, and not cry engine itself. Cryengine is fine.

could be a driver issue tbh. none of the cry engine games i've ever played have done that.

yes, I should have said that the way they handled the cry engine was a fucking mess.
and indeed, most other cry engine games are just fine; in my case, crysis 3 and kingdom come are running flawlessly. I remember Homefront's launch was a mess, filled with bugs and badly optimized, until it received a couple of updates (and it's still running on fumes from what I see). Not a bad game, though.
hopefully, the red engine never suffered from such a misuse, and never will.
 
Aye. Got my 1080 Ti a few months ago. There's no chance I'm upgrading unless the new 1180 or 1180 TI is significantly better.



Not concerned about CPU bottlenecks. Have an 8700K with a Noctua NH D15.



Same. Bought my eVGA 1080Ti,just months ago. I will snag a second 1080Ti,or ,on launch a better single card,as I want to play CP2077 with all the eye candy cranked up to 11.
 
[QUOTE="pwndo1, post: 11036882, everyone who plays on PC we need one thing: don't downgrade the game because of consoles.[/QUOTE]



This !

Please CDPR. No graphical downgrades for those of us on the PC. I beg. I purchased a entirely new 💻 for Criysis 1. I did the same for Morrowind,years earlier.... Cyberpunk 2077,is in that league! A title of this tremendous level of polish like this ,comes along once every 7 - 10 years,imho. "But,can it run Crysis?" .... Will soon be "But can it run Cyberpunk?" Most game engines ,in some sense,are designed around the lowest common denominator graphically, and engine file systems that presume the players system uses spinning rust. : IE 5400-7200 rpm SATA hard drives.Here is hoping the new Red Engine 4 was built with fast NVMe/Optane drives in mind,4-8 physical CPU cores,8-32Gb of system ram and Vulkan/DX12 as a baseline.
 
Here is hoping the new Red Engine 4 was built with fast NVMe/Optane drives in mind,4-8 physical CPU cores,8-32Gb of system ram and Vulkan/DX12 as a baseline.

Did they say what OS it was running on in the demo? Vulkan would be great of course.

Optane sounds like an overkill of a requirement though. Even for NVMe SSD it's extreme.
 
Screw that. He needs double SLI, next-gen Titans. 4 total.

SLI / Crossfire are obsolete, today multiple GPUs are supposed to communicate over PCIe, using explicit APIs like Vulkan. It gives developers more control, so they treat each GPU as a separate device rather than one virtual GPU. It also allows heterogeneous setup.
 
This is exactly the reason I'm waiting until we get some accurate hardware specs before starting to build a new gaming platform.
 
Top Bottom