Hardware Thread - General.

+
Since I don't do FPS games I've found my GTX 970 runs every game I have quite well at max settings.
And again I'm sure it doesn't hurt that the OS and drivers are on an SSD.

The 970 is closely comparable to a 1060, making it borderline overkill for 1080p. So even if you did do FPS games, you'd be pretty well off. Oh, wait... if you played FPS games, you wouldn't be happy with anything less than 120FPS@8k with a neural interface to avoid the input latency of having to use your hands to manipulate the controls! :p

RTX 2080 is going to debut at 650-700 USD

...which is more than a lot of folks spend on their entire tower :cautious:

Ur trolling.. a bit.. actually as long as a card supports the correct API and has enough memory it should boot lol I don't remember what gen dx11 started.. anyway, 4gb RAM min would be my guess

The GTX 4xx-series could do DX11. However, the 550Ti only had 1GB.

A proper troll would've asked about using a GT240 ;)
 
im trolling a bit here but i have 550ti card. how F'd am i if i try to run this game? can i get anything on the lowest settings?

Run Witcher 3 and see how it goes for a start. I believe CP77 will be made in the same engine, so that's a good place to test it. Update your drivers, change all your settings to high performance from quality, and change pre-rendered frames to 1, set all the graphics and post-processing down and see how Witcher 3 runs.

Unfortunately, I did that with my GT 640 and only got like 10 -20fps.
 
Run Witcher 3 and see how it goes for a start. I believe CP77 will be made in the same engine, so that's a good place to test it. Update your drivers, change all your settings to high performance from quality, and change pre-rendered frames to 1, set all the graphics and post-processing down and see how Witcher 3 runs.

Unfortunately, I did that with my GT 640 and only got like 10 -20fps.

Far from it. CP77 is made on new engine. W3 runs decently on GTX650 (and only if you're fine with decent graphics without all the fireworks ), but I don't think that CP77 will run on anything lower that 970/1050. There's also a matter of RAM and CPU. If you got 550ti, your RAM and CPU is far from crazy as well :) My guess is 16GB RAM, i5 and 1060 will be listed as minimum.
 
I'm thinking that if it can't run on an 8GB i3 with a 1050, at least on Low, they can kiss half their sales goodbye. Sure, a lot of us here on these forums have better, but we're not the only gamers in the market either. I'm not saying CDPR should run on total legacy hardware, but it should at least run on the sorts of rigs a lot of casual gamers have.
 
I'm thinking that if it can't run on an 8GB i3 with a 1050, at least on Low, they can kiss half their sales goodbye. Sure, a lot of us here on these forums have better, but we're not the only gamers in the market either. I'm not saying CDPR should run on total legacy hardware, but it should at least run on the sorts of rigs a lot of casual gamers have.

i dont agree. this game will sell for years just like GTA5, thats the nature of AAAA games nowadays. Most PC gamers will upgrade their cards for this game and by 2020 1080TI will be standard. Also Intel is releasing budget high end cards, prices will go down.
 
It's going to run on old hardware. No doubt about that. 970 is only one generation old, technically. Two at most. No PC dev who cares about their community is going to make a game exclusively for those with hardware that new. It's stupid and unsustainable, and it's also why I no longer buy Ubisoft (or Bethesda-published [not developed] games) titles on launch or via pre-order. Technically, I have the hardware (1080 ti/8700k) to deal with their crappy optimization, but so many others don't and it's not a practice I want to support.

The difference is the end result. I'm certain users will be able to get 30 FPS on low settings with much older hardware than a 970. But that's it. If you want 60, or even 144, FPS in the latest games, you buy new hardware. It's not CDPR's fault if you can't/won't upgrade, and they aren't obligated to hold their visual style back for those individuals. I'm certainly not going to come to Apple with my first-gen iPhone and demand they put the latest version of iOS on it.

But it will be playable, and that's what matters to people with older hardware.
 
Last edited:
i dont agree. this game will sell for years just like GTA5, thats the nature of AAAA games nowadays. Most PC gamers will upgrade their cards for this game and by 2020 1080TI will be standard. Also Intel is releasing budget high end cards, prices will go down.

...and we'll all have flying cars and take weekend vacations on the moon.

You're overlooking the fact that current i3s compare favorably to older i5s like my 4460, that a 1050Ti can run GTA5 at 55-60FPS on High at 1080p, and that many gamers have rigs that probably cost less for their entire tower than some folks here have in one of their video cards.
Post automatically merged:

It's going to run on old hardware. No doubt about that. 970 is only one generation old, technically. Two at most. No PC dev who cares about their community is going to make a game exclusively for those with hardware that new. It's stupid and unsustainable, and it's also why I no longer buy Ubisoft (or Bethesda-published [not developed] games) titles on launch or via pre-order. Technically, I have the hardware (1080 ti/8700k) to deal with their crappy optimization, but so many others don't and it's not a practice I want to support.

The difference is the end result. I'm certain users will be able to get 30 FPS on low settings with much older hardware than a 970. But that's it. If you want 60, or even 144, FPS in the latest games, you buy new hardware. It's not CDPR's fault if you can't/won't upgrade, and they aren't obligated to hold their visual style back for those individuals. I'm certainly not going to come to Apple with my first-gen iPhone and demand they put the latest version of iOS on it.

But it will be playable, and that's what matters to people with older hardware.

Agreed. Like I said, I'm not asking for legacy hardware support, but I think it'd be a bad idea to put out a game that can't be run on a $600 rig. I'll change my mind if anyone can show me an 16GB i5-8xxx/1070 rig that cheap, but until then I think the lowest specs should be closer to the average or median rig than those spoiled by their i9/twin-1080Ti rigs think of as "peasant rigs".
 
Last edited:
Should/Will CP2077 have Ray Tracing? I feel like if any game should have it it would be this game.

The more I dig, the less i think that's a good idea. First off, it'd be a lot of effort to make an engine that could even do ray-tracing. I think it safe to assume that CDPR already has an engine in mind for CP2077 and is unlikely to put the entire project back to square one for the sake of a small percentage of early adopters with deep pockets. Maybe they will remaster CP2077 with an RT engine at a later date, but I suspect the initial release will be non-RT simply so that they can ship something within the next 2-3 years.

Second, the first-gen of any radical new technology is often a dog. I made the mistake of owning a Clarkdale, thinking that it would be a deathblow to the Core 2 series of CPUs. Then Sandy Bridge (the i3/5/7-2xxx series) came out and had a bigger performance advantage over Clarkdale than Clarkdale had over the Core 2. So yeah, the RTX2080 may seem shiny, but RT won't really take off until the 2180 comes out, if that soon.
 
So yeah, the RTX2080 may seem shiny, but RT won't really take off until the 2180 comes out, if that soon.

Only there are some major titles like Exodus coming out that take advantage. And they said they've had the code for months.

Soooooo
 
Only there are some major titles like Exodus coming out that take advantage. And they said they've had the code for months.

Soooooo

If Cyberpunk was made on an engine that supports RT then it's wonderful. I would probably look at buying an RTX just for that.

But RTX is still in its early stages. The 2080ti could only run RT on 1080p at approximately 30fps, so it still needs some improvements before it becomes a norm for engines to support it. And Cyberpunk has been in development for longer than Tomb Raider or BF1, I would be pleasantly surprised if their engine supports RT.
 
Only there are some major titles like Exodus coming out that take advantage. And they said they've had the code for months.

Soooooo

Correct if wrong but from what i've read is that 2080ti struggles to give 60fps at 1080p with raytracing in Tomb Raider and the few others on display at Gamescom.
There's been a lot of poo pooing about the 2080 but its a great development. Its just the start of a better visual tech.

Ive been saying for awhile now. Id like a halt on the hunt for extra pixels. Put GPU power into pumping models/lighting/textures. What we want is life like visuals. Minecraft will still look like minecraft at 20k res. Blue ray looks great at 1080p. Its models/texture/lighting.

The only reason for resolution over 1080p on 27"+ monitor size is that we sit so close to monitor VS TVs we see individual pixels. As long as you cross that threshold its fine. i.e. 32" @ 1440p is enough. (happens to be exact same DPI as 24" at 1080p)

A game 10 years from now will look a zillion times better at 1080p than any 4k+ current game could muster.
 
Last edited:
I have an i7 6700K, GTX1070 and 16GB DDR4 3200mhz. I don't know about any fancy new tech or features and I have no idea what ray tracing is, but if that rig can't run the game on at least medium high settings, it can stay in the shelf.
 
Only there are some major titles like Exodus coming out that take advantage. And they said they've had the code for months.

Soooooo

I don't have enough details on how the 4A engine compares to REDengine4, but I don't see Ford putting out Hemis simply because Dodge comes out with a new motor either.

Correct if wrong but from what i've read is that 2080ti struggles to give 60fps at 1080p with raytracing in Tomb Raider and the few others on display at Gamescom.
There's been a lot of poo pooing about the 2080 but its a great development. Its just the start of a better visual tech.


If by "struggles" you mean "runs at 30-45 FPS nearly all the time" then yes.

Ive been saying for awhile now. Id like a halt on the hunt for extra pixels. Put GPU power into pumping models/lighting/textures. What we want is life like visuals. Minecraft will still look like minecraft at 20k res. Blue ray looks great at 1080p. Its models/texture/lighting.

The only reason for resolution over 1080p on 27"+ monitor size is that we sit so close to monitor VS TVs we see individual pixels. As long as you cross that threshold its fine. i.e. 32" @ 1440p is enough. (happens to be exact same DPI as 24" at 1080p)

A game 10 years from now will look a zillion times better at 1080p than any 4k+ current game could muster.

While I agree with you, there's a ton of folks who think that anything less than 1440p is too jagged, regardless of screen size. Realistically, comparing the viewing distances and adjusting, my 32" 1080p screen has about the same effective pixel density as my phone. But look how many folks thing a phone with "only" 300 pixels per inch looks like Atari 2600 level graphics.

I have an i7 6700K, GTX1070 and 16GB DDR4 3200mhz. I don't know about any fancy new tech or features and I have no idea what ray tracing is, but if that rig can't run the game on at least medium high settings, it can stay in the shelf.

If that rig won't run it on full-on-High then CDPR will lose at least 3/4 of it's sales when you consider what sort of rigs average gamers have. Of course, they could get away with having rigs like yours being the minimum if they push CP2077 back to at least 2022....
 
Top Bottom