Tom's Hardware "Don't Trust Projekt Red system requirements"

+
And they did release it the way ubisoft did. They just didnt release it fully yet. They still need to show the rest.

Last time I checked Ubisoft show us the specs required to play Watchdogs Legion at 4K/Ultra/RT on.

Where is that for Cyberpunk 2077?

And what are they waiting on? If they are waiting for the 3080/3090 then that implies that the 2080ti cannot.
 
I was a bit furious when I saw this article, not going to lie. They make valid arguments as to the necessity of those components, but they did it in such a dirty way. They linked every specific part you'd need for the build on their page and we're not supposed to think twice about the kickbacks they're going to get from anyone dumb enough to buy those parts off the article? I especially loved the Recommended specs including a Kraken X63. I didn't realize the model and brand of cooler you used was now a part of Recommended specs to play games.

I've lost a lot of respect for Tom's over this article, which is a shame. They've been a reputable publication in the past. As for the other comments on it, I would also love to see CDPR release output specific specs. What will we need for the best 4k output? 1440p? 1080p? 60fps? Those I agree make much more sense in the broader world of PCs today. Not that I'm upset at all with what we got, I can guestimate off that at least.
 
CD Projekt Red can only optimize their game so much before it becomes downgrading. Red Dead Redemption 2 had similar specs requirements, but that game was optimized as shit, so most systems had a hard time running that game. A better example might be death stranding, as that game was optimized really well, but the engine and environment is so different from Cyberpunk that it doesn't make it a really good comparison.

A good way to find out how your PC will perform playing Cyberpunk is to try playing TW3 at max settings and go to Novigrad. If you get around 100 fps at ultra settings, then you might get 70-80 at those same settings with Cyberpunk.
 
Last time I checked Ubisoft show us the specs required to play Watchdogs Legion at 4K/Ultra/RT on.

Where is that for Cyberpunk 2077?

And what are they waiting on? If they are waiting for the 3080/3090 then that implies that the 2080ti cannot.

Cdpr have confirmed that the specs they showed are for low/high for 1080 p, and said they will release the 4k/1440p/RTX ones later.

Im not sure what they are waiting for but it isnt the 3000 series cards lol. They probably got 3000 series cards a while ago.
 
Cdpr have confirmed that the specs they showed are for low/high for 1080 p, and said they will release the 4k/1440p/RTX ones later.

Its not like people have much choice in cards when it comes to Ray-tracing. I would personally consider those recommendations kinda pointless.
 

DC9V

Forum veteran
.
Its not like people have much choice in cards when it comes to Ray-tracing. I would personally consider those recommendations kinda pointless.
People stil can choose not to make use of RT, right? I mean it's nice to have but it won't change the story of the game.
When I played Morrowind for the first time, I was spending so much time watching those stupid trees trying to see if I can find any Aliasing... A friend once told me at a LAN: "Dude, it doesn't look that different, just play the game!" And after all these years I have to admit that he was right, 100%. You can still play and enyoy the game at lowest settings. Don't focus on just the visuals. Just imagine all the side quests, isn't that crazy? You don't need ultra settings for that.
That being said, I would love to see it in 16K, lol!
 
Switching from Samsung to TSMC will mean that Nvidia needs to design their chip from scratch. Which, in turn means, that it will be a completely new RTX series, not 3000 series refresh. So it would make more sense to develop 5nm chip, instead of switching to 7nm.

My feeling is that the 3000 series was initially supposed to be 7nm and was then made 8nm compatible after the fact. And that's root cause of the higher heat output

I'm also interested what will AMD show next month. I assume their BigNavi will run a little worse than 3080, will cost a little bit less and will have more VRAM. And I'm also pretty sure that AMD will not be able to implement alternatives to NVidia new technologies (I/O, RT, DLSS 2.0) in any meaningful way.

I/O tech was pioneered by AMD on professional side GPU's a while back and they are further developing the tech with both Microsoft and Sony. So I'd be surprised if nVidia has more edge on I/O than brand recognition.

High end hardware RT seems like a gimmick, less a solid value proposition, than a solution desperately searching for a non-obsolete problem. Given how rapidly software ray tracing is advancing, it would not be all that suprising to expect AMD to be able to brute force RT trough drivers, which seems to be the solution they are going for. Unless the game engine does it for them, ofcourse.

DLSS or Deep Learning Super Sampling is a software solution, so that can be handled completely on the driver side. Meaning that AMD's offering there depends on the dept of their partnerships with Sony and Microsoft and whether they have decided to create a alternative together. Microsoft has the servers to do the brute work of creating the bespoke libraries for each game.

Last time I checked Ubisoft show us the specs required to play Watchdogs Legion at 4K/Ultra/RT on.

Where is that for Cyberpunk 2077?

And what are they waiting on? If they are waiting for the 3080/3090 then that implies that the 2080ti cannot.

As you can tell from the look of the game and the minimum specs, CDPR is rather good at game engine - uh - engineering, so it just may be that the specs for RT/Ultra are too low for nVidia to be happy for their partners at CDPR to release their max specs this close to the 3000 series launch.

It may just be, that the software RT of Cyberpunk is good enough, to hang a big old question mark on the concept of hardware RT all together. Remember physX?
 
My feeling is that the 3000 series was initially supposed to be 7nm and was then made 8nm compatible after the fact. And that's root cause of the higher heat output

That's not how the chip design works. You need to know beforehand what manufacture process will be used, not the other way around.

I/O tech was pioneered by AMD on professional side GPU's a while back and they are further developing the tech with both Microsoft and Sony. So I'd be surprised if nVidia has more edge on I/O than brand recognition.

Are you sure that the I/O solutions in new console generation is of AMD design, and not Sony / MS proprietary tech?

High end hardware RT seems like a gimmick, less a solid value proposition, than a solution desperately searching for a non-obsolete problem. Given how rapidly software ray tracing is advancing, it would not be all that suprising to expect AMD to be able to brute force RT trough drivers, which seems to be the solution they are going for. Unless the game engine does it for them, ofcourse.

Software RT would kill any performance win new AMD cards may (and it's a big "IF") have over Nvidia. Software < Hardware solution - almost universally always. Also, RTX3070, and probably even RTX3060 will offer good RT experience, which means that more gamer will get RT ready cards, which in turn will incentivize devs to add it to their games.

DLSS or Deep Learning Super Sampling is a software solution, so that can be handled completely on the driver side. Meaning that AMD's offering there depends on the dept of their partnerships with Sony and Microsoft and whether they have decided to create a alternative together. Microsoft has the servers to do the brute work of creating the bespoke libraries for each game.

For this MS till need to have appropriate GPU server hardware to do the computations in their server centres. Currently, only Nvidia offers this kind of hardware. So don't expect anything even remotely close to this in near future.
 
People stil can choose not to make use of RT, right?

Yes, but having separate recommendations for normal and RT is not going to help those people choose between them, does it? Its a personal choice. Personally I think RT is the way of the future, but until there are cheap monitors available for 4K, most people are not going to make that leap.

I imagine the reality is that most "common people" don't own 4K monitors and large bank accounts, will make do with 1080P (aka FullHD) settings simply because their monitor cannot handle anything more. After my initial shock seeing the recommended settings, they do make sense. CDPR did plan to make the game specifically for the current-gen consoles after all.

Now, I assume that the upper limit for the graphics including RT is going to be high, especially after working together with NVIDIA, so you will probably get good return on your investments if you go for the new RTX cards.
 
That's not how the chip design works. You need to know beforehand what manufacture process will be used, not the other way around.

For the design of the chip, yes. Design of an individual core? Another story.

Are you sure that the I/O solutions in new console generation is of AMD design, and not Sony / MS proprietary tech?

While I don't remember the name of it, I know that AMD did release a professional tier card with that functionality. [insert]: Named Radeon Pro SSG. A bit prematurely, perhaps. I mis-remembered the card being older. But they have been sitting on tech like that for a while, that's just a fact. The Sony / MS partnership simply allows for the niche solution to become a mass market product.

Software RT would kill any performance win new AMD cards may (and it's a big "IF") have over Nvidia. Software < Hardware solution - almost universally always. Also, RTX3070, and probably even RTX3060 will offer good RT experience, which means that more gamer will get RT ready cards, which in turn will incentivize devs to add it to their games.
Yeah, that's why we all lug around bespoke cameras, arm watches, PDAs and phones. Instead of a single software driven device.
Software RT doesn't need to be better, just good enough at the pricepoint of free.
For this MS till need to have appropriate GPU server hardware to do the computations in their server centres. Currently, only Nvidia offers this kind of hardware. So don't expect anything even remotely close to this in near future.
And? Do you actually think execs at MS, Sony or even AMD would care if they have to run with nVidia HW to make the necessary datasets?
And burning a neural net unto a chip is not any harder than burning any other instruction set. It's all just math. Simple logical math.
 
Last edited:

Basically their conclusion is that if you want the best experience, get a RTX 3080. What are your thoughts?

I don't necessarily believe think that their only conclusion was to buy a 3080 because they listed a more affordable configuration for ray tracing.
The question I have is that if this game is going to be implementing quite a few different RTX capabilities and will probably be this years show case game for RTX , what kind of performance can we expect from ray tracing capable cards in this game?

Also, the cards listed, seem to be aiming at a Locked 30FPS but that's not directly communicated and for me personally, Frames per second is more important than eye candy, especially in a First Person Shooter.

Also in response to the Toms Hardware headline posted of the "Just Buy RTX Cards", that was part of a counter views pair of articles they posted; one pro RTX and early adoption, vs one who said that early adoption of fresh technology is a waste and isn't worth the cost. Personally, I don't like the way he framed his argument because it lends itself to just indiscriminate spending on untested technologies but I liked the format of having differing opinions on a topic come from an outlet.
Post automatically merged:

To be fair, phones are able to take high quality photos due to the software AND hardware implementation. Their cameras are linked directly to their processors and have hardware dedicated to processing the actual image and the actual lenses are improving every year... so the software isn't doing ALL the heavy lifting.

Cameras smoke Laptops cause Laptop Cameras still have to go through the snail pace USB header.

RTX having hardware dedicated to the calculations typically found in ray traced light is going to be a better solution than something purely software based. Crytek found a way around it and has ray tracing working on non-RTX cards even in DX11 but you still get the best performance out of cards with the RTX Cores.
 
Last edited:
I havn't seen anyone mention the most important point in this whole discussion. I noticed the screens from the CP2077 Menu and the Graphics Setting were Low, Medium, High, Ultra and Overkill! Recommended is for 1080 on High. Everything else everyone is arguing about will fall under Ultra and Overkill which are not recommended for everyone because the cost to unlock all the eye candy is not practical for everyone its a premium. I'm sure High will look badass and you wont know any better unless you plan on taking screen shots or just staring at water reflections and shit. It could be argued that In a way all the Hyper detail almost becomes counter immersive at a point.
 
I was a bit furious when I saw this article, not going to lie. They make valid arguments as to the necessity of those components, but they did it in such a dirty way. They linked every specific part you'd need for the build on their page and we're not supposed to think twice about the kickbacks they're going to get from anyone dumb enough to buy those parts off the article? I especially loved the Recommended specs including a Kraken X63. I didn't realize the model and brand of cooler you used was now a part of Recommended specs to play games.

I've lost a lot of respect for Tom's over this article, which is a shame. They've been a reputable publication in the past. As for the other comments on it, I would also love to see CDPR release output specific specs. What will we need for the best 4k output? 1440p? 1080p? 60fps? Those I agree make much more sense in the broader world of PCs today. Not that I'm upset at all with what we got, I can guestimate off that at least.

I agree, the whole minimum vs recommended requirements is so outdated by today's standards.
 
To be fair, phones are able to take high quality photos due to the software AND hardware implementation. Their cameras are linked directly to their processors and have hardware dedicated to processing the actual image and the actual lenses are improving every year... so the software isn't doing ALL the heavy lifting.

Cameras smoke Laptops cause Laptop Cameras still have to go through the snail pace USB header.

And bespoke cameras are far better than either webcams, laptop cameras or smartphone cameres. But for the average consumer, the smartphone camera is good enough. And RTX is a bespoke camera. Hell, I'd go even as far as to say that the primary use-case for RTX is not gaming, but science.

RTX having hardware dedicated to the calculations typically found in ray traced light is going to be a better solution than something purely software based. Crytek found a way around it and has ray tracing working on non-RTX cards even in DX11 but you still get the best performance out of cards with the RTX Cores.

Is it though? RTX is a solution in the hands of nVidia, forcing game devs to optimize their game environments to the hardware. Software RT is in engine and thus in the control of the game devs, allowing them to optimize their game environs in general. There are some indications that high level HW RT is not necessarily better than lower levels.

Software RT is going to be easier to work with and grant more artistic freedom as well. The same exact thing happened with PhysX or Hairworks, for some more recent gimmicks.

There will ofcourse be hardware acceleration, but there will not be a bespoke edge for nVidia. Because the difference with generic acceleration and bespoke acceleration will be indistinguishable for majority of consumers. Already the actual difference between software RT and hardware RT is almost non-existent.

And finally, as the technology is optimized, it may just become cheap enough on the hardware that brute forcing it, without any bespoke hardware is the way to go. Unless ofcourse we switch from rasterization to raytracing as the basic way of graphics processing. But in order for that to happen, you would need HW RT to be generic.

[edit]:
After some further research, I chanced upon path tracing. Which seems to be - so far atleast - a software based iteration upon raytracing.

Check it out;
 
Last edited:
Also in response to the Toms Hardware headline posted of the "Just Buy RTX Cards", that was part of a counter views pair of articles they posted; one pro RTX and early adoption, vs one who said that early adoption of fresh technology is a waste and isn't worth the cost. Personally, I don't like the way he framed his argument because it lends itself to just indiscriminate spending on untested technologies but I liked the format of having differing opinions on a topic come from an outlet.

Regardless, the just buy it bit was what we can refer to as a credibility killer. Hardware "tech sites" should be tasked with providing information. Information is objective. Facts, evidence, logic, reliable sources, etc. It's not wild speculation, rumor merrygorounds, dramatized BS and presenting "opinions". Sadly, this doesn't generate as many clicks. Less clicks = less ad revenue/data sniping. So all these "content creators", "influencers" and tech sites have gravitated toward such behavior (they all do it now, and no it's not forgivable).

Just buy it 2.0 sure sounds like a continuation of this.... issue. None of this is to say the claims are inaccurate. The underlying point is Tom's Hardware painted themselves into "unreliable" status.

RTX having hardware dedicated to the calculations typically found in ray traced light is going to be a better solution than something purely software based. Crytek found a way around it and has ray tracing working on non-RTX cards even in DX11 but you still get the best performance out of cards with the RTX Cores.

Sure, you could argue this point. The trouble is dedicated hardware for RT requires space. This hardware has to go somewhere. This space could theoretically be allocated to other tasks. Say, "traditional" rendering. It's not helping a great deal there when it's specialized toward RT. Especially if the software (game) isn't using RT.

Of equal worry, it's arguably not good when the goal of the hardware provider becomes forcing things to be locked behind their own devices. It happens with operating systems all the time (Microsoft, Apple). It happens with hardware. RT is an example of the last one. This type of behavior and it's adjacent practices is rarely consumer friendly.
 
I'm hoping for an answer to what GPU we need for 4k max candy at 60fps.

...and if the 3090 only gets +10% over the 3080
 
Last edited:
The game is not even out yet, and they are already recommending you buy a brand new $1,800 computer. Even the editor who wrote this stated "In terms of performance, while we don't know exactly how demanding Cyberpunk 2077 will be", so whats the point of this article?

Just wait until the game comes out, do the testing for CPUs and GPUs, then give your recommendations.

Tomshardware has really been going down hill for the past few years. I hardly use that site anymore.
 
Top Bottom