Building a gaming PC

+
I'm pretty sure that the Radeon 6800 will be able to do Raytracing at 1440p in the future, similar to a RTX 3070 (DLSS disabled) and at least as good as a RTX 2080 (DLSS disabled) in newer titles.

Since Cyberpunk is sponsored by NVIDIA, it's possible that CDPR is not allowed to put names of AMD cards inside those green-lighted columns. Another simple reason could be that AMD-drivers are not ready yet.

Also, all the next gen consoles feature AMD graphics, so although RT looks outstanding from what we've seen so far, it just can't be that important for the game experience.
The visuals are a big draw for this game for me, too. I'll be happy with ultra settings if I can't have ray-tracing. From what I understand you can use ray-tracing on the 6800, but anything over 1080p will drop your fps to less than 30. Maybe by next December I'll actually be able to find a 3080.
 
Making DLSS a requirement is already weird. Isn't ray tracing hardware accessible through common interfaces both in Vulkan and DX12 (aka DXR)? So what is so Nvidia specific there, let alone tied to DLSS?
 
Last edited:
Making DLSS a requirement is already weird. Isn't ray tracing hardware accessible through common interfaces both in Vulkan and DX12 (aka DXR)? So what is so Nvidia specific there, let alone tied to DLSS?

it means it runs like shit on the 2060 with RT on. which shouldn't really supprise anyone, everything runs like shit on the 2060 with RT on. DLSS makes it acceptable, but the AMD version isn't ready yet and their RT capabilities are more 2000 series than 3000 series.
 
How it runs is less relevant, the question is why something "doesn't run" or "required". If something doesn't perform well you can turn it off. But the above sounds like "it can't run" which is not the same thing. If they are using common interfances, it shouldn't be tied to Nvidia.
 
How it runs is less relevant, the question is why something "doesn't run" or "required". If something doesn't perform well you can turn it off. But the above sounds like "it can't run" which is not the same thing. If they are using common interfances, it shouldn't be tied to Nvidia.
Depends what people think of as "requirement".

Certainly DLSS is not a "must" in the sense that things simply don't run without it. It improves frame-rate, thats all. It is a bonus.

But DLSS functions specifically in the low-framerate cases, when the card has to work hard. Naturally you mention the improvement it makes when you try to run a barely functioning card like 2060. And it is exactly "how it runs" what makes this whole discussion relevant.
 
That's not how it was phrased. The quote clearly said it requires Nvidia, meaning it won't work (not that it will work worse) with AMD. Unless they said one thing and meant another.
 
Here's where the quote comes from.
Nvidia mentions DLSS in every part for each resolution.
I assume that it's not possible to run Cyberpunk with a 3090 at 4K with DLSS_disabled, at least not with consistency. The recommendations for RT look similar to the ones we saw from Watchdogs:Legion, and when we take a look at the ingame benchmark at 8:08 it shows only 30fps.
So it's not just the 2060 that benefits from DLSS...
 
Last edited:
@DC9V: Well, you don't need 4K to run the game, no? Or they meant that it won't work without upscaling on any resolution? Then I'd say this whole ray tracing hardware is highly overhyped and simply not ready even in Nvidia cards. Which I'm not surprised about. Ray tracing is a very brute force approach in general. Not smart to use in real time scenarios to begin with.
 
I believe that I have never seen so many headaches to make a game run perfectly without having already played it, knowing that at the base it has not been optimized for the next gen, even if it is. will be quickly, does not break your bank either, the game should run very well on a good configuration, without necessarily breaking the bank ... especially since we have seen a lot of "cinematics" which sends heavy ... with very good video editing ... when I look at the bit of pure and hard gameplay, I tell myself that neither should we imagine that the pure gameplay will be like that of cinematics ... even if it will be beautiful and quality ...
 
It's interesting that 6800 AMD series beat even supposedly higher end Nvidia cards in some benchmarks (i.e. I'm a bit surprised to see 6800XT beating 3090 in some of them). That shows that AMD really caught up to Nvidia in raw compute power.

So the whole ray tracing / DLSS push by Nvidia was done in anticipation of that, since they realized they lost compute power advantage. Arguments like "but RT works better" is the new marketing strategy for them.
 
What I mean is techniques that provide something much better than janky but can rely on standard GPU compute units, not on ASICs limited to ray tracing. See the linked paper which gives an idea.


I think ASICs for ray tracing is a dead end technologically. You can't make ASICs for every single need - you'll run out of space on the card. But if you can find smart ways to achive that using regular compute units of the GPU you can replace ASICs with them.

Gotcha. This is exactly what I'm talking about -- it not being fully implemented yet -- as well. Right now, we still don't know exactly how to handle RT best. Perhaps the software can be rebuilt to utilize existing hardware much more efficiently. Perhaps it will require dedicated components. Perhaps the entire GPU will need to be built in such a way that it can do only RT, and it won't be backwards-compatible with older rasterization techniques. Maybe, we'll see totally separate cards to handle just RT...while separate 4K GPUs will handle only geometry teraflops.

Again, I'd recommend people not dump large amounts of money on top-of-the-line RT cards unless you have money to burn. Even then, I wouldn't expect "stellar" results. Quite literally, what everyone is buying right now are prototypes to beta test. The technology itself, software and hardware, is not fully understood or realized yet.
 
It's interesting that 6800 AMD series beat even supposedly higher end Nvidia cards in some benchmarks (i.e. I'm a bit surprised to see 6800XT beating 3090 in some of them). That shows that AMD really caught up to Nvidia in raw compute power.

So the whole ray tracing / DLSS push by Nvidia was done in anticipation of that, since they realized they lost compute power advantage. Arguments like "but RT works better" is the new marketing strategy for them.

I think it was Nvidia’s marketing strategy for their 2000 series, before AMD’s 5000 series even came out, let alone the 6000 series to seriously rival them.

Go back a few years and a hundred pages on this thread and you’ll see me and others bemoaning the GeForce 2000’s Ray-Tracing as a gimmick to over-price their cards. You were there.

Thing is, I’m not so sure it’s a gimmick anymore, or that it was intended to be preemptive response to what AMD ended up doing now.

We’re in the second generation of ray-tracing now and AMD seems to be following in earnest. By now it’s more an evolution in the field and less of a marketing ploy.

One or two more GPU generations from now and AMD will catch up and Ray-Tracing becomes enough of a norm that we’ll stop talking about it, but unlike Hair FX, I think Ray Tracing is a definite thing and Nvidia simply managed to be market leaders.
 

Man those RT reflections look great. I hope I can enable them without FPS tanking too much (with DLSS). The rest of the RT features I can live without.
 
@Skirlasvoud: I'm pretty sure it's still a gimmick, in the sense that it's more of a hype value than real value with the current approach.

I can argue it's an anti-feature even, as long as it requires dedicated hardware on existing GPU, because I'm not convinced the approach of adding ray tracing ASICs is beneficial going forward. I already posted about that above.

I'd like to see innovation in actual lighting techniqutes, instead of a race of "who will add more ASICs for computationally intense approaches". I.e. instead of speeding up a horse, find something that's faster than a horse.
Post automatically merged:

Also, going back to that particular example with CP2077. How did no one pointed out that it's a dubious idea, to use ray tracing (supposed to improve visual quality) that requires upscaling to function (reduces visual quality). That's just a weird combination.
 
Last edited:

Man those RT reflections look great. I hope I can enable them without FPS tanking too much (with DLSS). The rest of the RT features I can live without.

Ideally, yes, as that would be how RT functions. Rather than creating a separate shader for "mirrors" and "puddles", etc., you can simply treat them like any other surface in the game. The engine will simply be told to reflect more like 100% of the light that strikes it with minimal refraction and no alteration of colors. Hence, that surface will look like a flat puddle reflecting all of the world around it. If I apply a red texture, and tell the engine to reflect only 75% of light that strikes it, I could make that same puddle look like a pool of blood instead: duller, deep red, and showing only muted highlights to make appear wet and shiny. If I applied a white texture that looked like a pattern of cracks and told the engine to only reflect 50% of the light that strikes it, I could make it look like ice: more matte, and the texture applied would give the appearance of cracked ice.

The more variety and complexity I add to a scene, the slower the performance is going to be. However, the days of needing to "artificially create the illusion of reflections" by loading shaders that needed to actually invert the vertices of 3D geometry, their textures, etc. and then render and draw them onto the "reflective surface draw space" as a separate pass before rendering / drawing the composite of both frames during a third pass in order to produce the final frame...this will be a thing of the past.

Ray tracing is using the same data to create lighting effects regardless of whether it's supposed to look like wood, blacktop, shiny leather, skin, or chrome. To alter the appearance of everything in the scene if I were to, for example, shine a flashlight on it, it doesn't require me to create "glow shaders" to "paint" that extra lighting over the existing textures. I just create a source of light, and it will add those rays into the scene along with all of the other rays that were already there -- interacting with all surfaces and all other light sources equally.

The downside is that this is a bleep-ton of processing to do overall, as ALL of those rays need to be constantly updated, "pixel-by-pixel", across every single frame. It's a fantastic idea because it's a universal system, not requiring different techniques for different effects. It's just something that neither software nor hardware has been capable of doing up to this point. So, it's a bit of a future blessing:

As hardware becomes more powerful, rather than adding even more layers of rasterization to achieve more detailed lighting results requiring more and more rendering passes...we can now devote 100% of the processing power to a single lighting technique, managing most effects in one pass. Hence, right now, hardware will struggle to do it -- but 5 years from now, it will be MUCH faster and more efficient to use RT instead of additional layers of rasterization to achieve the equivalent levels of detail.
 
Top Bottom