Building a gaming PC

+
I was going to sell my 1080 ti and get a 2080 ti to play with ray tracing. But i wanna have a steady 100fps. Do you think it's better to just keep my 1080 ti?
Just for ray tracing? No. It's a big system hog for current cards and the games that use it still look great without it.
 
Just for ray tracing? No. It's a big system hog for current cards and the games that use it still look great without it.
I think you're right, gonna have to wait closer to release to make a decision on whether to upgrade or not. I hope 1080 ti is enough for ultra graphics in this game.
 
Red Herring? how?

Little gain for space on GPU die wasted on dedicated ASICs for ray tracing. That space could be used for more general purpose compute units. There were many reviews of this topic already. Serious ray tracing can't be handled even by ASICs on the GPU (simply not enough). And some light ray tracing can be handled by general purpose cores. So dedicated ASICs are neither here nor there. I.e. they are better than general compute units, but not enough for the real job. That's why it's not clear yet, that's a beneficial strategy to add those ASICs to a GPU. Nvidia gambled with it, and so far it's not clear whether it's worth it.

AMD explicitly said, that even if they add some dedicated ASICs for it, it will be mininalistic (and probably just as a marketing match to Nvidia), and they will focus on ray tracing through general purpose GPU cores.
 
Red Herring? how? will your tune change when next year when the AMD Radeon cards with it arrive? (they have the tech ready for the consoles, so it is coming)

No one doesn't want a hardware ray tracing solution, it's been the holy grail for like 25 years at least.
Little gain for space on GPU die wasted on dedicated ASICs for ray tracing. That space could be used for more general purpose compute units. There were many reviews of this topic already. Serious ray tracing can't be handled even by ASICs on the GPU (simply not enough). And some light ray tracing can be handled by general purpose cores. So dedicated ASICs are neither here nor there. I.e. they are better than general compute units, but not enough for the real job. That's why it's not clear yet, that's a beneficial strategy to add those ASICs to a GPU. Nvidia gambled with it, and so far it's not clear whether it's worth it.

AMD explicitly said, that even if they add some dedicated ASICs for it, it will be mininalistic (and probably just as a marketing match to Nvidia), and they will focus on ray tracing through general purpose GPU cores.

For now, I'd say that's probably how it will go. In the future (next 5-10 years) I imagine that ray tracing may simply take over as a standard lighting model. Once GPUs come with the hardware required as standard, then I'd say that buying an "RTX" card would be worthwhile...but of course, by then, everything will have "RTX" built-in.

I'm of the argument that it's a very subtle effect in exchange for pretty significant performance impact. So I'd say that building a brand new system with an RTX 2080 is a good option if the funds are there. Upgrading from something like a 960 or 1060 to a 2080 could be worth considering. Upgrading from a 1080 to a 2080 is likely to give middling improvements at best -- definitely not worth the $1,000+ it will cost, imo. (For that price, you could grab a high-res, 120 Hz, IPS monitor with G-Sync / FreeSync -- then ALL of your games will suddenly gain smoothness and amazing color range.)

For ray-tracing itself, it's the classic situation. The tech is brand new, super-expensive, and not yet optimized. Few titles will take major advantage of it, and the ones that do will be guinea pigs while devs explore the ins and outs. By the time it becomes standardized and much more performance friendly, the hardware will be ~50% cheaper, orders of magnitude more powerful, highly optimized, and many / most games will be using it.
 
For now, I'd say that's probably how it will go. In the future (next 5-10 years) I imagine that ray tracing may simply take over as a standard lighting model. Once GPUs come with the hardware required as standard, then I'd say that buying an "RTX" card would be worthwhile...but of course, by then, everything will have "RTX" built-in.

"For now" is kind of an important piece of information though, isn't it? Where it goes in the future isn't very relevant to where it is at now. Not when buying a card now. I've been considering upgrading my card but haven't pulled the trigger on it yet. I likely won't do so until at least the next generation of cards release. For the simple reason I don't buy the marketing. Ray tracing is the greatest thing since forever, will make your games look super duper awesome and you're a terrible person if you buy a card without it, please give me $1200. Nope.... Not going to happen. Nvidia can blow it out their ass :).

None of this is to say existing options with ray tracing are a bad purchase either. It's just I am not willing to buy a card because, oh boy, it has ray tracing. I'd buy it because it's a good video card and yields the desired performance at the desired price point. Ray tracing has nothing to do with it.

I'd add, I've seen various videos showing the differences between ray tracing off vs on. Honestly, I wasn't impressed. The improvements looked subtle at best.
 
"For now" is kind of an important piece of information though, isn't it? Where it goes in the future isn't very relevant to where it is at now. Not when buying a card now. I've been considering upgrading my card but haven't pulled the trigger on it yet. I likely won't do so until at least the next generation of cards release. For the simple reason I don't buy the marketing. Ray tracing is the greatest thing since forever, will make your games look super duper awesome and you're a terrible person if you buy a card without it, please give me $1200. Nope.... Not going to happen. Nvidia can blow it out their ass :).

None of this is to say existing options with ray tracing are a bad purchase either. It's just I am not willing to buy a card because, oh boy, it has ray tracing. I'd buy it because it's a good video card and yields the desired performance at the desired price point. Ray tracing has nothing to do with it.

I'd add, I've seen various videos showing the differences between ray tracing off vs on. Honestly, I wasn't impressed. The improvements looked subtle at best.

That's the gist of it, yeah. Totally agree. But, of course, there are people in the world with money to burn. No reason not to have all the cool new toys if you can afford them. That's what ultimately funds the later hardware. (I like to think of them as volunteers. :) )

When trying to budget, that's where it literally pays to think price vs. value. Will a 2080 be better than a 1080? Assuredly. Will it be over a grand's worth of better? Now I start thinking of other things I could buy for a thousand dollars that would have a more universal impact on my gaming experience.

That's the kicker for me with ray tracing. When I'm absorbed in a game's story, or focused on a combat sequence, or trying to figure out a puzzle...how much of my attention is really going to be on how accurate the lighting is?

Is it as important to the experience as the first sound cards that were introduced?

Is it as significant a leap as going from 2D to the first 3D accelerators?

Is it as impactful as the step from HDD to SSD?

Ray tracing may be nice to have, but I'm not sure it's worth $1,000.
 
Seems to me that people are "slightly" underestimating how big of an impact ray tracing can have on visuals:





Like holy shit it can make a difference.


DF Metro analysis is worth watching. Goes really in-depth just how much ray traced GI affects the visuals.

I don't really think it's worth it yet, but I'm very much looking forward to seeing how it's going to be used in next gen games.
 
Last edited:
Good ray tracing surely has a major impact. But good ray tracing can't be achieved on these GPU ASICs, it's too demanding for them to be used in real time processing. And more limited ray tracing indeed doesn't have as a significant impact on visuals as some try to sell it for. Question is, is it worth it (to put too much into that, at the cost of having less computing power for more general purpose GPU tasks).

It's all a trade off. Ideally, if you need it so much, there should be simply another processor, just for ray tracing (let's call it RTPU), that doesn't take away from GPU doing what it's usually supposed to be doing. Then you can get some interesting results.
 
Seems to me that people are "slightly" underestimating how big of an impact ray tracing can have on visuals:

Nope. It certainly looks better.

The first problem is, even using the example video provided, it doesn't look much better all of the time in my opinion. How much better it looks depends on the conditions in the environment at the time. In certain cases it looks great. In others... meh. Let me put it this way, in some of the scenes in the video link provided I wouldn't have noticed it missing. In many of the large open areas I wasn't impressed, for the most part. The difference was considerable in some of the smaller, enclosed areas. I suspect there is a reason for the discrepancy :). I'd add, it's not like Metro Exodus looks bad with it off. That game is freaking gorgeous either way.

The other problem would be performance. Great, "ray trace" an area of the game to an extreme. It looks amazing. It would look even more amazing if it didn't send FPS off a cliff. Quality may go up but it's a moot point if performance is tanked in the process. I'm not at all saying it will yield those results in all cases or implementations. I'm saying it would intuitively be a great way to bring hardware to it's knees.

Until games hit a point where they're widely implementing ray tracing in a way where it drastically improves visuals with reasonable performance on affordable hardware it comes off as yet another fish hook with shiny bait attached to it. It's hard enough sifting through all the bullshit, misleading, hyped up specs on hardware for.... everything. If there are any lessons to be learned from the history of the hardware world approaching "new" tech apprehensively and displaying great constitution against the plethora of misleading marketing are toward the top of the list.

Again, none of this is to say cards should be avoided if they have the words ray tracing slapped on them. It's to illustrate ray tracing is being pushed as a tool to hook, line and sinker people into paying more than they would otherwise. I'm entirely too cynical to buy into the hype. At least for existing cards with the "feature".
 
Ray tracing is basically realistically simulated lighting. Simply put, as a dev, you only need to set a light source, it's parameters and the rest (light distribution, diffusion, reflections, refraction) is handled automatically by the tech.

As a user, it's a bit comparable to the "pixel shader" leap of the early 2000's I think. Not as extreme - when that tech became used on games you had to literally buy a card that supported it or the game couldn't be run - but it's still in it's infancy.

Whithout it - as it was the case in Metro Exodus, for example - all of those need to be "written" in. That's why you get unnaturally lit or unlit areas, because it's independant of an actual source, or focal point. The video just points out how "well" the lighting was made whithout the added benefit of automatic simulation. At least in the open areas where the detail is a bit harder to notice thanks to the nature of it, "illumination".

The visual fidelity between the first two pics is telling. The first seems like your run off the mill lifeless empty area part of a grander open world that simply didn't need developer attention while the latter seems part of a well crafted ambient environment.

Some extreme cases of Ray Tracing benefit probably wouldn't be possible using the standard approach, but to be able to see them, they must be made so as to take advantage of the tech (one example of this would be the Minecraft cristal cubes).

There simply hasn't been a game made to fully display what Ray Tracing is capable of. That said, such a game would probably require much more processing power than current gen Ray Tracing cards can handle. So we need to make due with crumbs, very pretty crumbs, but probably not much else, for now.

All of this isn't to say you shouldn't buy an RTX card. The 2080 is simply a damn good card, RTX or not, and the price is the same as it was when the 1080 launched (around 750 euros), at least in my area.
 
Last edited:
In general, Nvidia isn't new with cramming their GPU die with ASICs. They did it with tensor flow ASICs, now doing with ray tracing ASICs. Question is, what's next? Even the gigantic GPU die Nivida is using is not unlimited, so this approach doesn't really scale well. As above, better just to split into a separate device at that rate (same way GPU was split from CPU to begin with).
 
Okay, but the thing is I get great performance out of my 1080 ti with RT on?

I'm fairly confident RT isn't just "on". I spent a bit of time the other day reading up on it for shits and giggles and it sounds like there is quite a bit of nuance to how it's applied, what it's applied to, the extent of the application, etc. The reading indicates one technique for getting a certain effect efficiently, in terms of the expense incurred on hardware to perform RT, may not deliver another effect efficiently. Likewise, the same approach may not be efficient when RT is applied to a different type of object.

All of the above is why I come out feeling slightly irritated with the marketing. A card or game "supports" ray tracing. That sounds great but it doesn't tell you anything particularly useful. It says nothing of the graphical benefits you're going to get from the support nor the performance loss it may entail. Buying a souped up RTX card right now for CP2077 exclusively because it supports RT sounds misguided. Buying a new card right now because you want a better card with RT as a possible cherry on top down the road might make sense.
 
Yes but RT can be faked with Opengl and VULKAN which is Opengl/Directx hybrid

That is kind of part of the point. RT can be faked where it's pseudo-RT. Even if it isn't it doesn't sound like there is one size fits all approach to apply it to everything you may see in a given scene or game. While I do not know it to be absolutely true, intuitively I'd expect this to mean there is some variance to how it is applied. As an example, it may be applied to a type of object in a certain game but not others (by object I mean type of surface). One game may make the proverbial "rays" bounce three times vs another making them bounce twice. Those differences in application in the game itself, regardless of whether the hardware can handle the demand or not ("RTX" card or not), would presumably change the result.

Marketing tends to omit these finer details. It's more like they're saying you need to buy X because it "supports" ray tracing, no other technology can do so, and you'll get a fully ray traced super awesome graphical experience. It doesn't appear to be that simple. It's relevant when it isn't that simple. It would be like buying a QLC NVME drive. Yeah, you started out in a Mercedes. Then the cache got full and your fancy car transformed into a wagon with a three legged horse pulling it. Find that on the spec sheet :).
 
Top Bottom