I have found that the differences are very subtle but also obvious in various areas of the game as you can see from the screenshots below.
Raytracing OFF
View attachment 11243986
Raytracing ON
View attachment 11243989
Raytracing OFF
View attachment 11244001
Raytracing ON
View attachment 11244007
While i agree that its pretty taxing and resorce heavy it does make some differance that can be pretty easily seen. The way it works is more dynamic then the old ways of doing lighting and shadows generaly looks more "real". But it is a resorce hog and since its not that well developed yet its even worse. Also its kinda bad that theres already diffrent ways of doing it. one problem that i wasent expecting is the toll it takes on the gfx card.The only real difference I noticed between raytracing on/off is the overall taxes on your rig exploding to uncanny levels. Joe Biden will invite you in the White House for a cup of coffee a lot sooner than you ever playing CP77 with raytracing on on your laptop on battery power alone, without stuttering itself stuck, or simply give up.
Not going to bother mentioning extreme rises in temperature.
Raytracing sounds awesome for sales propaganda, but it's another one of those new features that come very close to the maximum capability of the human eye, hunch the very meager differences noticeable in your pictures.
Think it's a wise advise to consider whether or not it's worth the high resource demand for this progress.
Maybe the number of light sources and reflexions who are pretty insane in Cyberpunk, and who are maybe less in Valhalla (I didn't play it, but for me in "medieval" england, It seem obvious to me).If you take a look at the power usage of my gfx card you can se the effect of resolution scaling more clearly. since the RTX cards has RT/tensor cores and use em for specificaly these things i hit my power limit much easier then i do in assa creed valhalla for example that dont. I wasent expecting it to e quite such a powerhog but.
Hmm assa creed dont use RTX or DLSS at all. So im guessing those cores dont pull any power at all when i play that game. so i see a lower power usage just because of that. I just compared it to a similar gfx fidelety and newer games. Theres not as much light as in CP2077 but they have other stuff and another engien so its not an apples to apples exactly. comparing to metro exodus and controll would make more sense in that case and i see similar power usage too Cp2077. Anyways its a pretty cool way too make light/shadow and reflections so im all for itMaybe the number of light sources and reflexions who are pretty insane in Cyberpunk, and who are maybe less in Valhalla (I didn't play it, but for me in "medieval" england, It seem obvious to me).
While i think theres some pretty big memory leaks or bugs within the systems, i dont get that stutterlag so i have to restart the game. Generaly the game crashes at times but its never something i can notice before it happends. i use DLSS 2.2.11 (updated the file) and i dont use auto but set the level myself. Auto determens the scaling based on FPS i think so it might change and cause the stutter your talking about. Im not a expert tho so basicly guessing hereI'm running raytracing on a pretty modest rig, ryzen 7 3700x, nvidia rtx 2060, 32gb ram. I get around 50 to 60fps with ray tracing on lighting at ultra and shadows and DLSS set to auto, highest temp is round 43c cpu and 73c gpu well within tolerated limits.. It is definitely playable without stutter however there appears to be some kind of memory leak bug as after about 20 to 30 mins of playing with RT on there is a stuttery lag, which goes away after restarting the game but then returns.