DLSS 2.2.11 works miracles (Remember to update Nvidia Drivers)

+
I have tested the latest DLSS version 2.4.3 and I instantly norticed severe ghosting of AVs, leg parts of moving NPCs and other rapidly moving objects. This looks absolutely horrendous in Hitman 3 with its newest patch, too.

From my personal experience, the DLSS version the game now ships with is hugely better in terms of ghosting than DLSS 2.4.3. DLSS 2.3.9 seems to be the sweet spot to me so far, until they get rid of ghosting issues with the 2.4.x iterations.
Interesting...to me it seemed like 2.4.3 and 2.3.9 were similar in terms of ghosting so I have to check again...
 
Im thinking the driver might also affect DLSS in some ways, noticed a pretty big drop in vram usage with the newer driver
 
I did the ingame benchmark and I did not notice any performance regression nor improvement, however it seems that the trailing has been reduced. Would be interesting to hear other opinions though.
 
I did the ingame benchmark and I did not notice any performance regression nor improvement, however it seems that the trailing has been reduced. Would be interesting to hear other opinions though.
That "trailing" is likely a result of TXAA.

The most bloody effective anti-aliasing technique ever created that I hate with a passion.

DLSS (at least normally) disables all standard AA methods.
 
That "trailing" is likely a result of TXAA.

The most bloody effective anti-aliasing technique ever created that I hate with a passion.

DLSS (at least normally) disables all standard AA methods.
Interesting...

Anyway, there was some on the birds in the benchmark on previous version, but it seems to be gone now. I will have to keep playing to make any sophisticated statements but as I've said it appears gone now :)
 
They released a new DLSS version again. I will give it a try when I have some spare time and also conduct a quick benchmark.

 
Given that there are many report of 2.4.12 being awful compared to 2.4.6, I will skip benchmark-comparing this version since I will definitely not be using it anyway.
 
There was talk of GeForce Experience installing the latest DLSS DLL known to work for you as part of optimization, but if I look in my bin directory it doesn't appear to have applied the 2.4.12 library. Isn't it better to use GeForce Experience for this?
 
Speaking of DLSS anybody here is aware of the FSR2.0 mod?
That one is quite interesting for ppl like me with a older NVidia GPU
or not NVidia GPU at all

Take a look at what this (imo better) alternative can to compared to DLSS, looks
quite promising (the mod has a few problems right now like ghosting)
but the FPS boost is really nice, judging by what i have seen
 
Speaking of DLSS anybody here is aware of the FSR2.0 mod?
That one is quite interesting for ppl like me with a older NVidia GPU
or not NVidia GPU at all

Take a look at what this (imo better) alternative can to compared to DLSS, looks
quite promising (the mod has a few problems right now like ghosting)
but the FPS boost is really nice, judging by what i have seen
The interesting this also is that it does work on any other game that supports DLSS and it also seems quite "easy" to integrate. This bodes well for the future of FSR given that developers might be able to integrate it just as easily.
 
Not sure how to feel about this... just got my 3090 a few months ago. The pricing is crazy high and DLSS3 being locked to 4000 series really sucks.

It's the onward march of technology. The 40 series likely has features that enable DLSS3 to work which are missing on the 30 series. (Windows 11 won't install on some computers because they lack certain features!)

This just seems to be a more significant jump than we expected. I think it's awesome, but also know that the price and energy costs of these new cards will put them out of my reach for the time being. Sigh.

One of the things I was thinking was how CP2077 really seems to have been designed for technology that didn't exist yet! And now, we're getting there! :)

I've been amused by comments by PS5 owners that say full RT should be implemented on that console... which costs significantly less than one of these cards! (I love my PS5, but it's incredible the technical requirements to do Raytracing).

I'll be interested in the GPU landscape in another year - I'm hopeful that we may have some affordable options.

Cheers,
Merric
 
Yea i think its the "new" generation of tensor cores. Just like the RT efficency becomes better and better with each generation there changing the RT cores. Also just saw on the slides... 1400 tensor cores vs 320 in the 3090... Kinda explains the differance just there i think.

I've been amused by comments by PS5 owners that say full RT should be implemented on that console... which costs significantly less than one of these cards! (I love my PS5, but it's incredible the technical requirements to do Raytracing).
Im actually a bit scared for the future. The consoles are already so far behind top pc hardware its going too start limiting Pc gaming way faster then i thought. A 4090 has the tflops of 7,5 Ps5s -.-
 
Last edited:
Im actually a bit scared for the future. The consoles are already so far behind top pc hardware its going too start limiting Pc gaming way faster then i thought. A 4090 has the tflops of 7,5 Ps5s -.-
I hope new consoles will be release more frequently... to try to follow PCs. But on the other hand, I can't see a Xbox sold at the price of an high end PC (you said 7.5 PS5, but in your opinion, this GPU will cost how much PS5 ?^^)
 
I hope new consoles will be release more frequently... to try to follow PCs. But on the other hand, I can't see a Xbox sold at the price of an high end PC (you said 7.5 PS5, but in your opinion, this GPU will cost how much PS5 ?^^)
Thats true but for the extremes it just doesnt matter. Ps5/xbox has pretty much the same speccs (small differance) compared too a 4090 its way cheaper ofc. The thing is games are not going to be able too take advantage of all that or we shall have the same situation as this game hade. Runs great on PC and cant run on consoles. This is one of the reasons high end pcs ran the game prety well even on launch, we just brute forced it a bit more. We havent seen the cheaper ones of the lovelace generation of cards yet(4060-4050) either but if even they are cheaper and or faster then the consoles i for see issues in the future if it will take 8 years for next gen consoles -.-

Oh dident aswer your question. its around 3 xboxes for a 4090 with MSRP prices.
 
Thats true but for the extremes it just doesnt matter. Ps5/xbox has pretty much the same speccs (small differance) compared too a 4090 its way cheaper ofc. The thing is games are not going to be able too take advantage of all that or we shall have the same situation as this game hade. Runs great on PC and cant run on consoles. This is one of the reasons high end pcs ran the game prety well even on launch, we just brute forced it a bit more. We havent seen the cheaper ones of the lovelace generation of cards yet(4060-4050) either but if even they are cheaper and or faster then the consoles i for see issues in the future if it will take 8 years for next gen consoles -.-
Yeah, it's not new that consoles are (far) behind, but I think the difference will growth more quicker (8 years, it's an eternity^^).
Just my opinion, I'm not sure that studios will limit their games because of console limitations. So I think we will see more and more PCs exclusive games (or games to be release on consoles, years after). As console user, I totally understand it... If I wanted to make a game, I would want to use the most advanced technology to make it as good/beautiful as possible.
Oh dident aswer your question. its around 3 xboxes for a 4090 with MSRP prices.
Yeah. Knowing that with a 4090 alone, you can't play a game :D
 
Top Bottom