DLSS 2.2.11 works miracles (Remember to update Nvidia Drivers)

+
From my understanding, they are meshing 3 different techs in the new DLSS pipeline : the existing pixel scaler/temporal aliasing, AI-generated & motion-based frames, and their low-latency tech Reflex. The system will analyse the last few frames and predict based on current motion the next frame(s) that would have been rendered, but will generate it directly instead on waiting for the actual GPU / frame buffer.

I think Nvidia Reflex is thrown in the mix to lower the input lag as much as possible, since frame interpolation could otherwise feel janky and laggy.

Digital Foundry should have a video in the next few days on the subject:
IIRC, the term Nvidia Reflex originally described their low latency monitor technology (e.g. on the ASUS ROG Swift PG259QNR 360Hz), but now they also seem to include the standard Low Latency Mode which can be activated in the Control Panel, independent from the monitor. Low Latency Mode reduces the pre-rendered frames. They're probably going to use the term once again, even if it's a completely new technology.
 
I imagine that the Serie 40 will at least have the merit of lowering the price of the Serie 30 (I hope) :)
Sadly im not sure that will happend. Been hearing some weird rumors around nvidia the last couple of weeks. They are not happy the bitcoin mining market collapsed and now they are sitting on loads of gpus that they wanna sell at as high a price as possible. We shall se what happends but the dip that already happend with 3000 series is probably the lowest it will get. Most stores where i live are running out after the big sales atm and seems nobody want too risk getting more to be stuck with when 4000 comes -.-
 
Sadly im not sure that will happend. Been hearing some weird rumors around nvidia the last couple of weeks. They are not happy the bitcoin mining market collapsed and now they are sitting on loads of gpus that they wanna sell at as high a price as possible. We shall se what happends but the dip that already happend with 3000 series is probably the lowest it will get. Most stores where i live are running out after the big sales atm and seems nobody want too risk getting more to be stuck with when 4000 comes -.-
Very possible, at the same time, "the context" is quite unusual (Without stock, no reason to lower the price...)
 
Very possible, at the same time, "the context" is quite unusual (Without stock, no reason to lower the price...)
Thats the rumor, they have stock but they are limiting the sales too keep then prices up. Supply and demand so if they limit sales too 3d part gpu makers, they cant flood the market. Stores has hade big sales on these cards so are getting emptier. Like i said ive not seen any real evidence for it except there latest report for stockowners. Some very shady comments in that -.- oh and stocks in my county
 
Thats the rumor, they have stock but they are limiting the sales too keep then prices up. Supply and demand so if they limit sales too 3d part gpu makers, they cant flood the market. Stores has hade big sales on these cards so are getting emptier. Like i said ive not seen any real evidence for it except there latest report for stockowners. Some very shady comments in that -.- oh and stocks in my county
Ok :)
It was just a guess, because I believe that it's better to sell a Serie 30 making a little less profit rather than sell nothing at all because the Serie 40 are too expensive (but I'm not business man^^).
 
Ok :)
It was just a guess, because I believe that it's better to sell a Serie 30 making a little less profit rather than sell nothing at all because the Serie 40 are too expensive (but I'm not business man^^).
Im not either but it kinda makes sense. Thats why im a bit worried. We shall se. October 15 is 4000 release date it seems.
 
Ok :)
It was just a guess, because I believe that it's better to sell a Serie 30 making a little less profit rather than sell nothing at all because the Serie 40 are too expensive (but I'm not business man^^).
You're right and not only that but also there's the power consumption for the 40 series that will likely demand a lot of ppl upgrading PSU etc. Nvidia is asking a lot. Adding to that, the cards announced by some of their partners like Asus and Gigabyte look ridiculous. At this point Tower manufacturers need to start building special containers in the case to hold these bricks... I mean GPU's
 
You're right and not only that but also there's the power consumption for the 40 series that will likely demand a lot of ppl upgrading PSU etc. Nvidia is asking a lot. Adding to that, the cards announced by some of their partners like Asus and Gigabyte look ridiculous. At this point Tower manufacturers need to start building special containers in the case to hold these bricks... I mean GPU's
Yea there getting bigger and more power hungry. Sadly its probably the way too go if you need more and more computing power. Even if you make it more efficent you still need the uplift in raw performance too. Think Nvidia is kinda swinging for the fences now too because of AMD. If the specs are correct the 4090 has x2 the Tflops of the 3090. Its rare too see that kinda uplift in that short of a time.

I kinda like the new "bricks" tho since atleast they seem rigid and might not sag as the 3090 did. And i allways watercool them anyways so ill be disasembeling it pretty fast if i get one anyways :D Hopefully the cooling solution will keep up with the added heat due too higher power consumption too.

The new psus will be interesting too se since the new pcie 5 cable is 600 watts. Hopefully it wont be as odd as 3090 but i kinda doubt it, Gamers nexus saw some really big spikes on most cards so the 450watt thats been announced might spike much much higher. Its basicly big space heaters at this point with some wicked compute power :D Its both really silly and fun at the same time.
 
Yea there getting bigger and more power hungry. Sadly its probably the way too go if you need more and more computing power. Even if you make it more efficent you still need the uplift in raw performance too. Think Nvidia is kinda swinging for the fences now too because of AMD. If the specs are correct the 4090 has x2 the Tflops of the 3090. Its rare too see that kinda uplift in that short of a time.

I kinda like the new "bricks" tho since atleast they seem rigid and might not sag as the 3090 did. And i allways watercool them anyways so ill be disasembeling it pretty fast if i get one anyways :D Hopefully the cooling solution will keep up with the added heat due too higher power consumption too.

The new psus will be interesting too se since the new pcie 5 cable is 600 watts. Hopefully it wont be as odd as 3090 but i kinda doubt it, Gamers nexus saw some really big spikes on most cards so the 450watt thats been announced might spike much much higher. Its basicly big space heaters at this point with some wicked compute power :D Its both really silly and fun at the same time.
Funny you mention that because in Gamer Nexus' review the Galax 40 series cards come with a GPU "support stick". They will definitely sag lol when the manufacturer starts including a support stick...RGB of course and no doubt at an additional cost buried in the final price, then you know we have a problem. It's certainly entertaining to watch.
 
Funny you mention that because in Gamer Nexus' review the Galax 40 series cards come with a GPU "support stick". They will definitely sag lol when the manufacturer starts including a support stick...RGB of course and no doubt at an additional cost buried in the final price, then you know we have a problem. It's certainly entertaining to watch.
Hehe well prices are going to be crazy probably. Thats the way nvidia and most of todays stuff works. Tbh It has allways been like that when it comes too computer parts. The Asus gfx card looks like it has a solid frame arund it now so im hoping that is non sagging. Dont really care how it looks tho since its performance im after. probably will wait for some time tho..
 
This is true of all parts tbh, cpu threads and so on has been very limited untill recently in gaming since the older systems could not use more then 4 anyways. Thank god theres directstorage on the consoles now so we can start using that atleast.
That's true, but it is also nice to have a couple of unused threads on the side for things like Firefox and the likes ;)

Yea i might get one too. Ill wait a while tho to ose if prices drops since they have pretty much screwed us with the 3000 series ^^ Nvidia will probably be better then AMD on RT atleast. Raw compute will probably be similar or slight ahead for nvidia since AMD goes the efficency route.
I agree, although it will still be quite interesting as to how close the 4090 and 7900XT (7950XT?) will be.

Really wish they started making the consoles upgradeble for real. They can make there own upgrade parts or something. But that would make it hard too set standards for what we can do with this game and so on, anyways rant off ^^
I think the whole point of consoles would be not needing to upgrade those, thus offering such a path would conflict with the intended simplicity of them.

That was my reasoning, too. They looked at the what the scalpers / miners were paying, and thought "hey we could do some scalping of our own!" and jacked up the prices on their 4000 series, while locking the new DLSS behind the new gen paywall. The most questionnable move is the 4070 being rebranded as a "4080 - 12gb" - I've been with nVidia for over a decade, but I'll seriously consider AMD this time around. Anyways, it took 15 months to get my 3090 at a reasonable price, I can wait for the prices to drop.

I just hope they'll continue to improve DLSS 2.x and not abandon it altogether.
I used to be an AMD user for quite a while until I went nuts and wanted the absolute high-end with RT. Then, you are basically stuck with Nvidia...

Currently no one needs a 4080/90 to play any game on the market. You don't even need a 3080/90. People just unfortunately fall prey to marketing and upgrade for no other reason than "bragging rights". I've had a 1080 since launch and I've yet to need to change it. I not only game with it but also do video editing. I could change it but I don't enjoy giving my money to wealthy companies for no reason.
This has been addressed before, but not everyone owning a 3090 or wanting to purchase a 4090 falls prey to marketing or bragging rights. I for one have a 3090 and I intend to buy a 4090 or 7900XT simply because I want to play at the highest settings at a decent framerate. I do think there is a difference between ultra and high settings, albeit I agree the difference is not that large. Also, I want to have RT on maximum and given that I have a 5120x1440 monitor DLSS below quality starts having to many issues.

As a result, I buy one of those high-end CPUs and I am well aware that I pay extra for a comparatively smaller performance uplift.

When you see people swapping gpu's at every new release then you know it's not about performance anymore, it's just about having the latest thing because rarely does performance change drastically from a single development cycle to the next.
I think if you go high-end, you pretty much have made the decision to swap GPUs often. Naturally, it is good to look at reviews and such, but if I decide I will spend 1.6k$ on a new GPU than that's perfectly fine. A friend of mine gets new bikes rather frequently and compared to those, PC gaming is cheap, even with those prices. Likewise, gaming is a hobby of mine and I do not have any qualms of putting money into it.

Yea at 1080p/1440p its harder too within reason get a 3090 unless you really need the extra Vram. I was very conflicted before buying it too but at 4k nowdays i see up too 16 gb vram "usage" at times so im glad i dident get a 3080 with 10 now -.-
Same. Now even more so, given that with the settings I use for Cyberpunk 2077 I see VRAM allocation going up to 22GiB.

^ That's also coming from someone that used to be a "Power User" during the 1990s and early 2000s. I get the lure of trying to put the biggest, baddest rig together, spending hundreds of extra dollars trying to squeeze out every last frame from any game, going for those huge numbers! I get it well.
Kind of a fun fact, but when I was younger I was never going high-end because I could not really afford it and I was also rather content with my GPUs. For me, ppl striving for high-end seemed a bit detached from reality and it also seemed that they were just throwing money away. After all, I enjoy those games as well at reasonable settings of course. Nowadays though, I do chase the high-end but I understand both sides.

(I'm not stating that everyone not going for the high-end can't afford it!

Looking forward to the 4000 series...I'd simply wait. I'd like to see what the real-world performance is like once the cards are on the market. I would not put too much stock into DLSS3 being some revolutionary step forward. I really doubt if the results of DLSS3 are going to be all that noticeable when compared to present DLSS. As graphics approach true 4K resolutions, we're getting seriously into the realm of diminishing returns. Simple fact is that, at those resolutions, the human eye can't actually see the pixels. I might be able to clean up the edges of a polygon with nearly 300% additional accuracy...but unless you have a magnifying glass up to the screen...you won't even notice.
That's always a reasonable approach.

When it comes to releases, I think publishers are gonna focus on releasing games for the most users. And console-users outnumber pc-gamers by orders of magnitude.
I disagree with console-gamers outnumbering pc-gamers by orders of magnitude. Sure, the outnumber PC gamers, but I would argue that PS, XBOX and PC roughly have 1/3 of the cake ;)
Depending on the game, one platform might outnumber another - like with Cyberpunk 2077 - but it is usually distributed in an equal manner.
 
So here is a thought (and maybe someone has brought this up already)...

But CDPR was obviously working with NVIDIA on this. Cyberpunk was used to show off DLSS3 and also the new Ray Tracing stuff. They then went on to show how easy it is to update older games with that technology.

Perhaps that is why we haven't got The Witcher 3 next gen version up until Q4 of this year? Maybe they are using that technology on the next gen version to really soup it up and make it look even more insanely beautiful than it already is?
 
This has been addressed before, but not everyone owning a 3090 or wanting to purchase a 4090 falls prey to marketing or bragging rights. I for one have a 3090 and I intend to buy a 4090 or 7900XT simply because I want to play at the highest settings at a decent framerate. I do think there is a difference between ultra and high settings, albeit I agree the difference is not that large. Also, I want to have RT on maximum and given that I have a 5120x1440 monitor DLSS below quality starts having to many issues.

As a result, I buy one of those high-end CPUs and I am well aware that I pay extra for a comparatively smaller performance uplift.


I think if you go high-end, you pretty much have made the decision to swap GPUs often. Naturally, it is good to look at reviews and such, but if I decide I will spend 1.6k$ on a new GPU than that's perfectly fine. A friend of mine gets new bikes rather frequently and compared to those, PC gaming is cheap, even with those prices. Likewise, gaming is a hobby of mine and I do not have any qualms of putting money into it.


.
That's kinda the point being made, high end doesn't mean swapping gpu's every cycle. In the pc community it's about future proofing which is the opposite of that. That's why pc gamers got away from consoles so they don't have to buy a new thing every Christmas.

Again, if spending on the latest is your thing great but practically speaking performance doesn't change drastically enough for the average gamer to warrant it. As others have said you'd never notice the difference and the majority don't even have a capable monitor, it'd just be down scaled. I have a high end build so I'm not suggesting ppl shouldn't, it's about what your need is.

For enthusiasts and professionals it's fine but those aren't the ppl being targeted with these edgy card designs or Anonymous looking box art with names like "The night Baron or Brutal by Nature". My personal favorite "The Midnight Kleidoscope" lol

Anyway my point was encouraging informed decisions if the objective is performance and not impulse buys. I'll leave it at that so we can get back on topic.
Post automatically merged:

I disagree with console-gamers outnumbering pc-gamers by orders of magnitude. Sure, the outnumber PC gamers, but I would argue that PS, XBOX and PC roughly have 1/3 of the cake ;)
Depending on the game, one platform might outnumber another - like with Cyberpunk 2077 - but it is usually distributed in an equal manner.
Lol I don't think you can disagree with facts. Pc gamers in no way have that large a share in the market as much as I would love that. Game sales are there to support this.

Consoles are essentially plug and play when it comes to gaming and that appeal is why they will always outnumber pc gamers. There's a lot of maintenance involved with pc's that the average or casual gamer will not bother with.

I'm thankful CDPR always ensures simultaneous release for PC but most publishers will prioritize console releases and technologies like the one being discussed here will take years before we see its impact on the wider gaming community.
 
Last edited:
From what I've seen, this frame interpolation technique causes even more artifacts than the traditional DLSS. In the Cyberpunk related footage, I can clearly see some distant building textures flickering, this does not happen when DLSS 2 is enabled. I like the performance boost DLSS brings but I cannot stand all those surfaces flickering as I move around, like tiles, street lights and so on. It is very noticeable at 1440p and DLSS quality with the game maxed out. Deactivating the DLSS shows how clean and stable the image is, only problem are those 20fps one gets hehehe...
 
From what I've seen, this frame interpolation technique causes even more artifacts than the traditional DLSS. In the Cyberpunk related footage, I can clearly see some distant building textures flickering, this does not happen when DLSS 2 is enabled. I like the performance boost DLSS brings but I cannot stand all those surfaces flickering as I move around, like tiles, street lights and so on. It is very noticeable at 1440p and DLSS quality with the game maxed out. Deactivating the DLSS shows how clean and stable the image is, only problem are those 20fps one gets hehehe...
Hmm i havent noticed that in some time, the net meshes does get odd at times from some distances but generally it looks almost full res on 4k/performance.. Hopefully with the next gen cards you can play with atleast 40 fps without dlss :)
 
Just installed DLSS 2.4.12.
It fixed almost all the blinkings reflections on white metal parts.

Now the only problem i get is (and i had it with 2.3.4) is some glass textures are not clear, they look blurry like if it was not clear glass, for some, it seem logic, but sometime it's glass of shops with neon behind.
 
Just installed DLSS 2.4.12.
It fixed almost all the blinkings reflections on white metal parts.

Now the only problem i get is (and i had it with 2.3.4) is some glass textures are not clear, they look blurry like if it was not clear glass, for some, it seem logic, but sometime it's glass of shops with neon behind.
Is this with ray tracing, rasterization, or does it happen on both?
 
Hmm i havent noticed that in some time, the net meshes does get odd at times from some distances but generally it looks almost full res on 4k/performance.. Hopefully with the next gen cards you can play with atleast 40 fps without dlss :)
Around minute 7:53 look at the big building's texture on the upper right corner of the screen... It's flickering. In the other two it's not. Besides also cables and such are still very unstable, I hope they will be able to improve on that.
 
Top Bottom