DLSS 2.2.11 works miracles (Remember to update Nvidia Drivers)

+
Yeah, it's not new that consoles are (far) behind, but I think the difference will growth more quicker (8 years, it's an eternity^^).
Just my opinion, I'm not sure that studios will limit their games because of console limitations. So I think we will see more and more PCs exclusive games (or games to be release on consoles, years after). As console user, I totally understand it... If I wanted to make a game, I would want to use the most advanced technology to make it as good/beautiful as possible.
I kinda doubt it, the high end pc market is very small just as SLI was before pretty much. The people spending that much money on a gaming hobby will be limited so theres little reason too cater too that market. The problem is its hard too scale stuff too much so you pretty much ahve to make 2 diffrent games and thats just not going too happend. something running well on a ps5 and a 4090 is a huge differance. True we on pc will probably get more settings and higher fidelity with more FPS but its getting harder and harder too justify these prices unless the games keep up :D Unles syou also do other stuff with your pc ofc, like 3d modeling/rendering/ai stuff.

True for a maxed out pc you would probably pay alot more then MSRP for a 4090. But if you like me already have a decent pc its a smaller cost too keep it up to date. I started with my PC once a loooong time ago with like 500 dollars too. Thats what i like about PC the modularity.

Edit: dammit i derailed the thread, we can take it in private if you want too keep this going :)
 
I kinda doubt it, the high end pc market is very small just as SLI was before pretty much.
Sure, but I think it's still a "showcase" of what can be done on PC (and what studios can achieve). So limited as it is, it's unvoidable.

If I remember (maybe it changed since so I could be wrong, ), most of PC players don't even have a RTX GPU but it don't prevent studios to use Ray Tracing in their games. So I think it won't be different with DLSS 3 and Series RTX 40, studios will use it anyway :)
I think it would be rather like that : We will make the most gorgeous game possible with all the newer technology available on PC, and then try to do what we can to adapt the game to consoles.
 
So I think it won't be different with DLSS 3 and Series RTX 40, studios will use it anyway
Since we kinda know very little about whats diffrent with DLSS 3 its kinda hard too say. If its the same and dont require having a 2.4 and a 3.0 file differance it would work on everything but be faster im guessing. I kinda doubt it since they dont seem too implement it on the 3000 series. We "know" that 4000 series will have much more tensor cores for example so higher troughput.

i get your point about demoing it on pc tho since thats has been a thing since forever. Its partially because games gets made on pc, they arent console friendly untill the game is finished pretty much and you need more power too run anything thats badly optimised. Thing is games arent really scaleble too the degree we will end up with if this keeps up. Nothing against consoles but im seeing a bigger and bigger differance in performance building.

(this is ofc if nvidia is honest in all figures and so on. they might be overexaggerating)
 
It's the onward march of technology. The 40 series likely has features that enable DLSS3 to work which are missing on the 30 series. (Windows 11 won't install on some computers because they lack certain features!)
This might be the case, but Nvidia might also block older hardware just because this might increase their profits on Lovelace ;)

I'll be interested in the GPU landscape in another year - I'm hopeful that we may have some affordable options.
This depends on AMD and also Intel. If you are not particularity interested in the high-end, then the prices might be more acceptable there.

Im actually a bit scared for the future. The consoles are already so far behind top pc hardware its going too start limiting Pc gaming way faster then i thought. A 4090 has the tflops of 7,5 Ps5s -.-
Usually developers do a good job at optimising console hardware and getting as much out of it as possible. This can be seen when you compare the visual fidelity of games at the start of a generation with those that are released at the end of one.

I'm not sure that studios will limit their games because of console limitations. So I think we will see more and more PCs exclusive games (or games to be release on consoles, years after).
I strongly doubt that, developers need the combined purchasing power of PC and consoles to justify the production cost of AAA games.

The problem is its hard too scale stuff too much so you pretty much ahve to make 2 diffrent games and thats just not going too happend. something running well on a ps5 and a 4090 is a huge differance.
Well, there are a few quick winds like increasing the ray count for raytracing, using full resolution on RT reflections, things like that. But, the general limitations of an engine will be designed around consoles and therefore no amount of processing power will change that. (For example, the LOD in Cyberpunk 2077.)

(this is ofc if nvidia is honest in all figures and so on. they might be overexaggerating)
Usually they don't even though they end up optimising their charts as much as possible.


Personally, I might end up purchasing a 4090 or a 7900XT and sell my 3090, which is what I always have intended anyway. Currently, I'm a bit more interested in the 4090, due to RT Overdrive and DLSS 3.0. Now, I'm quite sure that RT Overdrive can be enabled on AMD GPUs like the 7900XT as well, but it might be inferior in raw RT performance. However, this remains to be seen.

I'm also wondering if a new Cyberpunk 2077 patch will be released around the release date of the 4090, along with RT Overdrive and DLSS 3.0. (It also explains why cdpr_internal_dev3 and cdpr_internal_dev2 are called "Nvidia test branch".)
 
I strongly doubt that, developers need the combined purchasing power of PC and consoles to justify the production cost of AAA games.
Sure they need both platforms, but I think not at the cost to make their game looking like "crappy" (compared to other PC exclusive games) by avoiding using new technologies which are unavailable on console :)
(I could be wrong, but if it's the case, it's just seem to me to be a bad "strategy")
Edit : but yeah, we turn OT. In any case, I can't wait to see what Cyberpunk does on the next generation of GPUs :)
 
Last edited:
Usually developers do a good job at optimising console hardware and getting as much out of it as possible. This can be seen when you compare the visual fidelity of games at the start of a generation with those that are released at the end of one.
Yea people always say that. You cant optimise away 7,5x the pure processing power. Yea they learn too use the hardware better and better but you just cant make up that big of a differance. It still looks good dont get me wrong but just like you pointed out LODS in this game as an example.
It just feels like "we" are heading into another disaster. its partly AMDs fault since they compete now and Nvidia cant do small upgrades anymore :D
Well, there are a few quick winds like increasing the ray count for raytracing, using full resolution on RT reflections, things like that. But, the general limitations of an engine will be designed around consoles and therefore no amount of processing power will change that.
Yea exactly. They will make some stuff better but it still wont really make use of the new hardware. Your gimping the more powerfull system because the "older" just cant do the same amount of work. This is true of all parts tbh, cpu threads and so on has been very limited untill recently in gaming since the older systems could not use more then 4 anyways. Thank god theres directstorage on the consoles now so we can start using that atleast.
Personally, I might end up purchasing a 4090 or a 7900XT and sell my 3090, which is what I always have intended anyway. Currently, I'm a bit more interested in the 4090, due to RT Overdrive and DLSS 3.0. Now, I'm quite sure that RT Overdrive can be enabled on AMD GPUs like the 7900XT as well, but it might be inferior in raw RT performance. However, this remains to be seen.
Yea i might get one too. Ill wait a while tho to ose if prices drops since they have pretty much screwed us with the 3000 series ^^ Nvidia will probably be better then AMD on RT atleast. Raw compute will probably be similar or slight ahead for nvidia since AMD goes the efficency route.

Really wish they started making the consoles upgradeble for real. They can make there own upgrade parts or something. But that would make it hard too set standards for what we can do with this game and so on, anyways rant off ^^
 
This might be the case, but Nvidia might also block older hardware just because this might increase their profits on Lovelace ;)
That was my reasoning, too. They looked at the what the scalpers / miners were paying, and thought "hey we could do some scalping of our own!" and jacked up the prices on their 4000 series, while locking the new DLSS behind the new gen paywall. The most questionnable move is the 4070 being rebranded as a "4080 - 12gb" - I've been with nVidia for over a decade, but I'll seriously consider AMD this time around. Anyways, it took 15 months to get my 3090 at a reasonable price, I can wait for the prices to drop.

I just hope they'll continue to improve DLSS 2.x and not abandon it altogether.
 
That was my reasoning, too. They looked at the what the scalpers / miners were paying, and thought "hey we could do some scalping of our own!" and jacked up the prices on their 4000 series, while locking the new DLSS behind the new gen paywall. The most questionnable move is the 4070 being rebranded as a "4080 - 12gb" - I've been with nVidia for over a decade, but I'll seriously consider AMD this time around. Anyways, it took 15 months to get my 3090 at a reasonable price, I can wait for the prices to drop.

I just hope they'll continue to improve DLSS 2.x and not abandon it altogether.
I kinda feel you. Been with nvidia since 680s this time. Hade a short stint in AMD before that but i was a SLI boy too so spent way to much. Amd are looking better and better and unless you want the best RT experiance i think i would go with them. we shall se how it goes this time around.
 
This might be the case, but Nvidia might also block older hardware just because this might increase their profits on Lovelace ;)
That would never happen. New technologies take years to adopt. There are currently games/consoles in the pipeline that will be in development for years more before release... on older architecture. No company is going to erode their margins by halting legacy hardware abruptly.

Adding to that there is still a chip shortage, they literally do not have the supply to replace every single card, that's just not how these decisions are made.

Currently no one needs a 4080/90 to play any game on the market. You don't even need a 3080/90. People just unfortunately fall prey to marketing and upgrade for no other reason than "bragging rights". I've had a 1080 since launch and I've yet to need to change it. I not only game with it but also do video editing. I could change it but I don't enjoy giving my money to wealthy companies for no reason.
 
I could change it but I don't enjoy giving my money to wealthy companies for no reason.
I will normally use whatever hardware I have until it's completely useless. And I do not ever recommend buying top-of-the-line stuff (unless one has piles of money to burn and enjoys paying a premium to be used as a live tester.)

In general, purchasing the best hardware from the prior generation is going to result in excellent performance, fewer bugs and issues, and will significantly decrease the overall cost in exchange for much more stable and consistent gameplay.
 
Currently no one needs a 4080/90 to play any game on the market. You don't even need a 3080/90. People just unfortunately fall prey to marketing and upgrade for no other reason than "bragging rights". I've had a 1080 since launch and I've yet to need to change it. I not only game with it but also do video editing. I could change it but I don't enjoy giving my money to wealthy companies for no reason.
Hehe nobody needs anything really except food/shelter/water. Its all relative. Bragging right can possibly be a reason for some people i just want to play my games at high settings at a decent FPS. I also like testing new stuff such as Ai/rendering/3d modeling so on. Can it be done on a 1080? Sure, will be slower tho. 1500 dollars is nothing compared to the real pro level gear so for me its worth it. Not sure if ill go for the 4000 series yet, need too see some tests and ill probably wait untill next year atleast. Have more need of a new Mb/cpu atm tbh.
 
Hehe nobody needs anything really except food/shelter/water. Its all relative. Bragging right can possibly be a reason for some people i just want to play my games at high settings at a decent FPS. I also like testing new stuff such as Ai/rendering/3d modeling so on. Can it be done on a 1080? Sure, will be slower tho. 1500 dollars is nothing compared to the real pro level gear so for me its worth it. Not sure if ill go for the 4000 series yet, need too see some tests and ill probably wait untill next year atleast. Have more need of a new Mb/cpu atm tbh.
If you're an enthusiast sure but the average gamer isn't doing rendering and honestly it's even less likely they have a monitor capable of displaying the high settings they're buying a card for.

When you see people swapping gpu's at every new release then you know it's not about performance anymore, it's just about having the latest thing because rarely does performance change drastically from a single development cycle to the next.

In any case, as @SigilFey said these releases amount to just beta testing so it's always best to buy the prior generation anyway. I'll probably look to a 3080 once I feel my 1080 is at the end of its life cycle.
 
Currently no one needs a 4080/90 to play any game on the market. You don't even need a 3080/90. People just unfortunately fall prey to marketing and upgrade for no other reason than "bragging rights". I've had a 1080 since launch and I've yet to need to change it.
I'm just curious why are you in the thread for DLSS, then? GTX 1080 doesn't support that tech. Honestly if you enjoy the game at lower settings, of course its fine, all the more power to you. What anyone do with their disposable income is not really an issue here. We are on a cyberpunk forum talking about bleeding edge rendering technology, I think it's the right place to discuss the latest and greatest graphics hardware.

Back on the subject, it seems that DLSS3 might someday be available to other RTX gens after all, just without the frame interpolation component (maybe thats the solution to keep backward-compat simple, instead of going back to DLL Hell)
 
If you're an enthusiast sure but the average gamer isn't doing rendering and honestly it's even less likely they have a monitor capable of displaying the high settings they're buying a card for.

When you see people swapping gpu's at every new release then you know it's not about performance anymore, it's just about having the latest thing because rarely does performance change drastically from a single development cycle to the next.

In any case, as @SigilFey said these releases amount to just beta testing so it's always best to buy the prior generation anyway. I'll probably look to a 3080 once I feel my 1080 is at the end of its life cycle.
Yea at 1080p/1440p its harder too within reason get a 3090 unless you really need the extra Vram. I was very conflicted before buying it too but at 4k nowdays i see up too 16 gb vram "usage" at times so im glad i dident get a 3080 with 10 now -.-

I kinda do have a problem with nvidia now tho so we shall se what happends next year. wish amd would catch up with RT but i kinda doubt it :(
Post automatically merged:

Back on the subject, it seems that DLSS3 might someday be available to other RTX gens after all, just without the frame interpolation component (maybe thats the solution to keep backward-compat simple, instead of going back to DLL Hell)
Havent really read up yet on DLSS3, sounds like i need too :D
 
I'm just curious why are you in the thread for DLSS, then? GTX 1080 doesn't support that tech. Honestly if you enjoy the game at lower settings, of course its fine, all the more power to you. What anyone do with their disposable income is not really an issue here. We are on a cyberpunk forum talking about bleeding edge rendering technology, I think it's the right place to discuss the latest and greatest graphics hardware.

Back on the subject, it seems that DLSS3 might someday be available to other RTX gens after all, just without the frame interpolation component (maybe thats the solution to keep backward-compat simple, instead of going back to DLL Hell)
Might help if you read the thread I replied to which specifically spoke to the 40 series and the impact on legacy hardware. You can figure the rest out from there. I've never needed to play the game at lower settings and even if I did how would that even matter? Lol I guess that goes back to the the "braggers" I mentioned. It might be worth noting that the theme of CP2077 that you bring up is a cautionary tale about the obsession/addiction with the "latest and greatest" technology.
In any case....no one is telling you how to game mate, I don't even know who you are nor did I reply to you. If that's what makes you happy, do your thing
 
Last edited:
Okay, a reminder here that no one's opinion is superior to another's. Let's not start labeling people that choose to focus on things that we, personally, find unimportant. Feel free to express your thoughts and views on the topic. Do not assume to speak for others, and be sure that all views are being respected.


_______________


^ That's also coming from someone that used to be a "Power User" during the 1990s and early 2000s. I get the lure of trying to put the biggest, baddest rig together, spending hundreds of extra dollars trying to squeeze out every last frame from any game, going for those huge numbers! I get it well.

It was the constant frustration that resulted with trying to fight the instability of aggressive overclocking, drivers that weren't working terribly well with state-of-the-art tech, games that simply didn't incorporate brand new features, hardware damage or failures that would occur when pushing it a bit too far...etc. That all ended when I had a looong talk with a Falcon-NW rep on the phone as I configured a nearly $4,000 USD Mach V tower, and he consistently tried to push me away from any overclocking, and discouraged me from upgrading certain components because of the motherboard I was using.

In the end, a system config needs to ensure that all of its components are in balance with one another in order to ensure smooth, stable performance across the board. If a piece of hardware doesn't really line up with the overall system functionality, millisecond to millisecond, then that will result in performance hitching or instability as the rest of the system either fails to keep up or gets too far ahead. Plus, there's no benchmark software on the planet that can accurately simulate what the real-world performance of a system will be running actual applications in practice.

So, as is the case with pretty much everything in life, "throwing money at the problem" won't usually solve anything. Most often, it will simply create new issues. The more I learn about Nvidia's 4000 series, the more I'm beginning to smell fast food. I don't think this is really a new iteration of hardware. It feels more like an attempt to put fancy sauce on a cheap meat patty, then jack the price up because it's "the next big thing".

To be honest, if anyone is looking for an upgrade in the present GPU market, I would recommend the RTX 3060 ti. I was really impressed by the performance of the 3060 standard that came in my present rig, and I'm disappointed that I couldn't track down a model with a 3060 ti. Bang for the buck right now, you can't do much better. Granted, it can't really do any meaningful ray tracing: the tech is there, but it simply doesn't have the power to create playable FPS in most games with ray tracing on. It absolutely screams with rasterized graphics, though. Plus, the voltage requirements mean that anyone using a 1000 or 2000 series card will likely be able to just plug it in (mobo permitting).

If players want to future-proof, I'd recommend a 3090 ti. That is a big, bad, mean card. You'll pay for it, but you'll get what you pay for. They are beasts in terms of both performance and price. (Personally, I absolutely refuse to pay that much money for a GPU. It's unhealthy marketing, and it's setting a terrible precedent for the future.)

Looking forward to the 4000 series...I'd simply wait. I'd like to see what the real-world performance is like once the cards are on the market. I would not put too much stock into DLSS3 being some revolutionary step forward. I really doubt if the results of DLSS3 are going to be all that noticeable when compared to present DLSS. As graphics approach true 4K resolutions, we're getting seriously into the realm of diminishing returns. Simple fact is that, at those resolutions, the human eye can't actually see the pixels. I might be able to clean up the edges of a polygon with nearly 300% additional accuracy...but unless you have a magnifying glass up to the screen...you won't even notice.
 
Havent really read up yet on DLSS3, sounds like i need too :D

From my understanding, they are meshing 3 different techs in the new DLSS pipeline : the existing pixel scaler/temporal aliasing, AI-generated & motion-based frames, and their low-latency tech Reflex. The system will analyse the last few frames and predict based on current motion the next frame(s) that would have been rendered, but will generate it directly instead on waiting for the actual GPU / frame buffer.

I think Nvidia Reflex is thrown in the mix to lower the input lag as much as possible, since frame interpolation could otherwise feel janky and laggy.

Digital Foundry should have a video in the next few days on the subject:

 
Yeah, it's not new that consoles are (far) behind, but I think the difference will growth more quicker (8 years, it's an eternity^^).
Just my opinion, I'm not sure that studios will limit their games because of console limitations. So I think we will see more and more PCs exclusive games (or games to be release on consoles, years after). As console user, I totally understand it... If I wanted to make a game, I would want to use the most advanced technology to make it as good/beautiful as possible.

Yeah. Knowing that with a 4090 alone, you can't play a game :D

When it comes to releases, I think publishers are gonna focus on releasing games for the most users. And console-users outnumber pc-gamers by orders of magnitude. Some times it's pure luck they decide to port a game to pc. One example is the Playstation-game Detroit: Become Human which was released on Steam. If they didn't port it, I'd probably not even knowing anything about this game.
PS. I'd give D:BH a 9/10 score, btw. With 40 endings, it has replay-value despite being linear. I wish more developers would embrace branching story-lines such as these.
 
Saw the Cyberpunk trailer for DLSS3 and ray tracing on the 40 series. It does look insanely good.

I have a 2080 Super and only play on 1080p, so I think I am good for now. But I kind of want to snatch up one of the 4080 12GB (basically the 4070 from how I understand it) so that I can get the visual upgrade.
 
When it comes to releases, I think publishers are gonna focus on releasing games for the most users. And console-users outnumber pc-gamers by orders of magnitude. Some times it's pure luck they decide to port a game to pc. One example is the Playstation-game Detroit: Become Human which was released on Steam. If they didn't port it, I'd probably not even knowing anything about this game.
PS. I'd give D:BH a 9/10 score, btw. With 40 endings, it has replay-value despite being linear. I wish more developers would embrace branching story-lines such as these.
Agreed, but I think that every part involved in games devellopement are interested to "use" the last technology and hardware (whatever which hardware are the "most" used). For example, I'm quite confident that Nvidia, when the Serie 40 will be released (it's maybe even already the case^^), will provide GPUs to the studios at "very low cost" (if not for free) to "demonstrate", how powerful are their new products. In the same way as Microsoft certainly "help" studios to "show" how powerful is the Series X (knowing that for now, most of Xbox players can't get their hands on a Xbox Series S/X and still play on XB1)

So in short, what best example to show to everyone how powerful and what is capable your brand new GPU than a game :)
The more I learn about Nvidia's 4000 series, the more I'm beginning to smell fast food. I don't think this is really a new iteration of hardware. It feels more like an attempt to put fancy sauce on a cheap meat patty, then jack the price up because it's "the next big thing".
I imagine that the Serie 40 will at least have the merit of lowering the price of the Serie 30 (I hope) :)
 
Top Bottom