Low Performance Cyberpunk 2077 PC

+
There was never a time when this wasn't the case. Even back in the 1990s, you could find me tweaking parts and adding fans to the case to crank out another 5-10 FPS to get "blazing" 60 FPS gameplay. Most games back then had trouble maintaining a steady 30, but I was a "power user"!
I actually just finished doing something which approximates this.

So, I have a Radeon 6600XT and a Ryzen 3700X connected to a B550 board. BIOS update comes out. I apply it. New feature - Resizeable BAR support. What's this, I wonder. Oh, it's the real name for AMD's Smart Access Memory. Doing some checking, it seems like, on a rig like mine, I should be able to get somewhere on the order of 10% FPS improvement on some games. So, I go to turn it on. Can't. Why? UEFI Legacy / CSM mode is on. Well, crap.

So, I figured out how to switch it all to UEFI mode WITHOUT a reinstall (because who wants to do that?).

Docs:


Once one figures it out, it only takes about an hour unless, of course, you have to move stuff around on your PV to make room.

dmesg reports success:

[ 4.167332] [drm] Detected VRAM RAM=8176M, BAR=8192M

Not sure how much the actual effect is. But I can now upgrade firmware (nee BIOS) from inside the OS:

(matt@bluebox) ~$ sudo fwupdmgr update
[sudo] password for matt:
Devices with no available firmware updates:
• APS-SE20G-2T
• BCM20702A0
• CT4000MX500SSD1
• PNY CS2130 2TB SSD
• SSD 870 QVO 4TB
• ST10000DM0004-2GR11L
• ST8000DM004-2U9188
• System Firmware
• UEFI Device Firmware
• UEFI Device Firmware
• UEFI dbx
No updatable devices

This is both convenient and frightening, as it allows for the installation of persistent malware.

And yes, this box has 6 disks. 2 NVMe in a RAID1 for boot and critical data, 2 4TB SSDs for video games and other stuff which requires high speed, and then 18TB of rusty platters as the datafortess (generally a pile of ROMS and such downloaded from archive.org).

The important stuff gets backed to the SOHO server (12TB RAID array) which is backed up to 2 external drives kept on separate fireproof safes at different buildings on the compound. The less important stuff (the aformentioned archive.org stuff) is backed to an external drive which is tossed in a drawer.

Because normal people run backups like this, right?
 
Because normal people run backups like this, right?
Yes. YES-YES-YES. Always.

You can normally save your UEFI / BIOS settings to a USB nowadays. So, if you have it running well, and you want to play around and tweak, just do that. That's almost as important as making a backup of important data on your drives.


Resizeable BAR support.
I'm not sure what AMD has done to compete with this, but it's an Nvidia specific technology. The general gist is that it allows the VRAM to buffer data for streaming game assets (like data for huge open worlds) concurrently, rather than scheduling it sequentially. In layman's terms, it makes bigga-huge, open world games like Cyberpunk, GTA, DayZ, PubG, etc. run with fewer loading stutters.

Like I said, though, I'm not sure if enabling it on an AMD card will do anything. Normally, AMD will come up with ways of utilizing this stuff.


_______________


On my end, I just got done removing the overclocking that was insidiously activated on my system through the BIOS. Man, that took a lot of research. Running a Ryzen 5600x, and the sucker was hitting 70*C just surfing the net. Fans sounded like they were running full blast constantly. Sure enough, there were two default settings that were driving the voltage up 1.500 (!?) for no bloody reason. Basically, it wasn't "boost" settings -- it was aggressive OC that cooked the CPU whenever the mobo decided there was voltage to spare.

Disabling that rubbish, and I'm getting well over 120 FPS in any game, no stuttering, no fan noise, and the CPU temps are remaning around 50*C. Under load, I've yet to hit 70*C. I have to chalk this up to terrible BIOS / UEFI design. There is 0% need for that. It was actually introducing minor issues, and man, those settings were buried in the BIOS. (I've not needed to do that much reading, researching, and cross-referencing for years.)
 
Yes. YES-YES-YES. Always.

You can normally save your UEFI / BIOS settings to a USB nowadays. So, if you have it running well, and you want to play around and tweak, just do that. That's almost as important as making a backup of important data on your drives.



I'm not sure what AMD has done to compete with this, but it's an Nvidia specific technology. The general gist is that it allows the VRAM to buffer data for streaming game assets (like data for huge open worlds) concurrently, rather than scheduling it sequentially. In layman's terms, it makes bigga-huge, open world games like Cyberpunk, GTA, DayZ, PubG, etc. run with fewer loading stutters.

Like I said, though, I'm not sure if enabling it on an AMD card will do anything. Normally, AMD will come up with ways of utilizing this stuff.
Nah its a AMD tech from the start just with a diffrent name. Called smart access memory or something like that. Nvida just said oh its just this and made it too. Its nothing special from the start and they have been able too do it since ages but havent it seems ^^
 
I'm not sure what AMD has done to compete with this, but it's an Nvidia specific technology.

I do not believe that is correct. It is a PCIe standard. See:


and


NVidia used the real name. AMD decided to make up their own name for a standard (much like IEE1394 got named different things depending on the company from which you bought it).

The general gist is that it allows the VRAM to buffer data for streaming game assets (like data for huge open worlds) concurrently, rather than scheduling it sequentially. In layman's terms, it makes bigga-huge, open world games like Cyberpunk, GTA, DayZ, PubG, etc. run with fewer loading stutters.

I'm not sure buffer is the correct term here. My understanding is that the CPU and GPU arbitrate access to the GPUs memory, probably via a DMA controller (not certain that this is true, but it is a common design pattern). The CPU fills stuff which the GPU then uses. The "normal" access mode means the CPU needs to break its accesses into 256MB chunks. So, if you need to push 300GB of textures, it's 2 chunks. The "bigly" mode (ReBAR enabled) would let the CPU do it as one scheduled DMA transfer. Therefore it should reduce the transactional overhead incurred, resulting in a slight performance boost.
Like I said, though, I'm not sure if enabling it on an AMD card will do anything. Normally, AMD will come up with ways of utilizing this stuff.

According to the interwebs, Linux has had it for a looong time.

https://www.reddit.com/r/nvidia/comments/mglv5g
Probably because it is a PCIe standard and, from what I read, is common in server applications.

As mentioned above, all I did is turn it on and my Ubuntu 22.04 box reports that it's enabled.
On my end, I just got done removing the overclocking that was insidiously activated on my system through the BIOS. Man, that took a lot of research. Running a Ryzen 5600x, and the sucker was hitting 70*C just surfing the net. Fans sounded like they were running full blast constantly. Sure enough, there were two default settings that were driving the voltage up 1.500 (!?) for no bloody reason. Basically, it wasn't "boost" settings -- it was aggressive OC that cooked the CPU whenever the mobo decided there was voltage to spare.

So, I have 2 rants about this.

First, I don't blame you. I don't generally overclock things. But, I love that people do. Why? Because it has led manufacturers to use high quality components in enthusiast boards, which drives the cost down (I remember buying TYAN multiprocessor (that is, multiple sockets) boards in the early 2000's and I'd be out $300 for the board (in 20 years ago dollars). Now I can get a board with high quality caps, good power conversion, etc. for $150.

Now, that said, I do leave "factory overclocked" things alone. The extensive testing that those engineers did to make sure it will be stable means I just set to their recommendations and forget it.

As far as fans, I hate fan noise. When I built this case originally (around 2003 or so), I used PC Power and Cooling fans which were the quietest I could find back in the day. It's had a succession of power supplies, boards, etc. in it, but the case is still the same.

About a year ago, I replaced all the fans with Noctua fans. They are quieter, but that may have a lot to do with the bearings wearing out on the original fans. I also replaced the Radeon R9 390X (which was a 290W beast which ran SO HOT) with a 6600X which is something like half the power, and has better fans and is much quieter.

As a final piece, I added one of those NH-D15 insane dual fan setups. Combined with the Ryzen 3700X (highest performance 65W TDP Ryzen CPU that was out at the time I bought it), it is pleasantly quiet and idles around 40C.

My second rant is the use of "turbo boost" on CPUs. Ahem.

Turbo boost is a stupid idea. "Oh, let's run our CPU hot and let the thermal throttling stop it from actually melting". Are you really serious with this foolishness? This results in die temps upwards of 90C, a pile of thermal throttling messages in the logs, and heat buildup elsewhere in the system.

This is why I set my the CPU max frequency on my laptop to the non-boost max frequency. Then it never thermally throttles.
Disabling that rubbish, and I'm getting well over 120 FPS in any game, no stuttering, no fan noise, and the CPU temps are remaning around 50*C. Under load, I've yet to hit 70*C. I have to chalk this up to terrible BIOS / UEFI design. There is 0% need for that. It was actually introducing minor issues, and man, those settings were buried in the BIOS. (I've not needed to do that much reading, researching, and cross-referencing for years.)

I think the reason they have it is that it sells. People buy it. I turn all that off, set the fan controllers to "silent" mode, and run it like that.

Nah its a AMD tech from the start just with a diffrent name. Called smart access memory or something like that. Nvida just said oh its just this and made it too. Its nothing special from the start and they have been able too do it since ages but havent it seems ^^

I do not believe this is accurate. My understanding is that it is a PCIe standard. NVidia uses the standard name, AMD decided to call it something different. See my links above.
 
I do not believe this is accurate. My understanding is that it is a PCIe standard. NVidia uses the standard name, AMD decided to call it something different. See my links above.
I think it was AMD that introduced these changes first and named it Smart access memory then nvidia made a big stink about it due too it beeing pretty much pci-e tech from the start just that it hasent been used before and enabled it trough bios/drives and called it resizeble bar (its real name). The 6900xt beat the 3090 in some titles with it enabled before nvida got it too work i think. Also im pretty sure they have another tech that required amd cpu+gpu too work. Might be mixing up the name with something but it appeared at the same time the 6000 series came.


Even i hade forgotten about how it happend :D But im fairly sure AMD was first too enable it and nvidia got pissed and pushed trough the same changes and enabled it on more motherboards and so on. It was a weird time when for once amd was a threat too nvidia and i was so happy too get it thanks too the competition.
 
Last edited:
I think it was AMD that introduced these changes first and named it Smart access memory then nvidia made a big stink about it due too it beeing pretty much pci-e tech from the start just that it hasent been used before

For sure. In my head, it went like this:

Enterprise server person from the Epyc line: "Hey, uh... we implemented Resizeable BAR so PCIe storage devices and stuff don't need to page. Wouldn't that be useful for these new video cards that have gobs of RAM?"

Ryzen engineer: "Hey, that's a good idea. We should put them in the next gen stuff. Shouldn't be hard, just drop an IP core on the die and we're off to the races. But let's get the Radeon people on board."

Radeon people: "Seems like a good idea to me. Let's talk to product management."

Product management: "Hmmm, yes. It's a good idea, but we can't just call it what it actually is - Resizeable BAR sounds too engineer-y, ReBar sounds like construction material. We need it to sound smart.. hmmm... let's see. It's memory, we're access it smartly, hmmmm.. How about Smart Access Memory?"

All the Engineers: "Whatever you want to call it is fine with us. Can we go build it now?"

Product management: "Go nuts"

.... six months later ....

AMD rep at trade show: "Hey, look, we have this cool thing. See my graphs that prove it?"

NVidia rep at the trade show on phone: "Uh, management? AMDs got this thing."

NVidia management: "Buy some and bring them home, we need to figure out what they're up to."

NVidia engineers: "So, we looked at the thing and they just gave Resizeable BAR a trade name."

NVidia management: "Well, then, we need that too. Get after it!"
 
I do not believe this is accurate. My understanding is that it is a PCIe standard. NVidia uses the standard name, AMD decided to call it something different. See my links above.
Not sure who came up with the actual tech first, but Nvidia was the company that coined the term Resizable BAR. Also, it's PCIe standard now because it has been adopted by both Nvidia and AMD. Regardless of who first created it, it was not likely a motherboard manufacturer. It almost certainly came from AMD or Nvidia, then mobo manufacturers scrambled to include it in their hardware and BIOS. It's only PCIe standard now.

But this is something that will likely affect CP2077 performance in a good way on any GeForce 3000 series cards and beyond or Radeon equivalents.

Even i hade forgotten about how it happend :D But im fairly sure AMD was first too enable it and nvidia got pissed and pushed trough the same changes and enabled it on more motherboards and so on...
For sure. In my head, it went like this:
That's about right! They are so viciously cut-throat with one another, it's ridiculous at times. Plus, both can be so greedy and profit driven that neither one is ever going to do what's needed to become the "definitive" solution.

Plus, even if they achieved it for a generation or so, they'd never be able to maintain it and still drive profits through the roof. Hopefully the GPU crash will get things back on good path.

I like the mentality behind your philosophy, though, @Notserious80 . There's really nothing else to do but be thankful that the company's desire to burn out their own parts more slowly than the competition gives us GPUs that will likely run for decades. :D

So, I have 2 rants about this.

First, I don't blame you. I don't generally overclock things. But, I love that people do. Why? Because it has led manufacturers to use high quality components in enthusiast boards, which drives the cost down (I remember buying TYAN multiprocessor (that is, multiple sockets) boards in the early 2000's and I'd be out $300 for the board (in 20 years ago dollars). Now I can get a board with high quality caps, good power conversion, etc. for $150.

Now, that said, I do leave "factory overclocked" things alone. The extensive testing that those engineers did to make sure it will be stable means I just set to their recommendations and forget it.

As far as fans, I hate fan noise. When I built this case originally (around 2003 or so), I used PC Power and Cooling fans which were the quietest I could find back in the day. It's had a succession of power supplies, boards, etc. in it, but the case is still the same.

About a year ago, I replaced all the fans with Noctua fans. They are quieter, but that may have a lot to do with the bearings wearing out on the original fans. I also replaced the Radeon R9 390X (which was a 290W beast which ran SO HOT) with a 6600X which is something like half the power, and has better fans and is much quieter.

As a final piece, I added one of those NH-D15 insane dual fan setups. Combined with the Ryzen 3700X (highest performance 65W TDP Ryzen CPU that was out at the time I bought it), it is pleasantly quiet and idles around 40C.

My second rant is the use of "turbo boost" on CPUs. Ahem.

Turbo boost is a stupid idea. "Oh, let's run our CPU hot and let the thermal throttling stop it from actually melting". Are you really serious with this foolishness? This results in die temps upwards of 90C, a pile of thermal throttling messages in the logs, and heat buildup elsewhere in the system.

This is why I set my the CPU max frequency on my laptop to the non-boost max frequency. Then it never thermally throttles.
In this case, it was a matter of the (cheap-o) ASRock mobo containing its own, default overclocking nonsense at defaults, and that would directly compete with AMD's "Performance Boost" settings to decide which one was going to control all the voltage and settings. If the voltage dropped, the ASRock would relinquish control back to the AMD firmware for the CPU...but the minute the voltage was available again, the ASRock firmware would be, like, "Gimme back total control" and start jacking things up insanely, stupidly high.

I may fiddle with using only the AMD defaults again in the future, but for now, it's resolved. Everything is rock solid and smooth as butter. And that's my goal for gaming. Could care less about the FPS "numbers". (The number of times I've heard players gloat about getting 144+ FPS in this or that game...then proceed to display their evidence on a clearly 60Hz monitor. It is amusing.)
 
Not sure who came up with the actual tech first, but Nvidia was the company that coined the term Resizable BAR. Also, it's PCIe standard now because it has been adopted by both Nvidia and AMD. Regardless of who first created it, it was not likely a motherboard manufacturer. It almost certainly came from AMD or Nvidia, then mobo manufacturers scrambled to include it in their hardware and BIOS. It's only PCIe standard now.
Fair point. I figured it was added by PCI-SIG, but there are reps from all the major vendors there, so.. who knows who introduced it? I mean, I'm sure the minutes of the meetings and comments on the specs say, but that exceeds my level if curiosity. ;-)
 
Fair point. I figured it was added by PCI-SIG, but there are reps from all the major vendors there, so.. who knows who introduced it? I mean, I'm sure the minutes of the meetings and comments on the specs say, but that exceeds my level if curiosity. ;-)
Heh heh heh...I wonder if that was the actual way of it now. Some low-level R&D guy making PCIe tech goes: "You know, why don't we add a couple of pins here. We could use that to channel data directly to the VRAM in asymmetrical chunks for faster asset streaming in open worlds. Eh?"

2 weeks later...

AMD Rep: "...proud to introduce the newest in open-world gaming technology! What we're calling -- "

Nvidia Rep: (Directly upstaging the presentation.) "We're calling it Resisable BAR, being included now in all Geforce 3000 series GPUs! This new -- "

(Stage manager switches off their mics. AMD and Nvidia reps begin slappy-fighting. PCI-SIG rep sits in the thrid row, arms crossed, scowling.)
 
I may fiddle with using only the AMD defaults again in the future, but for now, it's resolved. Everything is rock solid and smooth as butter. And that's my goal for gaming. Could care less about the FPS "numbers". (The number of times I've heard players gloat about getting 144+ FPS in this or that game...then proceed to display their evidence on a clearly 60Hz monitor. It is amusing.)
Hehe i got into a scruff online over beeing happy with 98hz on my 4k HDR screen(it supports 144hz but need too overclock and disable HDR and 10 bit color since its only DP1.4 so cant handle the data at higher settings.). Its really really rare too se even close too 144 fps in any game at 4k tbh so its kinda useless. But i like the HDR and the ips panel so.. That "dude" got pissed i dident aim for 144 and played at 1440p instead just too achive that. I was like dude, its my pc. You play how ever you want i want my 4k! :D

And the 60 hz 144fps is way too true. Before i got this screen i hade a standard 4k 60hz and when ever it got over 60 i would get screen tear deluxe instead ^^ People are kinda funny sometimes :D thank god for G-sync too so i never get tears anymore :D
 
Hehe i got into a scruff online over beeing happy with 98hz on my 4k HDR screen(it supports 144hz but need too overclock and disable HDR and 10 bit color since its only DP1.4 so cant handle the data at higher settings.). Its really really rare too se even close too 144 fps in any game at 4k tbh so its kinda useless. But i like the HDR and the ips panel so.. That "dude" got pissed i dident aim for 144 and played at 1440p instead just too achive that. I was like dude, its my pc. You play how ever you want i want my 4k! :D

And the 60 hz 144fps is way too true. Before i got this screen i hade a standard 4k 60hz and when ever it got over 60 i would get screen tear deluxe instead ^^ People are kinda funny sometimes :D thank god for G-sync too so i never get tears anymore :D
Yeah, I need to get one of those variable sync monitors. No tearing would be sooo nice.

My monitor is actually a Samsung 4K TV from 2016. It does 4K @ 30Hz (which is fine for desktop work, and is why I bought it) and 1080p @ 60Hz. The 6600XT will push CP2077 at slightly above 60FPS in 1080p, but I set it to sync with the monitor refresh so it doesn't tear (unless it's not making frame rate, which is rare, but does happen).

My problem is, 4k monitors in the 40" range are stupid expensive.

I play some games at 4K - stuff like Cities: Skylines and X-Com, but I need the frame rate up for stuff like CP2077, so I drop to 1080p.

Thing is, it's still fine. I was showing my nephew who plays 2077 at 1080 on XBox (before the next gen console update) and he was blown away at how much better t looked. So, it's not like it looks bad, and anything above 60Hz is faster than we're supposed to be able to see anyway.

I swear a lot of this is just competition for the sake of competition. Or egocentric posturing. Or both.

*shrug*
 
Thing is, it's still fine. I was showing my nephew who plays 2077 at 1080 on XBox (before the next gen console update) and he was blown away at how much better t looked. So, it's not like it looks bad, and anything above 60Hz is faster than we're supposed to be able to see anyway.

I swear a lot of this is just competition for the sake of competition. Or egocentric posturing. Or both.

*shrug*
Yea the whole cant see over xxhz anyways is an old thing :D Thing is you kinda can. Or more like you "feel" it. The smoothness in turning and moving the camera is very diffrent even from 60-98 tbh. Its hard too explain, the biggest gain is the frametime tho and the way it seem smoother. Like @SigilFey said before, a constant xx fps is better then high spikes into the xxx.
 
Hehe i got into a scruff online over beeing happy with 98hz on my 4k HDR screen(it supports 144hz but need too overclock and disable HDR and 10 bit color since its only DP1.4 so cant handle the data at higher settings.). Its really really rare too se even close too 144 fps in any game at 4k tbh so its kinda useless. But i like the HDR and the ips panel so.. That "dude" got pissed i dident aim for 144 and played at 1440p instead just too achive that. I was like dude, its my pc. You play how ever you want i want my 4k! :D

And the 60 hz 144fps is way too true. Before i got this screen i hade a standard 4k 60hz and when ever it got over 60 i would get screen tear deluxe instead ^^ People are kinda funny sometimes :D thank god for G-sync too so i never get tears anymore :D
Normally, I find that it's best to leave the actual Refresh Rate at maximum and worry about limiting frames to achieve smooth results. In the past, I've had legacy titles that really didn't want to cooperate with high refresh rates. Ghosting, stutters, physics explosions, etc. In those cases, it may be necessary to drop refresh rate itself down to achieve stability (or sometimes even playability). But generally, high refresh rates simply mean duped frames and smoother images. Hitting a nice balance around half-refresh rate is a great way of creating extremely consistent FPS with no image ghosting at all.

Also, if you haven't yet played with Nvidia's "Low-Latency" modes, I highly recommend you switch it to Ultra. The effect this has on games running between 60 - 120 FPS is phenomenal.

Yeah, I need to get one of those variable sync monitors. No tearing would be sooo nice.

My monitor is actually a Samsung 4K TV from 2016. It does 4K @ 30Hz (which is fine for desktop work, and is why I bought it) and 1080p @ 60Hz. The 6600XT will push CP2077 at slightly above 60FPS in 1080p, but I set it to sync with the monitor refresh so it doesn't tear (unless it's not making frame rate, which is rare, but does happen).

My problem is, 4k monitors in the 40" range are stupid expensive.

I play some games at 4K - stuff like Cities: Skylines and X-Com, but I need the frame rate up for stuff like CP2077, so I drop to 1080p.

Thing is, it's still fine. I was showing my nephew who plays 2077 at 1080 on XBox (before the next gen console update) and he was blown away at how much better t looked. So, it's not like it looks bad, and anything above 60Hz is faster than we're supposed to be able to see anyway.

I swear a lot of this is just competition for the sake of competition. Or egocentric posturing. Or both.

*shrug*
I use a 1440p and find that it's now a perfect match for the RTX 3060. My old 980 ti could do a few games at 1440p, but mostly 1080p. Having played around with a couple of 3090 ti systems, I'm still not convinced the "days of true 4K gaming" are really here yet. I watched FPS plummet here and there at 4K in quite a few games. I mean, it can do it, but it's far from what I'd call smooth or consistent performance.

The upside is that a 1440p monitor running at 144 Hz is still capable of DSR all the way up to 4K in a few games, which of course, is scaled, but still a beautifully crisp image for games that the 3060 can power through. Personally, though, I find I'm perfectly content running at 1440p with DLSS if available. I can get 72+ FPS at full Ultra settings in pretty much everything.
 
Last edited:
I use a 1440p and find that it's now a perfect match for the RTX 3060. My old 980 ti could do a few games at 1440p, but mostly 1080p. Having played around with a couple of 3090 ti systems, I'm still not convinced the "days of true 4K gaming" are really here yet. I watched FPS plummet here and there at 4K in quite a few games. I mean, it can do it, but it's far from what I'd call smooth or consistent performance.

The upside is that a 1440p monitor running at 144 Hz is still capable of DSR all the way up to 4K in a few games, which of course, is scaled, but still a beautifully crisp image for games that the 3060 can power through. Personally, though, I find I'm perfectly content running at 1440p with DLSS if available. I can get 72+ FPS at full Ultra settings in pretty much everything.
Thing is, I need a real 4K monitor just to get the screen real estate for work.

The monitor I have earmarked (ASUS XG43UQ) says it does 1440p @ 120Hz as well, so I'll likely try running 1440p on it with my current card and see how that goes. Realistically though, I'll probably upgrade to the Radeon 7000 series cards once they drop. But, monitor first.

Other monitor suggestions are welcome, but they have to be a pretty direct replacement for the Samsung 43" 4K TV that I'm currently using (but they don't need TV functionality), and I'm not willing to compromise on either screen real estate or physical size (because I'm not going to squint at a 30" 4K screen).
 
To be fair, the game runs decently well, At release I played on my old rig which used a 4790K, 16gb ram 2400 and a GTX 970, that system wasn't capable of much in 2077, 720p low/medium at best, so a 1060 won't be capable of much more.

My current setup is an AMD 5600X, 16 gb 3600, RX 5700 "XT". I recently also got a 1440p 144hz display and I can run the game on ultra 1440p with FSR on Ultra Quality, I can, but I run it on high with volumetric on low, shadows on medium.

What eats performance and always had is shadows, but even worse is Volumetric clouds, fog and steam.

Also, great explanation on what can be narrowed down to "Bigger number does not mean better", 1080 Ti still rips most modern GPU a new one and was probably the biggest mistake Nvidia ever did, but as another example an FX 8320 is not better than a Ryzen 5600X just because the number is higher.

Also i got half a headache reading OP's second post.
 
The upside is that a 1440p monitor running at 144 Hz is still capable of DSR all the way up to 4K in a few games, which of course, is scaled, but still a beautifully crisp image for games that the 3060 can power through. Personally, though, I find I'm perfectly content running at 1440p with DLSS if available. I can get 72+ FPS at full Ultra settings in pretty much everything.
Yea i tested some DSR today in Mass effect 2 legendary edition. Playing at 6k was quite the experiance and still did not max out my card. Sadly DSR dont have 8k upscale but theres some Ui issues and scaling problems within the ui sadly.
 
Yea i tested some DSR today in Mass effect 2 legendary edition. Playing at 6k was quite the experiance and still did not max out my card. Sadly DSR dont have 8k upscale but theres some Ui issues and scaling problems within the ui sadly.
I was having the same issue with X-Com 2 at 4k. Looked fine, but I had to squint at the words. Dropping back to 1080 made them readable.
 
I was having the same issue with X-Com 2 at 4k. Looked fine, but I had to squint at the words. Dropping back to 1080 made them readable.
Yea the issue is it changes the scale too 6k and then downsample it. text and ui elements can become way smaller and if you adjust it (if you can) it never becomes quite as good as it is at regular resolutions. the textures and stuff gets very sharp and nice tho ^^ Kinda have too hack it into the game with using same res at desktop tho since ME legendary wont show up in my NV experiance -.-
 
I was having the same issue with X-Com 2 at 4k. Looked fine, but I had to squint at the words. Dropping back to 1080 made them readable.
Yup. Always an issue. 2D assets cannot dynamically scale unless they're rendered with vector graphics. Not too many vector fonts out there, that I know of. (Might not be a bad idea, though...) So, in order to scale 2D stuff for higher resolutions, all of the assets need to be rebuilt from the ground up at the new scale. Not a very cost-effective venture, usually.
 
From what I understood is that, optimization means; "Achieve the same results for less recourses."

It is especially apparent in aviation where aircraft manufacturers constantly are trying to reduce weight, reduce drag, adjust the fan-blades etc., to make the plane consume less and less fuel, while maintaining the same speed.

Concerning Cyberpunk, having done some DLSS testing, it's save to say that although the game already received a vast amount of optimization, there is still a lot of room left for a lot of improvement concerning resource consumption.

Cyberpunk is a real gas-guzzler, so much so, pretty convinced Europe would never-ever in a million lightyears permit it to drive on the open roads were it a car. Nvidia's ultimate flagship graphics card, can have severe difficulties keeping the FPS rate up in those heavy tax spots on the map.

Had DLSS set on 'Auto' assuming that the game would know how to best orchestrate it. It resulted in the game CTD once every 1 to 2 hours. Only when I set it to 'High performance', did the CTD finally ceased. Using an RTX 3070 laptop.

Optimization = maximization under constraint. Achieving the same results for less resources is efficiency. Although, we can flip the constraint on its head, and speak of optimizing efficiency for a given output. So in that sense, they're largely equivalent. Hypothetically, if the developers could achieve the same result (whatever that result is) with fewer resources, then I would argue that optimization has not been achieved.
 
Top Bottom