So FSR never got fixed and stuck with the constant flickering?

+
Like really? They just didn't care or did not see all those people talking about the issue?
 

Attachments

  • IMG_20241027_203111.jpg
    IMG_20241027_203111.jpg
    376.1 KB · Views: 203
FDR is an AMD technology. Integrating isn't as simple as saying, "I'm gonna put FSR in my game." Just like DLSS, it's a learning algorithm that AMD needs to implement in order to get it working well. That takes time, as the algorithm needs to run frames millions of times in order to figure out the best methods to create the cleanest images. That requires both a studio and AMD's labs to work together over time.
 
We all know that there are better and worse implementations of FSR. Unless you have a proof that AMD neglected implementation process (as far as I know FSR is open source, so not really much from their side to do) that's CDPR fault is looks less than stellar. And why did they decide to release FSR3 long after the actual release date and that in an old (3.0 instead of 3.1) version - only they know probably...
 
And why did they decide to release FSR3 long after the actual release date and that in an old (3.0 instead of 3.1) version - only they know probably...
Why Final Fantasy VII rebirth was released with Unreal Engine 4 ? because they started the work on it.
Maybe there is breaking changes with FSR 3.1, and they had to remake a lot of things to make it works with Red Engine.
 
FDR is an AMD technology. Integrating isn't as simple as saying, "I'm gonna put FSR in my game." Just like DLSS, it's a learning algorithm that AMD needs to implement in order to get it working well. That takes time, as the algorithm needs to run frames millions of times in order to figure out the best methods to create the cleanest images. That requires both a studio and AMD's labs to work together over time.

While this partially applies to DLSS(largely due to Nvidia being Nvidia), it's completely incorrect for FSR.

As @radosuaf_ pointed out, FSR is completely open source. Anyone can integrate it into their game at any time. AMD has released comprehensive guides on how to implement it and since it's completely GPU-agnostic, it also doesn't require any AMD hardware for integration. There is absolutely no need for a studio to collaborate with AMD's labs to properly integrate FSR into their game.

Why Final Fantasy VII rebirth was released with Unreal Engine 4 ? because they started the work on it.
Maybe there is breaking changes with FSR 3.1, and they had to remake a lot of things to make it works with Red Engine.

Not exactly an apples to apples situation. Upgrading from UE4 to UE5 is a massive undertaking that requires a lot of work to convert everything. The further along you are, the worse it gets obviously.

Not nearly as true with FSR.

And FSR 3.1 has already been implemented in the game.... just not by CDPR. It's not a RED engine issue. The engine can handle it just fine.
 
Last edited:
Oh luds...you're completely correct. My mistake. Don't really know how I missed that (aside from not having an AMD card).

Frankly, it's an easy mistake to make. Most people still assume FSR is simply AMD's direct answer to DLSS when it's not. It's not even what AMD advertises it as. It's a far more consumer-friendly alternative that really throws shade at Nvidia's ethics in my opinion.

I have really been reconsidering my allegiance to team green/blue these last few months lol.
 
Frankly, it's an easy mistake to make. Most people still assume FSR is simply AMD's direct answer to DLSS when it's not. It's not even what AMD advertises it as. It's a far more consumer-friendly alternative that really throws shade at Nvidia's ethics in my opinion.

I have really been reconsidering my allegiance to team green/blue these last few months lol.
Looking at what they've done, I have to say, it's a fair answer to the exclusivity of DLSS...but it's apparently just a form of layering temporal anti-aliasing on much lower resolutions to gain frames. I can definitely say that I don't know if something like that would ever work well with CP2077 because of the sheer amount of fine details in the game world. I imagine CP would have artifacting all over the place.

Using DLSS for some years now, I can definitely vouch for it. I can also understand why it would need specifically integrated hardware to do what it does. But I'm simultaneously not a fan of exclusivity. Hopefully, AMD can eventually evolve FSR into something that will work as effectively as their Free-Sync tech did as an answer to G-Sync.
 
Last edited:
Looking at what they've done, I have to say, it's a fair answer to the exclusivity of DLSS...but it's apparently just a form of layering temporal anti-aliasing on much lower resolutions to gain frames. I can definitely say that I don't know if something like that would ever work well with CP2077 because of the sheer amount of fine details in the game world. I imagine CP would have artifacting all over the place.

Using DLSS for some years now, I can definitely vouch for it. I can also understand why it would need specifically integrated hardware to do what it does. But I'm simultaneously not a fan of exclusivity. Hopefully, AMD can eventually evolve FSR into something that will work as effectively as their Free-Sync tech did as an answer to G-Sync.

It's a bit more than that. Two friends of mine are currently working on their own indie game. I help them on my off time. They're both industry vets of 15+ years. I'm not and while I've learned a lot, their explanation of the intricacies of FSR vs DLSS was far too advanced for me to be able to regurgitate it here in a way that makes sense but there is more to it than that.

And it can definitely work well with CP2077. If you're interested, look into LukeFZ's work. He's a modder who dedicated his time to getting FSR 3/3.1 into various games. His 3.1 implementation in CP2077 is almost flawless. Gives access to FG and has all the crispness of 3.1 that CDPR's 3.0 lacks with zero artifacts (assuming one has the hardware to do so). It's quite amazing work to be honest.

Don't get me wrong, the image quality with DLSS is still superior. I'm rethinking my allegiance to Nvidia, sure, but at this point in time there is no doubt I'm still buying a 5080 as soon as they are released to replace my 3080ti. There is still a gap but AMD is closing this gap by wide margins with every new version and it really throws a wrench into Nvidia's explanation for gatekeeping things like FG behind newer hardware.
 
It's a bit more than that.
Absolutely, but FSR seems to be using the same "fast-and-dirty" approach to upscaling as FSAA did with AA. It's definitely faster, but image quality will likely fluctuate wildly between titles. It apparently (from what I read) takes subsequent frames at the lower resolution, scales them up, then attempts to "blend" the upscaled frames with a TAA effect in several passes that is still faster than actual TAA. Hence, all the artifacting that can result, especially on the edges of transformable textures (like hair with physics applied). All in real-time, so software only.

I'd be ineterested to see what could be done manually by tweaking the way the drivers handle a certain game. Like:

And it can definitely work well with CP2077. If you're interested, look into LukeFZ's work. He's a modder who dedicated his time to getting FSR 3/3.1 into various games. His 3.1 implementation in CP2077 is almost flawless. Gives access to FG and has all the crispness of 3.1 that CDPR's 3.0 lacks with zero artifacts (assuming one has the hardware to do so). It's quite amazing work to be honest.
Impressive! I can already see why CDPR may have gone with one but not the other. I'm guessing that AMD would want some sort of significant licensing fee from established studios to gain access to all of the tools that must be available. And it must have been a pretty big task getting it to work that well.

Or, maybe it's another exclusivity thing. Maybe Nvidia didn't want to support CP2077 if they were also going to do FSR support. Business sucks, at times. But that's just speculation -- I don't actually have any idea.


Don't get me wrong, the image quality with DLSS is still superior. I'm rethinking my allegiance to Nvidia, sure, but at this point in time there is no doubt I'm still buying a 5080 as soon as they are released to replace my 3080ti. There is still a gap but AMD is closing this gap by wide margins with every new version and it really throws a wrench into Nvidia's explanation for gatekeeping things like FG behind newer hardware.
Yeah, they're actually nothing even remotely alike. I was way off. There's no AI learning between frames and resolutions at all with FSR. I didn't realize that all of it was handled in real-time. It's really it's own, unique apporach.

For myself, while I do use DLSS pretty much whenever it's available, I probably won't really spend money on a card anytime soon. I absolutely refuse to pay these prices, and I'm more than happy with my 3060. When I can upgrade meaningfully for ~$300-$500, then I'll think about a new GPU. 60 FPS is still perfectly adequate for me. I actively use 30 and 48 at times with other games. And I mostly get 72-144 FPS in the games I'm playing now.

It's lost on me. Totally don't care whether I get 30 or 300 FPS as long as it feels smooth.
 
For myself, while I do use DLSS pretty much whenever it's available, I probably won't really spend money on a card anytime soon. I absolutely refuse to pay these prices, and I'm more than happy with my 3060. When I can upgrade meaningfully for ~$300-$500, then I'll think about a new GPU. 60 FPS is still perfectly adequate for me. I actively use 30 and 48 at times with other games. And I mostly get 72-144 FPS in the games I'm playing now.

It's lost on me. Totally don't care whether I get 30 or 300 FPS as long as it feels smooth.

I wish I could say the same, truly.

I grew up poor, the best I could hope for was integrated graphics that could barely play Diablo 2. Mid/high 10s to low 20s FPS was my norm. My first GPU was some low end Geforce4 a friend gifted to me so I could play WoW with him. I don't think it even met the minimum requirements... but it worked! Then I got a 9500.. GT I think, that I kept all the way through university. Choppy gameplay is all I knew for the longest time.

You'd think somehow I'd be fine with lower FPS but I can't stand anything under 60 FPS anymore. I know there are various things you can do to make 30 FPS feel much smoother but it just doesn't do it for me. I'm not claiming I see a difference between 60 and 58, I don't, but anything in the low 50s and below I immediately notice and just can't stand. Plus, I really like my bells and whistles in games these days since I can actually afford it now. I don't want to spend that much on just a GPU but I've come to the conclusion that I don't really have a choice in the matter. There is just no way to reconcile wanting top end graphics, high performance and not paying stupid prices.

At this point I've accepted I'm not reasonable on this.
 
I wish I could say the same, truly.

I grew up poor, the best I could hope for was integrated graphics that could barely play Diablo 2. Mid/high 10s to low 20s FPS was my norm. My first GPU was some low end Geforce4 a friend gifted to me so I could play WoW with him. I don't think it even met the minimum requirements... but it worked! Then I got a 9500.. GT I think, that I kept all the way through university. Choppy gameplay is all I knew for the longest time.

You'd think somehow I'd be fine with lower FPS but I can't stand anything under 60 FPS anymore. I know there are various things you can do to make 30 FPS feel much smoother but it just doesn't do it for me. I'm not claiming I see a difference between 60 and 58, I don't, but anything in the low 50s and below I immediately notice and just can't stand. Plus, I really like my bells and whistles in games these days since I can actually afford it now. I don't want to spend that much on just a GPU but I've come to the conclusion that I don't really have a choice in the matter. There is just no way to reconcile wanting top end graphics, high performance and not paying stupid prices.

At this point I've accepted I'm not reasonable on this.
Heh -- it's all a psychological thing in the end. I grew up on Apple II's and the Commodore 64. 4-8 FPS was all that was possible at that point. To this day, I can definitely tell the difference between 30 and 60. I can feel the difference between 60 and 80. Anything above that, and I couldn't tell you if something was running at 80 or 144.

But, just to ensure we keep this on-topic, upscaling will definitely be the way of the future, at least until true 4K with ray tracing becomes standardized. In the meantime, Nvidia has definitely cornered the process with DLSS at present. I'm often hard pressed to find any meaningful difference in image quality running in native res at 1440p, even if dropping the lower sample to 720p. I mean, there are differences, obviously, especially in far-field, but textures remain crisp and performance often doubles. The only thing I'll get, in certain games only (like DD2) is a shimmer artifact around those transformable textures or when a foreground figure is standing in front of an animated texture like water or smoke. Fiddling with sharpening will normally rectify most of that in most scenes. But other games (like Mechwarrior 5, Darktide, Elite, BG3, or Space Marine 2) the difference in image clarity is so negligible as to be a virtual non-factor.

With FSR, which I've now fiddled with in a few games just to see (disabling DLSS altogether) there is not that much performance gain...and the image noticeably suffers. There is a lot of artifacting around in-game models with it enabled, and some real blockiness that happens with translucency/transparency on textures like chain-link fences or windows.

Now, I'm trying it on an Nvidia card, so I would imagine an AMD card might fare better in both FPS and image fidelity, but it's not something that I would call "comparable" to DLSS tech. It's there, and it's available for people that don't have the option of DLSS, so that's great -- and I fully expect a lot of issues to be ironed out over time. But I could imagine that incorporating FSR in an official capacity may be in the realm of diminishing returns for something the size of CP2077. Meaning: personel pulled of of other projects to continue supporting it through CDPR's and AMD's updates, but it may produce only middling to meh results...perhaps not even consistent enough performance to officially integrate frame-generation on existing hardware (based on minimum specs for both PC and consoles). Again, just speculating.
 
Looking at what they've done, I have to say, it's a fair answer to the exclusivity of DLSS...but it's apparently just a form of layering temporal anti-aliasing on much lower resolutions to gain frames. I can definitely say that I don't know if something like that would ever work well with CP2077 because of the sheer amount of fine details in the game world. I imagine CP would have artifacting all over the place.

Using DLSS for some years now, I can definitely vouch for it. I can also understand why it would need specifically integrated hardware to do what it does. But I'm simultaneously not a fan of exclusivity. Hopefully, AMD can eventually evolve FSR into something that will work as effectively as their Free-Sync tech did as an answer to G-Sync.

1. There are lots and lots "detailed" games using FSR and they still look fine.
2. CDPR TAA solution is mediocre at best, as mentioned by many reviewers - it loses a lot of fine detail and is built in the game itself.
3. XeSS implementation in CP2077 is actually pretty decent, which shows hardware agnostic solutions can be well integrated with the engine.
4. Also, while XeSS versions are easily swappable, CDPR didn't even care to put 3.1.1 version in the game (released July 2024) and used 3.1.0 instead. I replaced it myself and it took me about 73 seconds. FSR is also easily swappable since 3.1 and AMD just released a new 3.1.2 version yesterday. But this I can't use, since CDPR put 3.0 in the game...
5. FSR 3.1 decouples upscaling and frame generation, which would allow NVIDIA 3xxx series use DLSS for upscaling and FSR3 for frame generation, but this is probably something NVIDIA wouldn't like... I love the game, but CDPR's decisions regarding AMD and Intel solutions implementation are questionable, to say the least.

I mean, I'd gladly read an explanation from the developers why did they make such choices, but we all know they won't tell a word.
 
Last edited:
With FSR, which I've now fiddled with in a few games just to see (disabling DLSS altogether) there is not that much performance gain...and the image noticeably suffers. There is a lot of artifacting around in-game models with it enabled, and some real blockiness that happens with translucency/transparency on textures like chain-link fences or windows.

Now, I'm trying it on an Nvidia card, so I would imagine an AMD card might fare better in both FPS and image fidelity, but it's not something that I would call "comparable" to DLSS tech. It's there, and it's available for people that don't have the option of DLSS, so that's great -- and I fully expect a lot of issues to be ironed out over time. But I could imagine that incorporating FSR in an official capacity may be in the realm of diminishing returns for something the size of CP2077. Meaning: personel pulled of of other projects to continue supporting it through CDPR's and AMD's updates, but it may produce only middling to meh results...perhaps not even consistent enough performance to officially integrate frame-generation on existing hardware (based on minimum specs for both PC and consoles). Again, just speculating.

I'm genuineluy surprised by the results you saw.

My own fiddling with FSR, even 2.0, showed very comparable results to DLSS in terms of performance gain. Obviously, my own tinkering is also entirely on an Nvidia GPU.

And it's also what various outlets report. FSR can rival DLSS perfectly fine. You can look it up if you want, there are plenty of DLSS vs. FSR measurements out there. There are games where FSR wins in fact.

Where DLSS always wins is in terms of image quality. That is not in doubt but I can't say I've experienced the issues you did. It certainly did look more washed out, especially around certain types of smaller textures/models.

From everything I've experienced and seen out there, the "race" between the two products is much closer than you seem to have experienced on your end.

Let's also not forget that Nvidia released DLSS about 3 years earlier than FSR. Which makes it that much more impressive that the race is that close.
 
FDR is an AMD technology. Integrating isn't as simple as saying, "I'm gonna put FSR in my game." Just like DLSS, it's a learning algorithm that AMD needs to implement in order to get it working well. That takes time, as the algorithm needs to run frames millions of times in order to figure out the best methods to create the cleanest images. That requires both a studio and AMD's labs to work together over time.
There ist a mod that works flawlessly... So if one modder can do it...
 
There ist a mod that works flawlessly... So if one modder can do it...
We've already clarified that I had misunderstood how FSR worked. I thought it was a software-based version of Nvidia's DLSS technology. It's completely its own thing, and also works very differently. As stated, just because it may be simpler to implement, that doesn't mean that there is not something else preventing CDPR from officially supporting it. Could be anything from the studio just doesn't like the results it creates to some sort of legal barrier. We have no info...except the fact that FSR is not working as spectacularly as it could.

My guess would be that over time, it will wind up being addressed. But that's just a guess.
 
Top Bottom