Why Final Fantasy VII rebirth was released with Unreal Engine 4 ? because they started the work on it.And why did they decide to release FSR3 long after the actual release date and that in an old (3.0 instead of 3.1) version - only they know probably...
FDR is an AMD technology. Integrating isn't as simple as saying, "I'm gonna put FSR in my game." Just like DLSS, it's a learning algorithm that AMD needs to implement in order to get it working well. That takes time, as the algorithm needs to run frames millions of times in order to figure out the best methods to create the cleanest images. That requires both a studio and AMD's labs to work together over time.
Why Final Fantasy VII rebirth was released with Unreal Engine 4 ? because they started the work on it.
Maybe there is breaking changes with FSR 3.1, and they had to remake a lot of things to make it works with Red Engine.
Oh luds...you're completely correct. My mistake. Don't really know how I missed that (aside from not having an AMD card).While this partially applies to DLSS(largely due to Nvidia being Nvidia), it's completely incorrect for FSR.
Oh luds...you're completely correct. My mistake. Don't really know how I missed that (aside from not having an AMD card).
Looking at what they've done, I have to say, it's a fair answer to the exclusivity of DLSS...but it's apparently just a form of layering temporal anti-aliasing on much lower resolutions to gain frames. I can definitely say that I don't know if something like that would ever work well with CP2077 because of the sheer amount of fine details in the game world. I imagine CP would have artifacting all over the place.Frankly, it's an easy mistake to make. Most people still assume FSR is simply AMD's direct answer to DLSS when it's not. It's not even what AMD advertises it as. It's a far more consumer-friendly alternative that really throws shade at Nvidia's ethics in my opinion.
I have really been reconsidering my allegiance to team green/blue these last few months lol.
Looking at what they've done, I have to say, it's a fair answer to the exclusivity of DLSS...but it's apparently just a form of layering temporal anti-aliasing on much lower resolutions to gain frames. I can definitely say that I don't know if something like that would ever work well with CP2077 because of the sheer amount of fine details in the game world. I imagine CP would have artifacting all over the place.
Using DLSS for some years now, I can definitely vouch for it. I can also understand why it would need specifically integrated hardware to do what it does. But I'm simultaneously not a fan of exclusivity. Hopefully, AMD can eventually evolve FSR into something that will work as effectively as their Free-Sync tech did as an answer to G-Sync.
Absolutely, but FSR seems to be using the same "fast-and-dirty" approach to upscaling as FSAA did with AA. It's definitely faster, but image quality will likely fluctuate wildly between titles. It apparently (from what I read) takes subsequent frames at the lower resolution, scales them up, then attempts to "blend" the upscaled frames with a TAA effect in several passes that is still faster than actual TAA. Hence, all the artifacting that can result, especially on the edges of transformable textures (like hair with physics applied). All in real-time, so software only.It's a bit more than that.
Impressive! I can already see why CDPR may have gone with one but not the other. I'm guessing that AMD would want some sort of significant licensing fee from established studios to gain access to all of the tools that must be available. And it must have been a pretty big task getting it to work that well.And it can definitely work well with CP2077. If you're interested, look into LukeFZ's work. He's a modder who dedicated his time to getting FSR 3/3.1 into various games. His 3.1 implementation in CP2077 is almost flawless. Gives access to FG and has all the crispness of 3.1 that CDPR's 3.0 lacks with zero artifacts (assuming one has the hardware to do so). It's quite amazing work to be honest.
Yeah, they're actually nothing even remotely alike. I was way off. There's no AI learning between frames and resolutions at all with FSR. I didn't realize that all of it was handled in real-time. It's really it's own, unique apporach.Don't get me wrong, the image quality with DLSS is still superior. I'm rethinking my allegiance to Nvidia, sure, but at this point in time there is no doubt I'm still buying a 5080 as soon as they are released to replace my 3080ti. There is still a gap but AMD is closing this gap by wide margins with every new version and it really throws a wrench into Nvidia's explanation for gatekeeping things like FG behind newer hardware.
For myself, while I do use DLSS pretty much whenever it's available, I probably won't really spend money on a card anytime soon. I absolutely refuse to pay these prices, and I'm more than happy with my 3060. When I can upgrade meaningfully for ~$300-$500, then I'll think about a new GPU. 60 FPS is still perfectly adequate for me. I actively use 30 and 48 at times with other games. And I mostly get 72-144 FPS in the games I'm playing now.
It's lost on me. Totally don't care whether I get 30 or 300 FPS as long as it feels smooth.
Heh -- it's all a psychological thing in the end. I grew up on Apple II's and the Commodore 64. 4-8 FPS was all that was possible at that point. To this day, I can definitely tell the difference between 30 and 60. I can feel the difference between 60 and 80. Anything above that, and I couldn't tell you if something was running at 80 or 144.I wish I could say the same, truly.
I grew up poor, the best I could hope for was integrated graphics that could barely play Diablo 2. Mid/high 10s to low 20s FPS was my norm. My first GPU was some low end Geforce4 a friend gifted to me so I could play WoW with him. I don't think it even met the minimum requirements... but it worked! Then I got a 9500.. GT I think, that I kept all the way through university. Choppy gameplay is all I knew for the longest time.
You'd think somehow I'd be fine with lower FPS but I can't stand anything under 60 FPS anymore. I know there are various things you can do to make 30 FPS feel much smoother but it just doesn't do it for me. I'm not claiming I see a difference between 60 and 58, I don't, but anything in the low 50s and below I immediately notice and just can't stand. Plus, I really like my bells and whistles in games these days since I can actually afford it now. I don't want to spend that much on just a GPU but I've come to the conclusion that I don't really have a choice in the matter. There is just no way to reconcile wanting top end graphics, high performance and not paying stupid prices.
At this point I've accepted I'm not reasonable on this.
Looking at what they've done, I have to say, it's a fair answer to the exclusivity of DLSS...but it's apparently just a form of layering temporal anti-aliasing on much lower resolutions to gain frames. I can definitely say that I don't know if something like that would ever work well with CP2077 because of the sheer amount of fine details in the game world. I imagine CP would have artifacting all over the place.
Using DLSS for some years now, I can definitely vouch for it. I can also understand why it would need specifically integrated hardware to do what it does. But I'm simultaneously not a fan of exclusivity. Hopefully, AMD can eventually evolve FSR into something that will work as effectively as their Free-Sync tech did as an answer to G-Sync.
With FSR, which I've now fiddled with in a few games just to see (disabling DLSS altogether) there is not that much performance gain...and the image noticeably suffers. There is a lot of artifacting around in-game models with it enabled, and some real blockiness that happens with translucency/transparency on textures like chain-link fences or windows.
Now, I'm trying it on an Nvidia card, so I would imagine an AMD card might fare better in both FPS and image fidelity, but it's not something that I would call "comparable" to DLSS tech. It's there, and it's available for people that don't have the option of DLSS, so that's great -- and I fully expect a lot of issues to be ironed out over time. But I could imagine that incorporating FSR in an official capacity may be in the realm of diminishing returns for something the size of CP2077. Meaning: personel pulled of of other projects to continue supporting it through CDPR's and AMD's updates, but it may produce only middling to meh results...perhaps not even consistent enough performance to officially integrate frame-generation on existing hardware (based on minimum specs for both PC and consoles). Again, just speculating.
There ist a mod that works flawlessly... So if one modder can do it...FDR is an AMD technology. Integrating isn't as simple as saying, "I'm gonna put FSR in my game." Just like DLSS, it's a learning algorithm that AMD needs to implement in order to get it working well. That takes time, as the algorithm needs to run frames millions of times in order to figure out the best methods to create the cleanest images. That requires both a studio and AMD's labs to work together over time.
We've already clarified that I had misunderstood how FSR worked. I thought it was a software-based version of Nvidia's DLSS technology. It's completely its own thing, and also works very differently. As stated, just because it may be simpler to implement, that doesn't mean that there is not something else preventing CDPR from officially supporting it. Could be anything from the studio just doesn't like the results it creates to some sort of legal barrier. We have no info...except the fact that FSR is not working as spectacularly as it could.There ist a mod that works flawlessly... So if one modder can do it...