Well, if we had comparison screenshots it would not be subjective. Usually, you can argue about the style of a game but commonly people agree that the higher the resolution the better and the farther you can see the better.
No, I'm arguing that whether or not it bothers you, or "ruins the game", or "destroys the immersion" is subjective. I'm not arguing that it wasn't changed -- of course it was. I'm living proof that it's not objectively an "issue". Doesn't really bother me in the least. It's just another step on the path. Same thing was done with Witcher 3. Look at TW3 now. Specs haven't changed. It's a process.
I'm also running CP2077 on exactly the same system I built for TW3 back in 2015. No upgrades. Running full Ultra, RTX off (as it doesn't exist for the 980 ti), 1080p. Still not noticing any real performance woes. Bit more steady overall. Still getting 45-56 FPS everywhere, frame cap locked at 56, Vsync on. There's now a little more near-field draw in.
Running on a nearly 6-year old system...I'd say that's pretty optimized.
Well, there is a "garbage mountain" that looks awful from a LOD perspective:
Also, while I do love Witcher 3 comparing the LOD behaviour to a game from 2015 is not right for a game that was released in 2020. I know, I often draw a Witcher 3 comparison as well, because both use different versions of the RED Engine, but if CDPR was unable to get this right for Cyberpunk 2077 they should definitely invest the required time for their next game. Especially, when compared to what Unreal 5's Nanite technology is capable of. (Although Nanite should be taken with a grain of salt because we have yet to see this technology deployed in a real game.
Sure, but those are the types of things that can now be optimized further as time goes on. I can't really solve problems like that before a.) I see there's a problem, and b.) I discover a balanced solution. Now, the
numbers are known. The engine has been re-budgeted to solve issues, and it's known how far the values can be pushed without causing issues that were discovered after release. Now, for example, someone can go in and remodel those garbage heaps to cut the polygons used by a third, and/or rebuild the textures and maps to make them less intensive. Then, that aspect of the LoD can be tweaked back out a ways.
There's no "snap my fingers and everything's better." It's going to be back and forth for a bit until everything is just so. It took about 3 years after release before the final version of TW3 was released. (And there are still people encountering issues...)
Honestly, until recently old-gen did have the utmost priority and those fixes where required fast. If two fixes where presented where one would increase the performance on old-gen while keeping the visual fidelity unchanged, and a similar one that would additionally decrease the performance and would only take a third of the time to implement, I think they might have gone with the latter one.
I think they needed a bunch of fixes quickly, which they had to deploy on a running system. If Cyberpunk 2077 was released today, it would probably look better on PC and run smoother on old-gen. (Unless there truly is a real fundemental issue with the RED Engine.)
How? Reality knocking again. Obviously, one of the main reasons the game was suffering from such terrible performance issues was because the rendering engine was demanding far too much of the hardware. It couldn't handle it. Thus, obviously, that needed to be tweaked to get it working within the limitations of what is physically possible on last-gen hardware.
Obviously, lower end hardware is not going to be capable of performing at the same degree of detail as much higher end hardware. General rule of thumb with consoles is to compare graphics to PC
at the same specs. Very often, consoles will look about that good and offer slightly better performance. Consoles are definitely better streamlined for gaming than an equivalent PC will be by default.
But If I'm expecting my PS4 or XB1 to somehow rival the same graphical detail and performance as a PC running an i9 processor with an RTX 3000 series GPU...there's no possible way. That's simply an unrealistic expectation for the last-gen hardware and capabilities. (The same would be true of a PC player using a $900 laptop expecting their game to look and run like a demo they watched of the game running on a custom-built PC rig worth $3,000.) The specifications table clearly outlines what detail settings to expect in order to get the game running between 30-60 FPS.
Yeah, I also believe that if they downgraded Cyberpunk 2077 before, they won't continue doing so. From this on out, it's probably moving forward in terms of visual fidelity.
A "downgrade" would be:
- "We've removed support for ray-tracing for the game. It's no longer supported."
- "The game no longer supports 64-bit processing."
- "The game will no longer support DirectX 12, it will only render at DirectX 11 quality."
"We've worked with the LoD scaling to ensure smoother performance," is not a downgrade. It's optimization to ensure that people see fewer crashes, more stable FPS, fewer visual bugs or glitching, etc.
I miss those days when ultra settings where truly ultra and players weren't able to run the game at this setting with the best computer money could buy at that time. It was just developers using and trying crazy things, like Witcher 2's Bokeh filter. Nowadays, people get angry when the can't run a game at ultra with 60fps on a toaster. (I'm not talking about console users here, the game was marketed and sold for old-gen so these people need to get a game that runs!)
There is no such time that I can recall. For every "big game" ever released, there was a crowd of people that lauded it as the best thing ever made...a crowd that complained it was total garbage compared to [ThisGame] or [ThatGame]...and a gigantic majority of people in the middle, between either end of the spectrum.
Nothing in this arena has changed since...whenever. It's always been the same considerations and the same arguments. If anything, we now have the ability to run pretty much any game out there at completely playable levels, even on a toaster! (Seriously, I think there are microwaves now with more computing power than the gaming rigs I built in the '90s.)
The game works just fine on PC, it's simply that many players aren't aware of the steps they may need to take to get things working for games that pose challenges. It took me a few days of troubleshooting to figure out why my game was crashing, and another day to get it running without issue. Sometimes that's required. All part of gaming on PC.
Where I do agree with this statement is for console players. Yes, I agree that it was a mistake to release the game on last-gen in that state, and it's a mistake that CDPR has long since owned and offered compensation for. At the same time, as I stated above, no, the last-gen builds are absolutely not going to be the same graphical fidelity as the builds for next-gen consoles. It will, however, be completely playable on last-gen. It already is for many, as I have seen it running on my buddy's XB1. It's not crashing and burning -- it's very consistent 30 FPS with a few areas that chug a bit. The videos I've seen of other people's games on XB1 or PS4: whoa...yeah...that's an issue.
I disagree here, developers should take that into account and they usually do and if their game is affected to much by an uncommon aspect ration than that's just bad design. Personally, I'm gaming on 5120x1440 and I haven't had any problems yet. Sure, I had to change Witcher 2's binary to support that resolution but afterwards the game was running fine. With more modern games, like Witcher 3 or Cyberpunk 2077 I didn't have any issues.
Heh -- I wish! I've got to play in a 1920x1080 window -- can't even get it smooth around 50 FPS at 1440p. Sort of miss CRT monitors' ability to resize the whole screen space. But I'm not giving up that detail!
But non-standard resolutions are non-standard for a reason. Many don't have any idea how much work is involved in supporting different aspect ratios. Namely, all 2D assets need to be completely rebuilt -- every, single thing -- for each aspect ratio the game will use. Once the resolution for an aspect ratio gets past a certain threshold, the assets need to be completely rebuilt for
that as well. Again.
That is not only incredibly time consuming, but incredibly expensive as people aren't going to do that sort of work for free. Even modders tend not to touch that with a 75 foot pole.
It's your SSD so do whatever you like, but you are probably a bit on the paranoid side and as long as you keep 20-25% (the more the better) free it will be fine.
Well, as I've told many people in the past, the first time they deal with a complete hard-drive crash at precisely the wrong moment, then have no way of recovering critical data for work or something, and even after a reformat and reinstallation of the OS, there's so many bad sectors that they still have to deal with ongoing file system errors, until they finally replace the drive, requiring that they go through everything again...
...they'll start ensuring that there's
plenty of free space on every drive. 10% is actually cutting it a bit close -- especially with individual files able to reach an excess of 4 GB (movies, music, etc.) nowadays.
I stand by my statement that optimisation is only optimisation if the visual fidelity is roughly the same. Theoretically, I could remove all NPC in Cyberpunk and call this optimisation as well. Sure, it is up for discussion what "roughly" means, but if the LOD issue wasn't as pronounced in 1.06 I would call it a downgrade and not optimisation.
To stress the point I've made earlier again, I doubt that those things where really gameplay related and broke the game fundamentally. I much rather assume this matter to be very complicated and CDPR did neither have the time, nor the resources to fix this in a proper manner. That being said, I agree with your last statement that since 1.23 this enormous pressure was lifted from their shoulders.
Also, when having a look at the patch notes CDPR was able to fix a myriad of issues which also makes the gameplay experience far more enjoyable on PC than it was on launch.
You can stand by that belief if you want, but that's not how it works. Of course, it's ideal if things work out that way, but pick any professional game developer that you want anywhere in the industry, and they'll be able to explain in detail how and why it never works out that way.
The example I've made is not meant to be taken literally -- it was an intentionally simplified example of how various aspects of an engine can connect in ways that a player has no ability to see or understand. If you'd like a real world example of how obscure and ridiculously difficult this sort of engine issue can be, research the lip-sync bug for Skyrim. If took nearly 5 years of work, if I remember right, long after Bethesda had written it off, for a modder to finally find a work-around (not a true fix). Just for one bug. One that created a terribly distracting issue with the game.
The core of your stance is kind of like trying to argue that if the rocket didn't get to space, then it needs to be a bigger rocket. So, just build a bigger one. That's the way it should have been done to begin with.
That's not how reaching escape velocity works. (And yes, actually, trying to budget resources for a very demanding engine is a lot like rocket science. With fewer explosions. [Not "no" explosions...just...not as many...])