Did patch 1.1 mess with the LOD Distance?

+
It does allow the user to configure these settings -- just not the same way as it worked before. That was the whole point of the optimization.

Because of the way online marketplaces function, it's not possible to update a game officially, but still allow for players to roll back to prior versions. It's a support / liability concern. Legal issue and descision. I think it's pretty dumb in practice, as well, but it won't be changing. Totally impossible on Steam or Origin, as far as I know. It might be possible on GOG if you download the installation packages manually through your browser (ala not Galaxy). The base installation may still be 1.0, with other updates as separate packages. (I don't know for sure about CP2077; you'll have to check how the downloads are structured.) If it's individual packages for the game and updates, just install up to 1.6, then stop.
Nope, I just checked that, the only offline installer is for version 1.3, and actually even that doesn't have the whole fileset.. There are no updates of any kind available.
 
Nope, I just checked that, the only offline installer is for version 1.3, and actually even that doesn't have the whole fileset.. There are no updates of any kind available.
In GOG, the old versions will appear when unchecking auto-updates ;)
 
Ye-es but the problem is apparently with the event system, not FPS. They should be entirely decoupled and if they're not, there's a fundamental problem on how the engine works. Otherwise your game could break if you're running it with low end PC with choppy graphics, or vice versa. I can't think of any other instance right away where someone says that events and triggers break because the engine gfx trashes the system too much. At least not for published games. Graphics breaking and game crashing if things are pushed too much, yes, or funky physics because someone's routine was written for console with fixed 30FPS.. But triggers..? I guess the physics case could break scripts if object x ends up in a lake unexpectedly but that's not what happens here.

Closest I can think of is shitty old days when everything was single threaded and UI could start lagging hard when a game pushed the system. Paradox was "good" with this before they upgraded the system to "multi-threaded", i.e. UI runs in one thread, scripts run in another and the other 10 cores in your system do nothing.
An example I have personal (and extensive) experience with is Morrowind's Gamebryo, which is exactly the same base engine that was used for Oblivion, Fallout 3, Fallout New Vegas, Skyrim, Fallout 4, and Fallout 76. Frame timing for all game functions operated off of 60 FPS or less. If I increase the FPS beyond that point, things like player movement speed, idle animations, weapon attack speed, etc. will be accelerated linearly with every frame over 60. The functions will be based on the number of frames that are actually drawn to the screen.

Physics calculations were taken based on keyed frames that were rendered and drawn to the screen, again ticking complete cycles every 60 frames maximum. If the FPS exceeded 60, or if multiple, partial key frames were drawn (because of unlimited FPS / vsync off), the additional frames data would be treated as a multiplier to the physics calculations, and would result in a "physics explosion". (I bump into a cart, and it flips over sending a cabbage into orbit.)

But most shockingly to people struggling with "unexplainable" glitches and errors was the fact that Papyrus scripts (which is Bethesda's scripting engine that manages everything concerning quests, weather, day/night cycles, random NPC population and placement, animation packages, Radient AI behaviors, etc. -- everything that makes the game "do stuff")...also operates on a timer that uses 60 FPS frame timing to function. Frames passing is what counts as "ticks" on its internal clock. If I exceed 60, it means that the clock is now running in fast-forward. All sorts of broken stuff can start appearing.

Other games that use similar frame-timing functions that I know of: Dark Souls 1-2, Dragon's Dogma, Drago Age: Origins through Inquisition, Mass Effect 1-3, Halo 1-3, Supreme Commander and Forged Alliance, Total War 1 - Medival 2...each of them suffers from some pretty serious issues if FPS exceeds 60 or 72 in various cases. It's not always a guaranteed issue, nor does it mean that player are aware that issues are occurring. But they're there.

...

It's easiest to understand from that direction, but it can work the other way, as well.

If I fall below a certain FPS threshold, it can prevent certain game functions from happening or happening on-time. Why is this character not responding? Why is the next part of the mission not starting? Why are all the textures blurry on these NPCs? Why are those crates popping in, like, 5 seconds after I look at them?

And the answer can very likely be: not enough performance overhead. The engine is making calls to render and draw other things, and there's not enough system resource available to handle everything because there's, for example, too many distant LoD assets trying to load during what's supposed to be the "fast" bit of the rendering process. And that needs to happen first so that players can see what's happening in front of them before other, more complex functions take over. And if that's slowing things down too much, then other functions can't execute because they're waiting on the go-ahead from the engine -- and we have a chain reaction that results in some NPC staring blankly at you and refusing to initiate the quest dialogue because the scripted scene software can't initiate because the distant rendering algorithm is still trying to add specular map textures to distant walkway railings to make them look wet because the weather system says that it's storming outside. And the only reason that happened is because the player approached from the east instead of west, north, or south. Because they happened to be looking in that direction, their on-screen backdrop contained over 5,000 additional distant LoD assets that would otherwise not have been necessary to render and draw if they had been looking in a different direction at that exact moment in the game.

Welcome to rendering engine optimization.

So...what do we do? Totally recreate the rendering from the ground up? That's the same as saying we'll start making the game over again from the beginning. It would require not only rebuilding the rendering algorithms, but rebuilding everything that has any reliance on the rendering to function...which would be virtually everything else. Not possible. Just tell the computer to start the dialogue first? Okay, then it will need to suspend the rendering altogether and prevent assets from loading, meaning when that dialogue scene triggers, you'll be staring at void space, or giant pink polygons in the distance. Customize the distant LoD manually for that scene? Okay, but then I have to do the same for every possible NPC interaction from every possible viewing angle (which is 360° -- horizontal and vertical!) for every single part of the game from beginning to end. That's ridiculous.

Or -- we can work on the overall rendering, balancing it in such a way as to prevent such a problem from occurring. We pull the level of detail closer to camera, allowing the game to finish processing the visuals in time and let the rest of the game functions execute as designed.

This, of course, is a drastically simplified example of just one possible issue that can arise in any game. For CP2077, I think the visual changes were more to avoid conflicts with other parts of the rendering and possibly the collision systems. Unlike a lot of other games (Bethesda games, again, are a perfect example of this) that reuse the same, exact assets for almost all scenes in their games, CP2077 includes a lot of unique assets that are only ever seen in one part of the game. (Not that there are no assets that are reused -- of course there are.) Things like barrels, railings, trash heaps, vending machines, lamp posts, etc. These are often identical assets no matter where you see them. Other bits like unique buildings, unique graffiti on walls, unique storefronts, static 3D assets like the piping by the docks or the park area in the corporate center, are going to take a lot more rendering power to load into the game...as they can't be as readily pre-cached in RAM. They have to be streamed in on demand. And unlike other open world games that feature the occasional, unique area, most of Night City is unique like that. Hence, it may not be possible to balance that level of detail the same way it worked for other open world games (like GTA, Red Dead, Assassin's Creed, Just Cause, Far Cry...)

That's my best guess on why the changes to scale back the LoD were made. But I don't know. I was pretty astounded by how much of NC was hand-crafted, as opposed to the way every other, open world game I've ever seen has worked. It was more than just plopping the same 25 city buildings down facing in different directions, like GTA or Far Cry. Here, rather, it looks like the devs took the approach they took for creating Oxenfurt and Novigrad in TW3, and they said, "Let's do that...for the entire map of Night City." That's going to come with performance and optmization challenges.
 
Last edited:
IMHO this is a VERY good and important point. Especially for PCs where thousands of major factors can be different and it's nearly impossible to track minute details like "silicon lottery". It's very similar with consoles, just that it isn't advertised, because the product thrives to be "simple".
Definitely. I first experienced it when I went to swap a hard drive from an Xbox 360 that I bought in the Middle East with an Xbox 360 I had in America. Opened up the units, and I didn't understand what I was looking at. It was like night and day difference between the two machines. Components were not even in the same spot. I thought I had been given some sort of imitation system or something. Called Microsoft and they assured me that both were authentic. Even helped me to get through the crazy, internal plastic covering on one, which was added to the console in that region to improve airflow.

They're getting to be almost as bad as PCs!


I didn't have any particular stability issues with 1.06 (some crash could happen as much as now, but nothing relevant), so we can scramble around as much as we want, but still does not convince me as an acceptable motivation for what I see now on my PC version (with all due respect to your opinions, of course :beer:)
Me neither. I initially had crashes immediately upon exiting character creation, but upgrading from Win7 to Win10 solved all of that. Had only two crashes my entire first playthrough. Performance was pretty steady throughout. No big issues with any of the other patches.

I'm seeing the scaled back LoD now, just like everyone else. It just doesn't bug me much, and there's no stability issues at all. I'm going to do a clean reinstall before I start my next, real playthrough. Maybe I'll see new things there.


Well, if we had comparison screenshots...
(I'll respond to this in a separate post...)

Nope, I just checked that, the only offline installer is for version 1.3, and actually even that doesn't have the whole fileset.. There are no updates of any kind available.
In GOG, the old versions will appear when unchecking auto-updates ;)
Good catch! I used to exclusively download installation packages with GOG. Never used Galaxy. Then, I got lazy around the time Gwent released...and I started using Galaxy. :p But I love the fact that GOG allows this. It has helped out so many times.
 
Last edited:
An example I have personal (and extensive) experience with is Gamebryo,...
Taking the Gamebyro engine as an example is a bit of a far stretch because this engine is a mess. No offense against Bethesda, but this engine has many faults which in turn however enables them to do many object and NPC related things. (Many if which, Cyberpunk 2077 does not.)

Granted, we do not know if the RED Engine is a mess as well, that's difficult to say and I assume in a way every engine is a bit of a mess.

Origins through Inquisition, Mass Effect 1-3
Can you back up this claim? Because I have a hard time believing that given that Inquistion uses the Frostbite engine, which would lead to many other Frostbite games having the same issue, like Battlefield. Likewise, Mass Effect 1 uses a different engine than 2 and 3. Mass Effect 2 and 3 use the Unreal 3 engine, which would also mean that most games of that area would have the very same problem.


Welcome to rendering engine optimization.
While I do not want to argue, that game engines are peculiar and difficult things to manage, after all, I have read Mile's retweet on Twitter, I still hold the assumptions that the graphics system is and should be unrelated from everything else, as much as possible of course. I don't argue that sometimes this might not be possible, and other times bugs exist that shouldn't exist, but if everything would be that closely related things like the Unreal engine would probably be far less popular. Also and regarding Mile's retweet, it should be mentioned that the problems stated by this developer are first and foremost physics related, which does make sense.

Also, another reason those things are not that strongly related is that NPC behaviour is calculated on the CPU, whereas the graphical part is done on the GPU. (I know, draw calls have to be made, so the CPU is involved as well.)

So...what do we do? Totally recreate the rendering from the ground up? That's the same as saying we'll start making the game over again from the beginning. It would require not only rebuilding the rendering algorithms, but rebuilding everything that has any reliance on the rendering to function...which would be virtually everything else. Not possible.
Actually, an engine is just a toolset and if necessary certain tools are redesgined from ground up. For instance, raytracing is completely different than rasterisation.

Or -- we can work on the overall rendering, balancing it in such a way as to prevent such a problem from occurring. We pull the level of detail closer to camera, allowing the game to finish processing the visuals in time and let the rest of the game functions execute as designed.
Again, I do doubt that this was done to allow the game to execute normal/necessary functions properly and in time.

This, of course, is a drastically simplified example of just one possible issue that can arise in any game. For CP2077, I think the visual changes were more to avoid conflicts with other parts of the rendering and possibly the collision systems.
I fear that the RED Engine might have a fundamental problem with it's Draw Distance and it might very well be that it breaks if too many assets are loaded or that the performance requirements unproportionally increase.

Similar to Witcher 2 Dithering issue that was also engine related and could probably not have been solved for its iteration of the RED Engine.

That's my best guess on why the changes to scale back the LoD were made. But I don't know. I was pretty astounded by how much of NC was hand-crafted, as opposed to the way every other, open world game I've ever seen has worked. It was more than just plopping the same 25 city buildings down facing in different directions, like GTA or Far Cry. Here, rather, it looks like the devs took the approach they took for creating Oxenfurt and Novigrad in TW3, and they said, "Let's do that...for the entire map of Night City." That's going to come with performance and optmization challenges.
This is where the RED Engine shines, in the small details you are able to see everywhere. Then, it can truly be breathtaking and magnificent and luckily from my experience it is most of the time.
 
compared with an old video (look starting at 2:17...)

Interesting, while I did notice the very issue above only on the particular mountain I have on the screenshot, the video you posted clearly shows a more tightly populated Night City and I also could not make out any apparent LOD issues.
 
My 2cc.

I don't know how much of the narative that the game looked way better back on launch is down to rose tinted glasses and how much is down to peoples different experiences with 1.3 on different hardware.

I didn't play it in December but my expecience of playing it in Jan was that it was a janky mess with horrible pop in and frame rate crashes and often deserted streets, that at times could be quite beutiful.

On 1.3 I'm seeing a way bussier city with more details on buildings, more nps, more traffic, more environmental storytelling.

At times the number of npcs on the screen can actually be quite impressive. Even if there are probably duplicates to be found.

photomode_22082021_220933.png


Also I'm very much enjoying the more variable weather.

photomode_21082021_002736.png


Even if it does insist on drawing 1,000,000 2d car sprites in the distance on deserted roads.

Cyberpunk 2077 (C) 2020 by CD Projekt RED 22_08_2021 20_55_37.png
 
Dont have any screenshots left from back in december since ive formated my pc. This is 1.3. Hade too convert it too jpeg but should be pretty much the same.
 

Attachments

  • photomode_23082021_181650.jpg
    photomode_23082021_181650.jpg
    1.4 MB · Views: 47
My game when I put "slow HDD=on" some buildings doesn't load. Another thing I noticed is that the game is not creating the "cache" folder in appdata, where the "gamepipelinelibrary.cache" is located
 
My game when I put "slow HDD=on" some buildings doesn't load.
Don't want to derail the thread, but isn't this a recommended performance improvement option anyway, since it results in the data being loaded into RAM instead of being streamed from disk?
 
Hmm im starting too wonder if the lods is just more sharp nowdays and hence you se the lower details/blocky textures more clearly. Looking out of Vs window at the neon sign its really blocky now. I remember it beeing more smooth before.
 
Don't want to derail the thread, but isn't this a recommended performance improvement option anyway, since it results in the data being loaded into RAM instead of being streamed from disk?
I always played with "slow hdd=on", because the performance improves a lot, but after update 1.3, some buildings don't load with this option enabled, then I noticed that the folder "cache" is not being created in appdata
Post automatically merged:

IMG_20210824_173535.jpg
 
Interesting, while I did notice the very issue above only on the particular mountain I have on the screenshot, the video you posted clearly shows a more tightly populated Night City and I also could not make out any apparent LOD issues.

yeas, is an old video december 2020 as I've said, so not 1.3, and here I can see far more detali in the distance compared to now, the amazing detail of the v1.06

Post automatically merged:

Hmm im starting too wonder if the lods is just more sharp nowdays and hence you se the lower details/blocky textures more clearly.
not even in a dream. When I've updated to 1.1 since the first gameplay I was shocked, it seems to me like the game change all settings to low compared to the earlier gameplay in the same morning... can't believe it at first, the the sad truth...
 
Last edited:
Well, if we had comparison screenshots it would not be subjective. Usually, you can argue about the style of a game but commonly people agree that the higher the resolution the better and the farther you can see the better.
No, I'm arguing that whether or not it bothers you, or "ruins the game", or "destroys the immersion" is subjective. I'm not arguing that it wasn't changed -- of course it was. I'm living proof that it's not objectively an "issue". Doesn't really bother me in the least. It's just another step on the path. Same thing was done with Witcher 3. Look at TW3 now. Specs haven't changed. It's a process.

I'm also running CP2077 on exactly the same system I built for TW3 back in 2015. No upgrades. Running full Ultra, RTX off (as it doesn't exist for the 980 ti), 1080p. Still not noticing any real performance woes. Bit more steady overall. Still getting 45-56 FPS everywhere, frame cap locked at 56, Vsync on. There's now a little more near-field draw in.

Running on a nearly 6-year old system...I'd say that's pretty optimized.

Well, there is a "garbage mountain" that looks awful from a LOD perspective:

mnt_bug_02.jpeg


Also, while I do love Witcher 3 comparing the LOD behaviour to a game from 2015 is not right for a game that was released in 2020. I know, I often draw a Witcher 3 comparison as well, because both use different versions of the RED Engine, but if CDPR was unable to get this right for Cyberpunk 2077 they should definitely invest the required time for their next game. Especially, when compared to what Unreal 5's Nanite technology is capable of. (Although Nanite should be taken with a grain of salt because we have yet to see this technology deployed in a real game.
Sure, but those are the types of things that can now be optimized further as time goes on. I can't really solve problems like that before a.) I see there's a problem, and b.) I discover a balanced solution. Now, the numbers are known. The engine has been re-budgeted to solve issues, and it's known how far the values can be pushed without causing issues that were discovered after release. Now, for example, someone can go in and remodel those garbage heaps to cut the polygons used by a third, and/or rebuild the textures and maps to make them less intensive. Then, that aspect of the LoD can be tweaked back out a ways.

There's no "snap my fingers and everything's better." It's going to be back and forth for a bit until everything is just so. It took about 3 years after release before the final version of TW3 was released. (And there are still people encountering issues...)

Honestly, until recently old-gen did have the utmost priority and those fixes where required fast. If two fixes where presented where one would increase the performance on old-gen while keeping the visual fidelity unchanged, and a similar one that would additionally decrease the performance and would only take a third of the time to implement, I think they might have gone with the latter one.

I think they needed a bunch of fixes quickly, which they had to deploy on a running system. If Cyberpunk 2077 was released today, it would probably look better on PC and run smoother on old-gen. (Unless there truly is a real fundemental issue with the RED Engine.)
How? Reality knocking again. Obviously, one of the main reasons the game was suffering from such terrible performance issues was because the rendering engine was demanding far too much of the hardware. It couldn't handle it. Thus, obviously, that needed to be tweaked to get it working within the limitations of what is physically possible on last-gen hardware.

Obviously, lower end hardware is not going to be capable of performing at the same degree of detail as much higher end hardware. General rule of thumb with consoles is to compare graphics to PC at the same specs. Very often, consoles will look about that good and offer slightly better performance. Consoles are definitely better streamlined for gaming than an equivalent PC will be by default.

But If I'm expecting my PS4 or XB1 to somehow rival the same graphical detail and performance as a PC running an i9 processor with an RTX 3000 series GPU...there's no possible way. That's simply an unrealistic expectation for the last-gen hardware and capabilities. (The same would be true of a PC player using a $900 laptop expecting their game to look and run like a demo they watched of the game running on a custom-built PC rig worth $3,000.) The specifications table clearly outlines what detail settings to expect in order to get the game running between 30-60 FPS.

Yeah, I also believe that if they downgraded Cyberpunk 2077 before, they won't continue doing so. From this on out, it's probably moving forward in terms of visual fidelity.
A "downgrade" would be:
  • "We've removed support for ray-tracing for the game. It's no longer supported."
  • "The game no longer supports 64-bit processing."
  • "The game will no longer support DirectX 12, it will only render at DirectX 11 quality."
"We've worked with the LoD scaling to ensure smoother performance," is not a downgrade. It's optimization to ensure that people see fewer crashes, more stable FPS, fewer visual bugs or glitching, etc.

I miss those days when ultra settings where truly ultra and players weren't able to run the game at this setting with the best computer money could buy at that time. It was just developers using and trying crazy things, like Witcher 2's Bokeh filter. Nowadays, people get angry when the can't run a game at ultra with 60fps on a toaster. (I'm not talking about console users here, the game was marketed and sold for old-gen so these people need to get a game that runs!)
There is no such time that I can recall. For every "big game" ever released, there was a crowd of people that lauded it as the best thing ever made...a crowd that complained it was total garbage compared to [ThisGame] or [ThatGame]...and a gigantic majority of people in the middle, between either end of the spectrum.

Nothing in this arena has changed since...whenever. It's always been the same considerations and the same arguments. If anything, we now have the ability to run pretty much any game out there at completely playable levels, even on a toaster! (Seriously, I think there are microwaves now with more computing power than the gaming rigs I built in the '90s.)

The game works just fine on PC, it's simply that many players aren't aware of the steps they may need to take to get things working for games that pose challenges. It took me a few days of troubleshooting to figure out why my game was crashing, and another day to get it running without issue. Sometimes that's required. All part of gaming on PC.

Where I do agree with this statement is for console players. Yes, I agree that it was a mistake to release the game on last-gen in that state, and it's a mistake that CDPR has long since owned and offered compensation for. At the same time, as I stated above, no, the last-gen builds are absolutely not going to be the same graphical fidelity as the builds for next-gen consoles. It will, however, be completely playable on last-gen. It already is for many, as I have seen it running on my buddy's XB1. It's not crashing and burning -- it's very consistent 30 FPS with a few areas that chug a bit. The videos I've seen of other people's games on XB1 or PS4: whoa...yeah...that's an issue.

I disagree here, developers should take that into account and they usually do and if their game is affected to much by an uncommon aspect ration than that's just bad design. Personally, I'm gaming on 5120x1440 and I haven't had any problems yet. Sure, I had to change Witcher 2's binary to support that resolution but afterwards the game was running fine. With more modern games, like Witcher 3 or Cyberpunk 2077 I didn't have any issues.
Heh -- I wish! I've got to play in a 1920x1080 window -- can't even get it smooth around 50 FPS at 1440p. Sort of miss CRT monitors' ability to resize the whole screen space. But I'm not giving up that detail! :D

But non-standard resolutions are non-standard for a reason. Many don't have any idea how much work is involved in supporting different aspect ratios. Namely, all 2D assets need to be completely rebuilt -- every, single thing -- for each aspect ratio the game will use. Once the resolution for an aspect ratio gets past a certain threshold, the assets need to be completely rebuilt for that as well. Again.

That is not only incredibly time consuming, but incredibly expensive as people aren't going to do that sort of work for free. Even modders tend not to touch that with a 75 foot pole.


It's your SSD so do whatever you like, but you are probably a bit on the paranoid side and as long as you keep 20-25% (the more the better) free it will be fine.
Well, as I've told many people in the past, the first time they deal with a complete hard-drive crash at precisely the wrong moment, then have no way of recovering critical data for work or something, and even after a reformat and reinstallation of the OS, there's so many bad sectors that they still have to deal with ongoing file system errors, until they finally replace the drive, requiring that they go through everything again...

...they'll start ensuring that there's plenty of free space on every drive. 10% is actually cutting it a bit close -- especially with individual files able to reach an excess of 4 GB (movies, music, etc.) nowadays.

I stand by my statement that optimisation is only optimisation if the visual fidelity is roughly the same. Theoretically, I could remove all NPC in Cyberpunk and call this optimisation as well. Sure, it is up for discussion what "roughly" means, but if the LOD issue wasn't as pronounced in 1.06 I would call it a downgrade and not optimisation.

To stress the point I've made earlier again, I doubt that those things where really gameplay related and broke the game fundamentally. I much rather assume this matter to be very complicated and CDPR did neither have the time, nor the resources to fix this in a proper manner. That being said, I agree with your last statement that since 1.23 this enormous pressure was lifted from their shoulders.

Also, when having a look at the patch notes CDPR was able to fix a myriad of issues which also makes the gameplay experience far more enjoyable on PC than it was on launch.
You can stand by that belief if you want, but that's not how it works. Of course, it's ideal if things work out that way, but pick any professional game developer that you want anywhere in the industry, and they'll be able to explain in detail how and why it never works out that way.

The example I've made is not meant to be taken literally -- it was an intentionally simplified example of how various aspects of an engine can connect in ways that a player has no ability to see or understand. If you'd like a real world example of how obscure and ridiculously difficult this sort of engine issue can be, research the lip-sync bug for Skyrim. If took nearly 5 years of work, if I remember right, long after Bethesda had written it off, for a modder to finally find a work-around (not a true fix). Just for one bug. One that created a terribly distracting issue with the game.

The core of your stance is kind of like trying to argue that if the rocket didn't get to space, then it needs to be a bigger rocket. So, just build a bigger one. That's the way it should have been done to begin with.

That's not how reaching escape velocity works. (And yes, actually, trying to budget resources for a very demanding engine is a lot like rocket science. With fewer explosions. [Not "no" explosions...just...not as many...])
 
Last edited:
That's a long post, I'm not going to address every little thing but to head off, I did bring up bad console ports with forced 30fps breaking physics with unexpected results if you make it run faster.
An example I have personal (and extensive) experience with is Morrowind's Gamebryo, which is exactly the same base engine that was used for Oblivion, Fallout 3, Fallout New Vegas, Skyrim, Fallout 4, and Fallout 76.

Physics calculations were taken based on keyed frames that were rendered and drawn to the screen, again ticking complete cycles every 60 frames maximum. If the FPS exceeded 60, or if multiple, partial key frames were drawn (because of unlimited FPS / vsync off), the additional frames data would be treated as a multiplier to the physics calculations, and would result in a "physics explosion". (I bump into a cart, and it flips over sending a cabbage into orbit.)

Bad design there. Nothing to do with graphics details of course and trivial to limit framerate to 60fps. That's also something that should be relatively easy to fix, you are saying you saw this with Morrowind but e.g. Skyrim had engine upgrade.

It's easiest to understand from that direction, but it can work the other way, as well.

If I fall below a certain FPS threshold, it can prevent certain game functions from happening or happening on-time. Why is this character not responding? Why is the next part of the mission not starting? Why are all the textures blurry on these NPCs? Why are those crates popping in, like, 5 seconds after I look at them?

Yes, too choppy performance due to too high graphics settings can cause glitches and crashes, I mentioned this as well. This is somewhat self-correcting because most people will dial things down if it gets too slow.

And the answer can very likely be: not enough performance overhead. The engine is making calls to render and draw other things, and there's not enough system resource available to handle everything because there's, for example, too many distant LoD assets trying to load during what's supposed to be the "fast" bit of the rendering process. And that needs to happen first so that players can see what's happening in front of them before other, more complex functions take over. And if that's slowing things down too much, then other functions can't execute because they're waiting on the go-ahead from the engine -- and we have a chain reaction that results in some NPC staring blankly at you and refusing to initiate the quest dialogue because the scripted scene software can't initiate because the distant rendering algorithm is still trying to add specular map textures to distant walkway railings to make them look wet because the weather system says that it's storming outside. And the only reason that happened is because the player approached from the east instead of west, north, or south. Because they happened to be looking in that direction, their on-screen backdrop contained over 5,000 additional distant LoD assets that would otherwise not have been necessary to render and draw if they had been looking in a different direction at that exact moment in the game.
Now we're getting somewhere. It's interesting you say "script" related to LOD, perhaps scripting is running in a single thread a-la Paradox games? You hardly "have to" render the scene before you can run any scripts, not with multithreading going on. Yes, running scripts asynchronously causes additional issues but it entirely decouples graphics engine from scripts, although things can get interesting if FPS gets very low but that's again usually self correcting problem. You obviously have to synchronise at some point which can be time based entirely asynchronous or do it while the next frame is rendering.
Just tell the computer to start the dialogue first? Okay, then it will need to suspend the rendering altogether and prevent assets from loading, meaning when that dialogue scene triggers, you'll be staring at void space, or giant pink polygons in the distance.

If that's how RED engine works, there's indeed a fundamental problem. Are you really running rendering and scripting in a single thread? Even if you did, you're not running thousands of scripts like the Clausewitz engine so script execution phase should hardly block rendering to any significant amount of time.
 
Taking the Gamebyro engine as an example is a bit of a far stretch because this engine is a mess. No offense against Bethesda, but this engine has many faults which in turn however enables them to do many object and NPC related things. (Many if which, Cyberpunk 2077 does not.)

Granted, we do not know if the RED Engine is a mess as well, that's difficult to say and I assume in a way every engine is a bit of a mess.
No stretch at all. Remember that this was the very first, fully 3D, open-world engine to be used for a full-on, CRPG experience, that was not only capable of running solidly on a console (the original Xbox), but also came with an unimaginably robust construction kit that allowed PC users to build their own games if they wanted to take the time. And that's exactly what was done. It's the engine that turned modding into its own aspect of gaming...which now affects even console games.

The point is that every engine under the sun has limitations that must be worked within. They are not going to make human sense. They are mostly going to make sense within the logical / mathematical construct of the computer's code.

To be perfectly honest with you, every engine I've ever dealt with is a mess. The more creative and innovative a game tries to be, the more and more obscure tricks are going to create really weird results as developers fiddle and experiment to get an effect they're looking for.

This is why game development is an Art -- not a Science.

Can you back up this claim? Because I have a hard time believing that given that Inquistion uses the Frostbite engine, which would lead to many other Frostbite games having the same issue, like Battlefield. Likewise, Mass Effect 1 uses a different engine than 2 and 3. Mass Effect 2 and 3 use the Unreal 3 engine, which would also mean that most games of that area would have the very same problem.
Sure, for all of them. Any of these things can be researched in only a few minutes online. Start with the studio's forums, for most of them...looking for conversations a lot like this one:
  • Dark Souls 1-2: DS 1 had severe collision issues that would result in the player falling through the world, especially when climbing ladders. DS 2 had weapon degradation tied to 30 FPS. If you played at 60, you're weapons wore out twice as fast as they should have (and the ladder clipping issue remained).
  • Dragon's Dogma: Both weather updates for the overworld and several magical spells had their damage and hit proccing tied to 60 FPS max. (Also, while the engine was capable of natively generating shadows on grass, doing so could tank performance in certain scenes to <10 FPS and/or cause crashing on the highest-end hardware at the time of release. Not FPS glitching, but a cool graphical feature that was disabled by default for stability and performance purposes.)
  • Dragon Age: Origins through Inquisition, Mass Effect 1-3: Issues with both frame timing for in-engine cinematics and music. Potentially severe stuttering and crashing if FPS exceeded 60. Inquisition's scenes are a nightmare to get working correctly to this day, requiring a manual frame limit of something like 59.92 to avoid the most severe freezing and stuttering during cutscenes, though some stuttering still occurs, regardless, on virtually all PC systems.
  • Halo 1-3: Clipping issues as well as being able to launch yourself outside the borders of the map using the jumping platforms if you were over 60 FPS. (I believe this was eventually fixed on PC.)
  • Supreme Commander and Forged Alliance: Both AI pathfinding and hit-detection / damage proccing for area-of-effect or continuous fire weapons was very unreliable for FPS over 60. Might do no damage...might to some damage...might do extra damage. All depended on how many frames were dropped, registered, or duplicated. No idea if this is fixed or not; haven't played in years.
  • Total War 1 - Medival 2: Morale checks, pathfinding, and AI routines being updated could be severely borked out if FPS exceeded 60. Could result in instability on certain maps (crashing). Also the cause of an AI unit running aimlessly back and forth without engaging the enemy.
As for Frostbite itself -- go research the sheer amount of work that needed to be done to adapt it to other gameplay genres. It was a nightmare. Very similar to the struggle 2K had trying to get Unreal Engine 3 adapted to XCOM. And yes, the work done on each such adaptation of the engine will result in unique bugs and glitches that do not appear on any other game using that same engine.

While I do not want to argue, that game engines are peculiar and difficult things to manage, after all, I have read Mile's retweet on Twitter, I still hold the assumptions that the graphics system is and should be unrelated from everything else, as much as possible of course. I don't argue that sometimes this might not be possible, and other times bugs exist that shouldn't exist, but if everything would be that closely related things like the Unreal engine would probably be far less popular. Also and regarding Mile's retweet, it should be mentioned that the problems stated by this developer are first and foremost physics related, which does make sense.

Also, another reason those things are not that strongly related is that NPC behaviour is calculated on the CPU, whereas the graphical part is done on the GPU. (I know, draw calls have to be made, so the CPU is involved as well.)
Okay...here's a paradox in exchange:
I want to ensure that gameplay commands, music, PhysX effects, animations, lighting, shadows, and all gameworld interactions are synced up to provide visual feedback to the player. Everything you do will create a visible reaction in the game and create a seamless, cinematic experience.

But don't connect any game functions to the visuals.

Good luck, people! Have fun! Teamwork!

Actually, an engine is just a toolset and if necessary certain tools are redesgined from ground up. For instance, raytracing is completely different than rasterisation.
Ah, I think your confusing a "rendering engine" with a "game engine". It is possible to use some rendering engines alone (Unreal, Quake, Unity, etc.) and build it into your game engine. This is actually what they were designed for -- to be licensed to outside studios.

Other engines are comprehensive. Like REDengine, Gamebryo/Creation Engine, Anvil, etc. They were built specifically to function as part of a singular game engine. Another studio may choose to build their game using the entire game engine, but it would be not worth the effort or even impossible to use only the renderer with another game engine, or replace the existing rendering engine with something else. A studio would basically have to build their own engine from scratch on the concepts they wanted, allowing them to add functionality that wasn't possible by default. Might as well just make your own engine at that point...or build around an existing rendering engine.

Again, I do doubt that this was done to allow the game to execute normal/necessary functions properly and in time.
Okay. What do you think it was for then?

I fear that the RED Engine might have a fundamental problem with it's Draw Distance and it might very well be that it breaks if too many assets are loaded or that the performance requirements unproportionally increase.

Similar to Witcher 2 Dithering issue that was also engine related and could probably not have been solved for its iteration of the RED Engine.
Truth! And this has been the case throughout. It's a very robust engine, but it's definitely not one of the fastest. This is why there were exactly the same type of rendering concerns for TW3, and it took as long as it did to finally sort out. It's ambitious, and the devs obviously try to take full advantage of its capabilities. However, as we see, it's possible to overreach without meaning to. This is also part of any creative process.

This is where the RED Engine shines, in the small details you are able to see everywhere. Then, it can truly be breathtaking and magnificent and luckily from my experience it is most of the time.
I agree. I absolutely adore Night City. I've said before, it's the most realistic feeling city I've ever experienced in a game. GTA may come close in macro detail and expression, but nowhere near the micro detail that CP2077 creates. It's incredible.

But it's not perfect. What is? Work is ongoing!
 
Last edited:
Okay...here's a paradox in exchange:
I want to ensure that gameplay commands, music, PhysX effects, animations, lighting, shadows, and all gameworld interactions are synced up to provide visual feedback to the player. Everything you do will create a visible reaction in the game and create a seamless, cinematic experience.

But don't connect any game functions to the visuals.

Good luck, people! Have fun! Teamwork!

A false premise there. To decouple game functions from visuals does not removing any connection whatsoever, but synchronization between the two can be done in many ways. I'm an embedded programmer among other things, I design systems to run on a heartbeat triggering once a ms or so to process things on top of asynchronous events from inputs, that's obviously not going to work on something with a thousand times more code but the basic principle is there. Or you can synchronize scripts while next frame is rendering. That gives your game world 1 frame delay which is "whatever" for most people outside esports amphetamine amped manic rabbit -demographic.

Yes, it can be hard to implement and it can be not plausible at all depending how the particular engine was written but to say "it's unpossible, you're daft to suggest such a thing!" is not a valid premise. A bit like Paradox apologists coming up with excuses why it's impossible to run the damn scrips on threads, every complication is like a mountain that couldn't possibly be scaled, as if multi-threading was a novel problem nobody tackled. Yes, that engine has been around since 2007 in it's earliest form and the event handling core would have to be torn apart to make it run multi-threaded but it's just silly to say "It's impossible! what if unit x in combat gives wrong results because script z running weather didn't get updated before the tick?!".. Significant investment of labour for sure.
 
Top Bottom