Bad design there. Nothing to do with graphics details of course and trivial to limit framerate to 60fps. That's also something that should be relatively easy to fix, you are saying you saw this with Morrowind but e.g. Skyrim had engine upgrade.
It did have the 60 FPS ticker for Morrowind. (Just like it's a part of every Bethesda game since. Here's
the issue in Fallout 76. The YouTuber understands how to affect it, but doesn't really understand its purpose.) But, for Morrowind, if you could get any game to run at a steady 60 FPS at that point, you had a monster PC rig. Plus, there were no PhysX sorts of things in Morrowind to mess up. Just animation packages and the Papyrus scripts. I'd disagree that it was in any way a bad design -- it made perfect sense to do it that way at the time. That's how all computer games had operated at that point, ever since the beginning of computers themselves.
Earlier on, it had been processor cycles that dictated games...meaning that when processors began to become much, much more powerful and faster, it was impossible to run older games as they would play out at hundreds of "ticks" per second. Hardware functions like Turbo Mode and utility programs like DOSbox were specifically created to handle this, manually slowing down the processor cycles so the older programs could run.
Who ever expected that computers would get so crazily fast so quickly?
Following that, Bethesda's solution was pretty slick, ensuring that their engine would run off of something that any computer at the time (and every computer now) had total control over -- the refresh rate of their display and the draw rate of their GPU. No worries at all in the future if hardware advanced -- the games could still be made to function perfectly.
Who ever expected that 1080p resolutions and 100+ FPS gameing with 1 ms latency would ever become an expected thing?
And here we have the core distinction of this discussion about managing a game engine.
When it is made to function a certain way, then that's the way it functions. If I try to make it do something beyond its capabilities, I'm almost guaranteed to see errors. (This happens to the devs, sometimes, too!) The only solution is to put things into a place that is cooperative with the engine. And the types of errors I see might be caused by things don't seem logically connected. (
Who would ever have thought that the reason the stuff on that table went flying like a grenade went off as I sat down...is because the refresh rate of my monitor was set to 144Hz.)
Also, it's not simple to fix these things. Impossible, often, unless the entire engine is totally rebuilt. I think Beth managed to tweak it to avoid most issues from Skyrim onward so that 144Hz default vsync automatically triggers half-refresh rate (72 FPS) and it avoids the most glaring issues.
Yes, too choppy performance due to too high graphics settings can cause glitches and crashes, I mentioned this as well. This is somewhat self-correcting because most people will dial things down if it gets too slow.
Choppy performance is not what I'm discussing. (I may have gotten carried away with the hypotheticals in my example.
) I'm talking about elements of any engine that require a
sync with some other part of the engine. Has nothing to do with whether my FPS are "good" or "bad". (That's why games have minimum specs.)
All engines are built to process things in specific orders. (No, these can not be changed around willy-nilly -- they are what make the engine function. This is why there are different engines that are better or worse at different things.) One engine might be really good at processing visuals really, really quickly. (There's your Unreal and Quake engines.) Others may be excellent at handling outrageous amounts of AI scheduling. (There's the Lua engine or the in-house Total War engine.) Others may simply be very robust jacks-of-all-trades/masters-of-none. (Like Unity.) There's no defined categories. You either pick an engine that does what you need...or you build an engine from scratch to do what you need.
Once that's done, I don't get to simply say (as a fictional example), "Oh, my StratMaster engine doesn't create the graphics I want. Uhm, tell it to prioritize graphics, not pathfinding." Well...that's...not possible. That's not how this engine functions. It
always does pathfinding first, because that's what the engine was built to do. If I want to free up more resources for graphics, I need to remove elements that are creating pathfinding. And it does some really intense pathfinding, man. If I want my desired level of graphical result, I may need to remove 70% of all of these military units, and I need to pull helicopters completely out of the game. It's not possible to deliver that level of graphics with the game I've built -- the engine won't allow for that.
So, a balancing act begins. Give and take. If I'm seeing an issue in one area, I can tweak and nudge things around and make it better -- but I can't just take my StratMaster engine, built for really complicated pathfinding and AI strategy routines, and simply make it into the Unreal Engine because I want to.
That's like saying I'm going to take that semi-truck and give it a higher top speed and better handling than a Ferrari. That's not what the machine is designed to do. The only thing I can do is make a
bit faster and handle
better. But that also means I'll need to use a smaller trailer, and I won't be able to haul as many goods at once. Need to drop weight to increase accelleration and handling.
That's a pretty direct analogy to how a computer engine works. It's built for a particular task, and it handles it a certain way. I can tweak, but I must work within the limitations of the machine's core design. A semi, a motorboat, and a sportscar are not capable of the same things, no matter how much I think they should be.
Now we're getting somewhere. It's interesting you say "script" related to LOD, perhaps scripting is running in a single thread a-la Paradox games? You hardly "have to" render the scene before you can run any scripts, not with multithreading going on. Yes, running scripts asynchronously causes additional issues but it entirely decouples graphics engine from scripts, although things can get interesting if FPS gets very low but that's again usually self correcting problem. You obviously have to synchronise at some point which can be time based entirely asynchronous or do it while the next frame is rendering.
No...that's nowhere close to what I was saying. It has nothing to do with single versus multi-threading -- that's not even something that's engine bound. Pretty much any engine can be coded to take advantage of it, and most games, believe it or not, use no more than 2 cores at once, regardless of how many are available. Many games today continue to use only one core.
Graphics are not handled through "scripts". Scripts are a gameplay function. Graphics are controlled by drivers. But a game engine will always require that certain other functions, like graphics, or sound, or AI scheduling, or whatever be processed in certain order. Computers don't "think". They don't "figure things out". They must be specifically instructed to do every single thing they do. And don't think about this like telling a computer to, "Pick up that cup, then sweep the floor." Coding requires more like, "Identify "Cup". Rotate 37° to face "Cup". Advance 0.47 m. Rotate right wrist 90°. Extend right digit fingers 1-4 by 45°. Extend right thumb by -45°..." and so on.
I don't take 1,000,000+ lines of code and "just do this part totally differently". Any game, once built, needs to work within the confines of the engine and existing code, and that will mean that only certain things are possible. It requires the devs to be clever. To think outside the box. To find ways of making certain processes much more simple...but not letting it appear to be any different. Like this:
If I have an issue -- I need to take something from one area, and give to the other. I don't get to create "chocolate" out of thin air. It needs to come from somewhere.
Hence, the amount of "chocolate" taken from the graphical rendering for whatever was needed is pretty noticeable at the moment. Over time, the "chocolate" that is left can be better arranged to make it seem like the whole "bar" didn't get any smaller. In other words, using the graphical budget that remains, the individual graphical elements can be tweaked and fiddled with until they feel more smooth.
Thus, for random, hypothetical example:
If I have an issue where a
quest script will not trigger, because a
required graphical asset cannot load in time (which, yes, can happen -- not saying it
did happen that way -- just saying it's
an example scenario) I must take
processing time away from the
graphics in order to free up
that processing time for the
quest script. If the engine needs the graphical element to be finished first, then that's what's needed. I need to work within that confine to solve my
script issue. Graphics have to lose. But that doesn't mean I can't polish them up with what wiggle room is left.
Hopefully, that's more clear.