I don't know, I've been on consoles since the early 90s and rarely had problems with games. At least until 2010s and the opportunity to patch games after launch on (and after that in a couple of weeks at worst the game was good enough). But, you know, closed systems and every user has (within physiological differences in hardware manufacturing and aging) the same experience, plus Devs don't have to consider the billions possible combinations you have on pc. So the situation is easier there, if you will.
P.S. thank god indie Devs know their limits and, when talented enough, deliver Jems. The problem comes when they try to overdo. I can cite the very recent example of Baldo.
That was a major reason consoles took off. Unified hardware and software. So it both limited the resources that developers had to use and, at the same time, created a solid framework for them to work within. That meant that I may not be able to pull off everything I wanted with my game, but I could put together a finished title in far less time with fewer issues.
That was how it started...
...and then not only did consoles start becoming far more complex and robust, but they also started becoming non-standardized, with different makes and models of hardware and software being used in different regions. While they all have the same specs -- they're not all manufactured using exactly the same parts...
...and on top of that, there are now far more consoles and generations of consoles that come out a lot more regulalry than in the past...
...and that technology becoming more complex means that there are more moving parts in every machine. Building graphics for a 16-bit, 2D sprite system is not the same thing as building modern real-time, 3D procedural rendering utilizing multiple layers of textures and shaders, and running on independent graphical processing units that must be able to communicate with other hardware in the system to interpret calls being made by a game.
Games also weren't 1 million+ of lines of code, in the past. In-game functionality has increased exponentially since the '90s. A complete RPG might have been 100,000 lines of code back then. Nowadays, that block of code might be just the part of the engine that controls the camera. Every added dot is something that needs to be connected to various other parts, and every junction is a new opportunity for something to go wrong.
So even if writing for consoles is still potentially "easier" than writing for PC, it has still become a monumental task that requires huge teams of developers to pull off a big project. If we're talking about 20 people total working in a single office -- we're not going to be producing something like Cyberpunk 2077. We're going to be producing something like Shadowrun Returns. Those can be fantastic games -- but they're not on the same level of development.
And look at how much patching those smaller games still need.
It's not a matter of competence. It's not a matter of greed. It's a matter of the sheer complexity of the task. The options are the same as anything else life: we play it safe, keep it small, and do something that's been done before...or we branch off, take risks, and know full-well challenges might spring out of thin air. One of these things results in simpler games that can still be very good but less likely to "wow!" people...the other is ambitious and likely to get the spotlight, but also prone to unexpected problems that can result in failures along the way.
No risk, no reward. No pain, no gain.