But the game was not planned for PS5 and Series X, they planned it for PS4 and Xbox One.
When the game was announced both PS5 and Series did not exist and the reason why people are frustate with this game might be because when they launched the teaser 12 years back in time they said "it will be launched when it's ready" which is obviously false.
I am in the camp that they should have abandoned concurrent development for consoles years ago and then ported to console afterword. They clearly were initially over-estimating the hardware capabilities of the current generation of consoles from the start of 2077. The primary reason for the shite performance, other than the integration of ray-tracing and the absurd number of scripts that appear to be improperly firing, is the number of shadow-casting objects that are in the same area as each other. This is directly showcased by Fallout 4 on every system, including PC, if you increase shadow Quality/Distance over low/medium settings on every single card and system out there, downtown Boston causes absurd frame-drops due to the sheer quantity of shadow casting objects and the amount of lighting geometry that needs to get calculated. 2077 was absurdly ambitious and people over-estimated the time that would take because of goodwill granted to CDPR over much, much simpler games in terms of graphics quality and complexity. This was a PR nightmare from the start, where certain individuals in CDPR and shareholders blatantly didn't consider the input they got from developers on this.
There is some argument to made here that not only do folks need to be more realistic about what they promise their company can deliver in terms of game performance and quality, but consumers need to be aware of what the limitations of their system actually are. I ignored the hype videos prior to release until after I played the game, and went back and watched them this week. It is blatantly obvious to anyone keeping up with hardware specs, hardware limitations and best development practices that they were massively over-ambitious on what they were promising and about a quarter of their promised systems were unimplementable at the current expected quality of visuals from consumers, such as 4k@60 or 1080p@60 w/ray-traced lighting. Hell, anyone that has attempted to make a mod for Skyrim, Fallout 4 or one of the graphics mods for Minecraft are well aware of many of the limitations of current coding practices in conjunction with open world games and their impacts on performance.
Even with those limitations in place, CDPR's dev team has done a fucking amazing job. Prior to patches 1.05/1.06 I couldn't run the RT Psycho setting at all and maintain stable framerates on my RTX 2080 Super. I now can, and it's likely the optimization they are doing, I still don't think its going to run as well, or as prettily on console, but it should approach playable territory pretty quickly. Even the PS5 and Xbox Series X|S aren't going to be as pretty as PC, nor are they going to run as well at above 1080p as PC, that expectation is unrealistic. The scale of the game is too large, the scenes are too complex for that, those systems are somewhere between a GTX 1080 and RTX 2080 in actual hardware performance, meaning they are now considered low to mid range PC equivalents. They run as well as they do because of software specifically designed for them, with upscaling from internal systems, AND a combination of locked in low-mid range graphics quality settings. They will not ever approach the graphics fidelity or stability of a PC at high resolutions with high graphics settings for shadows and lighting effects. Instead they will focus, as they have continued to focus, on mid range graphics settings upscaled and taking advantage of built in acceleration systems for HDR and 4k upscaling, native 4k for high end systems is performance heavy.