LOL!
All games including console games are made on a PC, and then ported to the consoles through specific dev kits. Game developing in the console environment does not exist. What exists is game developing for the console environment.
im obviously not talking about developing IN the console. I wasnt aware that needed to be spelled out.
yes, they're made on a PC, but the dev kit is the "console environment" im talking about. its not really a port to work off of the dev kits. there's a reason they're PC ports and not the other way around.
But the problem is that many ports are product of laziness. And i'ts not only graphics, but the gameplay elements which must be developed for the consoles, gamepads for example.
which goes back to the point, they're developed in a different process than PC games, because of the enivornment they're working in. lazy ports just take that code and make it work on PC rather than having separate development. there's a reason games like Metro and TW3 pushed graphical boundaries on PC and games like Darksiders didnt even have graphics options beyond resolution. Consoles certainly didnt stop Metro from tossing on heaps of tesselation in Last Light when the tech was new and they didnt stop TW3 from trying to implement hairworks, etc.
you're right on controls, the amount of buttons on a gamepad limits things, but really, outside of sims, this doesnt matter at all. there really isnt a need for more functions than that. and frankly, i dont want to play a game where i need 50 key bindings to cover all my bases.
Again, graphics are stagnating only because of the consoles, and that's a fact. This is why people today can play these multiplatform games in 2k or 4k, which was not even close to possible when games were PC exclusives.
those resolutions werent even allowed for in any games until the past few years. you're forgetting how new ultra HD resolutions are. its only just now become mainstream. pc exclusive games werent even developing with them in mind until recently because the tech wasnt there yet.
the biggest limiting factor on that was VRAM anyways, consoles absolutely did NOT stop GPU makers from shoving more VRAM in. once higher resolutions started becoming a thing, you saw how quickly the GPU makers started boosting VRAM. everything was 2-4 gigs for a long time and we shot up to 12 real fast. to act like consoles were the limiting factor is so misguided. you have to remember that the GPU makes are always pushing tech and they do a lot more than just consumer graphics cards. tech has come a LONG way since the Xbone and PS4 launched. Consoles havent held back much, its just the uneven way in which the tech has developed. to blame consoles for tech stagnation when devs were still figuring out how to use the ever expanding mass of CPU threads they had and all the new features each subsequent API came out with is just not fair at all.
do you even play console games often? putting th ePS4 versions side by side with the PC version is a stark differnce in many cases.
The best example for this is 2007 Crysis.
thats a terrible example. Crysis was so terribly optimized. you couldnt run it at anything because they never optimized it properly. Crysis Warhead took most of the same tech and made it run twice as smoothly in like a years time.
Crysis wasnt a good example of games pushing boundaries in a healthy way, it was a great case of devs way overestimating their ability to optimize features and completely overestimating what the current hardware was capable of. they didnt succeed in pushing graphics so much as they flew too close to the sun and managed to turn it in to a marketing point. they dumbed their own engine down before it ever touched consoles because even they knew they went too far.
I can point to the RTX 2080 because in 2009 i have paid 450 euros for the fastest Nvidia graphic card, which was the fastest graphic card back then. Now the fastest card cost 1300 euros and above.
in 2009, all the cards were operating on the same playing field though. this is new tech that AMD doesnt have out yet. you cant compare something entirely different to AMD's stuff. it makes far more sense to compare a 1080 ti. wait til AMD has a ray tracing card to compare the 2080.
this is like comparing the price of an AGP card to when PCIe was brand new tech. this is just the price you pay to be an early adopter and isnt a really relevant comparison to match price points.