im obviously not talking about developing IN the console. I wasnt aware that needed to be spelled out.
yes, they're made on a PC, but the dev kit is the "console environment" im talking about. its not really a port to work off of the dev kits. there's a reason they're PC ports and not the other way around.
which goes back to the point, they're developed in a different process than PC games, because of the enivornment they're working in. lazy ports just take that code and make it work on PC rather than having separate development. there's a reason games like Metro and TW3 pushed graphical boundaries on PC and games like Darksiders didnt even have graphics options beyond resolution. Consoles certainly didnt stop Metro from tossing on heaps of tesselation in Last Light when the tech was new and they didnt stop TW3 from trying to implement hairworks, etc.
you're right on controls, the amount of buttons on a gamepad limits things, but really, outside of sims, this doesnt matter at all. there really isnt a need for more functions than that. and frankly, i dont want to play a game where i need 50 key bindings to cover all my bases.
those resolutions werent even allowed for in any games until the past few years. you're forgetting how new ultra HD resolutions are. its only just now become mainstream. pc exclusive games werent even developing with them in mind until recently because the tech wasnt there yet.
the biggest limiting factor on that was VRAM anyways, consoles absolutely did NOT stop GPU makers from shoving more VRAM in. once higher resolutions started becoming a thing, you saw how quickly the GPU makers started boosting VRAM. everything was 2-4 gigs for a long time and we shot up to 12 real fast. to act like consoles were the limiting factor is so misguided. you have to remember that the GPU makes are always pushing tech and they do a lot more than just consumer graphics cards. tech has come a LONG way since the Xbone and PS4 launched. Consoles havent held back much, its just the uneven way in which the tech has developed. to blame consoles for tech stagnation when devs were still figuring out how to use the ever expanding mass of CPU threads they had and all the new features each subsequent API came out with is just not fair at all.
do you even play console games often? putting th ePS4 versions side by side with the PC version is a stark differnce in many cases.
thats a terrible example. Crysis was so terribly optimized. you couldnt run it at anything because they never optimized it properly. Crysis Warhead took most of the same tech and made it run twice as smoothly in like a years time.
Crysis wasnt a good example of games pushing boundaries in a healthy way, it was a great case of devs way overestimating their ability to optimize features and completely overestimating what the current hardware was capable of. they didnt succeed in pushing graphics so much as they flew too close to the sun and managed to turn it in to a marketing point. they dumbed their own engine down before it ever touched consoles because even they knew they went too far.
in 2009, all the cards were operating on the same playing field though. this is new tech that AMD doesnt have out yet. you cant compare something entirely different to AMD's stuff. it makes far more sense to compare a 1080 ti. wait til AMD has a ray tracing card to compare the 2080.
this is like comparing the price of an AGP card to when PCIe was brand new tech. this is just the price you pay to be an early adopter and isnt a really relevant comparison to match price points.
You are either young and didn't experience many things that you talk about, which you shouldn't do btw . Or you are ignorant.
Back then 2k resolution were allowed. You could play Crysis or STALKER for example in those res, but of course performance were bad. This was 11 years ago, not few years ago.
Also back then gamers didn't play even in 1080 resolution because PC exclusive games were too demanding overall. And VRAM was not limiting factor because the games have had low res textures. Besides VRAM is irrelevant if your GPU is not capable to render everything in acceptable framerate. You can put 50gb of VRAM on a weak GPU, and it's gonna be useless.
Crysis is excellent example because even today can be compared to today's games. Basically it only needs higher res textures. It was so far ahead of everything, hence heavy performance.
And Cryengine was never dumbed down. What was dumbed was Crysis 2/3, because the consoles were not capable of running large maps with that level of graphics, which is exactly what i'm talking about here.
The reason why you can play today's games in 2/4k resolutions is because the graphics are stagnating because of consoles, and publishers don't want their games to look better on a much stronger PC. One example for that is Skyrim which looked like a joke compared to Crysis level of graphics, and it was much newer game. It didn't even have ambient occlusion, which would be perfectly playable on a PC.
It's very simple: you have much stronger GPUs today compared to consoles, you have console graphics, and you just put a lot of VRAM and play in 2/4k.
And btw Metro and TW3 were not graphical revolutions in any way.
Again Cryengine was never dumbed down or simplified or anything similar. I know this because i work with Cryengine.
About the prices, here is a good video: