This game only optimized for Nvidia GPU's ?

+
im obviously not talking about developing IN the console. I wasnt aware that needed to be spelled out.

yes, they're made on a PC, but the dev kit is the "console environment" im talking about. its not really a port to work off of the dev kits. there's a reason they're PC ports and not the other way around.



which goes back to the point, they're developed in a different process than PC games, because of the enivornment they're working in. lazy ports just take that code and make it work on PC rather than having separate development. there's a reason games like Metro and TW3 pushed graphical boundaries on PC and games like Darksiders didnt even have graphics options beyond resolution. Consoles certainly didnt stop Metro from tossing on heaps of tesselation in Last Light when the tech was new and they didnt stop TW3 from trying to implement hairworks, etc.

you're right on controls, the amount of buttons on a gamepad limits things, but really, outside of sims, this doesnt matter at all. there really isnt a need for more functions than that. and frankly, i dont want to play a game where i need 50 key bindings to cover all my bases.



those resolutions werent even allowed for in any games until the past few years. you're forgetting how new ultra HD resolutions are. its only just now become mainstream. pc exclusive games werent even developing with them in mind until recently because the tech wasnt there yet.

the biggest limiting factor on that was VRAM anyways, consoles absolutely did NOT stop GPU makers from shoving more VRAM in. once higher resolutions started becoming a thing, you saw how quickly the GPU makers started boosting VRAM. everything was 2-4 gigs for a long time and we shot up to 12 real fast. to act like consoles were the limiting factor is so misguided. you have to remember that the GPU makes are always pushing tech and they do a lot more than just consumer graphics cards. tech has come a LONG way since the Xbone and PS4 launched. Consoles havent held back much, its just the uneven way in which the tech has developed. to blame consoles for tech stagnation when devs were still figuring out how to use the ever expanding mass of CPU threads they had and all the new features each subsequent API came out with is just not fair at all.

do you even play console games often? putting th ePS4 versions side by side with the PC version is a stark differnce in many cases.

thats a terrible example. Crysis was so terribly optimized. you couldnt run it at anything because they never optimized it properly. Crysis Warhead took most of the same tech and made it run twice as smoothly in like a years time.

Crysis wasnt a good example of games pushing boundaries in a healthy way, it was a great case of devs way overestimating their ability to optimize features and completely overestimating what the current hardware was capable of. they didnt succeed in pushing graphics so much as they flew too close to the sun and managed to turn it in to a marketing point. they dumbed their own engine down before it ever touched consoles because even they knew they went too far.



in 2009, all the cards were operating on the same playing field though. this is new tech that AMD doesnt have out yet. you cant compare something entirely different to AMD's stuff. it makes far more sense to compare a 1080 ti. wait til AMD has a ray tracing card to compare the 2080.

this is like comparing the price of an AGP card to when PCIe was brand new tech. this is just the price you pay to be an early adopter and isnt a really relevant comparison to match price points.

You are either young and didn't experience many things that you talk about, which you shouldn't do btw . Or you are ignorant.
Back then 2k resolution were allowed. You could play Crysis or STALKER for example in those res, but of course performance were bad. This was 11 years ago, not few years ago.
Also back then gamers didn't play even in 1080 resolution because PC exclusive games were too demanding overall. And VRAM was not limiting factor because the games have had low res textures. Besides VRAM is irrelevant if your GPU is not capable to render everything in acceptable framerate. You can put 50gb of VRAM on a weak GPU, and it's gonna be useless.
Crysis is excellent example because even today can be compared to today's games. Basically it only needs higher res textures. It was so far ahead of everything, hence heavy performance.
And Cryengine was never dumbed down. What was dumbed was Crysis 2/3, because the consoles were not capable of running large maps with that level of graphics, which is exactly what i'm talking about here.
The reason why you can play today's games in 2/4k resolutions is because the graphics are stagnating because of consoles, and publishers don't want their games to look better on a much stronger PC. One example for that is Skyrim which looked like a joke compared to Crysis level of graphics, and it was much newer game. It didn't even have ambient occlusion, which would be perfectly playable on a PC.
It's very simple: you have much stronger GPUs today compared to consoles, you have console graphics, and you just put a lot of VRAM and play in 2/4k.
And btw Metro and TW3 were not graphical revolutions in any way.

Again Cryengine was never dumbed down or simplified or anything similar. I know this because i work with Cryengine.

About the prices, here is a good video:
 

xer21

Forum veteran
You are either young and didn't experience many things that you talk about, which you shouldn't do btw . Or you are ignorant.
please, dont even try.

Back then 2k resolution were allowed. You could play Crysis or STALKER for example in those res, but of course performance were bad. This was 11 years ago, not few years ago.

STALKER absolutely didnt have 2k out of the box and i already addressed Crysis trying to do WAY too much too fast.

Also back then gamers didn't play even in 1080 resolution because PC exclusive games were too demanding overall. And VRAM was not limiting factor because the games have had low res textures. Besides VRAM is irrelevant if your GPU is not capable to render everything in acceptable framerate. You can put 50gb of VRAM on a weak GPU, and it's gonna be useless.
[/MEDIA]

and a fast GPU doesnt matter if you dont have vram for the textures which has been an issue for a while now. and yeah, the textures in 2007 were lower res....but the flagship cards maybe had a full gigabyte of memory so it was the same issue. the 8800 GTX didnt even have a full GB.

VRAM was never irrelevant and has ALWAYS been a problem when going past 1080. people didnt play past 1080 for a lot of reasons, and VRAM was one of them. and the fact that a lot of games didnt have support natively was another.

I mean, I can still grab my stalker discs and load in unpatched versions. they didnt have 2k res natively. 1440 wasnt offered as a standard for along while, and you cant even blame consoles because consoles were just cresting 720 themselves at the time.

Crysis is excellent example because even today can be compared to today's games. Basically it only needs higher res textures. It was so far ahead of everything, hence heavy performance.

it was poorly optimized, period. you can make excuses for it, but the fact of the matter is, it went far beyond the limits of the hardware available for many reasons, not the least of which is becuase they tried to do literally everything.

even today its poorly optimized.

Crysis Warhead runs way better than Crysis 1 not only because they turned it down a little but because they did a hell of a lot better in optimization. and for the most part it looks mostly as good. Crysis 1 was poorly optimized and crytek even acknowledged this.


And Cryengine was never dumbed down. What was dumbed was Crysis 2/3, because the consoles were not capable of running large maps with that level of graphics, which is exactly what i'm talking about here.

crysis warhead turned down some of the features. the engine didnt get dumbed down as an engine, but the way it was presented to consumers was. its a known fact that warhead turned down some of the features to get it running better. thats the point. crytek knew they went too far in crysis 1 and actually downgraded some of the implementation of features because they overstepped. thats what IM talking about.


The reason why you can play today's games in 2/4k resolutions is because the graphics are stagnating because of consoles, and publishers don't want their games to look better on a much stronger PC.
and you know, because, the tech is actually catching up. or do you think its a good thing when games run like garbage on current tech? that's just lazy.

publishers arent downgrading games to keep in line with consoles. they're making lazy ports, yes, which results in a lot of games looking similar, but the reason thats happening is because its easier and cheaper, not to hide graphics progress from console users. lots of well done ports look MILES better than their console counterparts. lots of junk ports dont. but its not a conspiracy. console users dont even care, lol. they expect pc to look way better.


One example for that is Skyrim which looked like a joke compared to Crysis level of graphics, and it was much newer game. It didn't even have ambient occlusion, which would be perfectly playable on a PC.

its a good example of a lazy port, because bethesda has been giving us lazy ports since oblivion. skyrim looks like a joke because they didnt expend any effort on the port. they built it for Xbox, and ported it out afterwards. they never even tried to push pc graphics because that was extra time and money they didnt want to or need to spend.


It's very simple: you have much stronger GPUs today compared to consoles, you have console graphics, and you just put a lot of VRAM and play in 2/4k.

so you're admitting VRAM was a limiting factor.

ok.

And btw Metro and TW3 were not graphical revolutions in any way.
i didnt say they were. i said they were examples of games where the PC versions looked significantly different from the console versions. they were well done ports that exemplify the fact that some devs care about the PC market and some devs dont. they took advantage of the power they had and used it. they had significantly better graphics than their console counterparts. they pushed well past what the consoles did.

and what counts as a graphical revolution anyways? in 2015 there wasnt a single open world game that ran as well as TW3 with that amount of NPC density and the draw distance. Last Light was the first game to use heavy tesselation. both were clear advancements in their respective genres. to claim otherwise is just wrong. not every advancement is just about pushing numbers on frames and resolution.

About the prices, here is a good video:

i never said cards didnt cost more now, i said you cant compare the RTX cards to AMD's offering because its not an apples to apples comparison and it wont be until AMD gets ray tracing. they're not competing for the same consumer base. the 1080 ti (which is still more expensive than AMD's counterpart) is the only comparison that makes any sort of sense. saying Nvidia costs twice as much is disingenuous because its only due to them hawking tech that doesnt even exist on AMD's side. like, yeah, its more expensive, its got something that AMD literally doesnt have. thats not a fair comparison unless you're trying to cherry pick points. you're not cross shopping the RTX with anything AMD because its literally your only choice for ray tracing. its in its own market, and there's no competition, period.
 
[...] its a good example of a lazy port, because bethesda has been giving us lazy ports since oblivion. skyrim looks like a joke because they didnt expend any effort on the port. they built it for Xbox, and ported it out afterwards. they never even tried to push pc graphics because that was extra time and money they didnt want to or need to spend. [...]
Exactly this.
However it's not just the graphics though. They neglect a lot more aspects, such as performance, controls, screen real estate, interface accessibility etc. that is due to the lazy portion out of multi-platform titles from console to PC.
And it's not just Bethesda, it's the major "Tripple A" industry.

[...] so you're admitting VRAM was a limiting factor.

ok. [...]
I wouldn't say @Ancient76 is wrong, I actually agree with both of your arguments. It very much depends on the Developer and Publisher.
The resolutions makes surely the highest impact on VRAM usage, but in most cases the high amount of VRAM usage is not justified even on high resolutions, because most of these publishers you're talking about don't care about optimization in general, as long as the crwod will buy their games regardless.

Talking about console ports, a good example is Deus Ex Human Revolution vs Deus Ex Mankind Divided vs The Witcher 3.
The graphics engine of Deus Ex HR looks very similar to Deus Ex MD, there is almost no improvement visually. Yet Deus Ex MD requires about 4-4,5GB VRAM on just 1080p resolution on maximum graphical details.
If we take the least and compare to The Witcher 3, which uses the Red Engine, which looks a lot better with a lot larger world (Deus Ex MD is a joke compared to Witcher 3) where more details and input is put through the graphical visuals and level of details, The Witcher 3 only requires half of the VRAM that Deus Ex MD demands which is about 2-2,5GB @1080p on maximum graphical details + AA + Hairworks.
And things are similar with file sizes and more.

Coincidence? I doubt, it's as you mentioned before: it's their lazy progress, plus the will and/or the need to make a proper optimization.

[...] i never said cards didnt cost more now, i said you cant compare the RTX cards to AMD's offering because its not an apples to apples comparison and it wont be until AMD gets ray tracing. they're not competing for the same consumer base. the 1080 ti (which is still more expensive than AMD's counterpart) is the only comparison that makes any sort of sense. saying Nvidia costs twice as much is disingenuous because its only due to them hawking tech that doesnt even exist on AMD's side. like, yeah, its more expensive, its got something that AMD literally doesnt have. thats not a fair comparison unless you're trying to cherry pick points. you're not cross shopping the RTX with anything AMD because its literally your only choice for ray tracing. its in its own market, and there's no competition, period. [...]
But what we all will agree on is that due to the fact AMD has no equivalent products to compare with Nvidia's high-range products, Nvidia at this point has the benefit of the monopoly market being able to raise the prices (which they did), thus becoming more expensive.
It's what Intel did for the last decade because AMD had no equivalent offers for such a long time until Ryzen. And now we even have a $500 AMD CPU that beats a $1200 Intel CPU :shrug:
 
please, dont even try.



STALKER absolutely didnt have 2k out of the box and i already addressed Crysis trying to do WAY too much too fast.



and a fast GPU doesnt matter if you dont have vram for the textures which has been an issue for a while now. and yeah, the textures in 2007 were lower res....but the flagship cards maybe had a full gigabyte of memory so it was the same issue. the 8800 GTX didnt even have a full GB.

VRAM was never irrelevant and has ALWAYS been a problem when going past 1080. people didnt play past 1080 for a lot of reasons, and VRAM was one of them. and the fact that a lot of games didnt have support natively was another.

I mean, I can still grab my stalker discs and load in unpatched versions. they didnt have 2k res natively. 1440 wasnt offered as a standard for along while, and you cant even blame consoles because consoles were just cresting 720 themselves at the time.



it was poorly optimized, period. you can make excuses for it, but the fact of the matter is, it went far beyond the limits of the hardware available for many reasons, not the least of which is becuase they tried to do literally everything.

even today its poorly optimized.

Crysis Warhead runs way better than Crysis 1 not only because they turned it down a little but because they did a hell of a lot better in optimization. and for the most part it looks mostly as good. Crysis 1 was poorly optimized and crytek even acknowledged this.




crysis warhead turned down some of the features. the engine didnt get dumbed down as an engine, but the way it was presented to consumers was. its a known fact that warhead turned down some of the features to get it running better. thats the point. crytek knew they went too far in crysis 1 and actually downgraded some of the implementation of features because they overstepped. thats what IM talking about.



and you know, because, the tech is actually catching up. or do you think its a good thing when games run like garbage on current tech? that's just lazy.

publishers arent downgrading games to keep in line with consoles. they're making lazy ports, yes, which results in a lot of games looking similar, but the reason thats happening is because its easier and cheaper, not to hide graphics progress from console users. lots of well done ports look MILES better than their console counterparts. lots of junk ports dont. but its not a conspiracy. console users dont even care, lol. they expect pc to look way better.




its a good example of a lazy port, because bethesda has been giving us lazy ports since oblivion. skyrim looks like a joke because they didnt expend any effort on the port. they built it for Xbox, and ported it out afterwards. they never even tried to push pc graphics because that was extra time and money they didnt want to or need to spend.




so you're admitting VRAM was a limiting factor.

ok.


i didnt say they were. i said they were examples of games where the PC versions looked significantly different from the console versions. they were well done ports that exemplify the fact that some devs care about the PC market and some devs dont. they took advantage of the power they had and used it. they had significantly better graphics than their console counterparts. they pushed well past what the consoles did.

and what counts as a graphical revolution anyways? in 2015 there wasnt a single open world game that ran as well as TW3 with that amount of NPC density and the draw distance. Last Light was the first game to use heavy tesselation. both were clear advancements in their respective genres. to claim otherwise is just wrong. not every advancement is just about pushing numbers on frames and resolution.



i never said cards didnt cost more now, i said you cant compare the RTX cards to AMD's offering because its not an apples to apples comparison and it wont be until AMD gets ray tracing. they're not competing for the same consumer base. the 1080 ti (which is still more expensive than AMD's counterpart) is the only comparison that makes any sort of sense. saying Nvidia costs twice as much is disingenuous because its only due to them hawking tech that doesnt even exist on AMD's side. like, yeah, its more expensive, its got something that AMD literally doesnt have. thats not a fair comparison unless you're trying to cherry pick points. you're not cross shopping the RTX with anything AMD because its literally your only choice for ray tracing. its in its own market, and there's no competition, period.

I'm not trying anything. You do this to yourself. The moment you said how Cryengine was dumbed down, i knew that you don't know what you are talking about.
Some people only look at the framerate when we talk about 2007 Crysis, but they never look what is under the hood, because they don't understand how much it was advanced. That game was so far ahead in everything, but the most impressive feature were shaders.
Actually Crysis shaders are more advanced then TW3 shaders, especially foliage, terrain and water shaders. And we talk about 2007 vs 2015 game.
And btw, Warhead didn't turn down anything. Where did you get this? Both games perform very similar and look very similar. What they changed in Warhead is mostly lighting, it is more yellowish, which didn't affect performance.

And yes, like i said before, we have problems with lazy companies today. They make games primarily for consoles, which is exactly the reason why graphics is stagnating. This is very easy to contemplate. And this is the reason why you can play the games in 2 or 4k.
In the good old days this was not the case. The era of graphics revolution has started with dx9. The first graphics revolution was 2004 Far Cry which was huge leap, and then HL2, Doom3. Then in 2006 was Oblivion and Gothic 3. In 2007 it was STALKER, and Crysis as the biggest graphics revolution ever. In 2009 It was STALKER CS, and after that era of PC exclusives which pushed graphics was finished. All of these games that i mentioned above are PC exclusives, except Oblivion. But Oblivion was very much graphically PC centric, although not the best example of optimization like all Bethesda games.

And VRAM was not limiting factor because these PC exclusives have pushed graphics so much that it was necessary to have faster GPUs. VRAM was irrelevant here. I have explained you this. That's why it was not possible to play games in 2/4k back then, even if you had 8 or more gigs of VRAM.

STALKER SoC could easily be played in extreme res by simply enabling supersampling. It's the same, but of course performance were bad.

I'm not comparing RTX cards to AMD. I compare prices of high-end Nvidia cards from 10 or 5 years ago to the today's prices. That's insanity.

And somebody actually did compare it. In the Crysis picture they use some texture mod, but the rest is the same.
Btw, show this Crysis picture to some younger player who never played it, and hes gonna think that it's some kind of next-gen game:
https://external-preview.redd.it/_b...bp&s=dfdc584305c8147299783e3e7db354a7524d1dec
 

xer21

Forum veteran
dude, crytek straight up said they tweaked things in warhead. and on top of that it was just so much better optimized. they don't run the same at all. warhead runs way better. that was literally their main marketing hook, like, come on.

you cant even say "That's why it was not possible to play games in 2/4k back then, even if you had 8 or more gigs of VRAM.". it literally didn't exist. we hadn't even broken the 1 GB barrier then. no one was going to play anything at 2k or 4k on 1GB no matter the texture size. it was a limiting factor and always was.


when I asked about graphics leaps, I care WHY you think they were leaps. I explained why I think TW3 and Metro count, all you did was list games. you're over here comparing shaders and that's cool, but name an open world that runs as well at that level of fidelity that TW3 has because no one else has really done that. it seems like all you care about is pushing cutting edge tech and that's cool but that's not all there is to good graphics.

like, what was revolutionary about all those titles? id agree on far cry, and doom 3, but the biggest revolution of HL2 was physics based, and I'm not sure exactly what you're pointing to with oblivion and gothic, or STALKER for that matter, which outside of lighting and weather, was fine but not amazing. and if we're just counting specific features, then why DOESNT metro last light count? nothing you're saying is consistent, and it just feels like you're cherry picking example to mock consoles.


you're talking in circles about things you think you explained but you never did, and at the end of the day, your entire argument is "publishers are afraid of pc graphics" which is just way too big of an assumption and has no real basis in reality. the only thing holding back PC graphics is lazy porting. its not some conspiracy from publishers who think console gamers are gonna be offended, because its common thought amongst console guys that PC looks insanely better and no one cares. its just about money and that's al it ever was.
 
Last edited:

xer21

Forum veteran
Both of you calm down, for goodness sake.
as much as id like to continue, realistically, I can feel the ban coming and I probably shouldn't.

yeah yeah I cant wait to read about how I'm bowing out because I'm conceding. nah, I just already know the last post is probably gonna get me in trouble and I don't want to push my luck.

this legit made me edit my post out for certain things. thanks for stopping me.
 
you're talking in circles about things you think you explained but you never did, and at the end of the day, your entire argument is "publishers are afraid of pc graphics" which is just way too big of an assumption and has no real basis in reality. the only thing holding back PC graphics is lazy porting. its not some conspiracy from publishers who think console gamers are gonna be offended, because its common thought amongst console guys that PC looks insanely better and no one cares. its just about money and that's al it ever was.

It's probably safe to say most modern games are targeted at a variety of platforms, hardware or, for simplicity, "configurations". Presumably, it's important to consider which configurations the game is targeted toward. Most people aren't running top of the line PC hardware. Most are toward the budget, console or "less than top of the line" part of the spectrum. So the game has to run well on these configurations. Presumably, this may mean making some sacrifices. In other words, the game isn't made to fully leverage top of the line hardware.

One would think it would be possible, to some extent, to build a game able to leverage the best of the best but still perform well on a less powerful configuration. Unfortunately, it never boils down exclusively to whether it is possible. It's whether less or more money can be made doing it. So, again, sacrifices are probably made. How many are made probably depends on the game. Most modern games have 14 million adjustment sliders to get the desired balance between performance and quality (they didn't always have these). How far you can get with this probably depends on a long list of factors (again, presumably).

I'd think any reasonable person could see some merit to the above. Where I would disagree is when people blindly throw consoles under the bus and claim they're holding everything back. A console, smart-phone, smart-tv, home router, etc., are effectively both a computer and a personal device. So by all rights the distinction between "PC" and "console" is meaningless. At least, beyond indicating the differences between the two configurations. So if a console is holding things back because it's lesser hardware, so is a low end or budget PC. All of these console vs PC, Nvidia vs AMD, Intel vs AMD, A vs B "wars" are really quite comical.

You cannot really fault hardware developers for pushing cheaper, lesser hardware. Nor can you fault software developers for trying to build software able to run well on lesser hardware. Sure, game devs could build a game only able to run on top of the line hardware. They could also try to build it where it fully leverages top of the line hardware but still runs well on lesser hardware. The former would probably be pretty stupid and the latter presumably carries added cost.

Getting back to the "optimization". Who cares? Game dev A has some type of connection to hardware seller B. So the marketing claims game A is "optimized" for hardware sold by B. I'd wager a lot of that is... marketing. Yeah, it probably means the game runs better if you use hardware B. I seriously doubt it means it runs terribly on any other hardware. That would be stupid. I'd put as much thought into that as marketing surrounding "new" (not new) technologies such as Raytracing. Hardware/PC/game marketing is notorious for adding a lot of fluff, is the point. Don't get caught up in the fluff.

If someone stayed at a holiday inn last night and is a self proclaimed expert feel free to "correct" any of the above :).
 
I'd think any reasonable person could see some merit to the above. Where I would disagree is when people blindly throw consoles under the bus and claim they're holding everything back. A console, smart-phone, smart-tv, home router, etc., are effectively both a computer and a personal device. So by all rights the distinction between "PC" and "console" is meaningless. At least, beyond indicating the differences between the two configurations. So if a console is holding things back because it's lesser hardware, so is a low end or budget PC. All of these console vs PC, Nvidia vs AMD, Intel vs AMD, A vs B "wars" are really quite comical.

It's a matter of interpretation, to what extent and features consoles may hold back PC.
But it's not like they're holding it back, it's more like consoles give them the opportunity to neglect issues on PC.
Even CDPR was under this pressure! In terms of performance CDPR did a great job with Witcher 3, but the game had its flaws in other issues such as menu screen real estate:
There was basically not much changes done to differ PC from Consoles. However, in many other mainstream games it's much worse. CDPR at this point made a balance between both (it's not the greatest solution though), but if aiming for a proper solution, then PC will forcefully look different than on consoles, not just visually but also feature-wise.
witcher_3_screen_real_estate.png

Anyway, on graphical aspect, which @xer21 and @Ancient76 were discussing, it also depends on the publisher and developer, pretty much like everything else in the end. The argument that consoles stagnating PC graphics isn't entirely correct though. Some games may have this problem, but it's more likely because of laziness/greed, just as much as @xer21 already mentioned.
 
dude, crytek straight up said they tweaked things in warhead. and on top of that it was just so much better optimized. they don't run the same at all. warhead runs way better. that was literally their main marketing hook, like, come on.

you cant even say "That's why it was not possible to play games in 2/4k back then, even if you had 8 or more gigs of VRAM.". it literally didn't exist. we hadn't even broken the 1 GB barrier then. no one was going to play anything at 2k or 4k on 1GB no matter the texture size. it was a limiting factor and always was.


when I asked about graphics leaps, I care WHY you think they were leaps. I explained why I think TW3 and Metro count, all you did was list games. you're over here comparing shaders and that's cool, but name an open world that runs as well at that level of fidelity that TW3 has because no one else has really done that. it seems like all you care about is pushing cutting edge tech and that's cool but that's not all there is to good graphics.

like, what was revolutionary about all those titles? id agree on far cry, and doom 3, but the biggest revolution of HL2 was physics based, and I'm not sure exactly what you're pointing to with oblivion and gothic, or STALKER for that matter, which outside of lighting and weather, was fine but not amazing. and if we're just counting specific features, then why DOESNT metro last light count? nothing you're saying is consistent, and it just feels like you're cherry picking example to mock consoles.


you're talking in circles about things you think you explained but you never did, and at the end of the day, your entire argument is "publishers are afraid of pc graphics" which is just way too big of an assumption and has no real basis in reality. the only thing holding back PC graphics is lazy porting. its not some conspiracy from publishers who think console gamers are gonna be offended, because its common thought amongst console guys that PC looks insanely better and no one cares. its just about money and that's al it ever was.

Well of course Crytek tweaked some things, but that's not downgrade. Warhead is smaller game, more linear. But for example it has more intensive POM and better explosions, lighting is different. But overall both games perform the same. I have played them both more then once, and i have even made some modifications on both. I know how Cryengine works.

Yes, i repeat myself because you don't understand something which is so simple. Let's say that today someone releases a game that pushes graphics just like Crysis did in 2007. All VRAM that we have today would be useless, and you would not be able to play that game in 2/4k, not even close.
Back then Crysis required amount of VRAM just like all other games, but what was so advanced and demanding were shaders, geometry and lighting. And for these graphical elements you need strong GPU, not huge amount of VRAM.

And thank you for telling me how back then we didn't have 8gb of VRAM! I really didn't know this! Btw, have you ever heard about a "read between the lines" expression?

Also i'm not cherry picking. STALKER SoC was the first game that had dynamic real-time lighting and shadows, and it was so impressive. HL2 was impressive with facial expressions, which were so advanced. Overall all these PC exclusives were on pair back then with excellent graphics, and they were pushing graphics.
Btw, i didn't change my graphic card for 5 years, my HD 5850. When TW3 was released in 2015, that's when i changed it because it was too weak for that game. This just tells you how graphics is stagnating.

Now you can say that lazy porting is the problem, but these are actually multiplatform games developed and released for all platforms at the same time. So technically they are not ports. But all of this sh*t that we have today is because of consoles.
Generally i don't have problem with graphics, cause all game look good enough today. But i have problem with the gameplay which is designed around gamepad, and even bigger problem is that it's designed for console public, more casual, etc.
This is why RPG elements are almost vanished. RTS games, my favorite genre, is also almost vanished cause you can't make RTS for consoles.

Btw, let me enlighten you a little bit. Today people complain like never before because the games are worse than ever, especially AAA games. And the reason why these games are bad is not because of greedy publishers, but because of the consumers which don't think critically. If you wan't to stop this, don't buy sh*tty games.
 
[...]
Yes, i repeat myself because you don't understand something which is so simple.
[...]
And thank you for telling me how back then we didn't have 8gb of VRAM! I really didn't know this! Btw, have you ever heard about a "read between the lines" expression? [...]

While I like your enthusiasm in general, I think that getting too personal isn't the right reaction here, because there is something that you don't understand either to be fair with.
That's the fact that you don't agree with @xer21 opinion, which however doesn't make his arguments invalid as you're trying to point out though. So how about we agree that we don't agree on each others opinion instead?

[...]
Let's say that today someone releases a game that pushes graphics just like Crysis did in 2007. All VRAM that we have today would be useless, and you would not be able to play that game in 2/4k, not even close.
Back then Crysis required amount of VRAM just like all other games, but what was so advanced and demanding were shaders, geometry and lighting. And for these graphical elements you need strong GPU, not huge amount of VRAM. [...]

This doesn't make any sense...
Not to mention that what you're trying to imply here is very very farfetched, and probably never going to happen anyway, the comparison with Crysis at this point is irrational.
Crysis back then in 2007 had the problem not utilizing CPU core counts to its maximum extent, while at the same time during that age, multi-core CPUs were common with up to 2 or highest 4 core threads in total and it wasn't even utilizing these properly. Crysis was/is heavily CPU depended and this alone indicates a problem of proper optimization already, and not the utilization of capability that wasn't existing yet. Even with the fastest graphics cards today, you still can't properly play Crysis (1) yet, so what about that comparison?
Let's forget about the CryEngine for a moment, this action actually resulted in heavy criticism and losses at Crytek's end if you've followed press back then.

Also regarding screen resolution: 1080p was pretty much highest available in the display-market back then in 2007 anyway. Anything higher than that wasn't available for a reasonable price, maybe a few 1440p displays but 2160p, at least to public customers in late 2007's. I got my first 1080p display a year or two later.

[...]
Also i'm not cherry picking. STALKER SoC was the first game that had dynamic real-time lighting and shadows, and it was so impressive. HL2 was impressive with facial expressions, which were so advanced. Overall all these PC exclusives were on pair back then with excellent graphics, and they were pushing graphics.
Btw, i didn't change my graphic card for 5 years, my HD 5850. When TW3 was released in 2015, that's when i changed it because it was too weak for that game. This just tells you how graphics is stagnating.

Now you can say that lazy porting is the problem, but these are actually multiplatform games developed and released for all platforms at the same time. So technically they are not ports. But all of this sh*t that we have today is because of consoles.
Generally i don't have problem with graphics, cause all game look good enough today. But i have problem with the gameplay which is designed around gamepad, and even bigger problem is that it's designed for console public, more casual, etc.
This is why RPG elements are almost vanished. RTS games, my favorite genre, is also almost vanished cause you can't make RTS for consoles. [...]

I think you're a bit out of focus here, reading through the discussion once again, it looks like you're misinterpreting what @xer21 was trying to say in his earlier posts, thus resulting in disagreement.

@xer21 agrees with most of your share, also I do, but you make consoles responsible for everything that is happening currently, from which some aspects might indeed apply to it, but it's not solely nor entirely as I mentioned too in my earlier post.

[...] And the reason why these games are bad is not because of greedy publishers, but because of the consumers which don't think critically. [...]

That's wrong.
It's both, greedy developers and careless customers.
However, the careless aren't at full fault either to be fair, because the greedy publishers try everything in order to mislead their customers to let them think that the product they offer is great - no - greater than great. Call it propaganda if you want, it's fact and it's happening and it's working.
 
Last edited:
I believe, and someone can correct me if I'm wrong, the argument that "consoles limit PCs" is not necessarily a literal one. It can still be an accurate statement while also taking into account the greed/laziness points made above (Which are totally valid, by the way).

As others have said, developing for consoles first simply gives devs an excuse to be lazy with PC ports. Because, at the end of the day, they know we will still buy their games -- and that's on us as much as it's on them (ok, not as much as, but at least more than 100/0). Consoles, combined at least, are also simply the majority of the market -- that, plus the fact that we buy the games no matter what, leads most devs to say "screw it" and ditch any sort of optimizations.

However, PC players (I play almost exclusively on PC, so I count myself here) do at some point need to realize that, while we can and should argue in favor of better ports with better controls, moaning about consoles and attacking fellow gamers really isn't going to get us anywhere.

Anyway, all of us are getting a bit off-track here. This turned into hardware thread 2.0 instead of "Game being optimized for Nvidia GPUs." o_O

To circle back around to that, I think we can expect the game to be optimized first for Nvidia GPUs because CDPR has had a long-standing partnership with them, but it's tough to say what form that optimization will take. I don't have any benchmarks handy, but I'd be curious to see how well TW3 performed on, say, a mid-range AMD card vs a comparable mid-range Nvidia card back around the time the game first came out. My gut tells me Nvidia would have performed better due to this partnership, but I have no evidence for that. Maybe the partnership is just for fancy features like RTX/Hairworks?
 
Btw, TW3 was not optimized for Nvidia, so i expect the same with the Cyberpunk.

Good to know. I genuinely could not remember, but it seemed logical. So, hopefully, we can expect no more than a few extra aesthetic things (RTX/Gameworks) but otherwise even performance across both GPU makers.

No, we will not buy their shitty games, or at least i will not. But if someone can easily trick you, well that just tells something about you. In a nutshell: http://www.c4lpt.co.uk/blog/wp-content/uploads/2015/10/change-1024x848.png

No, it says something about the vast majority of PC gamers. You realize even poorly-optimized games sell like hotcakes, right? Of course, the sales will taper off or refunds will be issued if the issues are particularly bad and never fixed, but otherwise, they do just fine.

Personally, I'm like you. I will either not buy (if I'm aware of the issue beforehand) or refund a game that doesn't meet my port expectations. I have other principles (such as not buying on Epic) that I also stick to, no matter how much a company tries to strong-arm me.

But I was making a general statement regarding the PC market as a whole. They will complain, post 30k-upvoted Reddit posts, email the devs, etc. but they still buy the games. Not everybody is perfectly in-tune with the "issues," nor should they be.
 

xer21

Forum veteran
To circle back around to that, I think we can expect the game to be optimized first for Nvidia GPUs because CDPR has had a long-standing partnership with them, but it's tough to say what form that optimization will take. I don't have any benchmarks handy, but I'd be curious to see how well TW3 performed on, say, a mid-range AMD card vs a comparable mid-range Nvidia card back around the time the game first came out. My gut tells me Nvidia would have performed better due to this partnership, but I have no evidence for that. Maybe the partnership is just for fancy features like RTX/Hairworks?
its going to come down to a lot of things. lots of partnered games end up running better on competitor hardware for various reasons.
 
its going to come down to a lot of things. lots of partnered games end up running better on competitor hardware for various reasons.

Yeah, I've changed my mind. It was a fleeting thought, not really a hill I was going to die on. It just seemed logical to me that "partnership = favoritism," but if that is not the case, I am more than happy to accept that. Competition is good, and fairness is even better!
 

xer21

Forum veteran
No, we will not buy their shitty games, or at least i will not.
if your main criteria for what makes a game shitty is graphics well then, have fun missing out on a huge swath of what games have to offer.

seems like a terribly close minded way to view games.

gaming is better when you don't keep yourself a slave to the tech.
 
Top Bottom