Nvidia’s GameWorks: A double-edged sword for Witcher 3

+
Scams, usually those imply falsely offering things, no, not this dev team. I'm talking about locked exclusives, being only specific, those can be waited for in a few years, it's just I hope it's worth it. I mean it is their income & excluding all those who do care, hopefully it won't have as much impact, and the reactions here show me no one's worried, I gather, or maybe a little sarcasm...
I'm guilty of using sarcasm but that is normal for me and found in most of my posts :whistle:.

And I agree. I don't think CDPR is attempting to scam anyone and I will be quite shocked if the game doesn't run decently on most systems (just like the previous two games). Yes, graphic settings might have to be tweaked for each individual system but that's no big deal and typical for PC games. As long as options exist to make the game run smoothly on a large variety of hardware (scalable) then people will be able to enjoy the game regardless of whether it's maxed out or not.
 
My problem with hairworks is that it is unrealistic at times. It's as if the animals have been taken to a saloon, washed with an expensive shampoo, and then hair dried. Animals fur/hair in nature is dirty, sticky, muddy, and thick for most parts, and not clean and flowing like models in shampoo commercials.

Agreed. However it's still a giant leap over flat textures, and a generous step above Ubisoft or Rockstar's fur implementation.

Whether it's worth the performance hit or not though, well that's up to the individual. Step by step really, it already seems to have improved a lot from Ghosts to FC4, hopefully it'll be even better in TW3 and then in the future we'll see it react more naturally to stuff like getting muddy, dirty, clump etc.
 
I'm guilty of using sarcasm but that is normal for me and found in most of my posts :whistle:.

And I agree. I don't think CDPR is attempting to scam anyone and I will be quite shocked if the game doesn't run decently on most systems (just like the previous two games). Yes, graphic settings might have to be tweaked for each individual system but that's no big deal and typical for PC games. As long as options exist to make the game run smoothly on a large variety of hardware (scalable) then people will be able to enjoy the game regardless of whether it's maxed out or not.

The comment made in the other thread is really all I'm looking for, 60 FPS for this silliness on a 290, worries melt away. What is confusing me is, probably that they aren't running demos on 290x's, that's probably all I'm patiently waiting for, or at least close to it, not many have a 295x either. lol

No worries, I just may be defensive because of the fact I'm a Witcher fanboi to begin with and I don't want to see either side of the GPU, CPU whatever tech war be laid out on all of those wanting to enjoy the game as best as reasonably possible. ;)
 
The comment made in the other thread is really all I'm looking for, 60 FPS for this silliness on a 290, worries melt away. What is confusing me is, probably that they aren't running demos on 290x's, that's probably all I'm patiently waiting for, or at least close to it, not many have a 295x either. lol

No worries, I just may be defensive because of the fact I'm a Witcher fanboi to begin with and I don't want to see either side of the GPU, CPU whatever tech war be laid out on all of those wanting to enjoy the game as best as reasonably possible. ;)

Shoot, if I can run TW3 on my aging GTX 680 with all the fancy Gameworks features disabled at a decent FPS I'll be happy. I could really care less about them and I don't think they'll discourage me from playing or hamper my enjoyment of the game in any way.

Now, if Nvidia made it so these special effects couldn't be disabled in any game that used them then I'd be the first person to call it a scam and BS. As it currently stands, in every game I've seen that used these features they were completely optional. More options is almost always a good thing.
 
Last edited:
Shoot, if I can run TW3 on my aging GTX 680 with all the fancy Gameworks features disabled at a decent FPS I'll be happy. I could really care less about them and I don't think they'll discourage me from playing or hamper my enjoyment of the game in any way.

Now, if Nvidia made it so these special effects couldn't be disabled in any game that used them then I'd be the first person to call it a scam and BS. As it currently stands, in every game I've seen that used these features they were completely optional. More options is almost always a good thing.

Because, man, I don't care about you, or me, I care about everyone who's a fan. Exclusivity of features on specific equivalent hardware alienates your base, this isn't rocket science, maybe we can agree to disagree, perhaps?
 
Last edited:
Because, man, I don't care about you, or me, I care about everyone who's a fan. Exclusivity of features on specific equivalent hardware alienates your base, this isn't rocket science, maybe we can agree to disagree, perhaps?

I agree that it's not rocket science. I'm basing my conclusions off the previous two games. Most players did not max the previous games out yet they were still capable of enjoying the games. I was simply saying I don't see why this trend won't continue in the third game. The features we are discussing are not crucial to gameplay so they'll have very little to no impact if they aren't enabled. I don't think the effects are significant enough to alienate anyone if they can't enable them on their particular system. The effects are about as important as ubersampling was for TW2 (which is why I originally used that example). Almost nobody ran ubersampling and nobody cared about not running it; it was simply insignificant.

If you don't agree with my prediction or views then I have no problem with agreeing to disagree ;).
 
Last edited:
I agree that it's not rocket science. I'm basing my conclusions off the previous two games. Most players did not max the previous games out yet they were still capable of enjoying the games. I was simply saying I don't see why this trend won't continue in the third game. The features we are discussing are not crucial to gameplay so they'll have very little to no impact if they aren't enabled. I don't think the effects are significant enough to alienate anyone if they can't enable them on their particular system. The effects are about as important as ubersampling was for TW2 (which is why I originally used that example). Almost nobody ran ubersampling and nobody cared about not running it; it was simply insignificant.

If you don't agree with my prediction or views then I have no problem with agreeing to disagree ;).

I know what you mean, and that's fine, but my main concern isn't happening here, from what I'm watching unfold, it seems to be functional on the other 33% of gamers hardware so far, at least on one game. That was where concerns have been already eased.

What I was fearing, was market share influence buffing up Nvidia, at the cost of PC gamers who would have normally supported CDPR. We simply view this differently, which is alright. I just have multiple options to enjoy the game, now or later, and once those test results come out for both cards, I can simply be at ease about where I know my money deserves to go. ;)
 
I know what you mean, and that's fine, but my main concern isn't happening here, from what I'm watching unfold, it seems to be functional on the other 33% of gamers hardware so far, at least on one game. That was where concerns have been already eased.

What I was fearing, was market share influence buffing up Nvidia, at the cost of PC gamers who would have normally supported CDPR. We simply view this differently, which is alright. I just have multiple options to enjoy the game, now or later, and once those test results come out for both cards, I can simply be at ease about where I know my money deserves to go. ;)

So your fear was that Nvidia is attempting to strong-arm consumers into using their product by using proprietary software/technology which only runs on their own hardware (basically what Microsoft does with DirectX). And like I previously said, that would be a more valid concern if the GameWorks software was either a) crucial to gameplay or b) forced always on and not optional.

In my opinion it makes more sense to fight the existing monopoly (Microsoft, Windows, DirectX) as opposed to fighting a theoretical problem that doesn't yet exist (Nvidia, GameWorks becoming a monopoly).
 
Last edited:
So your fear was that Nvidia is attempting to strong arm the entire GPU industry into using their product by using proprietary software/technology which only runs on their own hardware (basically what Microsoft does with DirectX). And like I previously said, that would be a more valid concern if the GameWorks software was either a) crucial to gameplay or b) forced always on and not optional.

In my opinion it makes more sense to fight the existing monopoly (Microsoft, Windows, DirectX) as opposed to fighting a potential problem that doesn't yet exist (Nvidia, GameWorks).

freakie1one, I do not like any of them at all once they hit mono status.

I trust none of them when they get there, nope. Maybe I'm stubborn, but , yeah, there it is. It's the reason, I, ugh, still have consoles, and play with other OS's like Unbuntu, to balance. Also, indeed. I get you support locked exclusives for both, that's probably the only thing we see differently. I think it is nothing more than problems for fans on both sides to begin with.
 
freakie1one, I do not like any of them at all once they hit mono status.

I trust none of them when they get there, nope. Maybe I'm stubborn, but , yeah, there it is. It's the reason, I, ugh, still have consoles, and play with other OS's like Unbuntu, to balance. Also, indeed. I get you support locked exclusives for both, that's probably the only thing we see differently. I think it is nothing more than problems for fans on both sides to begin with.

I don't trust any businesses. Most all of them are willing to screw people over for a profit. I think there are just varying degrees of corruption, some will go further than others :p.

I also think gamers and developers alike would benefit if all proprietary software became open source. Sadly, it's usually not very profitable for businesses to spend money on researching technology only to give it away for free to their competitors. Making proprietary software and technology isn't bad as long as it's optional. Forcing everyone to use it is bad (read: monopoly).
 
I don't trust any businesses. Most all of them are willing to screw people over for a profit. I think there are just varying degrees of corruption, some will go further than others :p.

I also think gamers and developers alike would benefit if all proprietary software became open source. Sadly, it's usually not very profitable for businesses to spend money on researching technology only to give it away for free to their competitors. Making proprietary software and technology isn't bad as long as it's optional. Forcing everyone to use it is bad (read: monopoly).

Yeah, like the lens flares having issues in another title right now for specific cards, that's locked, on, forced onto the players. Is also BRAND specific. I know what you mean. However, it does remind me, of a simple issue I had with a good old game, it's like all games, not open source, so I'll never know if I could restore the original OST into the title ever...

Also, how many titles forced physx installs? Just one thing, I never asked for, and example.

You're thinking this is irrational concern with that track record? I'd say having both GPU's of the time able to equally perform with code would be the only reason to even choose one, and change every time equally, at least for me (How I've done it since 3dfx). Since that has not been the case, I will wait on the next purchase of one from them. So, I guess it is all about what the reaction influences, I will go for the underdog until I feel the market is fairer. If this doesn't change things (meaning they don't lockout shit and allow ATI to catch up with drivers within reason, of course not given extra time mind you), I'll be excited to switch back to Nvidia next upgrade. It's just how I see it.
 
Last edited:
A few screenshots doesn't give us the whole picture... Does it run poorly on all video cards/setups? Is it a driver issue? Is it an issue with the game not being optimized properly? Do all games which use the technology run poorly or just this one?

Also, most new technology causes a performance hit when first released. There is a reason that many graphic options can be enabled/disabled for most PC games. The Witcher 2 was a great example of this. When the game was first released almost no systems were powerful enough to run the game at 60+ FPS with supersampling enabled. Does this also make the game a scam and people fanboys who played TW2?

Im running the latest nvidia drivers for far cry 4 (i think are 344.75 or something like that) on a GTX780 Ti with the latest ubisoft patch 1.4 and i get 20-35fps drop whenever a pair of animals with fur are close to me... so i think its pretty the whole thing we got here. If u want i can up a video or some more pictures in any other situations but i tell you in that particular scene im getting an average of +10fps compared to other scenes with more trees, foliage and distance drawing...

---------- Updated at 03:10 PM ----------

You're saying new technology that doesn't run well on current systems is fluff and for "fanbois" while the other dude is saying it's a scam. PC games have almost always been this way concerning new technology and it's nothing new. Usually it takes a next generation GPU (after a game is released) in order to max the game out and maintain a high FPS. The technology people are bitching about is totally optional so it doesn't even affect gameplay if someone doesn't have a system powerful enough to run it since it can be disabled; just like 99% of the people who couldn't play TW2 with ubersampling enabled when it launched because it was too demanding.

Designing a game to scale well with future GPU's gives the game a longer shelf life and allows a broader spectrum of PC's to play the game since most of the options can be disabled if people do not have a powerful enough rig to run it maxed out. So far no one has posted screenshots of the game not running or working. They have posted screenshots of the game running at fantastic framerates if they adjusted settings that their PC was capable of running. Also, Vigilance pointed out that the game is obviously horribly optimized since it doesn't even remove the effects on creatures far off in the distance, which kills performance, so the game shouldn't be considered a benchmark for anything.

The thing here is not what your talking about theorical implementention of the future tecnology. But if that tecnology of today, is now, today a fact. We have to ask if our top Graphic cards are capable of doing more things that their showing us, and my personal answer is absolutely Yes.

If u look at the evolution of graphics, physics and IA of games from Crysis (2007) to Far Cry 4 (2014) for example.I can't clearly see the evolution and correlation in PERFORMANCE / GRAPHICS AND TECNOLOGYS IMPLEMENTATION IN GAMES / THEORICAL GRAPHIC PROCESS POWER (GFlops), DO THE MATHS. Top tier graphic cards from their respective years: GTX8800 (2007) --> 500Gflops (and others caracteristics like vram, bus, bandwith, cores etc, im talking with GFlops cause is the thoerical sum of all this), GTX 780 Ti (2013-2014) --> 5000Gflops. Ok we have here a huge jump in 7 years of horsepower of brute force in graphic cards im fine with that i can see the evolution the tecnology i can see the work here. But now run Crysis on GTX8800 and run Far Cry 4 (or another AAA game of 2014 that maxs the capabilitys of the card) on GTX 780 Ti. I personally dnt see that evolution in tecnology implemented in games vs the evolution of theorical graphic power capabilities of modern GPU's. Its simple and clear thers a BLACK HOLE in optimization, implementation of that GPU's tecnology and eficient utilization of ours graphic cards. Remember the FACT --> from 500GFlops(2007) to 5000GFlops (2014) thers a 1000% or 10 times more graphic process power, you can say sincerell to me that you can see this 1000% eficiently implemented in actual games?

I call it a scam cause im having the feeling that nvidia and Amd are ruling the buisness their way, and wake up! their buisness is to sell hardware! i can't understan why its so hard to see. The world is manifestly full of mono- and oligo-polys that controls and manipulate market rules, why the Videogame industry (one of the biggest industry nowadays) would be diferent? its so so hard to understand?

---------- Updated at 04:05 PM ----------

I up some more screens, these time of the diferent types of AA. Not all is negative in graphics nowadays and i can tell im pretty happy with the SMAA class of AA. I dnt know if its related to nvidia gameworks but SMAA seems to get the best of TXAA, MSAA and FXAA and put in all in a very performance friendly way, i can tell it works very well.

2X MSAA 53FPS:
View attachment 7681

2X TXAA 52FPS:
View attachment 7682

SMAA 63FPS:
View attachment 7683

I dnt know if u can see very well the difference with these stretched pictures but i can tell you TXAA looks really smooth but textures are a bit blurry, MSAA have great textures definition but the edges are not as smooth as TXAA, and SMAA gets very smooth picture with less blurry textures than TXAA and good performance! This is just great, i hope Witcher 3 will work with SMAA along with others types of AA. The bad thing is that i noticed SMAA has less distance drawing, more popup, but is barely noticeable and looks fantastic.
 

Attachments

  • 2XMSAA.jpg
    2XMSAA.jpg
    106.3 KB · Views: 37
  • 2XTXAA.jpg
    2XTXAA.jpg
    102.7 KB · Views: 34
  • SMAA.jpg
    SMAA.jpg
    102.2 KB · Views: 31
Last edited:
The thing here is not what your talking about theorical implementention of the future tecnology. But if that tecnology of today, is now, today a fact. We have to ask if our top Graphic cards are capable of doing more things that their showing us, and my personal answer is absolutely Yes.

I totally agree that graphics cards are capable of doing more than what many game developers use them for. However, all the blame can't be placed on Nvidia or AMD. Game developers must be capable and willing to utilize a graphics card in efficient ways to really benefit from more powerful GPU's. If a game is horribly optimized (example: special effects on distant wolves in FC4 being applied when it's not necessary and drains performance) then it is no fault of the GPU manufacturer. Game developers are just as much responsible for whether or not a game takes advantage of a GPU's capabilities.

If you want a great example of a game that IS taking full advantage of modern GPUs and CPUs then check out the game Sui Generis. They aren't using any third party software because they realize it is all too limited and not capable of what they need it to do. Since they've created every single part of the game engine from scratch they are able to do things which other games are not. Physics are applied to everything in Sui Generis (all objects, characters, clothing, hair, weapons, armor), all animations are procedurally generated based upon physics instead of using static animations, all objects can be interacted with and are not just static objects, there is advanced AI that is leagues ahead of any other game (NPCs have emotions, can form opinions and make decisions based upon what is occurring around them, have their own goals and objectives they work towards, etc.), full dynamic lighting and shadows, super sampling that can be enabled with high framerates on mediocre systems. This is all being created by a small indie team with only one programmer while using no third party proprietary software; and the game looks and runs great (I've played several alpha versions).

If u look at the evolution of graphics, physics and IA of games from Crysis (2007) to Far Cry 4 (2014) for example.I can't clearly see the evolution and correlation in PERFORMANCE / GRAPHICS AND TECNOLOGYS IMPLEMENTATION IN GAMES / THEORICAL GRAPHIC PROCESS POWER (GFlops), DO THE MATHS. Top tier graphic cards from their respective years: GTX8800 (2007) --> 500Gflops (and others caracteristics like vram, bus, bandwith, cores etc, im talking with GFlops cause is the thoerical sum of all this), GTX 780 Ti (2013-2014) --> 5000Gflops. Ok we have here a huge jump in 7 years of horsepower of brute force in graphic cards im fine with that i can see the evolution the tecnology i can see the work here. But now run Crysis on GTX8800 and run Far Cry 4 (or another AAA game of 2014 that maxs the capabilitys of the card) on GTX 780 Ti. I personally dnt see that evolution in tecnology implemented in games vs the evolution of theorical graphic power capabilities of modern GPU's. Its simple and clear thers a BLACK HOLE in optimization, implementation of that GPU's tecnology and eficient utilization of ours graphic cards. Remember the FACT --> from 500GFlops(2007) to 5000GFlops (2014) thers a 1000% or 10 times more graphic process power, you can say sincerell to me that you can see this 1000% eficiently implemented in actual games?

Have you actually tested your theory out by running the same exact game (Crysis 2007) with two different video cards to see what the gains actually are? I've seen huge improvements in graphics quality over the years. Simply compare TW1 to TW2 if you want a great example. Compare Battlefield 2 to Battlefield 4 for yet another. And I'm predicting that another huge leap will be made once DX12 is used in next gen games since it is supposed to be quite a bit more efficient than DX11.

Whether or not a game developer utilizes all of a GPU's processing power in an efficient manner isn't always going to be directly related to some grand scheme by GPU manufacturers to sell more product. Developing a game isn't easy and takes a lot of time, money, and skill. Some developers do it better than others; it's not a conspiracy.

I call it a scam cause im having the feeling that nvidia and Amd are ruling the buisness their way, and wake up! their buisness is to sell hardware! i can't understan why its so hard to see. The world is manifestly full of mono- and oligo-polys that controls and manipulate market rules, why the Videogame industry (one of the biggest industry nowadays) would be diferent? its so so hard to understand?

Read my previous post. I said I don't trust any business because I know the vast majority of businesses are corrupt and will screw people over if they can profit from doing so. I am not blind and I can see this as clearly as anyone. The question is: what is your solution to this problem?
 
Last edited:
I totally agree that graphics cards are capable of doing more than what many game developers use them for. However, all the blame can't be placed on Nvidia or AMD. Game developers must be capable and willing to utilize a graphics card in efficient ways to really benefit from more powerful GPU's. If a game is horribly optimized (example: special effects on distant wolves in FC4 being applied when it's not necessary and drains performance) then it is no fault of the GPU manufacturer. Game developers are just as much responsible for whether or not a game takes advantage of a GPU's capabilities.

If you want a great example of a game that IS taking full advantage of modern GPUs and CPUs then check out the game Sui Generis. They aren't using any third party software because they realize it is all too limited and not capable of what they need it to do. Since they've created every single part of the game engine from scratch they are able to do things which other games are not. Physics are applied to everything in Sui Generis (all objects, characters, clothing, hair, weapons, armor), all animations are procedurally generated based upon physics instead of using static animations, all objects can be interacted with and are not just static objects, there is advanced AI that is leagues ahead of any other game (NPCs have emotions, can form opinions and make decisions based upon what is occurring around them, have their own goals and objectives they work towards, etc.), full dynamic lighting and shadows, super sampling that can be enabled with high framerates on mediocre systems. This is all being created by a small indie team with only one programmer while using no third party proprietary software; and the game looks and runs great (I've played several alpha versions).



Have you actually tested your theory out by running the same exact game (Crysis 2007) with two different video cards to see what the gains actually are? I've seen huge improvements in graphics quality over the years. Simply compare TW1 to TW2 if you want a great example. Compare Battlefield 2 to Battlefield 4 for yet another. And I'm predicting that another huge leap will be made once DX12 starts being used to create next gen games since it is supposed to be quite a bit more efficient than DX11.

Whether or not a game developer utilizes all of a GPU's processing power in an efficient manner isn't always going to be directly related to some grand scheme by GPU manufacturers to sell more product. Developing a game isn't easy and takes a lot of time, money, and skill. Some developers do it better than others; it's not a conspiracy.



Read my previous post. I said I don't trust any business because I know the vast majority of businesses are corrupt and will screw people over if they can profit from doing so. I am not blind and I can see this as clearly as anyone. The question is: what is your solution to this problem?

My solution is very simple, don't buy anything from big companys. But is useless since millions of players already own a PS4 or Xbox one or a high end graphic card, and keep buying shits from moneymakers distributors like EA, Activision or Ubisoft among others. Its not a conspiracy what im trying to explain its very simple concept and is not even of my own, you know something called built-in obsolescence? Its a comon thing in mass production industry in second half of XX and XXI century, and many engineers and specialised journalists had talked about that extensively. How you apply this in tecnology industry? easy, ask Apple, Microsoft, vehicles manufacturers etc etc etc, You think videogame industry is different? i don't know how exactly they rule the hardware buisness but im pretty sure their doing dirty stuff to keep the machine of dollar running at 100%.

And your going to tell me its quite normal that new games require more powerful hardware blablabla, yeah, ok thats preschoolar logic i give you that. But keep in mind that in one side we have closed systems manufaturers (consoles) with no hardware evolution in 6,7 or 8 years and they need to keep players thrilling for new graphics year after year, but in the other side we have a nonstop hardware manufacturers that exponencially multiplies year after year the graphic process power of their hardware. Then, im asking to you: its normal the coexistence of this two sides of the buisness along with the evolution of graphics? I don't get it. Somebody can explain me how its posible that the same distributors are doing huge amounts of moneys making AAA titles for all platforms? with pratically 0 benefits for PC Gamers with high end cards that have at least 10 times more power. And not only that, but PC Gamers often have to suffer from poor performance, more bugs, crashes etc. How its posible than a Graphic card with 3 or 4 times more horsepower can't run a game decently, when this same game is running on console? and why PC games always requires a lot more power to run a game with the same settings as on consoles?

And yeah again your going to tell me developers optimize a lot more for consoles cause is a closet system they know better and can do more with less and blablabla, but im not talking anymore about optimization im talking about underused graphic power on purpose, ive already make the comparison of the capabilitys of the GTX8800 with 500GFLops in 2007, and the capabilitys we get nowadays of the GTX 780 Ti with 5000GFlops (10 times more) i think you don't realize the jump here. Is not anymore about optimizing better or worst or the talent of the developers to make graphic engines, if it was the case i would give you that. But nobody can convince me that a AAA developer or engineer in a big company can't take advantage of 5000GFlops even if he was the lazyest man on earth its just a nonsense.

In short, hardware manufaturers need to sell new GPU's every year, and console manufatureres need to keep selling the same system year after year (plus games plus subscriptions etc) thats their ultracapitalists logic, their ultra big international companys in the game, their have big investors that keep demanding more benefits year after year, thats their logic youre not moving them from that, and that is 100% a fact (im talking about big companys here, not CDPR), their not going to unleash a war against each others (Sony vs Microsoft vs Nvidia etc etc), thats pretty clear, they have a lot more to loose than to win in a war. Then what we get? An oligopoly, their all wining, one year that company wins more but the year after the other company wins a bit more thats, the game.

Im not engineer but i think is pretty easy to set performance ratio when you have the control of the drivers, via drivers you can make a game run well or not. Call it conspyracy or whatever but for me the situation today its not clear at all, its simpy not making any sense. At least for me
 
Last edited:
At least some of this is absorbed in higher pixel counts. In addition many of the pixel shaders use very simple raytracing, these additional processes often have multiple samples per pixel. It is easy to produce far more than 10x the number of samples/processes with fairly small increases to fidelity of a handful of shaders.

That resolutions, fidelity and framerates are all higher than they were a decade ago is a given (It used to be that 20fps was 'good' and 30fps was exceptionally high... now 30 is considered a failure, and 60 is the new "minimum" for some people...).

This seems to be more hyperbole than actual requirement, so long as the physics engine is running at a high refresh, with responsive controls, the actual display frame rate is far less important.
 
At least some of this is absorbed in higher pixel counts. In addition many of the pixel shaders use very simple raytracing, these additional processes often have multiple samples per pixel. It is easy to produce far more than 10x the number of samples/processes with fairly small increases to fidelity of a handful of shaders.

That resolutions, fidelity and framerates are all higher than they were a decade ago is a given (It used to be that 20fps was 'good' and 30fps was exceptionally high... now 30 is considered a failure, and 60 is the new "minimum" for some people...).

This seems to be more hyperbole than actual requirement, so long as the physics engine is running at a high refresh, with responsive controls, the actual display frame rate is far less important.

Well i dnt know where you got that "20fps used to be good" cause personally with my PS2 i got the best and smoother experience ever, they wer 30FPS yeah, but like a rock man. Those days...when you rent a game for a week and you get an enjoyable experience for 8 bucks...Good old times!
 
My solution is very simple, don't buy anything from big companys.

If you choose to not play games because you won't buy anything from big companies (those who currently produce consumer grade GPU's and CPU's) then that is your personal choice. However, I am still going to buy whatever GPU offers the best price to performance ratio while allowing me to run games at high settings. Gaming is one of my passions and I'm not going to give up gaming to try and make a political statement or to fight a theoretical war. While this solution may be acceptable to you it most certainly is not acceptable to most gamers. Telling gamers to give up their passion is hardly a viable solution.

But keep in mind that in one side we have closed systems manufaturers (consoles) with no hardware evolution in 6,7 or 8 years and they need to keep players thrilling for new graphics year after year, but in the other side we have a nonstop hardware manufacturers that exponencially multiplies year after year the graphic process power of their hardware. Then, im asking to you: its normal the coexistence of this two sides of the buisness along with the evolution of graphics? I don't get it. Somebody can explain me how its posible that the same distributors are doing huge amounts of moneys making AAA titles for all platforms?

This is simple: consoles don't run the same version of games as PC's do because consoles simply aren't powerful enough to do so. In order to make a PC game run on a console developers must cut things out of the game. Shorter view distances, lower resolutions, less frames per second, etc. Developers must use "tricks" in order to get the game to run at all on consoles while still looking decent since they have a lot less processing power. Again, not a conspiracy.
 
If you choose to not play games because you won't buy anything from big companies (those who currently produce consumer grade GPU's and CPU's) then that is your personal choice. However, I am still going to buy whatever GPU offers the best price to performance ratio while allowing me to run games at high settings. Gaming is one of my passions and I'm not going to give up gaming to try and make a political statement or to fight a theoretical war. While this solution may be acceptable to you it most certainly is not acceptable to most gamers. Telling gamers to give up their passion is hardly a viable solution.



This is simple: consoles don't run the same version of games as PC's do because consoles simply aren't powerful enough to do so. In order to make a PC game run on a console developers must cut things out of the game. Shorter view distances, lower resolutions, less frames per second, etc. Developers must use "tricks" in order to get the game to run at all on consoles while still looking decent since they have a lot less processing power. Again, not a conspiracy.

Im only saying to stop buying stuff like crazy to big companys who rules the market...Im not telling you to give up ur passion, thers always alternative ways to do so.

What youre saying about consoles its alrady said in my comment and really its not an answer, only a logical reasoning. If u can read all my post together and make a reflection based on what im exposing ill be pleased to answer. But it seems that people get speechless and argueless when a guy explain or make a long statement about something.
 
Last edited:
Ok, the discussion here really isn't about The Witcher or how this may impact The Witcher anymore. Bring it back onto the Witcher, or it will get closed.
 
Top Bottom