Predicted witcher 3 system specs? Can I run it .

+
Status
Not open for further replies.
Good , but it is your opinion, because they could put powerfull GPU, but titan no , it is a GPU very expensive(1000 euros) and 780ti 600 euros , and I think the standar or high grafics could be gtx 680 or 7870(similar gpu to PS4).

I want to say GPU titan no is a GPU standar that all the people its has in his houses and I think buy the game and change your GPU but one GPU very expensive for played ..... There are a one crisis and I think not is the good moment for that Buys, but all this is my opinion nothing more , And I don't want to disrespect anyone.
Eskerrik asko

Now, you are entirely missing the point. GTX 680 or ATI 7870 will still run TW3, and the game will look as good (at the same FPS) as any other properly made game that runs on GTX 680 or ATI 7870. But GTX 680 and ATI 7870 will be 3 years old by the time TW3 is released, and it would be silly from the devs to limit the graphical fidelity of TW3 with the requirement of making GTX 680 or ATI 7870 able to run TW3 smoothly on max settings. This is nothing more that the natural progress of computer hardware (every 18 months, or so, hardware performance doubles - Moore's law).
 
Now, you are entirely missing the point. GTX 680 or ATI 7870 will still run TW3, and the game will look as good (at the same FPS) as any other properly made game that runs on GTX 680 or ATI 7870. But GTX 680 and ATI 7870 will be 3 years old by the time TW3 is released, and it would be silly from the devs to limit the graphical fidelity of TW3 with the requirement of making GTX 680 or ATI 7870 able to run TW3 smoothly on max settings. This is nothing more that the natural progress of computer hardware (every 18 months, or so, hardware performance doubles - Moore's law).

Keep in mind though that the speed at which performance gains are made has been slowing down in recent years. Architectures are milked for all that they are worth while not showing that much of an increase in sheer power.
 
Last edited:
Keep in mind though that the speed of which performance gains are made has been slowing down in recent years. Architectures are milked for all that they are worth while not showing that much of a increase in sheer power.
What he said.

No longer is it the time when you had a GPU from 5 years ago and then you brought a new one and suddenly 5x performance.
 
Now, you are entirely missing the point. GTX 680 or ATI 7870 will still run TW3, and the game will look as good (at the same FPS) as any other properly made game that runs on GTX 680 or ATI 7870. But GTX 680 and ATI 7870 will be 3 years old by the time TW3 is released, and it would be silly from the devs to limit the graphical fidelity of TW3 with the requirement of making GTX 680 or ATI 7870 able to run TW3 smoothly on max settings. This is nothing more that the natural progress of computer hardware (every 18 months, or so, hardware performance doubles - Moore's law).

Yes , you have a reasons very realistic to respect to the PC world but I think that The PC world could try it has more perfomance to GPUs less expensives , because not is the good moment. But , always respecting opinions.
Eskerrik asko
 
What he said.

No longer is it the time when you had a GPU from 5 years ago and then you brought a new one and suddenly 5x performance.

To be accurate, if you had a 5 year old GPU and you buy today a new one you should expect that the new one is 3,33 times faster/better/more powerful (5y/1,5y or 60m/18m) according to Moore's law. I know that Moore's law isn’t 100% accurate (that’s why I wrote “or so”).
I had ATI 6950, and recently I bought GTX 770, and I experienced the doubling of FPS in all games. That’s like 2-2,5 years for doubling of performance (for the same price) which is not that far off the Moore's law.
 
Last edited:
To be accurate, if you had a 5 year old GPU and you buy today a new one you should expect that the new one is 3,33 times faster/better/more powerful (5y/1,5y or 60m/18m). I know that Moore's law isn’t a 100% accurate (that’s why I wrote “or so”).
I had ATI 6950, and recently I bought GTX 770, and I experienced the doubling of FPS in all games. That’s like 2-2,5 years for doubling of performance (for the same price) which is not that far off the Moore's law.

True but it used to go a heck of a lot faster in the past. Faster cards kept being pumped out at an almost alarming rate and new tech seemed to pop out of nowhere every few months or so. These days a new architecture is introduced with a decent performance gain and then milked by having several iterations of said architecture.

One could also argue that the prices of video cards have not been getting any more attractive lately. A card with 80 on the end was considered to be the top of the line model. With crazy products like the titan, ti, titan black and such the 80 has become more of a high middle end with the crazy brothers towering over it. My post is not a rant btw, it is just to air that I believe that the gpu market has become slower and more expensive in general.
 
Well that's exactly what I mean, you're getting double FPS now. If it was say a decade ago the difference would be bigger.
We'll only get... 'diminishing' returns right now so to speak.

I'm not trying to say the advancements made in technology between are irrelevant but they don't always translate into raw performance like they did earlier.
 
True but it used to go a heck of a lot faster in the past. Faster cards kept being pumped out at an almost alarming rate and new tech seemed to pop out of nowhere every few months or so. These days a new architecture is introduced with a decent performance gain and then milked by having several iterations of said architecture.

One could also argue that the prices of video cards have not been getting any more attractive lately. A card with 80 on the end was considered to be the top of the line model. With crazy products like the titan, ti, titan black and such the 80 has become more of a high middle end with the crazy brothers towering over it. My post is not a rant btw, it is just to air that I believe that the gpu market has become slower and more expensive in general.

In the good old days we had more competition (3dfx, nvidia, ATI, VIA S3), and they were all fighting amongst each other to make better graphics cards. And now we only have ATI and nvidia (for over 10 years). If anyone studied economic theory (especially microeconomics, and especially Game theory) then you will notice that ATI and nvidia aren’t fighting as much. It feels like they are a cartel. But lets not delve into it.
 
Last edited:
IMO there isn’t much that optimization can do, it’s not some kind of magic or wizardry. If the REDengine 3 is made properly (which is) than the game is pretty much optimized. The only optimization left is to reduce graphical fidelity that wouldn’t be noticed that much (like reducing the view distance from, i.e., 2 miles to 1 mile, or reducing the number of animated hairs on a wolf from 200.000 to a 100.000 or 50.000 and stuff like that).

Sorry don't agree on your view of optimization. Obviously it's not some kind of magic that will give you double performance boost but good decisions and the usage of right technology can either make or break the game.

Just see what happened to Skyrim, it's plagued by performance issues. Specially if you strive to run it on max settings. There is a lot of stuttering, long loading times, texture popping issues etc. I just upgraded to 8 GB ram and all these issues have greatly reduced, now at the time it was released 8 GB wasn't the standard, 4 GB should have been enough for this game it they made better streaming tech. Optimization is basically about clever usage of tech to provide an overall better experience. This article explains my point

http://www.lazygamer.net/general-news/the-witcher-3-wont-overload-your-video-card/

Now according to this there is no reduction or compromise of graphical fidelity because what you're currently looking is still rendered in full detail while at the same time reducing load by using lower quality models for things that are not in your focus. This is right usage of tech.
 
Hopefully there will be other versions of anti-aliasing supported by this game like SMAA or even FXAA which might not look the best but does the job(in most games)and doesn't affect the performance that much(or at all if implemented properly).
 
Last edited:
Hopefully there will other versions of anti-aliasing supported by this game like SMAA or even FXAA which might not look the best but does the job(in most games)and doesn't affect the performance that much(or at all if implemented properly).
I'd rather not use FXAA, because it really makes textures blurry. If I have a choice between FXAA or no AA, I'll go with no AA to save the textures.
 
Atleast we know there's gonna be MSAA and TXAA so there's a chance there might be post-process ones like SMAA.
 
I am using TXAA in AC4 and I must say it is less demanding than another antialiasing(for example EQAA).For Nvidia is TXAA best solution of course you must have good GPU for example GTX 780 if you want have everyting maxed + TXAA.
 
Last edited:
I'd rather not use FXAA, because it really makes textures blurry. If I have a choice between FXAA or no AA, I'll go with no AA to save the textures.
It's all about personal preference(or necessity in my case since my rig is mid-range)

Atleast we know there's gonna be MSAA and TXAA so there's a chance there might be post-process ones like SMAA.

TXAA is a Nvidia exclusive(if I'm not mistaken)so AMD users(such as myself)can't use it but I wouldn't mind SMAA since it's somewhere between MSAA and FXAA in terms of how much it affects the performance.
 
Temporal SMAA (T2X, 4X modes) should supersede TXAA. It's universal and injectable, unlike TXAA, which is proprietary. I'd be pleased to find that the game works with injected SMAA and even more pleased to find that it includes it as a built-in option.
 
On a related note, the requirements for running the upcoming game Watch Dogs on ultra settings has been confirmed to be a i7-4770k and a 780. It seems like the time for crazy requirements is upon us.
 
I really don't get how, the game doesn't look that good but then I remember it's Ubisoft and AC4 barely used 2 of my 6 cores so I won't put much faith in overblown system requirements.

If PS4's $100-150 GPU can pull it off, I don't see any reason why we need a $500+ card, it's just insane.
 
Status
Not open for further replies.
Top Bottom