Predicted witcher 3 system specs? Can I run it .

+
Status
Not open for further replies.
In the case of the Witcher 2 and possibly 3, performance is tied closely to pixel count; and 1440p has 2.25x the pixels of 1080p. I think it will be a lot more than a 25% to 40% bump in GPU power to get the same results at 1440p. Still, @Vigilance , you're right; it may require substantial cards in SLI or Crossfire to get high performance at 1440p.
 
You might be right, but it's not a huge portion.

My initial math might have been off... Looking at some benchmarks a 780Ti is about 60% more powerful than a 680, and a single 780Ti achieves about the same framerate at 1440p as a 680 does at 1080p... So the window is probably not 25-40% and more like 30-60%. However it all heavily depends on the GPU, Engine, Optimization etc.

So it definitely won't be as easy as someone going, "I'll buy a 980 rather than a 970 and play 1440p at the same framerate as 1080p", because that's simply not giving it enough juice. However going from a single 970 to SLI, would definitely allow someone to bump the Res up to 1440p and you'll probably get even better FPS then the single card at 1080p. Always just comes down to money and how much someone is willing to spend, 1440p is certainly achievable though (Most likely unlike 4K) and it definitely provides a nice, obvious boost over 1080p.
 
1440p is achievable yes, but it's not easy peasy as some may believe because the pixel count is quite a bit higher as for the framerate, the bus width also matters depending on how much higher your resolution is going.
 
1440p is achievable yes, but it's not easy peasy as some may believe because the pixel count is quite a bit higher as for the framerate, the bus width also matters depending on how much higher your resolution is going.

Yep. Even if my math is correct (Which I'm sure there's something wrong with it somewhere there) then 30-60% is no laughing matter. It pretty much means you more than likely have to go SLI if you want the most recent and best looking games at Max/60, and that's where the $$$ comes into play.

I'm still waiting and praying that come Jan/Early Feb we'll see revised 980/970's or a new card that has a bigger bus width and more VRAM.
 
In the case of the Witcher 2 and possibly 3, performance is tied closely to pixel count; and 1440p has 2.25x the pixels of 1080p. I think it will be a lot more than a 25% to 40% bump in GPU power to get the same results at 1440p. Still, @Vigilance , you're right; it may require substantial cards in SLI or Crossfire to get high performance at 1440p.
Hopefully dual GTX 970s are up for the job, since I just recently ordered two MSI Gaming 4G cards..
 
If you find two aren't benefiting you, send one this way :p
 
Hey guys, just dropped in to get an evaluation.
I'm planning to upgrade my PC before TW3 comes out, planning on getting a Gefore GTX 960 with at least 2 - 4 GB VRAM.

But my processor is pretty old, I still have a Intel Core 2 Quad 2,33 Ghz, up until now it was enough for every game I played but it could get problematic with next gen, so my question is: new processor yes or no? (if I get a new one it will probably be an i5 3,5 Ghz) Tough decision for me because new processor means new mainboard (probably MSI or Gigabyte).

IF I get a new processor I might want to buy another (cheaper) card such as a GTX 660 or 760, would that be enough then? I don't want to run the game on max, just want decent medium settings.

(Additional Info: Got Win 7 and 8 GB RAM, Monitor is a Samsung SyncMaster with a 1680x1050 res)

I ran the first One on an eMachines with a Pentium D 945 ( 3.4 Ghz ) and Nividia 9800GT card and that did fine.

Man I ran The Witcher 2 with a 9800 GT (Intel Core 2 Quad 2,33 Ghz and 8 GB RAM) on medium settings and it had 30 FPS most of the time.
 
Last edited:
Hey guys, just dropped in to get an evaluation.
I'm planning to upgrade my PC before TW3 comes out, planning on getting a Gefore GTX 960 with at least 2 - 4 GB VRAM.

But my processor is pretty old, I still have a Intel Core 2 Quad 2,33 Ghz, up until now it was enough for every game I played but it could get problematic with next gen, so my question is: new processor yes or no? (if I get a new one it will probably be an i5 3,5 Ghz) Tough decision for me because new processor means new mainboard (probably MSI or Gigabyte).

IF I get a new processor I might want to buy another (cheaper) card such as a GTX 660 or 760, would that be enough then? I don't want to run the game on max, just want decent medium settings.

(Additional Info: Got Win 7 and 8 GB RAM, Monitor is a Samsung SyncMaster with a 1680x1050 res)



Man I ran The Witcher 2 with a 9800 GT (Intel Core 2 Quad 2,33 Ghz and 8 GB RAM) on medium settings and it had 30 FPS most of the time.
Since you have to get a new motherboard anyway try looking into AMD CPUs. They're cheaper than Intel and still great for gaming. Could maybe get an AMD CPU and still get 960 or 770

http://www.logicalincrements.com/

Some suggestions. That site is not the be all end all. It's got good suggestions though

and http://pcpartpicker.com/ for finding the cheapest price on the MOBO, CPU, GPU.
 
Since you have to get a new motherboard anyway try looking into AMD CPUs. They're cheaper than Intel and still great for gaming. Could maybe get an AMD CPU and still get 960 or 770

Idk, never had an AMD CPU. Don't know a lot about them and was very happy with Intel so far, especially since I am also working with Nvidia most of the time.
I'll look into it, thanks for the reply.
 
Hey guys, just dropped in to get an evaluation.
I'm planning to upgrade my PC before TW3 comes out, planning on getting a Gefore GTX 960 with at least 2 - 4 GB VRAM.

But my processor is pretty old, I still have a Intel Core 2 Quad 2,33 Ghz, up until now it was enough for every game I played but it could get problematic with next gen, so my question is: new processor yes or no? (if I get a new one it will probably be an i5 3,5 Ghz) Tough decision for me because new processor means new mainboard (probably MSI or Gigabyte).

IF I get a new processor I might want to buy another (cheaper) card such as a GTX 660 or 760, would that be enough then? I don't want to run the game on max, just want decent medium settings.

(Additional Info: Got Win 7 and 8 GB RAM, Monitor is a Samsung SyncMaster with a 1680x1050 res)



Man I ran The Witcher 2 with a 9800 GT (Intel Core 2 Quad 2,33 Ghz and 8 GB RAM) on medium settings and it had 30 FPS most of the time.

you'll probably have to change to avoid bottleneck. A good i5 with a 760 could give you good performance. Anyway if you can it may worth to wait a little for 960 to see price and performance.

P.s. personal opinion, I'd stick with Intel for CPU
 
That's the reason why I was worried about CDPR going with Nvidia in development of TW3.

http://www.dsogaming.com/news/far-c...recommends-gtx-680-requires-30gb-of-free-hdd/

While it's Ubisoft game so I am not really surprised but what's extremely odd is that they are comparing GTX 680 with R9 290X. Even R9 290 is around 30% faster than GTX 680 let alone R9 290X so this means that the game will run like crap on AMD GPUs or say it's poorly optimized for AMD.

I hope CDPR optimize their game fairly for all hardware and don't go Nvidia biased.
 
That's the reason why I was worried about CDPR going with Nvidia in development of TW3.

http://www.dsogaming.com/news/far-c...recommends-gtx-680-requires-30gb-of-free-hdd/

While it's Ubisoft game so I am not really surprised but what's extremely odd is that they are comparing GTX 680 with R9 290X. Even R9 290 is around 30% faster than GTX 680 let alone R9 290X so this means that the game will run like crap on AMD GPUs or say it's poorly optimized for AMD.

I hope CDPR optimize their game fairly for all hardware and don't go Nvidia biased.

Fairness and bias are loaded terms that may be thrown around with little concern for how the industry actually manages to gets things done.

You work with the partners who will work with you, the way you want to. "Fairness" is an expensive luxury, and "bias" is a cheap accusation.

If nVidia provides better middleware and assistance, in a way your team can use effectively, you work with nVidia. If AMD does, you work with AMD. It's not any more complicated than that.
 
Fairness and bias are loaded terms that may be thrown around with little concern for how the industry actually manages to gets things done.

You work with the partners who will work with you, the way you want to. "Fairness" is an expensive luxury, and "bias" is a cheap accusation.

If nVidia provides better middleware and assistance, in a way your team can use effectively, you work with nVidia. If AMD does, you work with AMD. It's not any more complicated than that.

You work with your partners yes but you don't work "For" them, your end goal is to please the market not the partners. Your game should work well for all possible customers out there not subset of them having Nvidia.

In any case it doesn't make sense to compare GTX 680 with R9 290X when the later is around 30% faster or more, the word "bias" makes sense here.
 
I think AMD cards will get optimized just as well. After all, the new consoles are using AMD technology aren't they?
 
Last edited:
You work with your partners yes but you don't work "For" them, your end goal is to please the market not the partners. Your game should work well for all possible customers out there not subset of them having Nvidia.

In any case it doesn't make sense to compare GTX 680 with R9 290X when the later is around 30% faster or more, the word "bias" makes sense here.

Actually, you work with the partners that your effort is best spent in working with, not for a market objective you cannot achieve on a limited budget and schedule.

Accusations that CDPR is intentionally supporting nVidia, Microsoft, or any other partner at the expense of others are inherently unprovable and unfair.
 
I think AMD cards will get optimized just as well. After all, the new consoles are using AMD technology aren't they?

Yes that's right, I just hope their optimizations scales well on PC. For example when you opt for 1080p at 60fps with all bells and whistles.

Actually, you work with the partners that your effort is best spent in working with, not for a market objective you cannot achieve on a limited budget and schedule.

Accusations that CDPR is intentionally supporting nVidia, Microsoft, or any other partner at the expense of others are inherently unprovable and unfair.

Well without turning this into an argument, I will just clarify my previous statements.

I said I "hope" they don't go Nvidia biased, not that they "will" go Nvidia biased (which is accusatory). The word "bias" was used as an example of what Ubisoft is doing to AMD users, it wasn't intended as an accusation towards CDPR.

There are several games out there that use Nvidia tech but also perform well on AMD hardware and above all mention system requirements on equivalent basis. I only wished that CDPR also make a game that works well on both platforms unlike Ubisoft which clearly favors Nvidia due to their close partnership.
 
Now i have geforce 780 3go > 16 go ram > intel core i7 4770 i want to know if with this i can play the game on 1080 p on high setting at 60 frame or more ? a yes or no is enought for me because i think of buying a gtx 980 but if my rig is enough to play it why spend money ( already buy the game on gog )
 
Last edited:
I wouldn't worry about that, Ubisoft are not a good example of PC ports lately in the first place so Nvidia or AMD doesn't really matter.

Obviously TW3 is part of NV's TWIMTBP program so there's going to be some GPU specific goodies - like TXAA for example but other than those things it shouldn't matter what GPU you have, most of PhysX SDK runs on CPU now not the GPU and only certain effects need the GPU.
 
Status
Not open for further replies.
Top Bottom