The Witcher 3: Wild Hunt - PC System Requirements are here!

+
And I wouldn't get a 'Titan' branded card for gaming under all circumstances since the price is plain robbery, period.

Two silver linings on the horizon: AMDs Fiji GPU (R9 390(X)), which will most likely force Nvidia to introduce cheaper SKUs of their GM200 GPU (GTX 980Ti, or whatever they may call it), plus the ensuing price competition once all these cards are out. The downside: this will probably not happen in time for TW3 release.
 
I'm still not ready to get sad, it really doesn't make sense to me...

If you go look at some of the other benchmarks for the Titan X, you've got games like DA:I, FC4, Crysis 3 all running at fantastic framerates on Max settings, 1440p. DA:I is sitting at 79FPS, FC4 at 76 and Crysis 3 at 85.
Based on the Gameplay videos I absolutely CANNOT see the quality difference between something like DA:I/FC4 and TW3 on High settings, that would warrant such a huge reduction in FPS. Using an even closer example, Shadow of Mordor, which has different day/night conditions, weather system, tons of AI work going on, open world and those crazy Ultra textures, and that still runs super well - 86FPS.
If TW3 at High/1440p ends up only running at 30-40FPS on a Titan X, I'm sorry, but that's unoptimized as fuck.

Honestly, I wouldn't get sad just yet.

Just as a FYI, The Titan didn't run Crysis 3 at max settings at 85 FPS, it ran on high settings using FXAA, which is much less demanding. I assume you used the AndAndTech review. Using MSAAX2 on an overclocked Titan X it did achieve 61 average FPS (source: Linus Tech Tips). So in terms of actual fairness, you wouldn't actually receive 60 FPS on max settings playing Crysis 3 with a Titan X, unless you obviously overclocked it as well.

I agree that 30 fps on high settings with a Titan is unoptimized, but let's just be fair and stress the fact that high settings and ultra settings will be extremely different in terms of how demanding they are, since ultra settings uses a lot of extra graphical features as well ("Furtech", possible ubersampling etc.). If your computer isn't bottlenecked anywhere I'd stress that achieving 30 fps on ultra settings (with all extra features enabled) is quite fair, especially since anti-aliasing is a demanding feature and a ubersampling just fries computers. Even today playing Witcher 2 with ubersampling requires quite a lot of meat.

Witcher 2 was honestly a bit head of it's time in terms of the system needed to play the game on max settings (or in other terms, all other game are still way behind), and Witcher 3 seems to follow the same curve. If anything it's good for a consumer market because it pushes Nvidia and AMD to further develop better hardware and in a couple of years when you'll be able to pick up a GTX 980's performance for 200 € and a Titan's X performance for 350 €, you'll be able to play a "older" game who's still looking damn fine.
 
Last edited:
And I wouldn't get a 'Titan' branded card for gaming under all circumstances since the price is plain robbery, period.

It actually isn't as bad with the Titan X this time around. It's still completely skewed in terms of price/performance, but people who buy Titans generally don't seem to overly care about how much they're getting gipped, which is why it's strange that Nvidia went easy on the X, and it's actually better (not good though) in terms of price/perf, especially more-so than the OG Titan or Titan Black.

I agree that 30 fps on high settings with a Titan is unoptimized, but let's just be fair and stress the fact that high settings and ultra settings will be extremely different in terms of how demanding they are, since ultra settings uses a lot of extra graphical features as well ("Furtech", possible ubersampling etc.). If your computer isn't bottlenecked anywhere I'd stress that achieving 30 fps on ultra settings (with all extra features enabled) is quite fair, especially since anti-aliasing is a demanding feature and a ubersampling just fries computers. Even today playing Witcher 2 with ubersampling requires quite a lot of meat.

Yea that's why I kept my ramblings to just the 'High' standard. Ultra is a big wildcard at the moment and I generally don't like to bother going into it because much of what it actually adds is unknown.
I didn't realize Crysis 3 wasn't on its Max settings, but I guess that's why it's Crysis. FC4 and DAI however are on their highest (Besides AA) and you're still looking at the High 70's in terms of FPS @ 1440p. The Witcher 3 at High from what we've seen appears to be somewhat around what those games are near Max, so as I said, I'm failing to see how TW3 couldn't do High/1440/60, and as you said, 30 FPS for Ultra seems fair enough taking that into consideration, but it's just still too much of a wildcard to make any solid guesses. That said, 30 FPS is still playable, and if Ultra looks that good and with the resolution bump of 1440p, it might be worth sacrificing 60FPS, depends on preference (I wouldn't frankly).

I'm just saying, don't get upset too early, wait for the benchmarks. TW3's requirements seem a tad off to begin with, but then trying to use that data to draw conclusions about a GPU and how it will perform at settings we don't know, at a resolution these requirements aren't designed for - there's too many variables, wait and see.
 
Last edited:
It actually isn't as bad with the Titan X this time around. The OG Titan was like walking up to a table of 4 and stealing everyone's meal. The Titan X is like doing the same but only stealing 2 meals. It's still completely skewed in terms of price/performance, but people who buy Titans generally don't seem to overly care about how much they're getting gipped, which is why it's strange that Nvidia went easy on the X, and it's actually better (not good though) in terms of price/perf, especially more-so than the OG Titan or Titan Black.
Here in Euroland, the X is listed at 1150 bucks. That's even worse than the original Titan. Without any high DP performance for professionals or some similar apology.
Out of the box, the Titan X is nothing but a GTX 980 plus some 35% performance-wise, for +100% price tag.
 
Last edited:
Here in Euroland, the X is listed at 1150 bucks. That's even worse than the original Titan. Without any high DP performance for professionals or some similar apology.
Out of the box, the Titan X is nothing but a GTX 980 plus some 35% performance-wise, for +100% price tag.

Well If that's the deal in Euros then that does seem a bit crazy. I'm not too sure how everything is priced in Euros, but having followed US prices on all these GPU's for the last few years, the X is a much better deal than the OG Titan, but it's all relative.

Anyway we've probably discussed the X enough, have to wait and see what Ultra does and how those benchmarks come out.
 
Here in Euroland, the X is listed at 1150 bucks. That's even worse than the original Titan. Without any high DP performance for professionals or some similar apology.
Out of the box, the Titan X is nothing but a GTX 980 plus some 35% performance-wise, for +100% price tag.

on my part i think the same i m pro nvidia so i will wait for the replacement of the gtx 980 since now i have a gtx 780 could wait one or two year before i have to change cause of requirement. After at this time nvidia titan x is plain robbery (1 250 euro my goodness) .i think for sure that witcher 3 will run on ultra at 720-1080 p on a 980 gtx and 4k for titan x ( perhaps create for that purpose ). After as M4xw0lf said new radeon gpu will arrive this year not in time for witcher 3 but clearly good to run it on ultra setting after the launch . i think i will play on high at 1080 for this year and afer i change try for ultra .
 
Hello everyone, I'm a newbie here, right now I'm counting everyday until the release. But when i saw this post it scared the hell out of me :(.
I just wanna ask if my PC could handle this game or not, and if it could, what's the setting? Low, medium, high? Sorry, my english is not as good as you guys ^^
Here is my PC:
Intel CPU core 2 quad Q8400 2,66GHz
Asus Strix Nvidia Geforce 750Ti OC 2GB GDDR5
8GB RAM Bus 1333
OS windows 7 64 bit

Thank you!
 
Hello everyone, I'm a newbie here, right now I'm counting everyday until the release. But when i saw this post it scared the hell out of me :(.
I just wanna ask if my PC could handle this game or not, and if it could, what's the setting? Low, medium, high? Sorry, my english is not as good as you guys ^^
Here is my PC:
Intel CPU core 2 quad Q8400 2,66GHz
Asus Strix Nvidia Geforce 750Ti OC 2GB GDDR5
8GB RAM Bus 1333
OS windows 7 64 bit

Thank you!

You'll be probably aiming for low settings at low resolution, since you're below minimum requirements. If you plan to upgrade it would require a complete rebuild since you CPU/socket is quite obsolete.
 
This means GTX 970 owners can also enjoy 60 fps with a little overclock because it becomes a 980 with overclocking ? good work CDPR; hats off to you ! I hope AMD also step up now and release that 300 series along with new drivers so people still waiting for upgrade can go for it.
 
This means GTX 970 owners can also enjoy 60 fps with a little overclock because it becomes a 980 with overclocking ?
Sadly it does not. A cut die is always a cut die.

Although mine do overclock to 1,5GHz with ease. Which is quite impressive and do offer a nice performance boost.
 
Last edited:
Sadly it does not. A cut die is always a cut die.

Although mine do overclock to 1,5GHz with ease. Which is quite impressive and do offer a nice performance boost.

Yes but I am talking about performance numbers, I think if you do moderate OC to 970 it will reach reference 980 performance quite easily and you have 970 SLI, with those you can easily handle it at Ultra :)

I am glad the game works well on single 980 so with two of them in SLI, I can DSR it to 1440p until I get a real 1440p monitor. Yeah I have 1080p monitor at the moment :p
 
Nice! My i7 5930K (6 cores @ 3.5 GHz), GTX 980 and 1080p HDTV are ready! Might even overclock these to get some higher performance.

EDIT - Just overclocked the CPU to 4.3 GHz, and the GPU to 1450 boost clock. Bring it Witcher 3 :D
 
Last edited:
I have a question. I haven't upgraded my CPU since ages, because every new Intel generation was just a little bit faster, and i never do overclocking. Now i have the following system ATM and can't decide if my current CPU will be a bottleneck for running TW3.

Intel i5-2400 (Sandy Bridge, 4x 3.10GHz)
Nvidia GTX970 4GB
RAM 8GB
Windows 7 64bit

My aim is to run in 1080p or 1440p with a mix of high/ultra at 30fps. IIRC @GuyNwah said that Sandy Bridge and newer CPU should be OK, but maybe i misinterpreted.

What do you guys think - do i need to upgrade my CPU or am i good?
 
The thing about Sandy Bridge was it was a big advance, not so much in pure execution performance, as in memory management. Sandy Bridge CPUs are still superior to Ivy Bridge and early Haswell in cache and main memory access times. The Sandy Bridge-E is still used in the server market, largely for that reason. Only the "4th generation" Haswells finally passed them.

On a desktop, the main limitation of Sandy Bridge is that it doesn't support PCI-e 3.0. This isn't a big deal unless you are running high-performance cards in SLI/Crossfire. But for a single high performance card like yours, Sandy Bridge CPUs are well within any reasonable expectation of performance, and I wouldn't think of replacing one.
 
Can i assume that my PC (i7 4770 12gb RAM GTX 780 ti(OC)) handle with that game on high settings with good fps?
 
Top Bottom