The Witcher 3: Wild Hunt - PC System Requirements are here!

+
Recommends an i7?
Taking these with a grain of salt in a very big glass of water.

I suspect an R9 490 will be needed for a solid 60 fps @ 1080p maxed (hope not).
 
Last edited:
Nice, I was a bit concerned about my CPU (i2500K, still at stock settings). OC that and get that 970/980 (dunno which one, yet) I've always wanted, and I should be good to go!
 
Hopefully the CPU requirements are just overblown. I mean right now I'm playing AssCreed Unity on my 4670K underclocked to 2GHz. I still get 50-60fps as when I'm running at stock 3.4GHz (gtx 970) but obviously the CPU usage goes up. Basically most Intel desktop CPU's have lots of headroom, especially compared to the weak console cpu's.

I was even playing Tomb Raider and Witcher 2 at 1GHz last night! And still 60fps (maybe some stuttering now and then).
 
Any word on the VRAM requirements for highest texture quality @4k? Will a multi-GPU set up of 290X's or 980's with their 4GB VRAM be able to do it or do we need 6 to 8 GB VRAM cards to make that happen?
 
SLI or Crossfire doesn't matter because VRAM doesn't stack, you can have 4x 4GB VRAM cards and you'll still only be able to use 4GB.

And well obviously the VRAM requirements will shoot up if you want to do 4K/supersample/ubersample.
 
Intel® Core™ i5-3570k 3,4 Ghz
GIGABYTE GTX 760 OC 2GB GDDR5 rev 2.0
8GB DDR3 @ 1600 MHz

Hoping for High+ settings with some features like AA and Sync off.
 
Intel® Core™ i5-3570k 3,4 Ghz
GIGABYTE GTX 760 OC 2GB GDDR5 rev 2.0
8GB DDR3 @ 1600 MHz

Hoping for High+ settings with some features like AA and Sync off.
I'm guessing you need a 770 to be able run to those settings in 1080p and have a smooth frame rate. 760 is still quite a bit slower than a 770..
 
I know it's a bit slower than 770 but my specs are still above console "specs". CDPR said that console version is supposed to be running on high, That being said Im expecting to run W3 on High maybe with some setting even set on Ultra but with AA and Sync off. In other words a mix between Ultra and High,

P.S. I will be running it at 1680x1050 since that's the native resolution of my PC. That should also give me a boost since im not going for 1080p as my monitor does not allow it.
 
I know it's a bit slower than 770 but my specs are still above console "specs". CDPR said that console version is supposed to be running on high, That being said Im expecting to run W3 on High maybe with some setting even set on Ultra but with AA and Sync off. In other words a mix between Ultra and High,

P.S. I will be running it at 1680x1050 since that's the native resolution of my PC. That should also give me a boost since im not going for 1080p as my monitor does not allow it.
A good point. But we still don't know the native resolutions of the console versions though.
 
A good point. But we still don't know the native resolutions of the console versions though.

Well I doubt that they will reach 1080p... Even if they do, it wont be 60fps for sure, but just a 30fps lock. I guess it's 900p for PS4/Xbox
 
Yes! I meet the recommended (thanks to a recent overall upgrade), matching RAM and CPU, but not the GPU, which is a GTX 760. But, considering I play with 1280x1024 resolution because of my old monitor, I think my rig will handle it well.
 
Now if only we got a list of graphics options :) I wonder if "Ultra" for textures and shadows are going to use 4k textures (that you don't need unless you have a 4k monitor).
Hoping for "real" AA options this time around.
 
Now if only we got a list of graphics options :) I wonder if "Ultra" for textures and shadows are going to use 4k textures (that you don't need unless you have a 4k monitor).
Hoping for "real" AA options this time around.

Not to get to far off subject, but does anybody else find AA to have a barely noticeable effect outside of screenshots? Like when ever a game is in motion I have never noticed much benefit from it other than a performance chug.
 
Not to get to far off subject, but does anybody else find AA to have a barely noticeable effect outside of screenshots? Like when ever a game is in motion I have never noticed much benefit from it other than a performance chug.

Neither did I. Even in W2 when switching between AA on / off, - I barelly can see any difference.
 
Neither did I. Even in W2 when switching between AA on / off, - I barelly can see any difference.

I messed around with it quite a bit in AC4. Couldn't spot any difference outside of far off tree clusters. Even than it was more like it was getting rid of a mild fuzz that didn't bother me anyway. At any rate I imagine it will be the first thing I turn off for Witcher 3.
 
Not to get to far off subject, but does anybody else find AA to have a barely noticeable effect outside of screenshots? Like when ever a game is in motion I have never noticed much benefit from it other than a performance chug.

Depends on what type of game you play.

It does make a huge difference in titles such as Steelbeasts Pro, where the thin, hard edged barrels, antenna etc (which may be important recognition features (or the most visible part of a concealed vehicle)) shimmer unpleasantly without the AA enabled. Framerate hit is minimal ~ compared to terrain detail and visibility options. As draw distances are routinely out to 4km (for vehicles with rendered Thermal Imagers at least), and can be over 12km the terrain and rendering engine is the main limiting factor (though the number of actors (and the LOS/routeing checks associated with high unit density) can be enough to bring most machines to their knees if restraint isn't used).

Also relevant in flightsimulators, where narrow/hard edged wings/empennage shimmer if not sub-pixel filtered.
 
Top Bottom