The Witcher 3: Wild Hunt - PC System Requirements are here!

+
Watch_Dogs does this, drop resolution but framerate stays the same hence CPU bottleneck. Tested at as low as 800x600.

 
an i5 with a gtx 970 here, an i7 with sli-ed 980s there...... am I really the only one here who hopes to be able to play the game at least on low at 30fps with a G3258 and a gtx 750ti? :sad:
 
Well there you go, found your problem. That's not exactly a new affair with Ubi games for a while now or Arma for that matter, that doesn't make i5 a bottlenecking CPU, just that it's not the most optimum hardware for certain applications, not 'most' applications. That way any i5 will be beaten by an i7 at encoding videos because encoding is very thread intensive while videogames for the most part rely on single threaded performance.

Far as I know, the easiest way to check CPU bottleneck is to drop resolution, if the framerate improves considerably then it's not a CPU bottleneck, if it stays the same then that's a CPU bottleneck. Watch_Dogs does this, drop resolution but framerate stays the same hence CPU bottleneck. Tested at as low as 800x600.

I think you don't need to drop resolution to check CPU bottleneck. you can easily check even with task manager. AC UNITY in this video is just a good example for it.
Actually no matter what application or game you use, if any of your cpu threads( even if only 1 thread) is beeing used 95%+ your CPU will hold back your GPU from rendering more frames. And most modern games like( fc4,acunity,dying light etc.. ) use hyperthreading, witcher3 will use aswell.

Here is another good example of cpu bottleneck with an i5 and gtx 970 in far cry 4:
https://www.youtube.com/watch?v=Qg_sNyZnMCI
When core2 exceeds 90% gpu usage is not constant. the gpu usage dips ( even to 40%.. ) framerate dips aswell to 30.

I don't know if witcher 3 is really cpu heavy or not, but most of the open world games are cpu heavy, because it needs to load content continually while playing
 
an i5 with a gtx 970 here, an i7 with sli-ed 980s there...... am I really the only one here who hopes to be able to play the game at least on low at 30fps with a G3258 and a gtx 750ti? :sad:

Im wondering the same thing...I admit my knowledge on GPUs and performance of cards are pretty poor.

So excuse my ignorance...I have an I5 3470 , 8gb ram and EVGA GTX750TI and despite the initial doubts managed to play Shadow of Mordor with most Settings on High @ 45fps. To me that is ample, I will never be able to run/afford a system that can run the latest games at ULTRA.

I understand that the Witcher 3 will be more demanding than SoM, but to the extent that I will have to run everything on Low settings and still only maybe get 30pfs?
 
Last edited:
the i5 would be a bottleneck for the 980

The only game in which my i5 4690k reached 100% utilization thus dropping fps (bottleneck) is Assassins Creed Unity, here I am talking about GTX 980 SLI and when the game is running over 119 fps so there is no way a single 980 will be bottle necked.

an i5 with a gtx 970 here, an i7 with sli-ed 980s there...... am I really the only one here who hopes to be able to play the game at least on low at 30fps with a G3258 and a gtx 750ti?

Can't really say mate, we don't know how much demanding it will be on low or medium, we just know that it was running on almost 60 fps with i7 4790k and GTX 980 at ultra settings (as revealed by GameStar guys).

Here is another good example of cpu bottleneck with an i5 and gtx 970 in far cry 4:
https://www.youtube.com/watch?v=Qg_sNyZnMCI
When core2 exceeds 90% gpu usage is not constant. the gpu usage dips ( even to 40%.. ) framerate dips aswell to 30.

That's the problem with shoddy code not utilizing all cores properly. TW3 looks like an ptimized game so it should distribute load on all cores evenly.
 
Last edited:
That's the problem with shoddy code not utilizing all cores properly. TW3 looks like an ptimized game so it should distribute load on all cores evenly.


Furthermore, a game that's limited by the performance of a single core won't benefit from hyperthreading. One way this can arise is with Lua scripting, since Lua doesn't do multithreading. If you have so much scripted activity going on (and an open world with many actors will have a lot of it), you're limited by how fast the thread running the scripts can execute.

DirectX 11 also imposes a single-thread bottleneck, since you're feeding the GPU from one thread (even if you're queuing up commands in several threads). This won't be relieved until we have DirectX 12 or Vulkan to work with.

When you're limited by how fast one or two threads can execute, a really fast dual core is just as effective as an 8-bogocore FX or i7.

But another issue is cache. Instructions and data get used from cache, not main memory; the CPU uses main memory as a backing store for its cache. Core i7's have a lot more cache ( 8MB L3 vs. 6MB in Core i5's ) and this may affect performance as much as hyperthreading does.
 
Last edited:
I believe what tahirahmed says above is probably correct. The i5 4460 is not the most powerful cpu out there but i think its pretty difficult for it to bottleneck a single graphics card unless it's one of the dual monsters or you're using sli/crossfire.
 
Can't really say mate, we don't know how much demanding it will be on low or medium, we just know that it was running on almost 60 fps with i7 4790k and GTX 980 at ultra settings (as revealed by GameStar guys).
Yop, I know that nobody knows :)D) But I'm rather more skeptical about the CPU than the GPU, as it is just a 2-core.
 
Yop, I know that nobody knows :)D) But I'm rather more skeptical about the CPU than the GPU, as it is just a 2-core.

Well a friend of mine is using Core i3 2100 which is a dual core with four threads with 8 GB ram and a GTX 660 on 1680x1050 resolution monitor and he's still okay with his system, none of the games give him big issues but at the same time he doesn't ask for a lot, he's just looking to run them at 30 - 40 fps with as much higher settings as possible to keep this frame rate stable.

So the thing is you may be able to run Witcher 3 with a dual core but how much bottleneck you'll face we don't know, it will be pushing your CPU for sure though, the best thing you can do is to wait for the game to come out and then read some performance reviews to get an idea how much punishing it is on CPU & GPU then you can decide because getting a high end i5 or i7 won't give you huge benefit if your GPU is lagging behind. The game is going to be costly on both.
 
what do you think? will there be a difference in performance when entering a big city? fps drops with gtx 980 or equivalent?

---------- Updated at 04:50 PM ----------

Here guy talk about difference between gtx 970 and 980 considering Witcher 3, and it turnes out that 970 should run pretty similiar to 980, it shoud run on ultra at 1080p with 50-55 fps, and that sound good to me ;)

http://www.reddit.com/r/witcher/comments/3077p2/regarding_the_980_vs_the_970_when_it_comes_to/

i'm worried about the same issue, because the 980 has 4 gb vram, and the 970 has 3,5 , and the performance of the 970 is 10% - 15% lower, so if the 980 reaches 60 fps, the 970 reaches 41 or something like that, i think the game maxed out with all ultra , hairworks, uber, etc, won't pass the 40 fps, and in large populated cities or forests about 30 fps, or lower, but we can't be certain , because the game isn't out yet
 
Performance hits on entering an area with many actors are more likely going to come from a CPU bottleneck. Unless they have a scripting language more thread-friendly than Lua, all the actors have to be evaluated on one thread.

The 3.5GB canard needs to be taken out and shot.
 
The recommended GTX 770 has 2GB of VRAM. I don't think the game will reach 3.5 gigs at 1080p, even with all graphical bells and whistles activated.

Except CDPR delivers a crazy-high-resolution Texture Pack like Shadow of Mordor did (which really didn't make that much of a difference in comparison to the next lower setting).
 
Last edited:
I have very recently got a GTX 970. I tested it with DA:I just to see how it plays cause the game can be pretty demanding.

I managed to run it at 1080p with everything on ultra but no MSAA at an average of 75fps. The maximum amount of ram the game has ever used was 1700+ a little below 1800, after 2+ hours of gaming.

My point is that the 3.5GB are waaaay more than enough for every game that will be coming out this year. The last 500 which run slower are still there just in case but i doubt they would ever be needed for now :)
 
The recommended GTX 770 has 2GB of VRAM. I don't think the game will reach 3.5 gigs at 1080p, even with all graphical bells and whistles activated.

Except CDPR delivers a crazy-high-resolution Texture Pack like Shadow of Mordor did (which really didn't make that much of a difference in comparison to the next lower setting).

Does Ubersampling qualify...? I doubt so but could it reach 3-3.5GB vram?

And yeah I know it's coming post launch, not at release.
 
Top Bottom