The GameStar Hands-On Session: Technical Matters and First Impressions

+
After some time thinking and actually i always suspected this. It seems that pc requirements are always up to the developer or publisher of the game and how they feel requirements should mean. There is not a perfect way to measure excalty or maybe there is but it's not like all game creators do requirements the same way.
 
So if recommended system requirements are for 30 fps, what are minimum for? 10 fps slideshow? I'm a die-hard fan of The Witcher franchise, but I am not going to spend $1000 just to get playable framerate.
 
So if recommended system requirements are for 30 fps, what are minimum for? 10 fps slideshow? I'm a die-hard fan of The Witcher franchise, but I am not going to spend $1000 just to get playable framerate.

It means you get the framerate you need by lowering the graphics settings, just as it always does. Is there any reason why you would expect it mean anything else?
 
It means you get the framerate you need by lowering the graphics settings, just as it always does. Is there any reason why you would expect it mean anything else?

Yeah but I think all the worries come from the fact that requirements seem to be pretty steep and PC gamers always aim reaching 60 fps (but there's no need to reach them if the game runs smooth even at lower framerates). And now all this 980&stuff talking is lowering the expectations. So I can understand how some more performance were generally expected, also considering that this game was probably developed with the previous generation of GPUs in mind. If so, I hope that the delays and release of new GPUs do not authorize some laziness in the optimization process. But for now everyone must leave time to CDPR to optimize everything before judging and I personally am quite optimistic about what they will come up with.
 
30 fps for recommended settings is nothing but a plain bullshit. I really believe that he (Adam Badowski) simply does'nt know what he's talking about, orherwise everything regarding the "optimisation" and "being fair to PC users" statements are plain contradiction with his own words.

Seriously a GTX 770/780 should kill that game on CONSOLE graphics settings that are the mix between low/medium, what the hell. How horse power PCs could run such settings for only 30fps? That doesnt make any logical sense. At all.

Or maybe CDPR expects that every freakin PC owner owns two pairs of 770/780 in their cases, TITANs?...

Im plannin on upgrading my PC with gtx 970ti or 980ti, not sure yet witch Im gonna pick, but Im sure as hell that I won't upgrade untill I wont experience Witcher 3 on my current machine, which btw is on par with recommended settings.
 
Last edited:
First I think we should scale back a little bit and remember that when witcher 2 specs were released and nobody knew then how the game would be or how it would play on substandard cards . Plus the GTX260 or the 4850 weren`t exactly bargain bin cards at the time either . Turned out that sub-par cards did really well with the game . Will it be the same for Witcher 3 ? I don`t know but much like Witcher 2 and time will tell .
EDIT : Almost forgot when system specs were revealed for W2 it was stated that a video card with 1 GB vram was recommended and a GTX260 had only 896 MB so I tend to take game specs with a grain of salt .
 
Last edited:
First I think we should scale back a little bit and remember that when witcher 2 specs were released and nobody knew then how the game would be or how it would play on substandard cards . Plus the GTX260 or the 4850 weren`t exactly bargain bin cards at the time either . Turned out that sub-par cards did really well with the game .

Whaaat. I had a 260 and could only manage 60 fps consistently on the very lowest settings. And it mattered-- the game was buggy and would nosedive in responsiveness at lower framerates. That game actually caused me to upgrade my graphics card--twice.
 
wont experience Witcher 3 on my current machine, which btw is on par with recommended settings.
I though you had a GTX 760?

Also, let's wait and see until the game is launched before we start accusing CDPR of making "unoptimized console ports". I've said this before but I have no problems having a graphics intensive game that eats little GPUs for breakfast if it means that the game will look spectacular even several years from now.
 
Last edited:
Whaaat. I had a 260 and could only manage 60 fps consistently on the very lowest settings. And it mattered-- the game was buggy and would nosedive in responsiveness at lower framerates. That game actually caused me to upgrade my graphics card--twice.
I never said anything about framerates . I also think all the hoopla about having to achieve 60 plus framerate is BS . I played witcher 2 at 30 fps and it was very playable at that rate . The only time it became unplayable was when it dipped in the low 20`s . I do agree on higher framerates with shooters though where quick turns are a do or die situation .
 
These excuses about 30 fps on PCs are nothing but pathetic. Cool story bro, but I bet people who own GPUs for high-end gaming are aiming for a bit more than that. You know - you could just own a console for okay'ish graphics and locked 30 fps performance, right?

Buying high-end cards and playing games at 30fps makes no fucking sense, pal.

As for myself - I can't stand Anything bellow 60fps. The difference of experience is like night and day.

@eskimoe - my brother's rig is far above those recommended specs.
 
Last edited:
I never said anything about framerates . I also think all the hoopla about having to achieve 60 plus framerate is BS . I played witcher 2 at 30 fps and it was very playable at that rate . The only time it became unplayable was when it dipped in the low 20`s . I do agree on higher framerates with shooters though where quick turns
are a do or die situation .

You did not specifically use the words 'frame rate' but I assumed that your saying certain cards 'did well with the game' was a judgement on performance, and frame-rate is a measure of performance.

I'm willing to give people the benefit of the doubt when they say they don't personally experience it. My experience has been that W2 was very temperamental and will only play nice with certain configurations. A few people on steam forums seemed to think that it's very system dependent. I remember reading someone going to a supposedly more powerful card and getting worse performance. For me, everything from menu lag to Geralt's responsiveness in combat to the speed of the mouse pointer seemed to change with the frame-rate. I've also heard that things got worse for some people after patches (and better for others). One thing i've noticed is that for every increment that you increase 'maximum pre-rendered frames' in nvidia input settings causes an extra frame of input latency. Furthermore if I force supersampling in nvidia inspector it causes several frames of input delay. It seems strangely coded, like inputs won't get parsed until it's completed certain tasks, even if those tasks last longer than a frame.

Funny you should mention shooters--I played Crysis on a 260 and found the ~25 fps quite playable, I didn't experience 'choppiness', like it was good at hiding it through parlour tricks (motion blur?). Maybe if I tried to do that today it wouldn't be the case, as I've grown accustomed to 60 fps, but it seems to me that a certain fps in one game can be very different from that same fps in another game, one can seem like a slideshow and the other, not. Not sure of the reason for it.

And so this doesn't completely go off topic--I am anxious as to whether Witcher 3 will work nicely with my PC or whether it will be a repeat of my Witcher 2 experience.
 
Last edited:
First I think we should scale back a little bit and remember that when witcher 2 specs were released and nobody knew then how the game would be or how it would play on substandard cards . Plus the GTX260 or the 4850 weren`t exactly bargain bin cards at the time either . Turned out that sub-par cards did really well with the game . Will it be the same for Witcher 3 ? I don`t know but much like Witcher 2 and time will tell .
EDIT : Almost forgot when system specs were revealed for W2 it was stated that a video card with 1 GB vram was recommended and a GTX260 had only 896 MB so I tend to take game specs with a grain of salt .

Yeah, exactly. I'm surprised at all the panic ( and anger) over the system requirements when I thought it was common knowledge that they are always exaggerated. Partly to give themselves some cautionary room and partly because these things are often established while the game is still being optimized. In this case we've been handed the System requirements 5 months ahead of the game.
 
@ caruga
I certainly hope Wicher 3 plays nice with all computers but if experience on this forum has taught me anything and that is it probably won`t play nice with some of the systems that people try to install the games on . ;)
 
These excuses about 30 fps on PCs are nothing but pathetic. Cool story bro, but I bet people who own GPUs for high-end gaming are aiming for a bit more than that. You know - you could just own a console for okay'ish graphics and locked 30 fps performance, right?

Buying high-end cards and playing games at 30fps makes no fucking sense, pal.
Take a deep breath, calm down and wait until the game is released. No reason to get your knickers in a twist at this point. It's still unfinished, unoptimized and there were features disabled (like apparently some of the nvidia hairworks stuff) that need to be taken into consideration before throwing wild accusation towards the developers. When the game is released and benchmarked by various sites and if it turns out to be unoptimized, then by all means accuse the developers of making a shotty PC version. In a civil manner of course.
 
Look if you even cared to read, especially my previous post, you wouldn't be spending time here "teaching" me good manners. I even said exact same thing that I won't upgrade untill I experience the game on current 760/780s, so no need to play a nice guy with me.

First off - Im stating facts that high-end PC owners are not playing games on 30fps
Second - noone is accussing CDPR (yet) since the game is not out. However it doesnt change the fact that Badowski's answers contradicts with CDPR's statements how PC version will be superior, since he recently was saying it will all be the same and that recommended settings will only give you 30 fps, which is utter bs for any self-respecting PC gamer.
 
Last edited:
However it doesnt change the fact that Badowski's answers contradicts with CDPR's statements how PC version will be superior, since he recently was saying it will all be the same and that recommended settings will only give you 30 fps, which is utter bs for any self-respecting PC gamer.

Damien also made a statement like that :huh:
Are there any differences between the PC and Console versions of the game?

No, there are no differences. On the PC you will be able to change the resolution and enable some special effects like NVIDIA Fur, but that’s typical for the platform. Aside from that, it’s the same game. http://connecteddigitalworld.com/20...onal-interview-damien-monnier-cd-projekt-red/
 
Look if you even cared to read, especially my previous post, you wouldn't be spending time here "teaching" me good manners. I even said exact same thing that I won't upgrade untill I experience the game on current 760/780s, so no need to play a nice guy with me.

First off - Im stating facts that high-end PC owners are not playing games on 30fps
Second - noone is accussing CDPR (yet) since the game is not out. However it doesnt change the fact that Badowski's answers contradicts with CDPR's statements how PC version will be superior, since he recently was saying it will all be the same and that recommended settings will only give you 30 fps, which is utter bs for any self-respecting PC gamer.

Your point is valid. it is also worth noting game requirements are most likely going to sky rocket when more demanding games start coming out.. If anything the games coming out this year are going to be the new baseline grade compared to what would likely be shown at E3.
 
Look if you even cared to read, especially my previous post, you wouldn't be spending time here "teaching" me good manners. I even said exact same thing that I won't upgrade untill I experience the game on current 760/780s, so no need to play a nice guy with me.

First off - Im stating facts that high-end PC owners are not playing games on 30fps
Second - noone is accussing CDPR (yet) since the game is not out. However it doesnt change the fact that Badowski's answers contradicts with CDPR's statements how PC version will be superior, since he recently was saying it will all be the same and that recommended settings will only give you 30 fps, which is utter bs for any self-respecting PC gamer.

I think you misunderstood what they said...
with the recommended rig,you could achieve 30 fps on high or ultra...they did not say that the pc version will be capped at 30 fps.
 
Top Bottom