GRYOnline.pl interview with CDPR studio lead Adam Badowski - translation

+
Why, oh why, would you phrase it that way....

Because it's the plain truth?

I meant, why did Badowski say it like that. This is terrible PR wise.

What would you rather have him say? Beat around the bush, sugar coat it, spin it like other Devs seem keen on doing and for which they often end up getting heavy flack? You live by the sword, you die by the sword. Fact is the 2,3 or 4 fold bigger crowd TW3 is attracting when compared to TW2 has been primarily lured in by the exceptional visuals. This is the area where REDs certainly have to deliver and which, either way, will set a precedent constraining how CP2077 will be initially received.

If we put Badowski's statements on the table along with Marcin's reassuring answers, frankly I see an unsolvable equation, a jigsaw that seems to deny completion. How can different scenes show such dramatic difference in graphical fidelity if they're running on the same engine and, simultaneously, how can three distinct platforms purportedly output pretty much the same fidelity? It boggles my mind, but for the time being I am putting faith in Marcin's words.
 
Last edited:
Yeah... but the important thing is what settings High or Ultra? 30FPS Ultra for a GTX 770 or 30FPS GTX 770 High.

Why is that "most important"? And it will be most likely high given the fact that even the high-end system used for the hands-on struggled with a fluid framerate in some cases...
 
Why is that "most important"? And it will be most likely high given the fact that even the high-end system used for the hands-on struggled with a fluid framerate in some cases...
Yeah it's not the most important but it's something that should obviously be asked. I too think it's High and if so that's horrible. That means a GTX 980 will do about 50fps on High and not Ultra. Of course things can change but if you were to test it now that would be the state it would be in.
 
Yeah... but the important thing is what settings High or Ultra? 30FPS Ultra for a GTX 770 or 30FPS GTX 770 High.

They've said Ultra is currently locked out. This is mainly because Ubersampling will not run at a practical frame rate on most hardware.

High is what was demonstrated and what will be available on release, unless that situation changes.
 
The prologue and the streamlining that we create for the new user is the second aspect, also really important. The quest goals are clear and readable but the player who knows the world won’t look at them.
Noone can accuse us of dumbing down the game.
The game is exactly the same, grass drawing distance is identical everywhere.
To sum it up there are not many differences between PC, PS4 and Xbox One, they rather fix certain problems than change the configuration

Thanks for this honest words, maybe the most honest words coming from CDPR I've read in months. But I also got very sad when I read this interview.

It was the uniqueness, the niche appeal, and the PC-centered approach of W1 and W2 which made those games so great imo. Streamlining the formula and opening it to the mass console market will most likely lead to an inferior product at its very core no matter how much they talk about features and improvements. And to be honest, those bits about graphics on PC sound a whole lot different to what they "promised" us a few months back aka "the PC version will look MUCH better than the console version" and stuff along that line. Apparently that was very much exaggerated.

Don't get me wrong. I still hope W3 will be an awesome game and I don't really care that much about graphics. But I do care about that "mass market" appeal and the compromises that were made to achieve that. They might tell me that I can deactivate quest markers and such but that doesn't change how quests are structured. The hands-on previews pretty much proved that most (all?) quests didn't require much real thinking or "old RPG discoverness". Just enable your godlike Witcher senses and you have your solution. Dumbing quest design down by its finest imo. If that is only one aspect of "streamlining the game for the mass appeal" I fear of what might follow that we haven't heard of so far.

It's pretty sad to see that CDPR seem to follow the "typcial" franchise development. You make a unique and innovative, but small and more niche game that aquires a really enthusiastic community, then you make a bigger and better sequel for basically the same audience and then, when you've reached a certain size threshold, you decide that the third game must have mass appeal. Your former core community (and customership) isn't the prime democracy anymore but just one part of it. You want the bigger oak tree that is beyond that, the mass market AAA gamer who wants vanilla mainstream mass market formulas like "open world", "fast rewards", "simple quests that don't require much effort or thinking", "accessible gameplay" and yes - maybe you need them to justify your size. But you cannot neglect that you've given something up in the process, that you betrayed or at least disappointed your former core community to a certain extend by spreading too far and too thin. That's sadly inevitable in such a process somehow and somewhere.

In the end there is a difference between a GOTY and a cult classic. One is "just" a great game loved or at least liked by many, the other is among the all time favorites of some people. I fear W3 will never belong to the latter category but very likely belong to the former one. Forgive me, but for me, as an old-school PC gamer, a cRPG enthusiast and a die-hard Witcher fan this is bad news.
 
How do you know that? Did the devs confirm?

No, i'm just guessing. But in the interview he did say it would run on High with a GTX 770 with 30fps. So think about it. A GTX 980 is about 40% faster than a GTX 770 i then would assume a 30fps with a GTX 770 change it to a GTX 980 and it would be about 50 fps. I'm not to far off i think because the PCs that were running TW3 had a GTX 980 and people said it was playing with about 30 to 40 fps on High.
 
They've said Ultra is currently locked out. This is mainly because Ubersampling will not run at a practical frame rate on most hardware.

High is what was demonstrated and what will be available on release, unless that situation changes.
I don't think ubersampling is part of the ultra settings. It most likely will be a thing of its own, just like supersampling in W2, on top of ultra settings.
 
Yeah it's not the most important but it's something that should obviously be asked. I too think it's High and if so that's horrible. That means a GTX 980 will do about 50fps on High and not Ultra. Of course things can change but if you were to test it now that would be the state it would be in.

So far, what we have witnessed is that a GTX 980 will do about 30 fps, not 50, in High, not Ultra. A claim of 50 has no foundation.

---------- Updated at 05:23 PM ----------

I don't think ubersampling is part of the ultra settings. It most likely will be a thing of its own, just like supersampling in W2, on top of ultra settings.
It was stated. I'm reluctant to make assumptions to the contrary of statements made. (Also, Ubersampling was initially enabled in Witcher 2 Ultra. It wasn't removed from Ultra settings until later.)

And how will the PC ultra compare to consoles?

You will be able to find some small differences like Nvidia Hairworks for example but they are very demanding graphically so you must have a strong machine. It’s not a political thing to say but in the future you will be able to turn on Ubersampling that killed the Witcher 2 at release and now it will do the same, so we don’t want to unlock it now. The game looks better but has absurd requirements.
 
Last edited:
So far, what we have witnessed is that a GTX 980 will do about 30 fps, not 50, in High, not Ultra. A claim of 50 has no foundation.
At the current build yes. But they did say it could run better in the future with the optimization. So 50fps on High i think is possible.
 
It was stated. I'm reluctant to make assumptions to the contrary of statements made. (Also, Ubersampling was initially enabled in Witcher 2 Ultra. It wasn't removed from Ultra settings until later.)

And how will the PC ultra compare to consoles?

You will be able to find some small differences like Nvidia Hairworks for example but they are very demanding graphically so you must have a strong machine. It’s not a political thing to say but in the future you will be able to turn on Ubersampling that killed the Witcher 2 at release and now it will do the same, so we don’t want to unlock it now. The game looks better but has absurd requirements.
Maybe we read that passage differently but imo it wasn't stated here that ubersampling is part of ultra settings. He just mentions ubersampling in the same passage. To me his answer means that ubersampling will be a thing on its own that will have "absurd requirements". That obviously doesn't apply to ultra settings.
 
Last edited:
Not sure what to get, guys do you think I can play on High with a GTX960 or should I go for a 970 with 4GB VRAM?
(my screen resolution is 1680x1050)
 
At the current build yes. But they did say it could run better in the future with the optimization. So 50fps on High i think is possible.

50 fps vs. 30 fps is a reduction from 33 msec/frame to 20. That means you have to cut out 13/33 (40 percent) of per-frame processing. It would not be realistic to expect that.

I'd rather they get a solid floor under that 30 fps. It's a bummer when your controls fail because the frame rate momentarily dropped to 12 in combat.
 
50 fps vs. 30 fps is a reduction from 33 msec/frame to 20. That means you have to cut out 13/33 (40 percent) of per-frame processing. It would not be realistic to expect that.
Well, given the fact that AC Unity runs with 50-60 FPS on ultra/maximum settings on a GTX 970 it would be pretty sad if Witcher 3 wouldn't at least reach the same on high settings...
 
Not sure what to get, guys do you think I can play on High with a GTX960 or should I go for a 970 with 4GB VRAM?
(my screen resolution is 1680x1050)

With 1680x1050, which is something less than 75% of the pixels at 1920x1080, you should be able to get by with a lesser GPU. We don't know what the demand on VRAM will be; it has not been stated.

---------- Updated at 05:33 PM ----------

Well, given the fact that AC Unity runs with 50-60 FPS on ultra/maximum settings on a GTX 970 it would be pretty sad if Witcher 3 wouldn't at least reach the same on high settings...

Witcher 3 is not AC Unity. I know that sounds like Captain Obvious, but they are entirely different engines, and experience with one has no bearing on performance with the other.
 
With 1680x1050, which is something less than 75% of the pixels at 1920x1080, you should be able to get by with a lesser GPU. We don't know what the demand on VRAM will be; it has not been stated.

Sounds awesome.
Won't change the fact that I have to build a new (future-proof) system however.

I think I'll go with I5 3,5 Ghz + GTX 960 (or 970), power supply 550 W, 8 GB RAM, that should do it for the next 3 years

Witcher 3 is not AC Unity. I know that sounds like Captain Obvious, but they are entirely different engines, and experience with one has no bearing on performance with the other.

Exactly.
Also there is a lot to take into account, like the number of objects on screen, the number of assets being used at any given time and the systems being used for those assets, the worlds size, draw distance, movement, special effects, character details, etc. "Low", "Medium", "High" are only generalizations of dozens of graphical settings. In some games 1 single graphic feature (that makes not much visual difference) can determine the performance of the whole game by a huge margin (can, not has to necessarily). In the end it all depends on the individual variables in the equation.
 
Last edited:
Witcher 3 is not AC Unity. I know that sounds like Captain Obvious, but they are entirely different engines, and experience with one has no bearing on performance with the other.
True. But I'm not a programmer or developer, I'm a gamer and a customer. I compare what I get in the game with what I've invested in hardware. So I compare framerates and visual quality. In all honesty, it's CDPR's damn job to optimize their engine. If the all-day-long-criticized-to-crucified Ubisoft can deliver better "frames per fidelity" shouldn't the all praised CDPR shouldn't do at least the same? ;)

---------- Updated at 02:38 AM ----------

I think I'll go with I5 3,5 Ghz + GTX 960 (or 970), power supply 550 W, 8 GB RAM, that should do it for the next 3 years
Maybe you should think of buying a Xeon GPU instead with DX12 and even better multithreading support just around the corner... ;)
 
Top Bottom