The GameStar Hands-On Session: Technical Matters and First Impressions

+
this news about the performance of the witcher 3 on pc at the recent hands on, sounds ridicules.

I have just recently killed myself to get a new pc with the Nvidia GTX 970 and an intel core i5-4690. some games I have managed to test on it have been shadow of mordor and assassins creed unity (the latter I... COUGHdownloaded)

I can run both at 60 fps on ultra, and assassins creed, the nightmare of a thing it is, and how horribly it was said to be on pc, even with its taxing crowds, will only drop at most to 50fps for a couple seconds.

on consoles, assassins creed unity runs at an awful frame rate that straddle between 20 and 30fps. It seems unreasonable that a game with higher requirements than the witcher 3 will run at 60 frames on my pc, while the witcher 3 on high is suppose to run at 30 on a machine with a 980 and i7? if the requirements are that high it would be impossible to run the game on console.
 
It seems unreasonable that a game with higher requirements than the witcher 3 will run at 60 frames on my pc, while the witcher 3 on high is suppose to run at 30 on a machine with a 980 and i7? if the requirements are that high it would be impossible to run the game on console.

Where are you getting this information from? 30 FPS at 1080p on a 980 and Core i7 isn't what the article in the OP stated.. They said the frame rate was consistently fluent, which I'm assuming means it was well above 30 FPS.. Besides, by the time the optimization process is complete and the drivers are tuned, performance is going to be much better than it was in the demo anyway..
 
Where are you getting this information from? 30 FPS at 1080p on a 980 and Core i7 isn't what the article in the OP stated.. They said the frame rate was consistently fluent, which I'm assuming means it was well above 30 FPS.. Besides, by the time the optimization process is complete and the drivers are tuned, performance is going to be much better than it was in the demo anyway..

pardon, didn't read that right, they where talking about the recommended specs as 30fps, not the machines being used at the hands-on session. still sounds odd to me that the recommended specs would run at 30.
 
Gawd damn it! It's looks like I'll have to get a second GTX 980 in early May if I'm to have any hope of running Witcher 3 with Uber Sampling.
 
Last edited:
I hope I'll be able to run the game on Ultra with a stable 30 fps (even without AA). I guess I'll have to wait for others to find out what specs are needed for that. Until then, let's save some money.
 
Last edited:
My experience of running in various development processes and internal beta teams is that during the optimisation/testing process the software is commonly slower than the *exact same* code version in the Release Candidate version, as there is almost always Debug Code and other monitoring/profiling going on as a background process. Sometimes this can be very noticeable. At other times barely a suspicion.

Until the RC is demonstrated (or the final version is on shelves) ~ and there are driver optimisations (if required) from NVIDIA or AMD, no-one will have a really good idea of what the final performance will be.

(I'm more concerned with what the low IQ looks like at around 1400x900 (monitor res), and what the possible performance of a variety of sub-minimum and near minimum spec cards would be than endless debate about some largely hypothetical super computer running expensive additional shaders at unreasonably high FR and resolutions and how those pieces don't quite fit together).
 
this news about the performance of the witcher 3 on pc at the recent hands on, sounds ridicules.

I have just recently killed myself to get a new pc with the Nvidia GTX 970 and an intel core i5-4690. some games I have managed to test on it have been shadow of mordor and assassins creed unity (the latter I... COUGHdownloaded)

I can run both at 60 fps on ultra, and assassins creed, the nightmare of a thing it is, and how horribly it was said to be on pc, even with its taxing crowds, will only drop at most to 50fps for a couple seconds.

on consoles, assassins creed unity runs at an awful frame rate that straddle between 20 and 30fps. It seems unreasonable that a game with higher requirements than the witcher 3 will run at 60 frames on my pc, while the witcher 3 on high is suppose to run at 30 on a machine with a 980 and i7? if the requirements are that high it would be impossible to run the game on console.

What i find most funny is that the Assassins Creed Unity recommended spces are higher than TW3s and ACU runs fine on my PC with everything maxed out. Why is it that i get the feeling from all the articles that i was reading that TW3 seems much more taxing than ACU. Remember ACU recommended specs GTX 780 and TW3s recommended specs are a GTX 770. Just the CPU is the same for ACU and TW3s recommendations and that's a Intel Core i7-3770. In other words TW3 should run fine with our systems unless CDPR lied and the specs of TW3 should be higher than is shown.
 
I think it's more likely that:

1. AC:U's requirements were fudged.

2. TW3's requirements were based on a target frame rate of 30.

Claims that CDPR "lied" want proof, not mere assertions. And it had better be proof that turns aside the "No true Scotsman" argument.
 
Last edited:
Gawd damn it! It's looks like I'll have to get a second GTX 980 in early May if I'm to have any hope of running Witcher 3 with Uber Sampling.

I highly suspect people won't be able to run ubersampling at a decent framerate for years to come, i.e. it'll be as much a killer as it was in Witcher 2 for 2011's PC's.

---------- Updated at 08:05 PM ----------

A bit ironic that fans of a game known for its grey morality are so black-and-white in their own views

Black & white is another term for grey. :p
 
I highly suspect people won't be able to run ubersampling at a decent framerate for years to come, i.e. it'll be as much a killer as it was in Witcher 2 for 2011's PC's.

Well, I could get 50-60 FPS with two 580s running W2 with Uber Sampling back in the day. That was after CDPR fixed a rendering bug and Nvidia optimised their drivers of course. So if two 980s aren't enough, something is very wrong.
 
Last edited:
Well, I could get 50-60 FPS with two 580s running W2 with Uber Sampling back in the day. That was after CDPR fixed a rendering bug and Nvidia optimised their drivers of course. So if two 980s aren't enough, something is very wrong.

Ah. The only dual-gpu config I tried was amd (7990, which is two 7970's, which in raw computational power are a little bit faster than 580's) and the ubersampling framerate was unacceptable, but W2 is known for favouring nvidia. As far as single gpu's go, my 970 manages 60 fps indoors and ranges between 45 and 60 running around Vergen. So it took until late 2014 for single gpu's to be playable with it. :)
 
Ah. The only dual-gpu config I tried was amd (7990, which is two 7970's, which in raw computational power are a little bit faster than 580's) and the ubersampling framerate was unacceptable, but W2 is known for favouring nvidia. As far as single gpu's go, my 970 manages 60 fps indoors and ranges between 45 and 60 running around Vergen. So it took until late 2014 for single gpu's to be playable with it. :)


Can I ask what's your setup? 50fps on average is fantastic. I'm still debating whether I should go for top of the line powerhouse or venture into the PS4. Guess that PC footage on ultra is what's going to break the current the.
 
Can I ask what's your setup? 50fps on average is fantastic. I'm still debating whether I should go for top of the line powerhouse or venture into the PS4. Guess that PC footage on ultra is what's going to break the current the.

Umm, it's actually 5 and a half yrs ago that I built it. :) i7 920 @ 4.2 ghz, 6gb ram, and the 970.

The 45-60 fps is with 1080p (i.e. not 16:10) and unmodified ultra settings w/ uber (I think the only setting you can raise further after setting ultra, in the standard configurator, is the LOD).

*EDIT Actually I'm clocked at 3.8 ghz right now, and I tried increasing the LOD and it made zero difference to the frame-rate, must be a scenery thing.
 
Last edited:
I think it's more likely that:

1. AC:U's requirements were fudged.

2. TW3's requirements were based on a target frame rate of 30.

Claims that CDPR "lied" want proof, not mere assertions. And it had better be proof that turns aside the "No true Scotsman" argument.
So ACU requirements are based on a target frame rate of 60? What was fudged about ACU if you mean it was a bad port then i agree on that.
 
Nobody here knows anything about the technical aspects of AC:U. Anybody who did would be barred from commenting.

But when a game is stated to require certain hardware and yet performs admirably on hardware that is much less, and the developer of that game made no exhibition of their reasons for making those hardware specifications, one may suspect that the requirements were stated with a considerable margin of safety.

Developers may legitimately do this to provide a buffer against support calls (or, in the case of hardware, warranty claims). But in comparison with hardware requirements drawn carefully from actual observation of the game and the progress of development, it's fudging.
 
I think the main reason why Witcher 3 may have more trouble running on your PC than Asscreed Unity is because of the Global Illumination that is used which IMO which looks better than ACU. Also ACU doesn't have dynamic weather and beautiful particle effects that we saw in the latest gameplay trailer. I'm no technical guy but maybe ACU doesn't have as many polygons than witcher 3. Maybe the beautiful environments have many more polygons and stuff especially in the vegetation which is why it looks so realistic.
 
I think the main reason why Witcher 3 may have more trouble running on your PC than Asscreed Unity is because of the Global Illumination that is used which IMO which looks better than ACU. Also ACU doesn't have dynamic weather and beautiful particle effects that we saw in the latest gameplay trailer. I'm no technical guy but maybe ACU doesn't have as many polygons than witcher 3. Maybe the beautiful environments have many more polygons and stuff especially in the vegetation which is why it looks so realistic.

Polygon count for sure is a factor. I always thought Resident Evil 5 was a damn good-looking game but it ran well even on low-end cards of its time, because the environments were very basic.
 
This isn't exactly from the latest video because it wasn't showed , but when Geralt used Aard sign while fighting in the swamps the water ripples looked kinda bad because it made several circular ripples as it was going over the water, instead of a cone form.
Wondering if they intend to fix that detail.
(it's from the 35min gameplay video)
 
@onionshavelayers
You may have hit it on the head. Lighting is often the most expensive computation. They've put a lot of effort into lighting. It's where the biggest difference between the PC version and the consoles is visible.

By comparison, texture size is a red herring. PC and consoles have roughly comparable main and graphics memory size, and if anything the consoles have an advantage because they do not have to copy textures into VRAM. There is no good reason that the PC should be expected to have "better" textures. What the PC can do is render the same textures with better effects.

I suspect we'll find the difference really comes down to lighting.
 
Top Bottom