The Witcher 3 - Visuals

+
Status
Not open for further replies.
You are forgetting two things:

1) PC's require a bit more ''horsepower'' to run games at the same level as a console does. Consoles utilise their weaker GPU/CPU better than a PC with the same set would, because they are machines optimised for gaming + there is no variance in specs which makes optimisation easier for devs.

2) The GTX 980 is for 1080p/60FPS, which consoles cannot achieve + Hairworks and some other little bits and bobs. If you were to run the game at 1080p/30FPS or 900p/30FPS the specs will scale down closer to console hardware, even more so without hairworks.

Basically yes, there isn't a huge difference between PC Ultra and PS4, yes the drawing distance is a bit better and you'll have things like Hairworks. But people who did call the visual parity statement from earlier this year ''PR move not to upset console players/M$'' is simply BS. Does the game look amazing visually? Hell yes it does. Are there clear differences from what we were shown in promotional material way back? Yes. Am I mad bro/brah's? Nope, the game looks good and every other element (gameplay/story/atmosphere) looks fantastic so I am totally pumped.

I'm going to say here what I said in another forum. PC gaming has never been about balls to the walls graphics. It's always been about scalability, or using settings and tweaks so you can adjust the game to fit your hardware's capabilities.

What I find irritating about the Witcher 3, is that CDPR has left very little room for the scalability of assets and effects. Even worse, is the outright removal of certain effects due to the consoles. The blood decals should never have been "CUT" simply because consoles can't handle it or whatever. It's just a graphical effect, and doesn't affect gameplay whatsoever so why cut it out?

Thats what settings are for. Leave a way for those to enable it who can run it..
 
How come the PC footage with a 980gtx looks nearly the same as the ps4 footage?? Huh huh ?

Real world performance from high end PC parts to optimization is completely different, and that's the dicussion here. We all know a 980gtx is much more powerful than what's in the ps4 but it's all about the optimization

A. The PC version was downgraded to look like the consoles. The PC can make the game looks much better if it wasn't held back, the consoles cannot.
B. 60fps is WAY different that 30fps.
 
Last edited:
How come the PC footage with a 980gtx looks nearly the same as the ps4 footage?? Huh huh ?

60 FPS versus 30 FPS? Also one thing I missed. Textures SHOULD look almost the exact same between consoles and the highest end PCs because the consoles have 8GB of combined RAM which makes them on par with even the most powerful GPUs and textures are mainly a memory issue.
 
How come the PC footage with a 980gtx looks nearly the same as the ps4 footage?? Huh huh ?

Real world performance from high end PC parts to optimization is completely different, and that's the dicussion here. We all know a 980gtx is much more powerful than what's in the ps4 but it's all about the optimization

Optimization can only get you so far. No amount of optimization is going to bridge the massive chasm in raw processing power, capability and bandwidth between the GPU in the PS4 and a GTX 980..

Also 60 FPS is MUCH harder to run than 30 FPS..
 
What really annoys me is that all the high quality assets, textures and effects were ALREADY developed but then they actively set out to downgrade these and thus create MORE work when the work was already done. Why not just LEAVE THEM BE for the PC version and create the weaker assets for the consoles. The was a huge difference between The Witcher 2 on PC and the xbox version, CDPR didn't seem to care for console parity then. Last year I upgraded my entire PC to an i7 4790k and GTX 980 SLI in time for The Witcher 3 in January only to learn that the entire time between then and now has been spent downgrading the visuals. I could have saved over a £1000.

---------- Updated at 02:08 PM ----------

60 FPS versus 30 FPS? Also one thing I missed. Textures SHOULD look almost the exact same between consoles and the highest end PCs because the consoles have 8GB of combined RAM which makes them on par with even the most powerful GPUs and textures are mainly a memory issue.

Consoles do NOT have access to the full 8gb. On ps4 they can only access up to 5gb. I'm not sure about xbox but I suspect it would be the same.
 
Last edited:
Optimization can only get you so far. No amount of optimization is going to bridge the massive chasm in raw processing power, capability and bandwidth between the GPU in the PS4 and a GTX 980..

Also 60 FPS is MUCH harder to run than 30 FPS..

60 FPS is only 2x as hard to run as 30 FPS - 2x the number of frames per second.
A 980 is many times as powerful as the PS4 and XB1, and I would daresay more than 2x as powerful as the X1 and PS4 combined. Make of that what you will.
And this:
Comparing youtube videos is not the best idea. Wait for release then compare PC max settings to consoles.
 
Who think that there is only downgrade from 35 min demo i want to tell something to them.

Look at shadows in 35 min demo. The shadows appear in front of us. Look at shadows of characters at Novigrad scene. Almost in all characters head and arm does not cast shadows. Many objects does not cast shadows at all. Why nobody talk about improvements but only downgrades.

I do not say there is no any downgrade.

1. Head turning while walking.
2. Dust particle while gallop.
3. Water effect when Aard.
4. Tunder light illumination.
5. Some fire effect and fire animations.
All of this break my heart!

May be RED really wanted to make all platforms look similar. I am PC gamer too. And if there is only hairwork and another less noticable difference between PC and consoles why in earth i must buy GTX 980. For additional 30 fps and hairwork? I do not think 30 fps and hairwork cost 300$. I can buy new PS4 for this price.

BUT!!!!!!

Imagine there is 2 Witcher 3 game.

first. The game with terrible shadows but some special effects which actually not the big deal.
second. Without some special futures which is not break the gameplay immersion Bbut better shadows

Which one would you chose?

Look the ground. It is completely flat.




And I really didn't want to use Inquisition as comparison, but:


Look at the overall graphics, model quality, foliage, lighting and etc. And you will understand why tessellation is missing on little rocks.
 
What really annoys me is that all the high quality assets, textures and effects were ALREADY developed but then they actively set out to downgrade these and thus create MORE work when the work was already done. Why not just LEAVE THEM BE for the PC version and create the weaker assets for the consoles. The was a huge difference between The Witcher 2 on PC and the xbox version, CDPR didn't seem to care for console parity then. Last year I upgraded my entire PC to an i7 4790k and GTX 980 SLI in time for The Witcher 3 in January only to learn that the entire time has been spent downgrading the visuals. I could have saved over a £1000.

I'm kinda worried about cdpr downgrading the game for pc verses consoles. I hope its not the case but if it is I'll be a pissed off dude. I'm debating ether to keep the preorder from gog or cancel it? I want to believe cdpr is giving us something better but hmmm lately with the videos and pictures I'm starting to wonder. :(
 
Consoles do NOT have access to the full 8gb. On ps4 the can only access up to 5gb. I'm not sure about xbox but I suspect it would be the same.

This is true, but the consoles have much lower overhead (especially latency) when it comes to accessing memory than a PC. PCs likely require more memory (including cache) than the consoles to achieve similar performance, since it uses discrete memory architecture.
 
Consoles do NOT have access to the full 8gb. On ps4 the can only access up to 5gb. I'm not sure about xbox but I suspect it would be the same.

Still more then the vast majority of video cards which have 4 GB at best unless you dabble in Titan territory.

60 FPS is only 2x as hard to run as 30 FPS - 2x the number of frames per second.
A 980 is many times as powerful as the PS4 and XB1, and I would daresay more than 2x as powerful as the X1 and PS4 combined. Make of that what you will.

You forget two things: Anisotropic Filtering, which is always lower on consoles, and anti-aliasing which is generally non-existent and certainly not comparable even when it is present to the performance hog of something like MSAA.

Look at the overall graphics, model quality, foliage, lighting and etc. And you will understand why tessellation is missing on little rocks.

DA has superb lighting, great foliage ( not in that area since it's a bloody mountain ) and amazingly detailed models.
 
Last edited:
Still more then the vast majority of video cards which have 4 GB at best unless you dabble in Titan territory.



You forget two things: Anisotropic Filtering, which is always lower on consoles, and anti-aliasing which is generally non-existent and certainly not comparable even when it is present to the performance hog of something like MSAA.

This memory is SHARED. You seem to only think that textures get stored here. Sounds, AI, Music etc etc is ALSO stored here.
 
60 FPS is only 2x as hard to run as 30 FPS - 2x the number of frames per second.
A 980 is many times as powerful as the PS4 and XB1, and I would daresay more than 2x as powerful as the X1 and PS4 combined. Make of that what you will.

60 FPS is the golden standard for PC gaming. From 30 to 60 there's a massive leap in visual feedback. Beyond 60, diminishing gains become more and more apparent..

Also, frame rate isn't just about the GPU. The CPU also plays an important role there.. Much of the time, the GPU in high end PCs is bottlenecked by the CPU because of the API which has very high overhead. DX12 is supposed to fix that problem.
 
I must admit that I see a rather large change in the graphic fidelity since the first trailers went out. Well I could live with it because the 35 min demo eventhough not as atmospheric to me it was still quite pleasing. But now there seem to even have been yet another notch down between the 35 min demo and what we have seen as of late.

To me it was the screenshot released yesterday with the wall suddenly missing it's cornerstones, the textures looking flat and to me just in general a step away from the "realism" that was present earlier that have my alarm bells ringing right now. I still hope it's a bug but at the moment tbh I very much doubt it. Anyway as I have mention before I am no graphical experts but even these visual changes are very obvious to me and I must admit I am disappointed.

But instead of yelling downgrade and sit here and cry about it I will use this opportunity to cancel my preorder and I will now instead just wait and see how the game is received when released and then decide if I want to buy it or not. Don't get me wrong I am still excited for the game but there are just to many unanswered questions right now and too much of a drop in atmosphere which to me is quite a big deal in a RPG.

So I prefer to wait and see instead of keeping my preorder.
 
Last edited:
60 FPS is the golden standard for PC gaming. From 30 to 60 there's a massive leap in visual feedback. Beyond 60, diminishing gains become more and more apparent..

Also, frame rate isn't just about the GPU. The CPU also plays an important role there.. Much of the time, the GPU in high end PCs is bottlenecked by the CPU because of the API which has very high overhead. DX12 is supposed to fix that problem.

For 60 FPS, you only need to insert a frame between each frame from 30 FPS. Each frame takes 33.3 ms for 30 FPS, and 16.7 ms for 60 FPS, so yes, it is - in general - a linear relationship.
And yes, CPUs can be a bottleneck. I was assuming no bottleneck from the CPU (as it usually is when comparing gpu power).
 
DA has superb lighting, great foliage ( not in that area since it's a bloody mountain ) and amazingly detailed models.


DA everything is static, no day night cycle, and way less foliage than witcher 3, and I think the foliage won't react to anything on screen. Don't know why people compare them two, they are totally different.
 
DA everything is static, no day night cycle

That's a choice on Bioware's part certainly not a limitation of their graphics. Their engine supports that capability as was shown in Battlefield 4 on the same engine.

Of course CDPR gets credit for making their own engine unlike Bioware which took DICE's work. There are certainly areas where DA falls short, but in those that I listed. Also foliage does react to wind, though there's little of it, and characters moving through grass.

I really don't need to go to the mire and throw some fireballs to show just how amazing the lighting looks do I?





You may notice the rocks under the water are NOT flat.
 
Last edited:
Status
Not open for further replies.
Top Bottom