Witcher 3 Graphics

+
Status
Not open for further replies.
I really hope that all this thing of the downgrade has not to do with parity and that there's a technical motivation why something in appearance so simple as enabling the same lightning in the cinematics for gameplay is not done, even when the ambience obviously looks much better with it. I really hope that there's another motivation behind this thing than having at all costs the same (or very similar) visuals as consoles, elsewhere it would be really a sad scenario, and not only for what it concerns CDPR and The Witcher, but in general.

Since we cannot know for sure I prefer to think that there's a motive why CDPR cannot release a patch to enable that lightning in gameplay, because the alternative would probably prevent me from enjoying the game as I do atm.
 
Last edited:
Gameplay > Graphics

If you disagree you're in the genre of games. Making an open world game isn't like making a linear game. In order to create entire worlds with all the nooks and crannies you can explore they have to sacrifice some visual performance. You can't have both... maybe not right now. But honestly the graphics do not ruin the game, far from it. Dark Souls had horrible framerate issues that would drop to a slow crawl for nearly entire areas. It wasn't pretty but damn was that one the greatest games of all time... that's my opinion at least.

Fallout, Knights of the Old Republic, hell Minecraft, WoW, Diablo, Counter-Strike, Amnesia Dark Decent, Silent Hill, Ultima, Final Fantasy 7 (don't even start defending that, the cutscenes were great but Cloud looked like Popeye) and so on. Games that didn't have good graphics at their time and don't really age well today but who cares.
 
Last edited:

Hi, it doesn't increase draw distance for me, am I the only one?
 
But honestly the graphics do not ruin the game, far from it.

They do in case you know that the only motive why they aren't much better is because of some arbitrary external imposition as "visual parity with consoles". In this case, yes, they can ruin the enjoyment of the game for me. Principles often times are more important than momentary satisfaction, or at last for some people they are.

I have no problems with low graphics (and surely even with a downgrade W3 is anyway really good in the graphical department) as I come from an era of gaming where practically 80% of the visuals were created by your imagination, but this is a different concept that "caring about graphics". In fact the same thing can without problems happen also with gameplay, or any other thing. If this concept of "parity" is true and allowed to pass without antogonizing it then it will come a time when every aspect of a game, ranging from graphics, to interface, to gameplay etc. will be exactly the same in all platforms, notwithstanding their individual characteristics.

At that point you would know, everytime you play a game, that you could have had a much better product if only devs weren't constrained by arbitrary impositions on leveling the field at the lowest denominator.
 
Last edited:
I really hope that all this thing of the downgrade has not to do with parity and that there's a technical motivation why something in appearance so simple as enabling the same lightning in the cinematics for gameplay is not done, even when the ambience obviously looks much better with it. I really hope that there's another motivation behind this thing than having at all costs the same (or very similar) visuals as consoles, elsewhere it would be really a sad scenario, and not only for what it concerns CDPR and The Witcher, but in general.

Since we cannot know for sure I prefer to think that there's a motive why CDPR cannot release a patch to enable that lightning in gameplay, because the alternative would probably prevent me from enjoying the game as I do atm.
Clearly it's a parity thing or simply them not caring about optimizing the visuals for PC. Nothing else makes sense because it's literally an on/off function, just like Watch Dogs' E3 settings that some guy found and enabled. That game too had really cool cutscene graphics and models that vanished when gameplay started. For consoles its a necessary optimization for smooth gameplay, for PC it's something you do to show your consumers you don't respect their hardware or value their option of choice.

Making an open world game isn't like making a linear game. In order to create entire worlds with all the nooks and crannies you can explore they have to sacrifice some visual performance.
This hasn't really ever been true and is clearly not the case these days. Assassin's Creed is the perfect example of huge open world games with graphics better than most other video games that are mostly linear. You can have both, it's simply a matter of budget and typically open world games have huge budgets. Two of the most visually amazing looking 'games' I've played are Space Engine and Outerra and those are both bigger than every other game out there combined! Then you can bring up the Crysis argument which is a really huge(and old) shooter that still looks better than most games even if they're much smaller.

And I personally really like the Souls games' graphics and wouldn't play them if they looked bad. It's the reason I'm completely uninterested in Bloodborne.
 
Last edited:
Gameplay > Graphics

If you disagree you're in the genre of games. Making an open world game isn't like making a linear game. In order to create entire worlds with all the nooks and crannies you can explore they have to sacrifice some visual performance. You can't have both... maybe not right now. But honestly the graphics do not ruin the game, far from it. Dark Souls had horrible framerate issues that would drop to a slow crawl for nearly entire areas. It wasn't pretty but damn was that one the greatest games of all time... that's my opinion at least.

Fallout, Knights of the Old Republic, hell Minecraft, WoW, Diablo, Counter-Strike, Amnesia Dark Decent, Silent Hill, Ultima, Final Fantasy 7 (don't even start defending that, the cutscenes were great but Cloud looked like Popeye) and so on. Games that didn't have good graphics at their time and don't really age well today but who cares.

Go play Tetris...









It's a joke, but for some games graphics play a high role in immersion, just like 3D vision (in fact less than 3D vision but these two aspects of realisation have high impacts, especially for those who are ready to buy high end material just for that)

Anyway : a sad truth > a beautiful lie

Some things said here are tough, I recognize it, but it's a matter of feeling, especially for those who put a great trust in what CDPR said, and said again a few times

For what I know the problem is never the developers themselves or their hard work, but some decisions coming from their Leaders and the communication that is made

The game is good, but lots of us are disappointed for good reasons, there is no discussion about that last point, which has not the same importance for everyone

And now it seems to happen for nearly every multisupport game, even PC based licences, which is sad and disturbing...
If only it didn't just for this game, which is to me (and to others I am sure) the most waited for the last years...

Another equation :
most hope = (risk of most disappointement) / (trust in devs)
And here trust in devs was really high, and the fall was tough too
 
I do not know the answer, but my guess is visual parity with the console. PS4 is stuggling with high setting for 30fps, & X1 at 900p, imagine using this lighting?

Anyway, to be honestm, if CDPR said they will open up the cutscene lighting as an option in a future patch, I will totally stop palying this game now & wait for the patch!

Ha. I haven't started playing yet. Waiting for the PC version to finish evolving.
 
a quick update on the cutscen lighting, i think i found it, and its working, it look exactly like the screen shots Kiobi posted, gonna record a video , better than screen shots

Well, could you tell us how to enable it so that we can enjoy the game as well?
 
Last edited:
I prefer the "gameplay" settings for daylight and "dialog" for nightime. Dialog in the day looks to washed out in the pictures, on my TV and for me.
 
I prefer the "gameplay" settings for daylight and "dialog" for nightime. Dialog in the day looks to washed out in the pictures, on my TV and for me.

It's not "washed out", it's the ambient light reflecting on surfaces, as it happens in real life.
 
Well, could you tell us how to enable it so that we can enjoy the game as well?

its not playable, the view distance is reduced, the sky has an overblown amount of light, ther's too much light power coming from the sun and since it has bloom by default in day and night the bloom will react and creat " foggy" fake ambiant light that hide the lack of light when in shadowed area ( basically where sunlight dosnt arrive ) or it could be a fog particle , i dont know , just speculating

aaaaanyways please dont overblow this and blame CDPR for hiding some secret settings, ther's probably a reason why its not used
 
Damn, you got our hopes high, sigh sigh.
As I thought the "fog" setting is hardcoded in the effect and without a proper tool you cannot remove it.
 
its not playable, the view distance is reduced, the sky has an overblown amount of light, ther's too much light power coming from the sun and since it has bloom by default in day and night the bloom will react and creat " foggy" fake ambiant light that hide the lack of light when in shadowed area ( basically where sunlight dosnt arrive ) or it could be a fog particle , i dont know , just speculating

aaaaanyways please dont overblow this and blame CDPR for hiding some secret settings, ther's probably a reason why its not used
.

Everything you've mentioned seems to be relatively easily manipulated. They're like that, because they've been set to those parameters. Because, it has been relegated to trigger only during cutscenes (directed camera).

edit: It's not like CDPR said "oops, this looks weird, we can't use it. Keep it for cutscenes" lol. It's like that, because they set it like that. The current "cutscene lighting" was the exact lighting system they used to showcase the 35 minute 'gameplay' demo for e3 2014.

I'm not at home at the moment, but I presume it's a state triggered via the env files?
 
Last edited:
its not playable, the view distance is reduced, the sky has an overblown amount of light, ther's too much light power coming from the sun and since it has bloom by default in day and night the bloom will react and creat " foggy" fake ambiant light that hide the lack of light when in shadowed area ( basically where sunlight dosnt arrive ) or it could be a fog particle , i dont know , just speculating

aaaaanyways please dont overblow this and blame CDPR for hiding some secret settings, ther's probably a reason why its not used
How much Microsoft have paid you for not telling us your secrets :D:D:D
 
edit: It's not like CDPR said "oops, this looks weird, we can't use it. Keep it for cutscenes" lol. It's like that, because they set it like that. The current "cutscene lighting" was the exact lighting system they used to showcase the 35 minute 'gameplay' demo for e3 2014.

Still it is to be understood why they decided to relegate that much better lightning only for cinematics on PCs. I don't think that a good PC would not be capable of having good FPS with that lightning on all the time so there must be some other motive why they committed to this choice. If that's for parity with consoles then that's a problem. Even with just the lightning as in the 35 min demo the game would look much better than now, and much probably without either taking an huge hit on FPS.
 
Still it is to be understood why they decided to relegate that much better lightning only for cinematics on PCs. I don't think that a good PC would not be capable of having good FPS with that lightning on all the time so there must be some other motive why they committed to this choice. If that's for parity with consoles then that's a problem. Even with just the lightning as in the 35 min demo the game would look much better than now, and much probably without either taking an huge hit on FPS.

You can believe what you prefer. I've been a huge fan of CDPR since the first Witcher. But, The only feasible reasoning, to me is: either A: it was a (bad) design decision. or B: it was because of visual parity with consoles.

Unless they used some crazy 128-bit precision RT for the previous lighting(which I seriously doubt). There's absolutely no reason any modern gaming system shouldn't be able to handle it, without issue.

Not that it really means anything, because of directed camera positions, etc. But, I personally have no frame rate issues during cut scenes, on my system.

---
#Edit: To clarify - I'm not trying to start some 'debate' here. I'm simply very disappointed in them, if true.
 
Last edited:
#Edit: To clarify - I'm not trying to start some 'debate' here. I'm simply very disappointed in them, if true.

Me too. My position on all this thing is most of incredulity more than anything else. Just like when you have been told that someone you know very well and you have respect of has done something of reprehensible and you simply cannot believe it, so you desperately try to find a possible solution to prove that what you have been told is false.

Or, as Nietzsche said:
"'I have done that,' says my memory. 'I cannot have done that' ‑ says my pride, and remains adamant. At last ‑ memory yields."
 
Last edited:
Status
Not open for further replies.
Top Bottom