PS4 graphics

+
Trust your instinct, it will be impossible. Sony and Microsoft simply have come out with some underperforming machines. They are not what ps1/ps2/ps3/xbox/xbox360 were when they were released.
Don't have wild expectations on the graphics of the game, get over the fact that it won't be close to Pc version visually (I think Ps4 version will be almost like the Royal Wyvern footage).

Keep in mind that I don't want to start a system war/console war, as I will play TW3 on ps4.

Come close to the PS1/PS2? What? The PS1 and PS2 were not powerful at all and arguably (iirc) the weakest of their time (console wise).

PS3 was the only time Sony tried to play the power game.

Also, yes the consoles are said to come VERY close to the PC version as all versions are said to look identical and 'ultra' is said to hold "neglible differences".

Which admittedly worries me but fuck it. But yea, this generation is very much in line with prior generations in terms of power.

I'm also worried with how PS4 is even comparable to high when:

>Recommended settings (770/290 + i7 process)
>Said to achieve 1080p @ 30fps on medium-high settings

How is the PS4 even coming CLOSE to matching the recommended settings? It seems (by the sounds of it) to be identical to the recommended settings in terms of what it will achieve despite being FAR weaker?

Either this is some console parity or some god tier optimization.

Also, @MadYarpen do you work with CDPR?
 
Come close to the PS1/PS2? What? The PS1 and PS2 were not powerful at all and arguably (iirc) the weakest of their time (console wise).

PS3 was the only time Sony tried to play the power game.

Also, yes the consoles are said to come VERY close to the PC version as all versions are said to look identical and 'ultra' is said to hold "neglible differences".

Which admittedly worries me but fuck it. But yea, this generation is very much in line with prior generations in terms of power.

I'm also worried with how PS4 is even comparable to high when:

>Recommended settings (770/290 + i7 process)
>Said to achieve 1080p @ 30fps on medium-high settings

How is the PS4 even coming CLOSE to matching the recommended settings? It seems (by the sounds of it) to be identical to the recommended settings in terms of what it will achieve despite being FAR weaker?

Either this is some console parity or some god tier optimization.

Also, @MadYarpen do you work with CDPR?

The graphics back then were considered astonishing and marked new standards. That's it.

Going back to the topic, there is a sensible and substantial difference between the last Royal Wyvern footage and the previous ones. The difference will be there and it will be visible. I'm not saying the game will be bad on consoles, just don't expect miracles from a technical standpoint.
 
The graphics back then were considered astonishing and marked new standards. That's it.

Yes. Games like The Order/Driveclub/Ryse are seen as a massive set up visually from the PS360 era.

PS1 and PS2 were not powerful at all compared to the competition. The PS2 was even weaker than the Dreamcast iirc.

"Going back to the topic, there is a sensible and substantial difference between the last Royal Wyvern footage and the previous ones."

What Wyvern footage? The GDC footage? The one shot at dawn when there's a far higher contrast?

The game looks more or less identical to prior gameplay video, even the recent 4k 'ultra' / 'high' screenshot looked identical with the main difference being the fact that it was in 4K.

"The difference will be there and it will be visible"

Not from what CDPR have previously said. They said the game is identical on all versions and until I see Ultra in motion, every gameplay video both PC and Xbox One has looked more or less identical.

The main differences I expect to come primarily from higher resolutions. Nvidia hairwork etc will obvious have differences but for the most part, until i'm shown Ultra in motion the game looks to be identical.
 
Last edited:
I'm also worried with how PS4 is even comparable to high when:

>Recommended settings (770/290 + i7 process)
>Said to achieve 1080p @ 30fps on medium-high settings

How is the PS4 even coming CLOSE to matching the recommended settings? It seems (by the sounds of it) to be identical to the recommended settings in terms of what it will achieve despite being FAR weaker?

The main reason you can get such good performance from consoles is the api's you are working with and that you can optimize to a very narrow range of hardware. I think the phrase that bounces around is "getting close to the metal," which means you are by-passing a lot of overhead that you have with a PC regardless of how well it is all setup. A PC based on the ps4 console specs probably couldn't even start a game like Witcher 3 much less run it at the minimal settings. It's also why console to PC ports are so nefarious - it's a much harder platform to optimize for, so the specs for a similar experience are much higher. That's the basics anyway.
 
Yes. Games like The Order/Driveclub/Ryse are seen as a massive set up visually from the PS360 era.

PS1 and PS2 were not powerful at all compared to the competition. The PS2 was even weaker than the Dreamcast iirc.



What Wyvern footage? The GDC footage? The one shot at dawn when there's a far higher contrast?

The game looks more or less identical to prior gameplay video, even the recent 4k 'ultra' / 'high' screenshot looked identical with the main difference being the fact that it was in 4K.



Not from what CDPR have previously said. They said the game is identical on all versions and until I see Ultra in motion, every gameplay video both PC and Xbox One has looked more or less identical.

The main differences I expect to come primarily from higher resolutions. Nvidia hairwork etc will obvious have differences but for the most part, until i'm shown Ultra in motion the game looks to be identical.

No. The wyvern footage is PAX video. And I don't mean to say bad thing on console cause I am a console gamer too but having the same texture on all version and that kind of stuff doesn't mean the game will look the same. It's clever they said that because It's a commercial rethoric. Saying this is both wrong and true(That's why it's clever). Texture are the same on all version of a game it's always have been the case. Often the draw distance too. What makes the difference on pc is the rest. Texture resolution, and all nice effect consoles can't get. Saying the game is the same is just a clever way to avoid frustration for most gamers. Only The graphic freaks will worry reading that and they are not that many.

But the game on PC is really better. Gamestar already confirmed it. And I trust CDPR when they say the game is not downgraded. The GDC video is superior to the PAX and Hands on vid already. I don't see how it is possible for the witcher 3 to be as good on console than on PC. But I have no doubt PS4 version will be stunning too.
 
Last edited:
Best thing to do is wait really. You have to consider that devs won't straight out say there's a huge difference and the only answers you will probably get will be nothing more than "It plays just fine", "It looks smooth enough", "There's no major difference", "I can't tell the difference". They have to sell a product and remain neutral.

Not that I mind since on release date it's just a matter of comparing screenies with pc and console high settings and see if they really are equivalent and with the release date around the corner, it's not that much of an issue. Honestly as long as the game looks like that initial trailer, there is no problem. The article that guy posted already said there were some differences especially with draw distance so it's obviously not a 1:1.
 
Last edited:
Best thing to do is wait really. You have to consider that devs won't straight out say there's a huge difference and the only answers you will probably get will be nothing more than "It plays just fine", "It looks smooth enough", "There's no major difference", "I can't tell the difference". They have to sell a product and remain neutral.

Not that I mind since on release date it's just a matter of comparing screenies with pc and console high settings and see if they really are equivalent and with the release date around the corner, it's not that much of an issue. Honestly as long as the game looks like that initial trailer, there is no problem. The article that guy posted already said there were some differences especially with draw distance so it's obviously not a 1:1.

The thing is, is that they aren't just saying "it's smooth", "It looks fine" etc. They are directly drawing comparisons with a specific setting, a setting that a relatively high end rig (far more powerful than a PS4) is said to barely achieve.
 
I'd take those PC requirements with a large grain (bag) of salt, when it comes to speculating where the PS4 will sit.
Take a look at the minimum Intel vs. AMD cpu requirements. The AMD is not even close to the Intel.

The 'recommended' specs? The AMD 290 outperforms even the GTX 780, let alone the 770.
The AMD 280x is the most comparable GPU to GTX 770. It's the AMD gpu that should have been listed under 'recommended'.

So, until we get the game in our hands and some mature drivers on board and maybe a patch or 3, everything at this point is conjecture. Fun times!!!

PS: Remember when we had to just buy games from a shelf and hope it worked on our machines at home?
No internet, not speculating, heck...no advertising!
 
I'd take those PC requirements with a large grain (bag) of salt, when it comes to speculating where the PS4 will sit.
Take a look at the minimum Intel vs. AMD cpu requirements. The AMD is not even close to the Intel.

The 'recommended' specs? The AMD 290 outperforms even the GTX 780, let alone the 770.
The AMD 280x is the most comparable GPU to GTX 770. It's the AMD gpu that should have been listed under 'recommended'.

So, until we get the game in our hands and some mature drivers on board and maybe a patch or 3, everything at this point is conjecture. Fun times!!!

PS: Remember when we had to just buy games from a shelf and hope it worked on our machines at home?
No internet, not speculating, heck...no advertising!

All that shows is that this is another nvidia prioritized game which isn't much of a surprise.

But I agree with the wait and see. I'm just concerned with all the downgrade/console parity talk which seems to be the case more and more, but fuck it, wait and see I suppose
 
I can't believe this subject is still being discussed... Fact: Nearly all the people who tried the game on the hands-on events said that there is a visible difference between PS4 and PC-High (I'm guessing mostly on aliasing, LOD and pop-in wise but still). Another fact: Gamestar magazine got 60 FPS on 1080p Ultra with a GTX 980. Which kinda means that a GTX 770 will be able to get much more than 30 FPS on High settings. So there ya go; more frames per second on higher settings than console versions... (Also, if the Gamestar magazine is to be believed, the fidelity difference between High and Ultra is not that "negligible", as there are even settings for more and higher quality foliage and so on ;) ).

I would not suggest you guys to take the companies that seriously when they say "Oh, all of the versions will look really good and close to each other!", as they are most probably doing PR - or there are some agreements with the console companies present.
 
I can't believe this subject is still being discussed... Fact: Nearly all the people who tried the game on the hands-on events said that there is a visible difference between PS4 and PC-High (I'm guessing mostly on aliasing, LOD and pop-in wise but still). Another fact: Gamestar magazine got 60 FPS on 1080p Ultra with a GTX 980. Which kinda means that a GTX 770 will be able to get much more than 30 FPS on High settings. So there ya go; more frames per second on higher settings than console versions... (Also, if the Gamestar magazine is to be believed, the fidelity difference between High and Ultra is not that "negligible", as there are even settings for more and higher quality foliage and so on ;) ).

I would not suggest you guys to take the companies that seriously when they say "Oh, all of the versions will look really good and close to each other!", as they are most probably doing PR - or there are some agreements with the console companies present.

This. What's more, I would also suggest that you not try to read details into the words of gaming reporters, especially when they are paraphrasing statements made by speakers who are not native speakers of English. We know that canards like "downgrade", "frame rate limit", and "same on all platforms" have entered this forum through a combination of representatives who spoke unguardedly, reporters who inadvertently or deliberately misunderstood or chose loaded words for effect, and members who have been all too willing to pounce on these because they believe the industry to be tainted. All of it is a tempest in a very small teapot.
 
" can't believe this subject is still being discussed... Fact: Nearly all the people who tried the game on the hands-on events said that there is a visible difference between PS4 and PC-High "


1. Not even the recommended settings will be achieving high (http://wccftech.com/the-witcher-3-1080p-ps4-optimization-issues/) and we already know it's only said to be able to hit 30fps. So recommended settings (Going by CDPR) is for 1080p @30fps on medium-high settings.
2. All I read was that the LOD was halved and that the textures looked SLIGHTLY worse on the PS4 but the journalist thought it might have just came down to the TV

"Gamestar magazine got 60 FPS on 1080p Ultra with a GTX 980."

1. A 770 isn't even close to a 980
2. They also had a juicy i7 processor m8. Not to mention that it didn't maintain a consistent 60fps and dropped noticeably in the major cities and nor do we know how much of a difference Ultra even makes. So no, that doesn't even come close to proving what a '770' will achieve.

"Also, if the Gamestar magazine is to be believed, the fidelity difference between High and Ultra is not that "negligible", as there are even settings for more and higher quality foliage and so on"

All they said was that Ultra looked gorgeous, all journos have been saying High looked gorgeous also. They also said that at a lot of areas they couldn't tell the difference between high and Ultra.

"
I would not suggest you guys to take the companies that seriously when they say "Oh, all of the versions will look really good and close to each other!", as they are most probably doing PR"


That's not all they've said though. They directly drew comparisons with a specific setting. Simply saying "they are all gorgeous on any platform" would have sufficed, but instead CDPR themselves said current gen was equivalent to high.
 
Last edited:
The graphics look great for everything so who cares. At first yeah sure, but I think some of the screenshots show it with better graphics now too. It's not what to be worried about, more so the story, gameplay, and keeping interest. I'm almost positive that it will do all those and have amazing graphics, maybe not the Order 1886 but then again that was a _ hour game. I prefer length and interest in games, ones that are really worth the money.
 
equivalent to high

Equivalent might be too strong a word, but close too pc-high seems to be true from everything I've read. It's not going to be on the level of some of the screen shots and trailers, but even from the e3 xbox1 footage you can see, as pretty much everyone has said: It is the best looking game on consoles so far. Which is no small achievment considering how busy this game is.
 
"It is the best looking game on consoles so far. Which is no small achievment considering how busy this game is."

Eh, I genuinely thought AC Unity looked better.

"Equivalent might be too strong a word, but close too pc-high "
Yea. Recommended settings are medium-high which is what I think the current gen consoles are. So all the 'comparable to high' said by the hands on players recently was most likely due to it not being fully high.

Still, fucking remarkable that CDPR managed to get the consoles optimized so well given the graphical fidelity and scope of the game. Or will it turn out to be an unoptimized mess on release? Only time will tell!
 
Idk about the best, the order 1886 got a lot of praise for it's graphics. They are amazing. But the witcher 3 is much bigger, so I do think overall they will be better, in a different way... and I'm wondering how big of a difference will xbox one to ps4 be? ps4 is close to high, xbox close to minimum right?
 
Come close to the PS1/PS2? What? The PS1 and PS2 were not powerful at all and arguably (iirc) the weakest of their time (console wise).

PS3 was the only time Sony tried to play the power game.

Also, yes the consoles are said to come VERY close to the PC version as all versions are said to look identical and 'ultra' is said to hold "neglible differences".

Which admittedly worries me but fuck it. But yea, this generation is very much in line with prior generations in terms of power.

I'm also worried with how PS4 is even comparable to high when:

>Recommended settings (770/290 + i7 process)
>Said to achieve 1080p @ 30fps on medium-high settings

How is the PS4 even coming CLOSE to matching the recommended settings? It seems (by the sounds of it) to be identical to the recommended settings in terms of what it will achieve despite being FAR weaker?

Either this is some console parity or some god tier optimization.

Also, @MadYarpen do you work with CDPR?

At release, the PS1 and PS2 were absolutely considered powerful machines, more powerful than even PC hardware. They became the "weakest" of its ilk because Sony was always first to the market, the competition didnt show up until over a year later. In relative terms the PS4 is no where near as powerful as its predecessors, its a midrange PC.

As far as the PC requirements are concerned, those are always exaggerated. I also wouldnt take CDPRs word on the console games performance relative. Yesterday Damien also said there is no difference between 900p and 1080... I dont know about you but my eyes can absolutely see the difference.
 
Unity was pretty good except for the 3 meter halo of grass that followed you when you got off of a street. I think Second Son was better, but that had a lot to do with lighting. I haven't played the Order, but it is going for a very different feel, hard to compare for me.

It could be a mess, but remember Damien said ps4 was the platform he is choosing to play it on, so I feel pretty good it's going to play well.

Time will tell. That's one of the reasons I'm waiting for a review or two before I buy. I may not agree with some of the things they say, but they do tend to notice buggy or unoptimized games.
 
@Thebull94

You do realize that you are posting an article from 2 months ago, written about the January hands-on build; while Gamestar played the new build for two days without any limitations and stated that they were surprised by the amount of optimization that has been made? The 60 FPS 1080p Ultra via GTX 980 came from the optimized build, so that article that you posted is kinda obsolete right now. Also a GTX 980 is of course a lot different from a GTX 770, but apparently Ultra is also different than High (They even had HairWorks enabled and got that performance, and I'm sure we all know how much of a performance hog that is). And when I said there are even settings for more and higher quality foliage in the Ultra preset, I wasn't guessing - that came from the article itself ;)

But I concur, I had this discussion too many times and know that this will most probably go nowhere (I got that idea from the reactions I got from you at least :) ). So, I don't want to spend any more time on it. If you want to discuss more, have fun! :)

Side note to everyone else: I'm not saying that the console versions won't look good. I'm sure they will have one of the best fidelity we have seen so far (Not counting The Order: 1886, as probably just the White Orchard area of Witcher 3 is bigger than that game). I'm just saying that they won't be comparable to PC version, as logic dictates. Because of the scalability of the PC platform. You get the gist.
 
Last edited:
Top Bottom