Obvious Downgrade

+
Status
Not open for further replies.
All this discussion is a non sense, all the games have the same problems and performance at some point, the worst we saw was unity or watchdogs, got improved with patches. DAI works pretty well on all cross generation platforms. Now we are digging for a excuse to say why this game doesn't run well on PC? no sr. I will tell you why this happen.

This happen because the game is AMAZING, no doubt, but it's designed and optimized for consoles, specially PS4. The PC Witcher 3 is a PORT and the game is just delayed for consoles, everyone is waiting including cdpr itself, so the game will be out, running amazing, being the GOTY 2015...... for consoles. PC Master Race?........ good luck, pray god to get a miraculous optimization in the last weeks.

We spent the majority of our time on the Xbox One version and it was immediately obvious that the console versions don’t look nearly as good as the PC.

GC: It’s not necessarily a criticism of you, but consoles aren’t magic.
MI: Yeah, so it’s always a compromise and I can go back to the time of The Witcher 2 on 360 and it looked like the PC version running on medium. So, the PC, that’s the nature of the format – it’s scalable up

30 frames per second on the PS4.

Sure thing.
 
Disclaimer: I'm no hardware or software engineer, merely an enthusiast and certain concepts can be wrong but I try my best.

It'll probably go to waste but let me try and chime some in...

A game that isn't optimized yet since the phase is still on-going has to use raw hardware bruteforce strength. This proves absolutely nothing. If the game on launch has performance like this, then there would be something to it but right now it means absolutely nothing. The game was running roughly 30 fps when it was shown for the first time when it was nothing more than a modified TW2 and RedEngine2, on a GTX680. I don't know where you saw the performance being mentioned since it was a hands-on, nobody got to test the game's performance. So for the moment let's assume that it was 30fps, that means it's running around the same mark with a newer card without optimizations, that has remained consistent since Optimizations are still on-going. This is not a bad thing at all, if the performance is similar after optimizations THEN it's worrisome.

Assets means things like textures and models, this has nothing to do with technology like HairWorks, TXAA, PCSS or anything of the sort, those are additions to the PC version this sudden "where are they" is total nonsense FUD spreading.

It's a multi-platform game and all 3 targeted platforms are x86_64 architecture(with extensions), it's only natural they share 1 code tree which later branches into 3 separate branches depending on each platform. This again is not news, we're not in the PS3/360 era where all 3 platforms are of a different architecture. This is the preferred method of development.

Textures...

Theoretical: Yes all three use the same assets, and? High quality textures mean VRAM usage, the consoles have 8GB of shared VRAM. The newest GPUs have 4GB on them and 8GB cards are much much rarer still. You think a 4GB VRAM card is gonna run textures of substancially higher quality than 8GB?

Practical: 8GB is probably not the entire pool available for the game, some of that is no doubt used by the OS functions, that still leaves it with more than 4GB VRAM in total for ALL video data. That's not how it works on PC, here the data has to be copied back and forth between System RAM and VRAM all the time.

TFlops are useless in this day and age, hardware instruction sets run our world.

Last but not the least, the API.
DirectX11 is a high-level API that has been shown to be at its limit at around 10,000 drawcalls and even that's taking it too far AND it pushes them 1 by 1 to the CPU, in a lane.

Consoles use a low-level API, a lowlevel API can push as much as 100,000(or more) drawcalls, I obviously don't know the specifics of console APIs but that's besides the point, the point is that the difference is quite a lot, machines can extract A LOT more juice out of the same hardware and it also doesn't push them in a lane. Add the reduced overhead of the API itself and you can see the advantage they have.

TL;dr version:
So what am I really saying?

Consoles are stronger than current PCs? No.

Consoles are using the hardware wayyy more efficiently than current PCs? Yes and it would be short-sighted to underestimate that and put your fingers in your ears to yell "PC! PC! PC! PC!" and ignore it.

It's premature to worry about the performance right now because the optimizations are still underway, do not believe random figures anybody makes up for the framerate. It was a hands-on event, nobody could have tested performance in that environment. Adam Badowski is not a technical engineer, he is making an estimate based on what he knows or what he was told, it's not a figure set in stone.

All platforms use the same architecture, so it's natural they share a codebase, this is not unusual.

Assets like textures and models being the same is not unusual either. That depends on the resource budget, not what platform it's running. Neither does a regular PC have more VRAM than the consoles to have some exceptionally higher quality textures.

You don't need to be software engineer to know that RAM, being VRAM or system RAM it's the same, they are both RAM, video cards use normal RAM too, not GDDR and systems run on GDDR, the difference is in speeds and latency.

So my computer has 12 GB of RAM, 8 the system, 4 de VGA. You saw any game using 8 gb of ram? LOL 3.5 at most, not even close to 4.

Consoles can use only up to 5 gb max, no game uses it. Yoy don't use vram for everything, consoles are computers too and need system ram, and game use ram for different things not everything is in need of a high speed gddr5 ram. So 3 gb vram is enbough for the data console games use for textures and display, other data can run on normal ram, but if don't with 4 gb you are fine, The Witcher 3 Wild Hunt wont use 8 gb of ram for a game, it will use 2-3 or 4.
 
Baloney.

A PC exclusive was not at the time it was planned and is not now a viable proposition. Not on this scale, not at this quality, not in a world where developers have to be paid. Without the console market, you don't have enough money to buy this game for PC..

really? then how witcher 1 and witcher 2 was made?
for pc, optimized for pc, and amazing graphics for pc

now witcher 3.. they ruined all with the downgrades, ports, bad performance, delays, etc
 
Disclaimer: I'm no hardware or software engineer, merely an enthusiast and certain concepts can be wrong but I try my best.

It'll probably go to waste but let me try and chime some in...

A game that isn't optimized yet since the phase is still on-going has to use raw hardware bruteforce strength. This proves absolutely nothing. If the game on launch has performance like this, then there would be something to it but right now it means absolutely nothing. The game was running roughly 30 fps when it was shown for the first time when it was nothing more than a modified TW2 and RedEngine2, on a GTX680. I don't know where you saw the performance being mentioned since it was a hands-on, nobody got to test the game's performance. So for the moment let's assume that it was 30fps, that means it's running around the same mark with a newer card without optimizations, that has remained consistent since Optimizations are still on-going. This is not a bad thing at all, if the performance is similar after optimizations THEN it's worrisome.

Assets means things like textures and models, this has nothing to do with technology like HairWorks, TXAA, PCSS or anything of the sort, those are additions to the PC version this sudden "where are they" is total nonsense FUD spreading.

It's a multi-platform game and all 3 targeted platforms are x86_64 architecture(with extensions), it's only natural they share 1 code tree which later branches into 3 separate branches depending on each platform. This again is not news, we're not in the PS3/360 era where all 3 platforms are of a different architecture. This is the preferred method of development.

Textures...

Theoretical: Yes all three use the same assets, and? High quality textures mean VRAM usage, the consoles have 8GB of shared VRAM. The newest GPUs have 4GB on them and 8GB cards are much much rarer still. You think a 4GB VRAM card is gonna run textures of substancially higher quality than 8GB?

Practical: 8GB is probably not the entire pool available for the game, some of that is no doubt used by the OS functions, that still leaves it with more than 4GB VRAM in total for ALL video data. That's not how it works on PC, here the data has to be copied back and forth between System RAM and VRAM all the time.

TFlops are useless in this day and age, hardware instruction sets run our world.

Last but not the least, the API.
DirectX11 is a high-level API that has been shown to be at its limit at around 10,000 drawcalls and even that's taking it too far AND it pushes them 1 by 1 to the CPU, in a lane.

Consoles use a low-level API, a lowlevel API can push as much as 100,000(or more) drawcalls, I obviously don't know the specifics of console APIs but that's besides the point, the point is that the difference is quite a lot, machines can extract A LOT more juice out of the same hardware and it also doesn't push them in a lane. Add the reduced overhead of the API itself and you can see the advantage they have.

TL;dr version:
So what am I really saying?

Consoles are stronger than current PCs? No.

Consoles are using the hardware wayyy more efficiently than current PCs? Yes and it would be short-sighted to underestimate that and put your fingers in your ears to yell "PC! PC! PC! PC!" and ignore it.

It's premature to worry about the performance right now because the optimizations are still underway, do not believe random figures anybody makes up for the framerate. It was a hands-on event, nobody could have tested performance in that environment. Adam Badowski is not a technical engineer, he is making an estimate based on what he knows or what he was told, it's not a figure set in stone.

All platforms use the same architecture, so it's natural they share a codebase, this is not unusual.

Assets like textures and models being the same is not unusual either. That depends on the resource budget, not what platform it's running. Neither does a regular PC have more VRAM than the consoles to have some exceptionally higher quality textures.

Thanks for the insightful reply :).

I agree that optimizations are ongoing, especially in the late months of development and there is a lot to be expected of them. I think the questions about asset stem from two things : the fact that one trailer showed the king of the wild hunt with what looked like quite degraded textures and the idea that this stemmed from memory constraints of console. I have no idea if this is true but there you have it.

@facemeltingsolo made the valid point that, while consoles do have 8GB of memory, the amount that can effectively be dedicated to assets and textures could be inferior to what a 4GB video card would allow. It depends on the engine, asset streaming and all kind of things we have absolutely zero idea about. But this is where part of the speculation and trouble comes from.

I completely agree about the rest of your post regarding hardware efficiency as you already know :).

All in all, we can only wait and wish for the best.
 
So it's time to accept that not only is the console market a contribution to the success of this game, but also that the consoles in some ways exceed the performance of PCs and are not simply there to be disparaged as a poor man's platform for inferior games and to cause supposed downgrades of cross-platform games.

Let's not pretend that consoles don't destroy almost everything they touch, and they are always much worse graphically, even with sub 30fps. They do are a inferior and damage every single game made with them in mind, graphically or in any other way: Watchdogs, The division (it has already been downgraded thanks to consoles), Thief, etc. At the end of the day, the endless marketing theory about console performance doesn't seem to reflect reality.

This is reality, even if console users don't like it.

A PC exclusive is quite viable, I don't know where you got that idea. Crysis 1 was a PC exclusive, sold much more than the following multiplatform games, and was, graphically much better than anything else. I don't know about you, but Sony or Microsoft haven't brainwashed me yet.
Also, why don't we take a look at the Witcher2 xbox sales? I'm sure they will be quite revealing.

And if the game isn't an exclusive, the minimum I expect is models and textures made with pc in mind and then downgraded console versions
 
You do realize that CDPR almost went bankrupt before they released Witcher 2, right? And they barely made it out financially post-release btw.

didn't know that
but, was thanks to the fans., they buy their game, and was a success

but now their reputation is on game

what do you think will happen when their release the game?
the reviewers maybe will destroy this game, and this may affect the sales

This happen time ago with aliens colonial marines
They hype us with an amazing gameplay trailer. and finally, all we know what happend
 
There is an even simpler explanation: Consoles+greed=doomed the moment it was announced. There are around half a billion games that show this.

I mean, you all know what happens to every single game that suddendly is made for consoles, right?

I get the feeling proponents of the downgrade thesis - which I'm partially aligned with - keep shooting themselves in the foot to the point of having no toes left.

Greed?

The game has been delayed twice to accommodate more polishing, at the risky cost of potentially aggravating shareholders, discouraging buyers and thus depreciating overall company value, and you pin this on greed, of all things?

Be reasonable.

You have got to come to terms with the fact that consoles hold a very significant market share. Whether you like it or not, AAA open world RPGs cannot be realized alienating console gamers. Period. It's either The WItcher 3 on PC and on consoles or no The Witcher 3 at all. Simple as that.
 
Disclaimer: I'm no hardware or software engineer, merely an enthusiast and certain concepts can be wrong but I try my best.

It'll probably go to waste but let me try and chime some in...

A game that isn't optimized yet since the phase is still on-going has to use raw hardware bruteforce strength. This proves absolutely nothing. If the game on launch has performance like this, then there would be something to it but right now it means absolutely nothing. The game was running roughly 30 fps when it was shown for the first time when it was nothing more than a modified TW2 and RedEngine2, on a GTX680. I don't know where you saw the performance being mentioned since it was a hands-on, nobody got to test the game's performance. So for the moment let's assume that it was 30fps, that means it's running around the same mark with a newer card without optimizations, that has remained consistent since Optimizations are still on-going. This is not a bad thing at all, if the performance is similar after optimizations THEN it's worrisome.

Assets means things like textures and models, this has nothing to do with technology like HairWorks, TXAA, PCSS or anything of the sort, those are additions to the PC version this sudden "where are they" is total nonsense FUD spreading.

It's a multi-platform game and all 3 targeted platforms are x86_64 architecture(with extensions), it's only natural they share 1 code tree which later branches into 3 separate branches depending on each platform. This again is not news, we're not in the PS3/360 era where all 3 platforms are of a different architecture. This is the preferred method of development.

Textures...

Theoretical: Yes all three use the same assets, and? High quality textures mean VRAM usage, the consoles have 8GB of shared VRAM. The newest GPUs have 4GB on them and 8GB cards are much much rarer still. You think a 4GB VRAM card is gonna run textures of substancially higher quality than 8GB?

Practical: 8GB is probably not the entire pool available for the game, some of that is no doubt used by the OS functions, that still leaves it with more than 4GB VRAM in total for ALL video data. That's not how it works on PC, here the data has to be copied back and forth between System RAM and VRAM all the time.

TFlops are useless in this day and age, hardware instruction sets run our world.

Last but not the least, the API.
DirectX11 is a high-level API that has been shown to be at its limit at around 10,000 drawcalls and even that's taking it too far AND it pushes them 1 by 1 to the CPU, in a lane.

Consoles use a low-level API, a lowlevel API can push as much as 100,000(or more) drawcalls, I obviously don't know the specifics of console APIs but that's besides the point, the point is that the difference is quite a lot, machines can extract A LOT more juice out of the same hardware and it also doesn't push them in a lane. Add the reduced overhead of the API itself and you can see the advantage they have.

TL;dr version:
So what am I really saying?

Consoles are stronger than current PCs? No.

Consoles are using the hardware wayyy more efficiently than current PCs? Yes and it would be short-sighted to underestimate that and put your fingers in your ears to yell "PC! PC! PC! PC!" and ignore it.

It's premature to worry about the performance right now because the optimizations are still underway, do not believe random figures anybody makes up for the framerate. It was a hands-on event, nobody could have tested performance in that environment. Adam Badowski is not a technical engineer, he is making an estimate based on what he knows or what he was told, it's not a figure set in stone.

All platforms use the same architecture, so it's natural they share a codebase, this is not unusual.

Assets like textures and models being the same is not unusual either. That depends on the resource budget, not what platform it's running. Neither does a regular PC have more VRAM than the consoles to have some exceptionally higher quality textures.

Console SDK gives access to like 4.5 gigs TOTAL, to be used for the game and VRAM. Here this should help.

http://www.dualshockers.com/2014/04...am-cpu-and-gpu-compute-to-make-our-jaws-drop/

Comparing a 4 GB VRAM card and a I7 with 8 gigs of system ram to a "next gen console" is like comparing a Yugo to a Ferrari. Want to know why consoles didn't use the 6 GB VRAM textures or even the next step down in Shadows of Mordor? They don't have the ram available for VRAM. You do realize we have MSI afterburner right? We can see memory usage in these "next gen games" at console settings and the lower the resolution the less VRAM (and Witcher 3 prob ain't gonna be 1080p).

Anyways I am done with this thread. We are now being sold technical fairy tales.

You heard it here first people. Star Citizen on Xbox One and PS4. Here is a pre alpha screenshot.

http://www.wcnews.com/background/images/sm1screenshot08.gif

Also a 2010 GTX 480 that was UNDERCLOCKED managed to beat the PS4/Xbox One in a the horribly ported Watch Dogs at console settings. Why? Cus the consoles are just so awesome.

https://www.youtube.com/watch?v=tYG-7T2LVDc

Whatever though CDPR. Obviously Monolith with a third party modded engine > CDPR as far as being a PC developer. Enjoy selling this crap. No one is going to believe it but people who own XB1's and PS4 and they aren't here. They are on IGN.com. That is who you want to sell to though, so have a blast.
 
Last edited:
Greed, appealing to the mass market to get more money at the cost of quality. Delays are made because they are considered profitable.

You even admit that the relevant factor here is market share, money.

It can be done. Period. Make it for PC with no console limitations or resources used for console versions. Simple as that.
 
what do you think will happen when their release the game?
the reviewers maybe will destroy this game, and this may affect the sales

Did you even look at the previews? Did the majority of the journos who went there seemed to say that the game was awful? Please link me an article that says that (DO NOT LINK ME THE NATHAN GRAYSON ONE :) )
 
Delays are made because they are considered profitable.


In what bizarro world?
 
Did you even look at the previews? Did the majority of the journos who went there seemed to say that the game was awful? Please link me an article that says that (DO NOT LINK ME THE NATHAN GRAYSON ONE :) )

He actually liked it quite a bit, but loaded the article with click bate rage inducing snobbery of his to get kotaku dem monies.
 
You don't need to be software engineer to know that RAM, being VRAM or system RAM it's the same, they are both RAM, video cards use normal RAM too, not GDDR and systems run on GDDR, the difference is in speeds and latency.

So my computer has 12 GB of RAM, 8 the system, 4 de VGA. You saw any game using 8 gb of ram? LOL 3.5 at most, not even close to 4.

Consoles can use only up to 5 gb max, no game uses it. Yoy don't use vram for everything, consoles are computers too and need system ram, and game use ram for different things not everything is in need of a high speed gddr5 ram. So 3 gb vram is enbough for the data console games use for textures and display, other data can run on normal ram, but if don't with 4 gb you are fine, The Witcher 3 Wild Hunt wont use 8 gb of ram for a game, it will use 2-3 or 4.

No, RAM is not all the same. Except in uniform memory architectures, system RAM and VRAM are in different address spaces, and processors can access one but not the other. Your PC with 8 GB system RAM and 4 GB VRAM is NOT a PC with 12 GB RAM; it is a PC with separate 8 GB and 4 GB memories. Furthermore, most of what is in the VRAM is duplicated in the system RAM. So you don't even have 8+4, you have something like 8-3 + 4 less what Windows uses for itself.

On a traditional PC with PCI-e GPU, the GPU can render only from data that are already in VRAM. If a texture is needed and it's not already loaded into VRAM, it has to be moved in. At the least, this costs you a PCI-e transfer from main memory to the GPU VRAM, and some CPU time to supervise it. It may also require stealing memory from a resource that wasn't used recently, to make room in the VRAM address space. (This is where the 970 in particular gets into trouble.)

PS4 and Xbox have partial implementations of UMA. They're not full AMD hUMA; that was developed too late for these processors. But they're clever enough to avoid the mandatory copy between system RAM and VRAM that exists on the PC. This means that 8 GB of PS4 RAM is more than, say 6 GB of PC RAM plus 4 GB of GPU VRAM.

I'll leave the avoidance of Windows and the inefficiencies of DirectX 11 as an exercise for the reader. Sidspyker already discussed the importance of these. Bottom line, as an engine for running games, the consoles are substantially superior to a comparably equipped PC, and nobody should be surprised that they perform better when running a well-engineered game.
 
Console SDK gives access to like 4.5 gigs TOTAL, to be used for the game and VRAM. Here this should help.

http://www.dualshockers.com/2014/04...am-cpu-and-gpu-compute-to-make-our-jaws-drop/

Comparing a 4 GB VRAM card and a I7 with 8 gigs of system ram to a "next gen console" is like comparing a Yugo to a Ferrari. Want to know why consoles didn't use the 6 GB VRAM textures or even the next step down in Shadows of Mordor? They don't have the ram available for VRAM. You do realize we have MSI afterburner right? We can see memory usage in these "next gen games" at console settings and the lower the resolution the less VRAM (and Witcher 3 prob ain't gonna be 1080p).

Anyways I am done with this thread. We are now being sold technical fairy tales.

You heard it here first people. Star Citizen on Xbox One and PS4. Here is a pre alpha screenshot.

http://www.wcnews.com/background/images/sm1screenshot08.gif

Also a 2010 GTX 480 that was UNDERCLOCKED managed to beat the PS4/Xbox One in a the horribly ported Watch Dogs at console settings. Why? Cus the consoles are just so awesome.

https://www.youtube.com/watch?v=tYG-7T2LVDc

Whatever though CDPR. Obviously Monolith with a third party modded engine > CDPR as far as being a PC developer. Enjoy selling this crap. No one is going to believe it but people who own XB1's and PS4 and they aren't here. They are on IGN.com. That is who you want to sell to though, so have a blast.

I'm quite hostile towards consoles just like you and, while I recognize the can make better use of their resources, I think they are underpowered and are indeed holding PCs. But we're speaking about TW3. Many here criticize and say crap about a game which isn't out yet, without knowing the actual differences from platforms only based on assumptions and without sure informations that only developers are holding. And they're right, given the attitude to spread FUD all around. While part of your reasoning can be acceptable, this does not imply (even from a strictly graphical side) that TW3 cannot and won't be a great game.
 
I think I'm also done with this thread. There is nothing more to be gained from talking about this, I think our points of concerns have come across.

I am also a bit wary of being attacked for the crime of discussing console hardware in order to understand where we are, even though I fully agree to the PC's superior performance.

@Guy N'wah Thanks for clearing that one up, I didn't want to get into it ^^".

Good night :).
 
Last edited:
Greed, appealing to the mass market to get more money at the cost of quality. Delays are made because they are considered profitable.

You even admit that the relevant factor here is market share, money.

It can be done. Period. Make it for PC with no console limitations or resources used for console versions. Simple as that.


Which AAA open world story driven RPG on PC alone from recent years can you name off the top of your head?

Even a significantly more modest title like Kingdom Come Deliverance went cross platform. Sure, you might be able to finance a PC only TW3, I mean a Lilliputian version of it, a handicapped wheel-chaired paraplegic mockery of REDs' original vision. Thanks, but no thanks.

The irony is you're protesting - and in my eyes, rightfully so - against the graphical downgrade and yet are eager to deprive CDProjekt from the very bread and butter that has allowed them to aim for that kind of graphical fidelity in the first place.

You're ignoring basic business model realities.
 
Last edited:
Status
Not open for further replies.
Top Bottom