Let's not pretend that consoles don't destroy almost everything they touch, and they are always much worse graphically, even with sub 30fps. They do are a inferior and damage every single game made with them in mind, graphically or in any other way: Watchdogs, The division (it has already been downgraded thanks to consoles), Thief, etc. At the end of the day, the endless marketing theory about console performance doesn't seem to reflect reality.
This is reality, even if console users don't like it.
A PC exclusive is quite viable, I don't know where you got that idea. Crysis 1 was a PC exclusive, sold much more than the following multiplatform games, and was, graphically much better than anything else. I don't know about you, but Sony or Microsoft haven't brainwashed me yet.
Also, why don't we take a look at the Witcher2 xbox sales? I'm sure they will be quite revealing.
And if the game isn't an exclusive, the minimum I expect is models and textures made with pc in mind and then downgraded console versions
Others have, correctly to my technical understanding, pointed out that the consoles exceed the resource handling ability of PCs. So if anything, the textures have to be downgraded for the limited VRAM and additional transfers required only on the PC; the consoles can run textures that only 4GB+ cards can on PC.
Claims that "consoles destroy almost everything they touch" have not enough substance or foundation to bother refuting.
I think what he is getting at is the case where the texture resolution can be higher than consoles based on the greater actual VRAM available on PC. But we've been there time and time again ^^".
I really should go, this might be condescending or arrogant but thanks to everyone that has engaged calmly in this. If nothing else, it does feel like you guys listen even if you don't agree.
And again, IT'S NOT TESELLATION, IT'S BUMP MAPPING, WHICH IS FAR SIMPLER AND OLD.
Actually, the point is there is not greater VRAM available on the PC. The PC is limited to GPU VRAM, because resources must be copied into VRAM before they can be used. The consoles avoid copying resources to VRAM, so they can use more than 4GB of system RAM for GPU resources.
It's really not important anyway so I don't want to get hung up on this. But if we do the maths, consoles have between 4.5 and 5 gigs to work with. So between physics, sound, game logic and other stuff you end up with something in the ballpark of 2.5 to 3gigs of strictly GPU data. Don't get me wrong, this is insanely efficiently usable data, with special caching and direct write access. But in term of raw numbers I think the pool is not as large as on PC.
Of course, PCs will have to eat the huge latency cost of swapping data out and back in when they hit the 4GB cap. And that was a cheap shot to the GTX970 earlier on, I laughed sourly at mine .
G'night !
Uh, no, an 8GB console has more RAM to work with than an 8GB PC does. They have no operating system, or only a kernel with a minimal userland, not the hippopotamic bloat of Windows. The game code is the same, but the key is the graphical resources, which are the great majority of the game content are kept in RAM either way, only on the consoles there is only one copy.
What makes you think it'll look worse? Most people describe it and they like the scene itself, the visuals there are.... pretty damn bad honestly, everything looks half done/unfinished and very rough. The atmosphere is what captures you, but the individual assets? They are really bad. Look at the wood on the windmill, character models, 2D looking huts, sharp as hell grass that look as if it will cut you, Geralt's hair.
Everytime someone mentions it, they describe the atmosphere, the scene and the liveliness of it and mistake that for visual fidelity is what I've found.
Uh, no, an 8GB console has more RAM to work with than an 8GB PC does. They have no operating system, or only a kernel with a minimal userland, not the hippopotamic bloat of Windows. The game code is the same, but the key is the graphical resources, which are the great majority of the game content are kept in RAM either way, only on the consoles there is only one copy.
4K Textures? Uh no, nor does any game exist that uses 4K textures throughout. Max Payne 3 had 2K textures for EVERYTHING on PC and that was 35GB because of it, it wasn't an open-world game, it wasn't extremely big, and it was 35GB. Do you really want to imagine the size of a game that uses 4K textures throughout? Not to forget the we don't have the necessary VRAM available to support such a game either.For sure high end PC can handle 4k textures and I doubt console can handle 2k. If game would be PC exclusive I am pretty sure we would see much better graphics, same like on early trailers.
That's a strength, not a weakness. On PC, that data has to be copied to both System RAM and VRAM.And that 5.5 GB is shared, it's not exclusively available to the GPU, unlike 4GB in a GTX 980 for example
No, it doesn't. The PS4 for example has at maximum 5.5GB of memory available for developers. They do run a form of operating system. And that 5.5 GB is shared, it's not exclusively available to the GPU, unlike 4GB in a GTX 980 for example. And that's not even taking into account the gulf in computational power advantage a PC has. Hence why Watch Dogs for example looks, and runs far better on a good PC than any console. At least once its high end graphical features are re-enabled that is.
http://www.vg247.com/2013/07/26/ps4...pers-4-5gb-guaranteed-1gb-of-flexible-memory/
http://www.dualshockers.com/2014/03...l-and-how-they-can-make-them-run-really-fast/
That's a strength, not a weakness. On PC, that data has to be copied to both System RAM and VRAM.
What is even going on in that PC shot?
Basic fact about VRAM. It's a duplicate of system RAM, not an extension of it. There is not an additional 4GB of RAM in a PC with a 4GB GPU.
A 5GB program including graphical resources takes 5GB on an 8GB console and 5GB on an 8GB PC. Only it also takes maybe 3GB additional VRAM on the PC.
When CPU cycles are not the limiting resource, the performance difference between a Jaguar and a Core i5 is spectacular but quite meaningless.
My point is not that a PC cannot exceed the performance of a console. My point is that we have to drop the prejudice that a console is automatically inferior. And if we do not drop at the same time the unfounded belief that consoles are damaging to PC gaming, we are all in for a hard lesson.