What I want to know is, since I dont have some super computer, and simply run a GTX750TI, 8gb RAM and I53470....how on earth is this game going to run on any next gen consoles when, comparatively speaking, their specs arent that much better than the PC I run? Is it simply due to poo optimization that the PC specs are so much more demanding than that of the consoles?
This easily explained.
On the one hand you have the console:
Every single console out there is running the
exact same hardware and software keep this in mind this is important.
Not only that, but all drivers, the cooling simply every single piece of a console is exactly the same.
Thus the devs know
exactly what they have to deal with. There is no console installed badly - it's plug-n-play.
Which - from a programmers point of view (I am an active software developer (not for CDProject to make that sure) and mod maker - just so you know that I know what I'm talking about.) is heaven.
You can dig through your code and opimize every single line, every piece of the render pipline to run as smooth as possible on the given specs, which you know won't change.
You can even play dirty little tricks to make the framrate jump, because of workarounds and optimisation that may only work with this and that particular chipset present on all consoles.
On the other hand you have the PC:
I honestly can't remember when I last bought a "ready" PC. You have thousands of thousands of possibillities to build your own PC. There are thousands of combinations of hardware alone - not even speaking of software - which is a big deal on it's own.
What if a told you a game - let's take The Witcher 2 cause we're on CDProjects forums here - does not even have to run with the same framerate (which is only a small part of the actual performance of a program) on a - from a hardware point of view identical system.
During the testing phases I often enough experience this first hand.
All dev PCs at my work are identical in terms of hardware. But you can place a high bet that the App/Programm/Game will perform worse on a graphic designer's PC than a pure coder's. Simply because of what is installed on the PCs, how messed up parts like the overall driver installment, the registry, how full the harddrive is, how fragmented...
That's only the influence the components of the PC have. Now there come in PC exclusive techniques (the PC version of TW3 is using a different AA than the consoles, because the PC's AA would be too demanding,...) into the mix, nVidia's fur, PhysX, etc. of course you can turn this stuff off but there is a bare minimum the devs want you to have without killing off the game' artistic view. This minimum is always going to be higher than on consoles. For all the aforementioned reasons.
Let's have another example:
If you would go an buy the parts to build a console (same CPU chip , same GPU chip, same memory,...)
Your self built "console" is going to perform worse than the actual console.
Why? The console has optimised drivers, optimised platines, chipsets,..., close to no OS - as opposed to a rather demanding (in comparison) Windows installment with all it's drivers. Which makes it superior to your own "console" even if you've installed it perfectly - which most people don't.
To make it short:
It's way harder to get a program optimised for a system you don't know, which is why a PC version will always be more demanding than a console version.
Still there is a difference between pure limitations of optimisation and poor optimisation - AC Unity for example is poorly optimised. Crysis was pretty good optimised, despite providing you with options you couldn't activate without melting you PC.
Edit: GuyNwah has the fast to read answer for you - mine takes time.