When i first saw Witcher 3 Gameplay at the E3, i was blown away. Also when i saw the trailers. But i was not hyped, because i expected the requirements being that high i would never be able to play this game within the next 4 (or so) years (because i am a student, and so is my husband, we don't have the money to simply buy a high end GPU yet) When finally the requirements were official, i took a second look. Noticing that i actually CAN run W3 - not on ultra ofc, but rather on medium settings (i5, GeForce GTX760, 8GB RAM...), i watched the Angry Joes Video on youtube and i was blown away again. I wondered first how they were able to make this game run on my system at all - even if i don't match the ultra requirements.
So if there was a downgrade (and it seems likely), they had a reason behind that. looking at youtube videos, i can say the game looks absolutely fine on those "ultra" settings (what will probably make me bang my husband's head against the wall until he drops his encryption password, so that i can play it on his PC that is a little bit better than mine - and he isn't even interested in the game)
I did not even think of compare the game to it's trailers. That is never a good thing, leads only to disappointment. I am part of an era where screenshots do almost always look bad at some point - because once you take the motion away, you have time to just look at every shitty detail, and you will find the ugly shit. I don't like screenshots at all. And i don't expect the game to be perfect. There will be bugs, clipping, ugly textures, collision issues, shitty particles and so on. Because the game is huge, and even if one million people would work on it... anyways.
I understand that this community here is expecting really high standards. Thats fine, but the game hasn't even been released yet. Maybe they will patch stuff later, or add a huge graphics update at some point where you can turn your settings that high that it will melt your GPU. Maybe not.