For those PC gamers who are craving for Min vs Max comparison

+
Status
Not open for further replies.
I'm baffled by the rage that's ensued. I remember when we only cared about a great game being made. But now our priorities have shifted to having a game that's better looking than the same game on another platform.

A game is no longer judged by its own merits. It's now by judged by how it measures up against itself on another platform.

Bro - for me - it honestly comes back to how they portrayed the game in 2013 and e3 2014. It was misleading and before you say anything I did cancel my pre-order and I am bummed. I saved up some money to upgrade my PC to run this game at full quality, and I won't come close to utilizing it.

Go ahead you can make fun of me, but PCs are able to upgrade for a reason. A $400 console should not look the same as a $2k gaming computer. Just my humble opinion. No need to flame.
 
http://www.geforce.com/whats-new/articles/the-witcher-3-wild-hunt-is-your-system-ready

Look what Intel i7-5960X, 16GB DDR4 RAM with The Witcher 3: Wild Hunt Game Ready GeForce GTX Driver can do with dedicated gpu:
1920x1080, High settings GTX 960
1920x1080, Uber settings GTX 970
1920x1080, Uber settings w/ GameWorks GTX 980
2560x1440, Uber settings GTX 980
2560x1440, Uber settings w/ GameWorks GTX TITAN X, or 2-Way SLI GTX 970
3840x2160, Uber settings GTX TITAN X, or 2-Way SLI GTX 980
3840x2160, Uber settings w/ GameWorks 2-Way SLI GTX 980 or GTX TITAN X

And from mine experience Nvidia geforce experience don't optimize on default for 60fps level of perfomance.
And why am I supposed to care about Nvidia's hardware recommendations when an unrelated, independent and well reputed firm - PCGH has had it's say and tested it out?
 
http://www.geforce.com/whats-new/articles/the-witcher-3-wild-hunt-is-your-system-ready

Look what Intel i7-5960X, 16GB DDR4 RAM with The Witcher 3: Wild Hunt Game Ready GeForce GTX Driver can do with dedicated gpu:
1920x1080, High settings GTX 960
1920x1080, Uber settings GTX 970
1920x1080, Uber settings w/ GameWorks GTX 980
2560x1440, Uber settings GTX 980
2560x1440, Uber settings w/ GameWorks GTX TITAN X, or 2-Way SLI GTX 970
3840x2160, Uber settings GTX TITAN X, or 2-Way SLI GTX 980
3840x2160, Uber settings w/ GameWorks 2-Way SLI GTX 980 or GTX TITAN X

And from mine experience Nvidia geforce experience don't optimize on default for 60fps level of perfomance.
And why am I supposed to care about Nvidia's hardware recommendations when an unrelated, independent and well reputed firm - PCGH has had it's say and tested it out?

Yeah guys - I honestly think that is way over shot. You won't need that much hardware to run the game. I can almost guarantee it.
 
There are always two sides of the discussion when it comes to make a game and in a way I'm glad CDPR chose this one: They decided instead of making the game run on very few PC configurations because you would need Titan X's and Dual Sli 980's to run to make it accessible to more people and make the game very scalable in terms of hardware requirements.

The need of the many (in this case CDPR wanting more people to play their game than less by going multiplatform and making a build that scales very well with your hardware) outweigh the need of the few. Those few that care about graphics in a RPG GAME (PLANESCAPE TORMENT, Gothic, Baldur's Gate and all those amazing RPG's of old say hello and don't give a damn about graphical fidelity have remain in history as being one of the best RPG's of all time) and yet here we are bickering about how dense the fucking grass is on what configuration.

Based off that PCGames Hardware article the game is very optimized and doesn't need a high end CPU to look good and run:

"PCgameshardware.de posted: They are running the game in 4k (downsampled) on a single titan x (w/o hairworks).

- In general they seem to be very pleased with the game (no day1 patch and no nvidia optimized drivers so far)

- Multicore CPU useage is good.

- The game doesn't seem to be very CPU heavy (they mention that they were able to downcloak their test CPU to 2 ghz and the game still ran good).

- High End GPUs can render the game in 1440p (ultra, w/o hairworks).

-Mid range GPUs run the game in 1080p (ultra, w/o hairworks).

- VRAM usage is pretty moderate: 2560x1440 ~ 2.5 gb vram (max usage).

-No loading screens."

-Tessellation is not that high (8x-16x), but it still looks good.

-NVIDIA Hairworks is very performance hungry and should only be used with high end GPUs. (keep in mind this is before day 1 patch and Nvidia drivers).

Conclusion is this: Aesthetics > technical..the game looks very pretty even if every pore on every humans face or every wall texture is not crisp and sharpened to kingdom come like in those first screens that, personally, made my eyes bleed. You can do so yourself by enabling that option.

At the end of the day..you play Witcher games for the story not for the graphics.
i fully agree with you. in terms of performance and optimization, TW3 seems to excel, as indicated by that german website (god bless germans <3), and it's nothing short of amazing. plus the game still looks excellent, and i'm sure it'll be even better on release.
the game looks rather close to SOD trailer, at least i think so. and if i'm mistaken, don't forget that the game will have still have support post release for at least 2 years, meaning they may improove the game even more graphically, solve potential bugs, and the overall experience for PC users. :D
 
Personally, I'm waiting for 2 things to happen:

1. PC folks with high end graphics being mad that the game runs well also on mid-tier PCs. I think we're quite near this point. And it's going to be a glorious sight.

2. People realising that how the game looks does not equal how the game looks+plays+feels.
 
I hear what you are saying, but I think you are still missing the point. The very essence of owning a PC is that you can (IF YOU CHOOSE - and have the money too) run the best graphics possible. If all games are going to across the board system parity, you are correct in your thought process

See, i was used to think exactly the same, if you'd asked me a few months ago. Now i think having a PC only to play games on it - and thus having to upgrade it basically every year - is very much luxury. If there is a cheaper alternative that is also good, i'd like it. I never wanted a console because i was able to upgrade my PC without having to spend THAT MUCH to have better graphics than consoles did. Also, most stuff looked on consoles like shit. Now i feel those times could be over, and if i am honest, even if i had the money, i think spending that much on a GPU just to compete for 2 more years isn't worth it.

I feel bad for everyone who upgraded the PC in expectation of cinematic graphics. I really do. But it is not CDPR to blame, because it wasn't a downgrade. You can't downgrade something that doesn't exist... the game wasn't finished back then. they made design desicions, and they also want the game to be sold. Not only to us PC elitists.

ps_ To each his own.... this phrase is outdated since 1945. it really is.
 
@phaino: When you have 3 platforms to work on in the cours of 3 years and you have limited budget this is what happens..from the PC only build we have seen at E3 2013 to a single build that works on all 3 platforms (the architecture of the consoles are very similar to a PC and it helped in this case).

I'm sorry that they wanted to cater to more than just the PC crowd and try to appeal to the mainstream..this was their decision..and I respect it. This will bring more sales to them and make Cyberpunk 2077 even better and future games (hopefully more Witcher ones after CP 2077). by having a bigger budget.

Regarding the machines..no..you can't play on a low end machine if the game requires high end machines to even run at the acceptable framerate (no PC only gamer would be "fine" with 30fps let's be honest here).

With all it';s said and done..we have to wait until release to see exactly how much we can tweak those ini files in terms of foliage, shadow textures and everything else. I also noticed that HBAO+ wasn't enabled in ANY of these videos..and that increases the image quality GREATLY.
 
Personally, I'm waiting for 2 things to happen:

1. PC folks with high end graphics being mad that the game runs well also on mid-tier PCs. I think we're quite near this point. And it's going to be a glorious sight.

2. People realising that how the game looks does not equal how the game looks+plays+feels.

you and me both kind sir ^^
-hands some pop corn to him-
 
And why am I supposed to care about Nvidia's hardware recommendations when an unrelated, independent and well reputed firm - PCGH has had it's say and tested it out?

Why you ask, well I didn't read anything about fps in the game, maybe to them (PCGH) it is ok if it dont run stable on 60 fps, maybe they are ok with low 40 or 50, some people dont make a difference. Thing is there is not a single fact in their article, just a sentence it will run ok. And do you presume Gpu makers ignorant ? Why ?
 
First, the game looks great as it is on each platform. Second, the similarity between the PS4 and PC max settings is indeed a little unsettling, but I still fully expect the game to look better - let's say - in half a year down the line compared to how it looks now. I mean, patches should definitely help, and don't forget about the expansions. While they won't be like the previous Enhanched Editions, I believe CDPR can (and will) make some overall improvements on the game, and not just put in some new quests and locations.
 
Honestly, yes. I spent money to get great graphics.

/no regrets. 8)

Hey more power to you. But you have to realize you are a small minority amongst console gamers and PC gamers. According to the last steam survey the Intel HD 400 is the most commonly used card. Why would a business put a small minority of PC gamers who can afford 1000-2000 dollars to get great graphics first?
 
I think this is the first time ever I am not sure if I should be happy or not for my 7950 to be able to hold a game on ultra settings... This is so weird :p
 
I just laugh to the people that think because they have a GTX Titan X, CDPR will do a game for them, 0.1% - 0.5% (based of steam charts) of the PC users. CDPR do the right thing: Make and amazing game for all platforms. Remember Gameplay > Graphics.
 
According to the last steam survey the Intel HD 400 is the most commonly used card. Why would a business put a small minority of PC gamers who can afford 1000-2000 dollars to get great graphics first?
That's just a factor you have to remove though, iGPUs aren't for playing games. So just move to the most common dGPUs, and the most common dGPUs are GTX760 and 660. That's what most PCs run. Guess what it's close in performance to.

I have a 970, I have a PhysX dedicated 750Ti(overkill) and I have an i5 4690K but I know I am in in the niche, not the norm.
 
Status
Not open for further replies.
Top Bottom