I apologize. I got into too much detail and politics for the purposes of this thread, which consequently led to my main point concerning this thread to not be clear.
My advice regarding the Witcher 3 is do not buy/build PCs based on the ridiculous requirements of some games that suffer from poor optimization, because it is a waste of money and it does not matter at the end. A poorly optimized game is still a poorly optimized game, no matter what kind of PC you run it on. From what we can gather from the interviews, CDPR team seem to have emphasized a lot on optimization of the Witcher 3 across all platforms. If they deliver and the game is properly optimized, then we would not need two 780s to get a stable performance on ultra with 40+ FPS at 1080p. Of course we cannot be sure unless we see the requirements, and even then I'll hold off until I see the game in action.
I apologize. I got into too much detail and politics for the purposes of this thread, which consequently led to my main point concerning this thread to not be clear.
My advice regarding the Witcher 3 is do not buy/build PCs based on the ridiculous requirements of some games that suffer from poor optimization, because it is a waste of money and it does not matter at the end. A poorly optimized game is still a poorly optimized game, no matter what kind of PC you run it on. From what we can gather from the interviews, CDPR team seem to have emphasized a lot on optimization of the Witcher 3 across all platforms. If they deliver and the game is properly optimized, then we would not need two 780s to get a stable performance on ultra with 40+ FPS at 1080p. Of course we cannot be sure unless we see the requirements, and even then I'll hold off until I see the game in action.
Ultra settings doesn't mean nothing. You can't use those settings as criterion for comparison of the performances. I mean, if you enable SSAA, or MSAA x4, or TXAA, you don't have the right to complain about performances.
CD Projekt RED shouldn't be afraid to make some uber settings for the future.
Ultra settings doesn't mean nothing. You can't use those settings as criterion for comparison of the performances. I mean, if you enable SSAA, or MSAA x4, or TXAA, you don't have the right to complain about performances.
CD Projekt RED shouldn't be afraid to make some uber settings for the future.
Please don't repeat their CRRRAP. It makes it really hard for those of us who take giving technical advice seriously to have to tell people to ignore it.
Short answer: We don't know how many GPU hamsters it will take to spin this game at any specific frame rate, resolution, and quality yet. The best guesses are indeed that it will take a high-end GPU to get 1920x1080 at something close to 60 fps, a pair of them in SLI or Crossfire to handle 1440p, and not much chance of 4k, which is 4x the pixels of 1920x1080.
If I were determined to have 4K at 60fps at any cost, I would get a Xeon on an LGA 2011 motherboard, so I would have enough PCI-Express support for 3x SLI. If I were not so determined and willing to settle for lesser resolution or frame rate, I would get a Z97-chipset motherboard that had a good layout for two damned big GPUs in SLI, and only buy one GPU. That way, I can determine whether the second GPU is really needed, before spending another several hundred dollars.
Okay, so help an idiot out here - if I have an i7-2600k 3.4 GHz with 16GB DDR3-1600 RAM, what's about the biggest GPU that makes sense? I'm willing to splurge for this game, but I also don't want to buy a prohibitively expensive GPU that I can't take full advantage of.
There isn't any single GPU that a Sandy Bridge would bottleneck. I would be reluctant to put two high-end GPUs in SLI, because it has only PCI-e 2.0. But it has more than enough bandwidth for any single card.
I think Ubisoft and Nvidia outdid themselves this time around. Not only Assassin's Creed Unity gets an average of 30 FPS on a GTX 780 and an i7 4770k, but it also dips down to 17 FPS on the PS4. It is a game that is not optimized on any system, and funny thing is that it's one of the biggest releases this year.
It's even funnier how us, PC gamers, spend more and more money to get more powerful PCs, and at the end of the day it doesn't matter because a poorly optimized game doesn't even properly take advantage of the power that is available to it.
No one should buy/build PCs based on ridiculous requirement of poorly optimized games like Unity, and I'd say buying games like this before patches and price drop just gives the greedy and lazy companies like Ubisoft more money to make more unoptimized games.
And lastly the trailers and interviews saying that Nvidia is in close collaboration to optimize and make the game look the best on PC, has turned into pure comedy after Watch Dogs and Unity. It gives me the chuckles every time I see one of those trailers, and I try to avoid that game because it means the game is unoptimized say 99% of the time. These trailers and the game performances suggest two possibilities: Nvidia does this on purpose in order to force the customers to buy overpriced cards, or Nvidia is simply incompetent.
There is no scam and no call for flinging foul language around. Do not make me put on my moderator hat; you will not like the result.
There is no comparison whatsoever between the artistic and production standards of games that were current seven years ago and first-class games of today.
As an engineer, I will answer you. You can only put so much into the game before it exceeds the capacity of current hardware. Blaming the engineers for not being able to run the highest quality production on anything short of the highest quality and most modern hardware is a gross and offensive insult.
And I have to repeat something I said earlier. OPTIMIZATION IS NOT REDUCING THE FEATURES OF A GAME UNTIL IT WILL RUN ON YOUR HARDWARE.
wow take it easy man, if i cannot explain my point of view you dnt need to ban me, i quit myself.
Im just saying what is obvious for all. If ur engineer pls explain me how you get 40FPs with crashes stutterings and frame drops in a 5000Gflops card. Can you explain me exactly how developers optimize the game? if u could explain me that you probably will not be here moderating. Don't take that personally. but man optimization theese days its a fucking joke. Im a consumer i pay games i pay hardware, peripherals and all the paraphernalia its not fair that you buy a 500$ graphic card, a 200-300$ CPU, you buy a 60$ game and tachan! they got you, poor optimization, bugs, crashes, disconnections etc etc...And all that crap wiht the BIG LOGOG OF NVIDIA its meant to be played!! sure to be played at the museum. Can you also explain me that? Im a consumer, if i cannot says these words and im just here to buy and to be bought then we are all of us full of shit.
ATTENTION!
I am warning you before testing cards in 3DMark Fire Stike (latest version)
This thing killed my exellent 980 reference (1510/7810 at stock)
What for me the pal said is being confirmed that "Benchmark"in the Combined test is killing GPU after strong OC.
This killed my 780 Ti Lighting, now ref 980 :/
In some of tournament, peoples lost their GPUs and even PSUs on that test "Combined" :/
What a shame Futuremark, plz be cautious with yours GPUs
Quoted for truth.
Once you overclock, you are working at your own risk. It is entirely possible that you can destroy hardware by running it at full continuous load, overclocked and especially if it is also overvoltaged.
But blaming the benchmark programs for causing the failure is really not accurate. The failure was caused by the decision to overclock and then run at full load without being ready to back down really fast at the first sign of trouble.
ATTENTION!
I am warning you before testing cards in 3DMark Fire Stike (latest version)
Isn't that thing just for showing off?