Predicted witcher 3 system specs? Can I run it .

+
Status
Not open for further replies.
I hope so i trust in them, inclusive with a excellent profile SLI support... whatever graphic card is, it will be awesome...

we wait they listen us = D
 
Last edited:
I apologize. I got into too much detail and politics for the purposes of this thread, which consequently led to my main point concerning this thread to not be clear.
My advice regarding the Witcher 3 is do not buy/build PCs based on the ridiculous requirements of some games that suffer from poor optimization, because it is a waste of money and it does not matter at the end. A poorly optimized game is still a poorly optimized game, no matter what kind of PC you run it on. From what we can gather from the interviews, CDPR team seem to have emphasized a lot on optimization of the Witcher 3 across all platforms. If they deliver and the game is properly optimized, then we would not need two 780s to get a stable performance on ultra with 40+ FPS at 1080p. Of course we cannot be sure unless we see the requirements, and even then I'll hold off until I see the game in action.

Good example how devs can be lazy is AC Unity....But CD Projekt is a different story :)
 
I apologize. I got into too much detail and politics for the purposes of this thread, which consequently led to my main point concerning this thread to not be clear.
My advice regarding the Witcher 3 is do not buy/build PCs based on the ridiculous requirements of some games that suffer from poor optimization, because it is a waste of money and it does not matter at the end. A poorly optimized game is still a poorly optimized game, no matter what kind of PC you run it on. From what we can gather from the interviews, CDPR team seem to have emphasized a lot on optimization of the Witcher 3 across all platforms. If they deliver and the game is properly optimized, then we would not need two 780s to get a stable performance on ultra with 40+ FPS at 1080p. Of course we cannot be sure unless we see the requirements, and even then I'll hold off until I see the game in action.

Ultra settings doesn't mean nothing. You can't use those settings as criterion for comparison of the performances. I mean, if you enable SSAA, or MSAA x4, or TXAA, you don't have the right to complain about performances.
CD Projekt RED shouldn't be afraid to make some uber settings for the future.
 
Ultra settings doesn't mean nothing. You can't use those settings as criterion for comparison of the performances. I mean, if you enable SSAA, or MSAA x4, or TXAA, you don't have the right to complain about performances.
CD Projekt RED shouldn't be afraid to make some uber settings for the future.

But AC Unity is poor optimization piece of shit, anyway.
Lots of bugs.Even horrible framerate on consoles, below 20 fps :)
 
Ultra settings doesn't mean nothing. You can't use those settings as criterion for comparison of the performances. I mean, if you enable SSAA, or MSAA x4, or TXAA, you don't have the right to complain about performances.
CD Projekt RED shouldn't be afraid to make some uber settings for the future.

^ This.

Optimization does NOT mean reducing the game until it can run the best and highest of the remaining features on ordinary equipment.

Optimization does NOT mean pretending that the game will be able to run its best and highest features on ordinary equipment, without reducing and compromising these features.

The goal of optimization is to deliver the best possible performance across a wide range of platforms, including sub-minimum ones, and including a level of performance that is only possible on Big Iron and that delivers an experience that makes having the Big Iron worth it to you.
 
Please don't repeat their CRRRAP. It makes it really hard for those of us who take giving technical advice seriously to have to tell people to ignore it.

Short answer: We don't know how many GPU hamsters it will take to spin this game at any specific frame rate, resolution, and quality yet. The best guesses are indeed that it will take a high-end GPU to get 1920x1080 at something close to 60 fps, a pair of them in SLI or Crossfire to handle 1440p, and not much chance of 4k, which is 4x the pixels of 1920x1080.

If I were determined to have 4K at 60fps at any cost, I would get a Xeon on an LGA 2011 motherboard, so I would have enough PCI-Express support for 3x SLI. If I were not so determined and willing to settle for lesser resolution or frame rate, I would get a Z97-chipset motherboard that had a good layout for two damned big GPUs in SLI, and only buy one GPU. That way, I can determine whether the second GPU is really needed, before spending another several hundred dollars.



i see this is waht i'm plannig to put together to run at 1440 p at ultra,possibly 60 fps let me know if you think it's enough


case: Coler Master Storm Stryker Tower White

power suply: Cooler Master Silent Pro 700 wat

cpu: Asus Maximus VII Hero

processor : i7 4790k 4.40 gigahertz

16 gb System Ram

4 hard disk : primary SSd 500 giga,secondary HDD 1 tb ,3rd HDD 500 giga and lastsly another one HDD da 500

gpu: Asus gtx 980 4 giga

i will do this upgrade ANYWAY ( most of the parts listed i already have anyway including the gpu) i'm asking you if you think this is enough for what im' requestong only becouse i have doubts about putting another 980 in sli with this configuration, that' the only variable here
thank you

if it matter i'm plainng tu use a asus PA279Q for monitor at 1440 p obviously
 
Last edited:
Okay, so help an idiot out here - if I have an i7-2600k 3.4 GHz with 16GB DDR3-1600 RAM, what's about the biggest GPU that makes sense? I'm willing to splurge for this game, but I also don't want to buy a prohibitively expensive GPU that I can't take full advantage of.
 
Okay, so help an idiot out here - if I have an i7-2600k 3.4 GHz with 16GB DDR3-1600 RAM, what's about the biggest GPU that makes sense? I'm willing to splurge for this game, but I also don't want to buy a prohibitively expensive GPU that I can't take full advantage of.

There isn't any single GPU that a Sandy Bridge would bottleneck. I would be reluctant to put two high-end GPUs in SLI, because it has only PCI-e 2.0. But it has more than enough bandwidth for any single card.
 
There isn't any single GPU that a Sandy Bridge would bottleneck. I would be reluctant to put two high-end GPUs in SLI, because it has only PCI-e 2.0. But it has more than enough bandwidth for any single card.

Well, crap. Now I'm going to have to rely on restraint and rationality, I was hoping my hardware limitations would make the choice for me. Haha. But thanks, man.
 
I think Ubisoft and Nvidia outdid themselves this time around. Not only Assassin's Creed Unity gets an average of 30 FPS on a GTX 780 and an i7 4770k, but it also dips down to 17 FPS on the PS4. It is a game that is not optimized on any system, and funny thing is that it's one of the biggest releases this year.
It's even funnier how us, PC gamers, spend more and more money to get more powerful PCs, and at the end of the day it doesn't matter because a poorly optimized game doesn't even properly take advantage of the power that is available to it.
No one should buy/build PCs based on ridiculous requirement of poorly optimized games like Unity, and I'd say buying games like this before patches and price drop just gives the greedy and lazy companies like Ubisoft more money to make more unoptimized games.
And lastly the trailers and interviews saying that Nvidia is in close collaboration to optimize and make the game look the best on PC, has turned into pure comedy after Watch Dogs and Unity. It gives me the chuckles every time I see one of those trailers, and I try to avoid that game because it means the game is unoptimized say 99% of the time. These trailers and the game performances suggest two possibilities: Nvidia does this on purpose in order to force the customers to buy overpriced cards, or Nvidia is simply incompetent.

Completely Agree with u man, we need to stop buying hardware and software until they show us their working hard on their products. How is posible a 1 year old 500$ card cannot maintain 50FPS in a new game at full 1080p? Simple, its a fucking SCAM (sorry). How is it posible that a fricking 5000GFlops card only run 15-20FPS more on ACU than a 1800GFlops in a closet system. Theres any engineer somewhere can seriously explain that? Its begining to smell of big shit around the videogame industry. Im personally sick of upgrading and paying and paying and paying more to get less and less and less. Whats the fucking mess here?

Glorious, those days of the GTX8000 series + Crysis. That was really worth to pay for tecnology. Now you pay gold to get a wood wheel...
 
Last edited:
There is no scam and no call for flinging foul language around. Do not make me put on my moderator hat; you will not like the result.

There is no comparison whatsoever between the artistic and production standards of games that were current seven years ago and first-class games of today.

As an engineer, I will answer you. You can only put so much into the game before it exceeds the capacity of current hardware. Blaming the engineers for not being able to run the highest quality production on anything short of the highest quality and most modern hardware is a gross and offensive insult.

And I have to repeat something I said earlier. OPTIMIZATION IS NOT REDUCING THE FEATURES OF A GAME UNTIL IT WILL RUN ON YOUR HARDWARE.
 
There is no scam and no call for flinging foul language around. Do not make me put on my moderator hat; you will not like the result.

There is no comparison whatsoever between the artistic and production standards of games that were current seven years ago and first-class games of today.

As an engineer, I will answer you. You can only put so much into the game before it exceeds the capacity of current hardware. Blaming the engineers for not being able to run the highest quality production on anything short of the highest quality and most modern hardware is a gross and offensive insult.

And I have to repeat something I said earlier. OPTIMIZATION IS NOT REDUCING THE FEATURES OF A GAME UNTIL IT WILL RUN ON YOUR HARDWARE.

wow take it easy man, if i cannot explain my point of view you dnt need to ban me, i quit myself.

Im just saying what is obvious for all. If ur engineer pls explain me how you get 40FPs with crashes stutterings and frame drops in a 5000Gflops card. Can you explain me exactly how developers optimize the game? if u could explain me that you probably will not be here moderating. Don't take that personally. but man optimization theese days its a fucking joke. Im a consumer i pay games i pay hardware, peripherals and all the paraphernalia its not fair that you buy a 500$ graphic card, a 200-300$ CPU, you buy a 60$ game and tachan! they got you, poor optimization, bugs, crashes, disconnections etc etc...And all that crap wiht the BIG LOGOG OF NVIDIA its meant to be played!! sure to be played at the museum. Can you also explain me that? Im a consumer, if i cannot says these words and im just here to buy and to be bought then we are all of us full of shit.
 
wow take it easy man, if i cannot explain my point of view you dnt need to ban me, i quit myself.

Im just saying what is obvious for all. If ur engineer pls explain me how you get 40FPs with crashes stutterings and frame drops in a 5000Gflops card. Can you explain me exactly how developers optimize the game? if u could explain me that you probably will not be here moderating. Don't take that personally. but man optimization theese days its a fucking joke. Im a consumer i pay games i pay hardware, peripherals and all the paraphernalia its not fair that you buy a 500$ graphic card, a 200-300$ CPU, you buy a 60$ game and tachan! they got you, poor optimization, bugs, crashes, disconnections etc etc...And all that crap wiht the BIG LOGOG OF NVIDIA its meant to be played!! sure to be played at the museum. Can you also explain me that? Im a consumer, if i cannot says these words and im just here to buy and to be bought then we are all of us full of shit.

I've been an engineer longer than most of the members of this forum have been walking, so I've seen enough cases of bad development to ascribe it to bad development.

Not availing yourself of the assistance provided by the graphics card maker that is in a position to provide that assistance, in the name of some fictitious notion of fairness, is the best possible way to compound all the mistakes you made into an even worse excuse for a product.

And as a moderator, I have no objection to your stating your opinion, but it is my duty to tell you to do so decently and in order or keep silence instead.
 
ATTENTION!
I am warning you before testing cards in 3DMark Fire Stike (latest version)

This thing killed my exellent 980 reference (1510/7810 at stock)

What for me the pal said is being confirmed that "Benchmark"in the Combined test is killing GPU after strong OC.
This killed my 780 Ti Lighting, now ref 980 :/

In some of tournament, peoples lost their GPUs and even PSUs on that test "Combined" :/
What a shame Futuremark, plz be cautious with yours GPUs
 
Last edited:
ATTENTION!
I am warning you before testing cards in 3DMark Fire Stike (latest version)

This thing killed my exellent 980 reference (1510/7810 at stock)

What for me the pal said is being confirmed that "Benchmark"in the Combined test is killing GPU after strong OC.
This killed my 780 Ti Lighting, now ref 980 :/

In some of tournament, peoples lost their GPUs and even PSUs on that test "Combined" :/
What a shame Futuremark, plz be cautious with yours GPUs

Quoted for truth.

Once you overclock, you are working at your own risk. It is entirely possible that you can destroy hardware by running it at full continuous load, overclocked and especially if it is also overvoltaged.

But blaming the benchmark programs for causing the failure is really not accurate. The failure was caused by the decision to overclock and then run at full load without being ready to back down really fast at the first sign of trouble.
 
Quoted for truth.

Once you overclock, you are working at your own risk. It is entirely possible that you can destroy hardware by running it at full continuous load, overclocked and especially if it is also overvoltaged.

But blaming the benchmark programs for causing the failure is really not accurate. The failure was caused by the decision to overclock and then run at full load without being ready to back down really fast at the first sign of trouble.

But ONLY 3dMark Fire Strike Extreme - Combined test do that.
In past two months i have 9 different GTX 780Ti and 4 GTX 980 and ONLY 780Ti Lighting and one of 980 references died, durning this test.
780Ti Lighting at stock voltage...

3D Mark Fire Strike (newest version) have some kind of voltage peak bug durning combined test.
 
Mobo: Asus Maximus VII Ranger
CPU: Intel Core i5-4690K
RAM: Corsair Vengeance Pro 2133-8GB
Fan: Cooler Master Hyper 212 EVO
GPU: Asus Strix GeForce GTX970
PSU: Corsair RM 850W
(Case: Corsair Graphite 780T)
=+- €1200

Opinions would be appreciated. :D

Edit: I wanne go SLI in 1-2 years, therefore the high PSU. And I already have 2.5 TB on HDD space.
 
Last edited:
Status
Not open for further replies.
Top Bottom